Dec 03 06:48:57 crc systemd[1]: Starting Kubernetes Kubelet... Dec 03 06:48:57 crc restorecon[4813]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:48:57 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:48:58 crc restorecon[4813]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:48:58 crc restorecon[4813]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 03 06:48:58 crc kubenswrapper[4947]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 06:48:58 crc kubenswrapper[4947]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 03 06:48:58 crc kubenswrapper[4947]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 06:48:58 crc kubenswrapper[4947]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 06:48:58 crc kubenswrapper[4947]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 03 06:48:58 crc kubenswrapper[4947]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.871970 4947 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874862 4947 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874888 4947 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874893 4947 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874897 4947 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874902 4947 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874906 4947 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874910 4947 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874916 4947 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874922 4947 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874928 4947 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874933 4947 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874938 4947 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874941 4947 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874945 4947 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874949 4947 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874953 4947 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874957 4947 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874961 4947 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874964 4947 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874968 4947 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874973 4947 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874977 4947 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874981 4947 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874985 4947 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874989 4947 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874993 4947 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.874996 4947 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875000 4947 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875003 4947 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875007 4947 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875012 4947 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875015 4947 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875019 4947 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875023 4947 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875027 4947 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875030 4947 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875034 4947 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875047 4947 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875053 4947 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875057 4947 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875061 4947 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875064 4947 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875068 4947 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875072 4947 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875075 4947 feature_gate.go:330] unrecognized feature gate: Example Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875079 4947 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875082 4947 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875086 4947 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875094 4947 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875098 4947 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875101 4947 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875105 4947 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875109 4947 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875113 4947 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875117 4947 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875120 4947 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875124 4947 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875127 4947 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875130 4947 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875134 4947 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875137 4947 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875141 4947 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875144 4947 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875148 4947 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875151 4947 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875156 4947 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875159 4947 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875162 4947 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875166 4947 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875169 4947 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.875178 4947 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875421 4947 flags.go:64] FLAG: --address="0.0.0.0" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875431 4947 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875452 4947 flags.go:64] FLAG: --anonymous-auth="true" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875457 4947 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875462 4947 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875467 4947 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875472 4947 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875477 4947 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875481 4947 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875508 4947 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875515 4947 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875520 4947 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875543 4947 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875548 4947 flags.go:64] FLAG: --cgroup-root="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875552 4947 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875556 4947 flags.go:64] FLAG: --client-ca-file="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875560 4947 flags.go:64] FLAG: --cloud-config="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875563 4947 flags.go:64] FLAG: --cloud-provider="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875567 4947 flags.go:64] FLAG: --cluster-dns="[]" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875576 4947 flags.go:64] FLAG: --cluster-domain="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875580 4947 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875584 4947 flags.go:64] FLAG: --config-dir="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875587 4947 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875592 4947 flags.go:64] FLAG: --container-log-max-files="5" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875597 4947 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875601 4947 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875605 4947 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875609 4947 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875613 4947 flags.go:64] FLAG: --contention-profiling="false" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875617 4947 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875621 4947 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875627 4947 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875631 4947 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875637 4947 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875641 4947 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875646 4947 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875650 4947 flags.go:64] FLAG: --enable-load-reader="false" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875654 4947 flags.go:64] FLAG: --enable-server="true" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875664 4947 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875672 4947 flags.go:64] FLAG: --event-burst="100" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875676 4947 flags.go:64] FLAG: --event-qps="50" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875680 4947 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875685 4947 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875689 4947 flags.go:64] FLAG: --eviction-hard="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875694 4947 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875698 4947 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875701 4947 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875706 4947 flags.go:64] FLAG: --eviction-soft="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875709 4947 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875713 4947 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875718 4947 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875722 4947 flags.go:64] FLAG: --experimental-mounter-path="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875725 4947 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875729 4947 flags.go:64] FLAG: --fail-swap-on="true" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875733 4947 flags.go:64] FLAG: --feature-gates="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875738 4947 flags.go:64] FLAG: --file-check-frequency="20s" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875742 4947 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875747 4947 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875752 4947 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875757 4947 flags.go:64] FLAG: --healthz-port="10248" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875762 4947 flags.go:64] FLAG: --help="false" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875767 4947 flags.go:64] FLAG: --hostname-override="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875772 4947 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875779 4947 flags.go:64] FLAG: --http-check-frequency="20s" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875784 4947 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875789 4947 flags.go:64] FLAG: --image-credential-provider-config="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875794 4947 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875799 4947 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875803 4947 flags.go:64] FLAG: --image-service-endpoint="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875808 4947 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875813 4947 flags.go:64] FLAG: --kube-api-burst="100" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875818 4947 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875824 4947 flags.go:64] FLAG: --kube-api-qps="50" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875829 4947 flags.go:64] FLAG: --kube-reserved="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875842 4947 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875847 4947 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875853 4947 flags.go:64] FLAG: --kubelet-cgroups="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875857 4947 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875861 4947 flags.go:64] FLAG: --lock-file="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875865 4947 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875869 4947 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875873 4947 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875881 4947 flags.go:64] FLAG: --log-json-split-stream="false" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875886 4947 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875890 4947 flags.go:64] FLAG: --log-text-split-stream="false" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875894 4947 flags.go:64] FLAG: --logging-format="text" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875899 4947 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875903 4947 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875907 4947 flags.go:64] FLAG: --manifest-url="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875911 4947 flags.go:64] FLAG: --manifest-url-header="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875917 4947 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875921 4947 flags.go:64] FLAG: --max-open-files="1000000" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875926 4947 flags.go:64] FLAG: --max-pods="110" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875930 4947 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875934 4947 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875940 4947 flags.go:64] FLAG: --memory-manager-policy="None" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875944 4947 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875948 4947 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875952 4947 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875956 4947 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875965 4947 flags.go:64] FLAG: --node-status-max-images="50" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875969 4947 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875974 4947 flags.go:64] FLAG: --oom-score-adj="-999" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875978 4947 flags.go:64] FLAG: --pod-cidr="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875981 4947 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875988 4947 flags.go:64] FLAG: --pod-manifest-path="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875992 4947 flags.go:64] FLAG: --pod-max-pids="-1" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.875996 4947 flags.go:64] FLAG: --pods-per-core="0" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876000 4947 flags.go:64] FLAG: --port="10250" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876004 4947 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876016 4947 flags.go:64] FLAG: --provider-id="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876021 4947 flags.go:64] FLAG: --qos-reserved="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876025 4947 flags.go:64] FLAG: --read-only-port="10255" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876029 4947 flags.go:64] FLAG: --register-node="true" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876033 4947 flags.go:64] FLAG: --register-schedulable="true" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876037 4947 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876043 4947 flags.go:64] FLAG: --registry-burst="10" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876047 4947 flags.go:64] FLAG: --registry-qps="5" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876051 4947 flags.go:64] FLAG: --reserved-cpus="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876055 4947 flags.go:64] FLAG: --reserved-memory="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876060 4947 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876064 4947 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876068 4947 flags.go:64] FLAG: --rotate-certificates="false" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876072 4947 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876076 4947 flags.go:64] FLAG: --runonce="false" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876080 4947 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876084 4947 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876088 4947 flags.go:64] FLAG: --seccomp-default="false" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876093 4947 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876097 4947 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876102 4947 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876106 4947 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876110 4947 flags.go:64] FLAG: --storage-driver-password="root" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876114 4947 flags.go:64] FLAG: --storage-driver-secure="false" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876117 4947 flags.go:64] FLAG: --storage-driver-table="stats" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876121 4947 flags.go:64] FLAG: --storage-driver-user="root" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876125 4947 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876129 4947 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876133 4947 flags.go:64] FLAG: --system-cgroups="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876137 4947 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876143 4947 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876164 4947 flags.go:64] FLAG: --tls-cert-file="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876169 4947 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876177 4947 flags.go:64] FLAG: --tls-min-version="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876181 4947 flags.go:64] FLAG: --tls-private-key-file="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876185 4947 flags.go:64] FLAG: --topology-manager-policy="none" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876195 4947 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876199 4947 flags.go:64] FLAG: --topology-manager-scope="container" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876203 4947 flags.go:64] FLAG: --v="2" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876209 4947 flags.go:64] FLAG: --version="false" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876214 4947 flags.go:64] FLAG: --vmodule="" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876219 4947 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.876224 4947 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877441 4947 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877544 4947 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877556 4947 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877566 4947 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877575 4947 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877588 4947 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877601 4947 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877623 4947 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877637 4947 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877647 4947 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877657 4947 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877668 4947 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877692 4947 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877703 4947 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877713 4947 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877723 4947 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877733 4947 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877743 4947 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877752 4947 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877763 4947 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877772 4947 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877782 4947 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877792 4947 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877803 4947 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877814 4947 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877833 4947 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877851 4947 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877869 4947 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877881 4947 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877892 4947 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877906 4947 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877918 4947 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877929 4947 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877941 4947 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877951 4947 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877961 4947 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877974 4947 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.877993 4947 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878003 4947 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878013 4947 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878023 4947 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878034 4947 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878044 4947 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878054 4947 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878064 4947 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878078 4947 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878092 4947 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878105 4947 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878116 4947 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878126 4947 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878145 4947 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878159 4947 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878171 4947 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878186 4947 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878228 4947 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878245 4947 feature_gate.go:330] unrecognized feature gate: Example Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878258 4947 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878310 4947 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878327 4947 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878336 4947 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878666 4947 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878736 4947 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878749 4947 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878764 4947 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878777 4947 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878790 4947 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878801 4947 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878811 4947 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878822 4947 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878833 4947 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.878846 4947 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.878868 4947 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.890088 4947 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.890121 4947 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890263 4947 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890275 4947 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890285 4947 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890293 4947 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890302 4947 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890312 4947 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890321 4947 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890329 4947 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890340 4947 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890351 4947 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890361 4947 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890371 4947 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890381 4947 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890392 4947 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890400 4947 feature_gate.go:330] unrecognized feature gate: Example Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890409 4947 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890417 4947 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890426 4947 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890434 4947 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890442 4947 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890450 4947 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890458 4947 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890466 4947 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890474 4947 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890482 4947 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890523 4947 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890535 4947 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890545 4947 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890555 4947 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890564 4947 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890576 4947 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890587 4947 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890595 4947 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890604 4947 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890613 4947 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890621 4947 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890630 4947 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890639 4947 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890647 4947 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890657 4947 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890665 4947 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890673 4947 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890681 4947 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890689 4947 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890697 4947 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890704 4947 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890712 4947 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890720 4947 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890727 4947 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890735 4947 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890743 4947 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890750 4947 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890758 4947 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890766 4947 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890773 4947 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890781 4947 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890788 4947 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890796 4947 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890807 4947 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890818 4947 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890828 4947 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890837 4947 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890848 4947 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890857 4947 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890866 4947 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890875 4947 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890885 4947 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890894 4947 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890902 4947 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890910 4947 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.890917 4947 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.890931 4947 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891155 4947 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891167 4947 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891175 4947 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891184 4947 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891192 4947 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891200 4947 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891209 4947 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891217 4947 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891225 4947 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891232 4947 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891240 4947 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891249 4947 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891256 4947 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891264 4947 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891272 4947 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891279 4947 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891287 4947 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891295 4947 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891302 4947 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891311 4947 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891318 4947 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891326 4947 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891335 4947 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891343 4947 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891351 4947 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891359 4947 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891367 4947 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891375 4947 feature_gate.go:330] unrecognized feature gate: Example Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891382 4947 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891390 4947 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891398 4947 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891408 4947 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891419 4947 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891427 4947 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891437 4947 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891445 4947 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891453 4947 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891464 4947 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891472 4947 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891481 4947 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891522 4947 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891531 4947 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891540 4947 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891548 4947 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891556 4947 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891563 4947 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891571 4947 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891578 4947 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891586 4947 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891594 4947 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891601 4947 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891609 4947 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891616 4947 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891625 4947 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891633 4947 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891642 4947 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891649 4947 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891657 4947 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891665 4947 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891673 4947 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891680 4947 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891691 4947 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891700 4947 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891710 4947 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891720 4947 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891728 4947 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891737 4947 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891745 4947 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891753 4947 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891762 4947 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.891771 4947 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.891784 4947 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.892048 4947 server.go:940] "Client rotation is on, will bootstrap in background" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.897109 4947 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.897265 4947 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.898106 4947 server.go:997] "Starting client certificate rotation" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.898143 4947 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.898367 4947 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-11 08:25:08.186511949 +0000 UTC Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.898460 4947 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 193h36m9.288056468s for next certificate rotation Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.905552 4947 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.909354 4947 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.924891 4947 log.go:25] "Validated CRI v1 runtime API" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.946286 4947 log.go:25] "Validated CRI v1 image API" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.948746 4947 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.952905 4947 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-03-06-41-26-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.952951 4947 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.986848 4947 manager.go:217] Machine: {Timestamp:2025-12-03 06:48:58.984590678 +0000 UTC m=+0.245545164 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:0621c3b3-388e-465f-be28-bb0fb0b39611 BootID:c3feb53c-ad63-4e2a-a506-87d0a873eb31 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:62:ae:f9 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:62:ae:f9 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:ed:bc:5a Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:7c:e1:41 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:29:f9:6b Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:60:72:3f Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:26:0b:60 Speed:-1 Mtu:1496} {Name:ens7.44 MacAddress:52:54:00:ff:5b:54 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ce:e4:57:1e:53:a9 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:8e:ca:96:6b:88:be Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.987234 4947 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.987443 4947 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.988192 4947 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.988581 4947 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.988646 4947 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.988970 4947 topology_manager.go:138] "Creating topology manager with none policy" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.988988 4947 container_manager_linux.go:303] "Creating device plugin manager" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.989289 4947 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.989354 4947 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.989817 4947 state_mem.go:36] "Initialized new in-memory state store" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.990419 4947 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.991488 4947 kubelet.go:418] "Attempting to sync node with API server" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.991616 4947 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.991656 4947 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.991680 4947 kubelet.go:324] "Adding apiserver pod source" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.991698 4947 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.994276 4947 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.994961 4947 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.995808 4947 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Dec 03 06:48:58 crc kubenswrapper[4947]: W1203 06:48:58.995806 4947 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Dec 03 06:48:58 crc kubenswrapper[4947]: E1203 06:48:58.995954 4947 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Dec 03 06:48:58 crc kubenswrapper[4947]: E1203 06:48:58.995985 4947 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.996335 4947 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.997236 4947 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.997280 4947 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.997295 4947 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.997311 4947 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.997334 4947 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.997349 4947 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.997366 4947 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.997393 4947 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.997414 4947 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.997429 4947 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.997448 4947 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.997462 4947 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.997786 4947 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.998620 4947 server.go:1280] "Started kubelet" Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.998803 4947 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.999234 4947 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 03 06:48:58 crc kubenswrapper[4947]: I1203 06:48:58.999371 4947 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.000734 4947 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 03 06:48:59 crc systemd[1]: Started Kubernetes Kubelet. Dec 03 06:48:59 crc kubenswrapper[4947]: E1203 06:48:59.002321 4947 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.196:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187da1cee651bb44 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 06:48:58.998553412 +0000 UTC m=+0.259507898,LastTimestamp:2025-12-03 06:48:58.998553412 +0000 UTC m=+0.259507898,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.002918 4947 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.002994 4947 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.003216 4947 server.go:460] "Adding debug handlers to kubelet server" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.003258 4947 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 13:54:05.229509963 +0000 UTC Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.007363 4947 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.007402 4947 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.009737 4947 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 03 06:48:59 crc kubenswrapper[4947]: E1203 06:48:59.012353 4947 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 03 06:48:59 crc kubenswrapper[4947]: E1203 06:48:59.013127 4947 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="200ms" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.013360 4947 factory.go:55] Registering systemd factory Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.013448 4947 factory.go:221] Registration of the systemd container factory successfully Dec 03 06:48:59 crc kubenswrapper[4947]: W1203 06:48:59.013624 4947 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Dec 03 06:48:59 crc kubenswrapper[4947]: E1203 06:48:59.013717 4947 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.014288 4947 factory.go:153] Registering CRI-O factory Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.014348 4947 factory.go:221] Registration of the crio container factory successfully Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.014445 4947 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.014516 4947 factory.go:103] Registering Raw factory Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.014574 4947 manager.go:1196] Started watching for new ooms in manager Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.024449 4947 manager.go:319] Starting recovery of all containers Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.027331 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.027982 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.028149 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.028257 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.028369 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.028468 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.028618 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.028733 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.028846 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.028955 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.029061 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.029169 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.029270 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.029381 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.029476 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.029616 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.029693 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.029786 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.029866 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.029957 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.030051 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.030130 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.030217 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.030303 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.030411 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.030510 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.030597 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.030687 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.030764 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.030849 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.030937 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.031037 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.031131 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.031208 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.031289 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.031366 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.031443 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.031592 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.031687 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.031769 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.031845 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.031922 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.032013 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.032127 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.032212 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.032288 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.032364 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.032439 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.032547 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.032628 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.032701 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.032777 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.032857 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.032940 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.033047 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.033132 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.033207 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.033315 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.033397 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.033507 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.033601 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.033678 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.033755 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.033834 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.033921 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.034013 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.034093 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.034174 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.034285 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.034369 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.034455 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.034553 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.034631 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.034706 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.034781 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.034869 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.034955 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.035053 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.035133 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.035207 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.035296 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.035375 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.035458 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.035558 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.035638 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.035726 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.035812 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.035910 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.036003 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.036118 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.036196 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.036283 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.036368 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.036443 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.036537 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.036617 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.036706 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.036786 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.036862 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.036944 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.037051 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.037153 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.037233 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.037312 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.037395 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.037473 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.037576 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.037671 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.037751 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.037835 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.037916 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.038007 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.038097 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.038179 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.038256 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.038337 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.038415 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.038524 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.038606 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.038680 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.038754 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.038827 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.038909 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.039013 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.039095 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.039173 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.039247 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.040076 4947 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.040202 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.040288 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.040364 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.040438 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.040535 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.040627 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.040716 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.040792 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.040869 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.040951 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.041050 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.041129 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.041204 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.041279 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.041355 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.041431 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.041534 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.041627 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.041714 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.041791 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.041869 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.041963 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.042054 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.042132 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.042207 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.042280 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.042363 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.042446 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.042544 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.042624 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.042699 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.042775 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.042863 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.042947 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.043023 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.043098 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.043183 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.043268 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.043346 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.043422 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.043516 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.043631 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.043848 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.043947 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.044055 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.044144 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.044239 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.044335 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.044452 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.044633 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.044742 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.044844 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.044949 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.045069 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.045165 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.045242 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.046458 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.046610 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.046735 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.046851 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.046948 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.047052 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.047142 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.047229 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.047308 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.047382 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.047460 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.047573 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.047656 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.047739 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.047820 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.047902 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.047997 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.048150 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.048248 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.048344 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.048451 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.048669 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.048777 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.048897 4947 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.049002 4947 reconstruct.go:97] "Volume reconstruction finished" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.049096 4947 reconciler.go:26] "Reconciler: start to sync state" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.061803 4947 manager.go:324] Recovery completed Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.075904 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.079567 4947 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.080866 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.080928 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.080947 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.081700 4947 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.081745 4947 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.081768 4947 kubelet.go:2335] "Starting kubelet main sync loop" Dec 03 06:48:59 crc kubenswrapper[4947]: E1203 06:48:59.081820 4947 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 03 06:48:59 crc kubenswrapper[4947]: W1203 06:48:59.085157 4947 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Dec 03 06:48:59 crc kubenswrapper[4947]: E1203 06:48:59.085235 4947 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.085425 4947 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.085447 4947 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.085472 4947 state_mem.go:36] "Initialized new in-memory state store" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.100043 4947 policy_none.go:49] "None policy: Start" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.100800 4947 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.100827 4947 state_mem.go:35] "Initializing new in-memory state store" Dec 03 06:48:59 crc kubenswrapper[4947]: E1203 06:48:59.112970 4947 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.157961 4947 manager.go:334] "Starting Device Plugin manager" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.158678 4947 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.158728 4947 server.go:79] "Starting device plugin registration server" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.159565 4947 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.159601 4947 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.159909 4947 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.160148 4947 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.160169 4947 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 03 06:48:59 crc kubenswrapper[4947]: E1203 06:48:59.171304 4947 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.182568 4947 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.182685 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.185183 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.185259 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.185301 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.185545 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.185936 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.185994 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.187029 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.187079 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.187098 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.187208 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.187253 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.187277 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.187314 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.187732 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.187783 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.189029 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.189058 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.189069 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.189156 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.189174 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.189212 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.189395 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.189846 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.189947 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.191291 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.191345 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.191365 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.191569 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.191705 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.191755 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.192427 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.192519 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.192547 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.193235 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.193274 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.193292 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.193244 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.193359 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.193370 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.193543 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.193584 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.194469 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.194519 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.194542 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:48:59 crc kubenswrapper[4947]: E1203 06:48:59.214009 4947 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="400ms" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.252132 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.252186 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.252225 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.252258 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.252317 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.252432 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.252559 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.252609 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.252656 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.252710 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.252786 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.252826 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.252888 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.252926 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.252965 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.260483 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.261925 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.261991 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.262015 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.262054 4947 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 06:48:59 crc kubenswrapper[4947]: E1203 06:48:59.262664 4947 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.196:6443: connect: connection refused" node="crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.353767 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.353833 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.353866 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.353888 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.353911 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.353946 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.353966 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.353985 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.354005 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.354024 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.354044 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.354066 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.354088 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.354107 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.354125 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.354131 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.354251 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.354289 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.354344 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.354034 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.354406 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.354455 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.354534 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.354538 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.354576 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.354599 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.354607 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.354105 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.354630 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.354729 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.463238 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.464654 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.464705 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.464717 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.464744 4947 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 06:48:59 crc kubenswrapper[4947]: E1203 06:48:59.465117 4947 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.196:6443: connect: connection refused" node="crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.520974 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.528397 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.548767 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: W1203 06:48:59.559805 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-e06db73482dd5360788de5fa8c8cf39954d51152591e866b44b577d89a5b3754 WatchSource:0}: Error finding container e06db73482dd5360788de5fa8c8cf39954d51152591e866b44b577d89a5b3754: Status 404 returned error can't find the container with id e06db73482dd5360788de5fa8c8cf39954d51152591e866b44b577d89a5b3754 Dec 03 06:48:59 crc kubenswrapper[4947]: W1203 06:48:59.562737 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-10a3b17ac3d96aa45e7be91efd010c75a1b68e30d2564e94d0ee123f8bbe360f WatchSource:0}: Error finding container 10a3b17ac3d96aa45e7be91efd010c75a1b68e30d2564e94d0ee123f8bbe360f: Status 404 returned error can't find the container with id 10a3b17ac3d96aa45e7be91efd010c75a1b68e30d2564e94d0ee123f8bbe360f Dec 03 06:48:59 crc kubenswrapper[4947]: W1203 06:48:59.567576 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-6f9f54dcac9351ca266392e033e00086cfcef5e168e0e4fdf51496d0d013a9ee WatchSource:0}: Error finding container 6f9f54dcac9351ca266392e033e00086cfcef5e168e0e4fdf51496d0d013a9ee: Status 404 returned error can't find the container with id 6f9f54dcac9351ca266392e033e00086cfcef5e168e0e4fdf51496d0d013a9ee Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.586020 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.591220 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:48:59 crc kubenswrapper[4947]: E1203 06:48:59.614743 4947 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="800ms" Dec 03 06:48:59 crc kubenswrapper[4947]: W1203 06:48:59.615907 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-dfd946afa4bdb0bf42b4dccc258664ac25fab036631e4489ba9b7fddb9d38328 WatchSource:0}: Error finding container dfd946afa4bdb0bf42b4dccc258664ac25fab036631e4489ba9b7fddb9d38328: Status 404 returned error can't find the container with id dfd946afa4bdb0bf42b4dccc258664ac25fab036631e4489ba9b7fddb9d38328 Dec 03 06:48:59 crc kubenswrapper[4947]: W1203 06:48:59.617120 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-d8d810345f0260d413902298b8446c54f6de478d91e3ee62f36d292ef604b4a3 WatchSource:0}: Error finding container d8d810345f0260d413902298b8446c54f6de478d91e3ee62f36d292ef604b4a3: Status 404 returned error can't find the container with id d8d810345f0260d413902298b8446c54f6de478d91e3ee62f36d292ef604b4a3 Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.865432 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.866863 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.866924 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.866942 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:48:59 crc kubenswrapper[4947]: I1203 06:48:59.866978 4947 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 06:48:59 crc kubenswrapper[4947]: E1203 06:48:59.867536 4947 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.196:6443: connect: connection refused" node="crc" Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.000263 4947 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.004322 4947 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 17:10:41.393114291 +0000 UTC Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.004382 4947 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 562h21m41.388734557s for next certificate rotation Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.093311 4947 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433" exitCode=0 Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.093436 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433"} Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.093636 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"10a3b17ac3d96aa45e7be91efd010c75a1b68e30d2564e94d0ee123f8bbe360f"} Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.093785 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.095991 4947 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="a25512c24daabdca2a064cdb9f11c53a400d51b5493bb43cb7af721eef6820ae" exitCode=0 Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.096089 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"a25512c24daabdca2a064cdb9f11c53a400d51b5493bb43cb7af721eef6820ae"} Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.096138 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e06db73482dd5360788de5fa8c8cf39954d51152591e866b44b577d89a5b3754"} Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.096219 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.096582 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.096617 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.096631 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.097724 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.097756 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.097767 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.099670 4947 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85" exitCode=0 Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.099840 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85"} Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.099875 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d8d810345f0260d413902298b8446c54f6de478d91e3ee62f36d292ef604b4a3"} Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.100038 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.102027 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.102669 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.102702 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.102723 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.104341 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.104431 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.104478 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.105469 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8"} Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.105567 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dfd946afa4bdb0bf42b4dccc258664ac25fab036631e4489ba9b7fddb9d38328"} Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.108461 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9d474285ad8de913716408fee87d9f1b033e7675b8ad9f6eabc8c4e240db800e"} Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.108851 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.107639 4947 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9d474285ad8de913716408fee87d9f1b033e7675b8ad9f6eabc8c4e240db800e" exitCode=0 Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.108956 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6f9f54dcac9351ca266392e033e00086cfcef5e168e0e4fdf51496d0d013a9ee"} Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.110755 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.110790 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.110800 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:00 crc kubenswrapper[4947]: W1203 06:49:00.175970 4947 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Dec 03 06:49:00 crc kubenswrapper[4947]: E1203 06:49:00.176063 4947 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Dec 03 06:49:00 crc kubenswrapper[4947]: W1203 06:49:00.216346 4947 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Dec 03 06:49:00 crc kubenswrapper[4947]: E1203 06:49:00.216451 4947 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Dec 03 06:49:00 crc kubenswrapper[4947]: W1203 06:49:00.369195 4947 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Dec 03 06:49:00 crc kubenswrapper[4947]: E1203 06:49:00.369299 4947 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Dec 03 06:49:00 crc kubenswrapper[4947]: E1203 06:49:00.415852 4947 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="1.6s" Dec 03 06:49:00 crc kubenswrapper[4947]: W1203 06:49:00.438482 4947 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.196:6443: connect: connection refused Dec 03 06:49:00 crc kubenswrapper[4947]: E1203 06:49:00.438619 4947 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.196:6443: connect: connection refused" logger="UnhandledError" Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.667816 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.669895 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.669938 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.669979 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:00 crc kubenswrapper[4947]: I1203 06:49:00.670009 4947 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 06:49:00 crc kubenswrapper[4947]: E1203 06:49:00.671468 4947 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.196:6443: connect: connection refused" node="crc" Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.117326 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3085cc1aca7b221925ab104f657c06486df91695b0c4a0ab6c019dce49be5646"} Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.117377 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b953b6cbceff7ac8069d03ad77bd564b1661930712c29814a4605af39435cda8"} Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.117391 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"374d2c4f269c4e0196cca01c715fbf3d1c1be24b9a7f70471b79122aa9464fc2"} Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.117507 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.118636 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.118679 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.118689 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.121539 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53"} Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.121570 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014"} Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.121581 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562"} Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.121658 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.124311 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.124356 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.124365 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.138882 4947 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ad721644e4e7ccf3bd711c5934ad7d964a9f39c3125c6762ec4319a1a6a31ab3" exitCode=0 Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.138970 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ad721644e4e7ccf3bd711c5934ad7d964a9f39c3125c6762ec4319a1a6a31ab3"} Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.139185 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.140467 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.140526 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.140542 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.142688 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352"} Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.142748 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f"} Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.142764 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79"} Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.142778 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63"} Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.145265 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"3e67e837d991d917faf834bdf4fc01bbec144ff7cd174f5c185e2813904beb2f"} Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.145416 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.146266 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.146296 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.146308 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:01 crc kubenswrapper[4947]: I1203 06:49:01.882801 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:49:02 crc kubenswrapper[4947]: I1203 06:49:02.151690 4947 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="cf86847a1fef617f426feb864faaceb5e48e76addd7dc5d6f3045bbb8b710b3f" exitCode=0 Dec 03 06:49:02 crc kubenswrapper[4947]: I1203 06:49:02.151778 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"cf86847a1fef617f426feb864faaceb5e48e76addd7dc5d6f3045bbb8b710b3f"} Dec 03 06:49:02 crc kubenswrapper[4947]: I1203 06:49:02.151859 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:02 crc kubenswrapper[4947]: I1203 06:49:02.152994 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:02 crc kubenswrapper[4947]: I1203 06:49:02.153055 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:02 crc kubenswrapper[4947]: I1203 06:49:02.153082 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:02 crc kubenswrapper[4947]: I1203 06:49:02.157119 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336"} Dec 03 06:49:02 crc kubenswrapper[4947]: I1203 06:49:02.157176 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:02 crc kubenswrapper[4947]: I1203 06:49:02.157154 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:02 crc kubenswrapper[4947]: I1203 06:49:02.158734 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:02 crc kubenswrapper[4947]: I1203 06:49:02.158796 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:02 crc kubenswrapper[4947]: I1203 06:49:02.158819 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:02 crc kubenswrapper[4947]: I1203 06:49:02.158912 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:02 crc kubenswrapper[4947]: I1203 06:49:02.158968 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:02 crc kubenswrapper[4947]: I1203 06:49:02.158988 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:02 crc kubenswrapper[4947]: I1203 06:49:02.272108 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:02 crc kubenswrapper[4947]: I1203 06:49:02.273942 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:02 crc kubenswrapper[4947]: I1203 06:49:02.273991 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:02 crc kubenswrapper[4947]: I1203 06:49:02.274008 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:02 crc kubenswrapper[4947]: I1203 06:49:02.274040 4947 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 06:49:02 crc kubenswrapper[4947]: I1203 06:49:02.450519 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:49:02 crc kubenswrapper[4947]: I1203 06:49:02.450861 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:02 crc kubenswrapper[4947]: I1203 06:49:02.452757 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:02 crc kubenswrapper[4947]: I1203 06:49:02.452848 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:02 crc kubenswrapper[4947]: I1203 06:49:02.452870 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:03 crc kubenswrapper[4947]: I1203 06:49:03.167156 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bd8ed45e16ee5d22d7bd0cb336c843935b31f8f86ecddb4a57b83b150aec42f4"} Dec 03 06:49:03 crc kubenswrapper[4947]: I1203 06:49:03.167235 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:03 crc kubenswrapper[4947]: I1203 06:49:03.167267 4947 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 06:49:03 crc kubenswrapper[4947]: I1203 06:49:03.167328 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:03 crc kubenswrapper[4947]: I1203 06:49:03.167246 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7762a0aed7586e18fe4e69e884ee4fe9348761b7845c774ff90bfb8095c16819"} Dec 03 06:49:03 crc kubenswrapper[4947]: I1203 06:49:03.167919 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f9214e3b77d8eec03a770d8944f9822ead347e4b6be2cc8cf298344dfd3be15f"} Dec 03 06:49:03 crc kubenswrapper[4947]: I1203 06:49:03.169373 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:03 crc kubenswrapper[4947]: I1203 06:49:03.169416 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:03 crc kubenswrapper[4947]: I1203 06:49:03.169441 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:03 crc kubenswrapper[4947]: I1203 06:49:03.169457 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:03 crc kubenswrapper[4947]: I1203 06:49:03.169464 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:03 crc kubenswrapper[4947]: I1203 06:49:03.169480 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:03 crc kubenswrapper[4947]: I1203 06:49:03.389517 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:03 crc kubenswrapper[4947]: I1203 06:49:03.826635 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:04 crc kubenswrapper[4947]: I1203 06:49:04.178079 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"367ed1505ea1722051c7d96a913df20f97b4bce170b1c4b35372c8af6a24f9bd"} Dec 03 06:49:04 crc kubenswrapper[4947]: I1203 06:49:04.178167 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bf76d19cc298f94b47e42dc99e4d48bb8955348586e3c39ed1fa7be9f02d3c6d"} Dec 03 06:49:04 crc kubenswrapper[4947]: I1203 06:49:04.178125 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:04 crc kubenswrapper[4947]: I1203 06:49:04.178196 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:04 crc kubenswrapper[4947]: I1203 06:49:04.179998 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:04 crc kubenswrapper[4947]: I1203 06:49:04.180019 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:04 crc kubenswrapper[4947]: I1203 06:49:04.180055 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:04 crc kubenswrapper[4947]: I1203 06:49:04.180076 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:04 crc kubenswrapper[4947]: I1203 06:49:04.180060 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:04 crc kubenswrapper[4947]: I1203 06:49:04.180261 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:04 crc kubenswrapper[4947]: I1203 06:49:04.347821 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 03 06:49:04 crc kubenswrapper[4947]: I1203 06:49:04.883893 4947 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 06:49:04 crc kubenswrapper[4947]: I1203 06:49:04.884185 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 06:49:04 crc kubenswrapper[4947]: I1203 06:49:04.942385 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:49:04 crc kubenswrapper[4947]: I1203 06:49:04.942604 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:04 crc kubenswrapper[4947]: I1203 06:49:04.944192 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:04 crc kubenswrapper[4947]: I1203 06:49:04.944241 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:04 crc kubenswrapper[4947]: I1203 06:49:04.944259 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:04 crc kubenswrapper[4947]: I1203 06:49:04.952783 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:49:05 crc kubenswrapper[4947]: I1203 06:49:05.138884 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:05 crc kubenswrapper[4947]: I1203 06:49:05.180725 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:05 crc kubenswrapper[4947]: I1203 06:49:05.180832 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:05 crc kubenswrapper[4947]: I1203 06:49:05.180844 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:05 crc kubenswrapper[4947]: I1203 06:49:05.182433 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:05 crc kubenswrapper[4947]: I1203 06:49:05.182516 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:05 crc kubenswrapper[4947]: I1203 06:49:05.182543 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:05 crc kubenswrapper[4947]: I1203 06:49:05.182594 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:05 crc kubenswrapper[4947]: I1203 06:49:05.182623 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:05 crc kubenswrapper[4947]: I1203 06:49:05.182641 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:05 crc kubenswrapper[4947]: I1203 06:49:05.182433 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:05 crc kubenswrapper[4947]: I1203 06:49:05.182696 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:05 crc kubenswrapper[4947]: I1203 06:49:05.182712 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:06 crc kubenswrapper[4947]: I1203 06:49:06.183433 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:06 crc kubenswrapper[4947]: I1203 06:49:06.183433 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:06 crc kubenswrapper[4947]: I1203 06:49:06.185321 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:06 crc kubenswrapper[4947]: I1203 06:49:06.185367 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:06 crc kubenswrapper[4947]: I1203 06:49:06.185384 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:06 crc kubenswrapper[4947]: I1203 06:49:06.185445 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:06 crc kubenswrapper[4947]: I1203 06:49:06.185481 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:06 crc kubenswrapper[4947]: I1203 06:49:06.185537 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:09 crc kubenswrapper[4947]: I1203 06:49:09.061042 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:49:09 crc kubenswrapper[4947]: I1203 06:49:09.061703 4947 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 06:49:09 crc kubenswrapper[4947]: I1203 06:49:09.061762 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:09 crc kubenswrapper[4947]: I1203 06:49:09.063347 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:09 crc kubenswrapper[4947]: I1203 06:49:09.063410 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:09 crc kubenswrapper[4947]: I1203 06:49:09.063439 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:09 crc kubenswrapper[4947]: I1203 06:49:09.069231 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:49:09 crc kubenswrapper[4947]: E1203 06:49:09.171456 4947 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 03 06:49:09 crc kubenswrapper[4947]: I1203 06:49:09.191124 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:09 crc kubenswrapper[4947]: I1203 06:49:09.192646 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:09 crc kubenswrapper[4947]: I1203 06:49:09.192705 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:09 crc kubenswrapper[4947]: I1203 06:49:09.192725 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:09 crc kubenswrapper[4947]: I1203 06:49:09.820274 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:49:10 crc kubenswrapper[4947]: I1203 06:49:10.193892 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:10 crc kubenswrapper[4947]: I1203 06:49:10.199155 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:10 crc kubenswrapper[4947]: I1203 06:49:10.199215 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:10 crc kubenswrapper[4947]: I1203 06:49:10.199236 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:11 crc kubenswrapper[4947]: I1203 06:49:11.001818 4947 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 03 06:49:11 crc kubenswrapper[4947]: W1203 06:49:11.889853 4947 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 03 06:49:11 crc kubenswrapper[4947]: I1203 06:49:11.890000 4947 trace.go:236] Trace[1958484832]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 06:49:01.887) (total time: 10002ms): Dec 03 06:49:11 crc kubenswrapper[4947]: Trace[1958484832]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (06:49:11.889) Dec 03 06:49:11 crc kubenswrapper[4947]: Trace[1958484832]: [10.002232832s] [10.002232832s] END Dec 03 06:49:11 crc kubenswrapper[4947]: E1203 06:49:11.890039 4947 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 03 06:49:12 crc kubenswrapper[4947]: E1203 06:49:12.016883 4947 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 03 06:49:12 crc kubenswrapper[4947]: E1203 06:49:12.275883 4947 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 03 06:49:12 crc kubenswrapper[4947]: I1203 06:49:12.484777 4947 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 03 06:49:12 crc kubenswrapper[4947]: I1203 06:49:12.484865 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 03 06:49:12 crc kubenswrapper[4947]: I1203 06:49:12.490971 4947 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 03 06:49:12 crc kubenswrapper[4947]: I1203 06:49:12.491358 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 03 06:49:12 crc kubenswrapper[4947]: I1203 06:49:12.890749 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 03 06:49:12 crc kubenswrapper[4947]: I1203 06:49:12.890992 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:12 crc kubenswrapper[4947]: I1203 06:49:12.892242 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:12 crc kubenswrapper[4947]: I1203 06:49:12.892278 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:12 crc kubenswrapper[4947]: I1203 06:49:12.892289 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:12 crc kubenswrapper[4947]: I1203 06:49:12.922778 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 03 06:49:13 crc kubenswrapper[4947]: I1203 06:49:13.201809 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:13 crc kubenswrapper[4947]: I1203 06:49:13.203007 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:13 crc kubenswrapper[4947]: I1203 06:49:13.203086 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:13 crc kubenswrapper[4947]: I1203 06:49:13.203112 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:13 crc kubenswrapper[4947]: I1203 06:49:13.222252 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 03 06:49:14 crc kubenswrapper[4947]: I1203 06:49:14.204946 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:14 crc kubenswrapper[4947]: I1203 06:49:14.206301 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:14 crc kubenswrapper[4947]: I1203 06:49:14.206369 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:14 crc kubenswrapper[4947]: I1203 06:49:14.206396 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:14 crc kubenswrapper[4947]: I1203 06:49:14.883992 4947 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 06:49:14 crc kubenswrapper[4947]: I1203 06:49:14.884098 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 06:49:15 crc kubenswrapper[4947]: I1203 06:49:15.147685 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:15 crc kubenswrapper[4947]: I1203 06:49:15.147961 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:15 crc kubenswrapper[4947]: I1203 06:49:15.149851 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:15 crc kubenswrapper[4947]: I1203 06:49:15.149924 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:15 crc kubenswrapper[4947]: I1203 06:49:15.149942 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:15 crc kubenswrapper[4947]: I1203 06:49:15.154041 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:15 crc kubenswrapper[4947]: I1203 06:49:15.207969 4947 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 06:49:15 crc kubenswrapper[4947]: I1203 06:49:15.208046 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:15 crc kubenswrapper[4947]: I1203 06:49:15.209609 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:15 crc kubenswrapper[4947]: I1203 06:49:15.209708 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:15 crc kubenswrapper[4947]: I1203 06:49:15.209728 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:15 crc kubenswrapper[4947]: I1203 06:49:15.477091 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:15 crc kubenswrapper[4947]: I1203 06:49:15.479300 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:15 crc kubenswrapper[4947]: I1203 06:49:15.479387 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:15 crc kubenswrapper[4947]: I1203 06:49:15.479410 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:15 crc kubenswrapper[4947]: I1203 06:49:15.479450 4947 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 06:49:15 crc kubenswrapper[4947]: E1203 06:49:15.497258 4947 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 03 06:49:16 crc kubenswrapper[4947]: I1203 06:49:16.018217 4947 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 06:49:17 crc kubenswrapper[4947]: I1203 06:49:17.496535 4947 trace.go:236] Trace[921215074]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 06:49:02.946) (total time: 14549ms): Dec 03 06:49:17 crc kubenswrapper[4947]: Trace[921215074]: ---"Objects listed" error: 14549ms (06:49:17.496) Dec 03 06:49:17 crc kubenswrapper[4947]: Trace[921215074]: [14.549608294s] [14.549608294s] END Dec 03 06:49:17 crc kubenswrapper[4947]: I1203 06:49:17.496597 4947 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 06:49:17 crc kubenswrapper[4947]: I1203 06:49:17.497086 4947 trace.go:236] Trace[2132889738]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 06:49:02.565) (total time: 14931ms): Dec 03 06:49:17 crc kubenswrapper[4947]: Trace[2132889738]: ---"Objects listed" error: 14931ms (06:49:17.496) Dec 03 06:49:17 crc kubenswrapper[4947]: Trace[2132889738]: [14.931471719s] [14.931471719s] END Dec 03 06:49:17 crc kubenswrapper[4947]: I1203 06:49:17.497110 4947 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 06:49:17 crc kubenswrapper[4947]: I1203 06:49:17.498156 4947 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 03 06:49:17 crc kubenswrapper[4947]: I1203 06:49:17.500760 4947 trace.go:236] Trace[1520172481]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 06:49:03.538) (total time: 13962ms): Dec 03 06:49:17 crc kubenswrapper[4947]: Trace[1520172481]: ---"Objects listed" error: 13961ms (06:49:17.500) Dec 03 06:49:17 crc kubenswrapper[4947]: Trace[1520172481]: [13.962011552s] [13.962011552s] END Dec 03 06:49:17 crc kubenswrapper[4947]: I1203 06:49:17.500809 4947 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 06:49:17 crc kubenswrapper[4947]: I1203 06:49:17.542247 4947 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:48758->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 03 06:49:17 crc kubenswrapper[4947]: I1203 06:49:17.542335 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:48758->192.168.126.11:17697: read: connection reset by peer" Dec 03 06:49:17 crc kubenswrapper[4947]: I1203 06:49:17.542766 4947 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 03 06:49:17 crc kubenswrapper[4947]: I1203 06:49:17.542801 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 03 06:49:17 crc kubenswrapper[4947]: I1203 06:49:17.543191 4947 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38972->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 03 06:49:17 crc kubenswrapper[4947]: I1203 06:49:17.543551 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38972->192.168.126.11:17697: read: connection reset by peer" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.002916 4947 apiserver.go:52] "Watching apiserver" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.008416 4947 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.009419 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.010004 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.010110 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.010203 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.010430 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.010511 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.010630 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.010684 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.010710 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.010990 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.015017 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.015057 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.015622 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.015738 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.015733 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.015936 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.017562 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.018522 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.023292 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.051242 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.066891 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.082597 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.099796 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.100224 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.100310 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.100368 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.100397 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.100432 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.101851 4947 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.110526 4947 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.114612 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.114666 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.114692 4947 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.114784 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 06:49:18.614754572 +0000 UTC m=+19.875709008 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.115093 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.118524 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.123041 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.123072 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.123087 4947 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.123159 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 06:49:18.623142208 +0000 UTC m=+19.884096654 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.129371 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.130883 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.140207 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.152545 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.201387 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.201455 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.201539 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.202065 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.201565 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.202192 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.202220 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.202574 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.202462 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.202477 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.202672 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.202932 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203070 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203161 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203083 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.203235 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:49:18.703210716 +0000 UTC m=+19.964165152 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203093 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203141 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203277 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203317 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203331 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203344 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203375 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203400 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203424 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203451 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203475 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203520 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203546 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203571 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203594 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203605 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203617 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203649 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203654 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203683 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203711 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203711 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203736 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203761 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203772 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203786 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203813 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203848 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203862 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203891 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203914 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203937 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203985 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204035 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204138 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204171 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204200 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204227 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204248 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204271 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204295 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204317 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204341 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204370 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204392 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204414 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204439 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204460 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204483 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204523 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204553 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204601 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204625 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204651 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204678 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204706 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204731 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204755 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204777 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204798 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204822 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204849 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204872 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204894 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204919 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204944 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204965 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204987 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205020 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205063 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205087 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205112 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205143 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205168 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205191 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205216 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205240 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205271 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205296 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205319 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205340 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205362 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205385 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205410 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205432 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205454 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205510 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205535 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205560 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205584 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205606 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205879 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205909 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205934 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205957 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205984 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.206009 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.206032 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.206107 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.206132 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.203955 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.206156 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204047 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204108 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.206180 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204174 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204302 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204395 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204438 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204468 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204623 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204624 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204674 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204814 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.204847 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205010 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205009 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205067 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205191 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205469 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205597 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205763 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205773 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.205938 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.206029 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.206134 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.206142 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.206341 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.206354 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.207774 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.208254 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.208346 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.208387 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.208668 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.208851 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.208885 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.208920 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.209205 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.209303 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.209315 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.206186 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.209406 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.209476 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.209463 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.209574 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.209613 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.209651 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.209681 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.209702 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.209725 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.209752 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.209775 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.209798 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.209821 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.209844 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.209854 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.209869 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.209895 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.209917 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.209941 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.209964 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.209986 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210007 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210031 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210053 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210075 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210099 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210122 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210144 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210166 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210190 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210213 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210236 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210262 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210282 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210302 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210323 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210344 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210366 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210389 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210412 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210434 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210456 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210481 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210522 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210545 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210565 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210567 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210648 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210658 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210795 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.211130 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.211358 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.211372 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.211434 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.211736 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.211854 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.211906 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.211995 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.212106 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.212293 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.212870 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.212906 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.212960 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.212993 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.213217 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.213458 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.213733 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.213800 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.213986 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.214078 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.214123 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.214211 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.214463 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.214818 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.214956 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.214991 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.215254 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.215258 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.215897 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.215986 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.216135 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.216401 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.216685 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.216736 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.216834 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.217313 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.217526 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.217617 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.217752 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.217820 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.217843 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.218152 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.218187 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.218432 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.218768 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.219286 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.219530 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.219543 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.208888 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.219980 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.219934 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.220194 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.220213 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.220306 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.220627 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.220690 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.220711 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.220703 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.220947 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.221133 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.221154 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.221142 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.221574 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.221957 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.222001 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.222003 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.222067 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.222159 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.222593 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.222826 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.222873 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.210584 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.222945 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.222986 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.223025 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.223061 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.223100 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.223134 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.223218 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.223256 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.223384 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.223421 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.223453 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.223559 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.223598 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.223634 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.223671 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.223704 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.223738 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.223772 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.224153 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.224209 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.224248 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.224282 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.224317 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.224350 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.224387 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.224421 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.224459 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.224530 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.224567 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.224602 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.224639 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.224679 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.224743 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.224777 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.224813 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.224851 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.224896 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.224930 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.224964 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.225000 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.225035 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.225070 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.225108 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.225145 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.225179 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.225219 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.225256 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.225292 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.225393 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.225552 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.225602 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.225641 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.225810 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.225856 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.225893 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.225955 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226019 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226060 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226221 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226247 4947 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226269 4947 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226290 4947 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226311 4947 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226331 4947 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226351 4947 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226371 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226391 4947 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226413 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226434 4947 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226453 4947 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226473 4947 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226519 4947 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226540 4947 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226560 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226580 4947 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226602 4947 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226624 4947 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226644 4947 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226664 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226692 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226711 4947 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228030 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228065 4947 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228086 4947 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228106 4947 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228126 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228147 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228167 4947 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228189 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228210 4947 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228230 4947 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228256 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228276 4947 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228297 4947 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228318 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228340 4947 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228359 4947 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228380 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228402 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228422 4947 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228442 4947 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228464 4947 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228484 4947 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228532 4947 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228554 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228574 4947 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228594 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228614 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228656 4947 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228676 4947 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228696 4947 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228720 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228739 4947 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228758 4947 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228779 4947 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228802 4947 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228823 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228843 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228862 4947 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228884 4947 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228904 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228923 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228944 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228963 4947 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228984 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.229004 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.229024 4947 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.229054 4947 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.229074 4947 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.229097 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.229116 4947 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.229134 4947 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.229155 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.229175 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.229195 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.229216 4947 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.229235 4947 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.231297 4947 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.231329 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.231356 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.231378 4947 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.231643 4947 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.231671 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.231693 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.231714 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.231734 4947 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.231753 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.231773 4947 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.231793 4947 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.231814 4947 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.231833 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.231854 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.231874 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.231893 4947 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.231913 4947 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.231972 4947 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.231992 4947 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.232013 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.232102 4947 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.232125 4947 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.232144 4947 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.232163 4947 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.232183 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.232251 4947 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.232286 4947 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.232295 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.232307 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.232329 4947 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.232349 4947 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.232369 4947 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.232389 4947 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.232409 4947 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.232430 4947 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.232449 4947 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.232469 4947 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.232518 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.232539 4947 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.232578 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.233045 4947 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.233078 4947 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.233101 4947 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.233164 4947 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.233185 4947 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.233204 4947 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.233224 4947 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.223047 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.223228 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.233267 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.223241 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.222943 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.223807 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.223829 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.223863 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.223904 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.224426 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.224650 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.224938 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.225022 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.225077 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.226220 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.228797 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.229160 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.229180 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.229722 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.229808 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.229836 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.230073 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.230057 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.230118 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.233364 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.233479 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.230378 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.230362 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.230689 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.230724 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.230943 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.230986 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.231064 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.231096 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.231260 4947 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.231457 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.231893 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.232922 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.233470 4947 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.233728 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.233759 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.233945 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.234166 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.234312 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:49:18.734270163 +0000 UTC m=+19.995224779 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.234332 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.234541 4947 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.234564 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:49:18.734405697 +0000 UTC m=+19.995360333 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.234653 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.234691 4947 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.234752 4947 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.234804 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.234930 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.234968 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.235951 4947 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336" exitCode=255 Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.236052 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336"} Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.236174 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.236635 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.240121 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.249285 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.249364 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.249810 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.249853 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.249824 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.250373 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.250760 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.251170 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.278909 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.281162 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.284641 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.285320 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.285427 4947 scope.go:117] "RemoveContainer" containerID="6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.285718 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.285967 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.287836 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.288002 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.288070 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.291811 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.294516 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.296797 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.300402 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.302435 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.305676 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.307564 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.310456 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.313755 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.317184 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.325841 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.325876 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.334925 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.335309 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.335377 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.335436 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.335473 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.335506 4947 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.335520 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.335570 4947 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.335584 4947 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.335595 4947 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.335813 4947 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.335824 4947 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.335834 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.335534 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.335843 4947 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.335905 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.335924 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.335940 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.335957 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.335757 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.335971 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336141 4947 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336164 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336182 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336199 4947 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336215 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336231 4947 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336246 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336260 4947 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336273 4947 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336287 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336300 4947 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336315 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336328 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336340 4947 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336354 4947 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336368 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336381 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336394 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336406 4947 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336418 4947 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336429 4947 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336442 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336456 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336468 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336480 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336510 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336524 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336536 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336548 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336563 4947 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336575 4947 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336587 4947 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336599 4947 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336611 4947 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336623 4947 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336635 4947 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336648 4947 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336661 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336673 4947 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336685 4947 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336697 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336713 4947 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336724 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336740 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336751 4947 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.336764 4947 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:49:18 crc kubenswrapper[4947]: W1203 06:49:18.343157 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-bf40dc768e432d1f1ffe1be562cec2c9c1d91259104d46e3397742fd88feca74 WatchSource:0}: Error finding container bf40dc768e432d1f1ffe1be562cec2c9c1d91259104d46e3397742fd88feca74: Status 404 returned error can't find the container with id bf40dc768e432d1f1ffe1be562cec2c9c1d91259104d46e3397742fd88feca74 Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.345897 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.347464 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:49:18 crc kubenswrapper[4947]: W1203 06:49:18.349637 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-9fbbf6622091fe5fae3c5dd6e0f933f647670f482ff46ee7c9d2c932d0b6e2ca WatchSource:0}: Error finding container 9fbbf6622091fe5fae3c5dd6e0f933f647670f482ff46ee7c9d2c932d0b6e2ca: Status 404 returned error can't find the container with id 9fbbf6622091fe5fae3c5dd6e0f933f647670f482ff46ee7c9d2c932d0b6e2ca Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.640800 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.641244 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.641072 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.641364 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.641409 4947 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.641485 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.641521 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 06:49:19.641473361 +0000 UTC m=+20.902427957 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.641538 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.641556 4947 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.641646 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 06:49:19.641626065 +0000 UTC m=+20.902580481 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.742743 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.742885 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.742984 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:49:19.742944906 +0000 UTC m=+21.003899362 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.743028 4947 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.743130 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:49:19.74310223 +0000 UTC m=+21.004056686 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:49:18 crc kubenswrapper[4947]: I1203 06:49:18.743187 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.743259 4947 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:49:18 crc kubenswrapper[4947]: E1203 06:49:18.743316 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:49:19.743303306 +0000 UTC m=+21.004257762 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.082691 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:19 crc kubenswrapper[4947]: E1203 06:49:19.083295 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.088084 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.088798 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.090457 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.091302 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.092664 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.093320 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.094152 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.095414 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.096261 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.097531 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.100875 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.103554 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.104332 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.105317 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.106526 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.107078 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.108207 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.108675 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.109266 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.110323 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.110832 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.112013 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.112484 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.113533 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.114222 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.114928 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.115986 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.116462 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.117464 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.117976 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.118942 4947 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.119045 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.120541 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.120876 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.121382 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.122364 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.123962 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.124652 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.125746 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.126527 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.127581 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.128169 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.129149 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.129784 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.130755 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.131214 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.132124 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.132648 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.134002 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.134472 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.135341 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.135858 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.136758 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.137397 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.138048 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.141921 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.162884 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.183875 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.212592 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.233885 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.240215 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.242014 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d"} Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.242258 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.243267 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"3642c57388847acd288938c0d6486673676e364a4a668b72544c775ac17b42f1"} Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.244836 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda"} Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.244866 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce"} Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.244877 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9fbbf6622091fe5fae3c5dd6e0f933f647670f482ff46ee7c9d2c932d0b6e2ca"} Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.246201 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04"} Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.246230 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"bf40dc768e432d1f1ffe1be562cec2c9c1d91259104d46e3397742fd88feca74"} Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.250707 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.269692 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.290241 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.312314 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.331746 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.355023 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.374896 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.393606 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.654612 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:19 crc kubenswrapper[4947]: E1203 06:49:19.655741 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:49:19 crc kubenswrapper[4947]: E1203 06:49:19.655786 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:49:19 crc kubenswrapper[4947]: E1203 06:49:19.655800 4947 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:19 crc kubenswrapper[4947]: E1203 06:49:19.655857 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 06:49:21.655840236 +0000 UTC m=+22.916794652 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.654656 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:19 crc kubenswrapper[4947]: E1203 06:49:19.658129 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:49:19 crc kubenswrapper[4947]: E1203 06:49:19.658197 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:49:19 crc kubenswrapper[4947]: E1203 06:49:19.658217 4947 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:19 crc kubenswrapper[4947]: E1203 06:49:19.658302 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 06:49:21.658278612 +0000 UTC m=+22.919233038 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.757418 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:49:19 crc kubenswrapper[4947]: E1203 06:49:19.757677 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:49:21.757648011 +0000 UTC m=+23.018602477 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.757737 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:19 crc kubenswrapper[4947]: I1203 06:49:19.757878 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:19 crc kubenswrapper[4947]: E1203 06:49:19.757905 4947 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:49:19 crc kubenswrapper[4947]: E1203 06:49:19.758073 4947 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:49:19 crc kubenswrapper[4947]: E1203 06:49:19.758549 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:49:21.758030481 +0000 UTC m=+23.018985097 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:49:19 crc kubenswrapper[4947]: E1203 06:49:19.758596 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:49:21.758584726 +0000 UTC m=+23.019539152 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:49:20 crc kubenswrapper[4947]: I1203 06:49:20.082660 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:20 crc kubenswrapper[4947]: I1203 06:49:20.082685 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:20 crc kubenswrapper[4947]: E1203 06:49:20.082867 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:49:20 crc kubenswrapper[4947]: E1203 06:49:20.082983 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.082626 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:21 crc kubenswrapper[4947]: E1203 06:49:21.082775 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.674054 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.674142 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:21 crc kubenswrapper[4947]: E1203 06:49:21.674263 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:49:21 crc kubenswrapper[4947]: E1203 06:49:21.674291 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:49:21 crc kubenswrapper[4947]: E1203 06:49:21.674307 4947 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:21 crc kubenswrapper[4947]: E1203 06:49:21.674376 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 06:49:25.674355012 +0000 UTC m=+26.935309438 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:21 crc kubenswrapper[4947]: E1203 06:49:21.674409 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:49:21 crc kubenswrapper[4947]: E1203 06:49:21.674457 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:49:21 crc kubenswrapper[4947]: E1203 06:49:21.674472 4947 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:21 crc kubenswrapper[4947]: E1203 06:49:21.674562 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 06:49:25.674541287 +0000 UTC m=+26.935495713 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.774947 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.775039 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:21 crc kubenswrapper[4947]: E1203 06:49:21.775063 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:49:25.775040616 +0000 UTC m=+27.035995042 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.775089 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:21 crc kubenswrapper[4947]: E1203 06:49:21.775147 4947 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:49:21 crc kubenswrapper[4947]: E1203 06:49:21.775196 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:49:25.7751825 +0000 UTC m=+27.036136936 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:49:21 crc kubenswrapper[4947]: E1203 06:49:21.775205 4947 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:49:21 crc kubenswrapper[4947]: E1203 06:49:21.775228 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:49:25.775222571 +0000 UTC m=+27.036176997 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.889473 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.895298 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.898364 4947 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.899317 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-g8hh7"] Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.900684 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-g8hh7" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.900851 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.900887 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.900897 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.900972 4947 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.904092 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.904380 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-97tnc"] Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.904619 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.904819 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-97tnc" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.904916 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.906732 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.906836 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.906967 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.908411 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.908737 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.911201 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.911466 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.917383 4947 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.917840 4947 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.919612 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.919656 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.919667 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.919686 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.919699 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:21Z","lastTransitionTime":"2025-12-03T06:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.939077 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.953038 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:21 crc kubenswrapper[4947]: E1203 06:49:21.956133 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.961957 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.962076 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.962170 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.962254 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.962315 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:21Z","lastTransitionTime":"2025-12-03T06:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.970286 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:21 crc kubenswrapper[4947]: E1203 06:49:21.975640 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.977813 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf2kv\" (UniqueName: \"kubernetes.io/projected/a0cbf0ab-15d8-4ec1-b889-c31a347923f4-kube-api-access-nf2kv\") pod \"node-resolver-g8hh7\" (UID: \"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\") " pod="openshift-dns/node-resolver-g8hh7" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.977863 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-host-var-lib-cni-multus\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.977890 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-etc-kubernetes\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.977914 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-system-cni-dir\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.977933 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-host-var-lib-cni-bin\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.977965 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a0cbf0ab-15d8-4ec1-b889-c31a347923f4-hosts-file\") pod \"node-resolver-g8hh7\" (UID: \"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\") " pod="openshift-dns/node-resolver-g8hh7" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.977985 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-multus-socket-dir-parent\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.978006 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-host-run-netns\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.978028 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-hostroot\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.978052 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-os-release\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.978073 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-multus-conf-dir\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.978097 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1c90ac94-365a-4c82-b72a-41129d95a39e-cni-binary-copy\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.978119 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1c90ac94-365a-4c82-b72a-41129d95a39e-multus-daemon-config\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.978141 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-host-run-multus-certs\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.978182 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-host-var-lib-kubelet\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.978200 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-cnibin\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.978231 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-multus-cni-dir\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.978249 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-host-run-k8s-cni-cncf-io\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.978268 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsb2l\" (UniqueName: \"kubernetes.io/projected/1c90ac94-365a-4c82-b72a-41129d95a39e-kube-api-access-lsb2l\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.979715 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.979748 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.979757 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.979786 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.979796 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:21Z","lastTransitionTime":"2025-12-03T06:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.983550 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:21 crc kubenswrapper[4947]: E1203 06:49:21.994410 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.997735 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.997759 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.997767 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.997782 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:21 crc kubenswrapper[4947]: I1203 06:49:21.997792 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:21Z","lastTransitionTime":"2025-12-03T06:49:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.001657 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:21Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: E1203 06:49:22.016382 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.017598 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.020206 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.020236 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.020247 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.020262 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.020272 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:22Z","lastTransitionTime":"2025-12-03T06:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.035345 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: E1203 06:49:22.036376 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: E1203 06:49:22.036515 4947 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.038315 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.038354 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.038362 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.038376 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.038386 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:22Z","lastTransitionTime":"2025-12-03T06:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.049144 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.068248 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079256 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf2kv\" (UniqueName: \"kubernetes.io/projected/a0cbf0ab-15d8-4ec1-b889-c31a347923f4-kube-api-access-nf2kv\") pod \"node-resolver-g8hh7\" (UID: \"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\") " pod="openshift-dns/node-resolver-g8hh7" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079307 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-host-var-lib-cni-multus\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079329 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-etc-kubernetes\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079358 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-system-cni-dir\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079377 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-host-var-lib-cni-bin\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079398 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-hostroot\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079426 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a0cbf0ab-15d8-4ec1-b889-c31a347923f4-hosts-file\") pod \"node-resolver-g8hh7\" (UID: \"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\") " pod="openshift-dns/node-resolver-g8hh7" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079444 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-multus-socket-dir-parent\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079463 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-host-run-netns\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079486 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-os-release\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079527 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1c90ac94-365a-4c82-b72a-41129d95a39e-cni-binary-copy\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079549 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-multus-conf-dir\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079574 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1c90ac94-365a-4c82-b72a-41129d95a39e-multus-daemon-config\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079578 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-hostroot\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079592 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-host-run-multus-certs\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079633 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-host-run-multus-certs\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079655 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-host-var-lib-kubelet\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079675 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-multus-cni-dir\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079690 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-cnibin\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079699 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a0cbf0ab-15d8-4ec1-b889-c31a347923f4-hosts-file\") pod \"node-resolver-g8hh7\" (UID: \"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\") " pod="openshift-dns/node-resolver-g8hh7" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079718 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-host-run-k8s-cni-cncf-io\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079736 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsb2l\" (UniqueName: \"kubernetes.io/projected/1c90ac94-365a-4c82-b72a-41129d95a39e-kube-api-access-lsb2l\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079740 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-multus-socket-dir-parent\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079804 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-os-release\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079838 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-host-var-lib-cni-multus\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079861 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-etc-kubernetes\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079889 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-system-cni-dir\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079910 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-host-var-lib-cni-bin\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.079930 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-host-var-lib-kubelet\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.080070 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-multus-cni-dir\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.080092 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-multus-conf-dir\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.080119 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-cnibin\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.080134 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-host-run-k8s-cni-cncf-io\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.080273 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1c90ac94-365a-4c82-b72a-41129d95a39e-host-run-netns\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.080442 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1c90ac94-365a-4c82-b72a-41129d95a39e-cni-binary-copy\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.080646 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1c90ac94-365a-4c82-b72a-41129d95a39e-multus-daemon-config\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.082266 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.082301 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:22 crc kubenswrapper[4947]: E1203 06:49:22.082395 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:49:22 crc kubenswrapper[4947]: E1203 06:49:22.082503 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.084604 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.097669 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf2kv\" (UniqueName: \"kubernetes.io/projected/a0cbf0ab-15d8-4ec1-b889-c31a347923f4-kube-api-access-nf2kv\") pod \"node-resolver-g8hh7\" (UID: \"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\") " pod="openshift-dns/node-resolver-g8hh7" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.100534 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsb2l\" (UniqueName: \"kubernetes.io/projected/1c90ac94-365a-4c82-b72a-41129d95a39e-kube-api-access-lsb2l\") pod \"multus-97tnc\" (UID: \"1c90ac94-365a-4c82-b72a-41129d95a39e\") " pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.100669 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.118766 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.139990 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.141206 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.141258 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.141271 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.141290 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.141301 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:22Z","lastTransitionTime":"2025-12-03T06:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.154756 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.167850 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.182030 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.228229 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-g8hh7" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.243107 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-97tnc" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.251793 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.252204 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.252216 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.252235 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.252249 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:22Z","lastTransitionTime":"2025-12-03T06:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.263945 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-g8hh7" event={"ID":"a0cbf0ab-15d8-4ec1-b889-c31a347923f4","Type":"ContainerStarted","Data":"7cd5b87baf7798219bda5265cc43861eb833dad244d8bc7c7ad4f70697922b1a"} Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.277721 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c"} Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.322251 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.328157 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-qv8tj"] Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.328650 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-mc5l9"] Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.329148 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.329517 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.336398 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.336614 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.336687 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.336408 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.336814 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.336927 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.336948 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.336952 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pt9n6"] Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.337969 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: W1203 06:49:22.345107 4947 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert": failed to list *v1.Secret: secrets "ovn-node-metrics-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Dec 03 06:49:22 crc kubenswrapper[4947]: W1203 06:49:22.345150 4947 reflector.go:561] object-"openshift-ovn-kubernetes"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Dec 03 06:49:22 crc kubenswrapper[4947]: E1203 06:49:22.345177 4947 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-node-metrics-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 06:49:22 crc kubenswrapper[4947]: W1203 06:49:22.345199 4947 reflector.go:561] object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Dec 03 06:49:22 crc kubenswrapper[4947]: E1203 06:49:22.345205 4947 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 06:49:22 crc kubenswrapper[4947]: W1203 06:49:22.345242 4947 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovnkube-config": failed to list *v1.ConfigMap: configmaps "ovnkube-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Dec 03 06:49:22 crc kubenswrapper[4947]: E1203 06:49:22.345244 4947 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 06:49:22 crc kubenswrapper[4947]: E1203 06:49:22.345260 4947 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovnkube-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 06:49:22 crc kubenswrapper[4947]: W1203 06:49:22.345317 4947 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovnkube-script-lib": failed to list *v1.ConfigMap: configmaps "ovnkube-script-lib" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Dec 03 06:49:22 crc kubenswrapper[4947]: W1203 06:49:22.345327 4947 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl": failed to list *v1.Secret: secrets "ovn-kubernetes-node-dockercfg-pwtwl" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Dec 03 06:49:22 crc kubenswrapper[4947]: E1203 06:49:22.345333 4947 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"ovnkube-script-lib\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 06:49:22 crc kubenswrapper[4947]: E1203 06:49:22.345342 4947 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pwtwl\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-kubernetes-node-dockercfg-pwtwl\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 06:49:22 crc kubenswrapper[4947]: W1203 06:49:22.345357 4947 reflector.go:561] object-"openshift-ovn-kubernetes"/"env-overrides": failed to list *v1.ConfigMap: configmaps "env-overrides" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Dec 03 06:49:22 crc kubenswrapper[4947]: E1203 06:49:22.345375 4947 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"env-overrides\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"env-overrides\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.358940 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.370435 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.370467 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.370476 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.370513 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.370524 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:22Z","lastTransitionTime":"2025-12-03T06:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.387012 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-etc-openvswitch\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.387194 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-node-log\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.387331 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqnqr\" (UniqueName: \"kubernetes.io/projected/19542618-7a4e-44bc-9297-9931dcc41eea-kube-api-access-fqnqr\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.387430 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/12b4de70-50e1-40ba-836d-41c953077b50-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mc5l9\" (UID: \"12b4de70-50e1-40ba-836d-41c953077b50\") " pod="openshift-multus/multus-additional-cni-plugins-mc5l9" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.387570 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/12b4de70-50e1-40ba-836d-41c953077b50-cnibin\") pod \"multus-additional-cni-plugins-mc5l9\" (UID: \"12b4de70-50e1-40ba-836d-41c953077b50\") " pod="openshift-multus/multus-additional-cni-plugins-mc5l9" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.387690 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/19542618-7a4e-44bc-9297-9931dcc41eea-ovnkube-config\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.387803 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-kubelet\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.387945 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/19542618-7a4e-44bc-9297-9931dcc41eea-env-overrides\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.388523 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/12b4de70-50e1-40ba-836d-41c953077b50-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mc5l9\" (UID: \"12b4de70-50e1-40ba-836d-41c953077b50\") " pod="openshift-multus/multus-additional-cni-plugins-mc5l9" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.388658 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-slash\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.388730 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-run-netns\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.388796 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6s4r\" (UniqueName: \"kubernetes.io/projected/8384efba-0256-458d-8aab-627ad76e643e-kube-api-access-h6s4r\") pod \"machine-config-daemon-qv8tj\" (UID: \"8384efba-0256-458d-8aab-627ad76e643e\") " pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.388969 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/12b4de70-50e1-40ba-836d-41c953077b50-os-release\") pod \"multus-additional-cni-plugins-mc5l9\" (UID: \"12b4de70-50e1-40ba-836d-41c953077b50\") " pod="openshift-multus/multus-additional-cni-plugins-mc5l9" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.389084 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.389172 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/19542618-7a4e-44bc-9297-9931dcc41eea-ovnkube-script-lib\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.389253 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-run-systemd\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.389321 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-cni-netd\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.389391 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/12b4de70-50e1-40ba-836d-41c953077b50-cni-binary-copy\") pod \"multus-additional-cni-plugins-mc5l9\" (UID: \"12b4de70-50e1-40ba-836d-41c953077b50\") " pod="openshift-multus/multus-additional-cni-plugins-mc5l9" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.389467 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8384efba-0256-458d-8aab-627ad76e643e-proxy-tls\") pod \"machine-config-daemon-qv8tj\" (UID: \"8384efba-0256-458d-8aab-627ad76e643e\") " pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.389564 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-run-openvswitch\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.389631 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-cni-bin\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.389705 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/19542618-7a4e-44bc-9297-9931dcc41eea-ovn-node-metrics-cert\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.389790 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgflr\" (UniqueName: \"kubernetes.io/projected/12b4de70-50e1-40ba-836d-41c953077b50-kube-api-access-lgflr\") pod \"multus-additional-cni-plugins-mc5l9\" (UID: \"12b4de70-50e1-40ba-836d-41c953077b50\") " pod="openshift-multus/multus-additional-cni-plugins-mc5l9" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.389866 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-systemd-units\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.389929 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-var-lib-openvswitch\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.390005 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-log-socket\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.390097 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-run-ovn\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.390169 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-run-ovn-kubernetes\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.390249 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/12b4de70-50e1-40ba-836d-41c953077b50-system-cni-dir\") pod \"multus-additional-cni-plugins-mc5l9\" (UID: \"12b4de70-50e1-40ba-836d-41c953077b50\") " pod="openshift-multus/multus-additional-cni-plugins-mc5l9" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.390378 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8384efba-0256-458d-8aab-627ad76e643e-rootfs\") pod \"machine-config-daemon-qv8tj\" (UID: \"8384efba-0256-458d-8aab-627ad76e643e\") " pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.390463 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8384efba-0256-458d-8aab-627ad76e643e-mcd-auth-proxy-config\") pod \"machine-config-daemon-qv8tj\" (UID: \"8384efba-0256-458d-8aab-627ad76e643e\") " pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.406104 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.445155 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.469980 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.471963 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.471998 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.472005 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.472018 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.472027 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:22Z","lastTransitionTime":"2025-12-03T06:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.486242 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.491975 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-etc-openvswitch\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492014 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-node-log\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492032 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqnqr\" (UniqueName: \"kubernetes.io/projected/19542618-7a4e-44bc-9297-9931dcc41eea-kube-api-access-fqnqr\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492050 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/12b4de70-50e1-40ba-836d-41c953077b50-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mc5l9\" (UID: \"12b4de70-50e1-40ba-836d-41c953077b50\") " pod="openshift-multus/multus-additional-cni-plugins-mc5l9" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492074 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/12b4de70-50e1-40ba-836d-41c953077b50-cnibin\") pod \"multus-additional-cni-plugins-mc5l9\" (UID: \"12b4de70-50e1-40ba-836d-41c953077b50\") " pod="openshift-multus/multus-additional-cni-plugins-mc5l9" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492089 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/19542618-7a4e-44bc-9297-9931dcc41eea-ovnkube-config\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492115 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-kubelet\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492132 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/19542618-7a4e-44bc-9297-9931dcc41eea-env-overrides\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492148 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/12b4de70-50e1-40ba-836d-41c953077b50-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mc5l9\" (UID: \"12b4de70-50e1-40ba-836d-41c953077b50\") " pod="openshift-multus/multus-additional-cni-plugins-mc5l9" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492147 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-etc-openvswitch\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492163 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-slash\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492178 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-run-netns\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492197 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6s4r\" (UniqueName: \"kubernetes.io/projected/8384efba-0256-458d-8aab-627ad76e643e-kube-api-access-h6s4r\") pod \"machine-config-daemon-qv8tj\" (UID: \"8384efba-0256-458d-8aab-627ad76e643e\") " pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492195 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-node-log\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492250 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-slash\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492252 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492272 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-run-netns\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492225 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492332 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-kubelet\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492427 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/19542618-7a4e-44bc-9297-9931dcc41eea-ovnkube-script-lib\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492453 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/12b4de70-50e1-40ba-836d-41c953077b50-os-release\") pod \"multus-additional-cni-plugins-mc5l9\" (UID: \"12b4de70-50e1-40ba-836d-41c953077b50\") " pod="openshift-multus/multus-additional-cni-plugins-mc5l9" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492472 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/12b4de70-50e1-40ba-836d-41c953077b50-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mc5l9\" (UID: \"12b4de70-50e1-40ba-836d-41c953077b50\") " pod="openshift-multus/multus-additional-cni-plugins-mc5l9" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492487 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-run-systemd\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492513 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/12b4de70-50e1-40ba-836d-41c953077b50-cnibin\") pod \"multus-additional-cni-plugins-mc5l9\" (UID: \"12b4de70-50e1-40ba-836d-41c953077b50\") " pod="openshift-multus/multus-additional-cni-plugins-mc5l9" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492527 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-cni-netd\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492583 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/12b4de70-50e1-40ba-836d-41c953077b50-cni-binary-copy\") pod \"multus-additional-cni-plugins-mc5l9\" (UID: \"12b4de70-50e1-40ba-836d-41c953077b50\") " pod="openshift-multus/multus-additional-cni-plugins-mc5l9" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492621 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-run-systemd\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492675 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/12b4de70-50e1-40ba-836d-41c953077b50-os-release\") pod \"multus-additional-cni-plugins-mc5l9\" (UID: \"12b4de70-50e1-40ba-836d-41c953077b50\") " pod="openshift-multus/multus-additional-cni-plugins-mc5l9" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492728 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-run-openvswitch\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492753 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-cni-bin\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492754 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-cni-netd\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492774 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/19542618-7a4e-44bc-9297-9931dcc41eea-ovn-node-metrics-cert\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492829 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-run-openvswitch\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492844 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgflr\" (UniqueName: \"kubernetes.io/projected/12b4de70-50e1-40ba-836d-41c953077b50-kube-api-access-lgflr\") pod \"multus-additional-cni-plugins-mc5l9\" (UID: \"12b4de70-50e1-40ba-836d-41c953077b50\") " pod="openshift-multus/multus-additional-cni-plugins-mc5l9" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492852 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-cni-bin\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492869 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8384efba-0256-458d-8aab-627ad76e643e-proxy-tls\") pod \"machine-config-daemon-qv8tj\" (UID: \"8384efba-0256-458d-8aab-627ad76e643e\") " pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492897 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-systemd-units\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492913 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-var-lib-openvswitch\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492929 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-log-socket\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492948 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-run-ovn\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492956 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-systemd-units\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492971 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-run-ovn-kubernetes\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492989 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/12b4de70-50e1-40ba-836d-41c953077b50-system-cni-dir\") pod \"multus-additional-cni-plugins-mc5l9\" (UID: \"12b4de70-50e1-40ba-836d-41c953077b50\") " pod="openshift-multus/multus-additional-cni-plugins-mc5l9" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.492991 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-log-socket\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.493016 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-var-lib-openvswitch\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.493033 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8384efba-0256-458d-8aab-627ad76e643e-rootfs\") pod \"machine-config-daemon-qv8tj\" (UID: \"8384efba-0256-458d-8aab-627ad76e643e\") " pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.493042 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-run-ovn-kubernetes\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.493050 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8384efba-0256-458d-8aab-627ad76e643e-mcd-auth-proxy-config\") pod \"machine-config-daemon-qv8tj\" (UID: \"8384efba-0256-458d-8aab-627ad76e643e\") " pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.493065 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-run-ovn\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.493145 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8384efba-0256-458d-8aab-627ad76e643e-rootfs\") pod \"machine-config-daemon-qv8tj\" (UID: \"8384efba-0256-458d-8aab-627ad76e643e\") " pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.493169 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/12b4de70-50e1-40ba-836d-41c953077b50-system-cni-dir\") pod \"multus-additional-cni-plugins-mc5l9\" (UID: \"12b4de70-50e1-40ba-836d-41c953077b50\") " pod="openshift-multus/multus-additional-cni-plugins-mc5l9" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.493531 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/12b4de70-50e1-40ba-836d-41c953077b50-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mc5l9\" (UID: \"12b4de70-50e1-40ba-836d-41c953077b50\") " pod="openshift-multus/multus-additional-cni-plugins-mc5l9" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.493573 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/12b4de70-50e1-40ba-836d-41c953077b50-cni-binary-copy\") pod \"multus-additional-cni-plugins-mc5l9\" (UID: \"12b4de70-50e1-40ba-836d-41c953077b50\") " pod="openshift-multus/multus-additional-cni-plugins-mc5l9" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.493840 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8384efba-0256-458d-8aab-627ad76e643e-mcd-auth-proxy-config\") pod \"machine-config-daemon-qv8tj\" (UID: \"8384efba-0256-458d-8aab-627ad76e643e\") " pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.499252 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8384efba-0256-458d-8aab-627ad76e643e-proxy-tls\") pod \"machine-config-daemon-qv8tj\" (UID: \"8384efba-0256-458d-8aab-627ad76e643e\") " pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.511941 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.514146 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgflr\" (UniqueName: \"kubernetes.io/projected/12b4de70-50e1-40ba-836d-41c953077b50-kube-api-access-lgflr\") pod \"multus-additional-cni-plugins-mc5l9\" (UID: \"12b4de70-50e1-40ba-836d-41c953077b50\") " pod="openshift-multus/multus-additional-cni-plugins-mc5l9" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.514475 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6s4r\" (UniqueName: \"kubernetes.io/projected/8384efba-0256-458d-8aab-627ad76e643e-kube-api-access-h6s4r\") pod \"machine-config-daemon-qv8tj\" (UID: \"8384efba-0256-458d-8aab-627ad76e643e\") " pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.524636 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.541005 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.556623 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.568024 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.574525 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.574566 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.574575 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.574590 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.574600 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:22Z","lastTransitionTime":"2025-12-03T06:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.580605 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.596012 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.612011 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.621064 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.636063 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.659246 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.659471 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: W1203 06:49:22.674407 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12b4de70_50e1_40ba_836d_41c953077b50.slice/crio-e497c873e81eaaf4aaf30963c226da14808971c5ecee956a12c5f0c00d03f42a WatchSource:0}: Error finding container e497c873e81eaaf4aaf30963c226da14808971c5ecee956a12c5f0c00d03f42a: Status 404 returned error can't find the container with id e497c873e81eaaf4aaf30963c226da14808971c5ecee956a12c5f0c00d03f42a Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.675050 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.676753 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.676827 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.676883 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.676905 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.676933 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.676952 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:22Z","lastTransitionTime":"2025-12-03T06:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.690967 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: W1203 06:49:22.691424 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8384efba_0256_458d_8aab_627ad76e643e.slice/crio-08a21930562ccf465ce777e053fb790dc6ee4dee7f51221d6ec029ec486fd92b WatchSource:0}: Error finding container 08a21930562ccf465ce777e053fb790dc6ee4dee7f51221d6ec029ec486fd92b: Status 404 returned error can't find the container with id 08a21930562ccf465ce777e053fb790dc6ee4dee7f51221d6ec029ec486fd92b Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.713904 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.731520 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.764070 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.780466 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.780523 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.780535 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.780550 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.780561 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:22Z","lastTransitionTime":"2025-12-03T06:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.788148 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:22Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.882320 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.882360 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.882372 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.882392 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.882406 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:22Z","lastTransitionTime":"2025-12-03T06:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.985598 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.985631 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.985643 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.985658 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:22 crc kubenswrapper[4947]: I1203 06:49:22.985667 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:22Z","lastTransitionTime":"2025-12-03T06:49:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.082465 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:23 crc kubenswrapper[4947]: E1203 06:49:23.082620 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.087888 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.087928 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.087939 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.087955 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.087966 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:23Z","lastTransitionTime":"2025-12-03T06:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.156029 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.191805 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.191853 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.191866 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.191885 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.191900 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:23Z","lastTransitionTime":"2025-12-03T06:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.256255 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.274665 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/19542618-7a4e-44bc-9297-9931dcc41eea-ovn-node-metrics-cert\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.274690 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" event={"ID":"12b4de70-50e1-40ba-836d-41c953077b50","Type":"ContainerDied","Data":"f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812"} Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.274661 4947 generic.go:334] "Generic (PLEG): container finished" podID="12b4de70-50e1-40ba-836d-41c953077b50" containerID="f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812" exitCode=0 Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.274828 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" event={"ID":"12b4de70-50e1-40ba-836d-41c953077b50","Type":"ContainerStarted","Data":"e497c873e81eaaf4aaf30963c226da14808971c5ecee956a12c5f0c00d03f42a"} Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.279134 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-g8hh7" event={"ID":"a0cbf0ab-15d8-4ec1-b889-c31a347923f4","Type":"ContainerStarted","Data":"84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2"} Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.285384 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-97tnc" event={"ID":"1c90ac94-365a-4c82-b72a-41129d95a39e","Type":"ContainerStarted","Data":"b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d"} Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.285433 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-97tnc" event={"ID":"1c90ac94-365a-4c82-b72a-41129d95a39e","Type":"ContainerStarted","Data":"ccd022d1585108ddad61071ab7de10f38556e3edfba89239b31d02140e9f7427"} Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.291526 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9"} Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.291575 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e"} Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.291589 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"08a21930562ccf465ce777e053fb790dc6ee4dee7f51221d6ec029ec486fd92b"} Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.292083 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.295593 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.295636 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.295649 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.295868 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.295880 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:23Z","lastTransitionTime":"2025-12-03T06:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.309168 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.322434 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.341976 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.362308 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.374076 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.386913 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.399535 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.399589 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.399601 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.399626 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.399639 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:23Z","lastTransitionTime":"2025-12-03T06:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.400477 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.413722 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.425970 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.426684 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.435477 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/19542618-7a4e-44bc-9297-9931dcc41eea-ovnkube-config\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.439265 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.449129 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.459940 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.472350 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.486678 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: E1203 06:49:23.492355 4947 configmap.go:193] Couldn't get configMap openshift-ovn-kubernetes/env-overrides: failed to sync configmap cache: timed out waiting for the condition Dec 03 06:49:23 crc kubenswrapper[4947]: E1203 06:49:23.492524 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/19542618-7a4e-44bc-9297-9931dcc41eea-env-overrides podName:19542618-7a4e-44bc-9297-9931dcc41eea nodeName:}" failed. No retries permitted until 2025-12-03 06:49:23.992507716 +0000 UTC m=+25.253462142 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "env-overrides" (UniqueName: "kubernetes.io/configmap/19542618-7a4e-44bc-9297-9931dcc41eea-env-overrides") pod "ovnkube-node-pt9n6" (UID: "19542618-7a4e-44bc-9297-9931dcc41eea") : failed to sync configmap cache: timed out waiting for the condition Dec 03 06:49:23 crc kubenswrapper[4947]: E1203 06:49:23.492754 4947 configmap.go:193] Couldn't get configMap openshift-ovn-kubernetes/ovnkube-script-lib: failed to sync configmap cache: timed out waiting for the condition Dec 03 06:49:23 crc kubenswrapper[4947]: E1203 06:49:23.492873 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/19542618-7a4e-44bc-9297-9931dcc41eea-ovnkube-script-lib podName:19542618-7a4e-44bc-9297-9931dcc41eea nodeName:}" failed. No retries permitted until 2025-12-03 06:49:23.992862545 +0000 UTC m=+25.253816971 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovnkube-script-lib" (UniqueName: "kubernetes.io/configmap/19542618-7a4e-44bc-9297-9931dcc41eea-ovnkube-script-lib") pod "ovnkube-node-pt9n6" (UID: "19542618-7a4e-44bc-9297-9931dcc41eea") : failed to sync configmap cache: timed out waiting for the condition Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.502946 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.503427 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.503546 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.503620 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.503691 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.503747 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:23Z","lastTransitionTime":"2025-12-03T06:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:23 crc kubenswrapper[4947]: E1203 06:49:23.514549 4947 projected.go:288] Couldn't get configMap openshift-ovn-kubernetes/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.517218 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.531873 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.546359 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.565232 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.578579 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.597713 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.608391 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.608595 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.608774 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.608924 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.609013 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:23Z","lastTransitionTime":"2025-12-03T06:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.611052 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.625289 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.640029 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.663057 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.683891 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.711156 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.711205 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.711217 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.711236 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.711248 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:23Z","lastTransitionTime":"2025-12-03T06:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.813465 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.813531 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.813544 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.813570 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.813583 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:23Z","lastTransitionTime":"2025-12-03T06:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.878163 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 06:49:23 crc kubenswrapper[4947]: E1203 06:49:23.885523 4947 projected.go:194] Error preparing data for projected volume kube-api-access-fqnqr for pod openshift-ovn-kubernetes/ovnkube-node-pt9n6: failed to sync configmap cache: timed out waiting for the condition Dec 03 06:49:23 crc kubenswrapper[4947]: E1203 06:49:23.885622 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19542618-7a4e-44bc-9297-9931dcc41eea-kube-api-access-fqnqr podName:19542618-7a4e-44bc-9297-9931dcc41eea nodeName:}" failed. No retries permitted until 2025-12-03 06:49:24.385599772 +0000 UTC m=+25.646554208 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fqnqr" (UniqueName: "kubernetes.io/projected/19542618-7a4e-44bc-9297-9931dcc41eea-kube-api-access-fqnqr") pod "ovnkube-node-pt9n6" (UID: "19542618-7a4e-44bc-9297-9931dcc41eea") : failed to sync configmap cache: timed out waiting for the condition Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.915767 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.915801 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.915811 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.915824 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.915833 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:23Z","lastTransitionTime":"2025-12-03T06:49:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.923360 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 06:49:23 crc kubenswrapper[4947]: I1203 06:49:23.935637 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.011633 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/19542618-7a4e-44bc-9297-9931dcc41eea-env-overrides\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.011687 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/19542618-7a4e-44bc-9297-9931dcc41eea-ovnkube-script-lib\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.012335 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/19542618-7a4e-44bc-9297-9931dcc41eea-ovnkube-script-lib\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.012965 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/19542618-7a4e-44bc-9297-9931dcc41eea-env-overrides\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.018429 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.018670 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.018809 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.018941 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.019066 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:24Z","lastTransitionTime":"2025-12-03T06:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.083043 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:24 crc kubenswrapper[4947]: E1203 06:49:24.083163 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.083059 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:24 crc kubenswrapper[4947]: E1203 06:49:24.083886 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.124305 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.124379 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.124419 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.124453 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.124480 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:24Z","lastTransitionTime":"2025-12-03T06:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.227177 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.227213 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.227223 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.227237 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.227246 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:24Z","lastTransitionTime":"2025-12-03T06:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.296897 4947 generic.go:334] "Generic (PLEG): container finished" podID="12b4de70-50e1-40ba-836d-41c953077b50" containerID="32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07" exitCode=0 Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.296956 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" event={"ID":"12b4de70-50e1-40ba-836d-41c953077b50","Type":"ContainerDied","Data":"32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07"} Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.311268 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.326123 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.331032 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.331080 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.331098 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.331121 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.331137 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:24Z","lastTransitionTime":"2025-12-03T06:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.338667 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.350858 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.367687 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.381376 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.397271 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.415735 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqnqr\" (UniqueName: \"kubernetes.io/projected/19542618-7a4e-44bc-9297-9931dcc41eea-kube-api-access-fqnqr\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.418363 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.421641 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqnqr\" (UniqueName: \"kubernetes.io/projected/19542618-7a4e-44bc-9297-9931dcc41eea-kube-api-access-fqnqr\") pod \"ovnkube-node-pt9n6\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.435437 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.435474 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.435511 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.435534 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.435547 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:24Z","lastTransitionTime":"2025-12-03T06:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.436467 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.472407 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.485034 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.489641 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.496648 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: W1203 06:49:24.502077 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19542618_7a4e_44bc_9297_9931dcc41eea.slice/crio-dabca9ad190807bc920c390111b266c4671349adb1266225a129315e8ef3bc5c WatchSource:0}: Error finding container dabca9ad190807bc920c390111b266c4671349adb1266225a129315e8ef3bc5c: Status 404 returned error can't find the container with id dabca9ad190807bc920c390111b266c4671349adb1266225a129315e8ef3bc5c Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.518221 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.538237 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.538269 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.538278 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.538292 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.538302 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:24Z","lastTransitionTime":"2025-12-03T06:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.605453 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-w866n"] Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.606140 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-w866n" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.608667 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.610124 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.610313 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.610356 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.620702 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.633334 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.640932 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.640992 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.641002 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.641019 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.641029 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:24Z","lastTransitionTime":"2025-12-03T06:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.647039 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.659640 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.672157 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.684888 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.701445 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.718431 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/150157a8-5a65-4142-9088-0ab46998fc9d-serviceca\") pod \"node-ca-w866n\" (UID: \"150157a8-5a65-4142-9088-0ab46998fc9d\") " pod="openshift-image-registry/node-ca-w866n" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.718480 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2mdn\" (UniqueName: \"kubernetes.io/projected/150157a8-5a65-4142-9088-0ab46998fc9d-kube-api-access-d2mdn\") pod \"node-ca-w866n\" (UID: \"150157a8-5a65-4142-9088-0ab46998fc9d\") " pod="openshift-image-registry/node-ca-w866n" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.718557 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/150157a8-5a65-4142-9088-0ab46998fc9d-host\") pod \"node-ca-w866n\" (UID: \"150157a8-5a65-4142-9088-0ab46998fc9d\") " pod="openshift-image-registry/node-ca-w866n" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.724550 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.735121 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.744295 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.744339 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.744348 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.744369 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.744382 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:24Z","lastTransitionTime":"2025-12-03T06:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.751342 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.772263 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.787124 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.800241 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.815134 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.819521 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/150157a8-5a65-4142-9088-0ab46998fc9d-serviceca\") pod \"node-ca-w866n\" (UID: \"150157a8-5a65-4142-9088-0ab46998fc9d\") " pod="openshift-image-registry/node-ca-w866n" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.819571 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2mdn\" (UniqueName: \"kubernetes.io/projected/150157a8-5a65-4142-9088-0ab46998fc9d-kube-api-access-d2mdn\") pod \"node-ca-w866n\" (UID: \"150157a8-5a65-4142-9088-0ab46998fc9d\") " pod="openshift-image-registry/node-ca-w866n" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.819607 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/150157a8-5a65-4142-9088-0ab46998fc9d-host\") pod \"node-ca-w866n\" (UID: \"150157a8-5a65-4142-9088-0ab46998fc9d\") " pod="openshift-image-registry/node-ca-w866n" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.819672 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/150157a8-5a65-4142-9088-0ab46998fc9d-host\") pod \"node-ca-w866n\" (UID: \"150157a8-5a65-4142-9088-0ab46998fc9d\") " pod="openshift-image-registry/node-ca-w866n" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.820833 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/150157a8-5a65-4142-9088-0ab46998fc9d-serviceca\") pod \"node-ca-w866n\" (UID: \"150157a8-5a65-4142-9088-0ab46998fc9d\") " pod="openshift-image-registry/node-ca-w866n" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.834948 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2mdn\" (UniqueName: \"kubernetes.io/projected/150157a8-5a65-4142-9088-0ab46998fc9d-kube-api-access-d2mdn\") pod \"node-ca-w866n\" (UID: \"150157a8-5a65-4142-9088-0ab46998fc9d\") " pod="openshift-image-registry/node-ca-w866n" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.846240 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.846275 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.846285 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.846301 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.846311 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:24Z","lastTransitionTime":"2025-12-03T06:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.949302 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.949606 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.949707 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.949795 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:24 crc kubenswrapper[4947]: I1203 06:49:24.949879 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:24Z","lastTransitionTime":"2025-12-03T06:49:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.004128 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-w866n" Dec 03 06:49:25 crc kubenswrapper[4947]: W1203 06:49:25.018697 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod150157a8_5a65_4142_9088_0ab46998fc9d.slice/crio-22a539e2d55fac169d7fd2783e21ca80372fbe629cbec819b4e861d58694b6fa WatchSource:0}: Error finding container 22a539e2d55fac169d7fd2783e21ca80372fbe629cbec819b4e861d58694b6fa: Status 404 returned error can't find the container with id 22a539e2d55fac169d7fd2783e21ca80372fbe629cbec819b4e861d58694b6fa Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.052659 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.052686 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.052693 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.052707 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.052716 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:25Z","lastTransitionTime":"2025-12-03T06:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.082644 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:25 crc kubenswrapper[4947]: E1203 06:49:25.082790 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.155258 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.155667 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.155680 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.155715 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.155728 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:25Z","lastTransitionTime":"2025-12-03T06:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.258439 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.258483 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.258512 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.258528 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.258539 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:25Z","lastTransitionTime":"2025-12-03T06:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.303132 4947 generic.go:334] "Generic (PLEG): container finished" podID="12b4de70-50e1-40ba-836d-41c953077b50" containerID="b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b" exitCode=0 Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.303198 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" event={"ID":"12b4de70-50e1-40ba-836d-41c953077b50","Type":"ContainerDied","Data":"b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b"} Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.305278 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-w866n" event={"ID":"150157a8-5a65-4142-9088-0ab46998fc9d","Type":"ContainerStarted","Data":"e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d"} Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.305351 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-w866n" event={"ID":"150157a8-5a65-4142-9088-0ab46998fc9d","Type":"ContainerStarted","Data":"22a539e2d55fac169d7fd2783e21ca80372fbe629cbec819b4e861d58694b6fa"} Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.307002 4947 generic.go:334] "Generic (PLEG): container finished" podID="19542618-7a4e-44bc-9297-9931dcc41eea" containerID="1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab" exitCode=0 Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.307047 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerDied","Data":"1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab"} Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.307076 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerStarted","Data":"dabca9ad190807bc920c390111b266c4671349adb1266225a129315e8ef3bc5c"} Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.320801 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.340974 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.358882 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.362593 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.362644 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.362657 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.362678 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.362690 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:25Z","lastTransitionTime":"2025-12-03T06:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.370979 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.385769 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.400330 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.413244 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.424593 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.439964 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.457333 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.465781 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.465806 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.465814 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.465828 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.465838 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:25Z","lastTransitionTime":"2025-12-03T06:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.473461 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.488844 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.507953 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.519366 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.533448 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.544001 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.561295 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.568132 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.568159 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.568168 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.568181 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.568191 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:25Z","lastTransitionTime":"2025-12-03T06:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.570626 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.580460 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.593978 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.608348 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.623635 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.638854 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.653228 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.671312 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.671608 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.671649 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.671657 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.671671 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.671683 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:25Z","lastTransitionTime":"2025-12-03T06:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.689293 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.704417 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.719198 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.727709 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.727805 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:25 crc kubenswrapper[4947]: E1203 06:49:25.727992 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:49:25 crc kubenswrapper[4947]: E1203 06:49:25.728025 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:49:25 crc kubenswrapper[4947]: E1203 06:49:25.728045 4947 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:25 crc kubenswrapper[4947]: E1203 06:49:25.728122 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 06:49:33.728100533 +0000 UTC m=+34.989054999 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:25 crc kubenswrapper[4947]: E1203 06:49:25.728368 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:49:25 crc kubenswrapper[4947]: E1203 06:49:25.728513 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:49:25 crc kubenswrapper[4947]: E1203 06:49:25.728612 4947 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:25 crc kubenswrapper[4947]: E1203 06:49:25.728751 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 06:49:33.72873131 +0000 UTC m=+34.989685746 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.774312 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.774346 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.774355 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.774369 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.774380 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:25Z","lastTransitionTime":"2025-12-03T06:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.828390 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.828568 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:25 crc kubenswrapper[4947]: E1203 06:49:25.828655 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:49:33.828621393 +0000 UTC m=+35.089575839 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:49:25 crc kubenswrapper[4947]: E1203 06:49:25.828719 4947 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.828758 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:25 crc kubenswrapper[4947]: E1203 06:49:25.828791 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:49:33.828770867 +0000 UTC m=+35.089725303 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:49:25 crc kubenswrapper[4947]: E1203 06:49:25.828895 4947 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:49:25 crc kubenswrapper[4947]: E1203 06:49:25.828965 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:49:33.828949791 +0000 UTC m=+35.089904337 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.877079 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.877349 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.877460 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.877631 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.877723 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:25Z","lastTransitionTime":"2025-12-03T06:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.981320 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.981388 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.981406 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.981431 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:25 crc kubenswrapper[4947]: I1203 06:49:25.981449 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:25Z","lastTransitionTime":"2025-12-03T06:49:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.082604 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.082697 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:26 crc kubenswrapper[4947]: E1203 06:49:26.083989 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:49:26 crc kubenswrapper[4947]: E1203 06:49:26.084305 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.085616 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.085682 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.085702 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.085728 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.085748 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:26Z","lastTransitionTime":"2025-12-03T06:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.189129 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.189193 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.189212 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.189238 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.189255 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:26Z","lastTransitionTime":"2025-12-03T06:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.292198 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.292282 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.292296 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.292312 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.292322 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:26Z","lastTransitionTime":"2025-12-03T06:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.323984 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerStarted","Data":"3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea"} Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.324054 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerStarted","Data":"bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18"} Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.324066 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerStarted","Data":"e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16"} Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.324078 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerStarted","Data":"af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1"} Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.324088 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerStarted","Data":"2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996"} Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.324098 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerStarted","Data":"4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a"} Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.328096 4947 generic.go:334] "Generic (PLEG): container finished" podID="12b4de70-50e1-40ba-836d-41c953077b50" containerID="5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3" exitCode=0 Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.328134 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" event={"ID":"12b4de70-50e1-40ba-836d-41c953077b50","Type":"ContainerDied","Data":"5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3"} Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.346291 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.364984 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.387479 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.394882 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.394942 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.394958 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.395001 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.395018 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:26Z","lastTransitionTime":"2025-12-03T06:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.405549 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.420742 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.435592 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.451825 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.467687 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.481787 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.496715 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.499650 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.499764 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.499782 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.499844 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.499866 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:26Z","lastTransitionTime":"2025-12-03T06:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.523098 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.540034 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.558365 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.582427 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.603287 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.603337 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.603350 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.603374 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.603391 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:26Z","lastTransitionTime":"2025-12-03T06:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.706423 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.706522 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.706538 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.706559 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.706575 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:26Z","lastTransitionTime":"2025-12-03T06:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.809845 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.810397 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.810411 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.810436 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.810451 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:26Z","lastTransitionTime":"2025-12-03T06:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.913383 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.913416 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.913424 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.913437 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:26 crc kubenswrapper[4947]: I1203 06:49:26.913446 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:26Z","lastTransitionTime":"2025-12-03T06:49:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.017299 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.017348 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.017361 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.017379 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.017394 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:27Z","lastTransitionTime":"2025-12-03T06:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.082833 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:27 crc kubenswrapper[4947]: E1203 06:49:27.083313 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.120648 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.120717 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.120734 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.120760 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.120777 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:27Z","lastTransitionTime":"2025-12-03T06:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.224213 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.224273 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.224283 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.224303 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.224313 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:27Z","lastTransitionTime":"2025-12-03T06:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.327583 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.327644 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.327662 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.327686 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.327703 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:27Z","lastTransitionTime":"2025-12-03T06:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.334976 4947 generic.go:334] "Generic (PLEG): container finished" podID="12b4de70-50e1-40ba-836d-41c953077b50" containerID="8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec" exitCode=0 Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.335028 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" event={"ID":"12b4de70-50e1-40ba-836d-41c953077b50","Type":"ContainerDied","Data":"8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec"} Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.360755 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.386231 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.401539 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.417935 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.431695 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.432928 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.432993 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.433012 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.433081 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.433100 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:27Z","lastTransitionTime":"2025-12-03T06:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.450179 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.464541 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.479917 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.491918 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.505661 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.520032 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.532069 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.535152 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.535220 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.535239 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.535266 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.535285 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:27Z","lastTransitionTime":"2025-12-03T06:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.550400 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.566010 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.638662 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.638696 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.638705 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.638718 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.638727 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:27Z","lastTransitionTime":"2025-12-03T06:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.741592 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.741633 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.741644 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.741659 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.741671 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:27Z","lastTransitionTime":"2025-12-03T06:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.844573 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.844606 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.844614 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.844627 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.844636 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:27Z","lastTransitionTime":"2025-12-03T06:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.947619 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.947687 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.947705 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.947731 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:27 crc kubenswrapper[4947]: I1203 06:49:27.947749 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:27Z","lastTransitionTime":"2025-12-03T06:49:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.051005 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.051040 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.051048 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.051062 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.051071 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:28Z","lastTransitionTime":"2025-12-03T06:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.082523 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.082552 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:28 crc kubenswrapper[4947]: E1203 06:49:28.082779 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:49:28 crc kubenswrapper[4947]: E1203 06:49:28.082797 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.153413 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.153462 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.153473 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.153518 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.153531 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:28Z","lastTransitionTime":"2025-12-03T06:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.257743 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.257809 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.257828 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.257863 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.257886 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:28Z","lastTransitionTime":"2025-12-03T06:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.344664 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerStarted","Data":"dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825"} Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.346993 4947 generic.go:334] "Generic (PLEG): container finished" podID="12b4de70-50e1-40ba-836d-41c953077b50" containerID="be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744" exitCode=0 Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.347029 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" event={"ID":"12b4de70-50e1-40ba-836d-41c953077b50","Type":"ContainerDied","Data":"be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744"} Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.361078 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.361123 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.361137 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.361154 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.361166 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:28Z","lastTransitionTime":"2025-12-03T06:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.374722 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.394954 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.417555 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.434862 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.451307 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.470014 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.472801 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.472847 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.472858 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.472877 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.472888 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:28Z","lastTransitionTime":"2025-12-03T06:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.486939 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.504286 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.519225 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.536387 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.557781 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.571666 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.575600 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.575655 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.575670 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.575691 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.575705 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:28Z","lastTransitionTime":"2025-12-03T06:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.591722 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.604307 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.679381 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.679437 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.679451 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.679474 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.679508 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:28Z","lastTransitionTime":"2025-12-03T06:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.782794 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.782881 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.782905 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.782937 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.782958 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:28Z","lastTransitionTime":"2025-12-03T06:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.887050 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.887445 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.887528 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.887598 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.887669 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:28Z","lastTransitionTime":"2025-12-03T06:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.990998 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.991034 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.991045 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.991067 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:28 crc kubenswrapper[4947]: I1203 06:49:28.991081 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:28Z","lastTransitionTime":"2025-12-03T06:49:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.083068 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:29 crc kubenswrapper[4947]: E1203 06:49:29.083255 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.094816 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.094884 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.094908 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.094934 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.094951 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:29Z","lastTransitionTime":"2025-12-03T06:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.101828 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.123051 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.141362 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.163680 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.191195 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.197104 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.197244 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.197324 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.197437 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.197555 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:29Z","lastTransitionTime":"2025-12-03T06:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.210617 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.231923 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.245658 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.264855 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.278448 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.290454 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.300581 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.300606 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.300615 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.300632 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.300644 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:29Z","lastTransitionTime":"2025-12-03T06:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.303077 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.315433 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.330411 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.354106 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" event={"ID":"12b4de70-50e1-40ba-836d-41c953077b50","Type":"ContainerStarted","Data":"ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94"} Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.371731 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.389771 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.402524 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.402575 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.402589 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.402606 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.402620 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:29Z","lastTransitionTime":"2025-12-03T06:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.408938 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.428222 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.441003 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.454281 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.472208 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.486136 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.497437 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.505721 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.505760 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.505772 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.505794 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.505807 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:29Z","lastTransitionTime":"2025-12-03T06:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.534343 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.590281 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.607344 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.608209 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.608250 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.608259 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.608275 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.608287 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:29Z","lastTransitionTime":"2025-12-03T06:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.619914 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.634716 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.710815 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.710861 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.710874 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.710892 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.710905 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:29Z","lastTransitionTime":"2025-12-03T06:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.814304 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.814363 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.814379 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.814404 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.814422 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:29Z","lastTransitionTime":"2025-12-03T06:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.917146 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.917687 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.917849 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.918022 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:29 crc kubenswrapper[4947]: I1203 06:49:29.918216 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:29Z","lastTransitionTime":"2025-12-03T06:49:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.021192 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.021463 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.021579 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.021665 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.021746 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:30Z","lastTransitionTime":"2025-12-03T06:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.082511 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:30 crc kubenswrapper[4947]: E1203 06:49:30.082921 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.082520 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:30 crc kubenswrapper[4947]: E1203 06:49:30.083184 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.124841 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.124887 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.124907 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.124930 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.124947 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:30Z","lastTransitionTime":"2025-12-03T06:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.227224 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.227290 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.227310 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.227338 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.227356 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:30Z","lastTransitionTime":"2025-12-03T06:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.330717 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.331480 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.331627 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.331709 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.331782 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:30Z","lastTransitionTime":"2025-12-03T06:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.435216 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.435262 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.435275 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.435294 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.435306 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:30Z","lastTransitionTime":"2025-12-03T06:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.539314 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.539394 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.539413 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.539444 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.539463 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:30Z","lastTransitionTime":"2025-12-03T06:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.643630 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.643698 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.643718 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.643749 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.643770 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:30Z","lastTransitionTime":"2025-12-03T06:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.746929 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.747016 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.747040 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.747070 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.747090 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:30Z","lastTransitionTime":"2025-12-03T06:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.851251 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.851315 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.851338 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.851373 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.851399 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:30Z","lastTransitionTime":"2025-12-03T06:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.955093 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.955145 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.955170 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.955200 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:30 crc kubenswrapper[4947]: I1203 06:49:30.955221 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:30Z","lastTransitionTime":"2025-12-03T06:49:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.059136 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.059203 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.059219 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.059249 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.059266 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:31Z","lastTransitionTime":"2025-12-03T06:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.082743 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:31 crc kubenswrapper[4947]: E1203 06:49:31.082959 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.161997 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.162039 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.162055 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.162078 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.162094 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:31Z","lastTransitionTime":"2025-12-03T06:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.264783 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.264825 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.264834 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.264850 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.264861 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:31Z","lastTransitionTime":"2025-12-03T06:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.366835 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerStarted","Data":"7d643cdbcde2fd4fe77ce85149f0ed0747f4c0a592cf80853ca80a2a9f62fa77"} Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.367460 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.368261 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.368310 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.368321 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.368342 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.368353 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:31Z","lastTransitionTime":"2025-12-03T06:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.386628 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.401339 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.407611 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.428092 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.446846 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.471234 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.471294 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.471313 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.471339 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.471384 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:31Z","lastTransitionTime":"2025-12-03T06:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.477860 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d643cdbcde2fd4fe77ce85149f0ed0747f4c0a592cf80853ca80a2a9f62fa77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.500656 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.525554 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.552627 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.568421 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.573822 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.573857 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.573866 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.573881 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.573892 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:31Z","lastTransitionTime":"2025-12-03T06:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.584801 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.603082 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.616519 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.631377 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.651268 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.670733 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.676676 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.676714 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.676730 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.676751 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.676767 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:31Z","lastTransitionTime":"2025-12-03T06:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.694116 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.715541 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.733049 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.752664 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.772868 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.778958 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.779010 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.779020 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.779042 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.779053 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:31Z","lastTransitionTime":"2025-12-03T06:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.792465 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.808483 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.823996 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.841193 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.868404 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d643cdbcde2fd4fe77ce85149f0ed0747f4c0a592cf80853ca80a2a9f62fa77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.881902 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.882039 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.882065 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.882100 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.882122 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:31Z","lastTransitionTime":"2025-12-03T06:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.889642 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.905648 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.918919 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.985719 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.985791 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.985804 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.985830 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:31 crc kubenswrapper[4947]: I1203 06:49:31.985845 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:31Z","lastTransitionTime":"2025-12-03T06:49:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.061350 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.061475 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.061523 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.061555 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.061576 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:32Z","lastTransitionTime":"2025-12-03T06:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:32 crc kubenswrapper[4947]: E1203 06:49:32.076919 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:32Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.082123 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.082187 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:32 crc kubenswrapper[4947]: E1203 06:49:32.082326 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.082353 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.082383 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.082400 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.082424 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.082439 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:32Z","lastTransitionTime":"2025-12-03T06:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:32 crc kubenswrapper[4947]: E1203 06:49:32.082611 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:49:32 crc kubenswrapper[4947]: E1203 06:49:32.098352 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:32Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.103956 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.104010 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.104022 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.104043 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.104055 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:32Z","lastTransitionTime":"2025-12-03T06:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:32 crc kubenswrapper[4947]: E1203 06:49:32.118220 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:32Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.122662 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.122722 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.122734 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.122761 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.122775 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:32Z","lastTransitionTime":"2025-12-03T06:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:32 crc kubenswrapper[4947]: E1203 06:49:32.137703 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:32Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.142896 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.142934 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.142949 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.142965 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.142979 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:32Z","lastTransitionTime":"2025-12-03T06:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:32 crc kubenswrapper[4947]: E1203 06:49:32.157266 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:32Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:32 crc kubenswrapper[4947]: E1203 06:49:32.157452 4947 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.159756 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.159848 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.159883 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.159936 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.159961 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:32Z","lastTransitionTime":"2025-12-03T06:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.263586 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.263644 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.263660 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.263692 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.263708 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:32Z","lastTransitionTime":"2025-12-03T06:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.366120 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.366165 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.366176 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.366199 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.366219 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:32Z","lastTransitionTime":"2025-12-03T06:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.370048 4947 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.370552 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.410461 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.424816 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:32Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.450613 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d643cdbcde2fd4fe77ce85149f0ed0747f4c0a592cf80853ca80a2a9f62fa77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:32Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.468557 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:32Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.475914 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.475996 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.476006 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.476039 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.476055 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:32Z","lastTransitionTime":"2025-12-03T06:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.491758 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:32Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.508257 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:32Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.527259 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:32Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.542074 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:32Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.555804 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:32Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.566986 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:32Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.578668 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:32Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.579165 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.579215 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.579226 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.579253 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.579265 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:32Z","lastTransitionTime":"2025-12-03T06:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.592655 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:32Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.603609 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:32Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.618232 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:32Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.634356 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:32Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.682138 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.682199 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.682219 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.682246 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.682267 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:32Z","lastTransitionTime":"2025-12-03T06:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.785059 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.785111 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.785129 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.785158 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.785175 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:32Z","lastTransitionTime":"2025-12-03T06:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.888298 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.888349 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.888367 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.888392 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.888435 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:32Z","lastTransitionTime":"2025-12-03T06:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.992205 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.992296 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.992322 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.992354 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:32 crc kubenswrapper[4947]: I1203 06:49:32.992378 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:32Z","lastTransitionTime":"2025-12-03T06:49:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.082129 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:33 crc kubenswrapper[4947]: E1203 06:49:33.082315 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.095121 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.095188 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.095206 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.095228 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.095244 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:33Z","lastTransitionTime":"2025-12-03T06:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.617644 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.617675 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.617685 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.617701 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.617714 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:33Z","lastTransitionTime":"2025-12-03T06:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.618005 4947 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.625613 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.642079 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.653223 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.667376 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.686171 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.703445 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.720209 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.720245 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.720255 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.720269 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.720279 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:33Z","lastTransitionTime":"2025-12-03T06:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.721075 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.735756 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.747719 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.765766 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.781644 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.794611 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.806905 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.820735 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.820848 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:33 crc kubenswrapper[4947]: E1203 06:49:33.821991 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:49:33 crc kubenswrapper[4947]: E1203 06:49:33.822079 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:49:33 crc kubenswrapper[4947]: E1203 06:49:33.822112 4947 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:33 crc kubenswrapper[4947]: E1203 06:49:33.822253 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 06:49:49.822220785 +0000 UTC m=+51.083175211 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.825851 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.826266 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.826295 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.826319 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.826342 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.826356 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:33Z","lastTransitionTime":"2025-12-03T06:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:33 crc kubenswrapper[4947]: E1203 06:49:33.830516 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:49:33 crc kubenswrapper[4947]: E1203 06:49:33.830568 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:49:33 crc kubenswrapper[4947]: E1203 06:49:33.830592 4947 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:33 crc kubenswrapper[4947]: E1203 06:49:33.830688 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 06:49:49.830660663 +0000 UTC m=+51.091615089 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.848555 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d643cdbcde2fd4fe77ce85149f0ed0747f4c0a592cf80853ca80a2a9f62fa77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.921835 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:49:33 crc kubenswrapper[4947]: E1203 06:49:33.922222 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:49:49.922165081 +0000 UTC m=+51.183119547 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.922329 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.922455 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:33 crc kubenswrapper[4947]: E1203 06:49:33.922522 4947 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:49:33 crc kubenswrapper[4947]: E1203 06:49:33.922619 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:49:49.922596892 +0000 UTC m=+51.183551328 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:49:33 crc kubenswrapper[4947]: E1203 06:49:33.922694 4947 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:49:33 crc kubenswrapper[4947]: E1203 06:49:33.923094 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:49:49.923049974 +0000 UTC m=+51.184004600 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.934081 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.934127 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.934156 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.934181 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:33 crc kubenswrapper[4947]: I1203 06:49:33.934194 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:33Z","lastTransitionTime":"2025-12-03T06:49:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.039665 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.039716 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.039731 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.039754 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.039771 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:34Z","lastTransitionTime":"2025-12-03T06:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.082760 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.082817 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:34 crc kubenswrapper[4947]: E1203 06:49:34.082975 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:49:34 crc kubenswrapper[4947]: E1203 06:49:34.083166 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.143207 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.143264 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.143280 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.143304 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.143320 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:34Z","lastTransitionTime":"2025-12-03T06:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.247098 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.247163 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.247181 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.247212 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.247227 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:34Z","lastTransitionTime":"2025-12-03T06:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.350899 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.350967 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.350981 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.351004 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.351038 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:34Z","lastTransitionTime":"2025-12-03T06:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.455033 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.455113 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.455125 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.455152 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.455170 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:34Z","lastTransitionTime":"2025-12-03T06:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.559178 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.559243 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.559261 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.559288 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.559306 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:34Z","lastTransitionTime":"2025-12-03T06:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.624031 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pt9n6_19542618-7a4e-44bc-9297-9931dcc41eea/ovnkube-controller/0.log" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.629146 4947 generic.go:334] "Generic (PLEG): container finished" podID="19542618-7a4e-44bc-9297-9931dcc41eea" containerID="7d643cdbcde2fd4fe77ce85149f0ed0747f4c0a592cf80853ca80a2a9f62fa77" exitCode=1 Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.629218 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerDied","Data":"7d643cdbcde2fd4fe77ce85149f0ed0747f4c0a592cf80853ca80a2a9f62fa77"} Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.630371 4947 scope.go:117] "RemoveContainer" containerID="7d643cdbcde2fd4fe77ce85149f0ed0747f4c0a592cf80853ca80a2a9f62fa77" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.657030 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.666460 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.666543 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.666561 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.666584 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.666600 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:34Z","lastTransitionTime":"2025-12-03T06:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.683657 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.709162 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.730876 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.752749 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.770820 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.770993 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.771043 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.771065 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.771444 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.771468 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:34Z","lastTransitionTime":"2025-12-03T06:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.790233 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.814683 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.832903 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.852743 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.871669 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.875614 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.875657 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.875676 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.875701 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.875719 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:34Z","lastTransitionTime":"2025-12-03T06:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.886257 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.903894 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.944877 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d643cdbcde2fd4fe77ce85149f0ed0747f4c0a592cf80853ca80a2a9f62fa77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d643cdbcde2fd4fe77ce85149f0ed0747f4c0a592cf80853ca80a2a9f62fa77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:49:34Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 06:49:34.099232 6240 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 06:49:34.099302 6240 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:49:34.099324 6240 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:49:34.099386 6240 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 06:49:34.099409 6240 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 06:49:34.099407 6240 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:49:34.099424 6240 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06:49:34.099432 6240 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 06:49:34.099445 6240 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:49:34.099448 6240 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 06:49:34.099458 6240 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 06:49:34.099534 6240 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:49:34.099576 6240 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 06:49:34.099677 6240 factory.go:656] Stopping watch factory\\\\nI1203 06:49:34.099707 6240 ovnkube.go:599] Stopped ovnkube\\\\nI1203 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.982050 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.982126 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.982142 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.982164 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:34 crc kubenswrapper[4947]: I1203 06:49:34.982185 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:34Z","lastTransitionTime":"2025-12-03T06:49:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.082982 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:35 crc kubenswrapper[4947]: E1203 06:49:35.083126 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.084812 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.084857 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.084869 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.084885 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.084897 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:35Z","lastTransitionTime":"2025-12-03T06:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.188277 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.188333 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.188346 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.188370 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.188389 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:35Z","lastTransitionTime":"2025-12-03T06:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.290534 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.290568 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.290579 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.290592 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.290600 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:35Z","lastTransitionTime":"2025-12-03T06:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.393106 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.393156 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.393167 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.393185 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.393198 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:35Z","lastTransitionTime":"2025-12-03T06:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.496516 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.496571 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.496587 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.496613 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.496631 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:35Z","lastTransitionTime":"2025-12-03T06:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.599690 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.599730 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.599741 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.599759 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.599772 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:35Z","lastTransitionTime":"2025-12-03T06:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.635697 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pt9n6_19542618-7a4e-44bc-9297-9931dcc41eea/ovnkube-controller/0.log" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.639116 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerStarted","Data":"2322502c6bd099caf34de2d2a47a6916e481abcb49e24de7c9b1d737edbb9a45"} Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.639296 4947 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.656669 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.678301 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv"] Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.678863 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.680758 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.680852 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.685343 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2322502c6bd099caf34de2d2a47a6916e481abcb49e24de7c9b1d737edbb9a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d643cdbcde2fd4fe77ce85149f0ed0747f4c0a592cf80853ca80a2a9f62fa77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:49:34Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 06:49:34.099232 6240 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 06:49:34.099302 6240 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:49:34.099324 6240 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:49:34.099386 6240 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 06:49:34.099409 6240 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 06:49:34.099407 6240 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:49:34.099424 6240 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06:49:34.099432 6240 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 06:49:34.099445 6240 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:49:34.099448 6240 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 06:49:34.099458 6240 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 06:49:34.099534 6240 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:49:34.099576 6240 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 06:49:34.099677 6240 factory.go:656] Stopping watch factory\\\\nI1203 06:49:34.099707 6240 ovnkube.go:599] Stopped ovnkube\\\\nI1203 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.702348 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.702781 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.702822 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.702832 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.702852 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.703213 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:35Z","lastTransitionTime":"2025-12-03T06:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.717321 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.733178 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.749964 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.764061 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.779086 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.797866 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.805841 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.805871 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.805883 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.805901 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.805915 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:35Z","lastTransitionTime":"2025-12-03T06:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.819859 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.840421 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.845253 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/80d0c717-13a1-4c19-9af2-0dd9805ad606-env-overrides\") pod \"ovnkube-control-plane-749d76644c-sx6lv\" (UID: \"80d0c717-13a1-4c19-9af2-0dd9805ad606\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.845394 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgzvb\" (UniqueName: \"kubernetes.io/projected/80d0c717-13a1-4c19-9af2-0dd9805ad606-kube-api-access-zgzvb\") pod \"ovnkube-control-plane-749d76644c-sx6lv\" (UID: \"80d0c717-13a1-4c19-9af2-0dd9805ad606\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.845536 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/80d0c717-13a1-4c19-9af2-0dd9805ad606-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-sx6lv\" (UID: \"80d0c717-13a1-4c19-9af2-0dd9805ad606\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.845642 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/80d0c717-13a1-4c19-9af2-0dd9805ad606-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-sx6lv\" (UID: \"80d0c717-13a1-4c19-9af2-0dd9805ad606\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.859325 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.881326 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.897738 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.910284 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.910324 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.910335 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.910354 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.910366 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:35Z","lastTransitionTime":"2025-12-03T06:49:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.920440 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.940126 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.947064 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/80d0c717-13a1-4c19-9af2-0dd9805ad606-env-overrides\") pod \"ovnkube-control-plane-749d76644c-sx6lv\" (UID: \"80d0c717-13a1-4c19-9af2-0dd9805ad606\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.947311 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgzvb\" (UniqueName: \"kubernetes.io/projected/80d0c717-13a1-4c19-9af2-0dd9805ad606-kube-api-access-zgzvb\") pod \"ovnkube-control-plane-749d76644c-sx6lv\" (UID: \"80d0c717-13a1-4c19-9af2-0dd9805ad606\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.947391 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/80d0c717-13a1-4c19-9af2-0dd9805ad606-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-sx6lv\" (UID: \"80d0c717-13a1-4c19-9af2-0dd9805ad606\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.947457 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/80d0c717-13a1-4c19-9af2-0dd9805ad606-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-sx6lv\" (UID: \"80d0c717-13a1-4c19-9af2-0dd9805ad606\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.948356 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/80d0c717-13a1-4c19-9af2-0dd9805ad606-env-overrides\") pod \"ovnkube-control-plane-749d76644c-sx6lv\" (UID: \"80d0c717-13a1-4c19-9af2-0dd9805ad606\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.949016 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/80d0c717-13a1-4c19-9af2-0dd9805ad606-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-sx6lv\" (UID: \"80d0c717-13a1-4c19-9af2-0dd9805ad606\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.957911 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/80d0c717-13a1-4c19-9af2-0dd9805ad606-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-sx6lv\" (UID: \"80d0c717-13a1-4c19-9af2-0dd9805ad606\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.958807 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.971805 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgzvb\" (UniqueName: \"kubernetes.io/projected/80d0c717-13a1-4c19-9af2-0dd9805ad606-kube-api-access-zgzvb\") pod \"ovnkube-control-plane-749d76644c-sx6lv\" (UID: \"80d0c717-13a1-4c19-9af2-0dd9805ad606\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.975233 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:35 crc kubenswrapper[4947]: I1203 06:49:35.992605 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.000028 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2322502c6bd099caf34de2d2a47a6916e481abcb49e24de7c9b1d737edbb9a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d643cdbcde2fd4fe77ce85149f0ed0747f4c0a592cf80853ca80a2a9f62fa77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:49:34Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 06:49:34.099232 6240 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 06:49:34.099302 6240 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:49:34.099324 6240 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:49:34.099386 6240 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 06:49:34.099409 6240 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 06:49:34.099407 6240 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:49:34.099424 6240 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06:49:34.099432 6240 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 06:49:34.099445 6240 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:49:34.099448 6240 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 06:49:34.099458 6240 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 06:49:34.099534 6240 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:49:34.099576 6240 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 06:49:34.099677 6240 factory.go:656] Stopping watch factory\\\\nI1203 06:49:34.099707 6240 ovnkube.go:599] Stopped ovnkube\\\\nI1203 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: W1203 06:49:36.013814 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80d0c717_13a1_4c19_9af2_0dd9805ad606.slice/crio-67591d7d40679635fc0aadb922a7d41a7102ddcd06667fc2df6c431e502e6e92 WatchSource:0}: Error finding container 67591d7d40679635fc0aadb922a7d41a7102ddcd06667fc2df6c431e502e6e92: Status 404 returned error can't find the container with id 67591d7d40679635fc0aadb922a7d41a7102ddcd06667fc2df6c431e502e6e92 Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.015549 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.015592 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.015607 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.015630 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.015646 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:36Z","lastTransitionTime":"2025-12-03T06:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.016337 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.038051 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.055964 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d0c717-13a1-4c19-9af2-0dd9805ad606\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sx6lv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.084901 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:36 crc kubenswrapper[4947]: E1203 06:49:36.085108 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.084918 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:36 crc kubenswrapper[4947]: E1203 06:49:36.085666 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.088263 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.122923 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.160142 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.160207 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.160221 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.160250 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.160263 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:36Z","lastTransitionTime":"2025-12-03T06:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.169140 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.183385 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.195941 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.210150 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.224832 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.262179 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.262218 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.262227 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.262243 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.262253 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:36Z","lastTransitionTime":"2025-12-03T06:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.365275 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.365318 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.365328 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.365346 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.365356 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:36Z","lastTransitionTime":"2025-12-03T06:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.474793 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.474853 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.474864 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.474886 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.474902 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:36Z","lastTransitionTime":"2025-12-03T06:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.491213 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-cz948"] Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.492018 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:49:36 crc kubenswrapper[4947]: E1203 06:49:36.492120 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.511436 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.534282 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.547956 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.561971 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.577940 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.577988 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.578014 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.578038 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.578051 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:36Z","lastTransitionTime":"2025-12-03T06:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.582791 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2322502c6bd099caf34de2d2a47a6916e481abcb49e24de7c9b1d737edbb9a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d643cdbcde2fd4fe77ce85149f0ed0747f4c0a592cf80853ca80a2a9f62fa77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:49:34Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 06:49:34.099232 6240 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 06:49:34.099302 6240 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:49:34.099324 6240 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:49:34.099386 6240 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 06:49:34.099409 6240 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 06:49:34.099407 6240 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:49:34.099424 6240 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06:49:34.099432 6240 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 06:49:34.099445 6240 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:49:34.099448 6240 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 06:49:34.099458 6240 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 06:49:34.099534 6240 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:49:34.099576 6240 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 06:49:34.099677 6240 factory.go:656] Stopping watch factory\\\\nI1203 06:49:34.099707 6240 ovnkube.go:599] Stopped ovnkube\\\\nI1203 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.604681 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.617545 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d0c717-13a1-4c19-9af2-0dd9805ad606\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sx6lv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.629715 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.644586 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.644769 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pt9n6_19542618-7a4e-44bc-9297-9931dcc41eea/ovnkube-controller/1.log" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.645721 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pt9n6_19542618-7a4e-44bc-9297-9931dcc41eea/ovnkube-controller/0.log" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.648554 4947 generic.go:334] "Generic (PLEG): container finished" podID="19542618-7a4e-44bc-9297-9931dcc41eea" containerID="2322502c6bd099caf34de2d2a47a6916e481abcb49e24de7c9b1d737edbb9a45" exitCode=1 Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.648643 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerDied","Data":"2322502c6bd099caf34de2d2a47a6916e481abcb49e24de7c9b1d737edbb9a45"} Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.648695 4947 scope.go:117] "RemoveContainer" containerID="7d643cdbcde2fd4fe77ce85149f0ed0747f4c0a592cf80853ca80a2a9f62fa77" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.649467 4947 scope.go:117] "RemoveContainer" containerID="2322502c6bd099caf34de2d2a47a6916e481abcb49e24de7c9b1d737edbb9a45" Dec 03 06:49:36 crc kubenswrapper[4947]: E1203 06:49:36.649761 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pt9n6_openshift-ovn-kubernetes(19542618-7a4e-44bc-9297-9931dcc41eea)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.650218 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" event={"ID":"80d0c717-13a1-4c19-9af2-0dd9805ad606","Type":"ContainerStarted","Data":"709ce1febf81d6950f1da93243c30412e5c34208680a47a141e7908e7fa4b9bf"} Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.650238 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" event={"ID":"80d0c717-13a1-4c19-9af2-0dd9805ad606","Type":"ContainerStarted","Data":"12751cf58cf28bbb1c80e0f812afa88932e066c4266d386e60e3853b5bb8c060"} Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.650247 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" event={"ID":"80d0c717-13a1-4c19-9af2-0dd9805ad606","Type":"ContainerStarted","Data":"67591d7d40679635fc0aadb922a7d41a7102ddcd06667fc2df6c431e502e6e92"} Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.658821 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.664936 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtjjg\" (UniqueName: \"kubernetes.io/projected/8dd41826-cef5-42f7-8730-abc792b9337c-kube-api-access-jtjjg\") pod \"network-metrics-daemon-cz948\" (UID: \"8dd41826-cef5-42f7-8730-abc792b9337c\") " pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.664975 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs\") pod \"network-metrics-daemon-cz948\" (UID: \"8dd41826-cef5-42f7-8730-abc792b9337c\") " pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.674610 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.680401 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.680436 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.680445 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.680462 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.680472 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:36Z","lastTransitionTime":"2025-12-03T06:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.685781 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.696014 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.706520 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.717187 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.727252 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cz948" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd41826-cef5-42f7-8730-abc792b9337c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cz948\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.739134 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.754185 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.765722 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.765977 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtjjg\" (UniqueName: \"kubernetes.io/projected/8dd41826-cef5-42f7-8730-abc792b9337c-kube-api-access-jtjjg\") pod \"network-metrics-daemon-cz948\" (UID: \"8dd41826-cef5-42f7-8730-abc792b9337c\") " pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.766027 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs\") pod \"network-metrics-daemon-cz948\" (UID: \"8dd41826-cef5-42f7-8730-abc792b9337c\") " pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:49:36 crc kubenswrapper[4947]: E1203 06:49:36.766339 4947 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:49:36 crc kubenswrapper[4947]: E1203 06:49:36.766407 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs podName:8dd41826-cef5-42f7-8730-abc792b9337c nodeName:}" failed. No retries permitted until 2025-12-03 06:49:37.266386845 +0000 UTC m=+38.527341282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs") pod "network-metrics-daemon-cz948" (UID: "8dd41826-cef5-42f7-8730-abc792b9337c") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.783537 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.783825 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.783870 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.783892 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.783920 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.783943 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:36Z","lastTransitionTime":"2025-12-03T06:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.784011 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtjjg\" (UniqueName: \"kubernetes.io/projected/8dd41826-cef5-42f7-8730-abc792b9337c-kube-api-access-jtjjg\") pod \"network-metrics-daemon-cz948\" (UID: \"8dd41826-cef5-42f7-8730-abc792b9337c\") " pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.793481 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.802170 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.812153 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cz948" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd41826-cef5-42f7-8730-abc792b9337c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cz948\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.824595 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.837481 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.848064 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.859291 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.877106 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2322502c6bd099caf34de2d2a47a6916e481abcb49e24de7c9b1d737edbb9a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d643cdbcde2fd4fe77ce85149f0ed0747f4c0a592cf80853ca80a2a9f62fa77\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:49:34Z\\\",\\\"message\\\":\\\"e (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 06:49:34.099232 6240 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 06:49:34.099302 6240 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:49:34.099324 6240 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:49:34.099386 6240 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 06:49:34.099409 6240 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 06:49:34.099407 6240 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:49:34.099424 6240 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06:49:34.099432 6240 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 06:49:34.099445 6240 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:49:34.099448 6240 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 06:49:34.099458 6240 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 06:49:34.099534 6240 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:49:34.099576 6240 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 06:49:34.099677 6240 factory.go:656] Stopping watch factory\\\\nI1203 06:49:34.099707 6240 ovnkube.go:599] Stopped ovnkube\\\\nI1203 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2322502c6bd099caf34de2d2a47a6916e481abcb49e24de7c9b1d737edbb9a45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 06:49:35.637175 6377 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 06:49:35.637627 6377 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 06:49:35.637704 6377 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 06:49:35.637733 6377 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:49:35.637742 6377 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:49:35.637783 6377 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:49:35.637788 6377 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 06:49:35.637799 6377 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 06:49:35.637819 6377 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06:49:35.637866 6377 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:49:35.637883 6377 factory.go:656] Stopping watch factory\\\\nI1203 06:49:35.637903 6377 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06:49:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.886644 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.886680 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.886688 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.886701 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.886711 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:36Z","lastTransitionTime":"2025-12-03T06:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.889792 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.900581 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.912148 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.921301 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d0c717-13a1-4c19-9af2-0dd9805ad606\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12751cf58cf28bbb1c80e0f812afa88932e066c4266d386e60e3853b5bb8c060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709ce1febf81d6950f1da93243c30412e5c34208680a47a141e7908e7fa4b9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sx6lv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.989299 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.989341 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.989349 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.989385 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:36 crc kubenswrapper[4947]: I1203 06:49:36.989395 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:36Z","lastTransitionTime":"2025-12-03T06:49:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.082217 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:37 crc kubenswrapper[4947]: E1203 06:49:37.082386 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.091386 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.091417 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.091431 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.091467 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.091481 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:37Z","lastTransitionTime":"2025-12-03T06:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.194340 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.194708 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.194784 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.194825 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.194853 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:37Z","lastTransitionTime":"2025-12-03T06:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.274639 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs\") pod \"network-metrics-daemon-cz948\" (UID: \"8dd41826-cef5-42f7-8730-abc792b9337c\") " pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:49:37 crc kubenswrapper[4947]: E1203 06:49:37.274775 4947 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:49:37 crc kubenswrapper[4947]: E1203 06:49:37.274826 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs podName:8dd41826-cef5-42f7-8730-abc792b9337c nodeName:}" failed. No retries permitted until 2025-12-03 06:49:38.274811741 +0000 UTC m=+39.535766167 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs") pod "network-metrics-daemon-cz948" (UID: "8dd41826-cef5-42f7-8730-abc792b9337c") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.296784 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.296855 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.296869 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.296891 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.296905 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:37Z","lastTransitionTime":"2025-12-03T06:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.399327 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.399374 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.399385 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.399402 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.399414 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:37Z","lastTransitionTime":"2025-12-03T06:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.502021 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.502086 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.502105 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.502129 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.502147 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:37Z","lastTransitionTime":"2025-12-03T06:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.604897 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.604970 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.604994 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.605019 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.605035 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:37Z","lastTransitionTime":"2025-12-03T06:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.661955 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pt9n6_19542618-7a4e-44bc-9297-9931dcc41eea/ovnkube-controller/1.log" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.666697 4947 scope.go:117] "RemoveContainer" containerID="2322502c6bd099caf34de2d2a47a6916e481abcb49e24de7c9b1d737edbb9a45" Dec 03 06:49:37 crc kubenswrapper[4947]: E1203 06:49:37.666947 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pt9n6_openshift-ovn-kubernetes(19542618-7a4e-44bc-9297-9931dcc41eea)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.686666 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.704761 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.707645 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.707689 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.707706 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.707730 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.707749 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:37Z","lastTransitionTime":"2025-12-03T06:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.724062 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.740962 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.769084 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2322502c6bd099caf34de2d2a47a6916e481abcb49e24de7c9b1d737edbb9a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2322502c6bd099caf34de2d2a47a6916e481abcb49e24de7c9b1d737edbb9a45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 06:49:35.637175 6377 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 06:49:35.637627 6377 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 06:49:35.637704 6377 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 06:49:35.637733 6377 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:49:35.637742 6377 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:49:35.637783 6377 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:49:35.637788 6377 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 06:49:35.637799 6377 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 06:49:35.637819 6377 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06:49:35.637866 6377 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:49:35.637883 6377 factory.go:656] Stopping watch factory\\\\nI1203 06:49:35.637903 6377 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06:49:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pt9n6_openshift-ovn-kubernetes(19542618-7a4e-44bc-9297-9931dcc41eea)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.782129 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.797892 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.811057 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.811090 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.811101 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.811121 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.811135 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:37Z","lastTransitionTime":"2025-12-03T06:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.811438 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d0c717-13a1-4c19-9af2-0dd9805ad606\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12751cf58cf28bbb1c80e0f812afa88932e066c4266d386e60e3853b5bb8c060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709ce1febf81d6950f1da93243c30412e5c34208680a47a141e7908e7fa4b9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sx6lv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.826731 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.839505 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.855141 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.868693 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.887669 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.899683 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.913707 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.913920 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.913965 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.913979 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.914001 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.914017 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:37Z","lastTransitionTime":"2025-12-03T06:49:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:37 crc kubenswrapper[4947]: I1203 06:49:37.930419 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cz948" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd41826-cef5-42f7-8730-abc792b9337c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cz948\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.017282 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.017352 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.017372 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.017400 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.017418 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:38Z","lastTransitionTime":"2025-12-03T06:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.082842 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.082842 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:38 crc kubenswrapper[4947]: E1203 06:49:38.083022 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:49:38 crc kubenswrapper[4947]: E1203 06:49:38.083087 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.082867 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:38 crc kubenswrapper[4947]: E1203 06:49:38.083197 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.120615 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.120675 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.120685 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.120699 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.120709 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:38Z","lastTransitionTime":"2025-12-03T06:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.223647 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.223713 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.223733 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.223760 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.223783 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:38Z","lastTransitionTime":"2025-12-03T06:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.286433 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs\") pod \"network-metrics-daemon-cz948\" (UID: \"8dd41826-cef5-42f7-8730-abc792b9337c\") " pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:49:38 crc kubenswrapper[4947]: E1203 06:49:38.286687 4947 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:49:38 crc kubenswrapper[4947]: E1203 06:49:38.286771 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs podName:8dd41826-cef5-42f7-8730-abc792b9337c nodeName:}" failed. No retries permitted until 2025-12-03 06:49:40.286748092 +0000 UTC m=+41.547702548 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs") pod "network-metrics-daemon-cz948" (UID: "8dd41826-cef5-42f7-8730-abc792b9337c") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.326574 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.326640 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.326658 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.326686 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.326707 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:38Z","lastTransitionTime":"2025-12-03T06:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.430035 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.430092 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.430110 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.430132 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.430149 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:38Z","lastTransitionTime":"2025-12-03T06:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.532792 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.532862 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.532880 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.532909 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.532926 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:38Z","lastTransitionTime":"2025-12-03T06:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.636342 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.636399 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.636417 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.636440 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.636457 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:38Z","lastTransitionTime":"2025-12-03T06:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.740108 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.740141 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.740152 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.740174 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.740186 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:38Z","lastTransitionTime":"2025-12-03T06:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.842820 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.842885 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.842901 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.842924 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.842941 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:38Z","lastTransitionTime":"2025-12-03T06:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.945868 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.945937 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.945962 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.945996 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:38 crc kubenswrapper[4947]: I1203 06:49:38.946022 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:38Z","lastTransitionTime":"2025-12-03T06:49:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.048798 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.048866 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.048888 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.048917 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.048939 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:39Z","lastTransitionTime":"2025-12-03T06:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.082891 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:39 crc kubenswrapper[4947]: E1203 06:49:39.083124 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.108000 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.141147 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2322502c6bd099caf34de2d2a47a6916e481abcb49e24de7c9b1d737edbb9a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2322502c6bd099caf34de2d2a47a6916e481abcb49e24de7c9b1d737edbb9a45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 06:49:35.637175 6377 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 06:49:35.637627 6377 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 06:49:35.637704 6377 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 06:49:35.637733 6377 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:49:35.637742 6377 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:49:35.637783 6377 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:49:35.637788 6377 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 06:49:35.637799 6377 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 06:49:35.637819 6377 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06:49:35.637866 6377 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:49:35.637883 6377 factory.go:656] Stopping watch factory\\\\nI1203 06:49:35.637903 6377 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06:49:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pt9n6_openshift-ovn-kubernetes(19542618-7a4e-44bc-9297-9931dcc41eea)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.152726 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.152783 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.152804 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.152829 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.152847 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:39Z","lastTransitionTime":"2025-12-03T06:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.159283 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.185009 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.211016 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.230785 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d0c717-13a1-4c19-9af2-0dd9805ad606\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12751cf58cf28bbb1c80e0f812afa88932e066c4266d386e60e3853b5bb8c060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709ce1febf81d6950f1da93243c30412e5c34208680a47a141e7908e7fa4b9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sx6lv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.250056 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.256472 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.256537 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.256554 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.256577 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.256591 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:39Z","lastTransitionTime":"2025-12-03T06:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.267612 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.282257 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.299956 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.317438 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.335389 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.348734 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cz948" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd41826-cef5-42f7-8730-abc792b9337c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cz948\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.359054 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.359094 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.359104 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.359124 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.359137 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:39Z","lastTransitionTime":"2025-12-03T06:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.367118 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.383974 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.399884 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.462617 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.462682 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.462704 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.462731 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.462748 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:39Z","lastTransitionTime":"2025-12-03T06:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.566267 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.566323 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.566340 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.566361 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.566378 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:39Z","lastTransitionTime":"2025-12-03T06:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.669327 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.669408 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.669427 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.669448 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.669466 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:39Z","lastTransitionTime":"2025-12-03T06:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.772716 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.772756 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.772771 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.772825 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.772844 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:39Z","lastTransitionTime":"2025-12-03T06:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.876121 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.876184 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.876207 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.876230 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.876247 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:39Z","lastTransitionTime":"2025-12-03T06:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.979207 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.979246 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.979258 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.979276 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:39 crc kubenswrapper[4947]: I1203 06:49:39.979287 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:39Z","lastTransitionTime":"2025-12-03T06:49:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.082314 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.082451 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.082808 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:49:40 crc kubenswrapper[4947]: E1203 06:49:40.083103 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:49:40 crc kubenswrapper[4947]: E1203 06:49:40.083201 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:49:40 crc kubenswrapper[4947]: E1203 06:49:40.083298 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.083606 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.083662 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.083684 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.083711 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.083733 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:40Z","lastTransitionTime":"2025-12-03T06:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.187057 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.187395 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.187954 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.188424 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.188868 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:40Z","lastTransitionTime":"2025-12-03T06:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.293045 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.293100 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.293111 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.293133 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.293148 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:40Z","lastTransitionTime":"2025-12-03T06:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.308813 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs\") pod \"network-metrics-daemon-cz948\" (UID: \"8dd41826-cef5-42f7-8730-abc792b9337c\") " pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:49:40 crc kubenswrapper[4947]: E1203 06:49:40.309091 4947 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:49:40 crc kubenswrapper[4947]: E1203 06:49:40.309275 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs podName:8dd41826-cef5-42f7-8730-abc792b9337c nodeName:}" failed. No retries permitted until 2025-12-03 06:49:44.309235525 +0000 UTC m=+45.570190141 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs") pod "network-metrics-daemon-cz948" (UID: "8dd41826-cef5-42f7-8730-abc792b9337c") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.396579 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.396820 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.396912 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.397031 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.397116 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:40Z","lastTransitionTime":"2025-12-03T06:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.500781 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.500843 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.500860 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.500884 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.500902 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:40Z","lastTransitionTime":"2025-12-03T06:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.604415 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.604477 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.604505 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.604528 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.604552 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:40Z","lastTransitionTime":"2025-12-03T06:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.707690 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.707776 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.707790 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.707815 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.707835 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:40Z","lastTransitionTime":"2025-12-03T06:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.811559 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.812561 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.812763 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.812962 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.813148 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:40Z","lastTransitionTime":"2025-12-03T06:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.916325 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.916382 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.916403 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.916430 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:40 crc kubenswrapper[4947]: I1203 06:49:40.916451 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:40Z","lastTransitionTime":"2025-12-03T06:49:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.019041 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.019136 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.019159 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.019210 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.019234 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:41Z","lastTransitionTime":"2025-12-03T06:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.082630 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:41 crc kubenswrapper[4947]: E1203 06:49:41.082825 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.121449 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.121518 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.121534 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.121552 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.121566 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:41Z","lastTransitionTime":"2025-12-03T06:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.224463 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.224556 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.224590 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.224625 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.224646 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:41Z","lastTransitionTime":"2025-12-03T06:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.327317 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.327386 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.327403 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.327429 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.327446 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:41Z","lastTransitionTime":"2025-12-03T06:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.431426 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.431523 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.431541 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.431565 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.431582 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:41Z","lastTransitionTime":"2025-12-03T06:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.534861 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.534958 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.534976 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.535037 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.535055 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:41Z","lastTransitionTime":"2025-12-03T06:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.637671 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.638080 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.638202 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.638302 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.638431 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:41Z","lastTransitionTime":"2025-12-03T06:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.742929 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.743036 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.743117 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.743156 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.743938 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:41Z","lastTransitionTime":"2025-12-03T06:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.847738 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.847824 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.847848 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.847882 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.847906 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:41Z","lastTransitionTime":"2025-12-03T06:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.951450 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.951543 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.951561 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.951596 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:41 crc kubenswrapper[4947]: I1203 06:49:41.951632 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:41Z","lastTransitionTime":"2025-12-03T06:49:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.054571 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.054632 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.054649 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.054672 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.054690 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:42Z","lastTransitionTime":"2025-12-03T06:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.082602 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.082624 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:42 crc kubenswrapper[4947]: E1203 06:49:42.082892 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.082660 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:42 crc kubenswrapper[4947]: E1203 06:49:42.083045 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:49:42 crc kubenswrapper[4947]: E1203 06:49:42.083218 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.158036 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.158129 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.158148 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.158203 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.158221 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:42Z","lastTransitionTime":"2025-12-03T06:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.238717 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.238761 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.238777 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.238794 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.238805 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:42Z","lastTransitionTime":"2025-12-03T06:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:42 crc kubenswrapper[4947]: E1203 06:49:42.253210 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:42Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.258375 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.258521 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.258604 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.258698 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.258779 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:42Z","lastTransitionTime":"2025-12-03T06:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:42 crc kubenswrapper[4947]: E1203 06:49:42.274600 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:42Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.280264 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.280301 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.280313 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.280328 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.280338 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:42Z","lastTransitionTime":"2025-12-03T06:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:42 crc kubenswrapper[4947]: E1203 06:49:42.301034 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:42Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.306086 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.306108 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.306116 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.306129 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.306138 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:42Z","lastTransitionTime":"2025-12-03T06:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:42 crc kubenswrapper[4947]: E1203 06:49:42.325233 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:42Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.329738 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.329794 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.329804 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.329826 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.329838 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:42Z","lastTransitionTime":"2025-12-03T06:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:42 crc kubenswrapper[4947]: E1203 06:49:42.347108 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:42Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:42 crc kubenswrapper[4947]: E1203 06:49:42.347287 4947 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.349730 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.349775 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.349792 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.349830 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.349853 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:42Z","lastTransitionTime":"2025-12-03T06:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.453724 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.453801 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.453820 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.453849 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.453866 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:42Z","lastTransitionTime":"2025-12-03T06:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.556354 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.556402 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.556413 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.556430 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.556441 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:42Z","lastTransitionTime":"2025-12-03T06:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.659631 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.659670 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.659681 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.659699 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.659711 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:42Z","lastTransitionTime":"2025-12-03T06:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.763529 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.763624 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.763650 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.763682 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.763708 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:42Z","lastTransitionTime":"2025-12-03T06:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.867144 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.867241 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.867259 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.867288 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.867311 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:42Z","lastTransitionTime":"2025-12-03T06:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.970445 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.970487 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.970512 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.970526 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:42 crc kubenswrapper[4947]: I1203 06:49:42.970535 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:42Z","lastTransitionTime":"2025-12-03T06:49:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.074007 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.074082 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.074108 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.074135 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.074154 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:43Z","lastTransitionTime":"2025-12-03T06:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.082839 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:43 crc kubenswrapper[4947]: E1203 06:49:43.083046 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.176543 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.176673 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.176688 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.176731 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.176741 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:43Z","lastTransitionTime":"2025-12-03T06:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.280435 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.280597 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.280624 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.280653 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.280680 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:43Z","lastTransitionTime":"2025-12-03T06:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.383210 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.383276 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.383289 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.383309 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.383322 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:43Z","lastTransitionTime":"2025-12-03T06:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.487042 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.487153 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.487181 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.487247 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.487272 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:43Z","lastTransitionTime":"2025-12-03T06:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.590879 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.590958 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.590982 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.591012 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.591035 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:43Z","lastTransitionTime":"2025-12-03T06:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.694948 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.695009 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.695019 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.695041 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.695052 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:43Z","lastTransitionTime":"2025-12-03T06:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.797814 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.797899 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.797919 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.797951 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.797971 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:43Z","lastTransitionTime":"2025-12-03T06:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.901696 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.901766 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.901785 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.901813 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:43 crc kubenswrapper[4947]: I1203 06:49:43.901836 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:43Z","lastTransitionTime":"2025-12-03T06:49:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.004653 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.004721 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.004747 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.004779 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.004805 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:44Z","lastTransitionTime":"2025-12-03T06:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.082915 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.082938 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.083009 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:44 crc kubenswrapper[4947]: E1203 06:49:44.083176 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:49:44 crc kubenswrapper[4947]: E1203 06:49:44.083630 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:49:44 crc kubenswrapper[4947]: E1203 06:49:44.083880 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.108605 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.108687 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.108712 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.108741 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.108771 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:44Z","lastTransitionTime":"2025-12-03T06:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.211281 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.211328 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.211345 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.211368 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.211387 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:44Z","lastTransitionTime":"2025-12-03T06:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.313950 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.313996 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.314013 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.314035 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.314053 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:44Z","lastTransitionTime":"2025-12-03T06:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.362311 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs\") pod \"network-metrics-daemon-cz948\" (UID: \"8dd41826-cef5-42f7-8730-abc792b9337c\") " pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:49:44 crc kubenswrapper[4947]: E1203 06:49:44.362586 4947 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:49:44 crc kubenswrapper[4947]: E1203 06:49:44.362692 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs podName:8dd41826-cef5-42f7-8730-abc792b9337c nodeName:}" failed. No retries permitted until 2025-12-03 06:49:52.362665328 +0000 UTC m=+53.623619784 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs") pod "network-metrics-daemon-cz948" (UID: "8dd41826-cef5-42f7-8730-abc792b9337c") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.416911 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.417057 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.417082 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.417113 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.417133 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:44Z","lastTransitionTime":"2025-12-03T06:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.462907 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.464297 4947 scope.go:117] "RemoveContainer" containerID="2322502c6bd099caf34de2d2a47a6916e481abcb49e24de7c9b1d737edbb9a45" Dec 03 06:49:44 crc kubenswrapper[4947]: E1203 06:49:44.464641 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pt9n6_openshift-ovn-kubernetes(19542618-7a4e-44bc-9297-9931dcc41eea)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.520815 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.520873 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.520895 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.520923 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.520942 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:44Z","lastTransitionTime":"2025-12-03T06:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.623478 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.623538 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.623546 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.623559 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.623568 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:44Z","lastTransitionTime":"2025-12-03T06:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.726572 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.726611 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.726621 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.726636 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.726647 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:44Z","lastTransitionTime":"2025-12-03T06:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.829618 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.829668 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.829679 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.829698 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.829710 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:44Z","lastTransitionTime":"2025-12-03T06:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.933096 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.933170 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.933196 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.933226 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:44 crc kubenswrapper[4947]: I1203 06:49:44.933250 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:44Z","lastTransitionTime":"2025-12-03T06:49:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.036860 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.036931 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.036951 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.036982 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.037006 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:45Z","lastTransitionTime":"2025-12-03T06:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.082547 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:45 crc kubenswrapper[4947]: E1203 06:49:45.082723 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.139470 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.139543 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.139557 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.139574 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.139587 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:45Z","lastTransitionTime":"2025-12-03T06:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.243220 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.243279 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.243296 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.243319 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.243337 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:45Z","lastTransitionTime":"2025-12-03T06:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.346733 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.346802 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.346824 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.346850 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.346868 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:45Z","lastTransitionTime":"2025-12-03T06:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.450067 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.450524 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.450646 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.450752 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.450900 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:45Z","lastTransitionTime":"2025-12-03T06:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.554378 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.554850 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.554953 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.555112 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.555225 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:45Z","lastTransitionTime":"2025-12-03T06:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.659064 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.659414 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.659573 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.659734 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.659852 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:45Z","lastTransitionTime":"2025-12-03T06:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.763062 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.763123 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.763140 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.763164 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.763181 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:45Z","lastTransitionTime":"2025-12-03T06:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.867002 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.867064 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.867081 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.867107 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.867130 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:45Z","lastTransitionTime":"2025-12-03T06:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.969963 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.970025 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.970043 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.970070 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:45 crc kubenswrapper[4947]: I1203 06:49:45.970089 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:45Z","lastTransitionTime":"2025-12-03T06:49:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.074001 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.074062 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.074080 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.074102 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.074118 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:46Z","lastTransitionTime":"2025-12-03T06:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.082107 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.082134 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.082222 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:49:46 crc kubenswrapper[4947]: E1203 06:49:46.082254 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:49:46 crc kubenswrapper[4947]: E1203 06:49:46.082439 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:49:46 crc kubenswrapper[4947]: E1203 06:49:46.082622 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.176744 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.176899 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.176924 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.176955 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.176978 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:46Z","lastTransitionTime":"2025-12-03T06:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.280228 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.280313 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.280331 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.280354 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.280372 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:46Z","lastTransitionTime":"2025-12-03T06:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.383837 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.384222 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.384424 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.384668 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.384894 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:46Z","lastTransitionTime":"2025-12-03T06:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.487955 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.488024 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.488043 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.488067 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.488083 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:46Z","lastTransitionTime":"2025-12-03T06:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.591755 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.591830 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.591841 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.591859 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.591878 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:46Z","lastTransitionTime":"2025-12-03T06:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.695727 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.695955 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.695977 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.696001 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.696020 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:46Z","lastTransitionTime":"2025-12-03T06:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.799060 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.799125 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.799147 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.799176 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.799197 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:46Z","lastTransitionTime":"2025-12-03T06:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.901752 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.901797 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.901809 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.901825 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:46 crc kubenswrapper[4947]: I1203 06:49:46.901838 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:46Z","lastTransitionTime":"2025-12-03T06:49:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.004324 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.004383 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.004400 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.004424 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.004437 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:47Z","lastTransitionTime":"2025-12-03T06:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.082514 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:47 crc kubenswrapper[4947]: E1203 06:49:47.082674 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.107003 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.107072 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.107089 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.107115 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.107144 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:47Z","lastTransitionTime":"2025-12-03T06:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.213439 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.213582 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.213615 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.213658 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.213680 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:47Z","lastTransitionTime":"2025-12-03T06:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.318163 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.318708 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.318886 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.319026 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.319166 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:47Z","lastTransitionTime":"2025-12-03T06:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.423220 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.423278 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.423293 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.423317 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.423333 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:47Z","lastTransitionTime":"2025-12-03T06:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.526424 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.526542 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.526572 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.526602 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.526627 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:47Z","lastTransitionTime":"2025-12-03T06:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.630961 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.631011 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.631029 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.631056 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.631075 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:47Z","lastTransitionTime":"2025-12-03T06:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.735709 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.735781 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.735802 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.735835 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.735858 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:47Z","lastTransitionTime":"2025-12-03T06:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.838476 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.838538 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.838549 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.838569 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.838581 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:47Z","lastTransitionTime":"2025-12-03T06:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.942166 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.942233 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.942253 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.942282 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:47 crc kubenswrapper[4947]: I1203 06:49:47.942303 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:47Z","lastTransitionTime":"2025-12-03T06:49:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.046075 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.046138 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.046154 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.046180 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.046196 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:48Z","lastTransitionTime":"2025-12-03T06:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.082559 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.082619 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.082687 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:49:48 crc kubenswrapper[4947]: E1203 06:49:48.082855 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:49:48 crc kubenswrapper[4947]: E1203 06:49:48.083014 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:49:48 crc kubenswrapper[4947]: E1203 06:49:48.083130 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.150379 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.150438 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.150451 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.150473 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.150507 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:48Z","lastTransitionTime":"2025-12-03T06:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.253702 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.253799 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.253827 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.253858 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.253881 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:48Z","lastTransitionTime":"2025-12-03T06:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.357722 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.357820 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.357849 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.357896 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.357927 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:48Z","lastTransitionTime":"2025-12-03T06:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.461049 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.461176 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.461233 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.461268 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.461288 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:48Z","lastTransitionTime":"2025-12-03T06:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.565563 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.565620 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.565635 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.565670 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.565687 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:48Z","lastTransitionTime":"2025-12-03T06:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.669313 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.669410 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.669440 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.669477 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.669546 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:48Z","lastTransitionTime":"2025-12-03T06:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.772968 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.773055 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.773067 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.773091 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.773106 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:48Z","lastTransitionTime":"2025-12-03T06:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.875798 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.875859 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.875876 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.875897 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.875915 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:48Z","lastTransitionTime":"2025-12-03T06:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.978881 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.978943 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.978953 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.978973 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:48 crc kubenswrapper[4947]: I1203 06:49:48.978983 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:48Z","lastTransitionTime":"2025-12-03T06:49:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.081348 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.082032 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.082401 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.082487 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.082581 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.082761 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:49Z","lastTransitionTime":"2025-12-03T06:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:49 crc kubenswrapper[4947]: E1203 06:49:49.082884 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.104421 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:49Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.127132 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:49Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.148002 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:49Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.165262 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:49Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.184536 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.184575 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.184587 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.184605 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.184617 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:49Z","lastTransitionTime":"2025-12-03T06:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.187978 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:49Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.204690 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:49Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.220692 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cz948" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd41826-cef5-42f7-8730-abc792b9337c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cz948\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:49Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.239204 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:49Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.263354 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:49Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.280313 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:49Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.287215 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.287260 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.287269 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.287286 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.287296 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:49Z","lastTransitionTime":"2025-12-03T06:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.298111 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:49Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.321156 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2322502c6bd099caf34de2d2a47a6916e481abcb49e24de7c9b1d737edbb9a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2322502c6bd099caf34de2d2a47a6916e481abcb49e24de7c9b1d737edbb9a45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 06:49:35.637175 6377 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 06:49:35.637627 6377 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 06:49:35.637704 6377 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 06:49:35.637733 6377 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:49:35.637742 6377 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:49:35.637783 6377 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:49:35.637788 6377 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 06:49:35.637799 6377 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 06:49:35.637819 6377 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06:49:35.637866 6377 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:49:35.637883 6377 factory.go:656] Stopping watch factory\\\\nI1203 06:49:35.637903 6377 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06:49:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pt9n6_openshift-ovn-kubernetes(19542618-7a4e-44bc-9297-9931dcc41eea)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:49Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.336307 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:49Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.350743 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:49Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.368877 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:49Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.383830 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d0c717-13a1-4c19-9af2-0dd9805ad606\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12751cf58cf28bbb1c80e0f812afa88932e066c4266d386e60e3853b5bb8c060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709ce1febf81d6950f1da93243c30412e5c34208680a47a141e7908e7fa4b9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sx6lv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:49Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.390702 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.390847 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.390909 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.390975 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.391032 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:49Z","lastTransitionTime":"2025-12-03T06:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.493890 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.494172 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.494272 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.494353 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.494423 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:49Z","lastTransitionTime":"2025-12-03T06:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.597562 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.597658 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.597677 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.597697 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.597713 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:49Z","lastTransitionTime":"2025-12-03T06:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.701671 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.701735 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.701749 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.701770 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.701783 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:49Z","lastTransitionTime":"2025-12-03T06:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.806088 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.806157 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.806182 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.806215 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.806240 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:49Z","lastTransitionTime":"2025-12-03T06:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.822313 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:49 crc kubenswrapper[4947]: E1203 06:49:49.822652 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:49:49 crc kubenswrapper[4947]: E1203 06:49:49.822689 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:49:49 crc kubenswrapper[4947]: E1203 06:49:49.822710 4947 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:49 crc kubenswrapper[4947]: E1203 06:49:49.822783 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 06:50:21.822762844 +0000 UTC m=+83.083717300 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.909407 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.909531 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.909559 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.909588 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.909611 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:49Z","lastTransitionTime":"2025-12-03T06:49:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.922938 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.923065 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.923096 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:49 crc kubenswrapper[4947]: I1203 06:49:49.923140 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:49 crc kubenswrapper[4947]: E1203 06:49:49.923246 4947 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:49:49 crc kubenswrapper[4947]: E1203 06:49:49.923264 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:49:49 crc kubenswrapper[4947]: E1203 06:49:49.923306 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:49:49 crc kubenswrapper[4947]: E1203 06:49:49.923328 4947 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:49 crc kubenswrapper[4947]: E1203 06:49:49.923262 4947 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:49:49 crc kubenswrapper[4947]: E1203 06:49:49.923312 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:50:21.923295064 +0000 UTC m=+83.184249500 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:49:49 crc kubenswrapper[4947]: E1203 06:49:49.923446 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 06:50:21.923417548 +0000 UTC m=+83.184372014 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:49:49 crc kubenswrapper[4947]: E1203 06:49:49.923472 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:50:21.923458919 +0000 UTC m=+83.184413385 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:49:49 crc kubenswrapper[4947]: E1203 06:49:49.923661 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:50:21.923635533 +0000 UTC m=+83.184589999 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.012954 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.013031 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.013057 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.013087 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.013112 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:50Z","lastTransitionTime":"2025-12-03T06:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.082227 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.082283 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.082291 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:50 crc kubenswrapper[4947]: E1203 06:49:50.082412 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:49:50 crc kubenswrapper[4947]: E1203 06:49:50.082583 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:49:50 crc kubenswrapper[4947]: E1203 06:49:50.082707 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.116167 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.116223 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.116241 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.116266 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.116286 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:50Z","lastTransitionTime":"2025-12-03T06:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.220213 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.220276 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.220287 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.220307 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.220319 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:50Z","lastTransitionTime":"2025-12-03T06:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.324011 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.324097 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.324123 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.324157 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.324181 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:50Z","lastTransitionTime":"2025-12-03T06:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.454191 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.454243 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.454259 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.454281 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.454299 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:50Z","lastTransitionTime":"2025-12-03T06:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.557277 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.557343 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.557362 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.557391 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.557407 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:50Z","lastTransitionTime":"2025-12-03T06:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.660460 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.660606 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.660627 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.660658 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.660683 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:50Z","lastTransitionTime":"2025-12-03T06:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.763593 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.763659 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.763681 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.763710 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.763732 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:50Z","lastTransitionTime":"2025-12-03T06:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.866794 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.866834 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.866850 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.866870 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.866884 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:50Z","lastTransitionTime":"2025-12-03T06:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.969414 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.969773 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.969920 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.970066 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:50 crc kubenswrapper[4947]: I1203 06:49:50.970204 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:50Z","lastTransitionTime":"2025-12-03T06:49:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.073988 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.074066 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.074092 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.074122 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.074144 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:51Z","lastTransitionTime":"2025-12-03T06:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.083062 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:51 crc kubenswrapper[4947]: E1203 06:49:51.083191 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.178173 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.178238 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.178256 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.178280 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.178299 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:51Z","lastTransitionTime":"2025-12-03T06:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.281917 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.282025 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.282048 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.282101 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.282124 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:51Z","lastTransitionTime":"2025-12-03T06:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.385261 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.385330 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.385353 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.385384 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.385406 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:51Z","lastTransitionTime":"2025-12-03T06:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.489953 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.489995 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.490005 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.490022 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.490034 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:51Z","lastTransitionTime":"2025-12-03T06:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.592708 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.593024 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.593275 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.593495 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.593708 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:51Z","lastTransitionTime":"2025-12-03T06:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.697151 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.697219 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.697242 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.697272 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.697292 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:51Z","lastTransitionTime":"2025-12-03T06:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.801128 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.801436 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.801655 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.801807 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.801970 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:51Z","lastTransitionTime":"2025-12-03T06:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.905325 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.905395 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.905411 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.905436 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:51 crc kubenswrapper[4947]: I1203 06:49:51.905454 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:51Z","lastTransitionTime":"2025-12-03T06:49:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.008704 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.008769 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.008786 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.008810 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.008829 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:52Z","lastTransitionTime":"2025-12-03T06:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.082075 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.082122 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.082150 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:49:52 crc kubenswrapper[4947]: E1203 06:49:52.082287 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:49:52 crc kubenswrapper[4947]: E1203 06:49:52.082402 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:49:52 crc kubenswrapper[4947]: E1203 06:49:52.082613 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.112212 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.113244 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.113385 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.113579 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.113747 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:52Z","lastTransitionTime":"2025-12-03T06:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.216898 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.216978 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.217000 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.217032 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.217056 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:52Z","lastTransitionTime":"2025-12-03T06:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.320700 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.320748 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.320766 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.320791 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.320807 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:52Z","lastTransitionTime":"2025-12-03T06:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.371945 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs\") pod \"network-metrics-daemon-cz948\" (UID: \"8dd41826-cef5-42f7-8730-abc792b9337c\") " pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:49:52 crc kubenswrapper[4947]: E1203 06:49:52.372107 4947 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:49:52 crc kubenswrapper[4947]: E1203 06:49:52.372161 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs podName:8dd41826-cef5-42f7-8730-abc792b9337c nodeName:}" failed. No retries permitted until 2025-12-03 06:50:08.372144791 +0000 UTC m=+69.633099227 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs") pod "network-metrics-daemon-cz948" (UID: "8dd41826-cef5-42f7-8730-abc792b9337c") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.424429 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.424930 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.424944 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.424983 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.424998 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:52Z","lastTransitionTime":"2025-12-03T06:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.458218 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.473062 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.476713 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.496170 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.513226 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.527837 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.527926 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.527944 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.527964 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.527979 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:52Z","lastTransitionTime":"2025-12-03T06:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.535760 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.550189 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.565355 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.580857 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cz948" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd41826-cef5-42f7-8730-abc792b9337c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cz948\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.600576 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.614431 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.631145 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.631189 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.631205 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.631229 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.631247 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:52Z","lastTransitionTime":"2025-12-03T06:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.637534 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.651872 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.652111 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.652265 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.652394 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.652619 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:52Z","lastTransitionTime":"2025-12-03T06:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.665123 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2322502c6bd099caf34de2d2a47a6916e481abcb49e24de7c9b1d737edbb9a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2322502c6bd099caf34de2d2a47a6916e481abcb49e24de7c9b1d737edbb9a45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 06:49:35.637175 6377 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 06:49:35.637627 6377 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 06:49:35.637704 6377 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 06:49:35.637733 6377 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:49:35.637742 6377 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:49:35.637783 6377 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:49:35.637788 6377 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 06:49:35.637799 6377 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 06:49:35.637819 6377 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06:49:35.637866 6377 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:49:35.637883 6377 factory.go:656] Stopping watch factory\\\\nI1203 06:49:35.637903 6377 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06:49:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pt9n6_openshift-ovn-kubernetes(19542618-7a4e-44bc-9297-9931dcc41eea)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:52 crc kubenswrapper[4947]: E1203 06:49:52.673433 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.678139 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.678190 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.678207 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.678232 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.678250 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:52Z","lastTransitionTime":"2025-12-03T06:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.682948 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:52 crc kubenswrapper[4947]: E1203 06:49:52.698012 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.703144 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.703199 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.703216 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.703241 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.703258 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:52Z","lastTransitionTime":"2025-12-03T06:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.705424 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.727014 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:52 crc kubenswrapper[4947]: E1203 06:49:52.735690 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.742145 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.742190 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.742200 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.742216 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.742551 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:52Z","lastTransitionTime":"2025-12-03T06:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:52 crc kubenswrapper[4947]: E1203 06:49:52.763580 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.766464 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.768262 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.768364 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.768435 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.768514 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.768583 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:52Z","lastTransitionTime":"2025-12-03T06:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.790558 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d0c717-13a1-4c19-9af2-0dd9805ad606\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12751cf58cf28bbb1c80e0f812afa88932e066c4266d386e60e3853b5bb8c060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709ce1febf81d6950f1da93243c30412e5c34208680a47a141e7908e7fa4b9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sx6lv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:52 crc kubenswrapper[4947]: E1203 06:49:52.792518 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:52 crc kubenswrapper[4947]: E1203 06:49:52.792834 4947 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.795422 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.795467 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.795482 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.795518 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.795540 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:52Z","lastTransitionTime":"2025-12-03T06:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.898905 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.898963 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.898980 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.899004 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:52 crc kubenswrapper[4947]: I1203 06:49:52.899020 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:52Z","lastTransitionTime":"2025-12-03T06:49:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.002297 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.002380 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.002393 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.002423 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.002439 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:53Z","lastTransitionTime":"2025-12-03T06:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.082400 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:53 crc kubenswrapper[4947]: E1203 06:49:53.082976 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.106223 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.106276 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.106293 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.106315 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.106336 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:53Z","lastTransitionTime":"2025-12-03T06:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.209795 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.209857 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.209875 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.209899 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.209916 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:53Z","lastTransitionTime":"2025-12-03T06:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.313692 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.313766 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.313788 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.313816 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.313836 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:53Z","lastTransitionTime":"2025-12-03T06:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.417240 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.417300 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.417317 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.417343 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.417362 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:53Z","lastTransitionTime":"2025-12-03T06:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.520863 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.520921 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.520936 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.520962 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.520977 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:53Z","lastTransitionTime":"2025-12-03T06:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.624567 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.624637 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.624650 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.624683 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.624701 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:53Z","lastTransitionTime":"2025-12-03T06:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.727584 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.727659 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.727672 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.727696 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.727710 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:53Z","lastTransitionTime":"2025-12-03T06:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.829998 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.830075 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.830091 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.830117 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.830130 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:53Z","lastTransitionTime":"2025-12-03T06:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.932479 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.932539 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.932549 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.932570 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:53 crc kubenswrapper[4947]: I1203 06:49:53.932580 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:53Z","lastTransitionTime":"2025-12-03T06:49:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.035960 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.036031 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.036049 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.036076 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.036095 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:54Z","lastTransitionTime":"2025-12-03T06:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.082592 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.082615 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.082692 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:54 crc kubenswrapper[4947]: E1203 06:49:54.083007 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:49:54 crc kubenswrapper[4947]: E1203 06:49:54.083252 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:49:54 crc kubenswrapper[4947]: E1203 06:49:54.083278 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.138815 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.138848 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.138859 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.138874 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.138885 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:54Z","lastTransitionTime":"2025-12-03T06:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.242973 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.243044 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.243062 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.243095 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.243114 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:54Z","lastTransitionTime":"2025-12-03T06:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.346028 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.346089 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.346108 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.346132 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.346149 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:54Z","lastTransitionTime":"2025-12-03T06:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.448801 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.448864 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.448884 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.448918 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.448940 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:54Z","lastTransitionTime":"2025-12-03T06:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.553259 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.553326 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.553343 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.553366 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.553383 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:54Z","lastTransitionTime":"2025-12-03T06:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.657296 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.657356 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.657377 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.657409 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.657429 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:54Z","lastTransitionTime":"2025-12-03T06:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.761835 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.762254 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.762560 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.762789 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.763024 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:54Z","lastTransitionTime":"2025-12-03T06:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.867533 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.867902 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.868084 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.868236 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.868409 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:54Z","lastTransitionTime":"2025-12-03T06:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.970374 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.970437 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.970458 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.970482 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:54 crc kubenswrapper[4947]: I1203 06:49:54.970524 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:54Z","lastTransitionTime":"2025-12-03T06:49:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.073631 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.073920 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.074043 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.074133 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.074215 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:55Z","lastTransitionTime":"2025-12-03T06:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.082094 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:55 crc kubenswrapper[4947]: E1203 06:49:55.082293 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.177674 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.177724 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.177737 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.177757 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.177770 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:55Z","lastTransitionTime":"2025-12-03T06:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.280646 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.280742 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.280768 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.280805 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.280836 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:55Z","lastTransitionTime":"2025-12-03T06:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.384066 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.384123 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.384135 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.384156 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.384168 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:55Z","lastTransitionTime":"2025-12-03T06:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.487684 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.487752 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.487775 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.487814 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.487843 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:55Z","lastTransitionTime":"2025-12-03T06:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.590776 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.591314 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.591554 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.591845 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.592617 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:55Z","lastTransitionTime":"2025-12-03T06:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.696385 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.696717 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.696890 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.697089 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.697311 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:55Z","lastTransitionTime":"2025-12-03T06:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.801048 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.801258 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.801284 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.801313 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.801337 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:55Z","lastTransitionTime":"2025-12-03T06:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.904373 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.904467 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.904483 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.904546 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:55 crc kubenswrapper[4947]: I1203 06:49:55.904569 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:55Z","lastTransitionTime":"2025-12-03T06:49:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.007783 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.007858 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.007881 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.007909 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.007930 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:56Z","lastTransitionTime":"2025-12-03T06:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.082867 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.082967 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:56 crc kubenswrapper[4947]: E1203 06:49:56.083096 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:49:56 crc kubenswrapper[4947]: E1203 06:49:56.083174 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.083570 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:56 crc kubenswrapper[4947]: E1203 06:49:56.084182 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.110422 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.110481 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.110544 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.110574 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.110595 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:56Z","lastTransitionTime":"2025-12-03T06:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.212670 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.212711 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.212722 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.212759 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.212769 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:56Z","lastTransitionTime":"2025-12-03T06:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.315900 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.315944 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.315960 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.315982 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.316002 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:56Z","lastTransitionTime":"2025-12-03T06:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.419962 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.420019 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.420036 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.420059 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.420076 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:56Z","lastTransitionTime":"2025-12-03T06:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.522989 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.523056 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.523074 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.523096 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.523113 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:56Z","lastTransitionTime":"2025-12-03T06:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.626289 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.626361 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.626378 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.626403 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.626420 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:56Z","lastTransitionTime":"2025-12-03T06:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.729852 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.729913 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.729934 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.729960 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.729980 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:56Z","lastTransitionTime":"2025-12-03T06:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.832744 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.832845 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.832863 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.832885 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.832901 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:56Z","lastTransitionTime":"2025-12-03T06:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.936476 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.936688 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.936710 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.936739 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:56 crc kubenswrapper[4947]: I1203 06:49:56.936761 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:56Z","lastTransitionTime":"2025-12-03T06:49:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.040010 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.040086 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.040108 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.040136 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.040158 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:57Z","lastTransitionTime":"2025-12-03T06:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.082790 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:57 crc kubenswrapper[4947]: E1203 06:49:57.083013 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.143719 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.143786 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.143803 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.143829 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.143847 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:57Z","lastTransitionTime":"2025-12-03T06:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.247322 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.247394 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.247413 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.247468 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.247488 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:57Z","lastTransitionTime":"2025-12-03T06:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.352990 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.354123 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.354310 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.354447 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.354637 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:57Z","lastTransitionTime":"2025-12-03T06:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.465340 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.465400 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.465417 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.465442 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.465460 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:57Z","lastTransitionTime":"2025-12-03T06:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.568603 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.568667 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.568684 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.568710 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.568728 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:57Z","lastTransitionTime":"2025-12-03T06:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.672196 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.672667 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.672827 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.672990 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.673131 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:57Z","lastTransitionTime":"2025-12-03T06:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.776885 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.777256 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.777482 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.777744 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.778186 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:57Z","lastTransitionTime":"2025-12-03T06:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.882417 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.882830 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.883095 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.883323 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.883591 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:57Z","lastTransitionTime":"2025-12-03T06:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.986372 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.986644 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.986746 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.986899 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:57 crc kubenswrapper[4947]: I1203 06:49:57.986995 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:57Z","lastTransitionTime":"2025-12-03T06:49:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.083074 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.083096 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:49:58 crc kubenswrapper[4947]: E1203 06:49:58.083293 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:49:58 crc kubenswrapper[4947]: E1203 06:49:58.083401 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.083114 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:49:58 crc kubenswrapper[4947]: E1203 06:49:58.083913 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.089621 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.089712 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.089733 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.089757 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.089778 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:58Z","lastTransitionTime":"2025-12-03T06:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.193049 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.193380 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.193547 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.193704 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.193898 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:58Z","lastTransitionTime":"2025-12-03T06:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.297234 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.297752 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.297812 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.297848 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.297872 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:58Z","lastTransitionTime":"2025-12-03T06:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.400882 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.400932 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.400949 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.400974 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.400993 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:58Z","lastTransitionTime":"2025-12-03T06:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.504000 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.504074 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.504093 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.504115 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.504132 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:58Z","lastTransitionTime":"2025-12-03T06:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.607902 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.607977 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.607995 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.608019 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.608036 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:58Z","lastTransitionTime":"2025-12-03T06:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.712060 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.712120 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.712138 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.712163 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.712180 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:58Z","lastTransitionTime":"2025-12-03T06:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.814957 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.815025 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.815049 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.815078 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.815099 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:58Z","lastTransitionTime":"2025-12-03T06:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.918157 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.918310 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.918339 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.918373 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:58 crc kubenswrapper[4947]: I1203 06:49:58.918393 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:58Z","lastTransitionTime":"2025-12-03T06:49:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.028198 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.028272 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.028289 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.028315 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.028333 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:59Z","lastTransitionTime":"2025-12-03T06:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.083057 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:49:59 crc kubenswrapper[4947]: E1203 06:49:59.083410 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.084624 4947 scope.go:117] "RemoveContainer" containerID="2322502c6bd099caf34de2d2a47a6916e481abcb49e24de7c9b1d737edbb9a45" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.112662 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.131899 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.132251 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.132428 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.132645 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.132795 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:59Z","lastTransitionTime":"2025-12-03T06:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.138247 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.160281 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.176835 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d0c717-13a1-4c19-9af2-0dd9805ad606\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12751cf58cf28bbb1c80e0f812afa88932e066c4266d386e60e3853b5bb8c060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709ce1febf81d6950f1da93243c30412e5c34208680a47a141e7908e7fa4b9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sx6lv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.192998 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.211744 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.228962 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f88e233-6c52-4f7c-84ef-76b9304d0d4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374d2c4f269c4e0196cca01c715fbf3d1c1be24b9a7f70471b79122aa9464fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b953b6cbceff7ac8069d03ad77bd564b1661930712c29814a4605af39435cda8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3085cc1aca7b221925ab104f657c06486df91695b0c4a0ab6c019dce49be5646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.236463 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.236625 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.236703 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.236793 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.236885 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:59Z","lastTransitionTime":"2025-12-03T06:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.250312 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.267670 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.284787 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.296577 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.310027 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cz948" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd41826-cef5-42f7-8730-abc792b9337c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cz948\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.329527 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.341158 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.341226 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.341244 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.341274 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.341293 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:59Z","lastTransitionTime":"2025-12-03T06:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.346321 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.362529 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.380592 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.406209 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2322502c6bd099caf34de2d2a47a6916e481abcb49e24de7c9b1d737edbb9a45\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2322502c6bd099caf34de2d2a47a6916e481abcb49e24de7c9b1d737edbb9a45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 06:49:35.637175 6377 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 06:49:35.637627 6377 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 06:49:35.637704 6377 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 06:49:35.637733 6377 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:49:35.637742 6377 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:49:35.637783 6377 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:49:35.637788 6377 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 06:49:35.637799 6377 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 06:49:35.637819 6377 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06:49:35.637866 6377 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:49:35.637883 6377 factory.go:656] Stopping watch factory\\\\nI1203 06:49:35.637903 6377 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06:49:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-pt9n6_openshift-ovn-kubernetes(19542618-7a4e-44bc-9297-9931dcc41eea)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.443953 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.443986 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.443994 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.444007 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.444016 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:59Z","lastTransitionTime":"2025-12-03T06:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.546203 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.546286 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.546300 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.546323 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.546336 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:59Z","lastTransitionTime":"2025-12-03T06:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.649843 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.649921 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.649932 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.649954 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.649966 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:59Z","lastTransitionTime":"2025-12-03T06:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.752056 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.752109 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.752123 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.752147 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.752161 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:59Z","lastTransitionTime":"2025-12-03T06:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.762780 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pt9n6_19542618-7a4e-44bc-9297-9931dcc41eea/ovnkube-controller/1.log" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.765241 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerStarted","Data":"23d57a511f5c213100e7583442af42780b696ad462969ce528166b565b0f98a5"} Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.765761 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.780025 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.802594 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d57a511f5c213100e7583442af42780b696ad462969ce528166b565b0f98a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2322502c6bd099caf34de2d2a47a6916e481abcb49e24de7c9b1d737edbb9a45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 06:49:35.637175 6377 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 06:49:35.637627 6377 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 06:49:35.637704 6377 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 06:49:35.637733 6377 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:49:35.637742 6377 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:49:35.637783 6377 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:49:35.637788 6377 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 06:49:35.637799 6377 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 06:49:35.637819 6377 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06:49:35.637866 6377 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:49:35.637883 6377 factory.go:656] Stopping watch factory\\\\nI1203 06:49:35.637903 6377 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06:49:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.818008 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.832720 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.853206 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.855042 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.855152 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.855214 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.855277 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.855348 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:59Z","lastTransitionTime":"2025-12-03T06:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.868452 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d0c717-13a1-4c19-9af2-0dd9805ad606\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12751cf58cf28bbb1c80e0f812afa88932e066c4266d386e60e3853b5bb8c060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709ce1febf81d6950f1da93243c30412e5c34208680a47a141e7908e7fa4b9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sx6lv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.880104 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.894739 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.908665 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f88e233-6c52-4f7c-84ef-76b9304d0d4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374d2c4f269c4e0196cca01c715fbf3d1c1be24b9a7f70471b79122aa9464fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b953b6cbceff7ac8069d03ad77bd564b1661930712c29814a4605af39435cda8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3085cc1aca7b221925ab104f657c06486df91695b0c4a0ab6c019dce49be5646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.923036 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.938052 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.951225 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.958607 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.958660 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.958670 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.958691 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.958706 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:49:59Z","lastTransitionTime":"2025-12-03T06:49:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.964308 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.979963 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cz948" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd41826-cef5-42f7-8730-abc792b9337c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cz948\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:49:59 crc kubenswrapper[4947]: I1203 06:49:59.995467 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.014162 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:00Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.026655 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:00Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.060879 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.061023 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.061085 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.061181 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.061254 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:00Z","lastTransitionTime":"2025-12-03T06:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.082819 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.082971 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.083314 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:00 crc kubenswrapper[4947]: E1203 06:50:00.083508 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:00 crc kubenswrapper[4947]: E1203 06:50:00.083657 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:00 crc kubenswrapper[4947]: E1203 06:50:00.083786 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.164037 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.164092 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.164109 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.164135 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.164154 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:00Z","lastTransitionTime":"2025-12-03T06:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.267649 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.267711 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.267729 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.267755 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.268032 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:00Z","lastTransitionTime":"2025-12-03T06:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.371537 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.371587 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.371603 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.371628 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.371645 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:00Z","lastTransitionTime":"2025-12-03T06:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.477452 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.478107 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.478221 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.478322 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.478422 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:00Z","lastTransitionTime":"2025-12-03T06:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.583217 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.583272 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.583290 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.583315 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.583334 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:00Z","lastTransitionTime":"2025-12-03T06:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.686931 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.686980 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.686996 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.687023 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.687040 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:00Z","lastTransitionTime":"2025-12-03T06:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.773562 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pt9n6_19542618-7a4e-44bc-9297-9931dcc41eea/ovnkube-controller/2.log" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.774405 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pt9n6_19542618-7a4e-44bc-9297-9931dcc41eea/ovnkube-controller/1.log" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.779004 4947 generic.go:334] "Generic (PLEG): container finished" podID="19542618-7a4e-44bc-9297-9931dcc41eea" containerID="23d57a511f5c213100e7583442af42780b696ad462969ce528166b565b0f98a5" exitCode=1 Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.779080 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerDied","Data":"23d57a511f5c213100e7583442af42780b696ad462969ce528166b565b0f98a5"} Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.779137 4947 scope.go:117] "RemoveContainer" containerID="2322502c6bd099caf34de2d2a47a6916e481abcb49e24de7c9b1d737edbb9a45" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.781291 4947 scope.go:117] "RemoveContainer" containerID="23d57a511f5c213100e7583442af42780b696ad462969ce528166b565b0f98a5" Dec 03 06:50:00 crc kubenswrapper[4947]: E1203 06:50:00.782575 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pt9n6_openshift-ovn-kubernetes(19542618-7a4e-44bc-9297-9931dcc41eea)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.805367 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.805435 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.805459 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.805536 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.805562 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:00Z","lastTransitionTime":"2025-12-03T06:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.807793 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d0c717-13a1-4c19-9af2-0dd9805ad606\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12751cf58cf28bbb1c80e0f812afa88932e066c4266d386e60e3853b5bb8c060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709ce1febf81d6950f1da93243c30412e5c34208680a47a141e7908e7fa4b9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sx6lv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:00Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.822256 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:00Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.835377 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:00Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.850647 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:00Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.864291 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:00Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.881164 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:00Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.896659 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:00Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.907127 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:00Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.908529 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.908556 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.908568 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.908586 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.908597 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:00Z","lastTransitionTime":"2025-12-03T06:50:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.921397 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:00Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.935132 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f88e233-6c52-4f7c-84ef-76b9304d0d4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374d2c4f269c4e0196cca01c715fbf3d1c1be24b9a7f70471b79122aa9464fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b953b6cbceff7ac8069d03ad77bd564b1661930712c29814a4605af39435cda8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3085cc1aca7b221925ab104f657c06486df91695b0c4a0ab6c019dce49be5646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:00Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.951623 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:00Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.964024 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cz948" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd41826-cef5-42f7-8730-abc792b9337c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cz948\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:00Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.982817 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:00Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:00 crc kubenswrapper[4947]: I1203 06:50:00.997468 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:00Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.010212 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:01Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.011428 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.011469 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.011480 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.011517 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.011530 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:01Z","lastTransitionTime":"2025-12-03T06:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.028101 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:01Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.050421 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d57a511f5c213100e7583442af42780b696ad462969ce528166b565b0f98a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2322502c6bd099caf34de2d2a47a6916e481abcb49e24de7c9b1d737edbb9a45\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"message\\\":\\\"ector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 06:49:35.637175 6377 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 06:49:35.637627 6377 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 06:49:35.637704 6377 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 06:49:35.637733 6377 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:49:35.637742 6377 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:49:35.637783 6377 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:49:35.637788 6377 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 06:49:35.637799 6377 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 06:49:35.637819 6377 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06:49:35.637866 6377 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:49:35.637883 6377 factory.go:656] Stopping watch factory\\\\nI1203 06:49:35.637903 6377 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06:49:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23d57a511f5c213100e7583442af42780b696ad462969ce528166b565b0f98a5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:50:00Z\\\",\\\"message\\\":\\\"3 06:49:59.992821 6636 lb_config.go:1031] Cluster endpoints for openshift-marketplace/redhat-operators for network=default are: map[]\\\\nI1203 06:49:59.992519 6636 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1203 06:49:59.992571 6636 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:01Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.082778 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:01 crc kubenswrapper[4947]: E1203 06:50:01.082935 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.115310 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.115382 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.115450 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.115479 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.115538 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:01Z","lastTransitionTime":"2025-12-03T06:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.218907 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.219036 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.219060 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.219089 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.219111 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:01Z","lastTransitionTime":"2025-12-03T06:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.323601 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.323652 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.323670 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.323693 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.323711 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:01Z","lastTransitionTime":"2025-12-03T06:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.427300 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.427357 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.427380 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.427408 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.427430 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:01Z","lastTransitionTime":"2025-12-03T06:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.530533 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.530582 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.530602 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.530626 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.530641 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:01Z","lastTransitionTime":"2025-12-03T06:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.634264 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.634333 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.634351 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.634379 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.634398 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:01Z","lastTransitionTime":"2025-12-03T06:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.737401 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.737475 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.737547 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.737578 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.737600 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:01Z","lastTransitionTime":"2025-12-03T06:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.785611 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pt9n6_19542618-7a4e-44bc-9297-9931dcc41eea/ovnkube-controller/2.log" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.795072 4947 scope.go:117] "RemoveContainer" containerID="23d57a511f5c213100e7583442af42780b696ad462969ce528166b565b0f98a5" Dec 03 06:50:01 crc kubenswrapper[4947]: E1203 06:50:01.795631 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pt9n6_openshift-ovn-kubernetes(19542618-7a4e-44bc-9297-9931dcc41eea)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.828511 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:01Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.841215 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.841254 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.841265 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.841289 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.841303 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:01Z","lastTransitionTime":"2025-12-03T06:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.847438 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d0c717-13a1-4c19-9af2-0dd9805ad606\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12751cf58cf28bbb1c80e0f812afa88932e066c4266d386e60e3853b5bb8c060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709ce1febf81d6950f1da93243c30412e5c34208680a47a141e7908e7fa4b9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sx6lv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:01Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.864142 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:01Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.881356 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:01Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.900928 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:01Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.916256 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:01Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.935679 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:01Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.944162 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.944209 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.944221 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.944240 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.944254 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:01Z","lastTransitionTime":"2025-12-03T06:50:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.949815 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:01Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.964778 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:01Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.981964 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:01Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:01 crc kubenswrapper[4947]: I1203 06:50:01.995056 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f88e233-6c52-4f7c-84ef-76b9304d0d4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374d2c4f269c4e0196cca01c715fbf3d1c1be24b9a7f70471b79122aa9464fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b953b6cbceff7ac8069d03ad77bd564b1661930712c29814a4605af39435cda8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3085cc1aca7b221925ab104f657c06486df91695b0c4a0ab6c019dce49be5646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:01Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.009135 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cz948" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd41826-cef5-42f7-8730-abc792b9337c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cz948\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:02Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.023641 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:02Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.038814 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:02Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.046881 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.046939 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.046952 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.046992 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.047006 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:02Z","lastTransitionTime":"2025-12-03T06:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.052073 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:02Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.070337 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:02Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.082655 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.082671 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.082790 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:02 crc kubenswrapper[4947]: E1203 06:50:02.082814 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:02 crc kubenswrapper[4947]: E1203 06:50:02.082921 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:02 crc kubenswrapper[4947]: E1203 06:50:02.083058 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.097974 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d57a511f5c213100e7583442af42780b696ad462969ce528166b565b0f98a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23d57a511f5c213100e7583442af42780b696ad462969ce528166b565b0f98a5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:50:00Z\\\",\\\"message\\\":\\\"3 06:49:59.992821 6636 lb_config.go:1031] Cluster endpoints for openshift-marketplace/redhat-operators for network=default are: map[]\\\\nI1203 06:49:59.992519 6636 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1203 06:49:59.992571 6636 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pt9n6_openshift-ovn-kubernetes(19542618-7a4e-44bc-9297-9931dcc41eea)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:02Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.150198 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.150262 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.150275 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.150297 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.150312 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:02Z","lastTransitionTime":"2025-12-03T06:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.253955 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.254028 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.254046 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.254076 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.254094 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:02Z","lastTransitionTime":"2025-12-03T06:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.356682 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.356724 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.356732 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.356746 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.356759 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:02Z","lastTransitionTime":"2025-12-03T06:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.459885 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.459954 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.459971 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.459997 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.460015 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:02Z","lastTransitionTime":"2025-12-03T06:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.562788 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.562833 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.562845 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.562863 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.562876 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:02Z","lastTransitionTime":"2025-12-03T06:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.666104 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.666168 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.666182 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.666201 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.666212 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:02Z","lastTransitionTime":"2025-12-03T06:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.769278 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.769338 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.769355 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.769380 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.769396 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:02Z","lastTransitionTime":"2025-12-03T06:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.872931 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.872978 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.872991 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.873009 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.873020 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:02Z","lastTransitionTime":"2025-12-03T06:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.975072 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.975132 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.975149 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.975175 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:02 crc kubenswrapper[4947]: I1203 06:50:02.975192 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:02Z","lastTransitionTime":"2025-12-03T06:50:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.077716 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.077768 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.077783 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.077806 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.077822 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:03Z","lastTransitionTime":"2025-12-03T06:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.108962 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:03 crc kubenswrapper[4947]: E1203 06:50:03.109790 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.114625 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.114733 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.114776 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.114813 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.114838 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:03Z","lastTransitionTime":"2025-12-03T06:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:03 crc kubenswrapper[4947]: E1203 06:50:03.135188 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:03Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.141980 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.142057 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.142072 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.142095 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.142109 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:03Z","lastTransitionTime":"2025-12-03T06:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:03 crc kubenswrapper[4947]: E1203 06:50:03.163086 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:03Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.168634 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.168702 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.168724 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.168757 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.168780 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:03Z","lastTransitionTime":"2025-12-03T06:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:03 crc kubenswrapper[4947]: E1203 06:50:03.189855 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:03Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.196010 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.196068 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.196082 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.196101 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.196118 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:03Z","lastTransitionTime":"2025-12-03T06:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:03 crc kubenswrapper[4947]: E1203 06:50:03.214143 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:03Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.218605 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.218646 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.218662 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.218683 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.218698 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:03Z","lastTransitionTime":"2025-12-03T06:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:03 crc kubenswrapper[4947]: E1203 06:50:03.237620 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:03Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:03 crc kubenswrapper[4947]: E1203 06:50:03.237821 4947 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.240109 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.240163 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.240174 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.240196 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.240211 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:03Z","lastTransitionTime":"2025-12-03T06:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.342392 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.342455 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.342471 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.342506 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.342520 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:03Z","lastTransitionTime":"2025-12-03T06:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.445834 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.445877 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.445888 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.445902 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.445911 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:03Z","lastTransitionTime":"2025-12-03T06:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.549582 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.549632 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.549646 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.549664 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.549674 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:03Z","lastTransitionTime":"2025-12-03T06:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.652853 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.652933 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.652951 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.652976 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.652991 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:03Z","lastTransitionTime":"2025-12-03T06:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.755910 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.755987 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.756011 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.756042 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.756064 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:03Z","lastTransitionTime":"2025-12-03T06:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.858809 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.858870 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.858882 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.858902 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.858913 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:03Z","lastTransitionTime":"2025-12-03T06:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.961297 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.961343 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.961356 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.961376 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:03 crc kubenswrapper[4947]: I1203 06:50:03.961390 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:03Z","lastTransitionTime":"2025-12-03T06:50:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.064731 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.064782 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.064800 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.064825 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.064843 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:04Z","lastTransitionTime":"2025-12-03T06:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.082375 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.082536 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:04 crc kubenswrapper[4947]: E1203 06:50:04.082573 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.082621 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:04 crc kubenswrapper[4947]: E1203 06:50:04.082764 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:04 crc kubenswrapper[4947]: E1203 06:50:04.082858 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.168043 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.168085 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.168100 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.168117 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.168133 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:04Z","lastTransitionTime":"2025-12-03T06:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.270750 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.270782 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.270791 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.270803 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.270812 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:04Z","lastTransitionTime":"2025-12-03T06:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.374169 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.374218 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.374236 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.374260 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.374277 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:04Z","lastTransitionTime":"2025-12-03T06:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.477760 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.478094 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.478228 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.478406 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.478593 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:04Z","lastTransitionTime":"2025-12-03T06:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.585007 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.585119 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.585150 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.585184 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.585216 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:04Z","lastTransitionTime":"2025-12-03T06:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.688166 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.688205 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.688221 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.688244 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.688260 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:04Z","lastTransitionTime":"2025-12-03T06:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.791823 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.791865 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.791877 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.791893 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.791905 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:04Z","lastTransitionTime":"2025-12-03T06:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.894281 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.894332 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.894349 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.894374 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.894392 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:04Z","lastTransitionTime":"2025-12-03T06:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.996485 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.996547 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.996559 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.996575 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:04 crc kubenswrapper[4947]: I1203 06:50:04.996587 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:04Z","lastTransitionTime":"2025-12-03T06:50:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.082812 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:05 crc kubenswrapper[4947]: E1203 06:50:05.082986 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.098797 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.098855 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.098872 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.098901 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.098918 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:05Z","lastTransitionTime":"2025-12-03T06:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.201763 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.201825 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.201835 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.201850 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.201860 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:05Z","lastTransitionTime":"2025-12-03T06:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.304891 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.304938 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.304946 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.304961 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.304969 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:05Z","lastTransitionTime":"2025-12-03T06:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.407503 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.407551 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.407563 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.407583 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.407597 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:05Z","lastTransitionTime":"2025-12-03T06:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.510796 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.510843 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.510852 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.510867 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.510876 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:05Z","lastTransitionTime":"2025-12-03T06:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.668595 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.668654 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.668670 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.668691 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.668703 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:05Z","lastTransitionTime":"2025-12-03T06:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.772112 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.772186 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.772200 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.772227 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.772243 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:05Z","lastTransitionTime":"2025-12-03T06:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.875543 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.875590 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.875628 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.875646 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.875659 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:05Z","lastTransitionTime":"2025-12-03T06:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.978589 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.978641 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.978657 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.978676 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:05 crc kubenswrapper[4947]: I1203 06:50:05.978689 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:05Z","lastTransitionTime":"2025-12-03T06:50:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.081757 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.081800 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.081811 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.081827 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.081837 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:06Z","lastTransitionTime":"2025-12-03T06:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.082087 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.082109 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.082109 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:06 crc kubenswrapper[4947]: E1203 06:50:06.082228 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:06 crc kubenswrapper[4947]: E1203 06:50:06.082464 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:06 crc kubenswrapper[4947]: E1203 06:50:06.082595 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.185687 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.185756 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.185772 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.185796 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.185813 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:06Z","lastTransitionTime":"2025-12-03T06:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.288998 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.289040 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.289049 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.289067 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.289078 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:06Z","lastTransitionTime":"2025-12-03T06:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.391894 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.391942 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.391952 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.391970 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.391982 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:06Z","lastTransitionTime":"2025-12-03T06:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.493712 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.493745 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.493753 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.493766 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.493774 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:06Z","lastTransitionTime":"2025-12-03T06:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.600697 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.600745 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.600760 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.600775 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.600784 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:06Z","lastTransitionTime":"2025-12-03T06:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.703611 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.703682 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.703705 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.703735 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.703756 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:06Z","lastTransitionTime":"2025-12-03T06:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.806018 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.806054 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.806063 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.806077 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.806085 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:06Z","lastTransitionTime":"2025-12-03T06:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.909076 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.909171 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.909236 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.909271 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:06 crc kubenswrapper[4947]: I1203 06:50:06.909292 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:06Z","lastTransitionTime":"2025-12-03T06:50:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.012436 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.012522 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.012540 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.012563 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.012578 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:07Z","lastTransitionTime":"2025-12-03T06:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.082506 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:07 crc kubenswrapper[4947]: E1203 06:50:07.082636 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.115194 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.115255 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.115272 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.115294 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.115307 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:07Z","lastTransitionTime":"2025-12-03T06:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.218072 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.218119 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.218130 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.218147 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.218158 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:07Z","lastTransitionTime":"2025-12-03T06:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.321461 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.321537 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.321552 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.321571 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.321587 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:07Z","lastTransitionTime":"2025-12-03T06:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.423971 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.424020 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.424058 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.424077 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.424088 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:07Z","lastTransitionTime":"2025-12-03T06:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.526379 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.526433 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.526452 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.526475 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.526520 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:07Z","lastTransitionTime":"2025-12-03T06:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.628912 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.628953 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.628962 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.628976 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.628986 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:07Z","lastTransitionTime":"2025-12-03T06:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.732862 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.732922 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.732940 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.732964 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.732983 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:07Z","lastTransitionTime":"2025-12-03T06:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.836603 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.836729 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.836803 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.836885 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.836911 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:07Z","lastTransitionTime":"2025-12-03T06:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.940358 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.940413 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.940424 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.940447 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:07 crc kubenswrapper[4947]: I1203 06:50:07.940465 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:07Z","lastTransitionTime":"2025-12-03T06:50:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.042822 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.042872 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.042884 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.042905 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.042919 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:08Z","lastTransitionTime":"2025-12-03T06:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.082134 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.082216 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:08 crc kubenswrapper[4947]: E1203 06:50:08.082264 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:08 crc kubenswrapper[4947]: E1203 06:50:08.082469 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.082905 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:08 crc kubenswrapper[4947]: E1203 06:50:08.083020 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.145732 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.145789 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.145798 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.145819 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.145834 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:08Z","lastTransitionTime":"2025-12-03T06:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.249008 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.249057 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.249074 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.249098 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.249115 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:08Z","lastTransitionTime":"2025-12-03T06:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.352914 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.352972 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.352996 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.353032 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.353058 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:08Z","lastTransitionTime":"2025-12-03T06:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.399092 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs\") pod \"network-metrics-daemon-cz948\" (UID: \"8dd41826-cef5-42f7-8730-abc792b9337c\") " pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:08 crc kubenswrapper[4947]: E1203 06:50:08.399359 4947 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:50:08 crc kubenswrapper[4947]: E1203 06:50:08.399560 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs podName:8dd41826-cef5-42f7-8730-abc792b9337c nodeName:}" failed. No retries permitted until 2025-12-03 06:50:40.399526083 +0000 UTC m=+101.660480549 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs") pod "network-metrics-daemon-cz948" (UID: "8dd41826-cef5-42f7-8730-abc792b9337c") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.456661 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.456737 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.456759 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.456789 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.456812 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:08Z","lastTransitionTime":"2025-12-03T06:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.559178 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.559264 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.559278 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.559312 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.559326 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:08Z","lastTransitionTime":"2025-12-03T06:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.662171 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.662222 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.662232 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.662245 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.662254 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:08Z","lastTransitionTime":"2025-12-03T06:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.765272 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.765363 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.765389 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.765417 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.765439 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:08Z","lastTransitionTime":"2025-12-03T06:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.814232 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-97tnc_1c90ac94-365a-4c82-b72a-41129d95a39e/kube-multus/0.log" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.814282 4947 generic.go:334] "Generic (PLEG): container finished" podID="1c90ac94-365a-4c82-b72a-41129d95a39e" containerID="b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d" exitCode=1 Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.814312 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-97tnc" event={"ID":"1c90ac94-365a-4c82-b72a-41129d95a39e","Type":"ContainerDied","Data":"b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d"} Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.814701 4947 scope.go:117] "RemoveContainer" containerID="b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.830216 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.850981 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d57a511f5c213100e7583442af42780b696ad462969ce528166b565b0f98a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23d57a511f5c213100e7583442af42780b696ad462969ce528166b565b0f98a5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:50:00Z\\\",\\\"message\\\":\\\"3 06:49:59.992821 6636 lb_config.go:1031] Cluster endpoints for openshift-marketplace/redhat-operators for network=default are: map[]\\\\nI1203 06:49:59.992519 6636 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1203 06:49:59.992571 6636 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pt9n6_openshift-ovn-kubernetes(19542618-7a4e-44bc-9297-9931dcc41eea)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.865670 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.868315 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.868377 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.868396 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.868420 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.868439 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:08Z","lastTransitionTime":"2025-12-03T06:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.883544 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.899171 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.912212 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d0c717-13a1-4c19-9af2-0dd9805ad606\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12751cf58cf28bbb1c80e0f812afa88932e066c4266d386e60e3853b5bb8c060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709ce1febf81d6950f1da93243c30412e5c34208680a47a141e7908e7fa4b9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sx6lv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.927868 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.943452 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f88e233-6c52-4f7c-84ef-76b9304d0d4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374d2c4f269c4e0196cca01c715fbf3d1c1be24b9a7f70471b79122aa9464fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b953b6cbceff7ac8069d03ad77bd564b1661930712c29814a4605af39435cda8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3085cc1aca7b221925ab104f657c06486df91695b0c4a0ab6c019dce49be5646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.957898 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.971808 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.971838 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.971846 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.971860 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.971870 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:08Z","lastTransitionTime":"2025-12-03T06:50:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.972765 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.987630 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:08 crc kubenswrapper[4947]: I1203 06:50:08.997421 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:08Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.008303 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.018759 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cz948" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd41826-cef5-42f7-8730-abc792b9337c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cz948\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.031868 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.042474 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:50:08Z\\\",\\\"message\\\":\\\"2025-12-03T06:49:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1eceb3a6-2476-4ad6-8dc2-3aff0be327c6\\\\n2025-12-03T06:49:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1eceb3a6-2476-4ad6-8dc2-3aff0be327c6 to /host/opt/cni/bin/\\\\n2025-12-03T06:49:23Z [verbose] multus-daemon started\\\\n2025-12-03T06:49:23Z [verbose] Readiness Indicator file check\\\\n2025-12-03T06:50:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.054822 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.074638 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.074672 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.074685 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.074704 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.074717 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:09Z","lastTransitionTime":"2025-12-03T06:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.083235 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:09 crc kubenswrapper[4947]: E1203 06:50:09.083349 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.109387 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d57a511f5c213100e7583442af42780b696ad462969ce528166b565b0f98a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23d57a511f5c213100e7583442af42780b696ad462969ce528166b565b0f98a5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:50:00Z\\\",\\\"message\\\":\\\"3 06:49:59.992821 6636 lb_config.go:1031] Cluster endpoints for openshift-marketplace/redhat-operators for network=default are: map[]\\\\nI1203 06:49:59.992519 6636 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1203 06:49:59.992571 6636 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pt9n6_openshift-ovn-kubernetes(19542618-7a4e-44bc-9297-9931dcc41eea)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.125306 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.139728 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.155233 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.177957 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.177998 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.178006 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.178020 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.178029 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:09Z","lastTransitionTime":"2025-12-03T06:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.178414 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.195735 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d0c717-13a1-4c19-9af2-0dd9805ad606\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12751cf58cf28bbb1c80e0f812afa88932e066c4266d386e60e3853b5bb8c060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709ce1febf81d6950f1da93243c30412e5c34208680a47a141e7908e7fa4b9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sx6lv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.216956 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.234189 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f88e233-6c52-4f7c-84ef-76b9304d0d4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374d2c4f269c4e0196cca01c715fbf3d1c1be24b9a7f70471b79122aa9464fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b953b6cbceff7ac8069d03ad77bd564b1661930712c29814a4605af39435cda8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3085cc1aca7b221925ab104f657c06486df91695b0c4a0ab6c019dce49be5646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.249462 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.262899 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.276758 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.281132 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.281179 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.281194 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.281212 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.281230 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:09Z","lastTransitionTime":"2025-12-03T06:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.288476 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.300875 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.313234 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cz948" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd41826-cef5-42f7-8730-abc792b9337c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cz948\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.327521 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:50:08Z\\\",\\\"message\\\":\\\"2025-12-03T06:49:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1eceb3a6-2476-4ad6-8dc2-3aff0be327c6\\\\n2025-12-03T06:49:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1eceb3a6-2476-4ad6-8dc2-3aff0be327c6 to /host/opt/cni/bin/\\\\n2025-12-03T06:49:23Z [verbose] multus-daemon started\\\\n2025-12-03T06:49:23Z [verbose] Readiness Indicator file check\\\\n2025-12-03T06:50:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.338948 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.357278 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.383418 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.383469 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.383482 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.383522 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.383537 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:09Z","lastTransitionTime":"2025-12-03T06:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.485403 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.485448 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.485458 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.485476 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.485506 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:09Z","lastTransitionTime":"2025-12-03T06:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.588807 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.588852 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.588878 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.588898 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.588909 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:09Z","lastTransitionTime":"2025-12-03T06:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.691186 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.691274 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.691300 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.691331 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.691352 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:09Z","lastTransitionTime":"2025-12-03T06:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.795113 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.795167 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.795179 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.795199 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.795244 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:09Z","lastTransitionTime":"2025-12-03T06:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.820484 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-97tnc_1c90ac94-365a-4c82-b72a-41129d95a39e/kube-multus/0.log" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.820573 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-97tnc" event={"ID":"1c90ac94-365a-4c82-b72a-41129d95a39e","Type":"ContainerStarted","Data":"5d1d6a820530cf13f5904d39860daa4f63b1b1037a7f863db881d1fbd9799441"} Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.847153 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.863635 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.882542 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.895300 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.897360 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.897410 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.897421 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.897439 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.897451 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:09Z","lastTransitionTime":"2025-12-03T06:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.908524 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.922226 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.935323 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f88e233-6c52-4f7c-84ef-76b9304d0d4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374d2c4f269c4e0196cca01c715fbf3d1c1be24b9a7f70471b79122aa9464fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b953b6cbceff7ac8069d03ad77bd564b1661930712c29814a4605af39435cda8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3085cc1aca7b221925ab104f657c06486df91695b0c4a0ab6c019dce49be5646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.951008 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cz948" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd41826-cef5-42f7-8730-abc792b9337c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cz948\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.968142 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.981016 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d1d6a820530cf13f5904d39860daa4f63b1b1037a7f863db881d1fbd9799441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:50:08Z\\\",\\\"message\\\":\\\"2025-12-03T06:49:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1eceb3a6-2476-4ad6-8dc2-3aff0be327c6\\\\n2025-12-03T06:49:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1eceb3a6-2476-4ad6-8dc2-3aff0be327c6 to /host/opt/cni/bin/\\\\n2025-12-03T06:49:23Z [verbose] multus-daemon started\\\\n2025-12-03T06:49:23Z [verbose] Readiness Indicator file check\\\\n2025-12-03T06:50:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:50:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.996536 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:09Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.999790 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.999863 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:09 crc kubenswrapper[4947]: I1203 06:50:09.999881 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:09.999909 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:09.999924 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:09Z","lastTransitionTime":"2025-12-03T06:50:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.016674 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:10Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.036607 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d57a511f5c213100e7583442af42780b696ad462969ce528166b565b0f98a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23d57a511f5c213100e7583442af42780b696ad462969ce528166b565b0f98a5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:50:00Z\\\",\\\"message\\\":\\\"3 06:49:59.992821 6636 lb_config.go:1031] Cluster endpoints for openshift-marketplace/redhat-operators for network=default are: map[]\\\\nI1203 06:49:59.992519 6636 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1203 06:49:59.992571 6636 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pt9n6_openshift-ovn-kubernetes(19542618-7a4e-44bc-9297-9931dcc41eea)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:10Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.051003 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:10Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.063459 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d0c717-13a1-4c19-9af2-0dd9805ad606\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12751cf58cf28bbb1c80e0f812afa88932e066c4266d386e60e3853b5bb8c060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709ce1febf81d6950f1da93243c30412e5c34208680a47a141e7908e7fa4b9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sx6lv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:10Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.077763 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:10Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.082315 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.082419 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.082328 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:10 crc kubenswrapper[4947]: E1203 06:50:10.082544 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:10 crc kubenswrapper[4947]: E1203 06:50:10.082662 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:10 crc kubenswrapper[4947]: E1203 06:50:10.082891 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.089541 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:10Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.102382 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.102448 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.102467 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.102525 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.102542 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:10Z","lastTransitionTime":"2025-12-03T06:50:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.205627 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.205673 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.205686 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.205704 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.205715 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:10Z","lastTransitionTime":"2025-12-03T06:50:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.308656 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.308685 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.308693 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.308708 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.308717 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:10Z","lastTransitionTime":"2025-12-03T06:50:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.411675 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.411713 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.411720 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.411736 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.411745 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:10Z","lastTransitionTime":"2025-12-03T06:50:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.514528 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.514585 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.514601 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.514624 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.514642 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:10Z","lastTransitionTime":"2025-12-03T06:50:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.617782 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.617820 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.617830 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.617844 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.617853 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:10Z","lastTransitionTime":"2025-12-03T06:50:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.720992 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.721030 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.721039 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.721055 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.721074 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:10Z","lastTransitionTime":"2025-12-03T06:50:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.823978 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.824017 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.824027 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.824041 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.824052 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:10Z","lastTransitionTime":"2025-12-03T06:50:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.927422 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.927525 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.927546 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.927571 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:10 crc kubenswrapper[4947]: I1203 06:50:10.927587 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:10Z","lastTransitionTime":"2025-12-03T06:50:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.030216 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.030264 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.030280 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.030303 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.030320 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:11Z","lastTransitionTime":"2025-12-03T06:50:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.082191 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:11 crc kubenswrapper[4947]: E1203 06:50:11.082382 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.133524 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.133590 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.133614 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.133671 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.133694 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:11Z","lastTransitionTime":"2025-12-03T06:50:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.236979 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.237034 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.237049 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.237072 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.237087 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:11Z","lastTransitionTime":"2025-12-03T06:50:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.339559 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.339607 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.339624 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.339647 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.339664 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:11Z","lastTransitionTime":"2025-12-03T06:50:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.442842 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.442884 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.442893 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.442906 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.442915 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:11Z","lastTransitionTime":"2025-12-03T06:50:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.546386 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.546477 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.546536 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.546561 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.546578 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:11Z","lastTransitionTime":"2025-12-03T06:50:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.649570 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.649631 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.649650 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.649677 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.649697 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:11Z","lastTransitionTime":"2025-12-03T06:50:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.751986 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.752042 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.752059 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.752086 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.752103 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:11Z","lastTransitionTime":"2025-12-03T06:50:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.855024 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.855097 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.855113 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.855136 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.855152 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:11Z","lastTransitionTime":"2025-12-03T06:50:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.958698 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.958811 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.958835 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.958863 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:11 crc kubenswrapper[4947]: I1203 06:50:11.958883 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:11Z","lastTransitionTime":"2025-12-03T06:50:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.061705 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.061794 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.061814 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.061836 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.061853 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:12Z","lastTransitionTime":"2025-12-03T06:50:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.081962 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.082012 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.082063 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:12 crc kubenswrapper[4947]: E1203 06:50:12.082140 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:12 crc kubenswrapper[4947]: E1203 06:50:12.082287 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:12 crc kubenswrapper[4947]: E1203 06:50:12.082377 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.164801 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.164859 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.164872 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.164887 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.164897 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:12Z","lastTransitionTime":"2025-12-03T06:50:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.268107 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.268198 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.268219 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.268242 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.268297 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:12Z","lastTransitionTime":"2025-12-03T06:50:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.371353 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.371395 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.371405 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.371421 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.371434 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:12Z","lastTransitionTime":"2025-12-03T06:50:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.474355 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.474449 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.474468 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.474538 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.474560 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:12Z","lastTransitionTime":"2025-12-03T06:50:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.578206 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.578312 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.578365 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.578395 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.578417 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:12Z","lastTransitionTime":"2025-12-03T06:50:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.681361 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.681415 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.681432 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.681456 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.681476 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:12Z","lastTransitionTime":"2025-12-03T06:50:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.784007 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.784048 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.784205 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.784225 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.784237 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:12Z","lastTransitionTime":"2025-12-03T06:50:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.887574 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.887624 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.887641 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.887664 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.887681 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:12Z","lastTransitionTime":"2025-12-03T06:50:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.989781 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.989846 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.989865 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.989888 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:12 crc kubenswrapper[4947]: I1203 06:50:12.989905 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:12Z","lastTransitionTime":"2025-12-03T06:50:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.082139 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:13 crc kubenswrapper[4947]: E1203 06:50:13.082392 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.092393 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.092454 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.092474 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.092541 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.092562 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:13Z","lastTransitionTime":"2025-12-03T06:50:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.196352 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.196428 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.196456 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.196487 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.196544 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:13Z","lastTransitionTime":"2025-12-03T06:50:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.273150 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.273189 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.273200 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.273216 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.273227 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:13Z","lastTransitionTime":"2025-12-03T06:50:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:13 crc kubenswrapper[4947]: E1203 06:50:13.294539 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:13Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.299725 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.299794 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.299813 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.299837 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.299855 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:13Z","lastTransitionTime":"2025-12-03T06:50:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:13 crc kubenswrapper[4947]: E1203 06:50:13.321597 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:13Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.326540 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.326601 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.326628 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.326657 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.326679 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:13Z","lastTransitionTime":"2025-12-03T06:50:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:13 crc kubenswrapper[4947]: E1203 06:50:13.348135 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:13Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.353168 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.353210 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.353223 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.353239 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.353251 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:13Z","lastTransitionTime":"2025-12-03T06:50:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:13 crc kubenswrapper[4947]: E1203 06:50:13.376674 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:13Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.381045 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.381074 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.381083 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.381102 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.381116 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:13Z","lastTransitionTime":"2025-12-03T06:50:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:13 crc kubenswrapper[4947]: E1203 06:50:13.399002 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:13Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:13 crc kubenswrapper[4947]: E1203 06:50:13.399278 4947 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.401478 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.401585 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.401609 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.401637 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.401658 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:13Z","lastTransitionTime":"2025-12-03T06:50:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.504360 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.504403 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.504412 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.504429 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.504441 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:13Z","lastTransitionTime":"2025-12-03T06:50:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.607803 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.607883 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.607904 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.607928 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.607946 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:13Z","lastTransitionTime":"2025-12-03T06:50:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.711077 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.711144 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.711161 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.711183 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.711201 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:13Z","lastTransitionTime":"2025-12-03T06:50:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.814091 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.814130 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.814141 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.814154 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.814163 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:13Z","lastTransitionTime":"2025-12-03T06:50:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.916471 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.916566 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.916582 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.916604 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:13 crc kubenswrapper[4947]: I1203 06:50:13.916621 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:13Z","lastTransitionTime":"2025-12-03T06:50:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.020058 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.020116 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.020134 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.020157 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.020178 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:14Z","lastTransitionTime":"2025-12-03T06:50:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.082637 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.082775 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:14 crc kubenswrapper[4947]: E1203 06:50:14.082847 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.082637 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:14 crc kubenswrapper[4947]: E1203 06:50:14.082960 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:14 crc kubenswrapper[4947]: E1203 06:50:14.083148 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.123089 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.123165 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.123182 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.123208 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.123225 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:14Z","lastTransitionTime":"2025-12-03T06:50:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.226857 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.226919 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.226936 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.226964 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.226986 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:14Z","lastTransitionTime":"2025-12-03T06:50:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.329927 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.329970 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.329985 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.330001 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.330012 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:14Z","lastTransitionTime":"2025-12-03T06:50:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.433020 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.433067 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.433077 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.433093 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.433104 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:14Z","lastTransitionTime":"2025-12-03T06:50:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.536076 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.536157 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.536174 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.536199 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.536215 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:14Z","lastTransitionTime":"2025-12-03T06:50:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.639263 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.639342 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.639368 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.639398 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.639444 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:14Z","lastTransitionTime":"2025-12-03T06:50:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.741538 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.741601 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.741618 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.741643 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.741661 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:14Z","lastTransitionTime":"2025-12-03T06:50:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.843994 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.844059 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.844080 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.844103 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.844121 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:14Z","lastTransitionTime":"2025-12-03T06:50:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.947038 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.947115 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.947134 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.947158 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:14 crc kubenswrapper[4947]: I1203 06:50:14.947174 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:14Z","lastTransitionTime":"2025-12-03T06:50:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.050687 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.050792 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.050815 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.050854 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.050879 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:15Z","lastTransitionTime":"2025-12-03T06:50:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.082622 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:15 crc kubenswrapper[4947]: E1203 06:50:15.082794 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.154115 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.154199 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.154225 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.154254 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.154275 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:15Z","lastTransitionTime":"2025-12-03T06:50:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.257551 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.257605 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.257624 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.257647 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.257665 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:15Z","lastTransitionTime":"2025-12-03T06:50:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.360552 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.360587 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.360598 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.360616 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.360646 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:15Z","lastTransitionTime":"2025-12-03T06:50:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.463782 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.463837 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.463852 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.463876 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.463898 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:15Z","lastTransitionTime":"2025-12-03T06:50:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.567430 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.567486 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.567546 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.567570 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.567587 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:15Z","lastTransitionTime":"2025-12-03T06:50:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.670131 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.670192 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.670213 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.670237 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.670254 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:15Z","lastTransitionTime":"2025-12-03T06:50:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.773537 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.773794 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.773811 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.773837 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.773854 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:15Z","lastTransitionTime":"2025-12-03T06:50:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.876225 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.876261 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.876268 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.876280 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.876289 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:15Z","lastTransitionTime":"2025-12-03T06:50:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.979237 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.979293 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.979311 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.979336 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:15 crc kubenswrapper[4947]: I1203 06:50:15.979354 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:15Z","lastTransitionTime":"2025-12-03T06:50:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.081724 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.081790 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.081807 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.081832 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.081852 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:16Z","lastTransitionTime":"2025-12-03T06:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.081996 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.082017 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:16 crc kubenswrapper[4947]: E1203 06:50:16.082147 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.082183 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:16 crc kubenswrapper[4947]: E1203 06:50:16.082275 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:16 crc kubenswrapper[4947]: E1203 06:50:16.082451 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.185599 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.185667 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.185686 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.185713 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.185732 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:16Z","lastTransitionTime":"2025-12-03T06:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.288565 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.288632 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.288644 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.288667 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.288681 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:16Z","lastTransitionTime":"2025-12-03T06:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.390861 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.390906 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.390921 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.390938 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.390952 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:16Z","lastTransitionTime":"2025-12-03T06:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.493805 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.493837 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.493847 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.493860 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.493868 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:16Z","lastTransitionTime":"2025-12-03T06:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.596860 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.596928 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.596954 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.596983 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.597006 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:16Z","lastTransitionTime":"2025-12-03T06:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.699691 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.699719 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.699726 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.699739 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.699748 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:16Z","lastTransitionTime":"2025-12-03T06:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.803542 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.803620 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.803643 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.803902 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.803952 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:16Z","lastTransitionTime":"2025-12-03T06:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.907734 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.907799 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.907819 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.907844 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:16 crc kubenswrapper[4947]: I1203 06:50:16.907864 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:16Z","lastTransitionTime":"2025-12-03T06:50:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.011205 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.011354 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.011377 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.011402 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.011421 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:17Z","lastTransitionTime":"2025-12-03T06:50:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.082446 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:17 crc kubenswrapper[4947]: E1203 06:50:17.082953 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.084428 4947 scope.go:117] "RemoveContainer" containerID="23d57a511f5c213100e7583442af42780b696ad462969ce528166b565b0f98a5" Dec 03 06:50:17 crc kubenswrapper[4947]: E1203 06:50:17.084814 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pt9n6_openshift-ovn-kubernetes(19542618-7a4e-44bc-9297-9931dcc41eea)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.114743 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.114813 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.114838 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.114866 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.114886 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:17Z","lastTransitionTime":"2025-12-03T06:50:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.218399 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.218926 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.219098 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.219255 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.219403 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:17Z","lastTransitionTime":"2025-12-03T06:50:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.322260 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.322310 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.322326 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.322352 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.322369 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:17Z","lastTransitionTime":"2025-12-03T06:50:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.426207 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.426266 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.426287 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.426314 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.426335 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:17Z","lastTransitionTime":"2025-12-03T06:50:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.529813 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.530372 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.530665 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.530902 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.531120 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:17Z","lastTransitionTime":"2025-12-03T06:50:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.635085 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.635217 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.635242 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.635270 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.635291 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:17Z","lastTransitionTime":"2025-12-03T06:50:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.738197 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.738233 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.738245 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.738261 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.738272 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:17Z","lastTransitionTime":"2025-12-03T06:50:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.840949 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.841289 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.841419 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.841582 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.841735 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:17Z","lastTransitionTime":"2025-12-03T06:50:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.944638 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.944671 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.944680 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.944693 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:17 crc kubenswrapper[4947]: I1203 06:50:17.944705 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:17Z","lastTransitionTime":"2025-12-03T06:50:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.047926 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.047971 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.047987 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.048009 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.048026 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:18Z","lastTransitionTime":"2025-12-03T06:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.082038 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:18 crc kubenswrapper[4947]: E1203 06:50:18.082213 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.082466 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:18 crc kubenswrapper[4947]: E1203 06:50:18.082604 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.082745 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:18 crc kubenswrapper[4947]: E1203 06:50:18.082888 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.150535 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.150599 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.150619 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.150646 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.150663 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:18Z","lastTransitionTime":"2025-12-03T06:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.253478 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.253539 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.253547 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.253563 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.253575 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:18Z","lastTransitionTime":"2025-12-03T06:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.357620 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.357662 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.357674 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.357691 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.357702 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:18Z","lastTransitionTime":"2025-12-03T06:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.461471 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.461584 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.461605 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.461631 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.461650 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:18Z","lastTransitionTime":"2025-12-03T06:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.564609 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.565078 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.565190 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.565302 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.565410 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:18Z","lastTransitionTime":"2025-12-03T06:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.668933 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.668965 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.668973 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.668986 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.668995 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:18Z","lastTransitionTime":"2025-12-03T06:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.771444 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.771576 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.771600 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.771656 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.771674 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:18Z","lastTransitionTime":"2025-12-03T06:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.874348 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.874430 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.874456 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.874485 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.874545 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:18Z","lastTransitionTime":"2025-12-03T06:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.978301 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.978361 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.978378 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.978405 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:18 crc kubenswrapper[4947]: I1203 06:50:18.978423 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:18Z","lastTransitionTime":"2025-12-03T06:50:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.080900 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.080965 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.080983 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.081008 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.081026 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:19Z","lastTransitionTime":"2025-12-03T06:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.082243 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:19 crc kubenswrapper[4947]: E1203 06:50:19.082445 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.114024 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.144790 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://23d57a511f5c213100e7583442af42780b696ad462969ce528166b565b0f98a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23d57a511f5c213100e7583442af42780b696ad462969ce528166b565b0f98a5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:50:00Z\\\",\\\"message\\\":\\\"3 06:49:59.992821 6636 lb_config.go:1031] Cluster endpoints for openshift-marketplace/redhat-operators for network=default are: map[]\\\\nI1203 06:49:59.992519 6636 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1203 06:49:59.992571 6636 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-pt9n6_openshift-ovn-kubernetes(19542618-7a4e-44bc-9297-9931dcc41eea)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.166636 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.184075 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.184126 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.184143 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.184170 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.184188 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:19Z","lastTransitionTime":"2025-12-03T06:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.192227 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.218618 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.238373 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d0c717-13a1-4c19-9af2-0dd9805ad606\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12751cf58cf28bbb1c80e0f812afa88932e066c4266d386e60e3853b5bb8c060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709ce1febf81d6950f1da93243c30412e5c34208680a47a141e7908e7fa4b9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sx6lv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.266096 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.281834 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.286393 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.286470 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.286526 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.286559 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.286582 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:19Z","lastTransitionTime":"2025-12-03T06:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.297676 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.316095 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.333833 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f88e233-6c52-4f7c-84ef-76b9304d0d4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374d2c4f269c4e0196cca01c715fbf3d1c1be24b9a7f70471b79122aa9464fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b953b6cbceff7ac8069d03ad77bd564b1661930712c29814a4605af39435cda8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3085cc1aca7b221925ab104f657c06486df91695b0c4a0ab6c019dce49be5646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.354279 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.372641 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.389190 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.389238 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.389254 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.389273 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.389287 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:19Z","lastTransitionTime":"2025-12-03T06:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.390045 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cz948" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd41826-cef5-42f7-8730-abc792b9337c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cz948\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.410047 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.428789 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d1d6a820530cf13f5904d39860daa4f63b1b1037a7f863db881d1fbd9799441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:50:08Z\\\",\\\"message\\\":\\\"2025-12-03T06:49:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1eceb3a6-2476-4ad6-8dc2-3aff0be327c6\\\\n2025-12-03T06:49:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1eceb3a6-2476-4ad6-8dc2-3aff0be327c6 to /host/opt/cni/bin/\\\\n2025-12-03T06:49:23Z [verbose] multus-daemon started\\\\n2025-12-03T06:49:23Z [verbose] Readiness Indicator file check\\\\n2025-12-03T06:50:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:50:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.443420 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.493785 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.494078 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.494170 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.494254 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.494340 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:19Z","lastTransitionTime":"2025-12-03T06:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.597339 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.597710 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.597807 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.597904 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.598004 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:19Z","lastTransitionTime":"2025-12-03T06:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.701515 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.701840 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.701933 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.702042 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.702138 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:19Z","lastTransitionTime":"2025-12-03T06:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.811948 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.812323 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.812434 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.812560 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.812652 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:19Z","lastTransitionTime":"2025-12-03T06:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.915957 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.915996 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.916009 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.916025 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:19 crc kubenswrapper[4947]: I1203 06:50:19.916036 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:19Z","lastTransitionTime":"2025-12-03T06:50:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.019570 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.019635 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.019658 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.019690 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.019713 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:20Z","lastTransitionTime":"2025-12-03T06:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.082316 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.082535 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.082811 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:20 crc kubenswrapper[4947]: E1203 06:50:20.082928 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:20 crc kubenswrapper[4947]: E1203 06:50:20.082955 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:20 crc kubenswrapper[4947]: E1203 06:50:20.083123 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.097478 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.124949 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.125652 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.125679 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.125707 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.125720 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:20Z","lastTransitionTime":"2025-12-03T06:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.228536 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.228633 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.228651 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.228675 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.228693 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:20Z","lastTransitionTime":"2025-12-03T06:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.332550 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.332714 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.332729 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.332800 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.332815 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:20Z","lastTransitionTime":"2025-12-03T06:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.436462 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.436580 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.436619 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.436655 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.436677 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:20Z","lastTransitionTime":"2025-12-03T06:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.539337 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.539408 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.539424 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.539449 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.539467 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:20Z","lastTransitionTime":"2025-12-03T06:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.642969 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.643049 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.643073 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.643107 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.643130 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:20Z","lastTransitionTime":"2025-12-03T06:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.747529 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.747676 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.747697 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.747729 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.747748 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:20Z","lastTransitionTime":"2025-12-03T06:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.850389 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.850441 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.850458 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.850484 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.850552 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:20Z","lastTransitionTime":"2025-12-03T06:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.954431 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.954516 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.954536 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.954621 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:20 crc kubenswrapper[4947]: I1203 06:50:20.954673 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:20Z","lastTransitionTime":"2025-12-03T06:50:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.057772 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.057822 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.057833 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.057851 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.057865 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:21Z","lastTransitionTime":"2025-12-03T06:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.082095 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:21 crc kubenswrapper[4947]: E1203 06:50:21.082239 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.161601 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.161672 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.161694 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.161720 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.161739 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:21Z","lastTransitionTime":"2025-12-03T06:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.265560 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.265628 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.265646 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.265671 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.265689 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:21Z","lastTransitionTime":"2025-12-03T06:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.368794 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.368872 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.368892 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.368985 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.369003 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:21Z","lastTransitionTime":"2025-12-03T06:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.473163 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.473227 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.473244 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.473271 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.473294 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:21Z","lastTransitionTime":"2025-12-03T06:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.576217 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.576277 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.576289 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.576306 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.576319 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:21Z","lastTransitionTime":"2025-12-03T06:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.682600 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.682670 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.682690 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.682714 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.682732 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:21Z","lastTransitionTime":"2025-12-03T06:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.785262 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.785310 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.785328 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.785351 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.785369 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:21Z","lastTransitionTime":"2025-12-03T06:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.851242 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:21 crc kubenswrapper[4947]: E1203 06:50:21.851623 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:50:21 crc kubenswrapper[4947]: E1203 06:50:21.851679 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:50:21 crc kubenswrapper[4947]: E1203 06:50:21.851705 4947 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:50:21 crc kubenswrapper[4947]: E1203 06:50:21.851782 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 06:51:25.851759678 +0000 UTC m=+147.112714134 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.887952 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.888007 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.888031 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.888054 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.888071 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:21Z","lastTransitionTime":"2025-12-03T06:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.954300 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:50:21 crc kubenswrapper[4947]: E1203 06:50:21.954502 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:25.954463412 +0000 UTC m=+147.215417878 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.954600 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.954705 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:21 crc kubenswrapper[4947]: E1203 06:50:21.954723 4947 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.954752 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:21 crc kubenswrapper[4947]: E1203 06:50:21.954777 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:51:25.9547641 +0000 UTC m=+147.215718526 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:50:21 crc kubenswrapper[4947]: E1203 06:50:21.954908 4947 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:50:21 crc kubenswrapper[4947]: E1203 06:50:21.954970 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:51:25.954952286 +0000 UTC m=+147.215906752 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:50:21 crc kubenswrapper[4947]: E1203 06:50:21.955049 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:50:21 crc kubenswrapper[4947]: E1203 06:50:21.955068 4947 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:50:21 crc kubenswrapper[4947]: E1203 06:50:21.955082 4947 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:50:21 crc kubenswrapper[4947]: E1203 06:50:21.955114 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 06:51:25.9551041 +0000 UTC m=+147.216058536 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.991755 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.992057 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.992220 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.992364 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:21 crc kubenswrapper[4947]: I1203 06:50:21.992484 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:21Z","lastTransitionTime":"2025-12-03T06:50:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.082692 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.082692 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:22 crc kubenswrapper[4947]: E1203 06:50:22.083375 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.082711 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:22 crc kubenswrapper[4947]: E1203 06:50:22.083644 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:22 crc kubenswrapper[4947]: E1203 06:50:22.083771 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.096287 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.096362 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.096391 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.096418 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.096436 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:22Z","lastTransitionTime":"2025-12-03T06:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.198594 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.198645 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.198657 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.198730 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.198748 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:22Z","lastTransitionTime":"2025-12-03T06:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.302134 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.302197 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.302209 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.302233 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.302249 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:22Z","lastTransitionTime":"2025-12-03T06:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.405878 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.406282 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.406444 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.406683 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.406853 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:22Z","lastTransitionTime":"2025-12-03T06:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.510154 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.510658 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.510926 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.511131 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.511332 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:22Z","lastTransitionTime":"2025-12-03T06:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.615060 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.615117 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.615133 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.615155 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.615172 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:22Z","lastTransitionTime":"2025-12-03T06:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.718131 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.718190 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.718208 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.718234 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.718252 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:22Z","lastTransitionTime":"2025-12-03T06:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.821701 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.821771 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.821779 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.821801 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.821812 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:22Z","lastTransitionTime":"2025-12-03T06:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.925330 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.925387 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.925401 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.925421 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:22 crc kubenswrapper[4947]: I1203 06:50:22.925437 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:22Z","lastTransitionTime":"2025-12-03T06:50:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.029735 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.029828 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.029851 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.029877 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.029900 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:23Z","lastTransitionTime":"2025-12-03T06:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.082656 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:23 crc kubenswrapper[4947]: E1203 06:50:23.082842 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.134239 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.134287 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.134321 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.134341 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.134355 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:23Z","lastTransitionTime":"2025-12-03T06:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.236763 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.236807 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.236815 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.236831 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.236849 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:23Z","lastTransitionTime":"2025-12-03T06:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.340726 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.340796 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.340815 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.340838 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.340854 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:23Z","lastTransitionTime":"2025-12-03T06:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.444909 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.444992 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.445013 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.445046 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.445070 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:23Z","lastTransitionTime":"2025-12-03T06:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.549203 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.549285 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.549305 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.549337 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.549367 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:23Z","lastTransitionTime":"2025-12-03T06:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.653027 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.653103 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.653129 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.653157 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.653178 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:23Z","lastTransitionTime":"2025-12-03T06:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.655421 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.655476 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.655532 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.655559 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.655581 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:23Z","lastTransitionTime":"2025-12-03T06:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:23 crc kubenswrapper[4947]: E1203 06:50:23.675363 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.680686 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.680775 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.680796 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.680826 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.680845 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:23Z","lastTransitionTime":"2025-12-03T06:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:23 crc kubenswrapper[4947]: E1203 06:50:23.696963 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.702049 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.702120 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.702145 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.702175 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.702196 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:23Z","lastTransitionTime":"2025-12-03T06:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:23 crc kubenswrapper[4947]: E1203 06:50:23.723200 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.735313 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.735379 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.735397 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.735422 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.735441 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:23Z","lastTransitionTime":"2025-12-03T06:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:23 crc kubenswrapper[4947]: E1203 06:50:23.756652 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.763317 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.763374 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.763396 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.763425 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.763449 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:23Z","lastTransitionTime":"2025-12-03T06:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:23 crc kubenswrapper[4947]: E1203 06:50:23.779226 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:23 crc kubenswrapper[4947]: E1203 06:50:23.779367 4947 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.781104 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.781148 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.781165 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.781186 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.781201 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:23Z","lastTransitionTime":"2025-12-03T06:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.884561 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.884615 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.884633 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.884658 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.884675 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:23Z","lastTransitionTime":"2025-12-03T06:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.987441 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.987615 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.987641 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.987665 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:23 crc kubenswrapper[4947]: I1203 06:50:23.987682 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:23Z","lastTransitionTime":"2025-12-03T06:50:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.082382 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.082534 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:24 crc kubenswrapper[4947]: E1203 06:50:24.082578 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.082382 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:24 crc kubenswrapper[4947]: E1203 06:50:24.082792 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:24 crc kubenswrapper[4947]: E1203 06:50:24.082989 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.089933 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.089970 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.089983 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.090000 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.090013 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:24Z","lastTransitionTime":"2025-12-03T06:50:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.192683 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.192766 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.192796 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.192824 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.192846 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:24Z","lastTransitionTime":"2025-12-03T06:50:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.295667 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.295732 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.295754 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.295782 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.295803 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:24Z","lastTransitionTime":"2025-12-03T06:50:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.398557 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.398616 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.398632 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.398655 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.398672 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:24Z","lastTransitionTime":"2025-12-03T06:50:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.501190 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.501246 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.501260 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.501278 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.501296 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:24Z","lastTransitionTime":"2025-12-03T06:50:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.605137 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.605208 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.605227 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.605252 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.605270 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:24Z","lastTransitionTime":"2025-12-03T06:50:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.708168 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.708226 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.708241 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.708259 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.708271 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:24Z","lastTransitionTime":"2025-12-03T06:50:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.811835 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.811906 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.811923 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.811947 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.811965 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:24Z","lastTransitionTime":"2025-12-03T06:50:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.914904 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.914967 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.914994 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.915024 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:24 crc kubenswrapper[4947]: I1203 06:50:24.915046 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:24Z","lastTransitionTime":"2025-12-03T06:50:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.017885 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.017942 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.017960 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.017983 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.018000 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:25Z","lastTransitionTime":"2025-12-03T06:50:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.082712 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:25 crc kubenswrapper[4947]: E1203 06:50:25.082916 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.121715 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.121777 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.121794 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.121817 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.121835 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:25Z","lastTransitionTime":"2025-12-03T06:50:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.225047 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.225149 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.225169 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.225201 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.225221 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:25Z","lastTransitionTime":"2025-12-03T06:50:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.328004 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.328585 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.328598 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.328620 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.328632 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:25Z","lastTransitionTime":"2025-12-03T06:50:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.430856 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.430923 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.430950 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.430978 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.430999 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:25Z","lastTransitionTime":"2025-12-03T06:50:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.533131 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.533177 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.533188 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.533205 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.533216 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:25Z","lastTransitionTime":"2025-12-03T06:50:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.636428 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.636470 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.636479 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.636512 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.636525 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:25Z","lastTransitionTime":"2025-12-03T06:50:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.739298 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.739342 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.739353 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.739370 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.739380 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:25Z","lastTransitionTime":"2025-12-03T06:50:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.841860 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.841906 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.841918 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.841934 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.841945 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:25Z","lastTransitionTime":"2025-12-03T06:50:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.944783 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.944833 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.944844 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.944863 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:25 crc kubenswrapper[4947]: I1203 06:50:25.944877 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:25Z","lastTransitionTime":"2025-12-03T06:50:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.048321 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.048407 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.048430 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.048457 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.048482 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:26Z","lastTransitionTime":"2025-12-03T06:50:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.082486 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.082658 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.082778 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:26 crc kubenswrapper[4947]: E1203 06:50:26.082895 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:26 crc kubenswrapper[4947]: E1203 06:50:26.082960 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:26 crc kubenswrapper[4947]: E1203 06:50:26.083437 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.150941 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.150971 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.150979 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.150993 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.151002 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:26Z","lastTransitionTime":"2025-12-03T06:50:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.254672 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.254749 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.254772 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.254797 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.254848 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:26Z","lastTransitionTime":"2025-12-03T06:50:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.357455 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.357545 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.357563 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.357588 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.357604 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:26Z","lastTransitionTime":"2025-12-03T06:50:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.460023 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.460095 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.460240 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.460280 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.460301 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:26Z","lastTransitionTime":"2025-12-03T06:50:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.562664 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.562705 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.562721 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.562740 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.562752 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:26Z","lastTransitionTime":"2025-12-03T06:50:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.665607 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.665660 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.665673 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.665693 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.665712 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:26Z","lastTransitionTime":"2025-12-03T06:50:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.767979 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.768025 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.768037 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.768057 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.768069 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:26Z","lastTransitionTime":"2025-12-03T06:50:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.871644 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.871683 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.871691 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.871704 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.871714 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:26Z","lastTransitionTime":"2025-12-03T06:50:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.974998 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.975259 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.975283 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.975315 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:26 crc kubenswrapper[4947]: I1203 06:50:26.975340 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:26Z","lastTransitionTime":"2025-12-03T06:50:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.077768 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.077804 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.077814 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.077827 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.077835 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:27Z","lastTransitionTime":"2025-12-03T06:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.082518 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:27 crc kubenswrapper[4947]: E1203 06:50:27.082693 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.181694 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.181758 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.181780 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.181805 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.181842 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:27Z","lastTransitionTime":"2025-12-03T06:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.284768 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.284828 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.284845 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.284871 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.284889 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:27Z","lastTransitionTime":"2025-12-03T06:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.387723 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.387774 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.387791 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.387817 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.387834 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:27Z","lastTransitionTime":"2025-12-03T06:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.491051 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.491115 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.491134 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.491166 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.491189 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:27Z","lastTransitionTime":"2025-12-03T06:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.593590 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.593628 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.593638 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.593652 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.593663 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:27Z","lastTransitionTime":"2025-12-03T06:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.696827 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.696904 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.696932 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.696970 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.696995 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:27Z","lastTransitionTime":"2025-12-03T06:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.799689 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.799757 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.799780 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.799810 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.799836 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:27Z","lastTransitionTime":"2025-12-03T06:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.902288 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.902330 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.902342 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.902357 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:27 crc kubenswrapper[4947]: I1203 06:50:27.902368 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:27Z","lastTransitionTime":"2025-12-03T06:50:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.006113 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.006163 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.006173 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.006188 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.006198 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:28Z","lastTransitionTime":"2025-12-03T06:50:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.081964 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.082036 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.082036 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:28 crc kubenswrapper[4947]: E1203 06:50:28.082145 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:28 crc kubenswrapper[4947]: E1203 06:50:28.082293 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:28 crc kubenswrapper[4947]: E1203 06:50:28.082858 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.083135 4947 scope.go:117] "RemoveContainer" containerID="23d57a511f5c213100e7583442af42780b696ad462969ce528166b565b0f98a5" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.108704 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.108755 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.108775 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.108798 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.108816 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:28Z","lastTransitionTime":"2025-12-03T06:50:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.211925 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.211981 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.211992 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.212013 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.212026 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:28Z","lastTransitionTime":"2025-12-03T06:50:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.315833 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.315877 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.315892 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.315911 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.315926 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:28Z","lastTransitionTime":"2025-12-03T06:50:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.418742 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.419202 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.419214 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.419229 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.419241 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:28Z","lastTransitionTime":"2025-12-03T06:50:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.522952 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.523038 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.523056 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.523082 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.523101 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:28Z","lastTransitionTime":"2025-12-03T06:50:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.626064 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.626128 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.626142 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.626165 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.626178 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:28Z","lastTransitionTime":"2025-12-03T06:50:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.729152 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.729196 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.729205 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.729223 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.729235 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:28Z","lastTransitionTime":"2025-12-03T06:50:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.831995 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.832048 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.832065 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.832086 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.832102 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:28Z","lastTransitionTime":"2025-12-03T06:50:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.888303 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pt9n6_19542618-7a4e-44bc-9297-9931dcc41eea/ovnkube-controller/2.log" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.891222 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerStarted","Data":"2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6"} Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.891713 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.906557 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.935972 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.936026 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.936047 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.936071 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.936088 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:28Z","lastTransitionTime":"2025-12-03T06:50:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.938756 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23d57a511f5c213100e7583442af42780b696ad462969ce528166b565b0f98a5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:50:00Z\\\",\\\"message\\\":\\\"3 06:49:59.992821 6636 lb_config.go:1031] Cluster endpoints for openshift-marketplace/redhat-operators for network=default are: map[]\\\\nI1203 06:49:59.992519 6636 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1203 06:49:59.992571 6636 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:50:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.953782 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e53146b-6fe1-4fd0-a974-aad89ebb25ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e67e837d991d917faf834bdf4fc01bbec144ff7cd174f5c185e2813904beb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a25512c24daabdca2a064cdb9f11c53a400d51b5493bb43cb7af721eef6820ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a25512c24daabdca2a064cdb9f11c53a400d51b5493bb43cb7af721eef6820ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.970303 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:28 crc kubenswrapper[4947]: I1203 06:50:28.984383 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.002417 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.015828 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d0c717-13a1-4c19-9af2-0dd9805ad606\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12751cf58cf28bbb1c80e0f812afa88932e066c4266d386e60e3853b5bb8c060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709ce1febf81d6950f1da93243c30412e5c34208680a47a141e7908e7fa4b9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sx6lv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.033112 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.038866 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.038944 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.038970 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.039009 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.039035 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:29Z","lastTransitionTime":"2025-12-03T06:50:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.048671 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.061857 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.079021 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.082090 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:29 crc kubenswrapper[4947]: E1203 06:50:29.082440 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.096254 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f88e233-6c52-4f7c-84ef-76b9304d0d4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374d2c4f269c4e0196cca01c715fbf3d1c1be24b9a7f70471b79122aa9464fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b953b6cbceff7ac8069d03ad77bd564b1661930712c29814a4605af39435cda8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3085cc1aca7b221925ab104f657c06486df91695b0c4a0ab6c019dce49be5646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.111276 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.124852 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.139795 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cz948" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd41826-cef5-42f7-8730-abc792b9337c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cz948\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.141665 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.141721 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.141741 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.141763 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.141780 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:29Z","lastTransitionTime":"2025-12-03T06:50:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.162230 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.181728 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d1d6a820530cf13f5904d39860daa4f63b1b1037a7f863db881d1fbd9799441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:50:08Z\\\",\\\"message\\\":\\\"2025-12-03T06:49:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1eceb3a6-2476-4ad6-8dc2-3aff0be327c6\\\\n2025-12-03T06:49:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1eceb3a6-2476-4ad6-8dc2-3aff0be327c6 to /host/opt/cni/bin/\\\\n2025-12-03T06:49:23Z [verbose] multus-daemon started\\\\n2025-12-03T06:49:23Z [verbose] Readiness Indicator file check\\\\n2025-12-03T06:50:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:50:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.200898 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.217719 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.240336 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23d57a511f5c213100e7583442af42780b696ad462969ce528166b565b0f98a5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:50:00Z\\\",\\\"message\\\":\\\"3 06:49:59.992821 6636 lb_config.go:1031] Cluster endpoints for openshift-marketplace/redhat-operators for network=default are: map[]\\\\nI1203 06:49:59.992519 6636 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1203 06:49:59.992571 6636 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:50:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.245999 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.246071 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.246095 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.246129 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.246154 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:29Z","lastTransitionTime":"2025-12-03T06:50:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.257591 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e53146b-6fe1-4fd0-a974-aad89ebb25ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e67e837d991d917faf834bdf4fc01bbec144ff7cd174f5c185e2813904beb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a25512c24daabdca2a064cdb9f11c53a400d51b5493bb43cb7af721eef6820ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a25512c24daabdca2a064cdb9f11c53a400d51b5493bb43cb7af721eef6820ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.278257 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.291317 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.307107 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.320568 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d0c717-13a1-4c19-9af2-0dd9805ad606\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12751cf58cf28bbb1c80e0f812afa88932e066c4266d386e60e3853b5bb8c060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709ce1febf81d6950f1da93243c30412e5c34208680a47a141e7908e7fa4b9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sx6lv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.335811 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.351899 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.351962 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.351985 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.352014 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.352037 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:29Z","lastTransitionTime":"2025-12-03T06:50:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.361338 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.382011 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f88e233-6c52-4f7c-84ef-76b9304d0d4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374d2c4f269c4e0196cca01c715fbf3d1c1be24b9a7f70471b79122aa9464fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b953b6cbceff7ac8069d03ad77bd564b1661930712c29814a4605af39435cda8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3085cc1aca7b221925ab104f657c06486df91695b0c4a0ab6c019dce49be5646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.397839 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.415827 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.434178 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.449850 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.456060 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.456138 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.456164 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.456195 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.456217 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:29Z","lastTransitionTime":"2025-12-03T06:50:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.465590 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cz948" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd41826-cef5-42f7-8730-abc792b9337c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cz948\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.481965 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.498963 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d1d6a820530cf13f5904d39860daa4f63b1b1037a7f863db881d1fbd9799441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:50:08Z\\\",\\\"message\\\":\\\"2025-12-03T06:49:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1eceb3a6-2476-4ad6-8dc2-3aff0be327c6\\\\n2025-12-03T06:49:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1eceb3a6-2476-4ad6-8dc2-3aff0be327c6 to /host/opt/cni/bin/\\\\n2025-12-03T06:49:23Z [verbose] multus-daemon started\\\\n2025-12-03T06:49:23Z [verbose] Readiness Indicator file check\\\\n2025-12-03T06:50:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:50:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.515011 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.558543 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.558585 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.558596 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.558612 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.558624 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:29Z","lastTransitionTime":"2025-12-03T06:50:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.661936 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.662001 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.662019 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.662045 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.662065 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:29Z","lastTransitionTime":"2025-12-03T06:50:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.764907 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.765012 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.765030 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.765113 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.765132 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:29Z","lastTransitionTime":"2025-12-03T06:50:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.867835 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.867881 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.867895 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.867914 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.867926 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:29Z","lastTransitionTime":"2025-12-03T06:50:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.897833 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pt9n6_19542618-7a4e-44bc-9297-9931dcc41eea/ovnkube-controller/3.log" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.898758 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pt9n6_19542618-7a4e-44bc-9297-9931dcc41eea/ovnkube-controller/2.log" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.901966 4947 generic.go:334] "Generic (PLEG): container finished" podID="19542618-7a4e-44bc-9297-9931dcc41eea" containerID="2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6" exitCode=1 Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.902020 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerDied","Data":"2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6"} Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.902082 4947 scope.go:117] "RemoveContainer" containerID="23d57a511f5c213100e7583442af42780b696ad462969ce528166b565b0f98a5" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.903060 4947 scope.go:117] "RemoveContainer" containerID="2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6" Dec 03 06:50:29 crc kubenswrapper[4947]: E1203 06:50:29.903374 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pt9n6_openshift-ovn-kubernetes(19542618-7a4e-44bc-9297-9931dcc41eea)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.928251 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.946351 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d0c717-13a1-4c19-9af2-0dd9805ad606\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12751cf58cf28bbb1c80e0f812afa88932e066c4266d386e60e3853b5bb8c060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709ce1febf81d6950f1da93243c30412e5c34208680a47a141e7908e7fa4b9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sx6lv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.986091 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.986142 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.986157 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.986179 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.986194 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:29Z","lastTransitionTime":"2025-12-03T06:50:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:29 crc kubenswrapper[4947]: I1203 06:50:29.991763 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e53146b-6fe1-4fd0-a974-aad89ebb25ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e67e837d991d917faf834bdf4fc01bbec144ff7cd174f5c185e2813904beb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a25512c24daabdca2a064cdb9f11c53a400d51b5493bb43cb7af721eef6820ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a25512c24daabdca2a064cdb9f11c53a400d51b5493bb43cb7af721eef6820ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.010822 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.028624 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.048164 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.064758 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.082215 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.082279 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.082378 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:30 crc kubenswrapper[4947]: E1203 06:50:30.082536 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:30 crc kubenswrapper[4947]: E1203 06:50:30.082831 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:30 crc kubenswrapper[4947]: E1203 06:50:30.083057 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.087105 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.088691 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.088735 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.088752 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.088771 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.088789 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:30Z","lastTransitionTime":"2025-12-03T06:50:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.100071 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.111289 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.126846 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.138549 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f88e233-6c52-4f7c-84ef-76b9304d0d4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374d2c4f269c4e0196cca01c715fbf3d1c1be24b9a7f70471b79122aa9464fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b953b6cbceff7ac8069d03ad77bd564b1661930712c29814a4605af39435cda8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3085cc1aca7b221925ab104f657c06486df91695b0c4a0ab6c019dce49be5646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.149175 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cz948" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd41826-cef5-42f7-8730-abc792b9337c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cz948\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.165122 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.178465 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d1d6a820530cf13f5904d39860daa4f63b1b1037a7f863db881d1fbd9799441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:50:08Z\\\",\\\"message\\\":\\\"2025-12-03T06:49:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1eceb3a6-2476-4ad6-8dc2-3aff0be327c6\\\\n2025-12-03T06:49:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1eceb3a6-2476-4ad6-8dc2-3aff0be327c6 to /host/opt/cni/bin/\\\\n2025-12-03T06:49:23Z [verbose] multus-daemon started\\\\n2025-12-03T06:49:23Z [verbose] Readiness Indicator file check\\\\n2025-12-03T06:50:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:50:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.197561 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.197645 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.197668 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.197697 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.197717 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:30Z","lastTransitionTime":"2025-12-03T06:50:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.199574 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.219922 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.249400 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://23d57a511f5c213100e7583442af42780b696ad462969ce528166b565b0f98a5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:50:00Z\\\",\\\"message\\\":\\\"3 06:49:59.992821 6636 lb_config.go:1031] Cluster endpoints for openshift-marketplace/redhat-operators for network=default are: map[]\\\\nI1203 06:49:59.992519 6636 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF1203 06:49:59.992571 6636 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:49:59Z is after 2025-08-2\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:50:29Z\\\",\\\"message\\\":\\\"services.LB{services.LB{Name:\\\\\\\"Service_openshift-authentication-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"6ea1fd71-2b40-4361-92ee-3f1ab4ec7414\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-authentication-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.150\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1203 06:50:29.042295 7031 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:50:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.301110 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.301173 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.301194 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.301221 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.301240 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:30Z","lastTransitionTime":"2025-12-03T06:50:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.403156 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.403225 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.403243 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.403267 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.403286 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:30Z","lastTransitionTime":"2025-12-03T06:50:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.506367 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.506587 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.506624 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.506654 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.506676 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:30Z","lastTransitionTime":"2025-12-03T06:50:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.609311 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.609393 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.609413 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.609443 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.609462 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:30Z","lastTransitionTime":"2025-12-03T06:50:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.713094 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.713158 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.713180 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.713206 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.713227 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:30Z","lastTransitionTime":"2025-12-03T06:50:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.816693 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.816774 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.816794 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.816826 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.816844 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:30Z","lastTransitionTime":"2025-12-03T06:50:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.908694 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pt9n6_19542618-7a4e-44bc-9297-9931dcc41eea/ovnkube-controller/3.log" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.913774 4947 scope.go:117] "RemoveContainer" containerID="2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6" Dec 03 06:50:30 crc kubenswrapper[4947]: E1203 06:50:30.914026 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pt9n6_openshift-ovn-kubernetes(19542618-7a4e-44bc-9297-9931dcc41eea)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.919598 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.919647 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.919662 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.919681 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.919695 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:30Z","lastTransitionTime":"2025-12-03T06:50:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.932800 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d0c717-13a1-4c19-9af2-0dd9805ad606\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12751cf58cf28bbb1c80e0f812afa88932e066c4266d386e60e3853b5bb8c060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709ce1febf81d6950f1da93243c30412e5c34208680a47a141e7908e7fa4b9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sx6lv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.950948 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e53146b-6fe1-4fd0-a974-aad89ebb25ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e67e837d991d917faf834bdf4fc01bbec144ff7cd174f5c185e2813904beb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a25512c24daabdca2a064cdb9f11c53a400d51b5493bb43cb7af721eef6820ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a25512c24daabdca2a064cdb9f11c53a400d51b5493bb43cb7af721eef6820ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.968909 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:30 crc kubenswrapper[4947]: I1203 06:50:30.984637 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.006669 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.022573 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.022686 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.022747 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.022779 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.022799 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:31Z","lastTransitionTime":"2025-12-03T06:50:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.025468 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.047661 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.065292 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.081655 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.082271 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:31 crc kubenswrapper[4947]: E1203 06:50:31.082574 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.104381 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.124177 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f88e233-6c52-4f7c-84ef-76b9304d0d4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374d2c4f269c4e0196cca01c715fbf3d1c1be24b9a7f70471b79122aa9464fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b953b6cbceff7ac8069d03ad77bd564b1661930712c29814a4605af39435cda8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3085cc1aca7b221925ab104f657c06486df91695b0c4a0ab6c019dce49be5646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.125886 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.126294 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.126331 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.126365 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.126389 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:31Z","lastTransitionTime":"2025-12-03T06:50:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.147033 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.165212 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cz948" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd41826-cef5-42f7-8730-abc792b9337c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cz948\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.188022 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.210655 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d1d6a820530cf13f5904d39860daa4f63b1b1037a7f863db881d1fbd9799441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:50:08Z\\\",\\\"message\\\":\\\"2025-12-03T06:49:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1eceb3a6-2476-4ad6-8dc2-3aff0be327c6\\\\n2025-12-03T06:49:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1eceb3a6-2476-4ad6-8dc2-3aff0be327c6 to /host/opt/cni/bin/\\\\n2025-12-03T06:49:23Z [verbose] multus-daemon started\\\\n2025-12-03T06:49:23Z [verbose] Readiness Indicator file check\\\\n2025-12-03T06:50:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:50:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.228736 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.229038 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.229076 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.229091 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.229110 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.229122 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:31Z","lastTransitionTime":"2025-12-03T06:50:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.246709 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.278122 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:50:29Z\\\",\\\"message\\\":\\\"services.LB{services.LB{Name:\\\\\\\"Service_openshift-authentication-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"6ea1fd71-2b40-4361-92ee-3f1ab4ec7414\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-authentication-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.150\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1203 06:50:29.042295 7031 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:50:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pt9n6_openshift-ovn-kubernetes(19542618-7a4e-44bc-9297-9931dcc41eea)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.331450 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.331566 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.331587 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.331610 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.331627 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:31Z","lastTransitionTime":"2025-12-03T06:50:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.434757 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.434818 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.434838 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.434864 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.434887 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:31Z","lastTransitionTime":"2025-12-03T06:50:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.538672 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.538761 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.538789 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.538818 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.538837 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:31Z","lastTransitionTime":"2025-12-03T06:50:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.641954 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.641998 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.642023 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.642047 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.642061 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:31Z","lastTransitionTime":"2025-12-03T06:50:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.744709 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.744776 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.744816 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.744851 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.744878 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:31Z","lastTransitionTime":"2025-12-03T06:50:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.848330 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.848392 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.848415 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.848444 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.848465 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:31Z","lastTransitionTime":"2025-12-03T06:50:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.952121 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.952678 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.952784 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.952910 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:31 crc kubenswrapper[4947]: I1203 06:50:31.953011 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:31Z","lastTransitionTime":"2025-12-03T06:50:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.056105 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.056533 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.056633 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.056777 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.056870 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:32Z","lastTransitionTime":"2025-12-03T06:50:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.082895 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.082936 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.082915 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:32 crc kubenswrapper[4947]: E1203 06:50:32.083113 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:32 crc kubenswrapper[4947]: E1203 06:50:32.083200 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:32 crc kubenswrapper[4947]: E1203 06:50:32.083308 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.160553 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.160604 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.160615 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.160633 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.160701 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:32Z","lastTransitionTime":"2025-12-03T06:50:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.264055 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.264404 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.264483 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.264658 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.264766 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:32Z","lastTransitionTime":"2025-12-03T06:50:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.367844 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.368612 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.368633 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.368651 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.368662 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:32Z","lastTransitionTime":"2025-12-03T06:50:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.471699 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.471779 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.471800 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.471825 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.471844 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:32Z","lastTransitionTime":"2025-12-03T06:50:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.575451 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.575565 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.575585 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.575618 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.575639 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:32Z","lastTransitionTime":"2025-12-03T06:50:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.679695 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.679775 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.679794 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.679820 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.679846 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:32Z","lastTransitionTime":"2025-12-03T06:50:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.783692 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.783771 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.783790 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.783822 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.783841 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:32Z","lastTransitionTime":"2025-12-03T06:50:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.887811 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.887892 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.887914 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.887960 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.887998 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:32Z","lastTransitionTime":"2025-12-03T06:50:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.991762 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.991857 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.991885 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.991922 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:32 crc kubenswrapper[4947]: I1203 06:50:32.991949 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:32Z","lastTransitionTime":"2025-12-03T06:50:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.082417 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:33 crc kubenswrapper[4947]: E1203 06:50:33.082755 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.094748 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.094811 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.094830 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.094854 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.094872 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:33Z","lastTransitionTime":"2025-12-03T06:50:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.198862 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.198945 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.198968 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.199011 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.199036 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:33Z","lastTransitionTime":"2025-12-03T06:50:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.301727 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.301772 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.301780 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.301798 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.301810 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:33Z","lastTransitionTime":"2025-12-03T06:50:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.405018 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.405065 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.405074 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.405090 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.405100 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:33Z","lastTransitionTime":"2025-12-03T06:50:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.508168 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.508241 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.508253 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.508276 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.508291 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:33Z","lastTransitionTime":"2025-12-03T06:50:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.611029 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.611081 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.611095 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.611114 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.611125 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:33Z","lastTransitionTime":"2025-12-03T06:50:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.713848 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.713913 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.713931 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.713962 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.713980 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:33Z","lastTransitionTime":"2025-12-03T06:50:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.816941 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.817005 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.817020 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.817043 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.817055 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:33Z","lastTransitionTime":"2025-12-03T06:50:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.847181 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.847241 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.847259 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.847285 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.847302 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:33Z","lastTransitionTime":"2025-12-03T06:50:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:33 crc kubenswrapper[4947]: E1203 06:50:33.866519 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.872339 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.872389 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.872430 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.872456 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.872472 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:33Z","lastTransitionTime":"2025-12-03T06:50:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:33 crc kubenswrapper[4947]: E1203 06:50:33.890568 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.899432 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.899648 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.899678 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.899715 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.899814 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:33Z","lastTransitionTime":"2025-12-03T06:50:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:33 crc kubenswrapper[4947]: E1203 06:50:33.919835 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.924869 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.924998 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.925079 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.925162 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.925236 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:33Z","lastTransitionTime":"2025-12-03T06:50:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:33 crc kubenswrapper[4947]: E1203 06:50:33.939122 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.943629 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.943764 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.943869 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.943977 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.944063 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:33Z","lastTransitionTime":"2025-12-03T06:50:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:33 crc kubenswrapper[4947]: E1203 06:50:33.959093 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:33 crc kubenswrapper[4947]: E1203 06:50:33.959249 4947 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.961243 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.961282 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.961299 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.961319 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:33 crc kubenswrapper[4947]: I1203 06:50:33.961332 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:33Z","lastTransitionTime":"2025-12-03T06:50:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.064894 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.064957 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.064975 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.065050 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.065070 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:34Z","lastTransitionTime":"2025-12-03T06:50:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.125296 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:34 crc kubenswrapper[4947]: E1203 06:50:34.125536 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.126137 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:34 crc kubenswrapper[4947]: E1203 06:50:34.126376 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.126531 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:34 crc kubenswrapper[4947]: E1203 06:50:34.126672 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.167988 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.168029 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.168038 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.168053 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.168063 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:34Z","lastTransitionTime":"2025-12-03T06:50:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.271365 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.271432 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.271457 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.271486 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.271541 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:34Z","lastTransitionTime":"2025-12-03T06:50:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.374338 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.374379 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.374390 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.374405 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.374414 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:34Z","lastTransitionTime":"2025-12-03T06:50:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.477265 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.477322 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.477345 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.477368 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.477386 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:34Z","lastTransitionTime":"2025-12-03T06:50:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.580710 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.580764 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.580779 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.580802 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.580817 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:34Z","lastTransitionTime":"2025-12-03T06:50:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.684096 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.684679 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.684815 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.684917 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.684984 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:34Z","lastTransitionTime":"2025-12-03T06:50:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.788191 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.788653 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.788730 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.788817 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.788893 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:34Z","lastTransitionTime":"2025-12-03T06:50:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.892043 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.892366 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.892435 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.892537 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.892626 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:34Z","lastTransitionTime":"2025-12-03T06:50:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.996327 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.996400 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.996422 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.996453 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:34 crc kubenswrapper[4947]: I1203 06:50:34.996476 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:34Z","lastTransitionTime":"2025-12-03T06:50:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.082147 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:35 crc kubenswrapper[4947]: E1203 06:50:35.082924 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.100778 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.100853 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.100880 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.100914 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.100941 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:35Z","lastTransitionTime":"2025-12-03T06:50:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.103938 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.204656 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.204715 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.204734 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.204760 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.204798 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:35Z","lastTransitionTime":"2025-12-03T06:50:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.307168 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.307196 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.307204 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.307217 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.307225 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:35Z","lastTransitionTime":"2025-12-03T06:50:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.412884 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.412977 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.413006 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.413035 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.413065 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:35Z","lastTransitionTime":"2025-12-03T06:50:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.515935 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.516311 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.516453 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.516740 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.516987 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:35Z","lastTransitionTime":"2025-12-03T06:50:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.620682 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.620752 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.620775 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.620802 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.620822 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:35Z","lastTransitionTime":"2025-12-03T06:50:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.723384 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.723456 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.723477 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.723557 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.723615 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:35Z","lastTransitionTime":"2025-12-03T06:50:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.826206 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.826274 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.826297 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.826328 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.826349 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:35Z","lastTransitionTime":"2025-12-03T06:50:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.929993 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.930028 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.930036 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.930051 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:35 crc kubenswrapper[4947]: I1203 06:50:35.930060 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:35Z","lastTransitionTime":"2025-12-03T06:50:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.033360 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.033419 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.033427 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.033442 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.033451 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:36Z","lastTransitionTime":"2025-12-03T06:50:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.082405 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.082480 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:36 crc kubenswrapper[4947]: E1203 06:50:36.082603 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:36 crc kubenswrapper[4947]: E1203 06:50:36.082687 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.082812 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:36 crc kubenswrapper[4947]: E1203 06:50:36.082924 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.137046 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.137123 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.137145 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.137172 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.137191 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:36Z","lastTransitionTime":"2025-12-03T06:50:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.239583 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.239626 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.239635 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.239649 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.239657 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:36Z","lastTransitionTime":"2025-12-03T06:50:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.342402 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.342449 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.342459 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.342476 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.342490 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:36Z","lastTransitionTime":"2025-12-03T06:50:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.446359 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.446439 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.446456 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.446561 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.446592 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:36Z","lastTransitionTime":"2025-12-03T06:50:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.550081 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.550146 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.550164 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.550189 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.550206 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:36Z","lastTransitionTime":"2025-12-03T06:50:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.654799 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.654858 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.654878 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.654905 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.654924 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:36Z","lastTransitionTime":"2025-12-03T06:50:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.757605 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.757663 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.757681 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.757704 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.757719 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:36Z","lastTransitionTime":"2025-12-03T06:50:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.859300 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.859343 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.859351 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.859364 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.859373 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:36Z","lastTransitionTime":"2025-12-03T06:50:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.962837 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.962878 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.962887 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.962903 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:36 crc kubenswrapper[4947]: I1203 06:50:36.962913 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:36Z","lastTransitionTime":"2025-12-03T06:50:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.066664 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.067274 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.067446 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.067556 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.067580 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:37Z","lastTransitionTime":"2025-12-03T06:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.082281 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:37 crc kubenswrapper[4947]: E1203 06:50:37.082761 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.170690 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.170759 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.170777 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.170804 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.170824 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:37Z","lastTransitionTime":"2025-12-03T06:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.273829 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.273881 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.273892 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.273910 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.273921 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:37Z","lastTransitionTime":"2025-12-03T06:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.377030 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.377097 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.377121 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.377147 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.377165 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:37Z","lastTransitionTime":"2025-12-03T06:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.479830 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.479896 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.479914 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.479938 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.479958 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:37Z","lastTransitionTime":"2025-12-03T06:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.583619 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.583679 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.583696 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.583719 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.583737 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:37Z","lastTransitionTime":"2025-12-03T06:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.686480 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.686602 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.686622 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.686809 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.686834 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:37Z","lastTransitionTime":"2025-12-03T06:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.790634 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.790748 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.790777 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.790802 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.790821 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:37Z","lastTransitionTime":"2025-12-03T06:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.894055 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.894123 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.894142 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.894167 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.894186 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:37Z","lastTransitionTime":"2025-12-03T06:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.997800 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.997845 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.997858 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.997877 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:37 crc kubenswrapper[4947]: I1203 06:50:37.997888 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:37Z","lastTransitionTime":"2025-12-03T06:50:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.083013 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.083115 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:38 crc kubenswrapper[4947]: E1203 06:50:38.083192 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.083115 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:38 crc kubenswrapper[4947]: E1203 06:50:38.083357 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:38 crc kubenswrapper[4947]: E1203 06:50:38.083614 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.100419 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.100478 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.100514 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.100537 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.100553 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:38Z","lastTransitionTime":"2025-12-03T06:50:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.203995 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.204071 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.204091 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.204114 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.204132 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:38Z","lastTransitionTime":"2025-12-03T06:50:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.306971 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.307014 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.307030 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.307051 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.307070 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:38Z","lastTransitionTime":"2025-12-03T06:50:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.410200 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.410297 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.410315 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.410353 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.410372 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:38Z","lastTransitionTime":"2025-12-03T06:50:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.513889 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.513938 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.513953 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.513973 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.513986 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:38Z","lastTransitionTime":"2025-12-03T06:50:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.617302 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.617364 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.617383 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.617410 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.617427 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:38Z","lastTransitionTime":"2025-12-03T06:50:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.720463 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.720572 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.720595 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.720629 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.720651 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:38Z","lastTransitionTime":"2025-12-03T06:50:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.823401 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.823461 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.823480 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.823547 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.823564 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:38Z","lastTransitionTime":"2025-12-03T06:50:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.926054 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.926125 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.926144 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.926168 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:38 crc kubenswrapper[4947]: I1203 06:50:38.926187 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:38Z","lastTransitionTime":"2025-12-03T06:50:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.029530 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.029616 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.029642 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.029677 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.029702 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:39Z","lastTransitionTime":"2025-12-03T06:50:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.082625 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:39 crc kubenswrapper[4947]: E1203 06:50:39.082817 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.105477 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.135732 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.135787 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.135803 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.135826 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.135845 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:39Z","lastTransitionTime":"2025-12-03T06:50:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.139919 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19542618-7a4e-44bc-9297-9931dcc41eea\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:50:29Z\\\",\\\"message\\\":\\\"services.LB{services.LB{Name:\\\\\\\"Service_openshift-authentication-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"6ea1fd71-2b40-4361-92ee-3f1ab4ec7414\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-authentication-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-authentication-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.150\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1203 06:50:29.042295 7031 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:50:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pt9n6_openshift-ovn-kubernetes(19542618-7a4e-44bc-9297-9931dcc41eea)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fqnqr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-pt9n6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.158815 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e53146b-6fe1-4fd0-a974-aad89ebb25ae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e67e837d991d917faf834bdf4fc01bbec144ff7cd174f5c185e2813904beb2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a25512c24daabdca2a064cdb9f11c53a400d51b5493bb43cb7af721eef6820ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a25512c24daabdca2a064cdb9f11c53a400d51b5493bb43cb7af721eef6820ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.178279 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.198185 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20dfc9a65fb7449c2e62fa9b475f5fc98fc1f7411594fd009ba8f1754e60284c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.219059 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"12b4de70-50e1-40ba-836d-41c953077b50\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec41756c776112934948b714b6461f2b103e064b7dd93ab7a6afe35b6eed9b94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f05af5dfdca1fc75c47907a919f97fd67ec6bf858a64c88f4794226a6870d812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32c4f0e1deada86d941c10768adc2a15a84e076bd0e728e0d114264cf1f29b07\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5288f2e709d959204d3b8c2fc96a6f09f78d74def05f00724dbd2fca9b4cb9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5d7ae3ad39d33d39ed8123fb7eba237003535ccee0c9e16bcf36d64836ad41c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f53fd4e7b4b02cc13738aec306c57bdc46edc244972614c070b2d85ec7a19ec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be3930c7353b23cdac42c77919d5e7bc18950572706c2c3116825cfc91dc3744\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lgflr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mc5l9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.235136 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d0c717-13a1-4c19-9af2-0dd9805ad606\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12751cf58cf28bbb1c80e0f812afa88932e066c4266d386e60e3853b5bb8c060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://709ce1febf81d6950f1da93243c30412e5c34208680a47a141e7908e7fa4b9bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zgzvb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sx6lv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.239372 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.239565 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.239591 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.239613 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.239633 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:39Z","lastTransitionTime":"2025-12-03T06:50:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.258984 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-g8hh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0cbf0ab-15d8-4ec1-b889-c31a347923f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://84d4179fe60710952262f716f16718bc597021d0dc7c6e8113af9accf90378d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf2kv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-g8hh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.274172 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w866n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"150157a8-5a65-4142-9088-0ab46998fc9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4515cdae1ae81ce642892ec61f2395c43948bba4d84954e0a6175a6b99d6c1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d2mdn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:24Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w866n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.299796 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cd79691-d6e6-40e2-8ebf-2d39e64688e7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7762a0aed7586e18fe4e69e884ee4fe9348761b7845c774ff90bfb8095c16819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8ed45e16ee5d22d7bd0cb336c843935b31f8f86ecddb4a57b83b150aec42f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf76d19cc298f94b47e42dc99e4d48bb8955348586e3c39ed1fa7be9f02d3c6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://367ed1505ea1722051c7d96a913df20f97b4bce170b1c4b35372c8af6a24f9bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9214e3b77d8eec03a770d8944f9822ead347e4b6be2cc8cf298344dfd3be15f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d474285ad8de913716408fee87d9f1b033e7675b8ad9f6eabc8c4e240db800e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d474285ad8de913716408fee87d9f1b033e7675b8ad9f6eabc8c4e240db800e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad721644e4e7ccf3bd711c5934ad7d964a9f39c3125c6762ec4319a1a6a31ab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad721644e4e7ccf3bd711c5934ad7d964a9f39c3125c6762ec4319a1a6a31ab3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cf86847a1fef617f426feb864faaceb5e48e76addd7dc5d6f3045bbb8b710b3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cf86847a1fef617f426feb864faaceb5e48e76addd7dc5d6f3045bbb8b710b3f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.318342 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01e38a92-da7c-42f2-9fd0-97c3229888a9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6243b39c8f479a33b4156623c18d9044680b4409a517298ec1f344438e78a562\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90814adef3b98c329411a6ace8ea5aa4263963f9eb512eecb26b4110f3e35014\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98ba0ac3ebedc3f71a17f4271100363f051e0276b450d20bd6b2deed42e16b53\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.339759 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5f88e233-6c52-4f7c-84ef-76b9304d0d4b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374d2c4f269c4e0196cca01c715fbf3d1c1be24b9a7f70471b79122aa9464fc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b953b6cbceff7ac8069d03ad77bd564b1661930712c29814a4605af39435cda8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3085cc1aca7b221925ab104f657c06486df91695b0c4a0ab6c019dce49be5646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://173e7d2c398c8af7e1adf76e5f6b11b6690b8e79092887529dfe7f21749e5c85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.344675 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.344751 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.344780 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.344813 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.344839 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:39Z","lastTransitionTime":"2025-12-03T06:50:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.361226 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c7ec99498ee26276a5e309a70c3645b62072d3d50049326db3987855b311b04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.373772 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.388456 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eea3bc2b7417bd6387c45105f12453bd77a51f0b48737f9fdcfe767cb152dcda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8487065314ae353d3460efdfb04e46f473359cc599e394ebfdd6df478503b2ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.400512 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-cz948" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8dd41826-cef5-42f7-8730-abc792b9337c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:36Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jtjjg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:36Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-cz948\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.412829 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eefa93f4-4ae4-4da7-ba41-c19097fb2352\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:49:17Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:49:11.553397 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:49:11.555259 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3814766516/tls.crt::/tmp/serving-cert-3814766516/tls.key\\\\\\\"\\\\nI1203 06:49:17.512007 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:49:17.516835 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:49:17.516881 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:49:17.516952 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:49:17.516972 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:49:17.526329 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 06:49:17.526451 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526503 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:49:17.526550 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:49:17.526626 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nI1203 06:49:17.526376 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:49:17.526658 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:49:17.526741 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:49:17.529865 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:01Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:00Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:48:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:48:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:48:59Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.426502 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-97tnc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1c90ac94-365a-4c82-b72a-41129d95a39e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d1d6a820530cf13f5904d39860daa4f63b1b1037a7f863db881d1fbd9799441\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:50:08Z\\\",\\\"message\\\":\\\"2025-12-03T06:49:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_1eceb3a6-2476-4ad6-8dc2-3aff0be327c6\\\\n2025-12-03T06:49:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_1eceb3a6-2476-4ad6-8dc2-3aff0be327c6 to /host/opt/cni/bin/\\\\n2025-12-03T06:49:23Z [verbose] multus-daemon started\\\\n2025-12-03T06:49:23Z [verbose] Readiness Indicator file check\\\\n2025-12-03T06:50:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:50:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lsb2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-97tnc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.441758 4947 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8384efba-0256-458d-8aab-627ad76e643e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:49:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://061e278378788d198faf86cb4475c48e957b552e2066a970df8b7f7da7eafbe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:49:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h6s4r\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:49:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qv8tj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:39Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.448165 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.448216 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.448226 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.448245 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.448256 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:39Z","lastTransitionTime":"2025-12-03T06:50:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.550933 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.551014 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.551030 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.551052 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.551067 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:39Z","lastTransitionTime":"2025-12-03T06:50:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.654163 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.654230 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.654254 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.654284 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.654311 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:39Z","lastTransitionTime":"2025-12-03T06:50:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.757711 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.757839 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.757853 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.757877 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.757895 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:39Z","lastTransitionTime":"2025-12-03T06:50:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.860571 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.860628 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.860645 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.860670 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.860692 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:39Z","lastTransitionTime":"2025-12-03T06:50:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.963439 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.963550 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.963574 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.963603 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:39 crc kubenswrapper[4947]: I1203 06:50:39.963625 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:39Z","lastTransitionTime":"2025-12-03T06:50:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.067011 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.067091 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.067112 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.067145 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.067165 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:40Z","lastTransitionTime":"2025-12-03T06:50:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.082480 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.082563 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.082480 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:40 crc kubenswrapper[4947]: E1203 06:50:40.082671 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:40 crc kubenswrapper[4947]: E1203 06:50:40.082810 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:40 crc kubenswrapper[4947]: E1203 06:50:40.082956 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.169828 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.169876 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.169892 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.169915 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.169933 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:40Z","lastTransitionTime":"2025-12-03T06:50:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.273124 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.273183 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.273197 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.273216 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.273232 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:40Z","lastTransitionTime":"2025-12-03T06:50:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.376924 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.377043 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.377070 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.377101 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.377123 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:40Z","lastTransitionTime":"2025-12-03T06:50:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.480414 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.480478 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.480535 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.480563 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.480582 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:40Z","lastTransitionTime":"2025-12-03T06:50:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.499036 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs\") pod \"network-metrics-daemon-cz948\" (UID: \"8dd41826-cef5-42f7-8730-abc792b9337c\") " pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:40 crc kubenswrapper[4947]: E1203 06:50:40.499219 4947 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:50:40 crc kubenswrapper[4947]: E1203 06:50:40.499335 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs podName:8dd41826-cef5-42f7-8730-abc792b9337c nodeName:}" failed. No retries permitted until 2025-12-03 06:51:44.499308776 +0000 UTC m=+165.760263232 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs") pod "network-metrics-daemon-cz948" (UID: "8dd41826-cef5-42f7-8730-abc792b9337c") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.583894 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.583974 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.584000 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.584029 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.584050 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:40Z","lastTransitionTime":"2025-12-03T06:50:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.687997 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.688052 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.688061 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.688081 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.688093 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:40Z","lastTransitionTime":"2025-12-03T06:50:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.791358 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.791417 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.791434 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.791458 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.791475 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:40Z","lastTransitionTime":"2025-12-03T06:50:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.894709 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.894790 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.894814 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.894844 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.894866 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:40Z","lastTransitionTime":"2025-12-03T06:50:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.998091 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.998161 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.998172 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.998194 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:40 crc kubenswrapper[4947]: I1203 06:50:40.998207 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:40Z","lastTransitionTime":"2025-12-03T06:50:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.082631 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:41 crc kubenswrapper[4947]: E1203 06:50:41.082832 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.101095 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.101168 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.101193 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.101222 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.101243 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:41Z","lastTransitionTime":"2025-12-03T06:50:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.204422 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.204485 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.204545 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.204574 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.204598 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:41Z","lastTransitionTime":"2025-12-03T06:50:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.309282 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.309347 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.309364 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.309389 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.309406 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:41Z","lastTransitionTime":"2025-12-03T06:50:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.412893 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.412978 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.413001 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.413147 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.413177 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:41Z","lastTransitionTime":"2025-12-03T06:50:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.516811 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.517096 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.517136 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.517167 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.517189 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:41Z","lastTransitionTime":"2025-12-03T06:50:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.620330 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.620420 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.620433 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.620476 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.620533 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:41Z","lastTransitionTime":"2025-12-03T06:50:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.723859 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.723920 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.723945 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.723975 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.723998 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:41Z","lastTransitionTime":"2025-12-03T06:50:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.827151 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.827205 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.827234 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.827259 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.827276 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:41Z","lastTransitionTime":"2025-12-03T06:50:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.929869 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.929923 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.929944 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.929966 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:41 crc kubenswrapper[4947]: I1203 06:50:41.929982 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:41Z","lastTransitionTime":"2025-12-03T06:50:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.032710 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.032783 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.032815 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.032843 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.032863 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:42Z","lastTransitionTime":"2025-12-03T06:50:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.082541 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.082637 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:42 crc kubenswrapper[4947]: E1203 06:50:42.082665 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:42 crc kubenswrapper[4947]: E1203 06:50:42.082808 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.082950 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:42 crc kubenswrapper[4947]: E1203 06:50:42.083081 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.140034 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.140100 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.140125 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.140154 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.140177 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:42Z","lastTransitionTime":"2025-12-03T06:50:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.243081 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.243147 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.243173 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.243207 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.243226 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:42Z","lastTransitionTime":"2025-12-03T06:50:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.346643 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.346752 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.346771 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.346835 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.346854 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:42Z","lastTransitionTime":"2025-12-03T06:50:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.449722 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.449790 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.449807 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.449831 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.449852 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:42Z","lastTransitionTime":"2025-12-03T06:50:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.552637 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.552715 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.552726 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.552744 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.552756 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:42Z","lastTransitionTime":"2025-12-03T06:50:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.656209 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.656257 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.656268 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.656288 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.656298 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:42Z","lastTransitionTime":"2025-12-03T06:50:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.759549 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.759609 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.759625 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.759650 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.759666 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:42Z","lastTransitionTime":"2025-12-03T06:50:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.863384 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.863465 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.863530 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.863568 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.863591 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:42Z","lastTransitionTime":"2025-12-03T06:50:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.967647 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.967713 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.967736 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.967766 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:42 crc kubenswrapper[4947]: I1203 06:50:42.967789 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:42Z","lastTransitionTime":"2025-12-03T06:50:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.076395 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.077571 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.077640 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.077672 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.077690 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:43Z","lastTransitionTime":"2025-12-03T06:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.082292 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:43 crc kubenswrapper[4947]: E1203 06:50:43.082560 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.181123 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.181177 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.181220 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.181271 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.181288 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:43Z","lastTransitionTime":"2025-12-03T06:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.283545 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.283607 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.283627 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.283703 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.283723 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:43Z","lastTransitionTime":"2025-12-03T06:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.386480 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.386541 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.386550 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.386563 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.386575 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:43Z","lastTransitionTime":"2025-12-03T06:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.490429 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.490511 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.490527 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.490551 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.490603 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:43Z","lastTransitionTime":"2025-12-03T06:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.594163 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.594226 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.594243 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.594268 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.594284 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:43Z","lastTransitionTime":"2025-12-03T06:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.698691 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.698761 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.698778 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.698803 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.698822 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:43Z","lastTransitionTime":"2025-12-03T06:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.802723 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.802792 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.802808 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.802835 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.802855 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:43Z","lastTransitionTime":"2025-12-03T06:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.905757 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.905829 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.905847 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.905874 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:43 crc kubenswrapper[4947]: I1203 06:50:43.905894 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:43Z","lastTransitionTime":"2025-12-03T06:50:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.009945 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.010011 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.010029 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.010056 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.010074 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:44Z","lastTransitionTime":"2025-12-03T06:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.082642 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.082697 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.082697 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:44 crc kubenswrapper[4947]: E1203 06:50:44.082829 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:44 crc kubenswrapper[4947]: E1203 06:50:44.083016 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:44 crc kubenswrapper[4947]: E1203 06:50:44.083155 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.113148 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.113212 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.113235 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.113270 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.113291 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:44Z","lastTransitionTime":"2025-12-03T06:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.163682 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.163814 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.163834 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.163860 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.163878 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:44Z","lastTransitionTime":"2025-12-03T06:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:44 crc kubenswrapper[4947]: E1203 06:50:44.181918 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:44Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.187583 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.187677 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.187702 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.187733 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.187755 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:44Z","lastTransitionTime":"2025-12-03T06:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:44 crc kubenswrapper[4947]: E1203 06:50:44.206299 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:44Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.211415 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.211471 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.211522 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.211548 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.211566 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:44Z","lastTransitionTime":"2025-12-03T06:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:44 crc kubenswrapper[4947]: E1203 06:50:44.232250 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:44Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.238010 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.238058 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.238077 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.238100 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.238116 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:44Z","lastTransitionTime":"2025-12-03T06:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:44 crc kubenswrapper[4947]: E1203 06:50:44.258823 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:44Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.263995 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.264317 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.264486 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.264664 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.264831 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:44Z","lastTransitionTime":"2025-12-03T06:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:44 crc kubenswrapper[4947]: E1203 06:50:44.282550 4947 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:50:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c3feb53c-ad63-4e2a-a506-87d0a873eb31\\\",\\\"systemUUID\\\":\\\"0621c3b3-388e-465f-be28-bb0fb0b39611\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:50:44Z is after 2025-08-24T17:21:41Z" Dec 03 06:50:44 crc kubenswrapper[4947]: E1203 06:50:44.282810 4947 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.285061 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.285120 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.285140 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.285165 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.285184 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:44Z","lastTransitionTime":"2025-12-03T06:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.388376 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.388439 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.388458 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.388484 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.388532 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:44Z","lastTransitionTime":"2025-12-03T06:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.491282 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.491320 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.491329 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.491344 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.491356 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:44Z","lastTransitionTime":"2025-12-03T06:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.594274 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.594318 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.594329 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.594345 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.594356 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:44Z","lastTransitionTime":"2025-12-03T06:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.698018 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.698080 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.698098 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.698122 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.698169 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:44Z","lastTransitionTime":"2025-12-03T06:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.808139 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.808228 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.808252 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.808321 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.808347 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:44Z","lastTransitionTime":"2025-12-03T06:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.911224 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.911269 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.911280 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.911296 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:44 crc kubenswrapper[4947]: I1203 06:50:44.911306 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:44Z","lastTransitionTime":"2025-12-03T06:50:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.015191 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.015287 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.015312 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.015339 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.015362 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:45Z","lastTransitionTime":"2025-12-03T06:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.082433 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:45 crc kubenswrapper[4947]: E1203 06:50:45.082736 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.119431 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.119550 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.119617 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.119654 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.119679 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:45Z","lastTransitionTime":"2025-12-03T06:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.223107 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.223187 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.223220 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.223253 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.223274 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:45Z","lastTransitionTime":"2025-12-03T06:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.326133 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.326297 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.326327 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.326392 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.326426 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:45Z","lastTransitionTime":"2025-12-03T06:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.429997 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.430081 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.430100 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.430129 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.430150 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:45Z","lastTransitionTime":"2025-12-03T06:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.533619 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.533693 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.533711 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.533735 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.533752 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:45Z","lastTransitionTime":"2025-12-03T06:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.636670 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.636721 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.636736 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.636760 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.636779 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:45Z","lastTransitionTime":"2025-12-03T06:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.739471 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.739610 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.739649 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.739693 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.739720 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:45Z","lastTransitionTime":"2025-12-03T06:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.842348 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.842411 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.842428 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.842455 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.842473 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:45Z","lastTransitionTime":"2025-12-03T06:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.946227 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.946284 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.946299 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.946317 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:45 crc kubenswrapper[4947]: I1203 06:50:45.946329 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:45Z","lastTransitionTime":"2025-12-03T06:50:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.049079 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.049116 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.049125 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.049138 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.049147 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:46Z","lastTransitionTime":"2025-12-03T06:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.082537 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.082562 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.082720 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:46 crc kubenswrapper[4947]: E1203 06:50:46.082895 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:46 crc kubenswrapper[4947]: E1203 06:50:46.083029 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:46 crc kubenswrapper[4947]: E1203 06:50:46.083662 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.084019 4947 scope.go:117] "RemoveContainer" containerID="2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6" Dec 03 06:50:46 crc kubenswrapper[4947]: E1203 06:50:46.084365 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pt9n6_openshift-ovn-kubernetes(19542618-7a4e-44bc-9297-9931dcc41eea)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.151977 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.152044 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.152066 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.152096 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.152121 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:46Z","lastTransitionTime":"2025-12-03T06:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.255268 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.255329 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.255343 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.255367 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.255382 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:46Z","lastTransitionTime":"2025-12-03T06:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.357658 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.357727 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.357754 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.357785 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.357810 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:46Z","lastTransitionTime":"2025-12-03T06:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.461565 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.461627 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.461645 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.461668 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.461685 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:46Z","lastTransitionTime":"2025-12-03T06:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.565149 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.565215 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.565233 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.565258 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.565275 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:46Z","lastTransitionTime":"2025-12-03T06:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.667925 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.667975 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.667984 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.668005 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.668018 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:46Z","lastTransitionTime":"2025-12-03T06:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.771539 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.771598 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.771616 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.771646 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.771664 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:46Z","lastTransitionTime":"2025-12-03T06:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.875097 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.875199 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.875220 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.875246 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.875265 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:46Z","lastTransitionTime":"2025-12-03T06:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.977439 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.977539 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.977557 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.977584 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:46 crc kubenswrapper[4947]: I1203 06:50:46.977601 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:46Z","lastTransitionTime":"2025-12-03T06:50:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.080851 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.081637 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.081677 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.081707 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.081724 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:47Z","lastTransitionTime":"2025-12-03T06:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.081957 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:47 crc kubenswrapper[4947]: E1203 06:50:47.082440 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.185217 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.185280 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.185297 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.185321 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.185340 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:47Z","lastTransitionTime":"2025-12-03T06:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.288623 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.288686 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.288705 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.288735 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.288753 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:47Z","lastTransitionTime":"2025-12-03T06:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.391877 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.391932 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.391947 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.391962 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.391975 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:47Z","lastTransitionTime":"2025-12-03T06:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.493948 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.494007 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.494026 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.494051 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.494118 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:47Z","lastTransitionTime":"2025-12-03T06:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.598038 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.598121 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.598144 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.598168 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.598185 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:47Z","lastTransitionTime":"2025-12-03T06:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.701453 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.701552 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.701582 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.701610 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.701627 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:47Z","lastTransitionTime":"2025-12-03T06:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.804060 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.804117 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.804133 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.804157 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.804173 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:47Z","lastTransitionTime":"2025-12-03T06:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.906774 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.906807 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.906817 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.906832 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:47 crc kubenswrapper[4947]: I1203 06:50:47.906842 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:47Z","lastTransitionTime":"2025-12-03T06:50:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.009654 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.009709 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.009723 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.009740 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.009752 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:48Z","lastTransitionTime":"2025-12-03T06:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.082376 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.082400 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:48 crc kubenswrapper[4947]: E1203 06:50:48.082529 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.082431 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:48 crc kubenswrapper[4947]: E1203 06:50:48.082683 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:48 crc kubenswrapper[4947]: E1203 06:50:48.082846 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.112149 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.112222 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.112245 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.112280 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.112303 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:48Z","lastTransitionTime":"2025-12-03T06:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.214717 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.214762 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.214774 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.214790 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.214803 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:48Z","lastTransitionTime":"2025-12-03T06:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.317968 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.318049 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.318070 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.318096 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.318116 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:48Z","lastTransitionTime":"2025-12-03T06:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.424073 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.424167 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.424180 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.424219 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.424231 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:48Z","lastTransitionTime":"2025-12-03T06:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.526734 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.526797 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.526814 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.526838 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.526856 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:48Z","lastTransitionTime":"2025-12-03T06:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.630295 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.630373 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.630390 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.630412 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.630430 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:48Z","lastTransitionTime":"2025-12-03T06:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.735278 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.735353 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.735375 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.735398 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.735415 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:48Z","lastTransitionTime":"2025-12-03T06:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.838402 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.838459 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.838477 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.838539 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.838559 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:48Z","lastTransitionTime":"2025-12-03T06:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.941884 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.941982 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.941995 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.942012 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:48 crc kubenswrapper[4947]: I1203 06:50:48.942022 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:48Z","lastTransitionTime":"2025-12-03T06:50:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.045680 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.045792 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.045820 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.045850 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.045875 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:49Z","lastTransitionTime":"2025-12-03T06:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.083363 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:49 crc kubenswrapper[4947]: E1203 06:50:49.083561 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.136104 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=91.13608154 podStartE2EDuration="1m31.13608154s" podCreationTimestamp="2025-12-03 06:49:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:49.113015439 +0000 UTC m=+110.373969905" watchObservedRunningTime="2025-12-03 06:50:49.13608154 +0000 UTC m=+110.397035976" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.153796 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-97tnc" podStartSLOduration=88.153767325 podStartE2EDuration="1m28.153767325s" podCreationTimestamp="2025-12-03 06:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:49.136277466 +0000 UTC m=+110.397231892" watchObservedRunningTime="2025-12-03 06:50:49.153767325 +0000 UTC m=+110.414721792" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.155209 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.155264 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.155283 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.155309 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.155328 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:49Z","lastTransitionTime":"2025-12-03T06:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.184641 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podStartSLOduration=88.184611181 podStartE2EDuration="1m28.184611181s" podCreationTimestamp="2025-12-03 06:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:49.155172194 +0000 UTC m=+110.416126640" watchObservedRunningTime="2025-12-03 06:50:49.184611181 +0000 UTC m=+110.445565607" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.259001 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.259049 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.259063 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.259085 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.259097 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:49Z","lastTransitionTime":"2025-12-03T06:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.259714 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sx6lv" podStartSLOduration=87.259694968 podStartE2EDuration="1m27.259694968s" podCreationTimestamp="2025-12-03 06:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:49.246309111 +0000 UTC m=+110.507263537" watchObservedRunningTime="2025-12-03 06:50:49.259694968 +0000 UTC m=+110.520649394" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.279729 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=29.279709687 podStartE2EDuration="29.279709687s" podCreationTimestamp="2025-12-03 06:50:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:49.260694975 +0000 UTC m=+110.521649391" watchObservedRunningTime="2025-12-03 06:50:49.279709687 +0000 UTC m=+110.540664113" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.330744 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mc5l9" podStartSLOduration=88.330723614 podStartE2EDuration="1m28.330723614s" podCreationTimestamp="2025-12-03 06:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:49.314139599 +0000 UTC m=+110.575094025" watchObservedRunningTime="2025-12-03 06:50:49.330723614 +0000 UTC m=+110.591678040" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.359254 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-g8hh7" podStartSLOduration=88.359236156 podStartE2EDuration="1m28.359236156s" podCreationTimestamp="2025-12-03 06:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:49.358514106 +0000 UTC m=+110.619468532" watchObservedRunningTime="2025-12-03 06:50:49.359236156 +0000 UTC m=+110.620190582" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.363246 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.363287 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.363299 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.363315 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.363328 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:49Z","lastTransitionTime":"2025-12-03T06:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.371835 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-w866n" podStartSLOduration=88.371815361 podStartE2EDuration="1m28.371815361s" podCreationTimestamp="2025-12-03 06:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:49.371142592 +0000 UTC m=+110.632097018" watchObservedRunningTime="2025-12-03 06:50:49.371815361 +0000 UTC m=+110.632769787" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.397977 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=14.397961596 podStartE2EDuration="14.397961596s" podCreationTimestamp="2025-12-03 06:50:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:49.396169897 +0000 UTC m=+110.657124333" watchObservedRunningTime="2025-12-03 06:50:49.397961596 +0000 UTC m=+110.658916022" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.416679 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=88.416662599 podStartE2EDuration="1m28.416662599s" podCreationTimestamp="2025-12-03 06:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:49.415791245 +0000 UTC m=+110.676745671" watchObservedRunningTime="2025-12-03 06:50:49.416662599 +0000 UTC m=+110.677617025" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.448224 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=57.448203613 podStartE2EDuration="57.448203613s" podCreationTimestamp="2025-12-03 06:49:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:49.431447674 +0000 UTC m=+110.692402100" watchObservedRunningTime="2025-12-03 06:50:49.448203613 +0000 UTC m=+110.709158049" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.465772 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.465818 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.465829 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.465846 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.465859 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:49Z","lastTransitionTime":"2025-12-03T06:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.569323 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.569381 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.569398 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.569422 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.569441 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:49Z","lastTransitionTime":"2025-12-03T06:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.672718 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.672753 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.672765 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.672782 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.672798 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:49Z","lastTransitionTime":"2025-12-03T06:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.774775 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.774843 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.774868 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.774899 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.774920 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:49Z","lastTransitionTime":"2025-12-03T06:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.882722 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.882792 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.882817 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.882897 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.882923 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:49Z","lastTransitionTime":"2025-12-03T06:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.985961 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.986020 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.986034 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.986053 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:49 crc kubenswrapper[4947]: I1203 06:50:49.986065 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:49Z","lastTransitionTime":"2025-12-03T06:50:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.082909 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.082996 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.083053 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:50 crc kubenswrapper[4947]: E1203 06:50:50.083124 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:50 crc kubenswrapper[4947]: E1203 06:50:50.083259 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:50 crc kubenswrapper[4947]: E1203 06:50:50.083433 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.089636 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.089718 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.089741 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.089762 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.089778 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:50Z","lastTransitionTime":"2025-12-03T06:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.192392 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.192449 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.192469 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.192521 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.192540 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:50Z","lastTransitionTime":"2025-12-03T06:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.295953 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.296037 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.296067 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.296101 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.296123 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:50Z","lastTransitionTime":"2025-12-03T06:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.399600 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.399674 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.399697 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.399724 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.399744 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:50Z","lastTransitionTime":"2025-12-03T06:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.505155 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.505352 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.505387 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.505435 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.505458 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:50Z","lastTransitionTime":"2025-12-03T06:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.608630 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.608670 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.608681 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.608699 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.608713 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:50Z","lastTransitionTime":"2025-12-03T06:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.711476 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.711554 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.711573 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.711595 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.711611 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:50Z","lastTransitionTime":"2025-12-03T06:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.815080 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.815623 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.815649 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.815684 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.815703 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:50Z","lastTransitionTime":"2025-12-03T06:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.919241 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.919324 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.919345 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.919368 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:50 crc kubenswrapper[4947]: I1203 06:50:50.919409 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:50Z","lastTransitionTime":"2025-12-03T06:50:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.022534 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.022572 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.022586 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.022602 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.022614 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:51Z","lastTransitionTime":"2025-12-03T06:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.082906 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:51 crc kubenswrapper[4947]: E1203 06:50:51.083191 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.125467 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.125584 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.125608 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.125639 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.125686 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:51Z","lastTransitionTime":"2025-12-03T06:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.230152 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.230221 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.230239 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.230270 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.230294 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:51Z","lastTransitionTime":"2025-12-03T06:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.333881 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.333942 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.333954 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.333974 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.333988 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:51Z","lastTransitionTime":"2025-12-03T06:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.437166 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.437208 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.437226 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.437242 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.437251 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:51Z","lastTransitionTime":"2025-12-03T06:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.540367 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.540531 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.540562 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.540585 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.540605 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:51Z","lastTransitionTime":"2025-12-03T06:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.643957 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.644026 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.644048 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.644077 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.644098 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:51Z","lastTransitionTime":"2025-12-03T06:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.748106 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.748190 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.748214 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.748245 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.748278 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:51Z","lastTransitionTime":"2025-12-03T06:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.852384 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.852480 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.852563 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.852588 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.852607 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:51Z","lastTransitionTime":"2025-12-03T06:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.957238 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.957326 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.957350 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.957380 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:51 crc kubenswrapper[4947]: I1203 06:50:51.957403 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:51Z","lastTransitionTime":"2025-12-03T06:50:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.061028 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.061102 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.061124 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.061156 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.061177 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:52Z","lastTransitionTime":"2025-12-03T06:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.083021 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.083100 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.083100 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:52 crc kubenswrapper[4947]: E1203 06:50:52.083285 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:52 crc kubenswrapper[4947]: E1203 06:50:52.083408 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:52 crc kubenswrapper[4947]: E1203 06:50:52.083593 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.164611 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.164669 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.164686 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.164709 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.164725 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:52Z","lastTransitionTime":"2025-12-03T06:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.267852 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.267958 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.267976 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.268046 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.268063 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:52Z","lastTransitionTime":"2025-12-03T06:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.370959 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.371038 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.371059 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.371089 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.371140 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:52Z","lastTransitionTime":"2025-12-03T06:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.473536 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.473617 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.473637 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.473665 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.473684 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:52Z","lastTransitionTime":"2025-12-03T06:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.577017 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.577295 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.577412 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.577526 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.577615 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:52Z","lastTransitionTime":"2025-12-03T06:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.681075 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.681145 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.681165 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.681209 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.681230 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:52Z","lastTransitionTime":"2025-12-03T06:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.784613 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.784659 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.784672 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.784691 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.784703 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:52Z","lastTransitionTime":"2025-12-03T06:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.887823 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.887874 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.887895 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.887920 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.887938 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:52Z","lastTransitionTime":"2025-12-03T06:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.990895 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.990950 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.990967 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.990989 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:52 crc kubenswrapper[4947]: I1203 06:50:52.991008 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:52Z","lastTransitionTime":"2025-12-03T06:50:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.082662 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:53 crc kubenswrapper[4947]: E1203 06:50:53.083025 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.093436 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.093578 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.093659 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.093765 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.093887 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:53Z","lastTransitionTime":"2025-12-03T06:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.196818 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.197678 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.197728 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.197758 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.197777 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:53Z","lastTransitionTime":"2025-12-03T06:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.300959 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.301032 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.301056 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.301084 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.301104 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:53Z","lastTransitionTime":"2025-12-03T06:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.404069 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.404151 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.404169 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.404197 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.404217 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:53Z","lastTransitionTime":"2025-12-03T06:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.507558 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.507659 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.507678 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.507747 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.507777 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:53Z","lastTransitionTime":"2025-12-03T06:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.611002 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.611075 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.611092 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.611119 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.611137 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:53Z","lastTransitionTime":"2025-12-03T06:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.713101 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.713144 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.713155 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.713173 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.713185 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:53Z","lastTransitionTime":"2025-12-03T06:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.815902 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.815959 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.815976 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.815996 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.816010 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:53Z","lastTransitionTime":"2025-12-03T06:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.919179 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.919235 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.919252 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.919275 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:53 crc kubenswrapper[4947]: I1203 06:50:53.919289 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:53Z","lastTransitionTime":"2025-12-03T06:50:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.022829 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.022902 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.022926 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.022957 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.022978 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:54Z","lastTransitionTime":"2025-12-03T06:50:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.082275 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.082323 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:54 crc kubenswrapper[4947]: E1203 06:50:54.082455 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.082282 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:54 crc kubenswrapper[4947]: E1203 06:50:54.082706 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:54 crc kubenswrapper[4947]: E1203 06:50:54.082864 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.126820 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.126909 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.126940 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.126968 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.126988 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:54Z","lastTransitionTime":"2025-12-03T06:50:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.231736 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.231818 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.231844 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.231877 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.231900 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:54Z","lastTransitionTime":"2025-12-03T06:50:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.335926 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.336016 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.336064 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.336091 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.336109 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:54Z","lastTransitionTime":"2025-12-03T06:50:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.440002 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.440144 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.440306 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.440344 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.442642 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:54Z","lastTransitionTime":"2025-12-03T06:50:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.524349 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.524407 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.524427 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.524451 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.524468 4947 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:50:54Z","lastTransitionTime":"2025-12-03T06:50:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.598732 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff7zd"] Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.599272 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff7zd" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.602291 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.603183 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.603483 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.603639 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.662102 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5eef4f87-b571-4b69-8e03-bdc0fd9c42dc-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ff7zd\" (UID: \"5eef4f87-b571-4b69-8e03-bdc0fd9c42dc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff7zd" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.662218 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5eef4f87-b571-4b69-8e03-bdc0fd9c42dc-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ff7zd\" (UID: \"5eef4f87-b571-4b69-8e03-bdc0fd9c42dc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff7zd" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.662727 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5eef4f87-b571-4b69-8e03-bdc0fd9c42dc-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ff7zd\" (UID: \"5eef4f87-b571-4b69-8e03-bdc0fd9c42dc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff7zd" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.662795 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5eef4f87-b571-4b69-8e03-bdc0fd9c42dc-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ff7zd\" (UID: \"5eef4f87-b571-4b69-8e03-bdc0fd9c42dc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff7zd" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.663074 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5eef4f87-b571-4b69-8e03-bdc0fd9c42dc-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ff7zd\" (UID: \"5eef4f87-b571-4b69-8e03-bdc0fd9c42dc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff7zd" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.764410 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5eef4f87-b571-4b69-8e03-bdc0fd9c42dc-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ff7zd\" (UID: \"5eef4f87-b571-4b69-8e03-bdc0fd9c42dc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff7zd" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.764688 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/5eef4f87-b571-4b69-8e03-bdc0fd9c42dc-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ff7zd\" (UID: \"5eef4f87-b571-4b69-8e03-bdc0fd9c42dc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff7zd" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.764859 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5eef4f87-b571-4b69-8e03-bdc0fd9c42dc-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ff7zd\" (UID: \"5eef4f87-b571-4b69-8e03-bdc0fd9c42dc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff7zd" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.764960 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/5eef4f87-b571-4b69-8e03-bdc0fd9c42dc-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ff7zd\" (UID: \"5eef4f87-b571-4b69-8e03-bdc0fd9c42dc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff7zd" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.765084 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5eef4f87-b571-4b69-8e03-bdc0fd9c42dc-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ff7zd\" (UID: \"5eef4f87-b571-4b69-8e03-bdc0fd9c42dc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff7zd" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.765175 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5eef4f87-b571-4b69-8e03-bdc0fd9c42dc-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ff7zd\" (UID: \"5eef4f87-b571-4b69-8e03-bdc0fd9c42dc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff7zd" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.765260 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5eef4f87-b571-4b69-8e03-bdc0fd9c42dc-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ff7zd\" (UID: \"5eef4f87-b571-4b69-8e03-bdc0fd9c42dc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff7zd" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.766332 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5eef4f87-b571-4b69-8e03-bdc0fd9c42dc-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ff7zd\" (UID: \"5eef4f87-b571-4b69-8e03-bdc0fd9c42dc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff7zd" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.774747 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5eef4f87-b571-4b69-8e03-bdc0fd9c42dc-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ff7zd\" (UID: \"5eef4f87-b571-4b69-8e03-bdc0fd9c42dc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff7zd" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.798590 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5eef4f87-b571-4b69-8e03-bdc0fd9c42dc-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ff7zd\" (UID: \"5eef4f87-b571-4b69-8e03-bdc0fd9c42dc\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff7zd" Dec 03 06:50:54 crc kubenswrapper[4947]: I1203 06:50:54.922486 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff7zd" Dec 03 06:50:55 crc kubenswrapper[4947]: I1203 06:50:55.002678 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff7zd" event={"ID":"5eef4f87-b571-4b69-8e03-bdc0fd9c42dc","Type":"ContainerStarted","Data":"2e4811a90453534e9ef9b35e7423d56392db17129e499a59fb609712436d8bc3"} Dec 03 06:50:55 crc kubenswrapper[4947]: I1203 06:50:55.004408 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-97tnc_1c90ac94-365a-4c82-b72a-41129d95a39e/kube-multus/1.log" Dec 03 06:50:55 crc kubenswrapper[4947]: I1203 06:50:55.005819 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-97tnc_1c90ac94-365a-4c82-b72a-41129d95a39e/kube-multus/0.log" Dec 03 06:50:55 crc kubenswrapper[4947]: I1203 06:50:55.005891 4947 generic.go:334] "Generic (PLEG): container finished" podID="1c90ac94-365a-4c82-b72a-41129d95a39e" containerID="5d1d6a820530cf13f5904d39860daa4f63b1b1037a7f863db881d1fbd9799441" exitCode=1 Dec 03 06:50:55 crc kubenswrapper[4947]: I1203 06:50:55.005931 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-97tnc" event={"ID":"1c90ac94-365a-4c82-b72a-41129d95a39e","Type":"ContainerDied","Data":"5d1d6a820530cf13f5904d39860daa4f63b1b1037a7f863db881d1fbd9799441"} Dec 03 06:50:55 crc kubenswrapper[4947]: I1203 06:50:55.005981 4947 scope.go:117] "RemoveContainer" containerID="b3f1c097ab25ddbc2f04de34e2b9c12e1f7af55bb4a171d772a06d2cba5e726d" Dec 03 06:50:55 crc kubenswrapper[4947]: I1203 06:50:55.006531 4947 scope.go:117] "RemoveContainer" containerID="5d1d6a820530cf13f5904d39860daa4f63b1b1037a7f863db881d1fbd9799441" Dec 03 06:50:55 crc kubenswrapper[4947]: E1203 06:50:55.006720 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-97tnc_openshift-multus(1c90ac94-365a-4c82-b72a-41129d95a39e)\"" pod="openshift-multus/multus-97tnc" podUID="1c90ac94-365a-4c82-b72a-41129d95a39e" Dec 03 06:50:55 crc kubenswrapper[4947]: I1203 06:50:55.082097 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:55 crc kubenswrapper[4947]: E1203 06:50:55.082829 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:56 crc kubenswrapper[4947]: I1203 06:50:56.011767 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-97tnc_1c90ac94-365a-4c82-b72a-41129d95a39e/kube-multus/1.log" Dec 03 06:50:56 crc kubenswrapper[4947]: I1203 06:50:56.013437 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff7zd" event={"ID":"5eef4f87-b571-4b69-8e03-bdc0fd9c42dc","Type":"ContainerStarted","Data":"26b67aec397cd75ba236fe419b12e494be48029bc117765b66c16b3a0fc1626f"} Dec 03 06:50:56 crc kubenswrapper[4947]: I1203 06:50:56.037585 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ff7zd" podStartSLOduration=95.037566565 podStartE2EDuration="1m35.037566565s" podCreationTimestamp="2025-12-03 06:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:56.036812934 +0000 UTC m=+117.297767350" watchObservedRunningTime="2025-12-03 06:50:56.037566565 +0000 UTC m=+117.298520981" Dec 03 06:50:56 crc kubenswrapper[4947]: I1203 06:50:56.083004 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:56 crc kubenswrapper[4947]: I1203 06:50:56.083056 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:56 crc kubenswrapper[4947]: I1203 06:50:56.083127 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:56 crc kubenswrapper[4947]: E1203 06:50:56.083208 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:56 crc kubenswrapper[4947]: E1203 06:50:56.083307 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:56 crc kubenswrapper[4947]: E1203 06:50:56.083439 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:57 crc kubenswrapper[4947]: I1203 06:50:57.083032 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:57 crc kubenswrapper[4947]: E1203 06:50:57.083279 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:58 crc kubenswrapper[4947]: I1203 06:50:58.082021 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:50:58 crc kubenswrapper[4947]: E1203 06:50:58.082209 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:50:58 crc kubenswrapper[4947]: I1203 06:50:58.082267 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:50:58 crc kubenswrapper[4947]: I1203 06:50:58.082270 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:50:58 crc kubenswrapper[4947]: E1203 06:50:58.082759 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:50:58 crc kubenswrapper[4947]: E1203 06:50:58.082877 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:50:59 crc kubenswrapper[4947]: I1203 06:50:59.082117 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:50:59 crc kubenswrapper[4947]: E1203 06:50:59.083234 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:50:59 crc kubenswrapper[4947]: I1203 06:50:59.083370 4947 scope.go:117] "RemoveContainer" containerID="2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6" Dec 03 06:50:59 crc kubenswrapper[4947]: E1203 06:50:59.083876 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-pt9n6_openshift-ovn-kubernetes(19542618-7a4e-44bc-9297-9931dcc41eea)\"" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" Dec 03 06:50:59 crc kubenswrapper[4947]: E1203 06:50:59.094251 4947 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 03 06:50:59 crc kubenswrapper[4947]: E1203 06:50:59.199909 4947 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 06:51:00 crc kubenswrapper[4947]: I1203 06:51:00.082969 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:51:00 crc kubenswrapper[4947]: E1203 06:51:00.083140 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:51:00 crc kubenswrapper[4947]: I1203 06:51:00.083427 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:51:00 crc kubenswrapper[4947]: E1203 06:51:00.083577 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:51:00 crc kubenswrapper[4947]: I1203 06:51:00.083714 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:51:00 crc kubenswrapper[4947]: E1203 06:51:00.083849 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:51:01 crc kubenswrapper[4947]: I1203 06:51:01.082283 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:51:01 crc kubenswrapper[4947]: E1203 06:51:01.082596 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:51:02 crc kubenswrapper[4947]: I1203 06:51:02.082954 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:51:02 crc kubenswrapper[4947]: I1203 06:51:02.083028 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:51:02 crc kubenswrapper[4947]: I1203 06:51:02.083033 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:51:02 crc kubenswrapper[4947]: E1203 06:51:02.083158 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:51:02 crc kubenswrapper[4947]: E1203 06:51:02.083398 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:51:02 crc kubenswrapper[4947]: E1203 06:51:02.083550 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:51:03 crc kubenswrapper[4947]: I1203 06:51:03.082951 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:51:03 crc kubenswrapper[4947]: E1203 06:51:03.083172 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:51:04 crc kubenswrapper[4947]: I1203 06:51:04.082169 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:51:04 crc kubenswrapper[4947]: I1203 06:51:04.082209 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:51:04 crc kubenswrapper[4947]: I1203 06:51:04.082206 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:51:04 crc kubenswrapper[4947]: E1203 06:51:04.082405 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:51:04 crc kubenswrapper[4947]: E1203 06:51:04.082878 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:51:04 crc kubenswrapper[4947]: E1203 06:51:04.082976 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:51:04 crc kubenswrapper[4947]: E1203 06:51:04.201411 4947 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 06:51:05 crc kubenswrapper[4947]: I1203 06:51:05.082672 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:51:05 crc kubenswrapper[4947]: E1203 06:51:05.083226 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:51:06 crc kubenswrapper[4947]: I1203 06:51:06.082348 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:51:06 crc kubenswrapper[4947]: I1203 06:51:06.082394 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:51:06 crc kubenswrapper[4947]: I1203 06:51:06.082367 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:51:06 crc kubenswrapper[4947]: E1203 06:51:06.082570 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:51:06 crc kubenswrapper[4947]: E1203 06:51:06.082752 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:51:06 crc kubenswrapper[4947]: E1203 06:51:06.082913 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:51:07 crc kubenswrapper[4947]: I1203 06:51:07.082132 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:51:07 crc kubenswrapper[4947]: E1203 06:51:07.082358 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:51:08 crc kubenswrapper[4947]: I1203 06:51:08.082361 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:51:08 crc kubenswrapper[4947]: I1203 06:51:08.082440 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:51:08 crc kubenswrapper[4947]: E1203 06:51:08.082641 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:51:08 crc kubenswrapper[4947]: I1203 06:51:08.082447 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:51:08 crc kubenswrapper[4947]: E1203 06:51:08.082687 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:51:08 crc kubenswrapper[4947]: E1203 06:51:08.082781 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:51:09 crc kubenswrapper[4947]: I1203 06:51:09.084111 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:51:09 crc kubenswrapper[4947]: E1203 06:51:09.087603 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:51:09 crc kubenswrapper[4947]: E1203 06:51:09.202872 4947 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 06:51:10 crc kubenswrapper[4947]: I1203 06:51:10.082616 4947 scope.go:117] "RemoveContainer" containerID="5d1d6a820530cf13f5904d39860daa4f63b1b1037a7f863db881d1fbd9799441" Dec 03 06:51:10 crc kubenswrapper[4947]: I1203 06:51:10.083687 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:51:10 crc kubenswrapper[4947]: E1203 06:51:10.083913 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:51:10 crc kubenswrapper[4947]: I1203 06:51:10.084395 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:51:10 crc kubenswrapper[4947]: E1203 06:51:10.084643 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:51:10 crc kubenswrapper[4947]: I1203 06:51:10.085047 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:51:10 crc kubenswrapper[4947]: E1203 06:51:10.085227 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:51:10 crc kubenswrapper[4947]: I1203 06:51:10.086968 4947 scope.go:117] "RemoveContainer" containerID="2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6" Dec 03 06:51:11 crc kubenswrapper[4947]: I1203 06:51:11.081935 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cz948"] Dec 03 06:51:11 crc kubenswrapper[4947]: I1203 06:51:11.089920 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:51:11 crc kubenswrapper[4947]: E1203 06:51:11.090086 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:51:11 crc kubenswrapper[4947]: I1203 06:51:11.093994 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-97tnc_1c90ac94-365a-4c82-b72a-41129d95a39e/kube-multus/1.log" Dec 03 06:51:11 crc kubenswrapper[4947]: I1203 06:51:11.094105 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-97tnc" event={"ID":"1c90ac94-365a-4c82-b72a-41129d95a39e","Type":"ContainerStarted","Data":"f383d35aa20681f32ce2b9b1f63026fc2f6fc5da5a6cf993c059b3e42727cb59"} Dec 03 06:51:11 crc kubenswrapper[4947]: I1203 06:51:11.097880 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pt9n6_19542618-7a4e-44bc-9297-9931dcc41eea/ovnkube-controller/3.log" Dec 03 06:51:11 crc kubenswrapper[4947]: I1203 06:51:11.104383 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerStarted","Data":"a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d"} Dec 03 06:51:11 crc kubenswrapper[4947]: I1203 06:51:11.104421 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:51:11 crc kubenswrapper[4947]: E1203 06:51:11.104579 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:51:11 crc kubenswrapper[4947]: I1203 06:51:11.105084 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:51:12 crc kubenswrapper[4947]: I1203 06:51:12.082288 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:51:12 crc kubenswrapper[4947]: I1203 06:51:12.082341 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:51:12 crc kubenswrapper[4947]: E1203 06:51:12.082575 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:51:12 crc kubenswrapper[4947]: E1203 06:51:12.082810 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:51:13 crc kubenswrapper[4947]: I1203 06:51:13.082685 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:51:13 crc kubenswrapper[4947]: I1203 06:51:13.082792 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:51:13 crc kubenswrapper[4947]: E1203 06:51:13.082869 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cz948" podUID="8dd41826-cef5-42f7-8730-abc792b9337c" Dec 03 06:51:13 crc kubenswrapper[4947]: E1203 06:51:13.083161 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.082607 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.082668 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:51:14 crc kubenswrapper[4947]: E1203 06:51:14.082782 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:51:14 crc kubenswrapper[4947]: E1203 06:51:14.083018 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.866548 4947 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.918035 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" podStartSLOduration=113.918008587 podStartE2EDuration="1m53.918008587s" podCreationTimestamp="2025-12-03 06:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:11.157697196 +0000 UTC m=+132.418651692" watchObservedRunningTime="2025-12-03 06:51:14.918008587 +0000 UTC m=+136.178963053" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.920029 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-flxrp"] Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.920749 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-flxrp" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.922193 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-r8pc8"] Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.923213 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8pc8" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.924614 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-t2gnl"] Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.924715 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.924904 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.925026 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.925334 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.932145 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.933103 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.936208 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wpnnz"] Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.936852 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wpnnz" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.938835 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.938849 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.938978 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.940064 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.941779 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bdlx5"] Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.943051 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.961523 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.961793 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.961959 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.962835 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.962989 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.963110 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.963224 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.963404 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.963680 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.963725 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.963878 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.963902 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.964104 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.964279 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.964609 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.970712 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.972144 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.972154 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jtch8"] Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.972960 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jtch8" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.976426 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h6bsr"] Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.976654 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.976951 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h6bsr" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.980331 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jx4vv"] Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.980942 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jx4vv" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.981977 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.984108 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.984317 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.985609 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.985840 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb"] Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.986080 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.986593 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.986691 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q"] Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.987006 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.987182 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.988376 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-grrm2"] Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.990277 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.990683 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.991129 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.991280 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.994129 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.994362 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.994457 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.994860 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vtmrf"] Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.995265 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vtmrf" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.995916 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-grrm2" Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.997793 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-mpffx"] Dec 03 06:51:14 crc kubenswrapper[4947]: I1203 06:51:14.998577 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mpffx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.000022 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fj5r4"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.000170 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.000398 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.000435 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.000528 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.000534 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mh92n"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.000532 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.000422 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.001161 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.000840 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.000854 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.001543 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.000932 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.000993 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.001065 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.001419 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.002135 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.003519 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xnrcs"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.005318 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.009543 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.005710 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.009764 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.009809 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fncmn"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.006021 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.009926 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xnrcs" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.006058 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.006106 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010320 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eb688659-982c-44a4-8e59-b896de7a5e14-audit-dir\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010363 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a274d4f7-f741-48cc-9c34-8e2805ad66e3-console-config\") pod \"console-f9d7485db-t2gnl\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010385 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a274d4f7-f741-48cc-9c34-8e2805ad66e3-oauth-serving-cert\") pod \"console-f9d7485db-t2gnl\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010406 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afeb9125-522c-4dea-93f1-d3ee2c58da87-config\") pod \"console-operator-58897d9998-wpnnz\" (UID: \"afeb9125-522c-4dea-93f1-d3ee2c58da87\") " pod="openshift-console-operator/console-operator-58897d9998-wpnnz" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010424 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdg2b\" (UniqueName: \"kubernetes.io/projected/a6afe02e-cf5b-41e2-ac72-d89e06576d76-kube-api-access-pdg2b\") pod \"downloads-7954f5f757-flxrp\" (UID: \"a6afe02e-cf5b-41e2-ac72-d89e06576d76\") " pod="openshift-console/downloads-7954f5f757-flxrp" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010442 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb688659-982c-44a4-8e59-b896de7a5e14-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.006179 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010482 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b695283-9aef-4604-b070-43d775a56972-serving-cert\") pod \"openshift-config-operator-7777fb866f-r8pc8\" (UID: \"5b695283-9aef-4604-b070-43d775a56972\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8pc8" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010522 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5b695283-9aef-4604-b070-43d775a56972-available-featuregates\") pod \"openshift-config-operator-7777fb866f-r8pc8\" (UID: \"5b695283-9aef-4604-b070-43d775a56972\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8pc8" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010540 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a274d4f7-f741-48cc-9c34-8e2805ad66e3-console-oauth-config\") pod \"console-f9d7485db-t2gnl\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010558 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a274d4f7-f741-48cc-9c34-8e2805ad66e3-service-ca\") pod \"console-f9d7485db-t2gnl\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010575 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mfnd\" (UniqueName: \"kubernetes.io/projected/a274d4f7-f741-48cc-9c34-8e2805ad66e3-kube-api-access-6mfnd\") pod \"console-f9d7485db-t2gnl\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010585 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fncmn" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010592 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/afeb9125-522c-4dea-93f1-d3ee2c58da87-trusted-ca\") pod \"console-operator-58897d9998-wpnnz\" (UID: \"afeb9125-522c-4dea-93f1-d3ee2c58da87\") " pod="openshift-console-operator/console-operator-58897d9998-wpnnz" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010609 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb688659-982c-44a4-8e59-b896de7a5e14-serving-cert\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.006220 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010627 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a274d4f7-f741-48cc-9c34-8e2805ad66e3-trusted-ca-bundle\") pod \"console-f9d7485db-t2gnl\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010644 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcmzj\" (UniqueName: \"kubernetes.io/projected/afeb9125-522c-4dea-93f1-d3ee2c58da87-kube-api-access-tcmzj\") pod \"console-operator-58897d9998-wpnnz\" (UID: \"afeb9125-522c-4dea-93f1-d3ee2c58da87\") " pod="openshift-console-operator/console-operator-58897d9998-wpnnz" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010661 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s4vl\" (UniqueName: \"kubernetes.io/projected/5b695283-9aef-4604-b070-43d775a56972-kube-api-access-9s4vl\") pod \"openshift-config-operator-7777fb866f-r8pc8\" (UID: \"5b695283-9aef-4604-b070-43d775a56972\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8pc8" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010680 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/eb688659-982c-44a4-8e59-b896de7a5e14-audit\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010698 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb688659-982c-44a4-8e59-b896de7a5e14-config\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010824 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/eb688659-982c-44a4-8e59-b896de7a5e14-encryption-config\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010845 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrwj4\" (UniqueName: \"kubernetes.io/projected/eb688659-982c-44a4-8e59-b896de7a5e14-kube-api-access-jrwj4\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010874 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/eb688659-982c-44a4-8e59-b896de7a5e14-etcd-serving-ca\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010892 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a274d4f7-f741-48cc-9c34-8e2805ad66e3-console-serving-cert\") pod \"console-f9d7485db-t2gnl\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010910 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eb688659-982c-44a4-8e59-b896de7a5e14-node-pullsecrets\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010926 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afeb9125-522c-4dea-93f1-d3ee2c58da87-serving-cert\") pod \"console-operator-58897d9998-wpnnz\" (UID: \"afeb9125-522c-4dea-93f1-d3ee2c58da87\") " pod="openshift-console-operator/console-operator-58897d9998-wpnnz" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010943 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eb688659-982c-44a4-8e59-b896de7a5e14-etcd-client\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.006266 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.006308 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.006354 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.006349 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.006388 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.006401 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.006430 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.006439 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.006475 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.006472 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.006851 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.007261 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.007276 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.007328 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.007828 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.007930 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.008012 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.008267 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.008757 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.008964 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.009198 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.010415 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.029190 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.029721 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.030272 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/eb688659-982c-44a4-8e59-b896de7a5e14-image-import-ca\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.030440 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.031763 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nm4hw"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.038926 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qlvpr"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.041052 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.041456 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.041575 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm4hw" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.043242 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.046054 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qlvpr" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.048736 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.050181 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.059275 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.059873 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.060045 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.062073 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.063474 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.072049 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-xrkt7"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.072656 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v9zdx"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.072943 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-klrbv"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.073419 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-klrbv" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.073661 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xrkt7" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.073811 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.073980 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jl2zz"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.078737 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jl2zz" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.079672 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vdtwv"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.080935 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vm492"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.081563 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vm492" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.081931 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vdtwv" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.082080 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.082385 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.085610 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.091568 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.091733 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.096022 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.098893 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vcl8p"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.100100 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vcl8p" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.101163 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgl4w"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.101346 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.102571 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.103319 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.104662 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.110047 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgl4w" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.118864 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6wmgx"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.119508 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ljd4p"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.119795 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6wmgx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.119957 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ljd4p" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.127093 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.135923 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.136222 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8896266b-64eb-434e-b0cf-6a57510bf439-default-certificate\") pod \"router-default-5444994796-xrkt7\" (UID: \"8896266b-64eb-434e-b0cf-6a57510bf439\") " pod="openshift-ingress/router-default-5444994796-xrkt7" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.136312 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dabcbb38-cce7-404a-ab4b-aa81c4ceebb7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-t2pzb\" (UID: \"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.136418 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9008fed-2c05-4a53-a6f8-9a407213b236-serving-cert\") pod \"authentication-operator-69f744f599-jx4vv\" (UID: \"a9008fed-2c05-4a53-a6f8-9a407213b236\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jx4vv" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.136508 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-v9zdx\" (UID: \"8f3f9b27-48d3-4b7e-963e-e2829a4a659e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.136607 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b695283-9aef-4604-b070-43d775a56972-serving-cert\") pod \"openshift-config-operator-7777fb866f-r8pc8\" (UID: \"5b695283-9aef-4604-b070-43d775a56972\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8pc8" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.136687 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.136886 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dv6kr"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.137885 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5b695283-9aef-4604-b070-43d775a56972-available-featuregates\") pod \"openshift-config-operator-7777fb866f-r8pc8\" (UID: \"5b695283-9aef-4604-b070-43d775a56972\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8pc8" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.137984 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww62k\" (UniqueName: \"kubernetes.io/projected/795afb6d-557a-4f06-9772-0af88a287913-kube-api-access-ww62k\") pod \"machine-config-controller-84d6567774-nm4hw\" (UID: \"795afb6d-557a-4f06-9772-0af88a287913\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm4hw" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.138033 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a0c9c8ba-d665-479d-8b23-5f6150a8ccdf-proxy-tls\") pod \"machine-config-operator-74547568cd-mpffx\" (UID: \"a0c9c8ba-d665-479d-8b23-5f6150a8ccdf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mpffx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.138089 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a274d4f7-f741-48cc-9c34-8e2805ad66e3-console-oauth-config\") pod \"console-f9d7485db-t2gnl\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.138258 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5d7cs"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.138450 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5b695283-9aef-4604-b070-43d775a56972-available-featuregates\") pod \"openshift-config-operator-7777fb866f-r8pc8\" (UID: \"5b695283-9aef-4604-b070-43d775a56972\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8pc8" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.138690 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dv6kr" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.139594 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a274d4f7-f741-48cc-9c34-8e2805ad66e3-service-ca\") pod \"console-f9d7485db-t2gnl\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.139679 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mfnd\" (UniqueName: \"kubernetes.io/projected/a274d4f7-f741-48cc-9c34-8e2805ad66e3-kube-api-access-6mfnd\") pod \"console-f9d7485db-t2gnl\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.139724 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/afeb9125-522c-4dea-93f1-d3ee2c58da87-trusted-ca\") pod \"console-operator-58897d9998-wpnnz\" (UID: \"afeb9125-522c-4dea-93f1-d3ee2c58da87\") " pod="openshift-console-operator/console-operator-58897d9998-wpnnz" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.139985 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb688659-982c-44a4-8e59-b896de7a5e14-serving-cert\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.140056 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8466264-98ee-4866-bce5-0cdc13613446-serving-cert\") pod \"etcd-operator-b45778765-fncmn\" (UID: \"d8466264-98ee-4866-bce5-0cdc13613446\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fncmn" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.140095 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dabcbb38-cce7-404a-ab4b-aa81c4ceebb7-etcd-client\") pod \"apiserver-7bbb656c7d-t2pzb\" (UID: \"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.140133 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8fa054f2-39f2-479b-915f-b2ceeab282d6-audit-policies\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.140177 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dabcbb38-cce7-404a-ab4b-aa81c4ceebb7-audit-policies\") pod \"apiserver-7bbb656c7d-t2pzb\" (UID: \"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.140220 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1fecc097-e805-479d-9d1a-c03d13013851-auth-proxy-config\") pod \"machine-approver-56656f9798-grrm2\" (UID: \"1fecc097-e805-479d-9d1a-c03d13013851\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-grrm2" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.140258 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2feb0620-965a-4f0e-98fd-bccefd87053b-serving-cert\") pod \"route-controller-manager-6576b87f9c-lgg8q\" (UID: \"2feb0620-965a-4f0e-98fd-bccefd87053b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.140291 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.140344 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.140389 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2963ae1e-2e90-492a-a52a-4ca04bf481cc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-jl2zz\" (UID: \"2963ae1e-2e90-492a-a52a-4ca04bf481cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jl2zz" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.140437 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a274d4f7-f741-48cc-9c34-8e2805ad66e3-trusted-ca-bundle\") pod \"console-f9d7485db-t2gnl\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.140465 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcmzj\" (UniqueName: \"kubernetes.io/projected/afeb9125-522c-4dea-93f1-d3ee2c58da87-kube-api-access-tcmzj\") pod \"console-operator-58897d9998-wpnnz\" (UID: \"afeb9125-522c-4dea-93f1-d3ee2c58da87\") " pod="openshift-console-operator/console-operator-58897d9998-wpnnz" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.140519 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-serving-cert\") pod \"controller-manager-879f6c89f-v9zdx\" (UID: \"8f3f9b27-48d3-4b7e-963e-e2829a4a659e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.140566 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmkwn\" (UniqueName: \"kubernetes.io/projected/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-kube-api-access-bmkwn\") pod \"controller-manager-879f6c89f-v9zdx\" (UID: \"8f3f9b27-48d3-4b7e-963e-e2829a4a659e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.140610 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s4vl\" (UniqueName: \"kubernetes.io/projected/5b695283-9aef-4604-b070-43d775a56972-kube-api-access-9s4vl\") pod \"openshift-config-operator-7777fb866f-r8pc8\" (UID: \"5b695283-9aef-4604-b070-43d775a56972\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8pc8" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.140650 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/eb688659-982c-44a4-8e59-b896de7a5e14-audit\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.140689 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5afa1ed2-5ac6-49e9-bbcb-87bf7afaf3b0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-h6bsr\" (UID: \"5afa1ed2-5ac6-49e9-bbcb-87bf7afaf3b0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h6bsr" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.140704 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5d7cs" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.140723 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-config\") pod \"controller-manager-879f6c89f-v9zdx\" (UID: \"8f3f9b27-48d3-4b7e-963e-e2829a4a659e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.140760 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb688659-982c-44a4-8e59-b896de7a5e14-config\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.140804 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1fecc097-e805-479d-9d1a-c03d13013851-machine-approver-tls\") pod \"machine-approver-56656f9798-grrm2\" (UID: \"1fecc097-e805-479d-9d1a-c03d13013851\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-grrm2" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.140843 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8896266b-64eb-434e-b0cf-6a57510bf439-service-ca-bundle\") pod \"router-default-5444994796-xrkt7\" (UID: \"8896266b-64eb-434e-b0cf-6a57510bf439\") " pod="openshift-ingress/router-default-5444994796-xrkt7" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.140876 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5afa1ed2-5ac6-49e9-bbcb-87bf7afaf3b0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-h6bsr\" (UID: \"5afa1ed2-5ac6-49e9-bbcb-87bf7afaf3b0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h6bsr" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.140912 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2963ae1e-2e90-492a-a52a-4ca04bf481cc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-jl2zz\" (UID: \"2963ae1e-2e90-492a-a52a-4ca04bf481cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jl2zz" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.140949 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/eb688659-982c-44a4-8e59-b896de7a5e14-encryption-config\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.140981 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8466264-98ee-4866-bce5-0cdc13613446-config\") pod \"etcd-operator-b45778765-fncmn\" (UID: \"d8466264-98ee-4866-bce5-0cdc13613446\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fncmn" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.141019 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.141062 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrwj4\" (UniqueName: \"kubernetes.io/projected/eb688659-982c-44a4-8e59-b896de7a5e14-kube-api-access-jrwj4\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.141101 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.141154 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.141189 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8896266b-64eb-434e-b0cf-6a57510bf439-metrics-certs\") pod \"router-default-5444994796-xrkt7\" (UID: \"8896266b-64eb-434e-b0cf-6a57510bf439\") " pod="openshift-ingress/router-default-5444994796-xrkt7" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.141229 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dabcbb38-cce7-404a-ab4b-aa81c4ceebb7-serving-cert\") pod \"apiserver-7bbb656c7d-t2pzb\" (UID: \"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.141255 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2963ae1e-2e90-492a-a52a-4ca04bf481cc-config\") pod \"kube-controller-manager-operator-78b949d7b-jl2zz\" (UID: \"2963ae1e-2e90-492a-a52a-4ca04bf481cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jl2zz" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.141678 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/eb688659-982c-44a4-8e59-b896de7a5e14-etcd-serving-ca\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.141730 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2feb0620-965a-4f0e-98fd-bccefd87053b-config\") pod \"route-controller-manager-6576b87f9c-lgg8q\" (UID: \"2feb0620-965a-4f0e-98fd-bccefd87053b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.141767 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzjtz\" (UniqueName: \"kubernetes.io/projected/a9008fed-2c05-4a53-a6f8-9a407213b236-kube-api-access-xzjtz\") pod \"authentication-operator-69f744f599-jx4vv\" (UID: \"a9008fed-2c05-4a53-a6f8-9a407213b236\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jx4vv" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.141801 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-767c6\" (UniqueName: \"kubernetes.io/projected/1fecc097-e805-479d-9d1a-c03d13013851-kube-api-access-767c6\") pod \"machine-approver-56656f9798-grrm2\" (UID: \"1fecc097-e805-479d-9d1a-c03d13013851\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-grrm2" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.141823 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8fa054f2-39f2-479b-915f-b2ceeab282d6-audit-dir\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.141854 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d8466264-98ee-4866-bce5-0cdc13613446-etcd-service-ca\") pod \"etcd-operator-b45778765-fncmn\" (UID: \"d8466264-98ee-4866-bce5-0cdc13613446\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fncmn" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.141888 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bs75\" (UniqueName: \"kubernetes.io/projected/a0c9c8ba-d665-479d-8b23-5f6150a8ccdf-kube-api-access-9bs75\") pod \"machine-config-operator-74547568cd-mpffx\" (UID: \"a0c9c8ba-d665-479d-8b23-5f6150a8ccdf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mpffx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.141923 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9008fed-2c05-4a53-a6f8-9a407213b236-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jx4vv\" (UID: \"a9008fed-2c05-4a53-a6f8-9a407213b236\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jx4vv" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.142916 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a274d4f7-f741-48cc-9c34-8e2805ad66e3-console-serving-cert\") pod \"console-f9d7485db-t2gnl\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.143083 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eb688659-982c-44a4-8e59-b896de7a5e14-node-pullsecrets\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.143197 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5nj4\" (UniqueName: \"kubernetes.io/projected/28961044-3c25-4750-9703-1f19edee14cd-kube-api-access-m5nj4\") pod \"cluster-samples-operator-665b6dd947-jtch8\" (UID: \"28961044-3c25-4750-9703-1f19edee14cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jtch8" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.143423 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz9dj\" (UniqueName: \"kubernetes.io/projected/8fa054f2-39f2-479b-915f-b2ceeab282d6-kube-api-access-kz9dj\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.143540 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb688659-982c-44a4-8e59-b896de7a5e14-config\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.143546 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afeb9125-522c-4dea-93f1-d3ee2c58da87-serving-cert\") pod \"console-operator-58897d9998-wpnnz\" (UID: \"afeb9125-522c-4dea-93f1-d3ee2c58da87\") " pod="openshift-console-operator/console-operator-58897d9998-wpnnz" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.144288 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eb688659-982c-44a4-8e59-b896de7a5e14-etcd-client\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.144371 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9008fed-2c05-4a53-a6f8-9a407213b236-config\") pod \"authentication-operator-69f744f599-jx4vv\" (UID: \"a9008fed-2c05-4a53-a6f8-9a407213b236\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jx4vv" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.144451 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/eb688659-982c-44a4-8e59-b896de7a5e14-image-import-ca\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.144663 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.144748 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7dbf214b-d90a-41ff-8e46-694452ef42b9-metrics-tls\") pod \"dns-operator-744455d44c-vtmrf\" (UID: \"7dbf214b-d90a-41ff-8e46-694452ef42b9\") " pod="openshift-dns-operator/dns-operator-744455d44c-vtmrf" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.144832 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2bsj\" (UniqueName: \"kubernetes.io/projected/d8466264-98ee-4866-bce5-0cdc13613446-kube-api-access-r2bsj\") pod \"etcd-operator-b45778765-fncmn\" (UID: \"d8466264-98ee-4866-bce5-0cdc13613446\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fncmn" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.143241 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/afeb9125-522c-4dea-93f1-d3ee2c58da87-trusted-ca\") pod \"console-operator-58897d9998-wpnnz\" (UID: \"afeb9125-522c-4dea-93f1-d3ee2c58da87\") " pod="openshift-console-operator/console-operator-58897d9998-wpnnz" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.144383 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/eb688659-982c-44a4-8e59-b896de7a5e14-audit\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.144523 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a274d4f7-f741-48cc-9c34-8e2805ad66e3-service-ca\") pod \"console-f9d7485db-t2gnl\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.145224 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-client-ca\") pod \"controller-manager-879f6c89f-v9zdx\" (UID: \"8f3f9b27-48d3-4b7e-963e-e2829a4a659e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.145299 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8aef45b-e02c-4a5e-a154-f4814776a263-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xnrcs\" (UID: \"d8aef45b-e02c-4a5e-a154-f4814776a263\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xnrcs" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.145405 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a274d4f7-f741-48cc-9c34-8e2805ad66e3-console-config\") pod \"console-f9d7485db-t2gnl\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.145482 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eb688659-982c-44a4-8e59-b896de7a5e14-audit-dir\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.146252 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a0c9c8ba-d665-479d-8b23-5f6150a8ccdf-auth-proxy-config\") pod \"machine-config-operator-74547568cd-mpffx\" (UID: \"a0c9c8ba-d665-479d-8b23-5f6150a8ccdf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mpffx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.146373 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8aef45b-e02c-4a5e-a154-f4814776a263-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xnrcs\" (UID: \"d8aef45b-e02c-4a5e-a154-f4814776a263\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xnrcs" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.146546 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dabcbb38-cce7-404a-ab4b-aa81c4ceebb7-encryption-config\") pod \"apiserver-7bbb656c7d-t2pzb\" (UID: \"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.146645 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dabcbb38-cce7-404a-ab4b-aa81c4ceebb7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-t2pzb\" (UID: \"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.146770 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdj4z\" (UniqueName: \"kubernetes.io/projected/d8aef45b-e02c-4a5e-a154-f4814776a263-kube-api-access-jdj4z\") pod \"openshift-apiserver-operator-796bbdcf4f-xnrcs\" (UID: \"d8aef45b-e02c-4a5e-a154-f4814776a263\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xnrcs" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.146872 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/795afb6d-557a-4f06-9772-0af88a287913-proxy-tls\") pod \"machine-config-controller-84d6567774-nm4hw\" (UID: \"795afb6d-557a-4f06-9772-0af88a287913\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm4hw" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.146958 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h5tqs"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.146994 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d8466264-98ee-4866-bce5-0cdc13613446-etcd-client\") pod \"etcd-operator-b45778765-fncmn\" (UID: \"d8466264-98ee-4866-bce5-0cdc13613446\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fncmn" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.147169 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh8wq\" (UniqueName: \"kubernetes.io/projected/8896266b-64eb-434e-b0cf-6a57510bf439-kube-api-access-fh8wq\") pod \"router-default-5444994796-xrkt7\" (UID: \"8896266b-64eb-434e-b0cf-6a57510bf439\") " pod="openshift-ingress/router-default-5444994796-xrkt7" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.147266 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cggp5\" (UniqueName: \"kubernetes.io/projected/7dbf214b-d90a-41ff-8e46-694452ef42b9-kube-api-access-cggp5\") pod \"dns-operator-744455d44c-vtmrf\" (UID: \"7dbf214b-d90a-41ff-8e46-694452ef42b9\") " pod="openshift-dns-operator/dns-operator-744455d44c-vtmrf" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.147364 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2wht\" (UniqueName: \"kubernetes.io/projected/dabcbb38-cce7-404a-ab4b-aa81c4ceebb7-kube-api-access-w2wht\") pod \"apiserver-7bbb656c7d-t2pzb\" (UID: \"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.147482 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/28961044-3c25-4750-9703-1f19edee14cd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jtch8\" (UID: \"28961044-3c25-4750-9703-1f19edee14cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jtch8" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.147613 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.147749 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a0c9c8ba-d665-479d-8b23-5f6150a8ccdf-images\") pod \"machine-config-operator-74547568cd-mpffx\" (UID: \"a0c9c8ba-d665-479d-8b23-5f6150a8ccdf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mpffx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.148184 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h5tqs" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.149021 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/795afb6d-557a-4f06-9772-0af88a287913-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nm4hw\" (UID: \"795afb6d-557a-4f06-9772-0af88a287913\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm4hw" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.149245 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb688659-982c-44a4-8e59-b896de7a5e14-serving-cert\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.149257 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.149339 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvbdc\" (UniqueName: \"kubernetes.io/projected/5afa1ed2-5ac6-49e9-bbcb-87bf7afaf3b0-kube-api-access-kvbdc\") pod \"openshift-controller-manager-operator-756b6f6bc6-h6bsr\" (UID: \"5afa1ed2-5ac6-49e9-bbcb-87bf7afaf3b0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h6bsr" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.149415 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d8466264-98ee-4866-bce5-0cdc13613446-etcd-ca\") pod \"etcd-operator-b45778765-fncmn\" (UID: \"d8466264-98ee-4866-bce5-0cdc13613446\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fncmn" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.149537 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a274d4f7-f741-48cc-9c34-8e2805ad66e3-oauth-serving-cert\") pod \"console-f9d7485db-t2gnl\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.149608 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afeb9125-522c-4dea-93f1-d3ee2c58da87-config\") pod \"console-operator-58897d9998-wpnnz\" (UID: \"afeb9125-522c-4dea-93f1-d3ee2c58da87\") " pod="openshift-console-operator/console-operator-58897d9998-wpnnz" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.149662 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdg2b\" (UniqueName: \"kubernetes.io/projected/a6afe02e-cf5b-41e2-ac72-d89e06576d76-kube-api-access-pdg2b\") pod \"downloads-7954f5f757-flxrp\" (UID: \"a6afe02e-cf5b-41e2-ac72-d89e06576d76\") " pod="openshift-console/downloads-7954f5f757-flxrp" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.149730 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fecc097-e805-479d-9d1a-c03d13013851-config\") pod \"machine-approver-56656f9798-grrm2\" (UID: \"1fecc097-e805-479d-9d1a-c03d13013851\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-grrm2" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.149817 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2feb0620-965a-4f0e-98fd-bccefd87053b-client-ca\") pod \"route-controller-manager-6576b87f9c-lgg8q\" (UID: \"2feb0620-965a-4f0e-98fd-bccefd87053b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.149899 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dabcbb38-cce7-404a-ab4b-aa81c4ceebb7-audit-dir\") pod \"apiserver-7bbb656c7d-t2pzb\" (UID: \"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.149974 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb688659-982c-44a4-8e59-b896de7a5e14-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.150029 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlrx2\" (UniqueName: \"kubernetes.io/projected/2feb0620-965a-4f0e-98fd-bccefd87053b-kube-api-access-mlrx2\") pod \"route-controller-manager-6576b87f9c-lgg8q\" (UID: \"2feb0620-965a-4f0e-98fd-bccefd87053b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.150076 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.150127 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9008fed-2c05-4a53-a6f8-9a407213b236-service-ca-bundle\") pod \"authentication-operator-69f744f599-jx4vv\" (UID: \"a9008fed-2c05-4a53-a6f8-9a407213b236\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jx4vv" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.150278 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8896266b-64eb-434e-b0cf-6a57510bf439-stats-auth\") pod \"router-default-5444994796-xrkt7\" (UID: \"8896266b-64eb-434e-b0cf-6a57510bf439\") " pod="openshift-ingress/router-default-5444994796-xrkt7" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.150319 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/eb688659-982c-44a4-8e59-b896de7a5e14-image-import-ca\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.151328 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a274d4f7-f741-48cc-9c34-8e2805ad66e3-console-config\") pod \"console-f9d7485db-t2gnl\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.151686 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.153886 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a274d4f7-f741-48cc-9c34-8e2805ad66e3-console-oauth-config\") pod \"console-f9d7485db-t2gnl\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.154012 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a274d4f7-f741-48cc-9c34-8e2805ad66e3-oauth-serving-cert\") pod \"console-f9d7485db-t2gnl\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.155478 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afeb9125-522c-4dea-93f1-d3ee2c58da87-config\") pod \"console-operator-58897d9998-wpnnz\" (UID: \"afeb9125-522c-4dea-93f1-d3ee2c58da87\") " pod="openshift-console-operator/console-operator-58897d9998-wpnnz" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.156339 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rzd6v"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.157101 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/eb688659-982c-44a4-8e59-b896de7a5e14-encryption-config\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.157720 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b695283-9aef-4604-b070-43d775a56972-serving-cert\") pod \"openshift-config-operator-7777fb866f-r8pc8\" (UID: \"5b695283-9aef-4604-b070-43d775a56972\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8pc8" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.157775 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/eb688659-982c-44a4-8e59-b896de7a5e14-node-pullsecrets\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.162766 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eb688659-982c-44a4-8e59-b896de7a5e14-audit-dir\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.164688 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/eb688659-982c-44a4-8e59-b896de7a5e14-etcd-serving-ca\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.165390 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a274d4f7-f741-48cc-9c34-8e2805ad66e3-trusted-ca-bundle\") pod \"console-f9d7485db-t2gnl\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.169502 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.171176 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-rzd6v" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.172240 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb688659-982c-44a4-8e59-b896de7a5e14-trusted-ca-bundle\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.173977 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a274d4f7-f741-48cc-9c34-8e2805ad66e3-console-serving-cert\") pod \"console-f9d7485db-t2gnl\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.174293 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/eb688659-982c-44a4-8e59-b896de7a5e14-etcd-client\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.174575 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afeb9125-522c-4dea-93f1-d3ee2c58da87-serving-cert\") pod \"console-operator-58897d9998-wpnnz\" (UID: \"afeb9125-522c-4dea-93f1-d3ee2c58da87\") " pod="openshift-console-operator/console-operator-58897d9998-wpnnz" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.176194 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8wwl8"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.176995 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8wwl8" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.177443 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412405-w524p"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.177953 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-w524p" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.179095 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56tl4"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.180515 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vpwl6"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.180649 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56tl4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.182861 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.183220 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-d44cn"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.183377 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwl6" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.186789 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wpnnz"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.186826 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-r8pc8"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.186838 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.186936 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-d44cn" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.187709 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-mpffx"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.189218 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vbqcq"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.189942 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vbqcq" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.191375 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jtch8"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.193464 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.194479 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fncmn"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.195605 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-t2gnl"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.196598 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-flxrp"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.197547 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vm492"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.198679 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ljd4p"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.199795 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jx4vv"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.200934 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgl4w"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.202011 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vtmrf"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.202718 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.204658 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h6bsr"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.205862 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qlvpr"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.206777 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mh92n"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.207741 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nm4hw"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.209034 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v9zdx"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.210969 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fj5r4"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.212646 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jl2zz"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.213849 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6wmgx"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.214841 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vcl8p"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.216220 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-d44cn"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.217293 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412405-w524p"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.218328 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-p2j8p"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.219072 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p2j8p" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.219372 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-7kdr7"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.219896 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7kdr7" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.220669 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dv6kr"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.222017 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-klrbv"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.223123 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.223258 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rzd6v"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.224328 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5d7cs"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.225647 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bdlx5"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.226478 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vdtwv"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.228153 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vpwl6"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.230405 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xnrcs"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.232270 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8wwl8"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.233847 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56tl4"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.235470 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p2j8p"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.237376 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vbqcq"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.245777 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h5tqs"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.245854 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cf2xq"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.249562 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.251011 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cf2xq"] Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.251168 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.252709 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.252738 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8896266b-64eb-434e-b0cf-6a57510bf439-default-certificate\") pod \"router-default-5444994796-xrkt7\" (UID: \"8896266b-64eb-434e-b0cf-6a57510bf439\") " pod="openshift-ingress/router-default-5444994796-xrkt7" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.252779 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dabcbb38-cce7-404a-ab4b-aa81c4ceebb7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-t2pzb\" (UID: \"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.252801 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9008fed-2c05-4a53-a6f8-9a407213b236-serving-cert\") pod \"authentication-operator-69f744f599-jx4vv\" (UID: \"a9008fed-2c05-4a53-a6f8-9a407213b236\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jx4vv" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.252818 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-v9zdx\" (UID: \"8f3f9b27-48d3-4b7e-963e-e2829a4a659e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.252855 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww62k\" (UniqueName: \"kubernetes.io/projected/795afb6d-557a-4f06-9772-0af88a287913-kube-api-access-ww62k\") pod \"machine-config-controller-84d6567774-nm4hw\" (UID: \"795afb6d-557a-4f06-9772-0af88a287913\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm4hw" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.252890 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.252951 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a0c9c8ba-d665-479d-8b23-5f6150a8ccdf-proxy-tls\") pod \"machine-config-operator-74547568cd-mpffx\" (UID: \"a0c9c8ba-d665-479d-8b23-5f6150a8ccdf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mpffx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.252978 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8466264-98ee-4866-bce5-0cdc13613446-serving-cert\") pod \"etcd-operator-b45778765-fncmn\" (UID: \"d8466264-98ee-4866-bce5-0cdc13613446\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fncmn" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.253012 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dabcbb38-cce7-404a-ab4b-aa81c4ceebb7-etcd-client\") pod \"apiserver-7bbb656c7d-t2pzb\" (UID: \"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.253031 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8fa054f2-39f2-479b-915f-b2ceeab282d6-audit-policies\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.253047 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dabcbb38-cce7-404a-ab4b-aa81c4ceebb7-audit-policies\") pod \"apiserver-7bbb656c7d-t2pzb\" (UID: \"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.253065 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1fecc097-e805-479d-9d1a-c03d13013851-auth-proxy-config\") pod \"machine-approver-56656f9798-grrm2\" (UID: \"1fecc097-e805-479d-9d1a-c03d13013851\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-grrm2" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.253105 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2feb0620-965a-4f0e-98fd-bccefd87053b-serving-cert\") pod \"route-controller-manager-6576b87f9c-lgg8q\" (UID: \"2feb0620-965a-4f0e-98fd-bccefd87053b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.253125 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.253143 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.253181 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2963ae1e-2e90-492a-a52a-4ca04bf481cc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-jl2zz\" (UID: \"2963ae1e-2e90-492a-a52a-4ca04bf481cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jl2zz" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.253205 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-serving-cert\") pod \"controller-manager-879f6c89f-v9zdx\" (UID: \"8f3f9b27-48d3-4b7e-963e-e2829a4a659e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.253945 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmkwn\" (UniqueName: \"kubernetes.io/projected/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-kube-api-access-bmkwn\") pod \"controller-manager-879f6c89f-v9zdx\" (UID: \"8f3f9b27-48d3-4b7e-963e-e2829a4a659e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.253996 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5afa1ed2-5ac6-49e9-bbcb-87bf7afaf3b0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-h6bsr\" (UID: \"5afa1ed2-5ac6-49e9-bbcb-87bf7afaf3b0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h6bsr" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254055 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-config\") pod \"controller-manager-879f6c89f-v9zdx\" (UID: \"8f3f9b27-48d3-4b7e-963e-e2829a4a659e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254089 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2963ae1e-2e90-492a-a52a-4ca04bf481cc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-jl2zz\" (UID: \"2963ae1e-2e90-492a-a52a-4ca04bf481cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jl2zz" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254107 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1fecc097-e805-479d-9d1a-c03d13013851-machine-approver-tls\") pod \"machine-approver-56656f9798-grrm2\" (UID: \"1fecc097-e805-479d-9d1a-c03d13013851\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-grrm2" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254145 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8896266b-64eb-434e-b0cf-6a57510bf439-service-ca-bundle\") pod \"router-default-5444994796-xrkt7\" (UID: \"8896266b-64eb-434e-b0cf-6a57510bf439\") " pod="openshift-ingress/router-default-5444994796-xrkt7" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254164 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5afa1ed2-5ac6-49e9-bbcb-87bf7afaf3b0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-h6bsr\" (UID: \"5afa1ed2-5ac6-49e9-bbcb-87bf7afaf3b0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h6bsr" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254182 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8466264-98ee-4866-bce5-0cdc13613446-config\") pod \"etcd-operator-b45778765-fncmn\" (UID: \"d8466264-98ee-4866-bce5-0cdc13613446\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fncmn" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254218 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254268 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254316 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254332 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8896266b-64eb-434e-b0cf-6a57510bf439-metrics-certs\") pod \"router-default-5444994796-xrkt7\" (UID: \"8896266b-64eb-434e-b0cf-6a57510bf439\") " pod="openshift-ingress/router-default-5444994796-xrkt7" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254347 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dabcbb38-cce7-404a-ab4b-aa81c4ceebb7-serving-cert\") pod \"apiserver-7bbb656c7d-t2pzb\" (UID: \"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254364 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2963ae1e-2e90-492a-a52a-4ca04bf481cc-config\") pod \"kube-controller-manager-operator-78b949d7b-jl2zz\" (UID: \"2963ae1e-2e90-492a-a52a-4ca04bf481cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jl2zz" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254382 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2feb0620-965a-4f0e-98fd-bccefd87053b-config\") pod \"route-controller-manager-6576b87f9c-lgg8q\" (UID: \"2feb0620-965a-4f0e-98fd-bccefd87053b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254401 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzjtz\" (UniqueName: \"kubernetes.io/projected/a9008fed-2c05-4a53-a6f8-9a407213b236-kube-api-access-xzjtz\") pod \"authentication-operator-69f744f599-jx4vv\" (UID: \"a9008fed-2c05-4a53-a6f8-9a407213b236\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jx4vv" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254417 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bs75\" (UniqueName: \"kubernetes.io/projected/a0c9c8ba-d665-479d-8b23-5f6150a8ccdf-kube-api-access-9bs75\") pod \"machine-config-operator-74547568cd-mpffx\" (UID: \"a0c9c8ba-d665-479d-8b23-5f6150a8ccdf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mpffx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254435 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-767c6\" (UniqueName: \"kubernetes.io/projected/1fecc097-e805-479d-9d1a-c03d13013851-kube-api-access-767c6\") pod \"machine-approver-56656f9798-grrm2\" (UID: \"1fecc097-e805-479d-9d1a-c03d13013851\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-grrm2" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254450 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8fa054f2-39f2-479b-915f-b2ceeab282d6-audit-dir\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254466 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d8466264-98ee-4866-bce5-0cdc13613446-etcd-service-ca\") pod \"etcd-operator-b45778765-fncmn\" (UID: \"d8466264-98ee-4866-bce5-0cdc13613446\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fncmn" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254483 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz9dj\" (UniqueName: \"kubernetes.io/projected/8fa054f2-39f2-479b-915f-b2ceeab282d6-kube-api-access-kz9dj\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254510 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9008fed-2c05-4a53-a6f8-9a407213b236-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jx4vv\" (UID: \"a9008fed-2c05-4a53-a6f8-9a407213b236\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jx4vv" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254528 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5nj4\" (UniqueName: \"kubernetes.io/projected/28961044-3c25-4750-9703-1f19edee14cd-kube-api-access-m5nj4\") pod \"cluster-samples-operator-665b6dd947-jtch8\" (UID: \"28961044-3c25-4750-9703-1f19edee14cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jtch8" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254544 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9008fed-2c05-4a53-a6f8-9a407213b236-config\") pod \"authentication-operator-69f744f599-jx4vv\" (UID: \"a9008fed-2c05-4a53-a6f8-9a407213b236\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jx4vv" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254563 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254582 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7dbf214b-d90a-41ff-8e46-694452ef42b9-metrics-tls\") pod \"dns-operator-744455d44c-vtmrf\" (UID: \"7dbf214b-d90a-41ff-8e46-694452ef42b9\") " pod="openshift-dns-operator/dns-operator-744455d44c-vtmrf" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254600 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2bsj\" (UniqueName: \"kubernetes.io/projected/d8466264-98ee-4866-bce5-0cdc13613446-kube-api-access-r2bsj\") pod \"etcd-operator-b45778765-fncmn\" (UID: \"d8466264-98ee-4866-bce5-0cdc13613446\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fncmn" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254616 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-client-ca\") pod \"controller-manager-879f6c89f-v9zdx\" (UID: \"8f3f9b27-48d3-4b7e-963e-e2829a4a659e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254631 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8aef45b-e02c-4a5e-a154-f4814776a263-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xnrcs\" (UID: \"d8aef45b-e02c-4a5e-a154-f4814776a263\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xnrcs" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254656 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a0c9c8ba-d665-479d-8b23-5f6150a8ccdf-auth-proxy-config\") pod \"machine-config-operator-74547568cd-mpffx\" (UID: \"a0c9c8ba-d665-479d-8b23-5f6150a8ccdf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mpffx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254674 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8aef45b-e02c-4a5e-a154-f4814776a263-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xnrcs\" (UID: \"d8aef45b-e02c-4a5e-a154-f4814776a263\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xnrcs" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254690 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dabcbb38-cce7-404a-ab4b-aa81c4ceebb7-encryption-config\") pod \"apiserver-7bbb656c7d-t2pzb\" (UID: \"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254705 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cggp5\" (UniqueName: \"kubernetes.io/projected/7dbf214b-d90a-41ff-8e46-694452ef42b9-kube-api-access-cggp5\") pod \"dns-operator-744455d44c-vtmrf\" (UID: \"7dbf214b-d90a-41ff-8e46-694452ef42b9\") " pod="openshift-dns-operator/dns-operator-744455d44c-vtmrf" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254720 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dabcbb38-cce7-404a-ab4b-aa81c4ceebb7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-t2pzb\" (UID: \"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254738 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdj4z\" (UniqueName: \"kubernetes.io/projected/d8aef45b-e02c-4a5e-a154-f4814776a263-kube-api-access-jdj4z\") pod \"openshift-apiserver-operator-796bbdcf4f-xnrcs\" (UID: \"d8aef45b-e02c-4a5e-a154-f4814776a263\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xnrcs" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254754 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/795afb6d-557a-4f06-9772-0af88a287913-proxy-tls\") pod \"machine-config-controller-84d6567774-nm4hw\" (UID: \"795afb6d-557a-4f06-9772-0af88a287913\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm4hw" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254768 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d8466264-98ee-4866-bce5-0cdc13613446-etcd-client\") pod \"etcd-operator-b45778765-fncmn\" (UID: \"d8466264-98ee-4866-bce5-0cdc13613446\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fncmn" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254783 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh8wq\" (UniqueName: \"kubernetes.io/projected/8896266b-64eb-434e-b0cf-6a57510bf439-kube-api-access-fh8wq\") pod \"router-default-5444994796-xrkt7\" (UID: \"8896266b-64eb-434e-b0cf-6a57510bf439\") " pod="openshift-ingress/router-default-5444994796-xrkt7" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254822 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2wht\" (UniqueName: \"kubernetes.io/projected/dabcbb38-cce7-404a-ab4b-aa81c4ceebb7-kube-api-access-w2wht\") pod \"apiserver-7bbb656c7d-t2pzb\" (UID: \"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254863 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a0c9c8ba-d665-479d-8b23-5f6150a8ccdf-images\") pod \"machine-config-operator-74547568cd-mpffx\" (UID: \"a0c9c8ba-d665-479d-8b23-5f6150a8ccdf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mpffx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254881 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/28961044-3c25-4750-9703-1f19edee14cd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jtch8\" (UID: \"28961044-3c25-4750-9703-1f19edee14cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jtch8" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254898 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254914 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvbdc\" (UniqueName: \"kubernetes.io/projected/5afa1ed2-5ac6-49e9-bbcb-87bf7afaf3b0-kube-api-access-kvbdc\") pod \"openshift-controller-manager-operator-756b6f6bc6-h6bsr\" (UID: \"5afa1ed2-5ac6-49e9-bbcb-87bf7afaf3b0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h6bsr" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254932 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/795afb6d-557a-4f06-9772-0af88a287913-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nm4hw\" (UID: \"795afb6d-557a-4f06-9772-0af88a287913\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm4hw" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254952 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254968 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2feb0620-965a-4f0e-98fd-bccefd87053b-client-ca\") pod \"route-controller-manager-6576b87f9c-lgg8q\" (UID: \"2feb0620-965a-4f0e-98fd-bccefd87053b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254984 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d8466264-98ee-4866-bce5-0cdc13613446-etcd-ca\") pod \"etcd-operator-b45778765-fncmn\" (UID: \"d8466264-98ee-4866-bce5-0cdc13613446\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fncmn" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.255005 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fecc097-e805-479d-9d1a-c03d13013851-config\") pod \"machine-approver-56656f9798-grrm2\" (UID: \"1fecc097-e805-479d-9d1a-c03d13013851\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-grrm2" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.255022 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dabcbb38-cce7-404a-ab4b-aa81c4ceebb7-audit-dir\") pod \"apiserver-7bbb656c7d-t2pzb\" (UID: \"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.255041 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9008fed-2c05-4a53-a6f8-9a407213b236-service-ca-bundle\") pod \"authentication-operator-69f744f599-jx4vv\" (UID: \"a9008fed-2c05-4a53-a6f8-9a407213b236\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jx4vv" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.255060 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlrx2\" (UniqueName: \"kubernetes.io/projected/2feb0620-965a-4f0e-98fd-bccefd87053b-kube-api-access-mlrx2\") pod \"route-controller-manager-6576b87f9c-lgg8q\" (UID: \"2feb0620-965a-4f0e-98fd-bccefd87053b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.255079 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.255109 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8896266b-64eb-434e-b0cf-6a57510bf439-stats-auth\") pod \"router-default-5444994796-xrkt7\" (UID: \"8896266b-64eb-434e-b0cf-6a57510bf439\") " pod="openshift-ingress/router-default-5444994796-xrkt7" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.255368 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8fa054f2-39f2-479b-915f-b2ceeab282d6-audit-dir\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.255465 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dabcbb38-cce7-404a-ab4b-aa81c4ceebb7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-t2pzb\" (UID: \"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.256001 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d8466264-98ee-4866-bce5-0cdc13613446-etcd-service-ca\") pod \"etcd-operator-b45778765-fncmn\" (UID: \"d8466264-98ee-4866-bce5-0cdc13613446\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fncmn" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.256114 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2feb0620-965a-4f0e-98fd-bccefd87053b-config\") pod \"route-controller-manager-6576b87f9c-lgg8q\" (UID: \"2feb0620-965a-4f0e-98fd-bccefd87053b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.256372 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8466264-98ee-4866-bce5-0cdc13613446-serving-cert\") pod \"etcd-operator-b45778765-fncmn\" (UID: \"d8466264-98ee-4866-bce5-0cdc13613446\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fncmn" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.256538 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8fa054f2-39f2-479b-915f-b2ceeab282d6-audit-policies\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.257401 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a0c9c8ba-d665-479d-8b23-5f6150a8ccdf-images\") pod \"machine-config-operator-74547568cd-mpffx\" (UID: \"a0c9c8ba-d665-479d-8b23-5f6150a8ccdf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mpffx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.257863 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a0c9c8ba-d665-479d-8b23-5f6150a8ccdf-auth-proxy-config\") pod \"machine-config-operator-74547568cd-mpffx\" (UID: \"a0c9c8ba-d665-479d-8b23-5f6150a8ccdf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mpffx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.258647 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8aef45b-e02c-4a5e-a154-f4814776a263-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xnrcs\" (UID: \"d8aef45b-e02c-4a5e-a154-f4814776a263\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xnrcs" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.258653 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9008fed-2c05-4a53-a6f8-9a407213b236-serving-cert\") pod \"authentication-operator-69f744f599-jx4vv\" (UID: \"a9008fed-2c05-4a53-a6f8-9a407213b236\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jx4vv" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.259220 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8aef45b-e02c-4a5e-a154-f4814776a263-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xnrcs\" (UID: \"d8aef45b-e02c-4a5e-a154-f4814776a263\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xnrcs" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.259289 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dabcbb38-cce7-404a-ab4b-aa81c4ceebb7-audit-policies\") pod \"apiserver-7bbb656c7d-t2pzb\" (UID: \"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.259624 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dabcbb38-cce7-404a-ab4b-aa81c4ceebb7-etcd-client\") pod \"apiserver-7bbb656c7d-t2pzb\" (UID: \"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.259694 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.260115 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d8466264-98ee-4866-bce5-0cdc13613446-etcd-client\") pod \"etcd-operator-b45778765-fncmn\" (UID: \"d8466264-98ee-4866-bce5-0cdc13613446\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fncmn" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.260203 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2feb0620-965a-4f0e-98fd-bccefd87053b-serving-cert\") pod \"route-controller-manager-6576b87f9c-lgg8q\" (UID: \"2feb0620-965a-4f0e-98fd-bccefd87053b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.260295 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.260453 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d8466264-98ee-4866-bce5-0cdc13613446-etcd-ca\") pod \"etcd-operator-b45778765-fncmn\" (UID: \"d8466264-98ee-4866-bce5-0cdc13613446\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fncmn" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.260634 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/dabcbb38-cce7-404a-ab4b-aa81c4ceebb7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-t2pzb\" (UID: \"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.260735 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9008fed-2c05-4a53-a6f8-9a407213b236-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jx4vv\" (UID: \"a9008fed-2c05-4a53-a6f8-9a407213b236\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jx4vv" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.261037 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5afa1ed2-5ac6-49e9-bbcb-87bf7afaf3b0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-h6bsr\" (UID: \"5afa1ed2-5ac6-49e9-bbcb-87bf7afaf3b0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h6bsr" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.261113 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fecc097-e805-479d-9d1a-c03d13013851-config\") pod \"machine-approver-56656f9798-grrm2\" (UID: \"1fecc097-e805-479d-9d1a-c03d13013851\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-grrm2" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.261226 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.261302 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9008fed-2c05-4a53-a6f8-9a407213b236-service-ca-bundle\") pod \"authentication-operator-69f744f599-jx4vv\" (UID: \"a9008fed-2c05-4a53-a6f8-9a407213b236\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jx4vv" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.261371 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dabcbb38-cce7-404a-ab4b-aa81c4ceebb7-audit-dir\") pod \"apiserver-7bbb656c7d-t2pzb\" (UID: \"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.261581 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8466264-98ee-4866-bce5-0cdc13613446-config\") pod \"etcd-operator-b45778765-fncmn\" (UID: \"d8466264-98ee-4866-bce5-0cdc13613446\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fncmn" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.261820 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9008fed-2c05-4a53-a6f8-9a407213b236-config\") pod \"authentication-operator-69f744f599-jx4vv\" (UID: \"a9008fed-2c05-4a53-a6f8-9a407213b236\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jx4vv" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.261986 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/795afb6d-557a-4f06-9772-0af88a287913-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nm4hw\" (UID: \"795afb6d-557a-4f06-9772-0af88a287913\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm4hw" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.262245 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.262330 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.262861 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.263180 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.254344 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1fecc097-e805-479d-9d1a-c03d13013851-auth-proxy-config\") pod \"machine-approver-56656f9798-grrm2\" (UID: \"1fecc097-e805-479d-9d1a-c03d13013851\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-grrm2" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.263636 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/dabcbb38-cce7-404a-ab4b-aa81c4ceebb7-encryption-config\") pod \"apiserver-7bbb656c7d-t2pzb\" (UID: \"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.263810 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2feb0620-965a-4f0e-98fd-bccefd87053b-client-ca\") pod \"route-controller-manager-6576b87f9c-lgg8q\" (UID: \"2feb0620-965a-4f0e-98fd-bccefd87053b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.264442 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a0c9c8ba-d665-479d-8b23-5f6150a8ccdf-proxy-tls\") pod \"machine-config-operator-74547568cd-mpffx\" (UID: \"a0c9c8ba-d665-479d-8b23-5f6150a8ccdf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mpffx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.264486 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/28961044-3c25-4750-9703-1f19edee14cd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-jtch8\" (UID: \"28961044-3c25-4750-9703-1f19edee14cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jtch8" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.264729 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dabcbb38-cce7-404a-ab4b-aa81c4ceebb7-serving-cert\") pod \"apiserver-7bbb656c7d-t2pzb\" (UID: \"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.264913 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.265591 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.265672 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1fecc097-e805-479d-9d1a-c03d13013851-machine-approver-tls\") pod \"machine-approver-56656f9798-grrm2\" (UID: \"1fecc097-e805-479d-9d1a-c03d13013851\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-grrm2" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.266093 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.266371 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/795afb6d-557a-4f06-9772-0af88a287913-proxy-tls\") pod \"machine-config-controller-84d6567774-nm4hw\" (UID: \"795afb6d-557a-4f06-9772-0af88a287913\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm4hw" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.267564 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5afa1ed2-5ac6-49e9-bbcb-87bf7afaf3b0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-h6bsr\" (UID: \"5afa1ed2-5ac6-49e9-bbcb-87bf7afaf3b0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h6bsr" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.267745 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.268014 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7dbf214b-d90a-41ff-8e46-694452ef42b9-metrics-tls\") pod \"dns-operator-744455d44c-vtmrf\" (UID: \"7dbf214b-d90a-41ff-8e46-694452ef42b9\") " pod="openshift-dns-operator/dns-operator-744455d44c-vtmrf" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.268641 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.302931 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.324409 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.343128 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.363614 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.383472 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.402913 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.423612 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.443531 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.463383 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.469296 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8896266b-64eb-434e-b0cf-6a57510bf439-stats-auth\") pod \"router-default-5444994796-xrkt7\" (UID: \"8896266b-64eb-434e-b0cf-6a57510bf439\") " pod="openshift-ingress/router-default-5444994796-xrkt7" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.484446 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.497165 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8896266b-64eb-434e-b0cf-6a57510bf439-metrics-certs\") pod \"router-default-5444994796-xrkt7\" (UID: \"8896266b-64eb-434e-b0cf-6a57510bf439\") " pod="openshift-ingress/router-default-5444994796-xrkt7" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.503794 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.524035 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.529728 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8896266b-64eb-434e-b0cf-6a57510bf439-default-certificate\") pod \"router-default-5444994796-xrkt7\" (UID: \"8896266b-64eb-434e-b0cf-6a57510bf439\") " pod="openshift-ingress/router-default-5444994796-xrkt7" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.543327 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.563749 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.568884 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8896266b-64eb-434e-b0cf-6a57510bf439-service-ca-bundle\") pod \"router-default-5444994796-xrkt7\" (UID: \"8896266b-64eb-434e-b0cf-6a57510bf439\") " pod="openshift-ingress/router-default-5444994796-xrkt7" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.583440 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.603912 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.615806 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-serving-cert\") pod \"controller-manager-879f6c89f-v9zdx\" (UID: \"8f3f9b27-48d3-4b7e-963e-e2829a4a659e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.624161 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.633670 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-config\") pod \"controller-manager-879f6c89f-v9zdx\" (UID: \"8f3f9b27-48d3-4b7e-963e-e2829a4a659e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.644553 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.654917 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-client-ca\") pod \"controller-manager-879f6c89f-v9zdx\" (UID: \"8f3f9b27-48d3-4b7e-963e-e2829a4a659e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.672337 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.677024 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-v9zdx\" (UID: \"8f3f9b27-48d3-4b7e-963e-e2829a4a659e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.683145 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.703862 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.704841 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2963ae1e-2e90-492a-a52a-4ca04bf481cc-config\") pod \"kube-controller-manager-operator-78b949d7b-jl2zz\" (UID: \"2963ae1e-2e90-492a-a52a-4ca04bf481cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jl2zz" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.723067 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.743660 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.753878 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2963ae1e-2e90-492a-a52a-4ca04bf481cc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-jl2zz\" (UID: \"2963ae1e-2e90-492a-a52a-4ca04bf481cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jl2zz" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.764936 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.803103 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.823273 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.843379 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.863921 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.883616 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.903995 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.922942 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.952419 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.964083 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 06:51:15 crc kubenswrapper[4947]: I1203 06:51:15.984308 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.003974 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.024026 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.042404 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.063209 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.081984 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.082034 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.083540 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.101315 4947 request.go:700] Waited for 1.000897624s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.103868 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.124068 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.143114 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.163171 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.183142 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.204566 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.224781 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.243999 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.264327 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.293060 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.304181 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.323720 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.344362 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.370613 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.383472 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.422340 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mfnd\" (UniqueName: \"kubernetes.io/projected/a274d4f7-f741-48cc-9c34-8e2805ad66e3-kube-api-access-6mfnd\") pod \"console-f9d7485db-t2gnl\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.426552 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.444378 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.464699 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.484279 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.497480 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.504339 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.551587 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcmzj\" (UniqueName: \"kubernetes.io/projected/afeb9125-522c-4dea-93f1-d3ee2c58da87-kube-api-access-tcmzj\") pod \"console-operator-58897d9998-wpnnz\" (UID: \"afeb9125-522c-4dea-93f1-d3ee2c58da87\") " pod="openshift-console-operator/console-operator-58897d9998-wpnnz" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.560429 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s4vl\" (UniqueName: \"kubernetes.io/projected/5b695283-9aef-4604-b070-43d775a56972-kube-api-access-9s4vl\") pod \"openshift-config-operator-7777fb866f-r8pc8\" (UID: \"5b695283-9aef-4604-b070-43d775a56972\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8pc8" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.582634 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrwj4\" (UniqueName: \"kubernetes.io/projected/eb688659-982c-44a4-8e59-b896de7a5e14-kube-api-access-jrwj4\") pod \"apiserver-76f77b778f-bdlx5\" (UID: \"eb688659-982c-44a4-8e59-b896de7a5e14\") " pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.584346 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.621713 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdg2b\" (UniqueName: \"kubernetes.io/projected/a6afe02e-cf5b-41e2-ac72-d89e06576d76-kube-api-access-pdg2b\") pod \"downloads-7954f5f757-flxrp\" (UID: \"a6afe02e-cf5b-41e2-ac72-d89e06576d76\") " pod="openshift-console/downloads-7954f5f757-flxrp" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.623797 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.643993 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.664247 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.683873 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.704019 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.709331 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-t2gnl"] Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.723654 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.743897 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.763155 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.764690 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-flxrp" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.783103 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.785731 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8pc8" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.807014 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.813338 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-wpnnz" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.823956 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.845322 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.847685 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.905830 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.908174 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.908566 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.927077 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.946616 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.969158 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 06:51:16 crc kubenswrapper[4947]: I1203 06:51:16.986008 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.004976 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.020032 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-r8pc8"] Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.023208 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 06:51:17 crc kubenswrapper[4947]: W1203 06:51:17.029005 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b695283_9aef_4604_b070_43d775a56972.slice/crio-4953154abb81e1b139243c4fb6a53be73d16b6454940a1b310ee0c65f5960320 WatchSource:0}: Error finding container 4953154abb81e1b139243c4fb6a53be73d16b6454940a1b310ee0c65f5960320: Status 404 returned error can't find the container with id 4953154abb81e1b139243c4fb6a53be73d16b6454940a1b310ee0c65f5960320 Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.043059 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.064294 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.064307 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-flxrp"] Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.086347 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.091152 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-wpnnz"] Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.102960 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.116242 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-bdlx5"] Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.121559 4947 request.go:700] Waited for 1.901503548s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-server-tls&limit=500&resourceVersion=0 Dec 03 06:51:17 crc kubenswrapper[4947]: W1203 06:51:17.133338 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb688659_982c_44a4_8e59_b896de7a5e14.slice/crio-092ab2a49722ff29b066ef855d7fdada673b072b45327e4f2480cbad91cb670a WatchSource:0}: Error finding container 092ab2a49722ff29b066ef855d7fdada673b072b45327e4f2480cbad91cb670a: Status 404 returned error can't find the container with id 092ab2a49722ff29b066ef855d7fdada673b072b45327e4f2480cbad91cb670a Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.133868 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.136622 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wpnnz" event={"ID":"afeb9125-522c-4dea-93f1-d3ee2c58da87","Type":"ContainerStarted","Data":"1108cfb4642298fe672e3bce5344730b1502a346c7e92dbcc11084f2ea4828b7"} Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.137995 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" event={"ID":"eb688659-982c-44a4-8e59-b896de7a5e14","Type":"ContainerStarted","Data":"092ab2a49722ff29b066ef855d7fdada673b072b45327e4f2480cbad91cb670a"} Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.139726 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8pc8" event={"ID":"5b695283-9aef-4604-b070-43d775a56972","Type":"ContainerStarted","Data":"143b290e5abf6181c92546273d302da03a0b7a6e2502d045b52a452e7a4f3f51"} Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.139767 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8pc8" event={"ID":"5b695283-9aef-4604-b070-43d775a56972","Type":"ContainerStarted","Data":"4953154abb81e1b139243c4fb6a53be73d16b6454940a1b310ee0c65f5960320"} Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.140889 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-flxrp" event={"ID":"a6afe02e-cf5b-41e2-ac72-d89e06576d76","Type":"ContainerStarted","Data":"efbc849d3cd16334e734e6fbd93373a3ece27ed040188b8787ca534be2a1db86"} Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.141676 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-t2gnl" event={"ID":"a274d4f7-f741-48cc-9c34-8e2805ad66e3","Type":"ContainerStarted","Data":"45a32ada656de5faa8ca2abc0f130d56586a58450ba8e779b9ab6f245c309603"} Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.141697 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-t2gnl" event={"ID":"a274d4f7-f741-48cc-9c34-8e2805ad66e3","Type":"ContainerStarted","Data":"d0201c7480def2ead3d7aeea472f0d748595675d29208d5792b7d4c2c194ac19"} Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.143628 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.163738 4947 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.184590 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.223505 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzjtz\" (UniqueName: \"kubernetes.io/projected/a9008fed-2c05-4a53-a6f8-9a407213b236-kube-api-access-xzjtz\") pod \"authentication-operator-69f744f599-jx4vv\" (UID: \"a9008fed-2c05-4a53-a6f8-9a407213b236\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jx4vv" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.240253 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bs75\" (UniqueName: \"kubernetes.io/projected/a0c9c8ba-d665-479d-8b23-5f6150a8ccdf-kube-api-access-9bs75\") pod \"machine-config-operator-74547568cd-mpffx\" (UID: \"a0c9c8ba-d665-479d-8b23-5f6150a8ccdf\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mpffx" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.260694 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-767c6\" (UniqueName: \"kubernetes.io/projected/1fecc097-e805-479d-9d1a-c03d13013851-kube-api-access-767c6\") pod \"machine-approver-56656f9798-grrm2\" (UID: \"1fecc097-e805-479d-9d1a-c03d13013851\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-grrm2" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.278469 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww62k\" (UniqueName: \"kubernetes.io/projected/795afb6d-557a-4f06-9772-0af88a287913-kube-api-access-ww62k\") pod \"machine-config-controller-84d6567774-nm4hw\" (UID: \"795afb6d-557a-4f06-9772-0af88a287913\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm4hw" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.291264 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-grrm2" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.300421 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz9dj\" (UniqueName: \"kubernetes.io/projected/8fa054f2-39f2-479b-915f-b2ceeab282d6-kube-api-access-kz9dj\") pod \"oauth-openshift-558db77b4-fj5r4\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.304787 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mpffx" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.311625 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.321273 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh8wq\" (UniqueName: \"kubernetes.io/projected/8896266b-64eb-434e-b0cf-6a57510bf439-kube-api-access-fh8wq\") pod \"router-default-5444994796-xrkt7\" (UID: \"8896266b-64eb-434e-b0cf-6a57510bf439\") " pod="openshift-ingress/router-default-5444994796-xrkt7" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.338628 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2wht\" (UniqueName: \"kubernetes.io/projected/dabcbb38-cce7-404a-ab4b-aa81c4ceebb7-kube-api-access-w2wht\") pod \"apiserver-7bbb656c7d-t2pzb\" (UID: \"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.349338 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm4hw" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.363853 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cggp5\" (UniqueName: \"kubernetes.io/projected/7dbf214b-d90a-41ff-8e46-694452ef42b9-kube-api-access-cggp5\") pod \"dns-operator-744455d44c-vtmrf\" (UID: \"7dbf214b-d90a-41ff-8e46-694452ef42b9\") " pod="openshift-dns-operator/dns-operator-744455d44c-vtmrf" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.380418 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmkwn\" (UniqueName: \"kubernetes.io/projected/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-kube-api-access-bmkwn\") pod \"controller-manager-879f6c89f-v9zdx\" (UID: \"8f3f9b27-48d3-4b7e-963e-e2829a4a659e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.384392 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xrkt7" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.393115 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.404345 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdj4z\" (UniqueName: \"kubernetes.io/projected/d8aef45b-e02c-4a5e-a154-f4814776a263-kube-api-access-jdj4z\") pod \"openshift-apiserver-operator-796bbdcf4f-xnrcs\" (UID: \"d8aef45b-e02c-4a5e-a154-f4814776a263\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xnrcs" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.427212 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5nj4\" (UniqueName: \"kubernetes.io/projected/28961044-3c25-4750-9703-1f19edee14cd-kube-api-access-m5nj4\") pod \"cluster-samples-operator-665b6dd947-jtch8\" (UID: \"28961044-3c25-4750-9703-1f19edee14cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jtch8" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.436682 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvbdc\" (UniqueName: \"kubernetes.io/projected/5afa1ed2-5ac6-49e9-bbcb-87bf7afaf3b0-kube-api-access-kvbdc\") pod \"openshift-controller-manager-operator-756b6f6bc6-h6bsr\" (UID: \"5afa1ed2-5ac6-49e9-bbcb-87bf7afaf3b0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h6bsr" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.464745 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jtch8" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.466638 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlrx2\" (UniqueName: \"kubernetes.io/projected/2feb0620-965a-4f0e-98fd-bccefd87053b-kube-api-access-mlrx2\") pod \"route-controller-manager-6576b87f9c-lgg8q\" (UID: \"2feb0620-965a-4f0e-98fd-bccefd87053b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.478815 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h6bsr" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.480771 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2963ae1e-2e90-492a-a52a-4ca04bf481cc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-jl2zz\" (UID: \"2963ae1e-2e90-492a-a52a-4ca04bf481cc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jl2zz" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.498267 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jx4vv" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.504121 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2bsj\" (UniqueName: \"kubernetes.io/projected/d8466264-98ee-4866-bce5-0cdc13613446-kube-api-access-r2bsj\") pod \"etcd-operator-b45778765-fncmn\" (UID: \"d8466264-98ee-4866-bce5-0cdc13613446\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fncmn" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.518541 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.541179 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.544552 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.551628 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fj5r4"] Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.564514 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.578689 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vtmrf" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.580366 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-mpffx"] Dec 03 06:51:17 crc kubenswrapper[4947]: W1203 06:51:17.613289 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0c9c8ba_d665_479d_8b23_5f6150a8ccdf.slice/crio-f22bed079c374093bcbe1860bac0b4d165bdb0ab69ca133d43f81ac969d44469 WatchSource:0}: Error finding container f22bed079c374093bcbe1860bac0b4d165bdb0ab69ca133d43f81ac969d44469: Status 404 returned error can't find the container with id f22bed079c374093bcbe1860bac0b4d165bdb0ab69ca133d43f81ac969d44469 Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.625025 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0806f4cc-cda3-4c9e-a112-29d36cf3b596-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qlvpr\" (UID: \"0806f4cc-cda3-4c9e-a112-29d36cf3b596\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qlvpr" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.625081 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.625105 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9b82a6de-d75a-462f-9b68-105a28a52e28-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.625131 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9b82a6de-d75a-462f-9b68-105a28a52e28-trusted-ca\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.625157 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9b82a6de-d75a-462f-9b68-105a28a52e28-bound-sa-token\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.625186 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxww8\" (UniqueName: \"kubernetes.io/projected/9b82a6de-d75a-462f-9b68-105a28a52e28-kube-api-access-xxww8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.625204 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9b82a6de-d75a-462f-9b68-105a28a52e28-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.625224 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/272dc0cc-9856-4375-b9f3-0ce1b543f30f-config\") pod \"machine-api-operator-5694c8668f-klrbv\" (UID: \"272dc0cc-9856-4375-b9f3-0ce1b543f30f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-klrbv" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.625241 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0806f4cc-cda3-4c9e-a112-29d36cf3b596-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qlvpr\" (UID: \"0806f4cc-cda3-4c9e-a112-29d36cf3b596\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qlvpr" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.625354 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9b82a6de-d75a-462f-9b68-105a28a52e28-registry-certificates\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.625419 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9b82a6de-d75a-462f-9b68-105a28a52e28-registry-tls\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.625478 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/272dc0cc-9856-4375-b9f3-0ce1b543f30f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-klrbv\" (UID: \"272dc0cc-9856-4375-b9f3-0ce1b543f30f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-klrbv" Dec 03 06:51:17 crc kubenswrapper[4947]: E1203 06:51:17.625510 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:18.125482633 +0000 UTC m=+139.386437059 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.625536 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0806f4cc-cda3-4c9e-a112-29d36cf3b596-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qlvpr\" (UID: \"0806f4cc-cda3-4c9e-a112-29d36cf3b596\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qlvpr" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.625554 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k29v7\" (UniqueName: \"kubernetes.io/projected/272dc0cc-9856-4375-b9f3-0ce1b543f30f-kube-api-access-k29v7\") pod \"machine-api-operator-5694c8668f-klrbv\" (UID: \"272dc0cc-9856-4375-b9f3-0ce1b543f30f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-klrbv" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.625574 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/272dc0cc-9856-4375-b9f3-0ce1b543f30f-images\") pod \"machine-api-operator-5694c8668f-klrbv\" (UID: \"272dc0cc-9856-4375-b9f3-0ce1b543f30f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-klrbv" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.625850 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xnrcs" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.632275 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fncmn" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.699939 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jl2zz" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.726430 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:17 crc kubenswrapper[4947]: E1203 06:51:17.726661 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:18.226625265 +0000 UTC m=+139.487579721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.726708 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5b257084-363a-41ed-9bc8-838592867c51-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6wmgx\" (UID: \"5b257084-363a-41ed-9bc8-838592867c51\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wmgx" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.726752 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a887f13f-644f-497d-9105-6afdb38753b4-registration-dir\") pod \"csi-hostpathplugin-cf2xq\" (UID: \"a887f13f-644f-497d-9105-6afdb38753b4\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.726786 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9b82a6de-d75a-462f-9b68-105a28a52e28-registry-certificates\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.726861 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9b82a6de-d75a-462f-9b68-105a28a52e28-registry-tls\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.726889 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a84abb48-f81f-4b6d-b7f0-9b4b1467add5-profile-collector-cert\") pod \"catalog-operator-68c6474976-5d7cs\" (UID: \"a84abb48-f81f-4b6d-b7f0-9b4b1467add5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5d7cs" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.726913 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a887f13f-644f-497d-9105-6afdb38753b4-plugins-dir\") pod \"csi-hostpathplugin-cf2xq\" (UID: \"a887f13f-644f-497d-9105-6afdb38753b4\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.726938 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwfkp\" (UniqueName: \"kubernetes.io/projected/a887f13f-644f-497d-9105-6afdb38753b4-kube-api-access-fwfkp\") pod \"csi-hostpathplugin-cf2xq\" (UID: \"a887f13f-644f-497d-9105-6afdb38753b4\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.726994 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e34af75-3f73-476f-ab59-40eff1841e44-metrics-tls\") pod \"ingress-operator-5b745b69d9-vdtwv\" (UID: \"2e34af75-3f73-476f-ab59-40eff1841e44\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vdtwv" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.727051 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26ec81fa-20d5-4bc6-bd9f-51da98daec36-config-volume\") pod \"dns-default-vbqcq\" (UID: \"26ec81fa-20d5-4bc6-bd9f-51da98daec36\") " pod="openshift-dns/dns-default-vbqcq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.727144 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl2g2\" (UniqueName: \"kubernetes.io/projected/2e34af75-3f73-476f-ab59-40eff1841e44-kube-api-access-cl2g2\") pod \"ingress-operator-5b745b69d9-vdtwv\" (UID: \"2e34af75-3f73-476f-ab59-40eff1841e44\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vdtwv" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.727226 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/611d9fbd-329b-4743-9e87-25c99feaf35d-certs\") pod \"machine-config-server-7kdr7\" (UID: \"611d9fbd-329b-4743-9e87-25c99feaf35d\") " pod="openshift-machine-config-operator/machine-config-server-7kdr7" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.727261 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/749fbe4c-7c59-46bb-bdc0-f230bf72795c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-dv6kr\" (UID: \"749fbe4c-7c59-46bb-bdc0-f230bf72795c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dv6kr" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.727374 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnmk8\" (UniqueName: \"kubernetes.io/projected/59137370-0efe-42ca-82dc-f42ed6c69c37-kube-api-access-mnmk8\") pod \"packageserver-d55dfcdfc-h5tqs\" (UID: \"59137370-0efe-42ca-82dc-f42ed6c69c37\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h5tqs" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.727418 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9b82a6de-d75a-462f-9b68-105a28a52e28-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.727466 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9b82a6de-d75a-462f-9b68-105a28a52e28-trusted-ca\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.727483 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/59137370-0efe-42ca-82dc-f42ed6c69c37-tmpfs\") pod \"packageserver-d55dfcdfc-h5tqs\" (UID: \"59137370-0efe-42ca-82dc-f42ed6c69c37\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h5tqs" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.727536 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj5jr\" (UniqueName: \"kubernetes.io/projected/22f2942c-3081-4a6e-931d-47a98367d2ac-kube-api-access-lj5jr\") pod \"ingress-canary-p2j8p\" (UID: \"22f2942c-3081-4a6e-931d-47a98367d2ac\") " pod="openshift-ingress-canary/ingress-canary-p2j8p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.727555 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4bhg\" (UniqueName: \"kubernetes.io/projected/81985cd6-f5a4-46f3-9c71-448dc4f3bee6-kube-api-access-w4bhg\") pod \"control-plane-machine-set-operator-78cbb6b69f-ljd4p\" (UID: \"81985cd6-f5a4-46f3-9c71-448dc4f3bee6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ljd4p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.727573 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad418cca-2ab3-4e04-89f0-b547ceae6aeb-serving-cert\") pod \"service-ca-operator-777779d784-vpwl6\" (UID: \"ad418cca-2ab3-4e04-89f0-b547ceae6aeb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwl6" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.727594 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9c0cc40-7389-4703-bf34-d42b6bf32710-config-volume\") pod \"collect-profiles-29412405-w524p\" (UID: \"b9c0cc40-7389-4703-bf34-d42b6bf32710\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-w524p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.727670 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2e34af75-3f73-476f-ab59-40eff1841e44-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vdtwv\" (UID: \"2e34af75-3f73-476f-ab59-40eff1841e44\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vdtwv" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.727701 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a887f13f-644f-497d-9105-6afdb38753b4-socket-dir\") pod \"csi-hostpathplugin-cf2xq\" (UID: \"a887f13f-644f-497d-9105-6afdb38753b4\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.727721 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6lkj\" (UniqueName: \"kubernetes.io/projected/e85f57d3-31a1-41a1-a6dc-28c9e6803d5b-kube-api-access-x6lkj\") pod \"package-server-manager-789f6589d5-8wwl8\" (UID: \"e85f57d3-31a1-41a1-a6dc-28c9e6803d5b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8wwl8" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.727738 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e34af75-3f73-476f-ab59-40eff1841e44-trusted-ca\") pod \"ingress-operator-5b745b69d9-vdtwv\" (UID: \"2e34af75-3f73-476f-ab59-40eff1841e44\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vdtwv" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.727759 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9b82a6de-d75a-462f-9b68-105a28a52e28-bound-sa-token\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.727790 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77ab5990-c7ce-4016-bbad-88d7357e4bb5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qgl4w\" (UID: \"77ab5990-c7ce-4016-bbad-88d7357e4bb5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgl4w" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.727820 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bcc537bc-9041-4e21-b05b-1d681df333d0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rzd6v\" (UID: \"bcc537bc-9041-4e21-b05b-1d681df333d0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rzd6v" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.727840 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17696519-84ea-49b0-8a0a-ba85c071f427-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vcl8p\" (UID: \"17696519-84ea-49b0-8a0a-ba85c071f427\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vcl8p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.727870 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tdst\" (UniqueName: \"kubernetes.io/projected/bcc537bc-9041-4e21-b05b-1d681df333d0-kube-api-access-6tdst\") pod \"multus-admission-controller-857f4d67dd-rzd6v\" (UID: \"bcc537bc-9041-4e21-b05b-1d681df333d0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rzd6v" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.727933 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxww8\" (UniqueName: \"kubernetes.io/projected/9b82a6de-d75a-462f-9b68-105a28a52e28-kube-api-access-xxww8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.727949 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2ce74b93-fe18-4974-ae87-f26e76ed39f8-srv-cert\") pod \"olm-operator-6b444d44fb-56tl4\" (UID: \"2ce74b93-fe18-4974-ae87-f26e76ed39f8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56tl4" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.727966 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/749fbe4c-7c59-46bb-bdc0-f230bf72795c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-dv6kr\" (UID: \"749fbe4c-7c59-46bb-bdc0-f230bf72795c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dv6kr" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.728016 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/59137370-0efe-42ca-82dc-f42ed6c69c37-apiservice-cert\") pod \"packageserver-d55dfcdfc-h5tqs\" (UID: \"59137370-0efe-42ca-82dc-f42ed6c69c37\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h5tqs" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.728596 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77ab5990-c7ce-4016-bbad-88d7357e4bb5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qgl4w\" (UID: \"77ab5990-c7ce-4016-bbad-88d7357e4bb5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgl4w" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.728650 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9b82a6de-d75a-462f-9b68-105a28a52e28-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.728685 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a887f13f-644f-497d-9105-6afdb38753b4-mountpoint-dir\") pod \"csi-hostpathplugin-cf2xq\" (UID: \"a887f13f-644f-497d-9105-6afdb38753b4\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.728712 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgwlx\" (UniqueName: \"kubernetes.io/projected/749fbe4c-7c59-46bb-bdc0-f230bf72795c-kube-api-access-dgwlx\") pod \"cluster-image-registry-operator-dc59b4c8b-dv6kr\" (UID: \"749fbe4c-7c59-46bb-bdc0-f230bf72795c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dv6kr" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.728742 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22f2942c-3081-4a6e-931d-47a98367d2ac-cert\") pod \"ingress-canary-p2j8p\" (UID: \"22f2942c-3081-4a6e-931d-47a98367d2ac\") " pod="openshift-ingress-canary/ingress-canary-p2j8p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.728942 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/272dc0cc-9856-4375-b9f3-0ce1b543f30f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-klrbv\" (UID: \"272dc0cc-9856-4375-b9f3-0ce1b543f30f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-klrbv" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.728968 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6v9x\" (UniqueName: \"kubernetes.io/projected/5b257084-363a-41ed-9bc8-838592867c51-kube-api-access-p6v9x\") pod \"marketplace-operator-79b997595-6wmgx\" (UID: \"5b257084-363a-41ed-9bc8-838592867c51\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wmgx" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.729003 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a887f13f-644f-497d-9105-6afdb38753b4-csi-data-dir\") pod \"csi-hostpathplugin-cf2xq\" (UID: \"a887f13f-644f-497d-9105-6afdb38753b4\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.729027 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwczf\" (UniqueName: \"kubernetes.io/projected/ad418cca-2ab3-4e04-89f0-b547ceae6aeb-kube-api-access-fwczf\") pod \"service-ca-operator-777779d784-vpwl6\" (UID: \"ad418cca-2ab3-4e04-89f0-b547ceae6aeb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwl6" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.729050 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmwcn\" (UniqueName: \"kubernetes.io/projected/8bfce152-31c1-4d42-b91e-acf33d4dee46-kube-api-access-jmwcn\") pod \"migrator-59844c95c7-vm492\" (UID: \"8bfce152-31c1-4d42-b91e-acf33d4dee46\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vm492" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.729097 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0806f4cc-cda3-4c9e-a112-29d36cf3b596-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qlvpr\" (UID: \"0806f4cc-cda3-4c9e-a112-29d36cf3b596\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qlvpr" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.729122 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k29v7\" (UniqueName: \"kubernetes.io/projected/272dc0cc-9856-4375-b9f3-0ce1b543f30f-kube-api-access-k29v7\") pod \"machine-api-operator-5694c8668f-klrbv\" (UID: \"272dc0cc-9856-4375-b9f3-0ce1b543f30f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-klrbv" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.729164 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b257084-363a-41ed-9bc8-838592867c51-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6wmgx\" (UID: \"5b257084-363a-41ed-9bc8-838592867c51\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wmgx" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.729190 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85cnh\" (UniqueName: \"kubernetes.io/projected/17696519-84ea-49b0-8a0a-ba85c071f427-kube-api-access-85cnh\") pod \"kube-storage-version-migrator-operator-b67b599dd-vcl8p\" (UID: \"17696519-84ea-49b0-8a0a-ba85c071f427\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vcl8p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.729227 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/272dc0cc-9856-4375-b9f3-0ce1b543f30f-images\") pod \"machine-api-operator-5694c8668f-klrbv\" (UID: \"272dc0cc-9856-4375-b9f3-0ce1b543f30f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-klrbv" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.729260 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr92h\" (UniqueName: \"kubernetes.io/projected/2ce74b93-fe18-4974-ae87-f26e76ed39f8-kube-api-access-jr92h\") pod \"olm-operator-6b444d44fb-56tl4\" (UID: \"2ce74b93-fe18-4974-ae87-f26e76ed39f8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56tl4" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.729284 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad418cca-2ab3-4e04-89f0-b547ceae6aeb-config\") pod \"service-ca-operator-777779d784-vpwl6\" (UID: \"ad418cca-2ab3-4e04-89f0-b547ceae6aeb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwl6" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.729287 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9b82a6de-d75a-462f-9b68-105a28a52e28-registry-certificates\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.729321 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/81985cd6-f5a4-46f3-9c71-448dc4f3bee6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ljd4p\" (UID: \"81985cd6-f5a4-46f3-9c71-448dc4f3bee6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ljd4p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.729347 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/87cd9080-a444-42fa-aa94-7c2912690882-signing-key\") pod \"service-ca-9c57cc56f-d44cn\" (UID: \"87cd9080-a444-42fa-aa94-7c2912690882\") " pod="openshift-service-ca/service-ca-9c57cc56f-d44cn" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.729398 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/59137370-0efe-42ca-82dc-f42ed6c69c37-webhook-cert\") pod \"packageserver-d55dfcdfc-h5tqs\" (UID: \"59137370-0efe-42ca-82dc-f42ed6c69c37\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h5tqs" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.729585 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9c0cc40-7389-4703-bf34-d42b6bf32710-secret-volume\") pod \"collect-profiles-29412405-w524p\" (UID: \"b9c0cc40-7389-4703-bf34-d42b6bf32710\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-w524p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.729646 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0806f4cc-cda3-4c9e-a112-29d36cf3b596-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qlvpr\" (UID: \"0806f4cc-cda3-4c9e-a112-29d36cf3b596\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qlvpr" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.729665 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17696519-84ea-49b0-8a0a-ba85c071f427-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vcl8p\" (UID: \"17696519-84ea-49b0-8a0a-ba85c071f427\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vcl8p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.729683 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ab5990-c7ce-4016-bbad-88d7357e4bb5-config\") pod \"kube-apiserver-operator-766d6c64bb-qgl4w\" (UID: \"77ab5990-c7ce-4016-bbad-88d7357e4bb5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgl4w" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.729706 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.729726 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s89t6\" (UniqueName: \"kubernetes.io/projected/a84abb48-f81f-4b6d-b7f0-9b4b1467add5-kube-api-access-s89t6\") pod \"catalog-operator-68c6474976-5d7cs\" (UID: \"a84abb48-f81f-4b6d-b7f0-9b4b1467add5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5d7cs" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.729771 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/87cd9080-a444-42fa-aa94-7c2912690882-signing-cabundle\") pod \"service-ca-9c57cc56f-d44cn\" (UID: \"87cd9080-a444-42fa-aa94-7c2912690882\") " pod="openshift-service-ca/service-ca-9c57cc56f-d44cn" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.729894 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26ec81fa-20d5-4bc6-bd9f-51da98daec36-metrics-tls\") pod \"dns-default-vbqcq\" (UID: \"26ec81fa-20d5-4bc6-bd9f-51da98daec36\") " pod="openshift-dns/dns-default-vbqcq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.729930 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c4lq\" (UniqueName: \"kubernetes.io/projected/611d9fbd-329b-4743-9e87-25c99feaf35d-kube-api-access-6c4lq\") pod \"machine-config-server-7kdr7\" (UID: \"611d9fbd-329b-4743-9e87-25c99feaf35d\") " pod="openshift-machine-config-operator/machine-config-server-7kdr7" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.729968 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2ce74b93-fe18-4974-ae87-f26e76ed39f8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-56tl4\" (UID: \"2ce74b93-fe18-4974-ae87-f26e76ed39f8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56tl4" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.729987 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnh9s\" (UniqueName: \"kubernetes.io/projected/26ec81fa-20d5-4bc6-bd9f-51da98daec36-kube-api-access-jnh9s\") pod \"dns-default-vbqcq\" (UID: \"26ec81fa-20d5-4bc6-bd9f-51da98daec36\") " pod="openshift-dns/dns-default-vbqcq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.730039 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/749fbe4c-7c59-46bb-bdc0-f230bf72795c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-dv6kr\" (UID: \"749fbe4c-7c59-46bb-bdc0-f230bf72795c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dv6kr" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.730216 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a84abb48-f81f-4b6d-b7f0-9b4b1467add5-srv-cert\") pod \"catalog-operator-68c6474976-5d7cs\" (UID: \"a84abb48-f81f-4b6d-b7f0-9b4b1467add5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5d7cs" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.730242 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/611d9fbd-329b-4743-9e87-25c99feaf35d-node-bootstrap-token\") pod \"machine-config-server-7kdr7\" (UID: \"611d9fbd-329b-4743-9e87-25c99feaf35d\") " pod="openshift-machine-config-operator/machine-config-server-7kdr7" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.730321 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/272dc0cc-9856-4375-b9f3-0ce1b543f30f-config\") pod \"machine-api-operator-5694c8668f-klrbv\" (UID: \"272dc0cc-9856-4375-b9f3-0ce1b543f30f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-klrbv" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.730351 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8nk6\" (UniqueName: \"kubernetes.io/projected/b9c0cc40-7389-4703-bf34-d42b6bf32710-kube-api-access-j8nk6\") pod \"collect-profiles-29412405-w524p\" (UID: \"b9c0cc40-7389-4703-bf34-d42b6bf32710\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-w524p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.730404 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0806f4cc-cda3-4c9e-a112-29d36cf3b596-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qlvpr\" (UID: \"0806f4cc-cda3-4c9e-a112-29d36cf3b596\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qlvpr" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.730424 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e85f57d3-31a1-41a1-a6dc-28c9e6803d5b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8wwl8\" (UID: \"e85f57d3-31a1-41a1-a6dc-28c9e6803d5b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8wwl8" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.730444 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxknc\" (UniqueName: \"kubernetes.io/projected/87cd9080-a444-42fa-aa94-7c2912690882-kube-api-access-hxknc\") pod \"service-ca-9c57cc56f-d44cn\" (UID: \"87cd9080-a444-42fa-aa94-7c2912690882\") " pod="openshift-service-ca/service-ca-9c57cc56f-d44cn" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.730482 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0806f4cc-cda3-4c9e-a112-29d36cf3b596-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qlvpr\" (UID: \"0806f4cc-cda3-4c9e-a112-29d36cf3b596\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qlvpr" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.734421 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9b82a6de-d75a-462f-9b68-105a28a52e28-trusted-ca\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.731561 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/272dc0cc-9856-4375-b9f3-0ce1b543f30f-images\") pod \"machine-api-operator-5694c8668f-klrbv\" (UID: \"272dc0cc-9856-4375-b9f3-0ce1b543f30f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-klrbv" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.736037 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9b82a6de-d75a-462f-9b68-105a28a52e28-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.736132 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v9zdx"] Dec 03 06:51:17 crc kubenswrapper[4947]: E1203 06:51:17.737033 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:18.237018158 +0000 UTC m=+139.497972584 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.742381 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/272dc0cc-9856-4375-b9f3-0ce1b543f30f-config\") pod \"machine-api-operator-5694c8668f-klrbv\" (UID: \"272dc0cc-9856-4375-b9f3-0ce1b543f30f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-klrbv" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.744902 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/272dc0cc-9856-4375-b9f3-0ce1b543f30f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-klrbv\" (UID: \"272dc0cc-9856-4375-b9f3-0ce1b543f30f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-klrbv" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.745068 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9b82a6de-d75a-462f-9b68-105a28a52e28-registry-tls\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.746238 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0806f4cc-cda3-4c9e-a112-29d36cf3b596-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qlvpr\" (UID: \"0806f4cc-cda3-4c9e-a112-29d36cf3b596\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qlvpr" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.746256 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9b82a6de-d75a-462f-9b68-105a28a52e28-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.760929 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9b82a6de-d75a-462f-9b68-105a28a52e28-bound-sa-token\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.779253 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k29v7\" (UniqueName: \"kubernetes.io/projected/272dc0cc-9856-4375-b9f3-0ce1b543f30f-kube-api-access-k29v7\") pod \"machine-api-operator-5694c8668f-klrbv\" (UID: \"272dc0cc-9856-4375-b9f3-0ce1b543f30f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-klrbv" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.818139 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0806f4cc-cda3-4c9e-a112-29d36cf3b596-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-qlvpr\" (UID: \"0806f4cc-cda3-4c9e-a112-29d36cf3b596\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qlvpr" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.832276 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.832464 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a887f13f-644f-497d-9105-6afdb38753b4-plugins-dir\") pod \"csi-hostpathplugin-cf2xq\" (UID: \"a887f13f-644f-497d-9105-6afdb38753b4\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.832530 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwfkp\" (UniqueName: \"kubernetes.io/projected/a887f13f-644f-497d-9105-6afdb38753b4-kube-api-access-fwfkp\") pod \"csi-hostpathplugin-cf2xq\" (UID: \"a887f13f-644f-497d-9105-6afdb38753b4\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.832554 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a84abb48-f81f-4b6d-b7f0-9b4b1467add5-profile-collector-cert\") pod \"catalog-operator-68c6474976-5d7cs\" (UID: \"a84abb48-f81f-4b6d-b7f0-9b4b1467add5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5d7cs" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.832578 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e34af75-3f73-476f-ab59-40eff1841e44-metrics-tls\") pod \"ingress-operator-5b745b69d9-vdtwv\" (UID: \"2e34af75-3f73-476f-ab59-40eff1841e44\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vdtwv" Dec 03 06:51:17 crc kubenswrapper[4947]: E1203 06:51:17.832599 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:18.33258039 +0000 UTC m=+139.593534816 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.832642 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26ec81fa-20d5-4bc6-bd9f-51da98daec36-config-volume\") pod \"dns-default-vbqcq\" (UID: \"26ec81fa-20d5-4bc6-bd9f-51da98daec36\") " pod="openshift-dns/dns-default-vbqcq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.832670 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl2g2\" (UniqueName: \"kubernetes.io/projected/2e34af75-3f73-476f-ab59-40eff1841e44-kube-api-access-cl2g2\") pod \"ingress-operator-5b745b69d9-vdtwv\" (UID: \"2e34af75-3f73-476f-ab59-40eff1841e44\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vdtwv" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.832694 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/611d9fbd-329b-4743-9e87-25c99feaf35d-certs\") pod \"machine-config-server-7kdr7\" (UID: \"611d9fbd-329b-4743-9e87-25c99feaf35d\") " pod="openshift-machine-config-operator/machine-config-server-7kdr7" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.832710 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/749fbe4c-7c59-46bb-bdc0-f230bf72795c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-dv6kr\" (UID: \"749fbe4c-7c59-46bb-bdc0-f230bf72795c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dv6kr" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.832730 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnmk8\" (UniqueName: \"kubernetes.io/projected/59137370-0efe-42ca-82dc-f42ed6c69c37-kube-api-access-mnmk8\") pod \"packageserver-d55dfcdfc-h5tqs\" (UID: \"59137370-0efe-42ca-82dc-f42ed6c69c37\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h5tqs" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.832752 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/59137370-0efe-42ca-82dc-f42ed6c69c37-tmpfs\") pod \"packageserver-d55dfcdfc-h5tqs\" (UID: \"59137370-0efe-42ca-82dc-f42ed6c69c37\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h5tqs" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.832767 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj5jr\" (UniqueName: \"kubernetes.io/projected/22f2942c-3081-4a6e-931d-47a98367d2ac-kube-api-access-lj5jr\") pod \"ingress-canary-p2j8p\" (UID: \"22f2942c-3081-4a6e-931d-47a98367d2ac\") " pod="openshift-ingress-canary/ingress-canary-p2j8p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.832783 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad418cca-2ab3-4e04-89f0-b547ceae6aeb-serving-cert\") pod \"service-ca-operator-777779d784-vpwl6\" (UID: \"ad418cca-2ab3-4e04-89f0-b547ceae6aeb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwl6" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.832798 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9c0cc40-7389-4703-bf34-d42b6bf32710-config-volume\") pod \"collect-profiles-29412405-w524p\" (UID: \"b9c0cc40-7389-4703-bf34-d42b6bf32710\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-w524p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.832813 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4bhg\" (UniqueName: \"kubernetes.io/projected/81985cd6-f5a4-46f3-9c71-448dc4f3bee6-kube-api-access-w4bhg\") pod \"control-plane-machine-set-operator-78cbb6b69f-ljd4p\" (UID: \"81985cd6-f5a4-46f3-9c71-448dc4f3bee6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ljd4p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.832831 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2e34af75-3f73-476f-ab59-40eff1841e44-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vdtwv\" (UID: \"2e34af75-3f73-476f-ab59-40eff1841e44\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vdtwv" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.832846 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a887f13f-644f-497d-9105-6afdb38753b4-socket-dir\") pod \"csi-hostpathplugin-cf2xq\" (UID: \"a887f13f-644f-497d-9105-6afdb38753b4\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.832862 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e34af75-3f73-476f-ab59-40eff1841e44-trusted-ca\") pod \"ingress-operator-5b745b69d9-vdtwv\" (UID: \"2e34af75-3f73-476f-ab59-40eff1841e44\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vdtwv" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.832877 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6lkj\" (UniqueName: \"kubernetes.io/projected/e85f57d3-31a1-41a1-a6dc-28c9e6803d5b-kube-api-access-x6lkj\") pod \"package-server-manager-789f6589d5-8wwl8\" (UID: \"e85f57d3-31a1-41a1-a6dc-28c9e6803d5b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8wwl8" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.832895 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77ab5990-c7ce-4016-bbad-88d7357e4bb5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qgl4w\" (UID: \"77ab5990-c7ce-4016-bbad-88d7357e4bb5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgl4w" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.832911 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bcc537bc-9041-4e21-b05b-1d681df333d0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rzd6v\" (UID: \"bcc537bc-9041-4e21-b05b-1d681df333d0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rzd6v" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.832926 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17696519-84ea-49b0-8a0a-ba85c071f427-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vcl8p\" (UID: \"17696519-84ea-49b0-8a0a-ba85c071f427\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vcl8p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.832949 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tdst\" (UniqueName: \"kubernetes.io/projected/bcc537bc-9041-4e21-b05b-1d681df333d0-kube-api-access-6tdst\") pod \"multus-admission-controller-857f4d67dd-rzd6v\" (UID: \"bcc537bc-9041-4e21-b05b-1d681df333d0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rzd6v" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.832969 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2ce74b93-fe18-4974-ae87-f26e76ed39f8-srv-cert\") pod \"olm-operator-6b444d44fb-56tl4\" (UID: \"2ce74b93-fe18-4974-ae87-f26e76ed39f8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56tl4" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.832987 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/749fbe4c-7c59-46bb-bdc0-f230bf72795c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-dv6kr\" (UID: \"749fbe4c-7c59-46bb-bdc0-f230bf72795c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dv6kr" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833001 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/59137370-0efe-42ca-82dc-f42ed6c69c37-apiservice-cert\") pod \"packageserver-d55dfcdfc-h5tqs\" (UID: \"59137370-0efe-42ca-82dc-f42ed6c69c37\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h5tqs" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833017 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77ab5990-c7ce-4016-bbad-88d7357e4bb5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qgl4w\" (UID: \"77ab5990-c7ce-4016-bbad-88d7357e4bb5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgl4w" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833034 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a887f13f-644f-497d-9105-6afdb38753b4-mountpoint-dir\") pod \"csi-hostpathplugin-cf2xq\" (UID: \"a887f13f-644f-497d-9105-6afdb38753b4\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833049 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgwlx\" (UniqueName: \"kubernetes.io/projected/749fbe4c-7c59-46bb-bdc0-f230bf72795c-kube-api-access-dgwlx\") pod \"cluster-image-registry-operator-dc59b4c8b-dv6kr\" (UID: \"749fbe4c-7c59-46bb-bdc0-f230bf72795c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dv6kr" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833065 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22f2942c-3081-4a6e-931d-47a98367d2ac-cert\") pod \"ingress-canary-p2j8p\" (UID: \"22f2942c-3081-4a6e-931d-47a98367d2ac\") " pod="openshift-ingress-canary/ingress-canary-p2j8p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833093 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6v9x\" (UniqueName: \"kubernetes.io/projected/5b257084-363a-41ed-9bc8-838592867c51-kube-api-access-p6v9x\") pod \"marketplace-operator-79b997595-6wmgx\" (UID: \"5b257084-363a-41ed-9bc8-838592867c51\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wmgx" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833116 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a887f13f-644f-497d-9105-6afdb38753b4-csi-data-dir\") pod \"csi-hostpathplugin-cf2xq\" (UID: \"a887f13f-644f-497d-9105-6afdb38753b4\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833133 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwczf\" (UniqueName: \"kubernetes.io/projected/ad418cca-2ab3-4e04-89f0-b547ceae6aeb-kube-api-access-fwczf\") pod \"service-ca-operator-777779d784-vpwl6\" (UID: \"ad418cca-2ab3-4e04-89f0-b547ceae6aeb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwl6" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833147 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmwcn\" (UniqueName: \"kubernetes.io/projected/8bfce152-31c1-4d42-b91e-acf33d4dee46-kube-api-access-jmwcn\") pod \"migrator-59844c95c7-vm492\" (UID: \"8bfce152-31c1-4d42-b91e-acf33d4dee46\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vm492" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833163 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b257084-363a-41ed-9bc8-838592867c51-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6wmgx\" (UID: \"5b257084-363a-41ed-9bc8-838592867c51\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wmgx" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833179 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85cnh\" (UniqueName: \"kubernetes.io/projected/17696519-84ea-49b0-8a0a-ba85c071f427-kube-api-access-85cnh\") pod \"kube-storage-version-migrator-operator-b67b599dd-vcl8p\" (UID: \"17696519-84ea-49b0-8a0a-ba85c071f427\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vcl8p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833194 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr92h\" (UniqueName: \"kubernetes.io/projected/2ce74b93-fe18-4974-ae87-f26e76ed39f8-kube-api-access-jr92h\") pod \"olm-operator-6b444d44fb-56tl4\" (UID: \"2ce74b93-fe18-4974-ae87-f26e76ed39f8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56tl4" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833208 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad418cca-2ab3-4e04-89f0-b547ceae6aeb-config\") pod \"service-ca-operator-777779d784-vpwl6\" (UID: \"ad418cca-2ab3-4e04-89f0-b547ceae6aeb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwl6" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833224 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/87cd9080-a444-42fa-aa94-7c2912690882-signing-key\") pod \"service-ca-9c57cc56f-d44cn\" (UID: \"87cd9080-a444-42fa-aa94-7c2912690882\") " pod="openshift-service-ca/service-ca-9c57cc56f-d44cn" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833240 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/81985cd6-f5a4-46f3-9c71-448dc4f3bee6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ljd4p\" (UID: \"81985cd6-f5a4-46f3-9c71-448dc4f3bee6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ljd4p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833258 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/59137370-0efe-42ca-82dc-f42ed6c69c37-webhook-cert\") pod \"packageserver-d55dfcdfc-h5tqs\" (UID: \"59137370-0efe-42ca-82dc-f42ed6c69c37\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h5tqs" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833274 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9c0cc40-7389-4703-bf34-d42b6bf32710-secret-volume\") pod \"collect-profiles-29412405-w524p\" (UID: \"b9c0cc40-7389-4703-bf34-d42b6bf32710\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-w524p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833292 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17696519-84ea-49b0-8a0a-ba85c071f427-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vcl8p\" (UID: \"17696519-84ea-49b0-8a0a-ba85c071f427\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vcl8p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833282 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/a887f13f-644f-497d-9105-6afdb38753b4-plugins-dir\") pod \"csi-hostpathplugin-cf2xq\" (UID: \"a887f13f-644f-497d-9105-6afdb38753b4\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833310 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ab5990-c7ce-4016-bbad-88d7357e4bb5-config\") pod \"kube-apiserver-operator-766d6c64bb-qgl4w\" (UID: \"77ab5990-c7ce-4016-bbad-88d7357e4bb5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgl4w" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833426 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s89t6\" (UniqueName: \"kubernetes.io/projected/a84abb48-f81f-4b6d-b7f0-9b4b1467add5-kube-api-access-s89t6\") pod \"catalog-operator-68c6474976-5d7cs\" (UID: \"a84abb48-f81f-4b6d-b7f0-9b4b1467add5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5d7cs" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833474 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833514 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/87cd9080-a444-42fa-aa94-7c2912690882-signing-cabundle\") pod \"service-ca-9c57cc56f-d44cn\" (UID: \"87cd9080-a444-42fa-aa94-7c2912690882\") " pod="openshift-service-ca/service-ca-9c57cc56f-d44cn" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833544 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26ec81fa-20d5-4bc6-bd9f-51da98daec36-metrics-tls\") pod \"dns-default-vbqcq\" (UID: \"26ec81fa-20d5-4bc6-bd9f-51da98daec36\") " pod="openshift-dns/dns-default-vbqcq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833566 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c4lq\" (UniqueName: \"kubernetes.io/projected/611d9fbd-329b-4743-9e87-25c99feaf35d-kube-api-access-6c4lq\") pod \"machine-config-server-7kdr7\" (UID: \"611d9fbd-329b-4743-9e87-25c99feaf35d\") " pod="openshift-machine-config-operator/machine-config-server-7kdr7" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833596 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2ce74b93-fe18-4974-ae87-f26e76ed39f8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-56tl4\" (UID: \"2ce74b93-fe18-4974-ae87-f26e76ed39f8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56tl4" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833618 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnh9s\" (UniqueName: \"kubernetes.io/projected/26ec81fa-20d5-4bc6-bd9f-51da98daec36-kube-api-access-jnh9s\") pod \"dns-default-vbqcq\" (UID: \"26ec81fa-20d5-4bc6-bd9f-51da98daec36\") " pod="openshift-dns/dns-default-vbqcq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833651 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/749fbe4c-7c59-46bb-bdc0-f230bf72795c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-dv6kr\" (UID: \"749fbe4c-7c59-46bb-bdc0-f230bf72795c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dv6kr" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833683 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a84abb48-f81f-4b6d-b7f0-9b4b1467add5-srv-cert\") pod \"catalog-operator-68c6474976-5d7cs\" (UID: \"a84abb48-f81f-4b6d-b7f0-9b4b1467add5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5d7cs" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833703 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/611d9fbd-329b-4743-9e87-25c99feaf35d-node-bootstrap-token\") pod \"machine-config-server-7kdr7\" (UID: \"611d9fbd-329b-4743-9e87-25c99feaf35d\") " pod="openshift-machine-config-operator/machine-config-server-7kdr7" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833749 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8nk6\" (UniqueName: \"kubernetes.io/projected/b9c0cc40-7389-4703-bf34-d42b6bf32710-kube-api-access-j8nk6\") pod \"collect-profiles-29412405-w524p\" (UID: \"b9c0cc40-7389-4703-bf34-d42b6bf32710\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-w524p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833778 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e85f57d3-31a1-41a1-a6dc-28c9e6803d5b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8wwl8\" (UID: \"e85f57d3-31a1-41a1-a6dc-28c9e6803d5b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8wwl8" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833803 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxknc\" (UniqueName: \"kubernetes.io/projected/87cd9080-a444-42fa-aa94-7c2912690882-kube-api-access-hxknc\") pod \"service-ca-9c57cc56f-d44cn\" (UID: \"87cd9080-a444-42fa-aa94-7c2912690882\") " pod="openshift-service-ca/service-ca-9c57cc56f-d44cn" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833827 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5b257084-363a-41ed-9bc8-838592867c51-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6wmgx\" (UID: \"5b257084-363a-41ed-9bc8-838592867c51\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wmgx" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.833848 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a887f13f-644f-497d-9105-6afdb38753b4-registration-dir\") pod \"csi-hostpathplugin-cf2xq\" (UID: \"a887f13f-644f-497d-9105-6afdb38753b4\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.834370 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ab5990-c7ce-4016-bbad-88d7357e4bb5-config\") pod \"kube-apiserver-operator-766d6c64bb-qgl4w\" (UID: \"77ab5990-c7ce-4016-bbad-88d7357e4bb5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgl4w" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.835072 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/749fbe4c-7c59-46bb-bdc0-f230bf72795c-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-dv6kr\" (UID: \"749fbe4c-7c59-46bb-bdc0-f230bf72795c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dv6kr" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.837945 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a84abb48-f81f-4b6d-b7f0-9b4b1467add5-srv-cert\") pod \"catalog-operator-68c6474976-5d7cs\" (UID: \"a84abb48-f81f-4b6d-b7f0-9b4b1467add5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5d7cs" Dec 03 06:51:17 crc kubenswrapper[4947]: E1203 06:51:17.838029 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:18.338012647 +0000 UTC m=+139.598967073 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.838179 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2e34af75-3f73-476f-ab59-40eff1841e44-metrics-tls\") pod \"ingress-operator-5b745b69d9-vdtwv\" (UID: \"2e34af75-3f73-476f-ab59-40eff1841e44\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vdtwv" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.838621 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26ec81fa-20d5-4bc6-bd9f-51da98daec36-config-volume\") pod \"dns-default-vbqcq\" (UID: \"26ec81fa-20d5-4bc6-bd9f-51da98daec36\") " pod="openshift-dns/dns-default-vbqcq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.838847 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a887f13f-644f-497d-9105-6afdb38753b4-socket-dir\") pod \"csi-hostpathplugin-cf2xq\" (UID: \"a887f13f-644f-497d-9105-6afdb38753b4\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.839205 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/87cd9080-a444-42fa-aa94-7c2912690882-signing-cabundle\") pod \"service-ca-9c57cc56f-d44cn\" (UID: \"87cd9080-a444-42fa-aa94-7c2912690882\") " pod="openshift-service-ca/service-ca-9c57cc56f-d44cn" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.839844 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/59137370-0efe-42ca-82dc-f42ed6c69c37-tmpfs\") pod \"packageserver-d55dfcdfc-h5tqs\" (UID: \"59137370-0efe-42ca-82dc-f42ed6c69c37\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h5tqs" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.841917 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b257084-363a-41ed-9bc8-838592867c51-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6wmgx\" (UID: \"5b257084-363a-41ed-9bc8-838592867c51\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wmgx" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.844820 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad418cca-2ab3-4e04-89f0-b547ceae6aeb-config\") pod \"service-ca-operator-777779d784-vpwl6\" (UID: \"ad418cca-2ab3-4e04-89f0-b547ceae6aeb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwl6" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.845330 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e34af75-3f73-476f-ab59-40eff1841e44-trusted-ca\") pod \"ingress-operator-5b745b69d9-vdtwv\" (UID: \"2e34af75-3f73-476f-ab59-40eff1841e44\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vdtwv" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.845706 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/a887f13f-644f-497d-9105-6afdb38753b4-csi-data-dir\") pod \"csi-hostpathplugin-cf2xq\" (UID: \"a887f13f-644f-497d-9105-6afdb38753b4\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.845913 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17696519-84ea-49b0-8a0a-ba85c071f427-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vcl8p\" (UID: \"17696519-84ea-49b0-8a0a-ba85c071f427\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vcl8p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.846202 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/611d9fbd-329b-4743-9e87-25c99feaf35d-certs\") pod \"machine-config-server-7kdr7\" (UID: \"611d9fbd-329b-4743-9e87-25c99feaf35d\") " pod="openshift-machine-config-operator/machine-config-server-7kdr7" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.847041 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a887f13f-644f-497d-9105-6afdb38753b4-registration-dir\") pod \"csi-hostpathplugin-cf2xq\" (UID: \"a887f13f-644f-497d-9105-6afdb38753b4\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.847781 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9c0cc40-7389-4703-bf34-d42b6bf32710-config-volume\") pod \"collect-profiles-29412405-w524p\" (UID: \"b9c0cc40-7389-4703-bf34-d42b6bf32710\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-w524p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.851350 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/a887f13f-644f-497d-9105-6afdb38753b4-mountpoint-dir\") pod \"csi-hostpathplugin-cf2xq\" (UID: \"a887f13f-644f-497d-9105-6afdb38753b4\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.859733 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2ce74b93-fe18-4974-ae87-f26e76ed39f8-srv-cert\") pod \"olm-operator-6b444d44fb-56tl4\" (UID: \"2ce74b93-fe18-4974-ae87-f26e76ed39f8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56tl4" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.861016 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a84abb48-f81f-4b6d-b7f0-9b4b1467add5-profile-collector-cert\") pod \"catalog-operator-68c6474976-5d7cs\" (UID: \"a84abb48-f81f-4b6d-b7f0-9b4b1467add5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5d7cs" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.866966 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/59137370-0efe-42ca-82dc-f42ed6c69c37-apiservice-cert\") pod \"packageserver-d55dfcdfc-h5tqs\" (UID: \"59137370-0efe-42ca-82dc-f42ed6c69c37\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h5tqs" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.874124 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nm4hw"] Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.875472 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2ce74b93-fe18-4974-ae87-f26e76ed39f8-profile-collector-cert\") pod \"olm-operator-6b444d44fb-56tl4\" (UID: \"2ce74b93-fe18-4974-ae87-f26e76ed39f8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56tl4" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.877223 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/611d9fbd-329b-4743-9e87-25c99feaf35d-node-bootstrap-token\") pod \"machine-config-server-7kdr7\" (UID: \"611d9fbd-329b-4743-9e87-25c99feaf35d\") " pod="openshift-machine-config-operator/machine-config-server-7kdr7" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.878074 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26ec81fa-20d5-4bc6-bd9f-51da98daec36-metrics-tls\") pod \"dns-default-vbqcq\" (UID: \"26ec81fa-20d5-4bc6-bd9f-51da98daec36\") " pod="openshift-dns/dns-default-vbqcq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.878715 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22f2942c-3081-4a6e-931d-47a98367d2ac-cert\") pod \"ingress-canary-p2j8p\" (UID: \"22f2942c-3081-4a6e-931d-47a98367d2ac\") " pod="openshift-ingress-canary/ingress-canary-p2j8p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.878763 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/81985cd6-f5a4-46f3-9c71-448dc4f3bee6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ljd4p\" (UID: \"81985cd6-f5a4-46f3-9c71-448dc4f3bee6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ljd4p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.878871 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxww8\" (UniqueName: \"kubernetes.io/projected/9b82a6de-d75a-462f-9b68-105a28a52e28-kube-api-access-xxww8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.879250 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/749fbe4c-7c59-46bb-bdc0-f230bf72795c-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-dv6kr\" (UID: \"749fbe4c-7c59-46bb-bdc0-f230bf72795c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dv6kr" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.880893 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77ab5990-c7ce-4016-bbad-88d7357e4bb5-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qgl4w\" (UID: \"77ab5990-c7ce-4016-bbad-88d7357e4bb5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgl4w" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.881181 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bcc537bc-9041-4e21-b05b-1d681df333d0-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rzd6v\" (UID: \"bcc537bc-9041-4e21-b05b-1d681df333d0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rzd6v" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.885425 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17696519-84ea-49b0-8a0a-ba85c071f427-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vcl8p\" (UID: \"17696519-84ea-49b0-8a0a-ba85c071f427\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vcl8p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.885550 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/59137370-0efe-42ca-82dc-f42ed6c69c37-webhook-cert\") pod \"packageserver-d55dfcdfc-h5tqs\" (UID: \"59137370-0efe-42ca-82dc-f42ed6c69c37\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h5tqs" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.886206 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9c0cc40-7389-4703-bf34-d42b6bf32710-secret-volume\") pod \"collect-profiles-29412405-w524p\" (UID: \"b9c0cc40-7389-4703-bf34-d42b6bf32710\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-w524p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.886237 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5b257084-363a-41ed-9bc8-838592867c51-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6wmgx\" (UID: \"5b257084-363a-41ed-9bc8-838592867c51\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wmgx" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.886700 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad418cca-2ab3-4e04-89f0-b547ceae6aeb-serving-cert\") pod \"service-ca-operator-777779d784-vpwl6\" (UID: \"ad418cca-2ab3-4e04-89f0-b547ceae6aeb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwl6" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.888034 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e85f57d3-31a1-41a1-a6dc-28c9e6803d5b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8wwl8\" (UID: \"e85f57d3-31a1-41a1-a6dc-28c9e6803d5b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8wwl8" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.888962 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwfkp\" (UniqueName: \"kubernetes.io/projected/a887f13f-644f-497d-9105-6afdb38753b4-kube-api-access-fwfkp\") pod \"csi-hostpathplugin-cf2xq\" (UID: \"a887f13f-644f-497d-9105-6afdb38753b4\") " pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.896436 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/87cd9080-a444-42fa-aa94-7c2912690882-signing-key\") pod \"service-ca-9c57cc56f-d44cn\" (UID: \"87cd9080-a444-42fa-aa94-7c2912690882\") " pod="openshift-service-ca/service-ca-9c57cc56f-d44cn" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.897725 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/749fbe4c-7c59-46bb-bdc0-f230bf72795c-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-dv6kr\" (UID: \"749fbe4c-7c59-46bb-bdc0-f230bf72795c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dv6kr" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.920272 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c4lq\" (UniqueName: \"kubernetes.io/projected/611d9fbd-329b-4743-9e87-25c99feaf35d-kube-api-access-6c4lq\") pod \"machine-config-server-7kdr7\" (UID: \"611d9fbd-329b-4743-9e87-25c99feaf35d\") " pod="openshift-machine-config-operator/machine-config-server-7kdr7" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.935307 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:17 crc kubenswrapper[4947]: E1203 06:51:17.935925 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:18.435901061 +0000 UTC m=+139.696855487 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.959923 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qlvpr" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.960709 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s89t6\" (UniqueName: \"kubernetes.io/projected/a84abb48-f81f-4b6d-b7f0-9b4b1467add5-kube-api-access-s89t6\") pod \"catalog-operator-68c6474976-5d7cs\" (UID: \"a84abb48-f81f-4b6d-b7f0-9b4b1467add5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5d7cs" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.966813 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jtch8"] Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.978522 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-klrbv" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.980550 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4bhg\" (UniqueName: \"kubernetes.io/projected/81985cd6-f5a4-46f3-9c71-448dc4f3bee6-kube-api-access-w4bhg\") pod \"control-plane-machine-set-operator-78cbb6b69f-ljd4p\" (UID: \"81985cd6-f5a4-46f3-9c71-448dc4f3bee6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ljd4p" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.992927 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl2g2\" (UniqueName: \"kubernetes.io/projected/2e34af75-3f73-476f-ab59-40eff1841e44-kube-api-access-cl2g2\") pod \"ingress-operator-5b745b69d9-vdtwv\" (UID: \"2e34af75-3f73-476f-ab59-40eff1841e44\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vdtwv" Dec 03 06:51:17 crc kubenswrapper[4947]: I1203 06:51:17.996761 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2e34af75-3f73-476f-ab59-40eff1841e44-bound-sa-token\") pod \"ingress-operator-5b745b69d9-vdtwv\" (UID: \"2e34af75-3f73-476f-ab59-40eff1841e44\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vdtwv" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.013873 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vdtwv" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.026375 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj5jr\" (UniqueName: \"kubernetes.io/projected/22f2942c-3081-4a6e-931d-47a98367d2ac-kube-api-access-lj5jr\") pod \"ingress-canary-p2j8p\" (UID: \"22f2942c-3081-4a6e-931d-47a98367d2ac\") " pod="openshift-ingress-canary/ingress-canary-p2j8p" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.041973 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:18 crc kubenswrapper[4947]: E1203 06:51:18.042417 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:18.54240163 +0000 UTC m=+139.803356056 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.056637 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnmk8\" (UniqueName: \"kubernetes.io/projected/59137370-0efe-42ca-82dc-f42ed6c69c37-kube-api-access-mnmk8\") pod \"packageserver-d55dfcdfc-h5tqs\" (UID: \"59137370-0efe-42ca-82dc-f42ed6c69c37\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h5tqs" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.057378 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ljd4p" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.064684 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnh9s\" (UniqueName: \"kubernetes.io/projected/26ec81fa-20d5-4bc6-bd9f-51da98daec36-kube-api-access-jnh9s\") pod \"dns-default-vbqcq\" (UID: \"26ec81fa-20d5-4bc6-bd9f-51da98daec36\") " pod="openshift-dns/dns-default-vbqcq" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.078856 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5d7cs" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.080761 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h5tqs" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.084977 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6lkj\" (UniqueName: \"kubernetes.io/projected/e85f57d3-31a1-41a1-a6dc-28c9e6803d5b-kube-api-access-x6lkj\") pod \"package-server-manager-789f6589d5-8wwl8\" (UID: \"e85f57d3-31a1-41a1-a6dc-28c9e6803d5b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8wwl8" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.095017 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8wwl8" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.100700 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/77ab5990-c7ce-4016-bbad-88d7357e4bb5-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qgl4w\" (UID: \"77ab5990-c7ce-4016-bbad-88d7357e4bb5\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgl4w" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.133922 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85cnh\" (UniqueName: \"kubernetes.io/projected/17696519-84ea-49b0-8a0a-ba85c071f427-kube-api-access-85cnh\") pod \"kube-storage-version-migrator-operator-b67b599dd-vcl8p\" (UID: \"17696519-84ea-49b0-8a0a-ba85c071f427\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vcl8p" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.134029 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vbqcq" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.142431 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7kdr7" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.142690 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:18 crc kubenswrapper[4947]: E1203 06:51:18.142767 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:18.642747542 +0000 UTC m=+139.903701968 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.143052 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:18 crc kubenswrapper[4947]: E1203 06:51:18.143364 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:18.643351569 +0000 UTC m=+139.904305995 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.167395 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p2j8p" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.170445 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr92h\" (UniqueName: \"kubernetes.io/projected/2ce74b93-fe18-4974-ae87-f26e76ed39f8-kube-api-access-jr92h\") pod \"olm-operator-6b444d44fb-56tl4\" (UID: \"2ce74b93-fe18-4974-ae87-f26e76ed39f8\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56tl4" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.174730 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgwlx\" (UniqueName: \"kubernetes.io/projected/749fbe4c-7c59-46bb-bdc0-f230bf72795c-kube-api-access-dgwlx\") pod \"cluster-image-registry-operator-dc59b4c8b-dv6kr\" (UID: \"749fbe4c-7c59-46bb-bdc0-f230bf72795c\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dv6kr" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.180367 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jx4vv"] Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.182043 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" event={"ID":"8fa054f2-39f2-479b-915f-b2ceeab282d6","Type":"ContainerStarted","Data":"b8de4ef49fe8e8ef411375925bc60729f4a9b267984ef8e39cf34a3d8f2418a4"} Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.182680 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.205407 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6v9x\" (UniqueName: \"kubernetes.io/projected/5b257084-363a-41ed-9bc8-838592867c51-kube-api-access-p6v9x\") pod \"marketplace-operator-79b997595-6wmgx\" (UID: \"5b257084-363a-41ed-9bc8-838592867c51\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wmgx" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.208256 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb"] Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.208323 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h6bsr"] Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.208651 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tdst\" (UniqueName: \"kubernetes.io/projected/bcc537bc-9041-4e21-b05b-1d681df333d0-kube-api-access-6tdst\") pod \"multus-admission-controller-857f4d67dd-rzd6v\" (UID: \"bcc537bc-9041-4e21-b05b-1d681df333d0\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rzd6v" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.214972 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q"] Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.216504 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mpffx" event={"ID":"a0c9c8ba-d665-479d-8b23-5f6150a8ccdf","Type":"ContainerStarted","Data":"15beef97d7d62864016c2e9be3fd7834257d6bd65ea3e322512f3ff9f2e7c9b2"} Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.216541 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mpffx" event={"ID":"a0c9c8ba-d665-479d-8b23-5f6150a8ccdf","Type":"ContainerStarted","Data":"f22bed079c374093bcbe1860bac0b4d165bdb0ab69ca133d43f81ac969d44469"} Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.220274 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxknc\" (UniqueName: \"kubernetes.io/projected/87cd9080-a444-42fa-aa94-7c2912690882-kube-api-access-hxknc\") pod \"service-ca-9c57cc56f-d44cn\" (UID: \"87cd9080-a444-42fa-aa94-7c2912690882\") " pod="openshift-service-ca/service-ca-9c57cc56f-d44cn" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.237363 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" event={"ID":"eb688659-982c-44a4-8e59-b896de7a5e14","Type":"ContainerDied","Data":"a6d15359f456450a7d2a725b9382ad2a821ade31dddabe9bb8eea789fc6b3f89"} Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.237402 4947 generic.go:334] "Generic (PLEG): container finished" podID="eb688659-982c-44a4-8e59-b896de7a5e14" containerID="a6d15359f456450a7d2a725b9382ad2a821ade31dddabe9bb8eea789fc6b3f89" exitCode=0 Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.239299 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-wpnnz" event={"ID":"afeb9125-522c-4dea-93f1-d3ee2c58da87","Type":"ContainerStarted","Data":"7d8a74b9f3d40c1a9d322a39ee478d281acb6b6c982b32da215c16bb5eb86f16"} Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.240617 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-wpnnz" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.241613 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwczf\" (UniqueName: \"kubernetes.io/projected/ad418cca-2ab3-4e04-89f0-b547ceae6aeb-kube-api-access-fwczf\") pod \"service-ca-operator-777779d784-vpwl6\" (UID: \"ad418cca-2ab3-4e04-89f0-b547ceae6aeb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwl6" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.245621 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.260296 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xnrcs"] Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.260350 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vtmrf"] Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.264603 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8nk6\" (UniqueName: \"kubernetes.io/projected/b9c0cc40-7389-4703-bf34-d42b6bf32710-kube-api-access-j8nk6\") pod \"collect-profiles-29412405-w524p\" (UID: \"b9c0cc40-7389-4703-bf34-d42b6bf32710\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-w524p" Dec 03 06:51:18 crc kubenswrapper[4947]: E1203 06:51:18.268602 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:18.768204827 +0000 UTC m=+140.029159253 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.279726 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-wpnnz" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.282771 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmwcn\" (UniqueName: \"kubernetes.io/projected/8bfce152-31c1-4d42-b91e-acf33d4dee46-kube-api-access-jmwcn\") pod \"migrator-59844c95c7-vm492\" (UID: \"8bfce152-31c1-4d42-b91e-acf33d4dee46\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vm492" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.298173 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-grrm2" event={"ID":"1fecc097-e805-479d-9d1a-c03d13013851","Type":"ContainerStarted","Data":"61d5fd73cac5705e1feeed9f97649db303e9049436bb1901a55b666caa1553cd"} Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.298238 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-grrm2" event={"ID":"1fecc097-e805-479d-9d1a-c03d13013851","Type":"ContainerStarted","Data":"ff4406041e4d77ddd648ef8da2e95e451f83d7e2c55bc2ad8af68350e22a6b13"} Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.306843 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm4hw" event={"ID":"795afb6d-557a-4f06-9772-0af88a287913","Type":"ContainerStarted","Data":"2eaeb8e534c8afac150125b302ad9f1fdce8f8a7ecd05641ef41b20bb1481248"} Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.307416 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vm492" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.310393 4947 generic.go:334] "Generic (PLEG): container finished" podID="5b695283-9aef-4604-b070-43d775a56972" containerID="143b290e5abf6181c92546273d302da03a0b7a6e2502d045b52a452e7a4f3f51" exitCode=0 Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.310468 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8pc8" event={"ID":"5b695283-9aef-4604-b070-43d775a56972","Type":"ContainerDied","Data":"143b290e5abf6181c92546273d302da03a0b7a6e2502d045b52a452e7a4f3f51"} Dec 03 06:51:18 crc kubenswrapper[4947]: W1203 06:51:18.344616 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2feb0620_965a_4f0e_98fd_bccefd87053b.slice/crio-c0af378ce26d1b027559f4d50bcd0bc1bc6faddd35e07aec645e46a75280af43 WatchSource:0}: Error finding container c0af378ce26d1b027559f4d50bcd0bc1bc6faddd35e07aec645e46a75280af43: Status 404 returned error can't find the container with id c0af378ce26d1b027559f4d50bcd0bc1bc6faddd35e07aec645e46a75280af43 Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.344896 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vcl8p" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.345470 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgl4w" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.347684 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.349400 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jl2zz"] Dec 03 06:51:18 crc kubenswrapper[4947]: E1203 06:51:18.351371 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:18.85135039 +0000 UTC m=+140.112304816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.351949 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6wmgx" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.368026 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dv6kr" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.392923 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-rzd6v" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.404276 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-w524p" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.426797 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-d44cn" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.437881 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" event={"ID":"8f3f9b27-48d3-4b7e-963e-e2829a4a659e","Type":"ContainerStarted","Data":"94515d1aad9107d2fb69029de2f01b00523004209b013cf1aac0d806317b53bf"} Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.437935 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" event={"ID":"8f3f9b27-48d3-4b7e-963e-e2829a4a659e","Type":"ContainerStarted","Data":"b9d4a57bf2a9a9d51960103e98caea8ebf1314c619c4784000186c79de3c2e16"} Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.438023 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56tl4" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.438145 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwl6" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.439787 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.449785 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:18 crc kubenswrapper[4947]: E1203 06:51:18.452709 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:18.952672567 +0000 UTC m=+140.213626993 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.455361 4947 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-v9zdx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.455429 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" podUID="8f3f9b27-48d3-4b7e-963e-e2829a4a659e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.458423 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:18 crc kubenswrapper[4947]: E1203 06:51:18.458990 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:18.958967229 +0000 UTC m=+140.219921655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:18 crc kubenswrapper[4947]: W1203 06:51:18.461481 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5afa1ed2_5ac6_49e9_bbcb_87bf7afaf3b0.slice/crio-e56e85b3e04915fc699848d12b6611981c900ce25f77cc0099548b03251753e1 WatchSource:0}: Error finding container e56e85b3e04915fc699848d12b6611981c900ce25f77cc0099548b03251753e1: Status 404 returned error can't find the container with id e56e85b3e04915fc699848d12b6611981c900ce25f77cc0099548b03251753e1 Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.463672 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xrkt7" event={"ID":"8896266b-64eb-434e-b0cf-6a57510bf439","Type":"ContainerStarted","Data":"dd20a8b3653ad2d904153a9a94897e25674956d38def1ea7f22c29f1795792ec"} Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.463741 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xrkt7" event={"ID":"8896266b-64eb-434e-b0cf-6a57510bf439","Type":"ContainerStarted","Data":"7d86d6758b06c06b034a1fc5e66d57d336d2783f87218b77013df7a4830f4a9a"} Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.465889 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fncmn"] Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.487226 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-flxrp" event={"ID":"a6afe02e-cf5b-41e2-ac72-d89e06576d76","Type":"ContainerStarted","Data":"1bbd2d0bf9d480db7504c2a4971f53a35affb6c3b99c6afcf46525363ea362a4"} Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.487273 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-flxrp" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.508850 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qlvpr"] Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.515452 4947 patch_prober.go:28] interesting pod/downloads-7954f5f757-flxrp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.515540 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-flxrp" podUID="a6afe02e-cf5b-41e2-ac72-d89e06576d76" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.560386 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:18 crc kubenswrapper[4947]: E1203 06:51:18.560553 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:19.060529423 +0000 UTC m=+140.321483849 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.560848 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:18 crc kubenswrapper[4947]: E1203 06:51:18.565924 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:19.065888429 +0000 UTC m=+140.326843025 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.662323 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:18 crc kubenswrapper[4947]: E1203 06:51:18.662567 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:19.162512989 +0000 UTC m=+140.423467435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.663332 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:18 crc kubenswrapper[4947]: E1203 06:51:18.663654 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:19.16363759 +0000 UTC m=+140.424592226 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.696411 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-vdtwv"] Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.766091 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:18 crc kubenswrapper[4947]: E1203 06:51:18.766434 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:19.266419568 +0000 UTC m=+140.527373984 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.868689 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:18 crc kubenswrapper[4947]: E1203 06:51:18.869450 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:19.369434512 +0000 UTC m=+140.630388938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.969416 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-t2gnl" podStartSLOduration=117.969396163 podStartE2EDuration="1m57.969396163s" podCreationTimestamp="2025-12-03 06:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:18.969012072 +0000 UTC m=+140.229966498" watchObservedRunningTime="2025-12-03 06:51:18.969396163 +0000 UTC m=+140.230350589" Dec 03 06:51:18 crc kubenswrapper[4947]: I1203 06:51:18.974619 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:18 crc kubenswrapper[4947]: E1203 06:51:18.975481 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:19.475460028 +0000 UTC m=+140.736414454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.018733 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-klrbv"] Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.081828 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:19 crc kubenswrapper[4947]: E1203 06:51:19.082157 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:19.582142942 +0000 UTC m=+140.843097368 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.183406 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:19 crc kubenswrapper[4947]: E1203 06:51:19.184220 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:19.68420516 +0000 UTC m=+140.945159576 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.258818 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h5tqs"] Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.285014 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:19 crc kubenswrapper[4947]: E1203 06:51:19.285352 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:19.785338872 +0000 UTC m=+141.046293288 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.386676 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-xrkt7" Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.387940 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:19 crc kubenswrapper[4947]: E1203 06:51:19.388415 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:19.888394168 +0000 UTC m=+141.149348594 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.392114 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-cf2xq"] Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.393265 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5d7cs"] Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.427064 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p2j8p"] Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.429158 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8wwl8"] Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.489718 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:19 crc kubenswrapper[4947]: E1203 06:51:19.490302 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:19.990273721 +0000 UTC m=+141.251228147 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.513286 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-klrbv" event={"ID":"272dc0cc-9856-4375-b9f3-0ce1b543f30f","Type":"ContainerStarted","Data":"68ca7521f148461d55a941e2325025f6e45c4bca7fd40862012b1e071ceea344"} Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.519098 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vtmrf" event={"ID":"7dbf214b-d90a-41ff-8e46-694452ef42b9","Type":"ContainerStarted","Data":"8c7b73a1b11e329e8f3c695b85fc1238a70104bb87a6073db5c6e0db9990127f"} Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.536144 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-grrm2" event={"ID":"1fecc097-e805-479d-9d1a-c03d13013851","Type":"ContainerStarted","Data":"510629dc0e6847d06586343ffdd88355ee06548b059eea3223ba3ccf77394906"} Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.564018 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jl2zz" event={"ID":"2963ae1e-2e90-492a-a52a-4ca04bf481cc","Type":"ContainerStarted","Data":"7fcff92005fa212d5990b7a87cd0b08093e0e5a0bdf1bf6857932422c1770117"} Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.569518 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" event={"ID":"8fa054f2-39f2-479b-915f-b2ceeab282d6","Type":"ContainerStarted","Data":"8992e72de8cc2fbeeac1db6b51988c1fb58797dd468c8ea98addbc328084a967"} Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.571566 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.579518 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jtch8" event={"ID":"28961044-3c25-4750-9703-1f19edee14cd","Type":"ContainerStarted","Data":"7059c7b25bb1098aea4161eaef95af157e6c15c7bd93f7ead27427b372576c15"} Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.582539 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm4hw" event={"ID":"795afb6d-557a-4f06-9772-0af88a287913","Type":"ContainerStarted","Data":"d0b07af09cc5696dcd757cde5b55a0463f6189fe928f77833bdd2fed1867a33c"} Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.588073 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xnrcs" event={"ID":"d8aef45b-e02c-4a5e-a154-f4814776a263","Type":"ContainerStarted","Data":"d3027a1667e449b70831fa5ac4ba6a8d76713a6453b347f68dbe2202e75f1157"} Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.595324 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:19 crc kubenswrapper[4947]: E1203 06:51:19.595646 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:20.095629899 +0000 UTC m=+141.356584325 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.619547 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7kdr7" event={"ID":"611d9fbd-329b-4743-9e87-25c99feaf35d","Type":"ContainerStarted","Data":"c23fb78d794acdaa1cbcc1d81eae42d7f8ba9a66b4fbf101f763aa5a8bb9d156"} Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.630423 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qlvpr" event={"ID":"0806f4cc-cda3-4c9e-a112-29d36cf3b596","Type":"ContainerStarted","Data":"400da84ed5662c7e3069e762f58c574dbe2821622ecd7bf099425b16642e1555"} Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.642164 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vdtwv" event={"ID":"2e34af75-3f73-476f-ab59-40eff1841e44","Type":"ContainerStarted","Data":"6ab2570136a5224f8d6aa605211b738f4f2c2e5404f1e798e511c28a3e5a8505"} Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.664726 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q" event={"ID":"2feb0620-965a-4f0e-98fd-bccefd87053b","Type":"ContainerStarted","Data":"25200c6735ef4a116925d48980ffc122b1c87aed09fd99259a9c5677fb9f5b80"} Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.664804 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q" event={"ID":"2feb0620-965a-4f0e-98fd-bccefd87053b","Type":"ContainerStarted","Data":"c0af378ce26d1b027559f4d50bcd0bc1bc6faddd35e07aec645e46a75280af43"} Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.665307 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q" Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.669178 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" event={"ID":"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7","Type":"ContainerStarted","Data":"13014452f5ec1a4f1b0d9658b04e9dddf672e97911e48dec7f36f33c6f9433d9"} Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.673648 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h6bsr" event={"ID":"5afa1ed2-5ac6-49e9-bbcb-87bf7afaf3b0","Type":"ContainerStarted","Data":"e56e85b3e04915fc699848d12b6611981c900ce25f77cc0099548b03251753e1"} Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.688244 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fncmn" event={"ID":"d8466264-98ee-4866-bce5-0cdc13613446","Type":"ContainerStarted","Data":"a9e0a0bfd1ff065cabaf138678cf057e7978303f97d65045b5ab7a22bfaae238"} Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.697439 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.699541 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jx4vv" event={"ID":"a9008fed-2c05-4a53-a6f8-9a407213b236","Type":"ContainerStarted","Data":"cfb84f137625ba3c462ca4f46505512dee8d3b75ea17b410a7b824d107468af1"} Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.699586 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jx4vv" event={"ID":"a9008fed-2c05-4a53-a6f8-9a407213b236","Type":"ContainerStarted","Data":"2fe64938787b4b7df3f779824913796ebd84286b01682f7758a30b0c6f25aae9"} Dec 03 06:51:19 crc kubenswrapper[4947]: E1203 06:51:19.700120 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:20.200094572 +0000 UTC m=+141.461049208 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.708180 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mpffx" event={"ID":"a0c9c8ba-d665-479d-8b23-5f6150a8ccdf","Type":"ContainerStarted","Data":"dcebe61f902a98497068021941aabf85704b2aff817393a7fd80ecf99c03db04"} Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.786726 4947 patch_prober.go:28] interesting pod/downloads-7954f5f757-flxrp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" start-of-body= Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.786783 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-flxrp" podUID="a6afe02e-cf5b-41e2-ac72-d89e06576d76" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.6:8080/\": dial tcp 10.217.0.6:8080: connect: connection refused" Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.799192 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:19 crc kubenswrapper[4947]: E1203 06:51:19.801348 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:20.301322898 +0000 UTC m=+141.562277324 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.812827 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.824282 4947 patch_prober.go:28] interesting pod/router-default-5444994796-xrkt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:51:19 crc kubenswrapper[4947]: [-]has-synced failed: reason withheld Dec 03 06:51:19 crc kubenswrapper[4947]: [+]process-running ok Dec 03 06:51:19 crc kubenswrapper[4947]: healthz check failed Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.824383 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xrkt7" podUID="8896266b-64eb-434e-b0cf-6a57510bf439" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.852110 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" podStartSLOduration=118.852080849 podStartE2EDuration="1m58.852080849s" podCreationTimestamp="2025-12-03 06:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:19.812549453 +0000 UTC m=+141.073503889" watchObservedRunningTime="2025-12-03 06:51:19.852080849 +0000 UTC m=+141.113035265" Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.899565 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-grrm2" podStartSLOduration=118.899546211 podStartE2EDuration="1m58.899546211s" podCreationTimestamp="2025-12-03 06:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:19.892297344 +0000 UTC m=+141.153251780" watchObservedRunningTime="2025-12-03 06:51:19.899546211 +0000 UTC m=+141.160500637" Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.900932 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-jx4vv" podStartSLOduration=118.900924419 podStartE2EDuration="1m58.900924419s" podCreationTimestamp="2025-12-03 06:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:19.85100339 +0000 UTC m=+141.111957826" watchObservedRunningTime="2025-12-03 06:51:19.900924419 +0000 UTC m=+141.161878845" Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.902075 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:19 crc kubenswrapper[4947]: E1203 06:51:19.904186 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:20.404171387 +0000 UTC m=+141.665125933 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.947185 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-xrkt7" podStartSLOduration=117.947162037 podStartE2EDuration="1m57.947162037s" podCreationTimestamp="2025-12-03 06:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:19.930069812 +0000 UTC m=+141.191024238" watchObservedRunningTime="2025-12-03 06:51:19.947162037 +0000 UTC m=+141.208116463" Dec 03 06:51:19 crc kubenswrapper[4947]: I1203 06:51:19.996289 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" podStartSLOduration=118.996271344 podStartE2EDuration="1m58.996271344s" podCreationTimestamp="2025-12-03 06:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:19.983413794 +0000 UTC m=+141.244368220" watchObservedRunningTime="2025-12-03 06:51:19.996271344 +0000 UTC m=+141.257225760" Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.006034 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:20 crc kubenswrapper[4947]: E1203 06:51:20.006424 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:20.50641003 +0000 UTC m=+141.767364446 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.017643 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-d44cn"] Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.028202 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-flxrp" podStartSLOduration=119.028182943 podStartE2EDuration="1m59.028182943s" podCreationTimestamp="2025-12-03 06:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:20.027269167 +0000 UTC m=+141.288223613" watchObservedRunningTime="2025-12-03 06:51:20.028182943 +0000 UTC m=+141.289137369" Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.078107 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.111028 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:20 crc kubenswrapper[4947]: E1203 06:51:20.111660 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:20.611632614 +0000 UTC m=+141.872587040 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.117432 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-mpffx" podStartSLOduration=118.117404051 podStartE2EDuration="1m58.117404051s" podCreationTimestamp="2025-12-03 06:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:20.105434375 +0000 UTC m=+141.366388801" watchObservedRunningTime="2025-12-03 06:51:20.117404051 +0000 UTC m=+141.378358477" Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.118992 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6wmgx"] Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.141359 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q" podStartSLOduration=118.141332363 podStartE2EDuration="1m58.141332363s" podCreationTimestamp="2025-12-03 06:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:20.132903222 +0000 UTC m=+141.393857648" watchObservedRunningTime="2025-12-03 06:51:20.141332363 +0000 UTC m=+141.402286789" Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.190540 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-wpnnz" podStartSLOduration=119.190519021 podStartE2EDuration="1m59.190519021s" podCreationTimestamp="2025-12-03 06:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:20.189535974 +0000 UTC m=+141.450490400" watchObservedRunningTime="2025-12-03 06:51:20.190519021 +0000 UTC m=+141.451473447" Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.211953 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:20 crc kubenswrapper[4947]: E1203 06:51:20.212165 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:20.712123969 +0000 UTC m=+141.973078395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.212279 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:20 crc kubenswrapper[4947]: E1203 06:51:20.212619 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:20.712605352 +0000 UTC m=+141.973559778 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.314419 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:20 crc kubenswrapper[4947]: E1203 06:51:20.315662 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:20.815641137 +0000 UTC m=+142.076595563 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.418322 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:20 crc kubenswrapper[4947]: E1203 06:51:20.418658 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:20.918646691 +0000 UTC m=+142.179601117 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.419934 4947 patch_prober.go:28] interesting pod/router-default-5444994796-xrkt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:51:20 crc kubenswrapper[4947]: [-]has-synced failed: reason withheld Dec 03 06:51:20 crc kubenswrapper[4947]: [+]process-running ok Dec 03 06:51:20 crc kubenswrapper[4947]: healthz check failed Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.419965 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xrkt7" podUID="8896266b-64eb-434e-b0cf-6a57510bf439" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.456128 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ljd4p"] Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.510939 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q" Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.524777 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:20 crc kubenswrapper[4947]: E1203 06:51:20.525346 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:21.025319004 +0000 UTC m=+142.286273430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.626266 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:20 crc kubenswrapper[4947]: E1203 06:51:20.626620 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:21.126607561 +0000 UTC m=+142.387561977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.727917 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:20 crc kubenswrapper[4947]: E1203 06:51:20.732077 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:21.232050361 +0000 UTC m=+142.493004787 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.732255 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:20 crc kubenswrapper[4947]: E1203 06:51:20.737062 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:21.237005456 +0000 UTC m=+142.497959882 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.747949 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vpwl6"] Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.757074 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rzd6v"] Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.763419 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgl4w"] Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.764852 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p2j8p" event={"ID":"22f2942c-3081-4a6e-931d-47a98367d2ac","Type":"ContainerStarted","Data":"e6809e4ab97199bd3927a2a34d3b2882b737c7ae9c3c8dfc1ca56d99835834e3"} Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.771090 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dv6kr"] Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.771609 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-d44cn" event={"ID":"87cd9080-a444-42fa-aa94-7c2912690882","Type":"ContainerStarted","Data":"b7dbb6f95fe060d8373acb2851cfa93f64f9eb03c045eb988a5afd9159d41939"} Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.776985 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" event={"ID":"a887f13f-644f-497d-9105-6afdb38753b4","Type":"ContainerStarted","Data":"5401f6fafac3f05d5ec69d9024a1a7fda66d739acd7902e8c513d0c5f307a276"} Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.788325 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6wmgx" event={"ID":"5b257084-363a-41ed-9bc8-838592867c51","Type":"ContainerStarted","Data":"1e0e4cf7af66f48e5f9b1da181f39364aedad0746b7ab44d8bac4be9173b97e4"} Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.819864 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vbqcq"] Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.825180 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56tl4"] Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.838343 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:20 crc kubenswrapper[4947]: E1203 06:51:20.838463 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:21.338445987 +0000 UTC m=+142.599400413 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.838720 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:20 crc kubenswrapper[4947]: E1203 06:51:20.839015 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:21.339008513 +0000 UTC m=+142.599962939 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:20 crc kubenswrapper[4947]: W1203 06:51:20.846590 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad418cca_2ab3_4e04_89f0_b547ceae6aeb.slice/crio-2d6b33237d0200344c66528147a97c07d69eb2bfe34ce5ecef355da02dbcf5d5 WatchSource:0}: Error finding container 2d6b33237d0200344c66528147a97c07d69eb2bfe34ce5ecef355da02dbcf5d5: Status 404 returned error can't find the container with id 2d6b33237d0200344c66528147a97c07d69eb2bfe34ce5ecef355da02dbcf5d5 Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.850890 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8wwl8" event={"ID":"e85f57d3-31a1-41a1-a6dc-28c9e6803d5b","Type":"ContainerStarted","Data":"79e56ca40cf628bfe14f4b2591dffc5b666dd3523e53eb242278c9a8a9ad60b2"} Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.862622 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h5tqs" event={"ID":"59137370-0efe-42ca-82dc-f42ed6c69c37","Type":"ContainerStarted","Data":"d5efbc56377d31c4f3a37e750c37587fd223f6572405541fe3c62c80feee9587"} Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.884116 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ljd4p" event={"ID":"81985cd6-f5a4-46f3-9c71-448dc4f3bee6","Type":"ContainerStarted","Data":"dd51c6b74b69713d0566de201d9b7b5aa37cb29d89279b6056bb5bb04bd8b435"} Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.889972 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5d7cs" event={"ID":"a84abb48-f81f-4b6d-b7f0-9b4b1467add5","Type":"ContainerStarted","Data":"7fcdd5101f68b898bf389223e7068dec90ec237414ed59023fdf291507b79b40"} Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.939368 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.940251 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vm492"] Dec 03 06:51:20 crc kubenswrapper[4947]: E1203 06:51:20.944646 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:21.444600257 +0000 UTC m=+142.705554683 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.954600 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412405-w524p"] Dec 03 06:51:20 crc kubenswrapper[4947]: I1203 06:51:20.991459 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vcl8p"] Dec 03 06:51:21 crc kubenswrapper[4947]: I1203 06:51:21.042897 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:21 crc kubenswrapper[4947]: E1203 06:51:21.043291 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:21.543278433 +0000 UTC m=+142.804232849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:21 crc kubenswrapper[4947]: I1203 06:51:21.143781 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:21 crc kubenswrapper[4947]: E1203 06:51:21.144032 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:21.643992865 +0000 UTC m=+142.904947291 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:21 crc kubenswrapper[4947]: I1203 06:51:21.144406 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:21 crc kubenswrapper[4947]: E1203 06:51:21.144737 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:21.644722124 +0000 UTC m=+142.905676550 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:21 crc kubenswrapper[4947]: I1203 06:51:21.253010 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:21 crc kubenswrapper[4947]: E1203 06:51:21.253590 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:21.753569947 +0000 UTC m=+143.014524373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:21 crc kubenswrapper[4947]: I1203 06:51:21.355359 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:21 crc kubenswrapper[4947]: E1203 06:51:21.357849 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:21.857828564 +0000 UTC m=+143.118782990 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:21 crc kubenswrapper[4947]: I1203 06:51:21.392907 4947 patch_prober.go:28] interesting pod/router-default-5444994796-xrkt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:51:21 crc kubenswrapper[4947]: [-]has-synced failed: reason withheld Dec 03 06:51:21 crc kubenswrapper[4947]: [+]process-running ok Dec 03 06:51:21 crc kubenswrapper[4947]: healthz check failed Dec 03 06:51:21 crc kubenswrapper[4947]: I1203 06:51:21.392951 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xrkt7" podUID="8896266b-64eb-434e-b0cf-6a57510bf439" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:51:21 crc kubenswrapper[4947]: I1203 06:51:21.456859 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:21 crc kubenswrapper[4947]: E1203 06:51:21.457217 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:21.957190349 +0000 UTC m=+143.218144775 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:21 crc kubenswrapper[4947]: I1203 06:51:21.457643 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:21 crc kubenswrapper[4947]: E1203 06:51:21.458699 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:21.958651599 +0000 UTC m=+143.219606025 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:21 crc kubenswrapper[4947]: I1203 06:51:21.560200 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:21 crc kubenswrapper[4947]: E1203 06:51:21.560591 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:22.060577193 +0000 UTC m=+143.321531619 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:21 crc kubenswrapper[4947]: I1203 06:51:21.663517 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:21 crc kubenswrapper[4947]: E1203 06:51:21.664309 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:22.164296586 +0000 UTC m=+143.425251012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:21 crc kubenswrapper[4947]: I1203 06:51:21.766162 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:21 crc kubenswrapper[4947]: E1203 06:51:21.766550 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:22.266535919 +0000 UTC m=+143.527490345 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:21 crc kubenswrapper[4947]: I1203 06:51:21.870449 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:21 crc kubenswrapper[4947]: E1203 06:51:21.870946 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:22.370927351 +0000 UTC m=+143.631881777 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:21 crc kubenswrapper[4947]: I1203 06:51:21.938995 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-d44cn" event={"ID":"87cd9080-a444-42fa-aa94-7c2912690882","Type":"ContainerStarted","Data":"42292123cfb91b271ea2939801e8ef670e466f74633519a2f633e1cd5e02f338"} Dec 03 06:51:21 crc kubenswrapper[4947]: I1203 06:51:21.955751 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm4hw" event={"ID":"795afb6d-557a-4f06-9772-0af88a287913","Type":"ContainerStarted","Data":"ed8c67fdbb7b3fdd35a2779c60f91cf0e3c32b18be9f4a21793bbd3f5d9ec9b7"} Dec 03 06:51:21 crc kubenswrapper[4947]: I1203 06:51:21.966686 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-d44cn" podStartSLOduration=119.966483242 podStartE2EDuration="1m59.966483242s" podCreationTimestamp="2025-12-03 06:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:21.962977377 +0000 UTC m=+143.223931803" watchObservedRunningTime="2025-12-03 06:51:21.966483242 +0000 UTC m=+143.227437658" Dec 03 06:51:21 crc kubenswrapper[4947]: E1203 06:51:21.972452 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:22.472425683 +0000 UTC m=+143.733380109 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:21 crc kubenswrapper[4947]: I1203 06:51:21.972336 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:21 crc kubenswrapper[4947]: I1203 06:51:21.973308 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:21 crc kubenswrapper[4947]: E1203 06:51:21.973687 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:22.473671497 +0000 UTC m=+143.734625923 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:21 crc kubenswrapper[4947]: I1203 06:51:21.981878 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rzd6v" event={"ID":"bcc537bc-9041-4e21-b05b-1d681df333d0","Type":"ContainerStarted","Data":"7323cfb89ce82228a0facea79627d80bdb14105956e85afa4b75a8147fb9135a"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.008940 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nm4hw" podStartSLOduration=120.008920967 podStartE2EDuration="2m0.008920967s" podCreationTimestamp="2025-12-03 06:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:22.007274052 +0000 UTC m=+143.268228508" watchObservedRunningTime="2025-12-03 06:51:22.008920967 +0000 UTC m=+143.269875393" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.010348 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p2j8p" event={"ID":"22f2942c-3081-4a6e-931d-47a98367d2ac","Type":"ContainerStarted","Data":"1f72ea35b1850b03037a8dd459fb65986576ae3306f40968e2b2ac3c83202bb7"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.027026 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-klrbv" event={"ID":"272dc0cc-9856-4375-b9f3-0ce1b543f30f","Type":"ContainerStarted","Data":"9159635aab2c0c4447c1faed9ea1d24d85ab668946fb0e2682962cafd724a2fb"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.041110 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h5tqs" event={"ID":"59137370-0efe-42ca-82dc-f42ed6c69c37","Type":"ContainerStarted","Data":"6c3e6e3f54df7be299f89ada8e7fc587636096e12a36376fde8a62d905924b2d"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.042078 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h5tqs" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.043554 4947 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-h5tqs container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:5443/healthz\": dial tcp 10.217.0.42:5443: connect: connection refused" start-of-body= Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.043626 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h5tqs" podUID="59137370-0efe-42ca-82dc-f42ed6c69c37" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.42:5443/healthz\": dial tcp 10.217.0.42:5443: connect: connection refused" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.076384 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:22 crc kubenswrapper[4947]: E1203 06:51:22.077695 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:22.577681128 +0000 UTC m=+143.838635554 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.080347 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-p2j8p" podStartSLOduration=7.08032448 podStartE2EDuration="7.08032448s" podCreationTimestamp="2025-12-03 06:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:22.06375507 +0000 UTC m=+143.324709496" watchObservedRunningTime="2025-12-03 06:51:22.08032448 +0000 UTC m=+143.341278896" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.085076 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vtmrf" event={"ID":"7dbf214b-d90a-41ff-8e46-694452ef42b9","Type":"ContainerStarted","Data":"54e03e0a5b89b65b7836561eda7daf7ae4e00d5381531c9e9e54d3109263bf0f"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.093833 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h5tqs" podStartSLOduration=120.093818068 podStartE2EDuration="2m0.093818068s" podCreationTimestamp="2025-12-03 06:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:22.092748829 +0000 UTC m=+143.353703255" watchObservedRunningTime="2025-12-03 06:51:22.093818068 +0000 UTC m=+143.354772494" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.100936 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jl2zz" event={"ID":"2963ae1e-2e90-492a-a52a-4ca04bf481cc","Type":"ContainerStarted","Data":"c5b0054b48449f43f9676e5539ed037ec52e0b218068cdfd095be0a254ac53c7"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.139121 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vcl8p" event={"ID":"17696519-84ea-49b0-8a0a-ba85c071f427","Type":"ContainerStarted","Data":"d2bbdd53e5977ffe323efe145c89bba94381b801184806e220941fafd51dedd6"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.151969 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-w524p" event={"ID":"b9c0cc40-7389-4703-bf34-d42b6bf32710","Type":"ContainerStarted","Data":"dd375ed6a517a0278772857c23248a89db5440acd1cc67ddcbb73975412c7f98"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.179020 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:22 crc kubenswrapper[4947]: E1203 06:51:22.180410 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:22.680397915 +0000 UTC m=+143.941352341 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.182465 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-jl2zz" podStartSLOduration=120.18245008 podStartE2EDuration="2m0.18245008s" podCreationTimestamp="2025-12-03 06:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:22.135657857 +0000 UTC m=+143.396612283" watchObservedRunningTime="2025-12-03 06:51:22.18245008 +0000 UTC m=+143.443404506" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.182564 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vcl8p" podStartSLOduration=120.182561633 podStartE2EDuration="2m0.182561633s" podCreationTimestamp="2025-12-03 06:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:22.180831486 +0000 UTC m=+143.441785912" watchObservedRunningTime="2025-12-03 06:51:22.182561633 +0000 UTC m=+143.443516059" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.183852 4947 generic.go:334] "Generic (PLEG): container finished" podID="dabcbb38-cce7-404a-ab4b-aa81c4ceebb7" containerID="56f80a0bc857ad6b909936ddd427a35e37bf67190eba692e6a0f10339ced5eb9" exitCode=0 Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.183934 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" event={"ID":"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7","Type":"ContainerDied","Data":"56f80a0bc857ad6b909936ddd427a35e37bf67190eba692e6a0f10339ced5eb9"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.206670 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fncmn" event={"ID":"d8466264-98ee-4866-bce5-0cdc13613446","Type":"ContainerStarted","Data":"5e1cd10ebd1272cd3c293549ac0cedece12e83361dbbb46209afec145cfaadfd"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.222090 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-w524p" podStartSLOduration=121.222059419 podStartE2EDuration="2m1.222059419s" podCreationTimestamp="2025-12-03 06:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:22.220332961 +0000 UTC m=+143.481287387" watchObservedRunningTime="2025-12-03 06:51:22.222059419 +0000 UTC m=+143.483013845" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.278932 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dv6kr" event={"ID":"749fbe4c-7c59-46bb-bdc0-f230bf72795c","Type":"ContainerStarted","Data":"f265f2f488a16fbf2b3d4ce5aae6ee634c6928091934eac0e4ee8939d1c86121"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.279009 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dv6kr" event={"ID":"749fbe4c-7c59-46bb-bdc0-f230bf72795c","Type":"ContainerStarted","Data":"bb1629d02473fc695c6c026d05646730cebab821701449719fdc023af2989981"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.280528 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:22 crc kubenswrapper[4947]: E1203 06:51:22.284031 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:22.783973624 +0000 UTC m=+144.044928060 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.288902 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:22 crc kubenswrapper[4947]: E1203 06:51:22.296400 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:22.796361871 +0000 UTC m=+144.057316297 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.324549 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h6bsr" event={"ID":"5afa1ed2-5ac6-49e9-bbcb-87bf7afaf3b0","Type":"ContainerStarted","Data":"bcf85d0061e0c8d6033dee8395f6d6ef9b561147e6673f7aa70ddd8abc83bc27"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.362595 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-fncmn" podStartSLOduration=120.362580383 podStartE2EDuration="2m0.362580383s" podCreationTimestamp="2025-12-03 06:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:22.361239067 +0000 UTC m=+143.622193493" watchObservedRunningTime="2025-12-03 06:51:22.362580383 +0000 UTC m=+143.623534809" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.369832 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5d7cs" event={"ID":"a84abb48-f81f-4b6d-b7f0-9b4b1467add5","Type":"ContainerStarted","Data":"e9978ab773681857dd2d70817aaa33db25719ef98f92d3607886e2a62c075fe0"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.370665 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5d7cs" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.388806 4947 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-5d7cs container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.388868 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5d7cs" podUID="a84abb48-f81f-4b6d-b7f0-9b4b1467add5" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.390043 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:22 crc kubenswrapper[4947]: E1203 06:51:22.391155 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:22.89114103 +0000 UTC m=+144.152095456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.398962 4947 patch_prober.go:28] interesting pod/router-default-5444994796-xrkt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:51:22 crc kubenswrapper[4947]: [-]has-synced failed: reason withheld Dec 03 06:51:22 crc kubenswrapper[4947]: [+]process-running ok Dec 03 06:51:22 crc kubenswrapper[4947]: healthz check failed Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.399560 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xrkt7" podUID="8896266b-64eb-434e-b0cf-6a57510bf439" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.412292 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7kdr7" event={"ID":"611d9fbd-329b-4743-9e87-25c99feaf35d","Type":"ContainerStarted","Data":"baa461e59736911538fd17c1d1692d283d44ece620e8fc92c96a1fa4698dab34"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.434274 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8pc8" event={"ID":"5b695283-9aef-4604-b070-43d775a56972","Type":"ContainerStarted","Data":"53f23f282ff0d87080af685fd20d3ea25361fd98807c487cf05417bf06a03391"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.434716 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8pc8" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.447642 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vm492" event={"ID":"8bfce152-31c1-4d42-b91e-acf33d4dee46","Type":"ContainerStarted","Data":"6940557c54f08211926e97823d335f4eaad7c2aadad5374b18c6edfce83a8288"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.456177 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dv6kr" podStartSLOduration=121.456155041 podStartE2EDuration="2m1.456155041s" podCreationTimestamp="2025-12-03 06:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:22.399840558 +0000 UTC m=+143.660794984" watchObservedRunningTime="2025-12-03 06:51:22.456155041 +0000 UTC m=+143.717109467" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.460192 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vbqcq" event={"ID":"26ec81fa-20d5-4bc6-bd9f-51da98daec36","Type":"ContainerStarted","Data":"4760796493d8871b4a1040de7ede8062bcf0f3871772752c2f8ca626335d11d8"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.474226 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgl4w" event={"ID":"77ab5990-c7ce-4016-bbad-88d7357e4bb5","Type":"ContainerStarted","Data":"15891825a12caeac94c0dd4b2ffe9135896c3fea8001b0083ba8bcd16f2e370c"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.487819 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-h6bsr" podStartSLOduration=121.487804732 podStartE2EDuration="2m1.487804732s" podCreationTimestamp="2025-12-03 06:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:22.457168099 +0000 UTC m=+143.718122535" watchObservedRunningTime="2025-12-03 06:51:22.487804732 +0000 UTC m=+143.748759158" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.489587 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-7kdr7" podStartSLOduration=7.489580831 podStartE2EDuration="7.489580831s" podCreationTimestamp="2025-12-03 06:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:22.487159104 +0000 UTC m=+143.748113530" watchObservedRunningTime="2025-12-03 06:51:22.489580831 +0000 UTC m=+143.750535257" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.491529 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:22 crc kubenswrapper[4947]: E1203 06:51:22.492145 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:22.992129609 +0000 UTC m=+144.253084035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.496433 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6wmgx" event={"ID":"5b257084-363a-41ed-9bc8-838592867c51","Type":"ContainerStarted","Data":"df7776808f2148295386e0ce68f44b1034a81bebd6482412d9966d63876f933c"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.496940 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6wmgx" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.524890 4947 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6wmgx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.524937 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6wmgx" podUID="5b257084-363a-41ed-9bc8-838592867c51" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.551041 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ljd4p" event={"ID":"81985cd6-f5a4-46f3-9c71-448dc4f3bee6","Type":"ContainerStarted","Data":"4539f21ada6156d5f6528e40a622666cf903c84b0d910b718439bb7358ffa6f1"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.568774 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8pc8" podStartSLOduration=121.568755595 podStartE2EDuration="2m1.568755595s" podCreationTimestamp="2025-12-03 06:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:22.51272599 +0000 UTC m=+143.773680416" watchObservedRunningTime="2025-12-03 06:51:22.568755595 +0000 UTC m=+143.829710021" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.568972 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5d7cs" podStartSLOduration=120.568967711 podStartE2EDuration="2m0.568967711s" podCreationTimestamp="2025-12-03 06:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:22.568434407 +0000 UTC m=+143.829388833" watchObservedRunningTime="2025-12-03 06:51:22.568967711 +0000 UTC m=+143.829922137" Dec 03 06:51:22 crc kubenswrapper[4947]: E1203 06:51:22.597208 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:23.0971792 +0000 UTC m=+144.358133626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.597244 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.597528 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:22 crc kubenswrapper[4947]: E1203 06:51:22.599209 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:23.099186464 +0000 UTC m=+144.360140890 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.623649 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jtch8" event={"ID":"28961044-3c25-4750-9703-1f19edee14cd","Type":"ContainerStarted","Data":"028a97e37fc849c6b9030eb16191cacb97bf072a84e3408bd927e7d74df7a371"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.625403 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56tl4" event={"ID":"2ce74b93-fe18-4974-ae87-f26e76ed39f8","Type":"ContainerStarted","Data":"f4aea3c7ff9f9b397656547bb1806f774d8af88cf2f6f89d557e5a7cba552f2c"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.625442 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56tl4" event={"ID":"2ce74b93-fe18-4974-ae87-f26e76ed39f8","Type":"ContainerStarted","Data":"ec247653763b949799a8f4722d1b66ee0d1ed15fe5b7ec8925d635db9d1dda7c"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.626072 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56tl4" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.644290 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ljd4p" podStartSLOduration=120.644270521 podStartE2EDuration="2m0.644270521s" podCreationTimestamp="2025-12-03 06:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:22.639270915 +0000 UTC m=+143.900225331" watchObservedRunningTime="2025-12-03 06:51:22.644270521 +0000 UTC m=+143.905224947" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.645161 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-6wmgx" podStartSLOduration=120.645154755 podStartE2EDuration="2m0.645154755s" podCreationTimestamp="2025-12-03 06:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:22.603925272 +0000 UTC m=+143.864879698" watchObservedRunningTime="2025-12-03 06:51:22.645154755 +0000 UTC m=+143.906109181" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.667340 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgl4w" podStartSLOduration=120.667325339 podStartE2EDuration="2m0.667325339s" podCreationTimestamp="2025-12-03 06:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:22.664948694 +0000 UTC m=+143.925903120" watchObservedRunningTime="2025-12-03 06:51:22.667325339 +0000 UTC m=+143.928279765" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.679030 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vdtwv" event={"ID":"2e34af75-3f73-476f-ab59-40eff1841e44","Type":"ContainerStarted","Data":"2a680b8b92759b832bcafe596f9e7293c3fcb92cf95ba107dc8c301172b96247"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.690735 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jtch8" podStartSLOduration=121.690715795 podStartE2EDuration="2m1.690715795s" podCreationTimestamp="2025-12-03 06:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:22.690052857 +0000 UTC m=+143.951007283" watchObservedRunningTime="2025-12-03 06:51:22.690715795 +0000 UTC m=+143.951670221" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.698565 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:22 crc kubenswrapper[4947]: E1203 06:51:22.698703 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:23.198681172 +0000 UTC m=+144.459635588 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.699013 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:22 crc kubenswrapper[4947]: E1203 06:51:22.700477 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:23.20046919 +0000 UTC m=+144.461423616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.704709 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" event={"ID":"a887f13f-644f-497d-9105-6afdb38753b4","Type":"ContainerStarted","Data":"0bc470f296e633f20cb6677cf85e75afea18e54fa8d9c0c4212055e77943f4ec"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.714094 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwl6" event={"ID":"ad418cca-2ab3-4e04-89f0-b547ceae6aeb","Type":"ContainerStarted","Data":"9bda2bca549e01789723a066f5a1c6d57d3b1d8ff604646b51bcc8ef14e40199"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.714121 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwl6" event={"ID":"ad418cca-2ab3-4e04-89f0-b547ceae6aeb","Type":"ContainerStarted","Data":"2d6b33237d0200344c66528147a97c07d69eb2bfe34ce5ecef355da02dbcf5d5"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.717403 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56tl4" podStartSLOduration=120.717392111 podStartE2EDuration="2m0.717392111s" podCreationTimestamp="2025-12-03 06:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:22.715243313 +0000 UTC m=+143.976197739" watchObservedRunningTime="2025-12-03 06:51:22.717392111 +0000 UTC m=+143.978346537" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.734056 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vpwl6" podStartSLOduration=120.734040214 podStartE2EDuration="2m0.734040214s" podCreationTimestamp="2025-12-03 06:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:22.731820234 +0000 UTC m=+143.992774660" watchObservedRunningTime="2025-12-03 06:51:22.734040214 +0000 UTC m=+143.994994640" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.745248 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8wwl8" event={"ID":"e85f57d3-31a1-41a1-a6dc-28c9e6803d5b","Type":"ContainerStarted","Data":"47cde9eff0cac3ca88aafe160b3e57d5f7f0f84a836a98e99c392acd8c9fa8f0"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.756692 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vdtwv" podStartSLOduration=120.75667603 podStartE2EDuration="2m0.75667603s" podCreationTimestamp="2025-12-03 06:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:22.754674426 +0000 UTC m=+144.015628852" watchObservedRunningTime="2025-12-03 06:51:22.75667603 +0000 UTC m=+144.017630456" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.761004 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qlvpr" event={"ID":"0806f4cc-cda3-4c9e-a112-29d36cf3b596","Type":"ContainerStarted","Data":"d766cf4a94e259d630f7defd280540437ec0bd91fcdab5c7f2dd7261d403dcda"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.763944 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xnrcs" event={"ID":"d8aef45b-e02c-4a5e-a154-f4814776a263","Type":"ContainerStarted","Data":"22c44c8f208bb58341bc40e2283d70a62df0368fc0e719938cc2640be2dcbb57"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.767565 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" event={"ID":"eb688659-982c-44a4-8e59-b896de7a5e14","Type":"ContainerStarted","Data":"7b945982c132f53c47b62391558448d7cfa35fbfaeb1e0f9f979bfd8a1aa161d"} Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.800702 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:22 crc kubenswrapper[4947]: E1203 06:51:22.803086 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:23.303038133 +0000 UTC m=+144.563992559 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.820040 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-qlvpr" podStartSLOduration=120.820022825 podStartE2EDuration="2m0.820022825s" podCreationTimestamp="2025-12-03 06:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:22.818597216 +0000 UTC m=+144.079551642" watchObservedRunningTime="2025-12-03 06:51:22.820022825 +0000 UTC m=+144.080977251" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.829886 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8wwl8" podStartSLOduration=120.829867363 podStartE2EDuration="2m0.829867363s" podCreationTimestamp="2025-12-03 06:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:22.785822234 +0000 UTC m=+144.046776660" watchObservedRunningTime="2025-12-03 06:51:22.829867363 +0000 UTC m=+144.090821799" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.840157 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xnrcs" podStartSLOduration=121.840142402 podStartE2EDuration="2m1.840142402s" podCreationTimestamp="2025-12-03 06:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:22.838996301 +0000 UTC m=+144.099950717" watchObservedRunningTime="2025-12-03 06:51:22.840142402 +0000 UTC m=+144.101096828" Dec 03 06:51:22 crc kubenswrapper[4947]: I1203 06:51:22.902707 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:22 crc kubenswrapper[4947]: E1203 06:51:22.903010 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:23.402996963 +0000 UTC m=+144.663951389 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.004171 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:23 crc kubenswrapper[4947]: E1203 06:51:23.004663 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:23.50462637 +0000 UTC m=+144.765580806 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.105441 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:23 crc kubenswrapper[4947]: E1203 06:51:23.105870 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:23.605851175 +0000 UTC m=+144.866805601 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.138225 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-56tl4" Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.206961 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:23 crc kubenswrapper[4947]: E1203 06:51:23.207189 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:23.707147532 +0000 UTC m=+144.968101958 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.207518 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:23 crc kubenswrapper[4947]: E1203 06:51:23.207982 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:23.707965235 +0000 UTC m=+144.968919661 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.309213 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:23 crc kubenswrapper[4947]: E1203 06:51:23.309457 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:23.809409545 +0000 UTC m=+145.070363971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.309871 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:23 crc kubenswrapper[4947]: E1203 06:51:23.310328 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:23.810320591 +0000 UTC m=+145.071275017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.396533 4947 patch_prober.go:28] interesting pod/router-default-5444994796-xrkt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:51:23 crc kubenswrapper[4947]: [-]has-synced failed: reason withheld Dec 03 06:51:23 crc kubenswrapper[4947]: [+]process-running ok Dec 03 06:51:23 crc kubenswrapper[4947]: healthz check failed Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.396809 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xrkt7" podUID="8896266b-64eb-434e-b0cf-6a57510bf439" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.410995 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:23 crc kubenswrapper[4947]: E1203 06:51:23.411174 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:23.911141175 +0000 UTC m=+145.172095601 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.411242 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:23 crc kubenswrapper[4947]: E1203 06:51:23.411657 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:23.911647268 +0000 UTC m=+145.172601694 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.512991 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:23 crc kubenswrapper[4947]: E1203 06:51:23.513277 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:24.013233014 +0000 UTC m=+145.274187430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.513628 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:23 crc kubenswrapper[4947]: E1203 06:51:23.514087 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:24.014061456 +0000 UTC m=+145.275016042 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.614927 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:23 crc kubenswrapper[4947]: E1203 06:51:23.615169 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:24.115121217 +0000 UTC m=+145.376075653 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.615386 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:23 crc kubenswrapper[4947]: E1203 06:51:23.615743 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:24.115726233 +0000 UTC m=+145.376680659 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.717250 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:23 crc kubenswrapper[4947]: E1203 06:51:23.717536 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:24.217476003 +0000 UTC m=+145.478430429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.718024 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:23 crc kubenswrapper[4947]: E1203 06:51:23.718480 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:24.21846933 +0000 UTC m=+145.479423756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.789216 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vm492" event={"ID":"8bfce152-31c1-4d42-b91e-acf33d4dee46","Type":"ContainerStarted","Data":"814b0cea133f0ba9e5debba8274ebbbe5df2be1404bcc1d7bbed2abd69092ab6"} Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.789277 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vm492" event={"ID":"8bfce152-31c1-4d42-b91e-acf33d4dee46","Type":"ContainerStarted","Data":"0e4b91119545771f2d97b6671474306ea4b59ac5e16d4d81af08f87d2db3ed7e"} Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.796519 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" event={"ID":"eb688659-982c-44a4-8e59-b896de7a5e14","Type":"ContainerStarted","Data":"ad54b1ed7a3b055fb9f67dc06c42e442db69474f4450c8554833eb7528e206c6"} Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.799247 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" event={"ID":"a887f13f-644f-497d-9105-6afdb38753b4","Type":"ContainerStarted","Data":"0ccefaf3991a6fc23d44e6706f9e953a5c75302e440332ea389b1f7fcae87e77"} Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.804659 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" event={"ID":"dabcbb38-cce7-404a-ab4b-aa81c4ceebb7","Type":"ContainerStarted","Data":"ed6e82abf69f5605f007c57aa1b86f2fa2c084586e76d57643c94b7cf7c00e62"} Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.812411 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vbqcq" event={"ID":"26ec81fa-20d5-4bc6-bd9f-51da98daec36","Type":"ContainerStarted","Data":"42b8bc03e7f1d73e3afb0ceb25d720428d3ac73ed979d232a590798747471b03"} Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.812478 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vbqcq" event={"ID":"26ec81fa-20d5-4bc6-bd9f-51da98daec36","Type":"ContainerStarted","Data":"ef46b22fd37cd51ef1ffa6536b8c215433d042021ab98bb8be119ccbedfe636f"} Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.812627 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-vbqcq" Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.818988 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:23 crc kubenswrapper[4947]: E1203 06:51:23.819207 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:24.31916198 +0000 UTC m=+145.580116406 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.819678 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:23 crc kubenswrapper[4947]: E1203 06:51:23.821831 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:24.321809893 +0000 UTC m=+145.582764319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.823119 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-klrbv" event={"ID":"272dc0cc-9856-4375-b9f3-0ce1b543f30f","Type":"ContainerStarted","Data":"b07168d38bef2cb3e1f82f35d72b08536b627dc8ca630b23d3b1b3aa2c5c248c"} Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.838797 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qgl4w" event={"ID":"77ab5990-c7ce-4016-bbad-88d7357e4bb5","Type":"ContainerStarted","Data":"c7a9e4889d5fa79c25da5ab83cf630a4f31cac75d279cf77962d1169e6b1a49b"} Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.865172 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-vdtwv" event={"ID":"2e34af75-3f73-476f-ab59-40eff1841e44","Type":"ContainerStarted","Data":"1ace8507ed1eb3dd33f71d6217e86815392ef3363d5c0ae819950213e87ec373"} Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.897014 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-w524p" event={"ID":"b9c0cc40-7389-4703-bf34-d42b6bf32710","Type":"ContainerStarted","Data":"8f90dfa482cde9093d9723fd092a4ce4a48f644a83f95eea475576a5de33973c"} Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.926228 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:23 crc kubenswrapper[4947]: E1203 06:51:23.928067 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:24.428048365 +0000 UTC m=+145.689002791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.931170 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-jtch8" event={"ID":"28961044-3c25-4750-9703-1f19edee14cd","Type":"ContainerStarted","Data":"8de209b1795fc1b1e12b461bd3a1e616798d733d03fce74d0350490df63d9ce1"} Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.970899 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rzd6v" event={"ID":"bcc537bc-9041-4e21-b05b-1d681df333d0","Type":"ContainerStarted","Data":"8610f57cf65b1b60589ac886b0d58760ee0755c7d7459a2e4fe8853b5d458a4b"} Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.971061 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rzd6v" event={"ID":"bcc537bc-9041-4e21-b05b-1d681df333d0","Type":"ContainerStarted","Data":"69404d68ef6e61b72d1b15331b42f3ff3fa7a8585edeb8fcd01933f3258f9925"} Dec 03 06:51:23 crc kubenswrapper[4947]: I1203 06:51:23.998703 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vtmrf" event={"ID":"7dbf214b-d90a-41ff-8e46-694452ef42b9","Type":"ContainerStarted","Data":"2e32026e18dcba968c3cf129968137f07d96696513697949f091cd8ccc3bc4ba"} Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.020246 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8wwl8" event={"ID":"e85f57d3-31a1-41a1-a6dc-28c9e6803d5b","Type":"ContainerStarted","Data":"721aefadde0e87c1f046efaa5bf9a6400a57523ca04df5b00ec61b1738713c51"} Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.020978 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8wwl8" Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.032504 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:24 crc kubenswrapper[4947]: E1203 06:51:24.032937 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:24.532919809 +0000 UTC m=+145.793874235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.035983 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" podStartSLOduration=123.035970892 podStartE2EDuration="2m3.035970892s" podCreationTimestamp="2025-12-03 06:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:24.029128676 +0000 UTC m=+145.290083092" watchObservedRunningTime="2025-12-03 06:51:24.035970892 +0000 UTC m=+145.296925318" Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.036152 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vm492" podStartSLOduration=122.036148157 podStartE2EDuration="2m2.036148157s" podCreationTimestamp="2025-12-03 06:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:23.897790211 +0000 UTC m=+145.158744637" watchObservedRunningTime="2025-12-03 06:51:24.036148157 +0000 UTC m=+145.297102573" Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.044940 4947 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6wmgx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.044997 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6wmgx" podUID="5b257084-363a-41ed-9bc8-838592867c51" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.045041 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vcl8p" event={"ID":"17696519-84ea-49b0-8a0a-ba85c071f427","Type":"ContainerStarted","Data":"26bcc2c5cf16fee6a0ba7fc76988fbe13cf2b20e5e5204dca1a65bdc5c2b37f5"} Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.067837 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-r8pc8" Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.083970 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-5d7cs" Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.105642 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" podStartSLOduration=122.105620138 podStartE2EDuration="2m2.105620138s" podCreationTimestamp="2025-12-03 06:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:24.097418615 +0000 UTC m=+145.358373041" watchObservedRunningTime="2025-12-03 06:51:24.105620138 +0000 UTC m=+145.366574564" Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.133937 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:24 crc kubenswrapper[4947]: E1203 06:51:24.135802 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:24.635752398 +0000 UTC m=+145.896706824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.188467 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-klrbv" podStartSLOduration=122.188450973 podStartE2EDuration="2m2.188450973s" podCreationTimestamp="2025-12-03 06:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:24.187847177 +0000 UTC m=+145.448801603" watchObservedRunningTime="2025-12-03 06:51:24.188450973 +0000 UTC m=+145.449405399" Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.253925 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vbqcq" podStartSLOduration=10.253897754 podStartE2EDuration="10.253897754s" podCreationTimestamp="2025-12-03 06:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:24.249413142 +0000 UTC m=+145.510367568" watchObservedRunningTime="2025-12-03 06:51:24.253897754 +0000 UTC m=+145.514852180" Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.269638 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:24 crc kubenswrapper[4947]: E1203 06:51:24.270031 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:24.770017623 +0000 UTC m=+146.030972049 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.372090 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:24 crc kubenswrapper[4947]: E1203 06:51:24.372418 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:24.872402479 +0000 UTC m=+146.133356905 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.410192 4947 patch_prober.go:28] interesting pod/router-default-5444994796-xrkt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:51:24 crc kubenswrapper[4947]: [-]has-synced failed: reason withheld Dec 03 06:51:24 crc kubenswrapper[4947]: [+]process-running ok Dec 03 06:51:24 crc kubenswrapper[4947]: healthz check failed Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.410256 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xrkt7" podUID="8896266b-64eb-434e-b0cf-6a57510bf439" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.410543 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-rzd6v" podStartSLOduration=122.410517478 podStartE2EDuration="2m2.410517478s" podCreationTimestamp="2025-12-03 06:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:24.409231192 +0000 UTC m=+145.670185618" watchObservedRunningTime="2025-12-03 06:51:24.410517478 +0000 UTC m=+145.671471904" Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.474168 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:24 crc kubenswrapper[4947]: E1203 06:51:24.474845 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:24.974811008 +0000 UTC m=+146.235765434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.496363 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-vtmrf" podStartSLOduration=122.496332983 podStartE2EDuration="2m2.496332983s" podCreationTimestamp="2025-12-03 06:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:24.495230993 +0000 UTC m=+145.756185419" watchObservedRunningTime="2025-12-03 06:51:24.496332983 +0000 UTC m=+145.757287409" Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.564604 4947 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.575726 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:24 crc kubenswrapper[4947]: E1203 06:51:24.576401 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:51:25.076369842 +0000 UTC m=+146.337324268 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.678117 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:24 crc kubenswrapper[4947]: E1203 06:51:24.678522 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:51:25.178478781 +0000 UTC m=+146.439433207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mh92n" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.712188 4947 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-03T06:51:24.564652153Z","Handler":null,"Name":""} Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.762587 4947 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.762664 4947 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.779033 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.803115 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.880407 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:24 crc kubenswrapper[4947]: I1203 06:51:24.914128 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-h5tqs" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.049876 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" event={"ID":"a887f13f-644f-497d-9105-6afdb38753b4","Type":"ContainerStarted","Data":"4bd9033b7c4869b21c1feba61e7c8aa0f077a0bdebcce2e3cdfeb83772f501c2"} Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.049940 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" event={"ID":"a887f13f-644f-497d-9105-6afdb38753b4","Type":"ContainerStarted","Data":"6dbdef91a0f136b118d40064918afc2db5c645fa5d03821df884658ed67b39ef"} Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.051274 4947 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6wmgx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.051328 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6wmgx" podUID="5b257084-363a-41ed-9bc8-838592867c51" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.27:8080/healthz\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.068911 4947 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.068963 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.085604 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-cf2xq" podStartSLOduration=10.085585932 podStartE2EDuration="10.085585932s" podCreationTimestamp="2025-12-03 06:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:25.084610896 +0000 UTC m=+146.345565322" watchObservedRunningTime="2025-12-03 06:51:25.085585932 +0000 UTC m=+146.346540358" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.094581 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.118713 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2pgs5"] Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.119699 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2pgs5" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.123307 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.137373 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2pgs5"] Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.189296 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea-utilities\") pod \"certified-operators-2pgs5\" (UID: \"c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea\") " pod="openshift-marketplace/certified-operators-2pgs5" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.189435 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea-catalog-content\") pod \"certified-operators-2pgs5\" (UID: \"c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea\") " pod="openshift-marketplace/certified-operators-2pgs5" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.189613 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdzfk\" (UniqueName: \"kubernetes.io/projected/c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea-kube-api-access-gdzfk\") pod \"certified-operators-2pgs5\" (UID: \"c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea\") " pod="openshift-marketplace/certified-operators-2pgs5" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.221219 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mh92n\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.291649 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea-catalog-content\") pod \"certified-operators-2pgs5\" (UID: \"c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea\") " pod="openshift-marketplace/certified-operators-2pgs5" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.291762 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdzfk\" (UniqueName: \"kubernetes.io/projected/c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea-kube-api-access-gdzfk\") pod \"certified-operators-2pgs5\" (UID: \"c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea\") " pod="openshift-marketplace/certified-operators-2pgs5" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.291835 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea-utilities\") pod \"certified-operators-2pgs5\" (UID: \"c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea\") " pod="openshift-marketplace/certified-operators-2pgs5" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.292321 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea-catalog-content\") pod \"certified-operators-2pgs5\" (UID: \"c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea\") " pod="openshift-marketplace/certified-operators-2pgs5" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.292381 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea-utilities\") pod \"certified-operators-2pgs5\" (UID: \"c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea\") " pod="openshift-marketplace/certified-operators-2pgs5" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.311370 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7hsps"] Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.312340 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hsps" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.315699 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.350638 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdzfk\" (UniqueName: \"kubernetes.io/projected/c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea-kube-api-access-gdzfk\") pod \"certified-operators-2pgs5\" (UID: \"c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea\") " pod="openshift-marketplace/certified-operators-2pgs5" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.357673 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7hsps"] Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.394112 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8eae-11a8-4641-a62d-91c5a2051197-catalog-content\") pod \"community-operators-7hsps\" (UID: \"4e5e8eae-11a8-4641-a62d-91c5a2051197\") " pod="openshift-marketplace/community-operators-7hsps" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.394187 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8eae-11a8-4641-a62d-91c5a2051197-utilities\") pod \"community-operators-7hsps\" (UID: \"4e5e8eae-11a8-4641-a62d-91c5a2051197\") " pod="openshift-marketplace/community-operators-7hsps" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.394389 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6jht\" (UniqueName: \"kubernetes.io/projected/4e5e8eae-11a8-4641-a62d-91c5a2051197-kube-api-access-v6jht\") pod \"community-operators-7hsps\" (UID: \"4e5e8eae-11a8-4641-a62d-91c5a2051197\") " pod="openshift-marketplace/community-operators-7hsps" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.397798 4947 patch_prober.go:28] interesting pod/router-default-5444994796-xrkt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:51:25 crc kubenswrapper[4947]: [-]has-synced failed: reason withheld Dec 03 06:51:25 crc kubenswrapper[4947]: [+]process-running ok Dec 03 06:51:25 crc kubenswrapper[4947]: healthz check failed Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.397852 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xrkt7" podUID="8896266b-64eb-434e-b0cf-6a57510bf439" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.418638 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.440960 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2pgs5" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.500028 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8eae-11a8-4641-a62d-91c5a2051197-catalog-content\") pod \"community-operators-7hsps\" (UID: \"4e5e8eae-11a8-4641-a62d-91c5a2051197\") " pod="openshift-marketplace/community-operators-7hsps" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.500103 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8eae-11a8-4641-a62d-91c5a2051197-utilities\") pod \"community-operators-7hsps\" (UID: \"4e5e8eae-11a8-4641-a62d-91c5a2051197\") " pod="openshift-marketplace/community-operators-7hsps" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.500138 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6jht\" (UniqueName: \"kubernetes.io/projected/4e5e8eae-11a8-4641-a62d-91c5a2051197-kube-api-access-v6jht\") pod \"community-operators-7hsps\" (UID: \"4e5e8eae-11a8-4641-a62d-91c5a2051197\") " pod="openshift-marketplace/community-operators-7hsps" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.501196 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8eae-11a8-4641-a62d-91c5a2051197-catalog-content\") pod \"community-operators-7hsps\" (UID: \"4e5e8eae-11a8-4641-a62d-91c5a2051197\") " pod="openshift-marketplace/community-operators-7hsps" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.501196 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8eae-11a8-4641-a62d-91c5a2051197-utilities\") pod \"community-operators-7hsps\" (UID: \"4e5e8eae-11a8-4641-a62d-91c5a2051197\") " pod="openshift-marketplace/community-operators-7hsps" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.532305 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6tlpx"] Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.533266 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tlpx" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.538298 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6jht\" (UniqueName: \"kubernetes.io/projected/4e5e8eae-11a8-4641-a62d-91c5a2051197-kube-api-access-v6jht\") pod \"community-operators-7hsps\" (UID: \"4e5e8eae-11a8-4641-a62d-91c5a2051197\") " pod="openshift-marketplace/community-operators-7hsps" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.552017 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6tlpx"] Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.601089 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265a1050-c380-437c-a931-f51f96b375d8-catalog-content\") pod \"certified-operators-6tlpx\" (UID: \"265a1050-c380-437c-a931-f51f96b375d8\") " pod="openshift-marketplace/certified-operators-6tlpx" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.601189 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265a1050-c380-437c-a931-f51f96b375d8-utilities\") pod \"certified-operators-6tlpx\" (UID: \"265a1050-c380-437c-a931-f51f96b375d8\") " pod="openshift-marketplace/certified-operators-6tlpx" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.601216 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdzl2\" (UniqueName: \"kubernetes.io/projected/265a1050-c380-437c-a931-f51f96b375d8-kube-api-access-pdzl2\") pod \"certified-operators-6tlpx\" (UID: \"265a1050-c380-437c-a931-f51f96b375d8\") " pod="openshift-marketplace/certified-operators-6tlpx" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.627222 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hsps" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.703383 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265a1050-c380-437c-a931-f51f96b375d8-utilities\") pod \"certified-operators-6tlpx\" (UID: \"265a1050-c380-437c-a931-f51f96b375d8\") " pod="openshift-marketplace/certified-operators-6tlpx" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.703451 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdzl2\" (UniqueName: \"kubernetes.io/projected/265a1050-c380-437c-a931-f51f96b375d8-kube-api-access-pdzl2\") pod \"certified-operators-6tlpx\" (UID: \"265a1050-c380-437c-a931-f51f96b375d8\") " pod="openshift-marketplace/certified-operators-6tlpx" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.703556 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265a1050-c380-437c-a931-f51f96b375d8-catalog-content\") pod \"certified-operators-6tlpx\" (UID: \"265a1050-c380-437c-a931-f51f96b375d8\") " pod="openshift-marketplace/certified-operators-6tlpx" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.704029 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265a1050-c380-437c-a931-f51f96b375d8-utilities\") pod \"certified-operators-6tlpx\" (UID: \"265a1050-c380-437c-a931-f51f96b375d8\") " pod="openshift-marketplace/certified-operators-6tlpx" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.704052 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265a1050-c380-437c-a931-f51f96b375d8-catalog-content\") pod \"certified-operators-6tlpx\" (UID: \"265a1050-c380-437c-a931-f51f96b375d8\") " pod="openshift-marketplace/certified-operators-6tlpx" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.710143 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rwhg2"] Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.711282 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwhg2" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.724159 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rwhg2"] Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.742552 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdzl2\" (UniqueName: \"kubernetes.io/projected/265a1050-c380-437c-a931-f51f96b375d8-kube-api-access-pdzl2\") pod \"certified-operators-6tlpx\" (UID: \"265a1050-c380-437c-a931-f51f96b375d8\") " pod="openshift-marketplace/certified-operators-6tlpx" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.804736 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2024f7dd-754a-4d90-9216-88c64dd07472-utilities\") pod \"community-operators-rwhg2\" (UID: \"2024f7dd-754a-4d90-9216-88c64dd07472\") " pod="openshift-marketplace/community-operators-rwhg2" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.804808 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjf4h\" (UniqueName: \"kubernetes.io/projected/2024f7dd-754a-4d90-9216-88c64dd07472-kube-api-access-bjf4h\") pod \"community-operators-rwhg2\" (UID: \"2024f7dd-754a-4d90-9216-88c64dd07472\") " pod="openshift-marketplace/community-operators-rwhg2" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.804899 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2024f7dd-754a-4d90-9216-88c64dd07472-catalog-content\") pod \"community-operators-rwhg2\" (UID: \"2024f7dd-754a-4d90-9216-88c64dd07472\") " pod="openshift-marketplace/community-operators-rwhg2" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.850714 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tlpx" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.905636 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.905737 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2024f7dd-754a-4d90-9216-88c64dd07472-catalog-content\") pod \"community-operators-rwhg2\" (UID: \"2024f7dd-754a-4d90-9216-88c64dd07472\") " pod="openshift-marketplace/community-operators-rwhg2" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.905778 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2024f7dd-754a-4d90-9216-88c64dd07472-utilities\") pod \"community-operators-rwhg2\" (UID: \"2024f7dd-754a-4d90-9216-88c64dd07472\") " pod="openshift-marketplace/community-operators-rwhg2" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.905824 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjf4h\" (UniqueName: \"kubernetes.io/projected/2024f7dd-754a-4d90-9216-88c64dd07472-kube-api-access-bjf4h\") pod \"community-operators-rwhg2\" (UID: \"2024f7dd-754a-4d90-9216-88c64dd07472\") " pod="openshift-marketplace/community-operators-rwhg2" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.906660 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2024f7dd-754a-4d90-9216-88c64dd07472-catalog-content\") pod \"community-operators-rwhg2\" (UID: \"2024f7dd-754a-4d90-9216-88c64dd07472\") " pod="openshift-marketplace/community-operators-rwhg2" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.906901 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2024f7dd-754a-4d90-9216-88c64dd07472-utilities\") pod \"community-operators-rwhg2\" (UID: \"2024f7dd-754a-4d90-9216-88c64dd07472\") " pod="openshift-marketplace/community-operators-rwhg2" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.914356 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.929948 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjf4h\" (UniqueName: \"kubernetes.io/projected/2024f7dd-754a-4d90-9216-88c64dd07472-kube-api-access-bjf4h\") pod \"community-operators-rwhg2\" (UID: \"2024f7dd-754a-4d90-9216-88c64dd07472\") " pod="openshift-marketplace/community-operators-rwhg2" Dec 03 06:51:25 crc kubenswrapper[4947]: I1203 06:51:25.964835 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7hsps"] Dec 03 06:51:26 crc kubenswrapper[4947]: I1203 06:51:26.010146 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:51:26 crc kubenswrapper[4947]: I1203 06:51:26.010266 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:51:26 crc kubenswrapper[4947]: I1203 06:51:26.010309 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:51:26 crc kubenswrapper[4947]: I1203 06:51:26.013062 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:51:26 crc kubenswrapper[4947]: I1203 06:51:26.013802 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:51:26 crc kubenswrapper[4947]: I1203 06:51:26.016751 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:51:26 crc kubenswrapper[4947]: I1203 06:51:26.016937 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:51:26 crc kubenswrapper[4947]: I1203 06:51:26.029552 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwhg2" Dec 03 06:51:26 crc kubenswrapper[4947]: I1203 06:51:26.069407 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hsps" event={"ID":"4e5e8eae-11a8-4641-a62d-91c5a2051197","Type":"ContainerStarted","Data":"1973191e2e53c996241301cf0bc07e8a901415fb532b79263c80744495a5e7dd"} Dec 03 06:51:26 crc kubenswrapper[4947]: I1203 06:51:26.127705 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:51:26 crc kubenswrapper[4947]: I1203 06:51:26.240622 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6tlpx"] Dec 03 06:51:26 crc kubenswrapper[4947]: I1203 06:51:26.301116 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:51:26 crc kubenswrapper[4947]: I1203 06:51:26.349645 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mh92n"] Dec 03 06:51:26 crc kubenswrapper[4947]: I1203 06:51:26.386377 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2pgs5"] Dec 03 06:51:26 crc kubenswrapper[4947]: I1203 06:51:26.389402 4947 patch_prober.go:28] interesting pod/router-default-5444994796-xrkt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:51:26 crc kubenswrapper[4947]: [-]has-synced failed: reason withheld Dec 03 06:51:26 crc kubenswrapper[4947]: [+]process-running ok Dec 03 06:51:26 crc kubenswrapper[4947]: healthz check failed Dec 03 06:51:26 crc kubenswrapper[4947]: I1203 06:51:26.389458 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xrkt7" podUID="8896266b-64eb-434e-b0cf-6a57510bf439" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:51:26 crc kubenswrapper[4947]: I1203 06:51:26.498146 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:26 crc kubenswrapper[4947]: I1203 06:51:26.499968 4947 patch_prober.go:28] interesting pod/console-f9d7485db-t2gnl container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Dec 03 06:51:26 crc kubenswrapper[4947]: I1203 06:51:26.500062 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-t2gnl" podUID="a274d4f7-f741-48cc-9c34-8e2805ad66e3" containerName="console" probeResult="failure" output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" Dec 03 06:51:26 crc kubenswrapper[4947]: I1203 06:51:26.500541 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:26 crc kubenswrapper[4947]: I1203 06:51:26.521152 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rwhg2"] Dec 03 06:51:26 crc kubenswrapper[4947]: W1203 06:51:26.558753 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2024f7dd_754a_4d90_9216_88c64dd07472.slice/crio-5390b4acc2e769a2c2be04fc377808063af434cf97030702672f19401eea1f1e WatchSource:0}: Error finding container 5390b4acc2e769a2c2be04fc377808063af434cf97030702672f19401eea1f1e: Status 404 returned error can't find the container with id 5390b4acc2e769a2c2be04fc377808063af434cf97030702672f19401eea1f1e Dec 03 06:51:26 crc kubenswrapper[4947]: W1203 06:51:26.619897 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-8ba641d3e99390075d7b065ca73b9f58cd93156ca920acb972cd994c9b454306 WatchSource:0}: Error finding container 8ba641d3e99390075d7b065ca73b9f58cd93156ca920acb972cd994c9b454306: Status 404 returned error can't find the container with id 8ba641d3e99390075d7b065ca73b9f58cd93156ca920acb972cd994c9b454306 Dec 03 06:51:26 crc kubenswrapper[4947]: I1203 06:51:26.771797 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-flxrp" Dec 03 06:51:26 crc kubenswrapper[4947]: I1203 06:51:26.847894 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:26 crc kubenswrapper[4947]: I1203 06:51:26.848249 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:26 crc kubenswrapper[4947]: I1203 06:51:26.858266 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.079908 4947 generic.go:334] "Generic (PLEG): container finished" podID="2024f7dd-754a-4d90-9216-88c64dd07472" containerID="c820f430cbcfd25b4bd4ef7a97b73a5b58a3bd1700f97aa37132fea53e96cd56" exitCode=0 Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.080012 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwhg2" event={"ID":"2024f7dd-754a-4d90-9216-88c64dd07472","Type":"ContainerDied","Data":"c820f430cbcfd25b4bd4ef7a97b73a5b58a3bd1700f97aa37132fea53e96cd56"} Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.080043 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwhg2" event={"ID":"2024f7dd-754a-4d90-9216-88c64dd07472","Type":"ContainerStarted","Data":"5390b4acc2e769a2c2be04fc377808063af434cf97030702672f19401eea1f1e"} Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.082278 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.083127 4947 generic.go:334] "Generic (PLEG): container finished" podID="c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea" containerID="0f5b33141844046206759cd726aa8dbdfbc53dbb5438677f79d856832d9b8e3f" exitCode=0 Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.089741 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2pgs5" event={"ID":"c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea","Type":"ContainerDied","Data":"0f5b33141844046206759cd726aa8dbdfbc53dbb5438677f79d856832d9b8e3f"} Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.089770 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2pgs5" event={"ID":"c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea","Type":"ContainerStarted","Data":"1cc0e37d6cf3b350f7a73cae46405b831ee06d5d4d67a5f7a1391177167c3b28"} Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.089779 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"165f7d6e3d945e1133a16311a70b825356cf84ed0a4efabefd1b2b033d1c964a"} Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.089789 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f28ecb2357c99c58309755d5aec40b89ee72d2fd90ca08ac6bb0a78010b72f01"} Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.089797 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tlpx" event={"ID":"265a1050-c380-437c-a931-f51f96b375d8","Type":"ContainerDied","Data":"feddc7e2ffa12020c4fe285deefd136ea16b1b87ba26bf4f9ec4df6dd39f153a"} Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.089785 4947 generic.go:334] "Generic (PLEG): container finished" podID="265a1050-c380-437c-a931-f51f96b375d8" containerID="feddc7e2ffa12020c4fe285deefd136ea16b1b87ba26bf4f9ec4df6dd39f153a" exitCode=0 Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.089864 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tlpx" event={"ID":"265a1050-c380-437c-a931-f51f96b375d8","Type":"ContainerStarted","Data":"ca5a76a15925b83697c1045f338de2f94375c017b83378a120697ca91a4d70f6"} Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.091678 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b8aa7a710e9c0c3f88521fcdc9790a59c2fd5f6de0e06e18927a68caaf7ed909"} Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.091737 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8ba641d3e99390075d7b065ca73b9f58cd93156ca920acb972cd994c9b454306"} Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.091917 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.093835 4947 generic.go:334] "Generic (PLEG): container finished" podID="4e5e8eae-11a8-4641-a62d-91c5a2051197" containerID="60d45bdc3b52462e2e829952d2ab037ae9ba1b3a7b17c0c1e11f4bbb40ffcf33" exitCode=0 Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.093887 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hsps" event={"ID":"4e5e8eae-11a8-4641-a62d-91c5a2051197","Type":"ContainerDied","Data":"60d45bdc3b52462e2e829952d2ab037ae9ba1b3a7b17c0c1e11f4bbb40ffcf33"} Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.097231 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" event={"ID":"9b82a6de-d75a-462f-9b68-105a28a52e28","Type":"ContainerStarted","Data":"6f4f3de5653b29224d272d228083f2e24f104b08673df553368ad648dc539daa"} Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.097276 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" event={"ID":"9b82a6de-d75a-462f-9b68-105a28a52e28","Type":"ContainerStarted","Data":"90575b3c1727a2dac265e2518074a65abe9fe7f16bfb7412d9d7db255b939bb9"} Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.097696 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.101445 4947 generic.go:334] "Generic (PLEG): container finished" podID="b9c0cc40-7389-4703-bf34-d42b6bf32710" containerID="8f90dfa482cde9093d9723fd092a4ce4a48f644a83f95eea475576a5de33973c" exitCode=0 Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.101522 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-w524p" event={"ID":"b9c0cc40-7389-4703-bf34-d42b6bf32710","Type":"ContainerDied","Data":"8f90dfa482cde9093d9723fd092a4ce4a48f644a83f95eea475576a5de33973c"} Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.103196 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2c62dec1e30c8693d9a984d36608b570268fb484513c0f290d3764a6ad146eb5"} Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.103255 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"18c11360e27a784a947b9ef5a1e9f40f4e996a83dd2500a7bf6ed6e7d50d3930"} Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.110710 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-bdlx5" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.157584 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" podStartSLOduration=126.157563281 podStartE2EDuration="2m6.157563281s" podCreationTimestamp="2025-12-03 06:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:27.153041967 +0000 UTC m=+148.413996403" watchObservedRunningTime="2025-12-03 06:51:27.157563281 +0000 UTC m=+148.418517717" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.326167 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bzgkn"] Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.328437 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bzgkn" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.331451 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.333730 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bzgkn"] Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.385354 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-xrkt7" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.396040 4947 patch_prober.go:28] interesting pod/router-default-5444994796-xrkt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:51:27 crc kubenswrapper[4947]: [-]has-synced failed: reason withheld Dec 03 06:51:27 crc kubenswrapper[4947]: [+]process-running ok Dec 03 06:51:27 crc kubenswrapper[4947]: healthz check failed Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.396084 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xrkt7" podUID="8896266b-64eb-434e-b0cf-6a57510bf439" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.447420 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdf67ede-d6c7-4285-bef1-ef1fcc52c311-utilities\") pod \"redhat-marketplace-bzgkn\" (UID: \"bdf67ede-d6c7-4285-bef1-ef1fcc52c311\") " pod="openshift-marketplace/redhat-marketplace-bzgkn" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.447722 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7lck\" (UniqueName: \"kubernetes.io/projected/bdf67ede-d6c7-4285-bef1-ef1fcc52c311-kube-api-access-s7lck\") pod \"redhat-marketplace-bzgkn\" (UID: \"bdf67ede-d6c7-4285-bef1-ef1fcc52c311\") " pod="openshift-marketplace/redhat-marketplace-bzgkn" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.448022 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdf67ede-d6c7-4285-bef1-ef1fcc52c311-catalog-content\") pod \"redhat-marketplace-bzgkn\" (UID: \"bdf67ede-d6c7-4285-bef1-ef1fcc52c311\") " pod="openshift-marketplace/redhat-marketplace-bzgkn" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.519632 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.519673 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.527295 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.549267 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdf67ede-d6c7-4285-bef1-ef1fcc52c311-catalog-content\") pod \"redhat-marketplace-bzgkn\" (UID: \"bdf67ede-d6c7-4285-bef1-ef1fcc52c311\") " pod="openshift-marketplace/redhat-marketplace-bzgkn" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.549618 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdf67ede-d6c7-4285-bef1-ef1fcc52c311-utilities\") pod \"redhat-marketplace-bzgkn\" (UID: \"bdf67ede-d6c7-4285-bef1-ef1fcc52c311\") " pod="openshift-marketplace/redhat-marketplace-bzgkn" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.549734 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7lck\" (UniqueName: \"kubernetes.io/projected/bdf67ede-d6c7-4285-bef1-ef1fcc52c311-kube-api-access-s7lck\") pod \"redhat-marketplace-bzgkn\" (UID: \"bdf67ede-d6c7-4285-bef1-ef1fcc52c311\") " pod="openshift-marketplace/redhat-marketplace-bzgkn" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.549950 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdf67ede-d6c7-4285-bef1-ef1fcc52c311-catalog-content\") pod \"redhat-marketplace-bzgkn\" (UID: \"bdf67ede-d6c7-4285-bef1-ef1fcc52c311\") " pod="openshift-marketplace/redhat-marketplace-bzgkn" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.550140 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdf67ede-d6c7-4285-bef1-ef1fcc52c311-utilities\") pod \"redhat-marketplace-bzgkn\" (UID: \"bdf67ede-d6c7-4285-bef1-ef1fcc52c311\") " pod="openshift-marketplace/redhat-marketplace-bzgkn" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.569453 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7lck\" (UniqueName: \"kubernetes.io/projected/bdf67ede-d6c7-4285-bef1-ef1fcc52c311-kube-api-access-s7lck\") pod \"redhat-marketplace-bzgkn\" (UID: \"bdf67ede-d6c7-4285-bef1-ef1fcc52c311\") " pod="openshift-marketplace/redhat-marketplace-bzgkn" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.657144 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.658028 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.661137 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.661209 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.664148 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.668339 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bzgkn" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.711637 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cgrzs"] Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.712963 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cgrzs" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.723549 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cgrzs"] Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.752757 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98edffce-fb3b-46ef-ad42-1b8d18d9ff47-utilities\") pod \"redhat-marketplace-cgrzs\" (UID: \"98edffce-fb3b-46ef-ad42-1b8d18d9ff47\") " pod="openshift-marketplace/redhat-marketplace-cgrzs" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.753077 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5592fa87-6029-4a74-a60e-b825455e559c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5592fa87-6029-4a74-a60e-b825455e559c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.753130 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmrrz\" (UniqueName: \"kubernetes.io/projected/98edffce-fb3b-46ef-ad42-1b8d18d9ff47-kube-api-access-jmrrz\") pod \"redhat-marketplace-cgrzs\" (UID: \"98edffce-fb3b-46ef-ad42-1b8d18d9ff47\") " pod="openshift-marketplace/redhat-marketplace-cgrzs" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.753157 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98edffce-fb3b-46ef-ad42-1b8d18d9ff47-catalog-content\") pod \"redhat-marketplace-cgrzs\" (UID: \"98edffce-fb3b-46ef-ad42-1b8d18d9ff47\") " pod="openshift-marketplace/redhat-marketplace-cgrzs" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.753209 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5592fa87-6029-4a74-a60e-b825455e559c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5592fa87-6029-4a74-a60e-b825455e559c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.854259 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmrrz\" (UniqueName: \"kubernetes.io/projected/98edffce-fb3b-46ef-ad42-1b8d18d9ff47-kube-api-access-jmrrz\") pod \"redhat-marketplace-cgrzs\" (UID: \"98edffce-fb3b-46ef-ad42-1b8d18d9ff47\") " pod="openshift-marketplace/redhat-marketplace-cgrzs" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.854597 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98edffce-fb3b-46ef-ad42-1b8d18d9ff47-catalog-content\") pod \"redhat-marketplace-cgrzs\" (UID: \"98edffce-fb3b-46ef-ad42-1b8d18d9ff47\") " pod="openshift-marketplace/redhat-marketplace-cgrzs" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.854642 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5592fa87-6029-4a74-a60e-b825455e559c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5592fa87-6029-4a74-a60e-b825455e559c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.854688 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98edffce-fb3b-46ef-ad42-1b8d18d9ff47-utilities\") pod \"redhat-marketplace-cgrzs\" (UID: \"98edffce-fb3b-46ef-ad42-1b8d18d9ff47\") " pod="openshift-marketplace/redhat-marketplace-cgrzs" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.854706 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5592fa87-6029-4a74-a60e-b825455e559c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5592fa87-6029-4a74-a60e-b825455e559c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.854768 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5592fa87-6029-4a74-a60e-b825455e559c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5592fa87-6029-4a74-a60e-b825455e559c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.856009 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98edffce-fb3b-46ef-ad42-1b8d18d9ff47-catalog-content\") pod \"redhat-marketplace-cgrzs\" (UID: \"98edffce-fb3b-46ef-ad42-1b8d18d9ff47\") " pod="openshift-marketplace/redhat-marketplace-cgrzs" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.856202 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98edffce-fb3b-46ef-ad42-1b8d18d9ff47-utilities\") pod \"redhat-marketplace-cgrzs\" (UID: \"98edffce-fb3b-46ef-ad42-1b8d18d9ff47\") " pod="openshift-marketplace/redhat-marketplace-cgrzs" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.873353 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5592fa87-6029-4a74-a60e-b825455e559c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5592fa87-6029-4a74-a60e-b825455e559c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.882968 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmrrz\" (UniqueName: \"kubernetes.io/projected/98edffce-fb3b-46ef-ad42-1b8d18d9ff47-kube-api-access-jmrrz\") pod \"redhat-marketplace-cgrzs\" (UID: \"98edffce-fb3b-46ef-ad42-1b8d18d9ff47\") " pod="openshift-marketplace/redhat-marketplace-cgrzs" Dec 03 06:51:27 crc kubenswrapper[4947]: I1203 06:51:27.974827 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.041167 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cgrzs" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.124422 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-t2pzb" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.177184 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bzgkn"] Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.331045 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sf96j"] Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.332433 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sf96j" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.341836 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sf96j"] Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.341918 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.365269 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0e18759-9a71-4927-b7d6-ded426d626fa-catalog-content\") pod \"redhat-operators-sf96j\" (UID: \"e0e18759-9a71-4927-b7d6-ded426d626fa\") " pod="openshift-marketplace/redhat-operators-sf96j" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.365314 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0e18759-9a71-4927-b7d6-ded426d626fa-utilities\") pod \"redhat-operators-sf96j\" (UID: \"e0e18759-9a71-4927-b7d6-ded426d626fa\") " pod="openshift-marketplace/redhat-operators-sf96j" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.365354 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkhh7\" (UniqueName: \"kubernetes.io/projected/e0e18759-9a71-4927-b7d6-ded426d626fa-kube-api-access-jkhh7\") pod \"redhat-operators-sf96j\" (UID: \"e0e18759-9a71-4927-b7d6-ded426d626fa\") " pod="openshift-marketplace/redhat-operators-sf96j" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.367312 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6wmgx" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.367723 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.392967 4947 patch_prober.go:28] interesting pod/router-default-5444994796-xrkt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:51:28 crc kubenswrapper[4947]: [-]has-synced failed: reason withheld Dec 03 06:51:28 crc kubenswrapper[4947]: [+]process-running ok Dec 03 06:51:28 crc kubenswrapper[4947]: healthz check failed Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.393024 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xrkt7" podUID="8896266b-64eb-434e-b0cf-6a57510bf439" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:51:28 crc kubenswrapper[4947]: W1203 06:51:28.448235 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5592fa87_6029_4a74_a60e_b825455e559c.slice/crio-25c76c071a5abdf9b0be14e20699636e14cac2cdcda4d8bb6374e6db88860253 WatchSource:0}: Error finding container 25c76c071a5abdf9b0be14e20699636e14cac2cdcda4d8bb6374e6db88860253: Status 404 returned error can't find the container with id 25c76c071a5abdf9b0be14e20699636e14cac2cdcda4d8bb6374e6db88860253 Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.466658 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkhh7\" (UniqueName: \"kubernetes.io/projected/e0e18759-9a71-4927-b7d6-ded426d626fa-kube-api-access-jkhh7\") pod \"redhat-operators-sf96j\" (UID: \"e0e18759-9a71-4927-b7d6-ded426d626fa\") " pod="openshift-marketplace/redhat-operators-sf96j" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.466754 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0e18759-9a71-4927-b7d6-ded426d626fa-catalog-content\") pod \"redhat-operators-sf96j\" (UID: \"e0e18759-9a71-4927-b7d6-ded426d626fa\") " pod="openshift-marketplace/redhat-operators-sf96j" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.466780 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0e18759-9a71-4927-b7d6-ded426d626fa-utilities\") pod \"redhat-operators-sf96j\" (UID: \"e0e18759-9a71-4927-b7d6-ded426d626fa\") " pod="openshift-marketplace/redhat-operators-sf96j" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.467418 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0e18759-9a71-4927-b7d6-ded426d626fa-utilities\") pod \"redhat-operators-sf96j\" (UID: \"e0e18759-9a71-4927-b7d6-ded426d626fa\") " pod="openshift-marketplace/redhat-operators-sf96j" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.470968 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0e18759-9a71-4927-b7d6-ded426d626fa-catalog-content\") pod \"redhat-operators-sf96j\" (UID: \"e0e18759-9a71-4927-b7d6-ded426d626fa\") " pod="openshift-marketplace/redhat-operators-sf96j" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.514910 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkhh7\" (UniqueName: \"kubernetes.io/projected/e0e18759-9a71-4927-b7d6-ded426d626fa-kube-api-access-jkhh7\") pod \"redhat-operators-sf96j\" (UID: \"e0e18759-9a71-4927-b7d6-ded426d626fa\") " pod="openshift-marketplace/redhat-operators-sf96j" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.641662 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-w524p" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.671659 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9c0cc40-7389-4703-bf34-d42b6bf32710-secret-volume\") pod \"b9c0cc40-7389-4703-bf34-d42b6bf32710\" (UID: \"b9c0cc40-7389-4703-bf34-d42b6bf32710\") " Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.671828 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9c0cc40-7389-4703-bf34-d42b6bf32710-config-volume\") pod \"b9c0cc40-7389-4703-bf34-d42b6bf32710\" (UID: \"b9c0cc40-7389-4703-bf34-d42b6bf32710\") " Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.672423 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9c0cc40-7389-4703-bf34-d42b6bf32710-config-volume" (OuterVolumeSpecName: "config-volume") pod "b9c0cc40-7389-4703-bf34-d42b6bf32710" (UID: "b9c0cc40-7389-4703-bf34-d42b6bf32710"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.672474 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8nk6\" (UniqueName: \"kubernetes.io/projected/b9c0cc40-7389-4703-bf34-d42b6bf32710-kube-api-access-j8nk6\") pod \"b9c0cc40-7389-4703-bf34-d42b6bf32710\" (UID: \"b9c0cc40-7389-4703-bf34-d42b6bf32710\") " Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.673065 4947 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9c0cc40-7389-4703-bf34-d42b6bf32710-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.681580 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sf96j" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.682552 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9c0cc40-7389-4703-bf34-d42b6bf32710-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b9c0cc40-7389-4703-bf34-d42b6bf32710" (UID: "b9c0cc40-7389-4703-bf34-d42b6bf32710"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.683531 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9c0cc40-7389-4703-bf34-d42b6bf32710-kube-api-access-j8nk6" (OuterVolumeSpecName: "kube-api-access-j8nk6") pod "b9c0cc40-7389-4703-bf34-d42b6bf32710" (UID: "b9c0cc40-7389-4703-bf34-d42b6bf32710"). InnerVolumeSpecName "kube-api-access-j8nk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.700459 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cgrzs"] Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.731767 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jmpfq"] Dec 03 06:51:28 crc kubenswrapper[4947]: E1203 06:51:28.732750 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9c0cc40-7389-4703-bf34-d42b6bf32710" containerName="collect-profiles" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.732771 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9c0cc40-7389-4703-bf34-d42b6bf32710" containerName="collect-profiles" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.733692 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9c0cc40-7389-4703-bf34-d42b6bf32710" containerName="collect-profiles" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.748384 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jmpfq"] Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.748536 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jmpfq" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.774994 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a661b753-1589-4173-98ee-6da1511f42b0-catalog-content\") pod \"redhat-operators-jmpfq\" (UID: \"a661b753-1589-4173-98ee-6da1511f42b0\") " pod="openshift-marketplace/redhat-operators-jmpfq" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.775107 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a661b753-1589-4173-98ee-6da1511f42b0-utilities\") pod \"redhat-operators-jmpfq\" (UID: \"a661b753-1589-4173-98ee-6da1511f42b0\") " pod="openshift-marketplace/redhat-operators-jmpfq" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.775138 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkvk4\" (UniqueName: \"kubernetes.io/projected/a661b753-1589-4173-98ee-6da1511f42b0-kube-api-access-wkvk4\") pod \"redhat-operators-jmpfq\" (UID: \"a661b753-1589-4173-98ee-6da1511f42b0\") " pod="openshift-marketplace/redhat-operators-jmpfq" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.775196 4947 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9c0cc40-7389-4703-bf34-d42b6bf32710-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.775224 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8nk6\" (UniqueName: \"kubernetes.io/projected/b9c0cc40-7389-4703-bf34-d42b6bf32710-kube-api-access-j8nk6\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.876431 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkvk4\" (UniqueName: \"kubernetes.io/projected/a661b753-1589-4173-98ee-6da1511f42b0-kube-api-access-wkvk4\") pod \"redhat-operators-jmpfq\" (UID: \"a661b753-1589-4173-98ee-6da1511f42b0\") " pod="openshift-marketplace/redhat-operators-jmpfq" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.876508 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a661b753-1589-4173-98ee-6da1511f42b0-catalog-content\") pod \"redhat-operators-jmpfq\" (UID: \"a661b753-1589-4173-98ee-6da1511f42b0\") " pod="openshift-marketplace/redhat-operators-jmpfq" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.876593 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a661b753-1589-4173-98ee-6da1511f42b0-utilities\") pod \"redhat-operators-jmpfq\" (UID: \"a661b753-1589-4173-98ee-6da1511f42b0\") " pod="openshift-marketplace/redhat-operators-jmpfq" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.877156 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a661b753-1589-4173-98ee-6da1511f42b0-utilities\") pod \"redhat-operators-jmpfq\" (UID: \"a661b753-1589-4173-98ee-6da1511f42b0\") " pod="openshift-marketplace/redhat-operators-jmpfq" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.877391 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a661b753-1589-4173-98ee-6da1511f42b0-catalog-content\") pod \"redhat-operators-jmpfq\" (UID: \"a661b753-1589-4173-98ee-6da1511f42b0\") " pod="openshift-marketplace/redhat-operators-jmpfq" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.894898 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkvk4\" (UniqueName: \"kubernetes.io/projected/a661b753-1589-4173-98ee-6da1511f42b0-kube-api-access-wkvk4\") pod \"redhat-operators-jmpfq\" (UID: \"a661b753-1589-4173-98ee-6da1511f42b0\") " pod="openshift-marketplace/redhat-operators-jmpfq" Dec 03 06:51:28 crc kubenswrapper[4947]: I1203 06:51:28.933428 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sf96j"] Dec 03 06:51:28 crc kubenswrapper[4947]: W1203 06:51:28.953539 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0e18759_9a71_4927_b7d6_ded426d626fa.slice/crio-2b9e8ce5eabc57cae4948783798c7d118da60d7a86f274176bfa847e141bbf3c WatchSource:0}: Error finding container 2b9e8ce5eabc57cae4948783798c7d118da60d7a86f274176bfa847e141bbf3c: Status 404 returned error can't find the container with id 2b9e8ce5eabc57cae4948783798c7d118da60d7a86f274176bfa847e141bbf3c Dec 03 06:51:29 crc kubenswrapper[4947]: I1203 06:51:29.067211 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jmpfq" Dec 03 06:51:29 crc kubenswrapper[4947]: I1203 06:51:29.132595 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sf96j" event={"ID":"e0e18759-9a71-4927-b7d6-ded426d626fa","Type":"ContainerStarted","Data":"2b9e8ce5eabc57cae4948783798c7d118da60d7a86f274176bfa847e141bbf3c"} Dec 03 06:51:29 crc kubenswrapper[4947]: I1203 06:51:29.149838 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-w524p" Dec 03 06:51:29 crc kubenswrapper[4947]: I1203 06:51:29.150177 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-w524p" event={"ID":"b9c0cc40-7389-4703-bf34-d42b6bf32710","Type":"ContainerDied","Data":"dd375ed6a517a0278772857c23248a89db5440acd1cc67ddcbb73975412c7f98"} Dec 03 06:51:29 crc kubenswrapper[4947]: I1203 06:51:29.150218 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd375ed6a517a0278772857c23248a89db5440acd1cc67ddcbb73975412c7f98" Dec 03 06:51:29 crc kubenswrapper[4947]: I1203 06:51:29.162478 4947 generic.go:334] "Generic (PLEG): container finished" podID="bdf67ede-d6c7-4285-bef1-ef1fcc52c311" containerID="d82dc13f19f1799ef01d726acef96f5cc8157a0ac4d004c2911733e709ca750f" exitCode=0 Dec 03 06:51:29 crc kubenswrapper[4947]: I1203 06:51:29.162841 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bzgkn" event={"ID":"bdf67ede-d6c7-4285-bef1-ef1fcc52c311","Type":"ContainerDied","Data":"d82dc13f19f1799ef01d726acef96f5cc8157a0ac4d004c2911733e709ca750f"} Dec 03 06:51:29 crc kubenswrapper[4947]: I1203 06:51:29.162877 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bzgkn" event={"ID":"bdf67ede-d6c7-4285-bef1-ef1fcc52c311","Type":"ContainerStarted","Data":"38672b8b5f603a0dbf80273e8c9eee4c0bdc7e2bfccecb2fbefe30de1ad2517c"} Dec 03 06:51:29 crc kubenswrapper[4947]: I1203 06:51:29.165534 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5592fa87-6029-4a74-a60e-b825455e559c","Type":"ContainerStarted","Data":"25c76c071a5abdf9b0be14e20699636e14cac2cdcda4d8bb6374e6db88860253"} Dec 03 06:51:29 crc kubenswrapper[4947]: I1203 06:51:29.169655 4947 generic.go:334] "Generic (PLEG): container finished" podID="98edffce-fb3b-46ef-ad42-1b8d18d9ff47" containerID="61074805991b90617e6fbcbdceeb9ce42fb88bafa18f505480853f4cc5019561" exitCode=0 Dec 03 06:51:29 crc kubenswrapper[4947]: I1203 06:51:29.169728 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgrzs" event={"ID":"98edffce-fb3b-46ef-ad42-1b8d18d9ff47","Type":"ContainerDied","Data":"61074805991b90617e6fbcbdceeb9ce42fb88bafa18f505480853f4cc5019561"} Dec 03 06:51:29 crc kubenswrapper[4947]: I1203 06:51:29.169792 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgrzs" event={"ID":"98edffce-fb3b-46ef-ad42-1b8d18d9ff47","Type":"ContainerStarted","Data":"fcd5a18b3de8a3edf5604036ca503c791fe6b91352e5f27e3c240ae3b8eeb162"} Dec 03 06:51:29 crc kubenswrapper[4947]: I1203 06:51:29.331780 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jmpfq"] Dec 03 06:51:29 crc kubenswrapper[4947]: W1203 06:51:29.338717 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda661b753_1589_4173_98ee_6da1511f42b0.slice/crio-eae13a978f2f0daddc4068421f82b5bcb9be92c66886240fec9bfb7a2128d5fd WatchSource:0}: Error finding container eae13a978f2f0daddc4068421f82b5bcb9be92c66886240fec9bfb7a2128d5fd: Status 404 returned error can't find the container with id eae13a978f2f0daddc4068421f82b5bcb9be92c66886240fec9bfb7a2128d5fd Dec 03 06:51:29 crc kubenswrapper[4947]: I1203 06:51:29.388610 4947 patch_prober.go:28] interesting pod/router-default-5444994796-xrkt7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:51:29 crc kubenswrapper[4947]: [+]has-synced ok Dec 03 06:51:29 crc kubenswrapper[4947]: [+]process-running ok Dec 03 06:51:29 crc kubenswrapper[4947]: healthz check failed Dec 03 06:51:29 crc kubenswrapper[4947]: I1203 06:51:29.389176 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xrkt7" podUID="8896266b-64eb-434e-b0cf-6a57510bf439" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:51:30 crc kubenswrapper[4947]: I1203 06:51:30.090303 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:51:30 crc kubenswrapper[4947]: I1203 06:51:30.090362 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:51:30 crc kubenswrapper[4947]: I1203 06:51:30.177974 4947 generic.go:334] "Generic (PLEG): container finished" podID="a661b753-1589-4173-98ee-6da1511f42b0" containerID="85c6751f6f9e4db28ace0958691d8c70bb9ad66df8a0fd34bf63579780ac9647" exitCode=0 Dec 03 06:51:30 crc kubenswrapper[4947]: I1203 06:51:30.178083 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jmpfq" event={"ID":"a661b753-1589-4173-98ee-6da1511f42b0","Type":"ContainerDied","Data":"85c6751f6f9e4db28ace0958691d8c70bb9ad66df8a0fd34bf63579780ac9647"} Dec 03 06:51:30 crc kubenswrapper[4947]: I1203 06:51:30.178137 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jmpfq" event={"ID":"a661b753-1589-4173-98ee-6da1511f42b0","Type":"ContainerStarted","Data":"eae13a978f2f0daddc4068421f82b5bcb9be92c66886240fec9bfb7a2128d5fd"} Dec 03 06:51:30 crc kubenswrapper[4947]: I1203 06:51:30.181460 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5592fa87-6029-4a74-a60e-b825455e559c","Type":"ContainerStarted","Data":"ccffbde305b8d738a945fb934d5b0780c3976d0ce3f87032f11cb68dd14dbd11"} Dec 03 06:51:30 crc kubenswrapper[4947]: I1203 06:51:30.201288 4947 generic.go:334] "Generic (PLEG): container finished" podID="e0e18759-9a71-4927-b7d6-ded426d626fa" containerID="5100387c0a32014d504db5fb4ea8382a392876bb9311a82db28b3e7fb91f007b" exitCode=0 Dec 03 06:51:30 crc kubenswrapper[4947]: I1203 06:51:30.201328 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sf96j" event={"ID":"e0e18759-9a71-4927-b7d6-ded426d626fa","Type":"ContainerDied","Data":"5100387c0a32014d504db5fb4ea8382a392876bb9311a82db28b3e7fb91f007b"} Dec 03 06:51:30 crc kubenswrapper[4947]: I1203 06:51:30.251174 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.251155996 podStartE2EDuration="3.251155996s" podCreationTimestamp="2025-12-03 06:51:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:30.242284865 +0000 UTC m=+151.503239291" watchObservedRunningTime="2025-12-03 06:51:30.251155996 +0000 UTC m=+151.512110422" Dec 03 06:51:30 crc kubenswrapper[4947]: I1203 06:51:30.390321 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-xrkt7" Dec 03 06:51:30 crc kubenswrapper[4947]: I1203 06:51:30.393815 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-xrkt7" Dec 03 06:51:30 crc kubenswrapper[4947]: I1203 06:51:30.724955 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 06:51:30 crc kubenswrapper[4947]: I1203 06:51:30.726428 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:51:30 crc kubenswrapper[4947]: I1203 06:51:30.730415 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 06:51:30 crc kubenswrapper[4947]: I1203 06:51:30.730929 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 06:51:30 crc kubenswrapper[4947]: I1203 06:51:30.747839 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 06:51:30 crc kubenswrapper[4947]: I1203 06:51:30.799719 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d270c150-74b1-49d5-bd81-5f9ac04a5eeb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d270c150-74b1-49d5-bd81-5f9ac04a5eeb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:51:30 crc kubenswrapper[4947]: I1203 06:51:30.799758 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d270c150-74b1-49d5-bd81-5f9ac04a5eeb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d270c150-74b1-49d5-bd81-5f9ac04a5eeb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:51:30 crc kubenswrapper[4947]: I1203 06:51:30.900273 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d270c150-74b1-49d5-bd81-5f9ac04a5eeb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d270c150-74b1-49d5-bd81-5f9ac04a5eeb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:51:30 crc kubenswrapper[4947]: I1203 06:51:30.900318 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d270c150-74b1-49d5-bd81-5f9ac04a5eeb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d270c150-74b1-49d5-bd81-5f9ac04a5eeb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:51:30 crc kubenswrapper[4947]: I1203 06:51:30.900390 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d270c150-74b1-49d5-bd81-5f9ac04a5eeb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"d270c150-74b1-49d5-bd81-5f9ac04a5eeb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:51:30 crc kubenswrapper[4947]: I1203 06:51:30.933241 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d270c150-74b1-49d5-bd81-5f9ac04a5eeb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"d270c150-74b1-49d5-bd81-5f9ac04a5eeb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:51:31 crc kubenswrapper[4947]: I1203 06:51:31.120141 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:51:31 crc kubenswrapper[4947]: I1203 06:51:31.216658 4947 generic.go:334] "Generic (PLEG): container finished" podID="5592fa87-6029-4a74-a60e-b825455e559c" containerID="ccffbde305b8d738a945fb934d5b0780c3976d0ce3f87032f11cb68dd14dbd11" exitCode=0 Dec 03 06:51:31 crc kubenswrapper[4947]: I1203 06:51:31.217728 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5592fa87-6029-4a74-a60e-b825455e559c","Type":"ContainerDied","Data":"ccffbde305b8d738a945fb934d5b0780c3976d0ce3f87032f11cb68dd14dbd11"} Dec 03 06:51:31 crc kubenswrapper[4947]: I1203 06:51:31.493037 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 06:51:32 crc kubenswrapper[4947]: I1203 06:51:32.233510 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d270c150-74b1-49d5-bd81-5f9ac04a5eeb","Type":"ContainerStarted","Data":"07b348f49ead74a49c3f56b81a8d69559668fc0ad465129df121b6a610322658"} Dec 03 06:51:32 crc kubenswrapper[4947]: I1203 06:51:32.512312 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:51:32 crc kubenswrapper[4947]: I1203 06:51:32.548609 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5592fa87-6029-4a74-a60e-b825455e559c-kubelet-dir\") pod \"5592fa87-6029-4a74-a60e-b825455e559c\" (UID: \"5592fa87-6029-4a74-a60e-b825455e559c\") " Dec 03 06:51:32 crc kubenswrapper[4947]: I1203 06:51:32.548712 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5592fa87-6029-4a74-a60e-b825455e559c-kube-api-access\") pod \"5592fa87-6029-4a74-a60e-b825455e559c\" (UID: \"5592fa87-6029-4a74-a60e-b825455e559c\") " Dec 03 06:51:32 crc kubenswrapper[4947]: I1203 06:51:32.549033 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5592fa87-6029-4a74-a60e-b825455e559c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5592fa87-6029-4a74-a60e-b825455e559c" (UID: "5592fa87-6029-4a74-a60e-b825455e559c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:51:32 crc kubenswrapper[4947]: I1203 06:51:32.556402 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5592fa87-6029-4a74-a60e-b825455e559c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5592fa87-6029-4a74-a60e-b825455e559c" (UID: "5592fa87-6029-4a74-a60e-b825455e559c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:51:32 crc kubenswrapper[4947]: I1203 06:51:32.650326 4947 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5592fa87-6029-4a74-a60e-b825455e559c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:32 crc kubenswrapper[4947]: I1203 06:51:32.650366 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5592fa87-6029-4a74-a60e-b825455e559c-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:33 crc kubenswrapper[4947]: I1203 06:51:33.137351 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vbqcq" Dec 03 06:51:33 crc kubenswrapper[4947]: I1203 06:51:33.242219 4947 generic.go:334] "Generic (PLEG): container finished" podID="d270c150-74b1-49d5-bd81-5f9ac04a5eeb" containerID="6a2e0de5148539715d43ce2bd19fcc391ba9f5991401853db537195db948cb7d" exitCode=0 Dec 03 06:51:33 crc kubenswrapper[4947]: I1203 06:51:33.242281 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d270c150-74b1-49d5-bd81-5f9ac04a5eeb","Type":"ContainerDied","Data":"6a2e0de5148539715d43ce2bd19fcc391ba9f5991401853db537195db948cb7d"} Dec 03 06:51:33 crc kubenswrapper[4947]: I1203 06:51:33.251355 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5592fa87-6029-4a74-a60e-b825455e559c","Type":"ContainerDied","Data":"25c76c071a5abdf9b0be14e20699636e14cac2cdcda4d8bb6374e6db88860253"} Dec 03 06:51:33 crc kubenswrapper[4947]: I1203 06:51:33.251421 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25c76c071a5abdf9b0be14e20699636e14cac2cdcda4d8bb6374e6db88860253" Dec 03 06:51:33 crc kubenswrapper[4947]: I1203 06:51:33.251534 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:51:36 crc kubenswrapper[4947]: I1203 06:51:36.502014 4947 patch_prober.go:28] interesting pod/console-f9d7485db-t2gnl container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Dec 03 06:51:36 crc kubenswrapper[4947]: I1203 06:51:36.503010 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-t2gnl" podUID="a274d4f7-f741-48cc-9c34-8e2805ad66e3" containerName="console" probeResult="failure" output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" Dec 03 06:51:43 crc kubenswrapper[4947]: I1203 06:51:43.823154 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:51:43 crc kubenswrapper[4947]: I1203 06:51:43.884912 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d270c150-74b1-49d5-bd81-5f9ac04a5eeb-kube-api-access\") pod \"d270c150-74b1-49d5-bd81-5f9ac04a5eeb\" (UID: \"d270c150-74b1-49d5-bd81-5f9ac04a5eeb\") " Dec 03 06:51:43 crc kubenswrapper[4947]: I1203 06:51:43.885016 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d270c150-74b1-49d5-bd81-5f9ac04a5eeb-kubelet-dir\") pod \"d270c150-74b1-49d5-bd81-5f9ac04a5eeb\" (UID: \"d270c150-74b1-49d5-bd81-5f9ac04a5eeb\") " Dec 03 06:51:43 crc kubenswrapper[4947]: I1203 06:51:43.885141 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d270c150-74b1-49d5-bd81-5f9ac04a5eeb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d270c150-74b1-49d5-bd81-5f9ac04a5eeb" (UID: "d270c150-74b1-49d5-bd81-5f9ac04a5eeb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:51:43 crc kubenswrapper[4947]: I1203 06:51:43.885266 4947 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d270c150-74b1-49d5-bd81-5f9ac04a5eeb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:43 crc kubenswrapper[4947]: I1203 06:51:43.891859 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d270c150-74b1-49d5-bd81-5f9ac04a5eeb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d270c150-74b1-49d5-bd81-5f9ac04a5eeb" (UID: "d270c150-74b1-49d5-bd81-5f9ac04a5eeb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:51:43 crc kubenswrapper[4947]: I1203 06:51:43.985932 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d270c150-74b1-49d5-bd81-5f9ac04a5eeb-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:44 crc kubenswrapper[4947]: I1203 06:51:44.491346 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 06:51:44 crc kubenswrapper[4947]: I1203 06:51:44.592024 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs\") pod \"network-metrics-daemon-cz948\" (UID: \"8dd41826-cef5-42f7-8730-abc792b9337c\") " pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:51:44 crc kubenswrapper[4947]: I1203 06:51:44.597972 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8dd41826-cef5-42f7-8730-abc792b9337c-metrics-certs\") pod \"network-metrics-daemon-cz948\" (UID: \"8dd41826-cef5-42f7-8730-abc792b9337c\") " pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:51:44 crc kubenswrapper[4947]: I1203 06:51:44.710281 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"d270c150-74b1-49d5-bd81-5f9ac04a5eeb","Type":"ContainerDied","Data":"07b348f49ead74a49c3f56b81a8d69559668fc0ad465129df121b6a610322658"} Dec 03 06:51:44 crc kubenswrapper[4947]: I1203 06:51:44.710317 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07b348f49ead74a49c3f56b81a8d69559668fc0ad465129df121b6a610322658" Dec 03 06:51:44 crc kubenswrapper[4947]: I1203 06:51:44.710365 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:51:44 crc kubenswrapper[4947]: I1203 06:51:44.720238 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cz948" Dec 03 06:51:45 crc kubenswrapper[4947]: I1203 06:51:45.430024 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:51:46 crc kubenswrapper[4947]: I1203 06:51:46.853867 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:46 crc kubenswrapper[4947]: I1203 06:51:46.861988 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 06:51:51 crc kubenswrapper[4947]: E1203 06:51:51.385428 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:b45b4080e75db66dbb2f4d8403f29133c1829a6e7a5055752f4267aea3a23894: Get \"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:b45b4080e75db66dbb2f4d8403f29133c1829a6e7a5055752f4267aea3a23894\": context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 03 06:51:51 crc kubenswrapper[4947]: E1203 06:51:51.386232 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s7lck,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-bzgkn_openshift-marketplace(bdf67ede-d6c7-4285-bef1-ef1fcc52c311): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:b45b4080e75db66dbb2f4d8403f29133c1829a6e7a5055752f4267aea3a23894: Get \"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:b45b4080e75db66dbb2f4d8403f29133c1829a6e7a5055752f4267aea3a23894\": context canceled" logger="UnhandledError" Dec 03 06:51:51 crc kubenswrapper[4947]: E1203 06:51:51.387625 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:b45b4080e75db66dbb2f4d8403f29133c1829a6e7a5055752f4267aea3a23894: Get \\\"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:b45b4080e75db66dbb2f4d8403f29133c1829a6e7a5055752f4267aea3a23894\\\": context canceled\"" pod="openshift-marketplace/redhat-marketplace-bzgkn" podUID="bdf67ede-d6c7-4285-bef1-ef1fcc52c311" Dec 03 06:51:52 crc kubenswrapper[4947]: E1203 06:51:52.943297 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-bzgkn" podUID="bdf67ede-d6c7-4285-bef1-ef1fcc52c311" Dec 03 06:51:53 crc kubenswrapper[4947]: I1203 06:51:53.473163 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cz948"] Dec 03 06:51:54 crc kubenswrapper[4947]: E1203 06:51:54.180221 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 03 06:51:54 crc kubenswrapper[4947]: E1203 06:51:54.180708 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jmrrz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-cgrzs_openshift-marketplace(98edffce-fb3b-46ef-ad42-1b8d18d9ff47): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 06:51:54 crc kubenswrapper[4947]: E1203 06:51:54.181933 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-cgrzs" podUID="98edffce-fb3b-46ef-ad42-1b8d18d9ff47" Dec 03 06:51:56 crc kubenswrapper[4947]: I1203 06:51:56.305449 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:51:57 crc kubenswrapper[4947]: E1203 06:51:57.355791 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-cgrzs" podUID="98edffce-fb3b-46ef-ad42-1b8d18d9ff47" Dec 03 06:51:57 crc kubenswrapper[4947]: W1203 06:51:57.360600 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dd41826_cef5_42f7_8730_abc792b9337c.slice/crio-2850e5b31d5268698a4abcf616949f916f043d935c60a7957799efcd116869d2 WatchSource:0}: Error finding container 2850e5b31d5268698a4abcf616949f916f043d935c60a7957799efcd116869d2: Status 404 returned error can't find the container with id 2850e5b31d5268698a4abcf616949f916f043d935c60a7957799efcd116869d2 Dec 03 06:51:57 crc kubenswrapper[4947]: I1203 06:51:57.807951 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cz948" event={"ID":"8dd41826-cef5-42f7-8730-abc792b9337c","Type":"ContainerStarted","Data":"5dec110fb1b04bb53f7c3d251ef0a437ddc4ffc41f8ee43b11282799bfa34835"} Dec 03 06:51:57 crc kubenswrapper[4947]: I1203 06:51:57.808323 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cz948" event={"ID":"8dd41826-cef5-42f7-8730-abc792b9337c","Type":"ContainerStarted","Data":"2850e5b31d5268698a4abcf616949f916f043d935c60a7957799efcd116869d2"} Dec 03 06:51:57 crc kubenswrapper[4947]: I1203 06:51:57.809775 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sf96j" event={"ID":"e0e18759-9a71-4927-b7d6-ded426d626fa","Type":"ContainerStarted","Data":"8b7e943c87580bf125b99141bd7d3ad82ab29e8ec60215a0d37fbb974aa1f82b"} Dec 03 06:51:57 crc kubenswrapper[4947]: I1203 06:51:57.817901 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tlpx" event={"ID":"265a1050-c380-437c-a931-f51f96b375d8","Type":"ContainerStarted","Data":"b478a64b0c7f34ca8549a7cec3debe4fb777d0fc3ab32a312757e80961abdca9"} Dec 03 06:51:57 crc kubenswrapper[4947]: I1203 06:51:57.820555 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jmpfq" event={"ID":"a661b753-1589-4173-98ee-6da1511f42b0","Type":"ContainerStarted","Data":"2948140792ccdbe43da90280ac4443bf21345545972b259538c49d9a65be3e22"} Dec 03 06:51:57 crc kubenswrapper[4947]: I1203 06:51:57.822363 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwhg2" event={"ID":"2024f7dd-754a-4d90-9216-88c64dd07472","Type":"ContainerStarted","Data":"634b36e92040987bb086b0cd4e5be4bfb6265358cf1e2dd26e6c2070ddf03baf"} Dec 03 06:51:57 crc kubenswrapper[4947]: I1203 06:51:57.823748 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2pgs5" event={"ID":"c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea","Type":"ContainerStarted","Data":"d430978c174c1b13e5e608e98b2234f834bdb8df2bbadf2b72819ef52ba22423"} Dec 03 06:51:57 crc kubenswrapper[4947]: I1203 06:51:57.831642 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hsps" event={"ID":"4e5e8eae-11a8-4641-a62d-91c5a2051197","Type":"ContainerStarted","Data":"8a856f4c4a07fa5d9997d3c754ac66e887f6d4793163b64a2b486ef8eeda5283"} Dec 03 06:51:58 crc kubenswrapper[4947]: I1203 06:51:58.101545 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8wwl8" Dec 03 06:51:58 crc kubenswrapper[4947]: I1203 06:51:58.839433 4947 generic.go:334] "Generic (PLEG): container finished" podID="e0e18759-9a71-4927-b7d6-ded426d626fa" containerID="8b7e943c87580bf125b99141bd7d3ad82ab29e8ec60215a0d37fbb974aa1f82b" exitCode=0 Dec 03 06:51:58 crc kubenswrapper[4947]: I1203 06:51:58.839674 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sf96j" event={"ID":"e0e18759-9a71-4927-b7d6-ded426d626fa","Type":"ContainerDied","Data":"8b7e943c87580bf125b99141bd7d3ad82ab29e8ec60215a0d37fbb974aa1f82b"} Dec 03 06:51:58 crc kubenswrapper[4947]: I1203 06:51:58.844873 4947 generic.go:334] "Generic (PLEG): container finished" podID="265a1050-c380-437c-a931-f51f96b375d8" containerID="b478a64b0c7f34ca8549a7cec3debe4fb777d0fc3ab32a312757e80961abdca9" exitCode=0 Dec 03 06:51:58 crc kubenswrapper[4947]: I1203 06:51:58.844948 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tlpx" event={"ID":"265a1050-c380-437c-a931-f51f96b375d8","Type":"ContainerDied","Data":"b478a64b0c7f34ca8549a7cec3debe4fb777d0fc3ab32a312757e80961abdca9"} Dec 03 06:51:58 crc kubenswrapper[4947]: I1203 06:51:58.847769 4947 generic.go:334] "Generic (PLEG): container finished" podID="a661b753-1589-4173-98ee-6da1511f42b0" containerID="2948140792ccdbe43da90280ac4443bf21345545972b259538c49d9a65be3e22" exitCode=0 Dec 03 06:51:58 crc kubenswrapper[4947]: I1203 06:51:58.847852 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jmpfq" event={"ID":"a661b753-1589-4173-98ee-6da1511f42b0","Type":"ContainerDied","Data":"2948140792ccdbe43da90280ac4443bf21345545972b259538c49d9a65be3e22"} Dec 03 06:51:58 crc kubenswrapper[4947]: I1203 06:51:58.850357 4947 generic.go:334] "Generic (PLEG): container finished" podID="2024f7dd-754a-4d90-9216-88c64dd07472" containerID="634b36e92040987bb086b0cd4e5be4bfb6265358cf1e2dd26e6c2070ddf03baf" exitCode=0 Dec 03 06:51:58 crc kubenswrapper[4947]: I1203 06:51:58.850399 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwhg2" event={"ID":"2024f7dd-754a-4d90-9216-88c64dd07472","Type":"ContainerDied","Data":"634b36e92040987bb086b0cd4e5be4bfb6265358cf1e2dd26e6c2070ddf03baf"} Dec 03 06:51:58 crc kubenswrapper[4947]: I1203 06:51:58.856575 4947 generic.go:334] "Generic (PLEG): container finished" podID="c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea" containerID="d430978c174c1b13e5e608e98b2234f834bdb8df2bbadf2b72819ef52ba22423" exitCode=0 Dec 03 06:51:58 crc kubenswrapper[4947]: I1203 06:51:58.856669 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2pgs5" event={"ID":"c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea","Type":"ContainerDied","Data":"d430978c174c1b13e5e608e98b2234f834bdb8df2bbadf2b72819ef52ba22423"} Dec 03 06:51:58 crc kubenswrapper[4947]: I1203 06:51:58.859100 4947 generic.go:334] "Generic (PLEG): container finished" podID="4e5e8eae-11a8-4641-a62d-91c5a2051197" containerID="8a856f4c4a07fa5d9997d3c754ac66e887f6d4793163b64a2b486ef8eeda5283" exitCode=0 Dec 03 06:51:58 crc kubenswrapper[4947]: I1203 06:51:58.859152 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hsps" event={"ID":"4e5e8eae-11a8-4641-a62d-91c5a2051197","Type":"ContainerDied","Data":"8a856f4c4a07fa5d9997d3c754ac66e887f6d4793163b64a2b486ef8eeda5283"} Dec 03 06:51:58 crc kubenswrapper[4947]: I1203 06:51:58.861860 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cz948" event={"ID":"8dd41826-cef5-42f7-8730-abc792b9337c","Type":"ContainerStarted","Data":"0be34dda6eb37762d6246096a519b662e5fedd64f13a4045f92e40575716583f"} Dec 03 06:51:59 crc kubenswrapper[4947]: I1203 06:51:59.886435 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-cz948" podStartSLOduration=158.886409024 podStartE2EDuration="2m38.886409024s" podCreationTimestamp="2025-12-03 06:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:59.880899634 +0000 UTC m=+181.141854060" watchObservedRunningTime="2025-12-03 06:51:59.886409024 +0000 UTC m=+181.147363460" Dec 03 06:52:00 crc kubenswrapper[4947]: I1203 06:52:00.086435 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:52:00 crc kubenswrapper[4947]: I1203 06:52:00.086530 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:52:01 crc kubenswrapper[4947]: I1203 06:52:01.878904 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwhg2" event={"ID":"2024f7dd-754a-4d90-9216-88c64dd07472","Type":"ContainerStarted","Data":"81ebd64dbfb2d5d7cfd3aaf6a0bb11d3fb753a2699ca4f1e764a27bd2420ad6e"} Dec 03 06:52:01 crc kubenswrapper[4947]: I1203 06:52:01.881400 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hsps" event={"ID":"4e5e8eae-11a8-4641-a62d-91c5a2051197","Type":"ContainerStarted","Data":"ce23b299b094d30b8f407bed7bf985b85904998c3211591cfa31fda88df486a9"} Dec 03 06:52:01 crc kubenswrapper[4947]: I1203 06:52:01.884713 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sf96j" event={"ID":"e0e18759-9a71-4927-b7d6-ded426d626fa","Type":"ContainerStarted","Data":"8ab7318dc2ae1d268fb0d7c03c18947f000ab77f56c22699e1d74460d2ba2baf"} Dec 03 06:52:01 crc kubenswrapper[4947]: I1203 06:52:01.907356 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rwhg2" podStartSLOduration=3.401319495 podStartE2EDuration="36.907329732s" podCreationTimestamp="2025-12-03 06:51:25 +0000 UTC" firstStartedPulling="2025-12-03 06:51:27.081954893 +0000 UTC m=+148.342909339" lastFinishedPulling="2025-12-03 06:52:00.58796515 +0000 UTC m=+181.848919576" observedRunningTime="2025-12-03 06:52:01.9035507 +0000 UTC m=+183.164505126" watchObservedRunningTime="2025-12-03 06:52:01.907329732 +0000 UTC m=+183.168284158" Dec 03 06:52:01 crc kubenswrapper[4947]: I1203 06:52:01.923400 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sf96j" podStartSLOduration=3.57643039 podStartE2EDuration="33.923367269s" podCreationTimestamp="2025-12-03 06:51:28 +0000 UTC" firstStartedPulling="2025-12-03 06:51:30.212675189 +0000 UTC m=+151.473629615" lastFinishedPulling="2025-12-03 06:52:00.559612058 +0000 UTC m=+181.820566494" observedRunningTime="2025-12-03 06:52:01.922061613 +0000 UTC m=+183.183016039" watchObservedRunningTime="2025-12-03 06:52:01.923367269 +0000 UTC m=+183.184321685" Dec 03 06:52:01 crc kubenswrapper[4947]: I1203 06:52:01.943642 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7hsps" podStartSLOduration=3.377009754 podStartE2EDuration="36.94361814s" podCreationTimestamp="2025-12-03 06:51:25 +0000 UTC" firstStartedPulling="2025-12-03 06:51:27.095032169 +0000 UTC m=+148.355986595" lastFinishedPulling="2025-12-03 06:52:00.661640555 +0000 UTC m=+181.922594981" observedRunningTime="2025-12-03 06:52:01.940927766 +0000 UTC m=+183.201882192" watchObservedRunningTime="2025-12-03 06:52:01.94361814 +0000 UTC m=+183.204572566" Dec 03 06:52:02 crc kubenswrapper[4947]: I1203 06:52:02.896131 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tlpx" event={"ID":"265a1050-c380-437c-a931-f51f96b375d8","Type":"ContainerStarted","Data":"440c14bd138b2253cad4edf7b6f59e1e1ff36926939bc9e79cfb90634463fb18"} Dec 03 06:52:02 crc kubenswrapper[4947]: I1203 06:52:02.898746 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jmpfq" event={"ID":"a661b753-1589-4173-98ee-6da1511f42b0","Type":"ContainerStarted","Data":"97668cfdaf085bc0bcd3ee3d2da0a51814dbaa17146a26aa8b6e06a30312dbc2"} Dec 03 06:52:02 crc kubenswrapper[4947]: I1203 06:52:02.940819 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6tlpx" podStartSLOduration=2.991741217 podStartE2EDuration="37.940774682s" podCreationTimestamp="2025-12-03 06:51:25 +0000 UTC" firstStartedPulling="2025-12-03 06:51:27.090973778 +0000 UTC m=+148.351928224" lastFinishedPulling="2025-12-03 06:52:02.040007263 +0000 UTC m=+183.300961689" observedRunningTime="2025-12-03 06:52:02.92012476 +0000 UTC m=+184.181079186" watchObservedRunningTime="2025-12-03 06:52:02.940774682 +0000 UTC m=+184.201729108" Dec 03 06:52:02 crc kubenswrapper[4947]: I1203 06:52:02.942892 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jmpfq" podStartSLOduration=3.103206049 podStartE2EDuration="34.942885389s" podCreationTimestamp="2025-12-03 06:51:28 +0000 UTC" firstStartedPulling="2025-12-03 06:51:30.179868786 +0000 UTC m=+151.440823212" lastFinishedPulling="2025-12-03 06:52:02.019548126 +0000 UTC m=+183.280502552" observedRunningTime="2025-12-03 06:52:02.939299951 +0000 UTC m=+184.200254377" watchObservedRunningTime="2025-12-03 06:52:02.942885389 +0000 UTC m=+184.203839815" Dec 03 06:52:03 crc kubenswrapper[4947]: I1203 06:52:03.908130 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2pgs5" event={"ID":"c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea","Type":"ContainerStarted","Data":"c7a6caecdd000e9f1cebc541c913ca2eee6df4c068b80fce6f2fa830ff89981c"} Dec 03 06:52:03 crc kubenswrapper[4947]: I1203 06:52:03.933200 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2pgs5" podStartSLOduration=3.912846688 podStartE2EDuration="38.933175734s" podCreationTimestamp="2025-12-03 06:51:25 +0000 UTC" firstStartedPulling="2025-12-03 06:51:27.08554016 +0000 UTC m=+148.346494586" lastFinishedPulling="2025-12-03 06:52:02.105869206 +0000 UTC m=+183.366823632" observedRunningTime="2025-12-03 06:52:03.930096351 +0000 UTC m=+185.191050777" watchObservedRunningTime="2025-12-03 06:52:03.933175734 +0000 UTC m=+185.194130160" Dec 03 06:52:05 crc kubenswrapper[4947]: I1203 06:52:05.441995 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2pgs5" Dec 03 06:52:05 crc kubenswrapper[4947]: I1203 06:52:05.442079 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2pgs5" Dec 03 06:52:05 crc kubenswrapper[4947]: I1203 06:52:05.502325 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2pgs5" Dec 03 06:52:05 crc kubenswrapper[4947]: I1203 06:52:05.628205 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7hsps" Dec 03 06:52:05 crc kubenswrapper[4947]: I1203 06:52:05.628267 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7hsps" Dec 03 06:52:05 crc kubenswrapper[4947]: I1203 06:52:05.687383 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7hsps" Dec 03 06:52:05 crc kubenswrapper[4947]: I1203 06:52:05.851540 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6tlpx" Dec 03 06:52:05 crc kubenswrapper[4947]: I1203 06:52:05.851658 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6tlpx" Dec 03 06:52:05 crc kubenswrapper[4947]: I1203 06:52:05.896041 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6tlpx" Dec 03 06:52:06 crc kubenswrapper[4947]: I1203 06:52:06.030249 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rwhg2" Dec 03 06:52:06 crc kubenswrapper[4947]: I1203 06:52:06.030349 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rwhg2" Dec 03 06:52:06 crc kubenswrapper[4947]: I1203 06:52:06.074332 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rwhg2" Dec 03 06:52:06 crc kubenswrapper[4947]: I1203 06:52:06.249334 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fj5r4"] Dec 03 06:52:06 crc kubenswrapper[4947]: I1203 06:52:06.965072 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rwhg2" Dec 03 06:52:08 crc kubenswrapper[4947]: I1203 06:52:08.019588 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rwhg2"] Dec 03 06:52:08 crc kubenswrapper[4947]: I1203 06:52:08.537586 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 06:52:08 crc kubenswrapper[4947]: E1203 06:52:08.537863 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5592fa87-6029-4a74-a60e-b825455e559c" containerName="pruner" Dec 03 06:52:08 crc kubenswrapper[4947]: I1203 06:52:08.537876 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5592fa87-6029-4a74-a60e-b825455e559c" containerName="pruner" Dec 03 06:52:08 crc kubenswrapper[4947]: E1203 06:52:08.537886 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d270c150-74b1-49d5-bd81-5f9ac04a5eeb" containerName="pruner" Dec 03 06:52:08 crc kubenswrapper[4947]: I1203 06:52:08.537892 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="d270c150-74b1-49d5-bd81-5f9ac04a5eeb" containerName="pruner" Dec 03 06:52:08 crc kubenswrapper[4947]: I1203 06:52:08.537995 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5592fa87-6029-4a74-a60e-b825455e559c" containerName="pruner" Dec 03 06:52:08 crc kubenswrapper[4947]: I1203 06:52:08.538008 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="d270c150-74b1-49d5-bd81-5f9ac04a5eeb" containerName="pruner" Dec 03 06:52:08 crc kubenswrapper[4947]: I1203 06:52:08.538407 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:52:08 crc kubenswrapper[4947]: I1203 06:52:08.541524 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 06:52:08 crc kubenswrapper[4947]: I1203 06:52:08.541669 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 06:52:08 crc kubenswrapper[4947]: I1203 06:52:08.543716 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 06:52:08 crc kubenswrapper[4947]: I1203 06:52:08.633177 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22e26c61-0cbe-45f5-82e0-efb474365ce9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"22e26c61-0cbe-45f5-82e0-efb474365ce9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:52:08 crc kubenswrapper[4947]: I1203 06:52:08.633272 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22e26c61-0cbe-45f5-82e0-efb474365ce9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"22e26c61-0cbe-45f5-82e0-efb474365ce9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:52:08 crc kubenswrapper[4947]: I1203 06:52:08.681868 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sf96j" Dec 03 06:52:08 crc kubenswrapper[4947]: I1203 06:52:08.681920 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sf96j" Dec 03 06:52:08 crc kubenswrapper[4947]: I1203 06:52:08.733895 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22e26c61-0cbe-45f5-82e0-efb474365ce9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"22e26c61-0cbe-45f5-82e0-efb474365ce9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:52:08 crc kubenswrapper[4947]: I1203 06:52:08.734104 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22e26c61-0cbe-45f5-82e0-efb474365ce9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"22e26c61-0cbe-45f5-82e0-efb474365ce9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:52:08 crc kubenswrapper[4947]: I1203 06:52:08.734158 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22e26c61-0cbe-45f5-82e0-efb474365ce9-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"22e26c61-0cbe-45f5-82e0-efb474365ce9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:52:08 crc kubenswrapper[4947]: I1203 06:52:08.734373 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sf96j" Dec 03 06:52:08 crc kubenswrapper[4947]: I1203 06:52:08.774421 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22e26c61-0cbe-45f5-82e0-efb474365ce9-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"22e26c61-0cbe-45f5-82e0-efb474365ce9\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:52:08 crc kubenswrapper[4947]: I1203 06:52:08.922601 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:52:08 crc kubenswrapper[4947]: I1203 06:52:08.932862 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rwhg2" podUID="2024f7dd-754a-4d90-9216-88c64dd07472" containerName="registry-server" containerID="cri-o://81ebd64dbfb2d5d7cfd3aaf6a0bb11d3fb753a2699ca4f1e764a27bd2420ad6e" gracePeriod=2 Dec 03 06:52:08 crc kubenswrapper[4947]: I1203 06:52:08.971149 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sf96j" Dec 03 06:52:09 crc kubenswrapper[4947]: I1203 06:52:09.067687 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jmpfq" Dec 03 06:52:09 crc kubenswrapper[4947]: I1203 06:52:09.067756 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jmpfq" Dec 03 06:52:09 crc kubenswrapper[4947]: I1203 06:52:09.110260 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jmpfq" Dec 03 06:52:09 crc kubenswrapper[4947]: I1203 06:52:09.533805 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 06:52:09 crc kubenswrapper[4947]: W1203 06:52:09.550669 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod22e26c61_0cbe_45f5_82e0_efb474365ce9.slice/crio-cee77758bc74c445b7814bb5d197830600c18d10ad7651c4634b9bf5acfc8aa4 WatchSource:0}: Error finding container cee77758bc74c445b7814bb5d197830600c18d10ad7651c4634b9bf5acfc8aa4: Status 404 returned error can't find the container with id cee77758bc74c445b7814bb5d197830600c18d10ad7651c4634b9bf5acfc8aa4 Dec 03 06:52:09 crc kubenswrapper[4947]: I1203 06:52:09.939841 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"22e26c61-0cbe-45f5-82e0-efb474365ce9","Type":"ContainerStarted","Data":"cee77758bc74c445b7814bb5d197830600c18d10ad7651c4634b9bf5acfc8aa4"} Dec 03 06:52:09 crc kubenswrapper[4947]: I1203 06:52:09.943153 4947 generic.go:334] "Generic (PLEG): container finished" podID="2024f7dd-754a-4d90-9216-88c64dd07472" containerID="81ebd64dbfb2d5d7cfd3aaf6a0bb11d3fb753a2699ca4f1e764a27bd2420ad6e" exitCode=0 Dec 03 06:52:09 crc kubenswrapper[4947]: I1203 06:52:09.943215 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwhg2" event={"ID":"2024f7dd-754a-4d90-9216-88c64dd07472","Type":"ContainerDied","Data":"81ebd64dbfb2d5d7cfd3aaf6a0bb11d3fb753a2699ca4f1e764a27bd2420ad6e"} Dec 03 06:52:09 crc kubenswrapper[4947]: I1203 06:52:09.993069 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jmpfq" Dec 03 06:52:10 crc kubenswrapper[4947]: I1203 06:52:10.949885 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"22e26c61-0cbe-45f5-82e0-efb474365ce9","Type":"ContainerStarted","Data":"a9cb90c484a375a93dc46f056678d3dc745845eda541aa1f56d6e1981e0332cd"} Dec 03 06:52:10 crc kubenswrapper[4947]: I1203 06:52:10.966227 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.9661989010000003 podStartE2EDuration="2.966198901s" podCreationTimestamp="2025-12-03 06:52:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:52:10.966011676 +0000 UTC m=+192.226966102" watchObservedRunningTime="2025-12-03 06:52:10.966198901 +0000 UTC m=+192.227153317" Dec 03 06:52:11 crc kubenswrapper[4947]: I1203 06:52:11.958995 4947 generic.go:334] "Generic (PLEG): container finished" podID="22e26c61-0cbe-45f5-82e0-efb474365ce9" containerID="a9cb90c484a375a93dc46f056678d3dc745845eda541aa1f56d6e1981e0332cd" exitCode=0 Dec 03 06:52:11 crc kubenswrapper[4947]: I1203 06:52:11.959075 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"22e26c61-0cbe-45f5-82e0-efb474365ce9","Type":"ContainerDied","Data":"a9cb90c484a375a93dc46f056678d3dc745845eda541aa1f56d6e1981e0332cd"} Dec 03 06:52:12 crc kubenswrapper[4947]: I1203 06:52:12.416984 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jmpfq"] Dec 03 06:52:12 crc kubenswrapper[4947]: I1203 06:52:12.417235 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jmpfq" podUID="a661b753-1589-4173-98ee-6da1511f42b0" containerName="registry-server" containerID="cri-o://97668cfdaf085bc0bcd3ee3d2da0a51814dbaa17146a26aa8b6e06a30312dbc2" gracePeriod=2 Dec 03 06:52:12 crc kubenswrapper[4947]: I1203 06:52:12.969975 4947 generic.go:334] "Generic (PLEG): container finished" podID="a661b753-1589-4173-98ee-6da1511f42b0" containerID="97668cfdaf085bc0bcd3ee3d2da0a51814dbaa17146a26aa8b6e06a30312dbc2" exitCode=0 Dec 03 06:52:12 crc kubenswrapper[4947]: I1203 06:52:12.970051 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jmpfq" event={"ID":"a661b753-1589-4173-98ee-6da1511f42b0","Type":"ContainerDied","Data":"97668cfdaf085bc0bcd3ee3d2da0a51814dbaa17146a26aa8b6e06a30312dbc2"} Dec 03 06:52:13 crc kubenswrapper[4947]: I1203 06:52:13.955242 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwhg2" Dec 03 06:52:13 crc kubenswrapper[4947]: I1203 06:52:13.966445 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:52:13 crc kubenswrapper[4947]: I1203 06:52:13.986025 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"22e26c61-0cbe-45f5-82e0-efb474365ce9","Type":"ContainerDied","Data":"cee77758bc74c445b7814bb5d197830600c18d10ad7651c4634b9bf5acfc8aa4"} Dec 03 06:52:13 crc kubenswrapper[4947]: I1203 06:52:13.986075 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cee77758bc74c445b7814bb5d197830600c18d10ad7651c4634b9bf5acfc8aa4" Dec 03 06:52:13 crc kubenswrapper[4947]: I1203 06:52:13.986146 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.008303 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwhg2" event={"ID":"2024f7dd-754a-4d90-9216-88c64dd07472","Type":"ContainerDied","Data":"5390b4acc2e769a2c2be04fc377808063af434cf97030702672f19401eea1f1e"} Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.008374 4947 scope.go:117] "RemoveContainer" containerID="81ebd64dbfb2d5d7cfd3aaf6a0bb11d3fb753a2699ca4f1e764a27bd2420ad6e" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.008416 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwhg2" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.113228 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjf4h\" (UniqueName: \"kubernetes.io/projected/2024f7dd-754a-4d90-9216-88c64dd07472-kube-api-access-bjf4h\") pod \"2024f7dd-754a-4d90-9216-88c64dd07472\" (UID: \"2024f7dd-754a-4d90-9216-88c64dd07472\") " Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.113274 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2024f7dd-754a-4d90-9216-88c64dd07472-catalog-content\") pod \"2024f7dd-754a-4d90-9216-88c64dd07472\" (UID: \"2024f7dd-754a-4d90-9216-88c64dd07472\") " Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.113303 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2024f7dd-754a-4d90-9216-88c64dd07472-utilities\") pod \"2024f7dd-754a-4d90-9216-88c64dd07472\" (UID: \"2024f7dd-754a-4d90-9216-88c64dd07472\") " Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.113384 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22e26c61-0cbe-45f5-82e0-efb474365ce9-kubelet-dir\") pod \"22e26c61-0cbe-45f5-82e0-efb474365ce9\" (UID: \"22e26c61-0cbe-45f5-82e0-efb474365ce9\") " Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.113431 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22e26c61-0cbe-45f5-82e0-efb474365ce9-kube-api-access\") pod \"22e26c61-0cbe-45f5-82e0-efb474365ce9\" (UID: \"22e26c61-0cbe-45f5-82e0-efb474365ce9\") " Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.114535 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/22e26c61-0cbe-45f5-82e0-efb474365ce9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "22e26c61-0cbe-45f5-82e0-efb474365ce9" (UID: "22e26c61-0cbe-45f5-82e0-efb474365ce9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.115366 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2024f7dd-754a-4d90-9216-88c64dd07472-utilities" (OuterVolumeSpecName: "utilities") pod "2024f7dd-754a-4d90-9216-88c64dd07472" (UID: "2024f7dd-754a-4d90-9216-88c64dd07472"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.123888 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2024f7dd-754a-4d90-9216-88c64dd07472-kube-api-access-bjf4h" (OuterVolumeSpecName: "kube-api-access-bjf4h") pod "2024f7dd-754a-4d90-9216-88c64dd07472" (UID: "2024f7dd-754a-4d90-9216-88c64dd07472"). InnerVolumeSpecName "kube-api-access-bjf4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.129149 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22e26c61-0cbe-45f5-82e0-efb474365ce9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "22e26c61-0cbe-45f5-82e0-efb474365ce9" (UID: "22e26c61-0cbe-45f5-82e0-efb474365ce9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.200408 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2024f7dd-754a-4d90-9216-88c64dd07472-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2024f7dd-754a-4d90-9216-88c64dd07472" (UID: "2024f7dd-754a-4d90-9216-88c64dd07472"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.215476 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjf4h\" (UniqueName: \"kubernetes.io/projected/2024f7dd-754a-4d90-9216-88c64dd07472-kube-api-access-bjf4h\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.215525 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2024f7dd-754a-4d90-9216-88c64dd07472-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.215536 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2024f7dd-754a-4d90-9216-88c64dd07472-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.215547 4947 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22e26c61-0cbe-45f5-82e0-efb474365ce9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.215557 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/22e26c61-0cbe-45f5-82e0-efb474365ce9-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.319044 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 06:52:14 crc kubenswrapper[4947]: E1203 06:52:14.319290 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2024f7dd-754a-4d90-9216-88c64dd07472" containerName="extract-utilities" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.319305 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="2024f7dd-754a-4d90-9216-88c64dd07472" containerName="extract-utilities" Dec 03 06:52:14 crc kubenswrapper[4947]: E1203 06:52:14.319316 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2024f7dd-754a-4d90-9216-88c64dd07472" containerName="registry-server" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.319324 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="2024f7dd-754a-4d90-9216-88c64dd07472" containerName="registry-server" Dec 03 06:52:14 crc kubenswrapper[4947]: E1203 06:52:14.319340 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22e26c61-0cbe-45f5-82e0-efb474365ce9" containerName="pruner" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.319347 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="22e26c61-0cbe-45f5-82e0-efb474365ce9" containerName="pruner" Dec 03 06:52:14 crc kubenswrapper[4947]: E1203 06:52:14.319355 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2024f7dd-754a-4d90-9216-88c64dd07472" containerName="extract-content" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.319361 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="2024f7dd-754a-4d90-9216-88c64dd07472" containerName="extract-content" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.319449 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="2024f7dd-754a-4d90-9216-88c64dd07472" containerName="registry-server" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.319466 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="22e26c61-0cbe-45f5-82e0-efb474365ce9" containerName="pruner" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.319896 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.321363 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a2791830-ab75-4754-bc30-77a083b090c4-var-lock\") pod \"installer-9-crc\" (UID: \"a2791830-ab75-4754-bc30-77a083b090c4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.321429 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a2791830-ab75-4754-bc30-77a083b090c4-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a2791830-ab75-4754-bc30-77a083b090c4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.321461 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2791830-ab75-4754-bc30-77a083b090c4-kube-api-access\") pod \"installer-9-crc\" (UID: \"a2791830-ab75-4754-bc30-77a083b090c4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.322932 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.323339 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.339068 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.389069 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rwhg2"] Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.392194 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rwhg2"] Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.422745 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a2791830-ab75-4754-bc30-77a083b090c4-var-lock\") pod \"installer-9-crc\" (UID: \"a2791830-ab75-4754-bc30-77a083b090c4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.422802 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a2791830-ab75-4754-bc30-77a083b090c4-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a2791830-ab75-4754-bc30-77a083b090c4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.422840 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2791830-ab75-4754-bc30-77a083b090c4-kube-api-access\") pod \"installer-9-crc\" (UID: \"a2791830-ab75-4754-bc30-77a083b090c4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.422927 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a2791830-ab75-4754-bc30-77a083b090c4-var-lock\") pod \"installer-9-crc\" (UID: \"a2791830-ab75-4754-bc30-77a083b090c4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.422970 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a2791830-ab75-4754-bc30-77a083b090c4-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a2791830-ab75-4754-bc30-77a083b090c4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.445324 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2791830-ab75-4754-bc30-77a083b090c4-kube-api-access\") pod \"installer-9-crc\" (UID: \"a2791830-ab75-4754-bc30-77a083b090c4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.554311 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jmpfq" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.626930 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkvk4\" (UniqueName: \"kubernetes.io/projected/a661b753-1589-4173-98ee-6da1511f42b0-kube-api-access-wkvk4\") pod \"a661b753-1589-4173-98ee-6da1511f42b0\" (UID: \"a661b753-1589-4173-98ee-6da1511f42b0\") " Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.627004 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a661b753-1589-4173-98ee-6da1511f42b0-utilities\") pod \"a661b753-1589-4173-98ee-6da1511f42b0\" (UID: \"a661b753-1589-4173-98ee-6da1511f42b0\") " Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.627079 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a661b753-1589-4173-98ee-6da1511f42b0-catalog-content\") pod \"a661b753-1589-4173-98ee-6da1511f42b0\" (UID: \"a661b753-1589-4173-98ee-6da1511f42b0\") " Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.627741 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a661b753-1589-4173-98ee-6da1511f42b0-utilities" (OuterVolumeSpecName: "utilities") pod "a661b753-1589-4173-98ee-6da1511f42b0" (UID: "a661b753-1589-4173-98ee-6da1511f42b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.630746 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a661b753-1589-4173-98ee-6da1511f42b0-kube-api-access-wkvk4" (OuterVolumeSpecName: "kube-api-access-wkvk4") pod "a661b753-1589-4173-98ee-6da1511f42b0" (UID: "a661b753-1589-4173-98ee-6da1511f42b0"). InnerVolumeSpecName "kube-api-access-wkvk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.655199 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.728902 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkvk4\" (UniqueName: \"kubernetes.io/projected/a661b753-1589-4173-98ee-6da1511f42b0-kube-api-access-wkvk4\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.728951 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a661b753-1589-4173-98ee-6da1511f42b0-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.744456 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a661b753-1589-4173-98ee-6da1511f42b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a661b753-1589-4173-98ee-6da1511f42b0" (UID: "a661b753-1589-4173-98ee-6da1511f42b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:52:14 crc kubenswrapper[4947]: I1203 06:52:14.830270 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a661b753-1589-4173-98ee-6da1511f42b0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:15 crc kubenswrapper[4947]: I1203 06:52:15.021113 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jmpfq" event={"ID":"a661b753-1589-4173-98ee-6da1511f42b0","Type":"ContainerDied","Data":"eae13a978f2f0daddc4068421f82b5bcb9be92c66886240fec9bfb7a2128d5fd"} Dec 03 06:52:15 crc kubenswrapper[4947]: I1203 06:52:15.021195 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jmpfq" Dec 03 06:52:15 crc kubenswrapper[4947]: I1203 06:52:15.071124 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jmpfq"] Dec 03 06:52:15 crc kubenswrapper[4947]: I1203 06:52:15.076887 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jmpfq"] Dec 03 06:52:15 crc kubenswrapper[4947]: I1203 06:52:15.092890 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2024f7dd-754a-4d90-9216-88c64dd07472" path="/var/lib/kubelet/pods/2024f7dd-754a-4d90-9216-88c64dd07472/volumes" Dec 03 06:52:15 crc kubenswrapper[4947]: I1203 06:52:15.093568 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a661b753-1589-4173-98ee-6da1511f42b0" path="/var/lib/kubelet/pods/a661b753-1589-4173-98ee-6da1511f42b0/volumes" Dec 03 06:52:15 crc kubenswrapper[4947]: I1203 06:52:15.448869 4947 scope.go:117] "RemoveContainer" containerID="634b36e92040987bb086b0cd4e5be4bfb6265358cf1e2dd26e6c2070ddf03baf" Dec 03 06:52:15 crc kubenswrapper[4947]: I1203 06:52:15.497936 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2pgs5" Dec 03 06:52:15 crc kubenswrapper[4947]: I1203 06:52:15.650901 4947 scope.go:117] "RemoveContainer" containerID="c820f430cbcfd25b4bd4ef7a97b73a5b58a3bd1700f97aa37132fea53e96cd56" Dec 03 06:52:15 crc kubenswrapper[4947]: I1203 06:52:15.697680 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7hsps" Dec 03 06:52:15 crc kubenswrapper[4947]: I1203 06:52:15.897154 4947 scope.go:117] "RemoveContainer" containerID="97668cfdaf085bc0bcd3ee3d2da0a51814dbaa17146a26aa8b6e06a30312dbc2" Dec 03 06:52:15 crc kubenswrapper[4947]: I1203 06:52:15.900076 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6tlpx" Dec 03 06:52:15 crc kubenswrapper[4947]: I1203 06:52:15.967461 4947 scope.go:117] "RemoveContainer" containerID="2948140792ccdbe43da90280ac4443bf21345545972b259538c49d9a65be3e22" Dec 03 06:52:15 crc kubenswrapper[4947]: I1203 06:52:15.992821 4947 scope.go:117] "RemoveContainer" containerID="85c6751f6f9e4db28ace0958691d8c70bb9ad66df8a0fd34bf63579780ac9647" Dec 03 06:52:16 crc kubenswrapper[4947]: I1203 06:52:16.411738 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 06:52:17 crc kubenswrapper[4947]: I1203 06:52:17.041994 4947 generic.go:334] "Generic (PLEG): container finished" podID="98edffce-fb3b-46ef-ad42-1b8d18d9ff47" containerID="74a8f8df660c56872a77ef0429eb4e9af52cb81012570c4e5636de2154d16eb5" exitCode=0 Dec 03 06:52:17 crc kubenswrapper[4947]: I1203 06:52:17.042086 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgrzs" event={"ID":"98edffce-fb3b-46ef-ad42-1b8d18d9ff47","Type":"ContainerDied","Data":"74a8f8df660c56872a77ef0429eb4e9af52cb81012570c4e5636de2154d16eb5"} Dec 03 06:52:17 crc kubenswrapper[4947]: I1203 06:52:17.045770 4947 generic.go:334] "Generic (PLEG): container finished" podID="bdf67ede-d6c7-4285-bef1-ef1fcc52c311" containerID="bde9ba56dbd278e980ca1996649c1ea2661876dd6c1b01ae03c1faef9cc20862" exitCode=0 Dec 03 06:52:17 crc kubenswrapper[4947]: I1203 06:52:17.045829 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bzgkn" event={"ID":"bdf67ede-d6c7-4285-bef1-ef1fcc52c311","Type":"ContainerDied","Data":"bde9ba56dbd278e980ca1996649c1ea2661876dd6c1b01ae03c1faef9cc20862"} Dec 03 06:52:17 crc kubenswrapper[4947]: I1203 06:52:17.050692 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a2791830-ab75-4754-bc30-77a083b090c4","Type":"ContainerStarted","Data":"f07ee9e991aa1d12e2c38b1f62a9fea1a6bc085668f7e16cd1ec5a99e603810f"} Dec 03 06:52:17 crc kubenswrapper[4947]: I1203 06:52:17.050745 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a2791830-ab75-4754-bc30-77a083b090c4","Type":"ContainerStarted","Data":"ba5534a5a74b6d39f1ed0a7f6e3f1bd826b156ae91748bee161ded7828af8c9f"} Dec 03 06:52:17 crc kubenswrapper[4947]: I1203 06:52:17.095846 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.095823507 podStartE2EDuration="3.095823507s" podCreationTimestamp="2025-12-03 06:52:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:52:17.090090341 +0000 UTC m=+198.351044777" watchObservedRunningTime="2025-12-03 06:52:17.095823507 +0000 UTC m=+198.356777943" Dec 03 06:52:18 crc kubenswrapper[4947]: I1203 06:52:18.619077 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6tlpx"] Dec 03 06:52:18 crc kubenswrapper[4947]: I1203 06:52:18.619934 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6tlpx" podUID="265a1050-c380-437c-a931-f51f96b375d8" containerName="registry-server" containerID="cri-o://440c14bd138b2253cad4edf7b6f59e1e1ff36926939bc9e79cfb90634463fb18" gracePeriod=2 Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.023179 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tlpx" Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.074281 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgrzs" event={"ID":"98edffce-fb3b-46ef-ad42-1b8d18d9ff47","Type":"ContainerStarted","Data":"8cdac6ba9e413fc3f14d913caacdd1d7a42ee69c3c0ecc51f8e869594fd68f4a"} Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.085753 4947 generic.go:334] "Generic (PLEG): container finished" podID="265a1050-c380-437c-a931-f51f96b375d8" containerID="440c14bd138b2253cad4edf7b6f59e1e1ff36926939bc9e79cfb90634463fb18" exitCode=0 Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.085873 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6tlpx" Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.097097 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bzgkn" event={"ID":"bdf67ede-d6c7-4285-bef1-ef1fcc52c311","Type":"ContainerStarted","Data":"be420c0830045aa05a1fde7568f85f58e969e3e8cf2c09751732f1813f0be182"} Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.097312 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tlpx" event={"ID":"265a1050-c380-437c-a931-f51f96b375d8","Type":"ContainerDied","Data":"440c14bd138b2253cad4edf7b6f59e1e1ff36926939bc9e79cfb90634463fb18"} Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.097418 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6tlpx" event={"ID":"265a1050-c380-437c-a931-f51f96b375d8","Type":"ContainerDied","Data":"ca5a76a15925b83697c1045f338de2f94375c017b83378a120697ca91a4d70f6"} Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.097548 4947 scope.go:117] "RemoveContainer" containerID="440c14bd138b2253cad4edf7b6f59e1e1ff36926939bc9e79cfb90634463fb18" Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.125347 4947 scope.go:117] "RemoveContainer" containerID="b478a64b0c7f34ca8549a7cec3debe4fb777d0fc3ab32a312757e80961abdca9" Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.144656 4947 scope.go:117] "RemoveContainer" containerID="feddc7e2ffa12020c4fe285deefd136ea16b1b87ba26bf4f9ec4df6dd39f153a" Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.151856 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cgrzs" podStartSLOduration=3.165523866 podStartE2EDuration="52.151830864s" podCreationTimestamp="2025-12-03 06:51:27 +0000 UTC" firstStartedPulling="2025-12-03 06:51:29.171330364 +0000 UTC m=+150.432284790" lastFinishedPulling="2025-12-03 06:52:18.157637342 +0000 UTC m=+199.418591788" observedRunningTime="2025-12-03 06:52:19.118042656 +0000 UTC m=+200.378997082" watchObservedRunningTime="2025-12-03 06:52:19.151830864 +0000 UTC m=+200.412785290" Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.163906 4947 scope.go:117] "RemoveContainer" containerID="440c14bd138b2253cad4edf7b6f59e1e1ff36926939bc9e79cfb90634463fb18" Dec 03 06:52:19 crc kubenswrapper[4947]: E1203 06:52:19.164507 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"440c14bd138b2253cad4edf7b6f59e1e1ff36926939bc9e79cfb90634463fb18\": container with ID starting with 440c14bd138b2253cad4edf7b6f59e1e1ff36926939bc9e79cfb90634463fb18 not found: ID does not exist" containerID="440c14bd138b2253cad4edf7b6f59e1e1ff36926939bc9e79cfb90634463fb18" Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.164668 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"440c14bd138b2253cad4edf7b6f59e1e1ff36926939bc9e79cfb90634463fb18"} err="failed to get container status \"440c14bd138b2253cad4edf7b6f59e1e1ff36926939bc9e79cfb90634463fb18\": rpc error: code = NotFound desc = could not find container \"440c14bd138b2253cad4edf7b6f59e1e1ff36926939bc9e79cfb90634463fb18\": container with ID starting with 440c14bd138b2253cad4edf7b6f59e1e1ff36926939bc9e79cfb90634463fb18 not found: ID does not exist" Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.164812 4947 scope.go:117] "RemoveContainer" containerID="b478a64b0c7f34ca8549a7cec3debe4fb777d0fc3ab32a312757e80961abdca9" Dec 03 06:52:19 crc kubenswrapper[4947]: E1203 06:52:19.165383 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b478a64b0c7f34ca8549a7cec3debe4fb777d0fc3ab32a312757e80961abdca9\": container with ID starting with b478a64b0c7f34ca8549a7cec3debe4fb777d0fc3ab32a312757e80961abdca9 not found: ID does not exist" containerID="b478a64b0c7f34ca8549a7cec3debe4fb777d0fc3ab32a312757e80961abdca9" Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.165437 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b478a64b0c7f34ca8549a7cec3debe4fb777d0fc3ab32a312757e80961abdca9"} err="failed to get container status \"b478a64b0c7f34ca8549a7cec3debe4fb777d0fc3ab32a312757e80961abdca9\": rpc error: code = NotFound desc = could not find container \"b478a64b0c7f34ca8549a7cec3debe4fb777d0fc3ab32a312757e80961abdca9\": container with ID starting with b478a64b0c7f34ca8549a7cec3debe4fb777d0fc3ab32a312757e80961abdca9 not found: ID does not exist" Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.165479 4947 scope.go:117] "RemoveContainer" containerID="feddc7e2ffa12020c4fe285deefd136ea16b1b87ba26bf4f9ec4df6dd39f153a" Dec 03 06:52:19 crc kubenswrapper[4947]: E1203 06:52:19.166318 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feddc7e2ffa12020c4fe285deefd136ea16b1b87ba26bf4f9ec4df6dd39f153a\": container with ID starting with feddc7e2ffa12020c4fe285deefd136ea16b1b87ba26bf4f9ec4df6dd39f153a not found: ID does not exist" containerID="feddc7e2ffa12020c4fe285deefd136ea16b1b87ba26bf4f9ec4df6dd39f153a" Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.166422 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feddc7e2ffa12020c4fe285deefd136ea16b1b87ba26bf4f9ec4df6dd39f153a"} err="failed to get container status \"feddc7e2ffa12020c4fe285deefd136ea16b1b87ba26bf4f9ec4df6dd39f153a\": rpc error: code = NotFound desc = could not find container \"feddc7e2ffa12020c4fe285deefd136ea16b1b87ba26bf4f9ec4df6dd39f153a\": container with ID starting with feddc7e2ffa12020c4fe285deefd136ea16b1b87ba26bf4f9ec4df6dd39f153a not found: ID does not exist" Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.195140 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdzl2\" (UniqueName: \"kubernetes.io/projected/265a1050-c380-437c-a931-f51f96b375d8-kube-api-access-pdzl2\") pod \"265a1050-c380-437c-a931-f51f96b375d8\" (UID: \"265a1050-c380-437c-a931-f51f96b375d8\") " Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.195237 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265a1050-c380-437c-a931-f51f96b375d8-catalog-content\") pod \"265a1050-c380-437c-a931-f51f96b375d8\" (UID: \"265a1050-c380-437c-a931-f51f96b375d8\") " Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.195299 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265a1050-c380-437c-a931-f51f96b375d8-utilities\") pod \"265a1050-c380-437c-a931-f51f96b375d8\" (UID: \"265a1050-c380-437c-a931-f51f96b375d8\") " Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.196727 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/265a1050-c380-437c-a931-f51f96b375d8-utilities" (OuterVolumeSpecName: "utilities") pod "265a1050-c380-437c-a931-f51f96b375d8" (UID: "265a1050-c380-437c-a931-f51f96b375d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.204079 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/265a1050-c380-437c-a931-f51f96b375d8-kube-api-access-pdzl2" (OuterVolumeSpecName: "kube-api-access-pdzl2") pod "265a1050-c380-437c-a931-f51f96b375d8" (UID: "265a1050-c380-437c-a931-f51f96b375d8"). InnerVolumeSpecName "kube-api-access-pdzl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.243286 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/265a1050-c380-437c-a931-f51f96b375d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "265a1050-c380-437c-a931-f51f96b375d8" (UID: "265a1050-c380-437c-a931-f51f96b375d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.296359 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/265a1050-c380-437c-a931-f51f96b375d8-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.296622 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdzl2\" (UniqueName: \"kubernetes.io/projected/265a1050-c380-437c-a931-f51f96b375d8-kube-api-access-pdzl2\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.296734 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/265a1050-c380-437c-a931-f51f96b375d8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.414385 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bzgkn" podStartSLOduration=3.374708312 podStartE2EDuration="52.414354328s" podCreationTimestamp="2025-12-03 06:51:27 +0000 UTC" firstStartedPulling="2025-12-03 06:51:29.173057621 +0000 UTC m=+150.434012047" lastFinishedPulling="2025-12-03 06:52:18.212703627 +0000 UTC m=+199.473658063" observedRunningTime="2025-12-03 06:52:19.152258315 +0000 UTC m=+200.413212761" watchObservedRunningTime="2025-12-03 06:52:19.414354328 +0000 UTC m=+200.675308774" Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.416100 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6tlpx"] Dec 03 06:52:19 crc kubenswrapper[4947]: I1203 06:52:19.422030 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6tlpx"] Dec 03 06:52:21 crc kubenswrapper[4947]: I1203 06:52:21.098871 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="265a1050-c380-437c-a931-f51f96b375d8" path="/var/lib/kubelet/pods/265a1050-c380-437c-a931-f51f96b375d8/volumes" Dec 03 06:52:27 crc kubenswrapper[4947]: I1203 06:52:27.669697 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bzgkn" Dec 03 06:52:27 crc kubenswrapper[4947]: I1203 06:52:27.670483 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bzgkn" Dec 03 06:52:27 crc kubenswrapper[4947]: I1203 06:52:27.731774 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bzgkn" Dec 03 06:52:28 crc kubenswrapper[4947]: I1203 06:52:28.042018 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cgrzs" Dec 03 06:52:28 crc kubenswrapper[4947]: I1203 06:52:28.042082 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cgrzs" Dec 03 06:52:28 crc kubenswrapper[4947]: I1203 06:52:28.095689 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cgrzs" Dec 03 06:52:28 crc kubenswrapper[4947]: I1203 06:52:28.199543 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bzgkn" Dec 03 06:52:28 crc kubenswrapper[4947]: I1203 06:52:28.199716 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cgrzs" Dec 03 06:52:29 crc kubenswrapper[4947]: I1203 06:52:29.219244 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cgrzs"] Dec 03 06:52:30 crc kubenswrapper[4947]: I1203 06:52:30.086199 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:52:30 crc kubenswrapper[4947]: I1203 06:52:30.086605 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:52:30 crc kubenswrapper[4947]: I1203 06:52:30.086663 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 06:52:30 crc kubenswrapper[4947]: I1203 06:52:30.087295 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 06:52:30 crc kubenswrapper[4947]: I1203 06:52:30.087393 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e" gracePeriod=600 Dec 03 06:52:30 crc kubenswrapper[4947]: I1203 06:52:30.158029 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cgrzs" podUID="98edffce-fb3b-46ef-ad42-1b8d18d9ff47" containerName="registry-server" containerID="cri-o://8cdac6ba9e413fc3f14d913caacdd1d7a42ee69c3c0ecc51f8e869594fd68f4a" gracePeriod=2 Dec 03 06:52:30 crc kubenswrapper[4947]: I1203 06:52:30.575593 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cgrzs" Dec 03 06:52:30 crc kubenswrapper[4947]: I1203 06:52:30.655435 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98edffce-fb3b-46ef-ad42-1b8d18d9ff47-utilities\") pod \"98edffce-fb3b-46ef-ad42-1b8d18d9ff47\" (UID: \"98edffce-fb3b-46ef-ad42-1b8d18d9ff47\") " Dec 03 06:52:30 crc kubenswrapper[4947]: I1203 06:52:30.655601 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmrrz\" (UniqueName: \"kubernetes.io/projected/98edffce-fb3b-46ef-ad42-1b8d18d9ff47-kube-api-access-jmrrz\") pod \"98edffce-fb3b-46ef-ad42-1b8d18d9ff47\" (UID: \"98edffce-fb3b-46ef-ad42-1b8d18d9ff47\") " Dec 03 06:52:30 crc kubenswrapper[4947]: I1203 06:52:30.656510 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98edffce-fb3b-46ef-ad42-1b8d18d9ff47-utilities" (OuterVolumeSpecName: "utilities") pod "98edffce-fb3b-46ef-ad42-1b8d18d9ff47" (UID: "98edffce-fb3b-46ef-ad42-1b8d18d9ff47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:52:30 crc kubenswrapper[4947]: I1203 06:52:30.657087 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98edffce-fb3b-46ef-ad42-1b8d18d9ff47-catalog-content\") pod \"98edffce-fb3b-46ef-ad42-1b8d18d9ff47\" (UID: \"98edffce-fb3b-46ef-ad42-1b8d18d9ff47\") " Dec 03 06:52:30 crc kubenswrapper[4947]: I1203 06:52:30.657363 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98edffce-fb3b-46ef-ad42-1b8d18d9ff47-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:30 crc kubenswrapper[4947]: I1203 06:52:30.674216 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98edffce-fb3b-46ef-ad42-1b8d18d9ff47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98edffce-fb3b-46ef-ad42-1b8d18d9ff47" (UID: "98edffce-fb3b-46ef-ad42-1b8d18d9ff47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:52:30 crc kubenswrapper[4947]: I1203 06:52:30.675332 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98edffce-fb3b-46ef-ad42-1b8d18d9ff47-kube-api-access-jmrrz" (OuterVolumeSpecName: "kube-api-access-jmrrz") pod "98edffce-fb3b-46ef-ad42-1b8d18d9ff47" (UID: "98edffce-fb3b-46ef-ad42-1b8d18d9ff47"). InnerVolumeSpecName "kube-api-access-jmrrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:30 crc kubenswrapper[4947]: I1203 06:52:30.758333 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmrrz\" (UniqueName: \"kubernetes.io/projected/98edffce-fb3b-46ef-ad42-1b8d18d9ff47-kube-api-access-jmrrz\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:30 crc kubenswrapper[4947]: I1203 06:52:30.758898 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98edffce-fb3b-46ef-ad42-1b8d18d9ff47-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:31 crc kubenswrapper[4947]: I1203 06:52:31.166300 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e" exitCode=0 Dec 03 06:52:31 crc kubenswrapper[4947]: I1203 06:52:31.166383 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e"} Dec 03 06:52:31 crc kubenswrapper[4947]: I1203 06:52:31.166417 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"5f17900e42556091147e4ba38edd46b7b9702c592fdecfb16bbf2c0cd119c24e"} Dec 03 06:52:31 crc kubenswrapper[4947]: I1203 06:52:31.169253 4947 generic.go:334] "Generic (PLEG): container finished" podID="98edffce-fb3b-46ef-ad42-1b8d18d9ff47" containerID="8cdac6ba9e413fc3f14d913caacdd1d7a42ee69c3c0ecc51f8e869594fd68f4a" exitCode=0 Dec 03 06:52:31 crc kubenswrapper[4947]: I1203 06:52:31.169285 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgrzs" event={"ID":"98edffce-fb3b-46ef-ad42-1b8d18d9ff47","Type":"ContainerDied","Data":"8cdac6ba9e413fc3f14d913caacdd1d7a42ee69c3c0ecc51f8e869594fd68f4a"} Dec 03 06:52:31 crc kubenswrapper[4947]: I1203 06:52:31.169306 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cgrzs" event={"ID":"98edffce-fb3b-46ef-ad42-1b8d18d9ff47","Type":"ContainerDied","Data":"fcd5a18b3de8a3edf5604036ca503c791fe6b91352e5f27e3c240ae3b8eeb162"} Dec 03 06:52:31 crc kubenswrapper[4947]: I1203 06:52:31.169327 4947 scope.go:117] "RemoveContainer" containerID="8cdac6ba9e413fc3f14d913caacdd1d7a42ee69c3c0ecc51f8e869594fd68f4a" Dec 03 06:52:31 crc kubenswrapper[4947]: I1203 06:52:31.169424 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cgrzs" Dec 03 06:52:31 crc kubenswrapper[4947]: I1203 06:52:31.205228 4947 scope.go:117] "RemoveContainer" containerID="74a8f8df660c56872a77ef0429eb4e9af52cb81012570c4e5636de2154d16eb5" Dec 03 06:52:31 crc kubenswrapper[4947]: I1203 06:52:31.228870 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cgrzs"] Dec 03 06:52:31 crc kubenswrapper[4947]: I1203 06:52:31.233775 4947 scope.go:117] "RemoveContainer" containerID="61074805991b90617e6fbcbdceeb9ce42fb88bafa18f505480853f4cc5019561" Dec 03 06:52:31 crc kubenswrapper[4947]: I1203 06:52:31.234342 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cgrzs"] Dec 03 06:52:31 crc kubenswrapper[4947]: I1203 06:52:31.263592 4947 scope.go:117] "RemoveContainer" containerID="8cdac6ba9e413fc3f14d913caacdd1d7a42ee69c3c0ecc51f8e869594fd68f4a" Dec 03 06:52:31 crc kubenswrapper[4947]: E1203 06:52:31.264330 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cdac6ba9e413fc3f14d913caacdd1d7a42ee69c3c0ecc51f8e869594fd68f4a\": container with ID starting with 8cdac6ba9e413fc3f14d913caacdd1d7a42ee69c3c0ecc51f8e869594fd68f4a not found: ID does not exist" containerID="8cdac6ba9e413fc3f14d913caacdd1d7a42ee69c3c0ecc51f8e869594fd68f4a" Dec 03 06:52:31 crc kubenswrapper[4947]: I1203 06:52:31.264387 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cdac6ba9e413fc3f14d913caacdd1d7a42ee69c3c0ecc51f8e869594fd68f4a"} err="failed to get container status \"8cdac6ba9e413fc3f14d913caacdd1d7a42ee69c3c0ecc51f8e869594fd68f4a\": rpc error: code = NotFound desc = could not find container \"8cdac6ba9e413fc3f14d913caacdd1d7a42ee69c3c0ecc51f8e869594fd68f4a\": container with ID starting with 8cdac6ba9e413fc3f14d913caacdd1d7a42ee69c3c0ecc51f8e869594fd68f4a not found: ID does not exist" Dec 03 06:52:31 crc kubenswrapper[4947]: I1203 06:52:31.264434 4947 scope.go:117] "RemoveContainer" containerID="74a8f8df660c56872a77ef0429eb4e9af52cb81012570c4e5636de2154d16eb5" Dec 03 06:52:31 crc kubenswrapper[4947]: E1203 06:52:31.267672 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74a8f8df660c56872a77ef0429eb4e9af52cb81012570c4e5636de2154d16eb5\": container with ID starting with 74a8f8df660c56872a77ef0429eb4e9af52cb81012570c4e5636de2154d16eb5 not found: ID does not exist" containerID="74a8f8df660c56872a77ef0429eb4e9af52cb81012570c4e5636de2154d16eb5" Dec 03 06:52:31 crc kubenswrapper[4947]: I1203 06:52:31.268199 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74a8f8df660c56872a77ef0429eb4e9af52cb81012570c4e5636de2154d16eb5"} err="failed to get container status \"74a8f8df660c56872a77ef0429eb4e9af52cb81012570c4e5636de2154d16eb5\": rpc error: code = NotFound desc = could not find container \"74a8f8df660c56872a77ef0429eb4e9af52cb81012570c4e5636de2154d16eb5\": container with ID starting with 74a8f8df660c56872a77ef0429eb4e9af52cb81012570c4e5636de2154d16eb5 not found: ID does not exist" Dec 03 06:52:31 crc kubenswrapper[4947]: I1203 06:52:31.268233 4947 scope.go:117] "RemoveContainer" containerID="61074805991b90617e6fbcbdceeb9ce42fb88bafa18f505480853f4cc5019561" Dec 03 06:52:31 crc kubenswrapper[4947]: E1203 06:52:31.273335 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61074805991b90617e6fbcbdceeb9ce42fb88bafa18f505480853f4cc5019561\": container with ID starting with 61074805991b90617e6fbcbdceeb9ce42fb88bafa18f505480853f4cc5019561 not found: ID does not exist" containerID="61074805991b90617e6fbcbdceeb9ce42fb88bafa18f505480853f4cc5019561" Dec 03 06:52:31 crc kubenswrapper[4947]: I1203 06:52:31.273426 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61074805991b90617e6fbcbdceeb9ce42fb88bafa18f505480853f4cc5019561"} err="failed to get container status \"61074805991b90617e6fbcbdceeb9ce42fb88bafa18f505480853f4cc5019561\": rpc error: code = NotFound desc = could not find container \"61074805991b90617e6fbcbdceeb9ce42fb88bafa18f505480853f4cc5019561\": container with ID starting with 61074805991b90617e6fbcbdceeb9ce42fb88bafa18f505480853f4cc5019561 not found: ID does not exist" Dec 03 06:52:31 crc kubenswrapper[4947]: I1203 06:52:31.287815 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" podUID="8fa054f2-39f2-479b-915f-b2ceeab282d6" containerName="oauth-openshift" containerID="cri-o://8992e72de8cc2fbeeac1db6b51988c1fb58797dd468c8ea98addbc328084a967" gracePeriod=15 Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.086378 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.177072 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-user-template-provider-selection\") pod \"8fa054f2-39f2-479b-915f-b2ceeab282d6\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.177139 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-serving-cert\") pod \"8fa054f2-39f2-479b-915f-b2ceeab282d6\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.177944 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-user-idp-0-file-data\") pod \"8fa054f2-39f2-479b-915f-b2ceeab282d6\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.177996 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-user-template-error\") pod \"8fa054f2-39f2-479b-915f-b2ceeab282d6\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.178024 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-trusted-ca-bundle\") pod \"8fa054f2-39f2-479b-915f-b2ceeab282d6\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.178050 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-service-ca\") pod \"8fa054f2-39f2-479b-915f-b2ceeab282d6\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.178089 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-router-certs\") pod \"8fa054f2-39f2-479b-915f-b2ceeab282d6\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.178112 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz9dj\" (UniqueName: \"kubernetes.io/projected/8fa054f2-39f2-479b-915f-b2ceeab282d6-kube-api-access-kz9dj\") pod \"8fa054f2-39f2-479b-915f-b2ceeab282d6\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.178150 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8fa054f2-39f2-479b-915f-b2ceeab282d6-audit-policies\") pod \"8fa054f2-39f2-479b-915f-b2ceeab282d6\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.178178 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-ocp-branding-template\") pod \"8fa054f2-39f2-479b-915f-b2ceeab282d6\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.178213 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-session\") pod \"8fa054f2-39f2-479b-915f-b2ceeab282d6\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.178236 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-user-template-login\") pod \"8fa054f2-39f2-479b-915f-b2ceeab282d6\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.178264 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-cliconfig\") pod \"8fa054f2-39f2-479b-915f-b2ceeab282d6\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.178306 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8fa054f2-39f2-479b-915f-b2ceeab282d6-audit-dir\") pod \"8fa054f2-39f2-479b-915f-b2ceeab282d6\" (UID: \"8fa054f2-39f2-479b-915f-b2ceeab282d6\") " Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.178630 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8fa054f2-39f2-479b-915f-b2ceeab282d6-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "8fa054f2-39f2-479b-915f-b2ceeab282d6" (UID: "8fa054f2-39f2-479b-915f-b2ceeab282d6"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.181357 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "8fa054f2-39f2-479b-915f-b2ceeab282d6" (UID: "8fa054f2-39f2-479b-915f-b2ceeab282d6"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.181649 4947 generic.go:334] "Generic (PLEG): container finished" podID="8fa054f2-39f2-479b-915f-b2ceeab282d6" containerID="8992e72de8cc2fbeeac1db6b51988c1fb58797dd468c8ea98addbc328084a967" exitCode=0 Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.181694 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" event={"ID":"8fa054f2-39f2-479b-915f-b2ceeab282d6","Type":"ContainerDied","Data":"8992e72de8cc2fbeeac1db6b51988c1fb58797dd468c8ea98addbc328084a967"} Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.181690 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fa054f2-39f2-479b-915f-b2ceeab282d6-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "8fa054f2-39f2-479b-915f-b2ceeab282d6" (UID: "8fa054f2-39f2-479b-915f-b2ceeab282d6"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.181729 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" event={"ID":"8fa054f2-39f2-479b-915f-b2ceeab282d6","Type":"ContainerDied","Data":"b8de4ef49fe8e8ef411375925bc60729f4a9b267984ef8e39cf34a3d8f2418a4"} Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.181993 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-fj5r4" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.181934 4947 scope.go:117] "RemoveContainer" containerID="8992e72de8cc2fbeeac1db6b51988c1fb58797dd468c8ea98addbc328084a967" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.184882 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "8fa054f2-39f2-479b-915f-b2ceeab282d6" (UID: "8fa054f2-39f2-479b-915f-b2ceeab282d6"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.185168 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "8fa054f2-39f2-479b-915f-b2ceeab282d6" (UID: "8fa054f2-39f2-479b-915f-b2ceeab282d6"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.191389 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "8fa054f2-39f2-479b-915f-b2ceeab282d6" (UID: "8fa054f2-39f2-479b-915f-b2ceeab282d6"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.191868 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "8fa054f2-39f2-479b-915f-b2ceeab282d6" (UID: "8fa054f2-39f2-479b-915f-b2ceeab282d6"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.192407 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "8fa054f2-39f2-479b-915f-b2ceeab282d6" (UID: "8fa054f2-39f2-479b-915f-b2ceeab282d6"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.192443 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fa054f2-39f2-479b-915f-b2ceeab282d6-kube-api-access-kz9dj" (OuterVolumeSpecName: "kube-api-access-kz9dj") pod "8fa054f2-39f2-479b-915f-b2ceeab282d6" (UID: "8fa054f2-39f2-479b-915f-b2ceeab282d6"). InnerVolumeSpecName "kube-api-access-kz9dj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.192892 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "8fa054f2-39f2-479b-915f-b2ceeab282d6" (UID: "8fa054f2-39f2-479b-915f-b2ceeab282d6"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.193387 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "8fa054f2-39f2-479b-915f-b2ceeab282d6" (UID: "8fa054f2-39f2-479b-915f-b2ceeab282d6"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.193604 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "8fa054f2-39f2-479b-915f-b2ceeab282d6" (UID: "8fa054f2-39f2-479b-915f-b2ceeab282d6"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.193766 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "8fa054f2-39f2-479b-915f-b2ceeab282d6" (UID: "8fa054f2-39f2-479b-915f-b2ceeab282d6"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.194479 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "8fa054f2-39f2-479b-915f-b2ceeab282d6" (UID: "8fa054f2-39f2-479b-915f-b2ceeab282d6"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.205708 4947 scope.go:117] "RemoveContainer" containerID="8992e72de8cc2fbeeac1db6b51988c1fb58797dd468c8ea98addbc328084a967" Dec 03 06:52:32 crc kubenswrapper[4947]: E1203 06:52:32.206297 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8992e72de8cc2fbeeac1db6b51988c1fb58797dd468c8ea98addbc328084a967\": container with ID starting with 8992e72de8cc2fbeeac1db6b51988c1fb58797dd468c8ea98addbc328084a967 not found: ID does not exist" containerID="8992e72de8cc2fbeeac1db6b51988c1fb58797dd468c8ea98addbc328084a967" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.206350 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8992e72de8cc2fbeeac1db6b51988c1fb58797dd468c8ea98addbc328084a967"} err="failed to get container status \"8992e72de8cc2fbeeac1db6b51988c1fb58797dd468c8ea98addbc328084a967\": rpc error: code = NotFound desc = could not find container \"8992e72de8cc2fbeeac1db6b51988c1fb58797dd468c8ea98addbc328084a967\": container with ID starting with 8992e72de8cc2fbeeac1db6b51988c1fb58797dd468c8ea98addbc328084a967 not found: ID does not exist" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.279614 4947 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.279669 4947 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.279684 4947 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.279694 4947 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.279706 4947 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8fa054f2-39f2-479b-915f-b2ceeab282d6-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.279723 4947 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.279734 4947 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.279744 4947 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.279754 4947 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.279764 4947 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.279776 4947 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.279786 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz9dj\" (UniqueName: \"kubernetes.io/projected/8fa054f2-39f2-479b-915f-b2ceeab282d6-kube-api-access-kz9dj\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.279795 4947 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8fa054f2-39f2-479b-915f-b2ceeab282d6-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.279805 4947 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8fa054f2-39f2-479b-915f-b2ceeab282d6-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.512057 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fj5r4"] Dec 03 06:52:32 crc kubenswrapper[4947]: I1203 06:52:32.516805 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-fj5r4"] Dec 03 06:52:33 crc kubenswrapper[4947]: I1203 06:52:33.098557 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fa054f2-39f2-479b-915f-b2ceeab282d6" path="/var/lib/kubelet/pods/8fa054f2-39f2-479b-915f-b2ceeab282d6/volumes" Dec 03 06:52:33 crc kubenswrapper[4947]: I1203 06:52:33.099164 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98edffce-fb3b-46ef-ad42-1b8d18d9ff47" path="/var/lib/kubelet/pods/98edffce-fb3b-46ef-ad42-1b8d18d9ff47/volumes" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.837681 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5844cf768-s6mvg"] Dec 03 06:52:40 crc kubenswrapper[4947]: E1203 06:52:40.838676 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a661b753-1589-4173-98ee-6da1511f42b0" containerName="extract-content" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.838697 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="a661b753-1589-4173-98ee-6da1511f42b0" containerName="extract-content" Dec 03 06:52:40 crc kubenswrapper[4947]: E1203 06:52:40.838743 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98edffce-fb3b-46ef-ad42-1b8d18d9ff47" containerName="extract-content" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.838754 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="98edffce-fb3b-46ef-ad42-1b8d18d9ff47" containerName="extract-content" Dec 03 06:52:40 crc kubenswrapper[4947]: E1203 06:52:40.838774 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a661b753-1589-4173-98ee-6da1511f42b0" containerName="extract-utilities" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.838784 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="a661b753-1589-4173-98ee-6da1511f42b0" containerName="extract-utilities" Dec 03 06:52:40 crc kubenswrapper[4947]: E1203 06:52:40.838797 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a661b753-1589-4173-98ee-6da1511f42b0" containerName="registry-server" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.838807 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="a661b753-1589-4173-98ee-6da1511f42b0" containerName="registry-server" Dec 03 06:52:40 crc kubenswrapper[4947]: E1203 06:52:40.838820 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265a1050-c380-437c-a931-f51f96b375d8" containerName="extract-utilities" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.838829 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="265a1050-c380-437c-a931-f51f96b375d8" containerName="extract-utilities" Dec 03 06:52:40 crc kubenswrapper[4947]: E1203 06:52:40.838842 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fa054f2-39f2-479b-915f-b2ceeab282d6" containerName="oauth-openshift" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.838852 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fa054f2-39f2-479b-915f-b2ceeab282d6" containerName="oauth-openshift" Dec 03 06:52:40 crc kubenswrapper[4947]: E1203 06:52:40.838870 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98edffce-fb3b-46ef-ad42-1b8d18d9ff47" containerName="registry-server" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.838879 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="98edffce-fb3b-46ef-ad42-1b8d18d9ff47" containerName="registry-server" Dec 03 06:52:40 crc kubenswrapper[4947]: E1203 06:52:40.838896 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98edffce-fb3b-46ef-ad42-1b8d18d9ff47" containerName="extract-utilities" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.838959 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="98edffce-fb3b-46ef-ad42-1b8d18d9ff47" containerName="extract-utilities" Dec 03 06:52:40 crc kubenswrapper[4947]: E1203 06:52:40.838974 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265a1050-c380-437c-a931-f51f96b375d8" containerName="extract-content" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.838983 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="265a1050-c380-437c-a931-f51f96b375d8" containerName="extract-content" Dec 03 06:52:40 crc kubenswrapper[4947]: E1203 06:52:40.838998 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265a1050-c380-437c-a931-f51f96b375d8" containerName="registry-server" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.839008 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="265a1050-c380-437c-a931-f51f96b375d8" containerName="registry-server" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.839256 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="98edffce-fb3b-46ef-ad42-1b8d18d9ff47" containerName="registry-server" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.839269 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="265a1050-c380-437c-a931-f51f96b375d8" containerName="registry-server" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.839282 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="a661b753-1589-4173-98ee-6da1511f42b0" containerName="registry-server" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.839321 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fa054f2-39f2-479b-915f-b2ceeab282d6" containerName="oauth-openshift" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.840066 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.842805 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.843008 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.843201 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.843576 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.844575 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.845649 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.845981 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.846046 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.846453 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.848404 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.851046 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.862976 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.863735 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.868984 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.869592 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5844cf768-s6mvg"] Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.881937 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.901555 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-user-template-error\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.901645 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-user-template-login\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.901712 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.901890 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.902146 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-system-session\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.902217 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6df2d0f0-5837-4c24-961c-f95f52fa9580-audit-policies\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.902296 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.902358 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vgc5\" (UniqueName: \"kubernetes.io/projected/6df2d0f0-5837-4c24-961c-f95f52fa9580-kube-api-access-4vgc5\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.902423 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6df2d0f0-5837-4c24-961c-f95f52fa9580-audit-dir\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.902466 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.902560 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.902666 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.902761 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-system-service-ca\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:40 crc kubenswrapper[4947]: I1203 06:52:40.902802 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-system-router-certs\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.004293 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-system-session\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.004379 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6df2d0f0-5837-4c24-961c-f95f52fa9580-audit-policies\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.004433 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.004472 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vgc5\" (UniqueName: \"kubernetes.io/projected/6df2d0f0-5837-4c24-961c-f95f52fa9580-kube-api-access-4vgc5\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.004545 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6df2d0f0-5837-4c24-961c-f95f52fa9580-audit-dir\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.004586 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.004630 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.004679 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.004759 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-system-service-ca\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.004805 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-system-router-certs\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.004918 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-user-template-error\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.004961 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.004998 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-user-template-login\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.005064 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.005934 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-system-service-ca\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.006724 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.006903 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6df2d0f0-5837-4c24-961c-f95f52fa9580-audit-dir\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.007579 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6df2d0f0-5837-4c24-961c-f95f52fa9580-audit-policies\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.010658 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.011778 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-user-template-error\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.012333 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.013656 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-system-router-certs\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.013719 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-system-session\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.019401 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-user-template-login\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.020202 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.020792 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.022764 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6df2d0f0-5837-4c24-961c-f95f52fa9580-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.036880 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vgc5\" (UniqueName: \"kubernetes.io/projected/6df2d0f0-5837-4c24-961c-f95f52fa9580-kube-api-access-4vgc5\") pod \"oauth-openshift-5844cf768-s6mvg\" (UID: \"6df2d0f0-5837-4c24-961c-f95f52fa9580\") " pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.187530 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:41 crc kubenswrapper[4947]: I1203 06:52:41.700701 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5844cf768-s6mvg"] Dec 03 06:52:41 crc kubenswrapper[4947]: W1203 06:52:41.707425 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6df2d0f0_5837_4c24_961c_f95f52fa9580.slice/crio-0a2cde1b904dfcd054cc58051a203d3c234696fbb95605d328ad22828cd285ba WatchSource:0}: Error finding container 0a2cde1b904dfcd054cc58051a203d3c234696fbb95605d328ad22828cd285ba: Status 404 returned error can't find the container with id 0a2cde1b904dfcd054cc58051a203d3c234696fbb95605d328ad22828cd285ba Dec 03 06:52:42 crc kubenswrapper[4947]: I1203 06:52:42.286566 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" event={"ID":"6df2d0f0-5837-4c24-961c-f95f52fa9580","Type":"ContainerStarted","Data":"0c33de48f68672ea780f4f818417ed23882089aad55334742e56a7aeaf055045"} Dec 03 06:52:42 crc kubenswrapper[4947]: I1203 06:52:42.286883 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" event={"ID":"6df2d0f0-5837-4c24-961c-f95f52fa9580","Type":"ContainerStarted","Data":"0a2cde1b904dfcd054cc58051a203d3c234696fbb95605d328ad22828cd285ba"} Dec 03 06:52:42 crc kubenswrapper[4947]: I1203 06:52:42.287554 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:42 crc kubenswrapper[4947]: I1203 06:52:42.314330 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" podStartSLOduration=36.314308945 podStartE2EDuration="36.314308945s" podCreationTimestamp="2025-12-03 06:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:52:42.306550615 +0000 UTC m=+223.567505051" watchObservedRunningTime="2025-12-03 06:52:42.314308945 +0000 UTC m=+223.575263371" Dec 03 06:52:42 crc kubenswrapper[4947]: I1203 06:52:42.547894 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5844cf768-s6mvg" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.169663 4947 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.171178 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.211767 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.234316 4947 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.234642 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63" gracePeriod=15 Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.234693 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f" gracePeriod=15 Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.234700 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d" gracePeriod=15 Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.234771 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79" gracePeriod=15 Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.234761 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352" gracePeriod=15 Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.235842 4947 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 06:52:55 crc kubenswrapper[4947]: E1203 06:52:55.236166 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.236183 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 06:52:55 crc kubenswrapper[4947]: E1203 06:52:55.236214 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.236224 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 03 06:52:55 crc kubenswrapper[4947]: E1203 06:52:55.236235 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.236242 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 06:52:55 crc kubenswrapper[4947]: E1203 06:52:55.236254 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.236262 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 06:52:55 crc kubenswrapper[4947]: E1203 06:52:55.236272 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.236292 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 06:52:55 crc kubenswrapper[4947]: E1203 06:52:55.236308 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.236316 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 06:52:55 crc kubenswrapper[4947]: E1203 06:52:55.236330 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.236338 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.236447 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.236459 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.236470 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.236480 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.236505 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.236513 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.315848 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.316219 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.316265 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.316308 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.316349 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.417884 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.417939 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.417963 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.417987 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.418032 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.418055 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.418070 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.418097 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.418176 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.418223 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.418245 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.418263 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.418284 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.502665 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.518850 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.518925 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.518944 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.519030 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.519025 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: I1203 06:52:55.519069 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:52:55 crc kubenswrapper[4947]: E1203 06:52:55.531832 4947 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.196:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187da205f8c1ad7f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 06:52:55.531081087 +0000 UTC m=+236.792035513,LastTimestamp:2025-12-03 06:52:55.531081087 +0000 UTC m=+236.792035513,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 06:52:56 crc kubenswrapper[4947]: I1203 06:52:56.372874 4947 generic.go:334] "Generic (PLEG): container finished" podID="a2791830-ab75-4754-bc30-77a083b090c4" containerID="f07ee9e991aa1d12e2c38b1f62a9fea1a6bc085668f7e16cd1ec5a99e603810f" exitCode=0 Dec 03 06:52:56 crc kubenswrapper[4947]: I1203 06:52:56.373012 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a2791830-ab75-4754-bc30-77a083b090c4","Type":"ContainerDied","Data":"f07ee9e991aa1d12e2c38b1f62a9fea1a6bc085668f7e16cd1ec5a99e603810f"} Dec 03 06:52:56 crc kubenswrapper[4947]: I1203 06:52:56.374401 4947 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:52:56 crc kubenswrapper[4947]: I1203 06:52:56.374859 4947 status_manager.go:851] "Failed to get status for pod" podUID="a2791830-ab75-4754-bc30-77a083b090c4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:52:56 crc kubenswrapper[4947]: I1203 06:52:56.375157 4947 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:52:56 crc kubenswrapper[4947]: I1203 06:52:56.375274 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"523ab64fe2ffa2c9110ede53e24acd5cc10a62356a3a910166f9aea08defe7bd"} Dec 03 06:52:56 crc kubenswrapper[4947]: I1203 06:52:56.375303 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"99f09f43810328dedf5a4b8500c49996c0bdeff68e092dfa5acf760b4b9dbd90"} Dec 03 06:52:56 crc kubenswrapper[4947]: I1203 06:52:56.375909 4947 status_manager.go:851] "Failed to get status for pod" podUID="a2791830-ab75-4754-bc30-77a083b090c4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:52:56 crc kubenswrapper[4947]: I1203 06:52:56.376321 4947 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:52:56 crc kubenswrapper[4947]: I1203 06:52:56.376821 4947 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:52:56 crc kubenswrapper[4947]: I1203 06:52:56.377853 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 06:52:56 crc kubenswrapper[4947]: I1203 06:52:56.379267 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 06:52:56 crc kubenswrapper[4947]: I1203 06:52:56.379980 4947 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d" exitCode=0 Dec 03 06:52:56 crc kubenswrapper[4947]: I1203 06:52:56.380084 4947 scope.go:117] "RemoveContainer" containerID="6ded7a83f28eca5b503130fa6ff242e1b49693ec2ff3b278efedd4e893e79336" Dec 03 06:52:56 crc kubenswrapper[4947]: I1203 06:52:56.380118 4947 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352" exitCode=0 Dec 03 06:52:56 crc kubenswrapper[4947]: I1203 06:52:56.380268 4947 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f" exitCode=0 Dec 03 06:52:56 crc kubenswrapper[4947]: I1203 06:52:56.380292 4947 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79" exitCode=2 Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.390232 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.630870 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.631889 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.632635 4947 status_manager.go:851] "Failed to get status for pod" podUID="a2791830-ab75-4754-bc30-77a083b090c4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.633017 4947 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.633237 4947 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.667787 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.668359 4947 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.668701 4947 status_manager.go:851] "Failed to get status for pod" podUID="a2791830-ab75-4754-bc30-77a083b090c4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.669132 4947 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.749318 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.749375 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.749518 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.749536 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.749541 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.749586 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.750127 4947 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.750162 4947 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.750181 4947 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.851410 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a2791830-ab75-4754-bc30-77a083b090c4-kubelet-dir\") pod \"a2791830-ab75-4754-bc30-77a083b090c4\" (UID: \"a2791830-ab75-4754-bc30-77a083b090c4\") " Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.851647 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2791830-ab75-4754-bc30-77a083b090c4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a2791830-ab75-4754-bc30-77a083b090c4" (UID: "a2791830-ab75-4754-bc30-77a083b090c4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.851695 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2791830-ab75-4754-bc30-77a083b090c4-kube-api-access\") pod \"a2791830-ab75-4754-bc30-77a083b090c4\" (UID: \"a2791830-ab75-4754-bc30-77a083b090c4\") " Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.851868 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a2791830-ab75-4754-bc30-77a083b090c4-var-lock\") pod \"a2791830-ab75-4754-bc30-77a083b090c4\" (UID: \"a2791830-ab75-4754-bc30-77a083b090c4\") " Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.852078 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2791830-ab75-4754-bc30-77a083b090c4-var-lock" (OuterVolumeSpecName: "var-lock") pod "a2791830-ab75-4754-bc30-77a083b090c4" (UID: "a2791830-ab75-4754-bc30-77a083b090c4"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.852364 4947 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a2791830-ab75-4754-bc30-77a083b090c4-var-lock\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.852409 4947 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a2791830-ab75-4754-bc30-77a083b090c4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.857590 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2791830-ab75-4754-bc30-77a083b090c4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a2791830-ab75-4754-bc30-77a083b090c4" (UID: "a2791830-ab75-4754-bc30-77a083b090c4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:57 crc kubenswrapper[4947]: I1203 06:52:57.953254 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a2791830-ab75-4754-bc30-77a083b090c4-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.401367 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.403197 4947 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63" exitCode=0 Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.403381 4947 scope.go:117] "RemoveContainer" containerID="58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.403508 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.406059 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a2791830-ab75-4754-bc30-77a083b090c4","Type":"ContainerDied","Data":"ba5534a5a74b6d39f1ed0a7f6e3f1bd826b156ae91748bee161ded7828af8c9f"} Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.406099 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba5534a5a74b6d39f1ed0a7f6e3f1bd826b156ae91748bee161ded7828af8c9f" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.406159 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.427556 4947 scope.go:117] "RemoveContainer" containerID="79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.429708 4947 status_manager.go:851] "Failed to get status for pod" podUID="a2791830-ab75-4754-bc30-77a083b090c4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.430513 4947 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.430966 4947 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.437070 4947 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.437434 4947 status_manager.go:851] "Failed to get status for pod" podUID="a2791830-ab75-4754-bc30-77a083b090c4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.437782 4947 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.446924 4947 scope.go:117] "RemoveContainer" containerID="d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.468938 4947 scope.go:117] "RemoveContainer" containerID="22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.483965 4947 scope.go:117] "RemoveContainer" containerID="aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.499465 4947 scope.go:117] "RemoveContainer" containerID="75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.527104 4947 scope.go:117] "RemoveContainer" containerID="58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d" Dec 03 06:52:58 crc kubenswrapper[4947]: E1203 06:52:58.527660 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\": container with ID starting with 58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d not found: ID does not exist" containerID="58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.527836 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d"} err="failed to get container status \"58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\": rpc error: code = NotFound desc = could not find container \"58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d\": container with ID starting with 58591e34ca2a173c2bc74037defc01e1502b6e8bd9607ac267be90b05731179d not found: ID does not exist" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.527999 4947 scope.go:117] "RemoveContainer" containerID="79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352" Dec 03 06:52:58 crc kubenswrapper[4947]: E1203 06:52:58.529651 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\": container with ID starting with 79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352 not found: ID does not exist" containerID="79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.529944 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352"} err="failed to get container status \"79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\": rpc error: code = NotFound desc = could not find container \"79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352\": container with ID starting with 79a72d9bf7ec82243a736e22292797285dd8e4c75f09012f7ec6081090fc9352 not found: ID does not exist" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.530072 4947 scope.go:117] "RemoveContainer" containerID="d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f" Dec 03 06:52:58 crc kubenswrapper[4947]: E1203 06:52:58.530428 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\": container with ID starting with d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f not found: ID does not exist" containerID="d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.530559 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f"} err="failed to get container status \"d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\": rpc error: code = NotFound desc = could not find container \"d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f\": container with ID starting with d6ae242c6948f554a91dee66ba4181cb9905dc503e494b78606e5da84fcb2a8f not found: ID does not exist" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.530653 4947 scope.go:117] "RemoveContainer" containerID="22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79" Dec 03 06:52:58 crc kubenswrapper[4947]: E1203 06:52:58.531124 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\": container with ID starting with 22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79 not found: ID does not exist" containerID="22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.531229 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79"} err="failed to get container status \"22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\": rpc error: code = NotFound desc = could not find container \"22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79\": container with ID starting with 22c966a6cff8cb4be7d936ebf414acdcb9c1e2ce7b0665c698d8917212d34e79 not found: ID does not exist" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.531342 4947 scope.go:117] "RemoveContainer" containerID="aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63" Dec 03 06:52:58 crc kubenswrapper[4947]: E1203 06:52:58.532051 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\": container with ID starting with aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63 not found: ID does not exist" containerID="aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.532164 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63"} err="failed to get container status \"aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\": rpc error: code = NotFound desc = could not find container \"aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63\": container with ID starting with aab8ff1ffcabb3eebde01fba98e90bf294586d863d639a7b3a331878c492dc63 not found: ID does not exist" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.532286 4947 scope.go:117] "RemoveContainer" containerID="75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433" Dec 03 06:52:58 crc kubenswrapper[4947]: E1203 06:52:58.532659 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\": container with ID starting with 75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433 not found: ID does not exist" containerID="75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433" Dec 03 06:52:58 crc kubenswrapper[4947]: I1203 06:52:58.532771 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433"} err="failed to get container status \"75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\": rpc error: code = NotFound desc = could not find container \"75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433\": container with ID starting with 75a7e159e1cb1aeb3e984f8ad9a569bda4e9da5b1df5f68484dc743695f67433 not found: ID does not exist" Dec 03 06:52:59 crc kubenswrapper[4947]: I1203 06:52:59.086829 4947 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:52:59 crc kubenswrapper[4947]: I1203 06:52:59.087670 4947 status_manager.go:851] "Failed to get status for pod" podUID="a2791830-ab75-4754-bc30-77a083b090c4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:52:59 crc kubenswrapper[4947]: I1203 06:52:59.088067 4947 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:52:59 crc kubenswrapper[4947]: I1203 06:52:59.089856 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 03 06:52:59 crc kubenswrapper[4947]: E1203 06:52:59.485977 4947 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.196:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187da205f8c1ad7f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 06:52:55.531081087 +0000 UTC m=+236.792035513,LastTimestamp:2025-12-03 06:52:55.531081087 +0000 UTC m=+236.792035513,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 06:53:02 crc kubenswrapper[4947]: E1203 06:53:02.663932 4947 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:53:02 crc kubenswrapper[4947]: E1203 06:53:02.664465 4947 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:53:02 crc kubenswrapper[4947]: E1203 06:53:02.664935 4947 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:53:02 crc kubenswrapper[4947]: E1203 06:53:02.665210 4947 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:53:02 crc kubenswrapper[4947]: E1203 06:53:02.665559 4947 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:53:02 crc kubenswrapper[4947]: I1203 06:53:02.665599 4947 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 03 06:53:02 crc kubenswrapper[4947]: E1203 06:53:02.665830 4947 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="200ms" Dec 03 06:53:02 crc kubenswrapper[4947]: E1203 06:53:02.867383 4947 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="400ms" Dec 03 06:53:03 crc kubenswrapper[4947]: E1203 06:53:03.269123 4947 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="800ms" Dec 03 06:53:04 crc kubenswrapper[4947]: E1203 06:53:04.069759 4947 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="1.6s" Dec 03 06:53:05 crc kubenswrapper[4947]: E1203 06:53:05.670822 4947 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="3.2s" Dec 03 06:53:08 crc kubenswrapper[4947]: I1203 06:53:08.082301 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:53:08 crc kubenswrapper[4947]: I1203 06:53:08.083909 4947 status_manager.go:851] "Failed to get status for pod" podUID="a2791830-ab75-4754-bc30-77a083b090c4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:53:08 crc kubenswrapper[4947]: I1203 06:53:08.084144 4947 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:53:08 crc kubenswrapper[4947]: I1203 06:53:08.098941 4947 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eefa93f4-4ae4-4da7-ba41-c19097fb2352" Dec 03 06:53:08 crc kubenswrapper[4947]: I1203 06:53:08.098973 4947 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eefa93f4-4ae4-4da7-ba41-c19097fb2352" Dec 03 06:53:08 crc kubenswrapper[4947]: E1203 06:53:08.099271 4947 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:53:08 crc kubenswrapper[4947]: I1203 06:53:08.099750 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:53:08 crc kubenswrapper[4947]: I1203 06:53:08.466860 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"533e967263f02777d3796594246cff5d6624a698d79cdab1e0c5d8ef0de94d83"} Dec 03 06:53:08 crc kubenswrapper[4947]: E1203 06:53:08.872638 4947 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.196:6443: connect: connection refused" interval="6.4s" Dec 03 06:53:09 crc kubenswrapper[4947]: I1203 06:53:09.088348 4947 status_manager.go:851] "Failed to get status for pod" podUID="a2791830-ab75-4754-bc30-77a083b090c4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:53:09 crc kubenswrapper[4947]: I1203 06:53:09.088646 4947 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:53:09 crc kubenswrapper[4947]: I1203 06:53:09.089037 4947 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:53:09 crc kubenswrapper[4947]: I1203 06:53:09.475064 4947 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="fd5b9bec1270f75c155f087328d797a4a6ff01f34eeb575c25017d6825f9fdff" exitCode=0 Dec 03 06:53:09 crc kubenswrapper[4947]: I1203 06:53:09.475120 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"fd5b9bec1270f75c155f087328d797a4a6ff01f34eeb575c25017d6825f9fdff"} Dec 03 06:53:09 crc kubenswrapper[4947]: I1203 06:53:09.475541 4947 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eefa93f4-4ae4-4da7-ba41-c19097fb2352" Dec 03 06:53:09 crc kubenswrapper[4947]: I1203 06:53:09.475586 4947 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eefa93f4-4ae4-4da7-ba41-c19097fb2352" Dec 03 06:53:09 crc kubenswrapper[4947]: I1203 06:53:09.475943 4947 status_manager.go:851] "Failed to get status for pod" podUID="a2791830-ab75-4754-bc30-77a083b090c4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:53:09 crc kubenswrapper[4947]: E1203 06:53:09.476140 4947 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:53:09 crc kubenswrapper[4947]: I1203 06:53:09.476441 4947 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:53:09 crc kubenswrapper[4947]: I1203 06:53:09.476842 4947 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.196:6443: connect: connection refused" Dec 03 06:53:09 crc kubenswrapper[4947]: E1203 06:53:09.487247 4947 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.196:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187da205f8c1ad7f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 06:52:55.531081087 +0000 UTC m=+236.792035513,LastTimestamp:2025-12-03 06:52:55.531081087 +0000 UTC m=+236.792035513,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 06:53:10 crc kubenswrapper[4947]: I1203 06:53:10.496174 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"981bcdfee0c7332c5e7e3de2652d0d6143bb16af1911a20755fdcf2f0436facd"} Dec 03 06:53:10 crc kubenswrapper[4947]: I1203 06:53:10.496541 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4df8d18457b151e8040d5a4a387bc0ad2366abf9ed0579466795ab817ba77835"} Dec 03 06:53:10 crc kubenswrapper[4947]: I1203 06:53:10.496554 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"39feadebbcc1a203900251df92ad339fe420d99e14a59b7f8292f755b9ff3de1"} Dec 03 06:53:10 crc kubenswrapper[4947]: I1203 06:53:10.499245 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 06:53:10 crc kubenswrapper[4947]: I1203 06:53:10.499300 4947 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8" exitCode=1 Dec 03 06:53:10 crc kubenswrapper[4947]: I1203 06:53:10.499332 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8"} Dec 03 06:53:10 crc kubenswrapper[4947]: I1203 06:53:10.499832 4947 scope.go:117] "RemoveContainer" containerID="197502c99b55a65383318dc75564160f64a7d81b6a405c1234d44ccd5a314cb8" Dec 03 06:53:11 crc kubenswrapper[4947]: I1203 06:53:11.507820 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"866b331cfb51023e05360c93a05264776758000d1f06c62d3dbe0cd69b2b3709"} Dec 03 06:53:11 crc kubenswrapper[4947]: I1203 06:53:11.507864 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"38e36e84a93664baf3625f22b17aa3c184ef52f9fd85f05f055d82894b656466"} Dec 03 06:53:11 crc kubenswrapper[4947]: I1203 06:53:11.507969 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:53:11 crc kubenswrapper[4947]: I1203 06:53:11.508087 4947 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eefa93f4-4ae4-4da7-ba41-c19097fb2352" Dec 03 06:53:11 crc kubenswrapper[4947]: I1203 06:53:11.508116 4947 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eefa93f4-4ae4-4da7-ba41-c19097fb2352" Dec 03 06:53:11 crc kubenswrapper[4947]: I1203 06:53:11.510825 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 06:53:11 crc kubenswrapper[4947]: I1203 06:53:11.510873 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ccb0c25cdb16a99856b3f47ee0358e1b92c2aec68eb652bc24a7dc41bf894188"} Dec 03 06:53:13 crc kubenswrapper[4947]: I1203 06:53:13.100388 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:53:13 crc kubenswrapper[4947]: I1203 06:53:13.100851 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:53:13 crc kubenswrapper[4947]: I1203 06:53:13.109074 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:53:14 crc kubenswrapper[4947]: I1203 06:53:14.943603 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:53:14 crc kubenswrapper[4947]: I1203 06:53:14.954467 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:53:15 crc kubenswrapper[4947]: I1203 06:53:15.536635 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:53:16 crc kubenswrapper[4947]: I1203 06:53:16.520825 4947 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:53:16 crc kubenswrapper[4947]: I1203 06:53:16.546272 4947 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eefa93f4-4ae4-4da7-ba41-c19097fb2352" Dec 03 06:53:16 crc kubenswrapper[4947]: I1203 06:53:16.546321 4947 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eefa93f4-4ae4-4da7-ba41-c19097fb2352" Dec 03 06:53:16 crc kubenswrapper[4947]: I1203 06:53:16.551862 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:53:16 crc kubenswrapper[4947]: I1203 06:53:16.555112 4947 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="2bbf4efb-f1c9-4c81-8dab-874b3e18c524" Dec 03 06:53:17 crc kubenswrapper[4947]: I1203 06:53:17.549282 4947 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eefa93f4-4ae4-4da7-ba41-c19097fb2352" Dec 03 06:53:17 crc kubenswrapper[4947]: I1203 06:53:17.549808 4947 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eefa93f4-4ae4-4da7-ba41-c19097fb2352" Dec 03 06:53:19 crc kubenswrapper[4947]: I1203 06:53:19.097131 4947 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="2bbf4efb-f1c9-4c81-8dab-874b3e18c524" Dec 03 06:53:26 crc kubenswrapper[4947]: I1203 06:53:26.326038 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 06:53:26 crc kubenswrapper[4947]: I1203 06:53:26.727059 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 06:53:26 crc kubenswrapper[4947]: I1203 06:53:26.984355 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 06:53:27 crc kubenswrapper[4947]: I1203 06:53:27.019467 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 06:53:27 crc kubenswrapper[4947]: I1203 06:53:27.428452 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 06:53:27 crc kubenswrapper[4947]: I1203 06:53:27.610646 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 06:53:27 crc kubenswrapper[4947]: I1203 06:53:27.697251 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 06:53:27 crc kubenswrapper[4947]: I1203 06:53:27.741815 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 06:53:27 crc kubenswrapper[4947]: I1203 06:53:27.759784 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 03 06:53:27 crc kubenswrapper[4947]: I1203 06:53:27.922866 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 06:53:27 crc kubenswrapper[4947]: I1203 06:53:27.991951 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 03 06:53:28 crc kubenswrapper[4947]: I1203 06:53:28.093174 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 06:53:28 crc kubenswrapper[4947]: I1203 06:53:28.153843 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 06:53:28 crc kubenswrapper[4947]: I1203 06:53:28.252167 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 06:53:28 crc kubenswrapper[4947]: I1203 06:53:28.304563 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 06:53:28 crc kubenswrapper[4947]: I1203 06:53:28.347161 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 06:53:28 crc kubenswrapper[4947]: I1203 06:53:28.406524 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 06:53:28 crc kubenswrapper[4947]: I1203 06:53:28.447897 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 06:53:28 crc kubenswrapper[4947]: I1203 06:53:28.658165 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 06:53:28 crc kubenswrapper[4947]: I1203 06:53:28.730443 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 06:53:28 crc kubenswrapper[4947]: I1203 06:53:28.735115 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 03 06:53:28 crc kubenswrapper[4947]: I1203 06:53:28.901650 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 06:53:28 crc kubenswrapper[4947]: I1203 06:53:28.942312 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 06:53:29 crc kubenswrapper[4947]: I1203 06:53:29.025093 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 06:53:29 crc kubenswrapper[4947]: I1203 06:53:29.067443 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:53:29 crc kubenswrapper[4947]: I1203 06:53:29.130554 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 06:53:29 crc kubenswrapper[4947]: I1203 06:53:29.285326 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 06:53:29 crc kubenswrapper[4947]: I1203 06:53:29.315134 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 06:53:29 crc kubenswrapper[4947]: I1203 06:53:29.382990 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 03 06:53:29 crc kubenswrapper[4947]: I1203 06:53:29.448404 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 03 06:53:29 crc kubenswrapper[4947]: I1203 06:53:29.572603 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 06:53:29 crc kubenswrapper[4947]: I1203 06:53:29.572848 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 06:53:29 crc kubenswrapper[4947]: I1203 06:53:29.707541 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 06:53:29 crc kubenswrapper[4947]: I1203 06:53:29.712910 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 06:53:29 crc kubenswrapper[4947]: I1203 06:53:29.852410 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 06:53:29 crc kubenswrapper[4947]: I1203 06:53:29.893547 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 03 06:53:29 crc kubenswrapper[4947]: I1203 06:53:29.903605 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 06:53:29 crc kubenswrapper[4947]: I1203 06:53:29.922354 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 06:53:30 crc kubenswrapper[4947]: I1203 06:53:30.005879 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 06:53:30 crc kubenswrapper[4947]: I1203 06:53:30.030132 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 06:53:30 crc kubenswrapper[4947]: I1203 06:53:30.284979 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 06:53:30 crc kubenswrapper[4947]: I1203 06:53:30.316706 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 03 06:53:30 crc kubenswrapper[4947]: I1203 06:53:30.422316 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 03 06:53:30 crc kubenswrapper[4947]: I1203 06:53:30.439261 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 06:53:30 crc kubenswrapper[4947]: I1203 06:53:30.506647 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 06:53:30 crc kubenswrapper[4947]: I1203 06:53:30.531195 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 03 06:53:30 crc kubenswrapper[4947]: I1203 06:53:30.647996 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 06:53:30 crc kubenswrapper[4947]: I1203 06:53:30.651789 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 03 06:53:30 crc kubenswrapper[4947]: I1203 06:53:30.734577 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 03 06:53:30 crc kubenswrapper[4947]: I1203 06:53:30.734703 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 06:53:30 crc kubenswrapper[4947]: I1203 06:53:30.761870 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 06:53:30 crc kubenswrapper[4947]: I1203 06:53:30.850465 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 03 06:53:30 crc kubenswrapper[4947]: I1203 06:53:30.859311 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 06:53:30 crc kubenswrapper[4947]: I1203 06:53:30.874681 4947 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 06:53:30 crc kubenswrapper[4947]: I1203 06:53:30.928881 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 03 06:53:30 crc kubenswrapper[4947]: I1203 06:53:30.938202 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 06:53:31 crc kubenswrapper[4947]: I1203 06:53:31.120713 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 06:53:31 crc kubenswrapper[4947]: I1203 06:53:31.134214 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 03 06:53:31 crc kubenswrapper[4947]: I1203 06:53:31.209266 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 06:53:31 crc kubenswrapper[4947]: I1203 06:53:31.320828 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 06:53:31 crc kubenswrapper[4947]: I1203 06:53:31.337541 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 06:53:31 crc kubenswrapper[4947]: I1203 06:53:31.399202 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 06:53:31 crc kubenswrapper[4947]: I1203 06:53:31.423603 4947 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 06:53:31 crc kubenswrapper[4947]: I1203 06:53:31.432763 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 06:53:31 crc kubenswrapper[4947]: I1203 06:53:31.489449 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 06:53:31 crc kubenswrapper[4947]: I1203 06:53:31.533355 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 06:53:31 crc kubenswrapper[4947]: I1203 06:53:31.533697 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 03 06:53:31 crc kubenswrapper[4947]: I1203 06:53:31.563033 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 06:53:31 crc kubenswrapper[4947]: I1203 06:53:31.571063 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 06:53:31 crc kubenswrapper[4947]: I1203 06:53:31.665464 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 06:53:31 crc kubenswrapper[4947]: I1203 06:53:31.714061 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 06:53:31 crc kubenswrapper[4947]: I1203 06:53:31.754771 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 06:53:31 crc kubenswrapper[4947]: I1203 06:53:31.850307 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 06:53:31 crc kubenswrapper[4947]: I1203 06:53:31.860054 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 03 06:53:31 crc kubenswrapper[4947]: I1203 06:53:31.967174 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 06:53:32 crc kubenswrapper[4947]: I1203 06:53:32.018351 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 06:53:32 crc kubenswrapper[4947]: I1203 06:53:32.114318 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 06:53:32 crc kubenswrapper[4947]: I1203 06:53:32.138309 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 06:53:32 crc kubenswrapper[4947]: I1203 06:53:32.149847 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 06:53:32 crc kubenswrapper[4947]: I1203 06:53:32.234457 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 06:53:32 crc kubenswrapper[4947]: I1203 06:53:32.307325 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 06:53:32 crc kubenswrapper[4947]: I1203 06:53:32.331047 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 06:53:32 crc kubenswrapper[4947]: I1203 06:53:32.331281 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 06:53:32 crc kubenswrapper[4947]: I1203 06:53:32.428983 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 06:53:32 crc kubenswrapper[4947]: I1203 06:53:32.503989 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 06:53:32 crc kubenswrapper[4947]: I1203 06:53:32.511592 4947 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 06:53:32 crc kubenswrapper[4947]: I1203 06:53:32.534798 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 03 06:53:32 crc kubenswrapper[4947]: I1203 06:53:32.627791 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 06:53:32 crc kubenswrapper[4947]: I1203 06:53:32.650109 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 03 06:53:32 crc kubenswrapper[4947]: I1203 06:53:32.741198 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 06:53:32 crc kubenswrapper[4947]: I1203 06:53:32.810683 4947 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 03 06:53:32 crc kubenswrapper[4947]: I1203 06:53:32.814772 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 03 06:53:32 crc kubenswrapper[4947]: I1203 06:53:32.834849 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 06:53:32 crc kubenswrapper[4947]: I1203 06:53:32.839952 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 06:53:32 crc kubenswrapper[4947]: I1203 06:53:32.840317 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 06:53:32 crc kubenswrapper[4947]: I1203 06:53:32.854811 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 06:53:32 crc kubenswrapper[4947]: I1203 06:53:32.856627 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 06:53:32 crc kubenswrapper[4947]: I1203 06:53:32.908960 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 06:53:33 crc kubenswrapper[4947]: I1203 06:53:33.008161 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 03 06:53:33 crc kubenswrapper[4947]: I1203 06:53:33.055608 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 06:53:33 crc kubenswrapper[4947]: I1203 06:53:33.056807 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 03 06:53:33 crc kubenswrapper[4947]: I1203 06:53:33.076691 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 06:53:33 crc kubenswrapper[4947]: I1203 06:53:33.080869 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 06:53:33 crc kubenswrapper[4947]: I1203 06:53:33.247599 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 06:53:33 crc kubenswrapper[4947]: I1203 06:53:33.303423 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 06:53:33 crc kubenswrapper[4947]: I1203 06:53:33.304104 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 06:53:33 crc kubenswrapper[4947]: I1203 06:53:33.540636 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 06:53:33 crc kubenswrapper[4947]: I1203 06:53:33.568136 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 06:53:33 crc kubenswrapper[4947]: I1203 06:53:33.601698 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 06:53:33 crc kubenswrapper[4947]: I1203 06:53:33.655894 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 06:53:33 crc kubenswrapper[4947]: I1203 06:53:33.780784 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 06:53:33 crc kubenswrapper[4947]: I1203 06:53:33.843803 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 06:53:33 crc kubenswrapper[4947]: I1203 06:53:33.903610 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 06:53:33 crc kubenswrapper[4947]: I1203 06:53:33.905428 4947 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 06:53:33 crc kubenswrapper[4947]: I1203 06:53:33.985534 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 06:53:34 crc kubenswrapper[4947]: I1203 06:53:34.004226 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 06:53:34 crc kubenswrapper[4947]: I1203 06:53:34.030041 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 06:53:34 crc kubenswrapper[4947]: I1203 06:53:34.083427 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 06:53:34 crc kubenswrapper[4947]: I1203 06:53:34.102312 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 03 06:53:34 crc kubenswrapper[4947]: I1203 06:53:34.122965 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 03 06:53:34 crc kubenswrapper[4947]: I1203 06:53:34.298082 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 06:53:34 crc kubenswrapper[4947]: I1203 06:53:34.305599 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 06:53:34 crc kubenswrapper[4947]: I1203 06:53:34.404086 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 06:53:34 crc kubenswrapper[4947]: I1203 06:53:34.489854 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 03 06:53:34 crc kubenswrapper[4947]: I1203 06:53:34.588435 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 06:53:34 crc kubenswrapper[4947]: I1203 06:53:34.621530 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 06:53:34 crc kubenswrapper[4947]: I1203 06:53:34.631458 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 03 06:53:34 crc kubenswrapper[4947]: I1203 06:53:34.653648 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 03 06:53:34 crc kubenswrapper[4947]: I1203 06:53:34.672648 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 06:53:34 crc kubenswrapper[4947]: I1203 06:53:34.706206 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 06:53:34 crc kubenswrapper[4947]: I1203 06:53:34.725768 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 03 06:53:34 crc kubenswrapper[4947]: I1203 06:53:34.745439 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 06:53:34 crc kubenswrapper[4947]: I1203 06:53:34.830838 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 06:53:34 crc kubenswrapper[4947]: I1203 06:53:34.935179 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 06:53:35 crc kubenswrapper[4947]: I1203 06:53:35.203893 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 06:53:35 crc kubenswrapper[4947]: I1203 06:53:35.253171 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 06:53:35 crc kubenswrapper[4947]: I1203 06:53:35.454095 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 06:53:35 crc kubenswrapper[4947]: I1203 06:53:35.498476 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 06:53:35 crc kubenswrapper[4947]: I1203 06:53:35.510945 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 06:53:35 crc kubenswrapper[4947]: I1203 06:53:35.556485 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 06:53:35 crc kubenswrapper[4947]: I1203 06:53:35.580394 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 03 06:53:35 crc kubenswrapper[4947]: I1203 06:53:35.675090 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 03 06:53:35 crc kubenswrapper[4947]: I1203 06:53:35.679772 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 03 06:53:35 crc kubenswrapper[4947]: I1203 06:53:35.807823 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 03 06:53:35 crc kubenswrapper[4947]: I1203 06:53:35.928366 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 06:53:35 crc kubenswrapper[4947]: I1203 06:53:35.940078 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 03 06:53:35 crc kubenswrapper[4947]: I1203 06:53:35.994660 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 06:53:36 crc kubenswrapper[4947]: I1203 06:53:36.069358 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 06:53:36 crc kubenswrapper[4947]: I1203 06:53:36.070956 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 06:53:36 crc kubenswrapper[4947]: I1203 06:53:36.144748 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 06:53:36 crc kubenswrapper[4947]: I1203 06:53:36.156274 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 06:53:36 crc kubenswrapper[4947]: I1203 06:53:36.195879 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 06:53:36 crc kubenswrapper[4947]: I1203 06:53:36.456102 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 03 06:53:36 crc kubenswrapper[4947]: I1203 06:53:36.482425 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 06:53:36 crc kubenswrapper[4947]: I1203 06:53:36.572210 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 06:53:36 crc kubenswrapper[4947]: I1203 06:53:36.776248 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 06:53:36 crc kubenswrapper[4947]: I1203 06:53:36.790430 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 06:53:36 crc kubenswrapper[4947]: I1203 06:53:36.817122 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 06:53:36 crc kubenswrapper[4947]: I1203 06:53:36.838723 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 06:53:36 crc kubenswrapper[4947]: I1203 06:53:36.906321 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 06:53:36 crc kubenswrapper[4947]: I1203 06:53:36.911943 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 06:53:36 crc kubenswrapper[4947]: I1203 06:53:36.974304 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.004397 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.036888 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.045366 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.068136 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.164291 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.211775 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.225087 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.233176 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.315233 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.343546 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.353318 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.381256 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.455180 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.457414 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.539580 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.549918 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.555310 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.585129 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.606875 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.663304 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.685814 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.746089 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.773480 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.814085 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.871844 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 06:53:37 crc kubenswrapper[4947]: I1203 06:53:37.877057 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 03 06:53:38 crc kubenswrapper[4947]: I1203 06:53:38.101766 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 06:53:38 crc kubenswrapper[4947]: I1203 06:53:38.182236 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 03 06:53:38 crc kubenswrapper[4947]: I1203 06:53:38.284520 4947 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 06:53:38 crc kubenswrapper[4947]: I1203 06:53:38.289084 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=43.289062788 podStartE2EDuration="43.289062788s" podCreationTimestamp="2025-12-03 06:52:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:53:16.399680319 +0000 UTC m=+257.660634745" watchObservedRunningTime="2025-12-03 06:53:38.289062788 +0000 UTC m=+279.550017214" Dec 03 06:53:38 crc kubenswrapper[4947]: I1203 06:53:38.290120 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 06:53:38 crc kubenswrapper[4947]: I1203 06:53:38.290170 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 06:53:38 crc kubenswrapper[4947]: I1203 06:53:38.290905 4947 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eefa93f4-4ae4-4da7-ba41-c19097fb2352" Dec 03 06:53:38 crc kubenswrapper[4947]: I1203 06:53:38.290964 4947 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="eefa93f4-4ae4-4da7-ba41-c19097fb2352" Dec 03 06:53:38 crc kubenswrapper[4947]: I1203 06:53:38.296099 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:53:38 crc kubenswrapper[4947]: I1203 06:53:38.298935 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 03 06:53:38 crc kubenswrapper[4947]: I1203 06:53:38.311162 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=22.311139639 podStartE2EDuration="22.311139639s" podCreationTimestamp="2025-12-03 06:53:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:53:38.308315814 +0000 UTC m=+279.569270250" watchObservedRunningTime="2025-12-03 06:53:38.311139639 +0000 UTC m=+279.572094075" Dec 03 06:53:38 crc kubenswrapper[4947]: I1203 06:53:38.329704 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 03 06:53:38 crc kubenswrapper[4947]: I1203 06:53:38.362258 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 06:53:38 crc kubenswrapper[4947]: I1203 06:53:38.426736 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 06:53:38 crc kubenswrapper[4947]: I1203 06:53:38.469220 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 03 06:53:38 crc kubenswrapper[4947]: I1203 06:53:38.523538 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 06:53:38 crc kubenswrapper[4947]: I1203 06:53:38.560913 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 06:53:38 crc kubenswrapper[4947]: I1203 06:53:38.595829 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 06:53:38 crc kubenswrapper[4947]: I1203 06:53:38.695598 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 03 06:53:38 crc kubenswrapper[4947]: I1203 06:53:38.720224 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 06:53:38 crc kubenswrapper[4947]: I1203 06:53:38.827423 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 06:53:38 crc kubenswrapper[4947]: I1203 06:53:38.866293 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 06:53:38 crc kubenswrapper[4947]: I1203 06:53:38.917838 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 06:53:38 crc kubenswrapper[4947]: I1203 06:53:38.930141 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 03 06:53:38 crc kubenswrapper[4947]: I1203 06:53:38.939250 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 06:53:39 crc kubenswrapper[4947]: I1203 06:53:39.002682 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 06:53:39 crc kubenswrapper[4947]: I1203 06:53:39.007580 4947 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 06:53:39 crc kubenswrapper[4947]: I1203 06:53:39.007869 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://523ab64fe2ffa2c9110ede53e24acd5cc10a62356a3a910166f9aea08defe7bd" gracePeriod=5 Dec 03 06:53:39 crc kubenswrapper[4947]: I1203 06:53:39.141696 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 06:53:39 crc kubenswrapper[4947]: I1203 06:53:39.220068 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 06:53:39 crc kubenswrapper[4947]: I1203 06:53:39.292593 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 06:53:39 crc kubenswrapper[4947]: I1203 06:53:39.306327 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 06:53:39 crc kubenswrapper[4947]: I1203 06:53:39.405617 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 06:53:39 crc kubenswrapper[4947]: I1203 06:53:39.456123 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 03 06:53:39 crc kubenswrapper[4947]: I1203 06:53:39.477215 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 06:53:39 crc kubenswrapper[4947]: I1203 06:53:39.554115 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 06:53:39 crc kubenswrapper[4947]: I1203 06:53:39.707237 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 06:53:39 crc kubenswrapper[4947]: I1203 06:53:39.708509 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 06:53:39 crc kubenswrapper[4947]: I1203 06:53:39.719319 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 06:53:40 crc kubenswrapper[4947]: I1203 06:53:40.028208 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 06:53:40 crc kubenswrapper[4947]: I1203 06:53:40.102430 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 03 06:53:40 crc kubenswrapper[4947]: I1203 06:53:40.178616 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 06:53:40 crc kubenswrapper[4947]: I1203 06:53:40.310736 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 06:53:40 crc kubenswrapper[4947]: I1203 06:53:40.315519 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 06:53:40 crc kubenswrapper[4947]: I1203 06:53:40.376329 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 06:53:40 crc kubenswrapper[4947]: I1203 06:53:40.567733 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 06:53:40 crc kubenswrapper[4947]: I1203 06:53:40.671166 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 06:53:40 crc kubenswrapper[4947]: I1203 06:53:40.733531 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 06:53:40 crc kubenswrapper[4947]: I1203 06:53:40.814374 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 06:53:40 crc kubenswrapper[4947]: I1203 06:53:40.926711 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 03 06:53:40 crc kubenswrapper[4947]: I1203 06:53:40.959115 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.037675 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.060342 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.137262 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.237882 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2pgs5"] Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.238279 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2pgs5" podUID="c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea" containerName="registry-server" containerID="cri-o://c7a6caecdd000e9f1cebc541c913ca2eee6df4c068b80fce6f2fa830ff89981c" gracePeriod=30 Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.248605 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7hsps"] Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.248905 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7hsps" podUID="4e5e8eae-11a8-4641-a62d-91c5a2051197" containerName="registry-server" containerID="cri-o://ce23b299b094d30b8f407bed7bf985b85904998c3211591cfa31fda88df486a9" gracePeriod=30 Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.291012 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.291242 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.296792 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6wmgx"] Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.297639 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-6wmgx" podUID="5b257084-363a-41ed-9bc8-838592867c51" containerName="marketplace-operator" containerID="cri-o://df7776808f2148295386e0ce68f44b1034a81bebd6482412d9966d63876f933c" gracePeriod=30 Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.309156 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bzgkn"] Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.309437 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bzgkn" podUID="bdf67ede-d6c7-4285-bef1-ef1fcc52c311" containerName="registry-server" containerID="cri-o://be420c0830045aa05a1fde7568f85f58e969e3e8cf2c09751732f1813f0be182" gracePeriod=30 Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.313665 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sf96j"] Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.313870 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sf96j" podUID="e0e18759-9a71-4927-b7d6-ded426d626fa" containerName="registry-server" containerID="cri-o://8ab7318dc2ae1d268fb0d7c03c18947f000ab77f56c22699e1d74460d2ba2baf" gracePeriod=30 Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.341095 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.395300 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.427040 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.476470 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.506916 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.650031 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hsps" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.722285 4947 generic.go:334] "Generic (PLEG): container finished" podID="4e5e8eae-11a8-4641-a62d-91c5a2051197" containerID="ce23b299b094d30b8f407bed7bf985b85904998c3211591cfa31fda88df486a9" exitCode=0 Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.722377 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hsps" event={"ID":"4e5e8eae-11a8-4641-a62d-91c5a2051197","Type":"ContainerDied","Data":"ce23b299b094d30b8f407bed7bf985b85904998c3211591cfa31fda88df486a9"} Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.722405 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7hsps" event={"ID":"4e5e8eae-11a8-4641-a62d-91c5a2051197","Type":"ContainerDied","Data":"1973191e2e53c996241301cf0bc07e8a901415fb532b79263c80744495a5e7dd"} Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.722425 4947 scope.go:117] "RemoveContainer" containerID="ce23b299b094d30b8f407bed7bf985b85904998c3211591cfa31fda88df486a9" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.722606 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7hsps" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.728819 4947 generic.go:334] "Generic (PLEG): container finished" podID="e0e18759-9a71-4927-b7d6-ded426d626fa" containerID="8ab7318dc2ae1d268fb0d7c03c18947f000ab77f56c22699e1d74460d2ba2baf" exitCode=0 Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.728983 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sf96j" event={"ID":"e0e18759-9a71-4927-b7d6-ded426d626fa","Type":"ContainerDied","Data":"8ab7318dc2ae1d268fb0d7c03c18947f000ab77f56c22699e1d74460d2ba2baf"} Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.729765 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sf96j" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.730872 4947 generic.go:334] "Generic (PLEG): container finished" podID="5b257084-363a-41ed-9bc8-838592867c51" containerID="df7776808f2148295386e0ce68f44b1034a81bebd6482412d9966d63876f933c" exitCode=0 Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.730943 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6wmgx" event={"ID":"5b257084-363a-41ed-9bc8-838592867c51","Type":"ContainerDied","Data":"df7776808f2148295386e0ce68f44b1034a81bebd6482412d9966d63876f933c"} Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.733850 4947 generic.go:334] "Generic (PLEG): container finished" podID="bdf67ede-d6c7-4285-bef1-ef1fcc52c311" containerID="be420c0830045aa05a1fde7568f85f58e969e3e8cf2c09751732f1813f0be182" exitCode=0 Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.733898 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bzgkn" event={"ID":"bdf67ede-d6c7-4285-bef1-ef1fcc52c311","Type":"ContainerDied","Data":"be420c0830045aa05a1fde7568f85f58e969e3e8cf2c09751732f1813f0be182"} Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.734310 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2pgs5" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.736402 4947 generic.go:334] "Generic (PLEG): container finished" podID="c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea" containerID="c7a6caecdd000e9f1cebc541c913ca2eee6df4c068b80fce6f2fa830ff89981c" exitCode=0 Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.736424 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2pgs5" event={"ID":"c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea","Type":"ContainerDied","Data":"c7a6caecdd000e9f1cebc541c913ca2eee6df4c068b80fce6f2fa830ff89981c"} Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.737720 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6wmgx" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.742391 4947 scope.go:117] "RemoveContainer" containerID="8a856f4c4a07fa5d9997d3c754ac66e887f6d4793163b64a2b486ef8eeda5283" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.777317 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bzgkn" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.778022 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.784748 4947 scope.go:117] "RemoveContainer" containerID="60d45bdc3b52462e2e829952d2ab037ae9ba1b3a7b17c0c1e11f4bbb40ffcf33" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.797178 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6jht\" (UniqueName: \"kubernetes.io/projected/4e5e8eae-11a8-4641-a62d-91c5a2051197-kube-api-access-v6jht\") pod \"4e5e8eae-11a8-4641-a62d-91c5a2051197\" (UID: \"4e5e8eae-11a8-4641-a62d-91c5a2051197\") " Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.797240 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8eae-11a8-4641-a62d-91c5a2051197-utilities\") pod \"4e5e8eae-11a8-4641-a62d-91c5a2051197\" (UID: \"4e5e8eae-11a8-4641-a62d-91c5a2051197\") " Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.797326 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8eae-11a8-4641-a62d-91c5a2051197-catalog-content\") pod \"4e5e8eae-11a8-4641-a62d-91c5a2051197\" (UID: \"4e5e8eae-11a8-4641-a62d-91c5a2051197\") " Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.801213 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e5e8eae-11a8-4641-a62d-91c5a2051197-utilities" (OuterVolumeSpecName: "utilities") pod "4e5e8eae-11a8-4641-a62d-91c5a2051197" (UID: "4e5e8eae-11a8-4641-a62d-91c5a2051197"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.832261 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.851288 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e5e8eae-11a8-4641-a62d-91c5a2051197-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e5e8eae-11a8-4641-a62d-91c5a2051197" (UID: "4e5e8eae-11a8-4641-a62d-91c5a2051197"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.867382 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e5e8eae-11a8-4641-a62d-91c5a2051197-kube-api-access-v6jht" (OuterVolumeSpecName: "kube-api-access-v6jht") pod "4e5e8eae-11a8-4641-a62d-91c5a2051197" (UID: "4e5e8eae-11a8-4641-a62d-91c5a2051197"). InnerVolumeSpecName "kube-api-access-v6jht". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.876362 4947 scope.go:117] "RemoveContainer" containerID="ce23b299b094d30b8f407bed7bf985b85904998c3211591cfa31fda88df486a9" Dec 03 06:53:41 crc kubenswrapper[4947]: E1203 06:53:41.876965 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce23b299b094d30b8f407bed7bf985b85904998c3211591cfa31fda88df486a9\": container with ID starting with ce23b299b094d30b8f407bed7bf985b85904998c3211591cfa31fda88df486a9 not found: ID does not exist" containerID="ce23b299b094d30b8f407bed7bf985b85904998c3211591cfa31fda88df486a9" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.877017 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce23b299b094d30b8f407bed7bf985b85904998c3211591cfa31fda88df486a9"} err="failed to get container status \"ce23b299b094d30b8f407bed7bf985b85904998c3211591cfa31fda88df486a9\": rpc error: code = NotFound desc = could not find container \"ce23b299b094d30b8f407bed7bf985b85904998c3211591cfa31fda88df486a9\": container with ID starting with ce23b299b094d30b8f407bed7bf985b85904998c3211591cfa31fda88df486a9 not found: ID does not exist" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.877055 4947 scope.go:117] "RemoveContainer" containerID="8a856f4c4a07fa5d9997d3c754ac66e887f6d4793163b64a2b486ef8eeda5283" Dec 03 06:53:41 crc kubenswrapper[4947]: E1203 06:53:41.877435 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a856f4c4a07fa5d9997d3c754ac66e887f6d4793163b64a2b486ef8eeda5283\": container with ID starting with 8a856f4c4a07fa5d9997d3c754ac66e887f6d4793163b64a2b486ef8eeda5283 not found: ID does not exist" containerID="8a856f4c4a07fa5d9997d3c754ac66e887f6d4793163b64a2b486ef8eeda5283" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.877464 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a856f4c4a07fa5d9997d3c754ac66e887f6d4793163b64a2b486ef8eeda5283"} err="failed to get container status \"8a856f4c4a07fa5d9997d3c754ac66e887f6d4793163b64a2b486ef8eeda5283\": rpc error: code = NotFound desc = could not find container \"8a856f4c4a07fa5d9997d3c754ac66e887f6d4793163b64a2b486ef8eeda5283\": container with ID starting with 8a856f4c4a07fa5d9997d3c754ac66e887f6d4793163b64a2b486ef8eeda5283 not found: ID does not exist" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.877579 4947 scope.go:117] "RemoveContainer" containerID="60d45bdc3b52462e2e829952d2ab037ae9ba1b3a7b17c0c1e11f4bbb40ffcf33" Dec 03 06:53:41 crc kubenswrapper[4947]: E1203 06:53:41.877949 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60d45bdc3b52462e2e829952d2ab037ae9ba1b3a7b17c0c1e11f4bbb40ffcf33\": container with ID starting with 60d45bdc3b52462e2e829952d2ab037ae9ba1b3a7b17c0c1e11f4bbb40ffcf33 not found: ID does not exist" containerID="60d45bdc3b52462e2e829952d2ab037ae9ba1b3a7b17c0c1e11f4bbb40ffcf33" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.878007 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60d45bdc3b52462e2e829952d2ab037ae9ba1b3a7b17c0c1e11f4bbb40ffcf33"} err="failed to get container status \"60d45bdc3b52462e2e829952d2ab037ae9ba1b3a7b17c0c1e11f4bbb40ffcf33\": rpc error: code = NotFound desc = could not find container \"60d45bdc3b52462e2e829952d2ab037ae9ba1b3a7b17c0c1e11f4bbb40ffcf33\": container with ID starting with 60d45bdc3b52462e2e829952d2ab037ae9ba1b3a7b17c0c1e11f4bbb40ffcf33 not found: ID does not exist" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.878042 4947 scope.go:117] "RemoveContainer" containerID="c7a6caecdd000e9f1cebc541c913ca2eee6df4c068b80fce6f2fa830ff89981c" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.890540 4947 scope.go:117] "RemoveContainer" containerID="d430978c174c1b13e5e608e98b2234f834bdb8df2bbadf2b72819ef52ba22423" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.898939 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkhh7\" (UniqueName: \"kubernetes.io/projected/e0e18759-9a71-4927-b7d6-ded426d626fa-kube-api-access-jkhh7\") pod \"e0e18759-9a71-4927-b7d6-ded426d626fa\" (UID: \"e0e18759-9a71-4927-b7d6-ded426d626fa\") " Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.898994 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdf67ede-d6c7-4285-bef1-ef1fcc52c311-catalog-content\") pod \"bdf67ede-d6c7-4285-bef1-ef1fcc52c311\" (UID: \"bdf67ede-d6c7-4285-bef1-ef1fcc52c311\") " Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.899060 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b257084-363a-41ed-9bc8-838592867c51-marketplace-trusted-ca\") pod \"5b257084-363a-41ed-9bc8-838592867c51\" (UID: \"5b257084-363a-41ed-9bc8-838592867c51\") " Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.899088 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5b257084-363a-41ed-9bc8-838592867c51-marketplace-operator-metrics\") pod \"5b257084-363a-41ed-9bc8-838592867c51\" (UID: \"5b257084-363a-41ed-9bc8-838592867c51\") " Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.899126 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7lck\" (UniqueName: \"kubernetes.io/projected/bdf67ede-d6c7-4285-bef1-ef1fcc52c311-kube-api-access-s7lck\") pod \"bdf67ede-d6c7-4285-bef1-ef1fcc52c311\" (UID: \"bdf67ede-d6c7-4285-bef1-ef1fcc52c311\") " Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.899155 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea-catalog-content\") pod \"c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea\" (UID: \"c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea\") " Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.899187 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea-utilities\") pod \"c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea\" (UID: \"c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea\") " Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.899213 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0e18759-9a71-4927-b7d6-ded426d626fa-catalog-content\") pod \"e0e18759-9a71-4927-b7d6-ded426d626fa\" (UID: \"e0e18759-9a71-4927-b7d6-ded426d626fa\") " Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.899241 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdzfk\" (UniqueName: \"kubernetes.io/projected/c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea-kube-api-access-gdzfk\") pod \"c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea\" (UID: \"c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea\") " Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.899270 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0e18759-9a71-4927-b7d6-ded426d626fa-utilities\") pod \"e0e18759-9a71-4927-b7d6-ded426d626fa\" (UID: \"e0e18759-9a71-4927-b7d6-ded426d626fa\") " Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.899298 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdf67ede-d6c7-4285-bef1-ef1fcc52c311-utilities\") pod \"bdf67ede-d6c7-4285-bef1-ef1fcc52c311\" (UID: \"bdf67ede-d6c7-4285-bef1-ef1fcc52c311\") " Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.899316 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6v9x\" (UniqueName: \"kubernetes.io/projected/5b257084-363a-41ed-9bc8-838592867c51-kube-api-access-p6v9x\") pod \"5b257084-363a-41ed-9bc8-838592867c51\" (UID: \"5b257084-363a-41ed-9bc8-838592867c51\") " Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.899522 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6jht\" (UniqueName: \"kubernetes.io/projected/4e5e8eae-11a8-4641-a62d-91c5a2051197-kube-api-access-v6jht\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.899538 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8eae-11a8-4641-a62d-91c5a2051197-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.899550 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e5e8eae-11a8-4641-a62d-91c5a2051197-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.900150 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0e18759-9a71-4927-b7d6-ded426d626fa-utilities" (OuterVolumeSpecName: "utilities") pod "e0e18759-9a71-4927-b7d6-ded426d626fa" (UID: "e0e18759-9a71-4927-b7d6-ded426d626fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.900951 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdf67ede-d6c7-4285-bef1-ef1fcc52c311-utilities" (OuterVolumeSpecName: "utilities") pod "bdf67ede-d6c7-4285-bef1-ef1fcc52c311" (UID: "bdf67ede-d6c7-4285-bef1-ef1fcc52c311"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.902118 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea-utilities" (OuterVolumeSpecName: "utilities") pod "c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea" (UID: "c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.905041 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b257084-363a-41ed-9bc8-838592867c51-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "5b257084-363a-41ed-9bc8-838592867c51" (UID: "5b257084-363a-41ed-9bc8-838592867c51"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.905843 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdf67ede-d6c7-4285-bef1-ef1fcc52c311-kube-api-access-s7lck" (OuterVolumeSpecName: "kube-api-access-s7lck") pod "bdf67ede-d6c7-4285-bef1-ef1fcc52c311" (UID: "bdf67ede-d6c7-4285-bef1-ef1fcc52c311"). InnerVolumeSpecName "kube-api-access-s7lck". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.908895 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea-kube-api-access-gdzfk" (OuterVolumeSpecName: "kube-api-access-gdzfk") pod "c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea" (UID: "c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea"). InnerVolumeSpecName "kube-api-access-gdzfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.911279 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0e18759-9a71-4927-b7d6-ded426d626fa-kube-api-access-jkhh7" (OuterVolumeSpecName: "kube-api-access-jkhh7") pod "e0e18759-9a71-4927-b7d6-ded426d626fa" (UID: "e0e18759-9a71-4927-b7d6-ded426d626fa"). InnerVolumeSpecName "kube-api-access-jkhh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.912545 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b257084-363a-41ed-9bc8-838592867c51-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "5b257084-363a-41ed-9bc8-838592867c51" (UID: "5b257084-363a-41ed-9bc8-838592867c51"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.913296 4947 scope.go:117] "RemoveContainer" containerID="0f5b33141844046206759cd726aa8dbdfbc53dbb5438677f79d856832d9b8e3f" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.913825 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b257084-363a-41ed-9bc8-838592867c51-kube-api-access-p6v9x" (OuterVolumeSpecName: "kube-api-access-p6v9x") pod "5b257084-363a-41ed-9bc8-838592867c51" (UID: "5b257084-363a-41ed-9bc8-838592867c51"). InnerVolumeSpecName "kube-api-access-p6v9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.925471 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdf67ede-d6c7-4285-bef1-ef1fcc52c311-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bdf67ede-d6c7-4285-bef1-ef1fcc52c311" (UID: "bdf67ede-d6c7-4285-bef1-ef1fcc52c311"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.954976 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea" (UID: "c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:53:41 crc kubenswrapper[4947]: I1203 06:53:41.997047 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.001124 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdf67ede-d6c7-4285-bef1-ef1fcc52c311-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.001153 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6v9x\" (UniqueName: \"kubernetes.io/projected/5b257084-363a-41ed-9bc8-838592867c51-kube-api-access-p6v9x\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.001164 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkhh7\" (UniqueName: \"kubernetes.io/projected/e0e18759-9a71-4927-b7d6-ded426d626fa-kube-api-access-jkhh7\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.001173 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdf67ede-d6c7-4285-bef1-ef1fcc52c311-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.001183 4947 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b257084-363a-41ed-9bc8-838592867c51-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.001194 4947 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5b257084-363a-41ed-9bc8-838592867c51-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.001203 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7lck\" (UniqueName: \"kubernetes.io/projected/bdf67ede-d6c7-4285-bef1-ef1fcc52c311-kube-api-access-s7lck\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.001212 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.001220 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.001230 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdzfk\" (UniqueName: \"kubernetes.io/projected/c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea-kube-api-access-gdzfk\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.001238 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0e18759-9a71-4927-b7d6-ded426d626fa-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.003378 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0e18759-9a71-4927-b7d6-ded426d626fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e0e18759-9a71-4927-b7d6-ded426d626fa" (UID: "e0e18759-9a71-4927-b7d6-ded426d626fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.047893 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7hsps"] Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.052256 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7hsps"] Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.103134 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0e18759-9a71-4927-b7d6-ded426d626fa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.249952 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.306416 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.497213 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.502592 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.523881 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.683866 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j8fz6"] Dec 03 06:53:42 crc kubenswrapper[4947]: E1203 06:53:42.684243 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0e18759-9a71-4927-b7d6-ded426d626fa" containerName="registry-server" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.684282 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e18759-9a71-4927-b7d6-ded426d626fa" containerName="registry-server" Dec 03 06:53:42 crc kubenswrapper[4947]: E1203 06:53:42.684311 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e5e8eae-11a8-4641-a62d-91c5a2051197" containerName="extract-utilities" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.684327 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e5e8eae-11a8-4641-a62d-91c5a2051197" containerName="extract-utilities" Dec 03 06:53:42 crc kubenswrapper[4947]: E1203 06:53:42.684350 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdf67ede-d6c7-4285-bef1-ef1fcc52c311" containerName="registry-server" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.684368 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdf67ede-d6c7-4285-bef1-ef1fcc52c311" containerName="registry-server" Dec 03 06:53:42 crc kubenswrapper[4947]: E1203 06:53:42.684388 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0e18759-9a71-4927-b7d6-ded426d626fa" containerName="extract-utilities" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.684403 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e18759-9a71-4927-b7d6-ded426d626fa" containerName="extract-utilities" Dec 03 06:53:42 crc kubenswrapper[4947]: E1203 06:53:42.684429 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdf67ede-d6c7-4285-bef1-ef1fcc52c311" containerName="extract-utilities" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.684444 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdf67ede-d6c7-4285-bef1-ef1fcc52c311" containerName="extract-utilities" Dec 03 06:53:42 crc kubenswrapper[4947]: E1203 06:53:42.684462 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.684477 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 06:53:42 crc kubenswrapper[4947]: E1203 06:53:42.684535 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea" containerName="registry-server" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.684553 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea" containerName="registry-server" Dec 03 06:53:42 crc kubenswrapper[4947]: E1203 06:53:42.684576 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea" containerName="extract-content" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.684592 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea" containerName="extract-content" Dec 03 06:53:42 crc kubenswrapper[4947]: E1203 06:53:42.684612 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b257084-363a-41ed-9bc8-838592867c51" containerName="marketplace-operator" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.684628 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b257084-363a-41ed-9bc8-838592867c51" containerName="marketplace-operator" Dec 03 06:53:42 crc kubenswrapper[4947]: E1203 06:53:42.684644 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e5e8eae-11a8-4641-a62d-91c5a2051197" containerName="extract-content" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.684661 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e5e8eae-11a8-4641-a62d-91c5a2051197" containerName="extract-content" Dec 03 06:53:42 crc kubenswrapper[4947]: E1203 06:53:42.684686 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdf67ede-d6c7-4285-bef1-ef1fcc52c311" containerName="extract-content" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.684703 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdf67ede-d6c7-4285-bef1-ef1fcc52c311" containerName="extract-content" Dec 03 06:53:42 crc kubenswrapper[4947]: E1203 06:53:42.684727 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0e18759-9a71-4927-b7d6-ded426d626fa" containerName="extract-content" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.684744 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e18759-9a71-4927-b7d6-ded426d626fa" containerName="extract-content" Dec 03 06:53:42 crc kubenswrapper[4947]: E1203 06:53:42.684764 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea" containerName="extract-utilities" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.684779 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea" containerName="extract-utilities" Dec 03 06:53:42 crc kubenswrapper[4947]: E1203 06:53:42.684795 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e5e8eae-11a8-4641-a62d-91c5a2051197" containerName="registry-server" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.684809 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e5e8eae-11a8-4641-a62d-91c5a2051197" containerName="registry-server" Dec 03 06:53:42 crc kubenswrapper[4947]: E1203 06:53:42.684831 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2791830-ab75-4754-bc30-77a083b090c4" containerName="installer" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.684847 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2791830-ab75-4754-bc30-77a083b090c4" containerName="installer" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.685061 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e5e8eae-11a8-4641-a62d-91c5a2051197" containerName="registry-server" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.685094 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0e18759-9a71-4927-b7d6-ded426d626fa" containerName="registry-server" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.685115 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2791830-ab75-4754-bc30-77a083b090c4" containerName="installer" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.685140 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b257084-363a-41ed-9bc8-838592867c51" containerName="marketplace-operator" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.685164 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.685190 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdf67ede-d6c7-4285-bef1-ef1fcc52c311" containerName="registry-server" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.685213 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea" containerName="registry-server" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.685964 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j8fz6" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.687734 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j8fz6"] Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.721159 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/258aafb0-b050-4b10-b8b5-339e387f81ae-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j8fz6\" (UID: \"258aafb0-b050-4b10-b8b5-339e387f81ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8fz6" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.721224 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/258aafb0-b050-4b10-b8b5-339e387f81ae-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j8fz6\" (UID: \"258aafb0-b050-4b10-b8b5-339e387f81ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8fz6" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.721327 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gtmd\" (UniqueName: \"kubernetes.io/projected/258aafb0-b050-4b10-b8b5-339e387f81ae-kube-api-access-2gtmd\") pod \"marketplace-operator-79b997595-j8fz6\" (UID: \"258aafb0-b050-4b10-b8b5-339e387f81ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8fz6" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.742376 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bzgkn" event={"ID":"bdf67ede-d6c7-4285-bef1-ef1fcc52c311","Type":"ContainerDied","Data":"38672b8b5f603a0dbf80273e8c9eee4c0bdc7e2bfccecb2fbefe30de1ad2517c"} Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.742417 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bzgkn" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.742428 4947 scope.go:117] "RemoveContainer" containerID="be420c0830045aa05a1fde7568f85f58e969e3e8cf2c09751732f1813f0be182" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.743517 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2pgs5" event={"ID":"c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea","Type":"ContainerDied","Data":"1cc0e37d6cf3b350f7a73cae46405b831ee06d5d4d67a5f7a1391177167c3b28"} Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.743614 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2pgs5" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.747655 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sf96j" event={"ID":"e0e18759-9a71-4927-b7d6-ded426d626fa","Type":"ContainerDied","Data":"2b9e8ce5eabc57cae4948783798c7d118da60d7a86f274176bfa847e141bbf3c"} Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.747730 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sf96j" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.761180 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6wmgx" event={"ID":"5b257084-363a-41ed-9bc8-838592867c51","Type":"ContainerDied","Data":"1e0e4cf7af66f48e5f9b1da181f39364aedad0746b7ab44d8bac4be9173b97e4"} Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.761272 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6wmgx" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.772959 4947 scope.go:117] "RemoveContainer" containerID="bde9ba56dbd278e980ca1996649c1ea2661876dd6c1b01ae03c1faef9cc20862" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.779955 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sf96j"] Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.802043 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sf96j"] Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.810320 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2pgs5"] Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.817956 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2pgs5"] Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.820383 4947 scope.go:117] "RemoveContainer" containerID="d82dc13f19f1799ef01d726acef96f5cc8157a0ac4d004c2911733e709ca750f" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.822560 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/258aafb0-b050-4b10-b8b5-339e387f81ae-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j8fz6\" (UID: \"258aafb0-b050-4b10-b8b5-339e387f81ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8fz6" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.822715 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/258aafb0-b050-4b10-b8b5-339e387f81ae-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j8fz6\" (UID: \"258aafb0-b050-4b10-b8b5-339e387f81ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8fz6" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.822745 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gtmd\" (UniqueName: \"kubernetes.io/projected/258aafb0-b050-4b10-b8b5-339e387f81ae-kube-api-access-2gtmd\") pod \"marketplace-operator-79b997595-j8fz6\" (UID: \"258aafb0-b050-4b10-b8b5-339e387f81ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8fz6" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.823996 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/258aafb0-b050-4b10-b8b5-339e387f81ae-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j8fz6\" (UID: \"258aafb0-b050-4b10-b8b5-339e387f81ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8fz6" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.824505 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6wmgx"] Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.826418 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6wmgx"] Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.830032 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bzgkn"] Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.832685 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bzgkn"] Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.833586 4947 scope.go:117] "RemoveContainer" containerID="8ab7318dc2ae1d268fb0d7c03c18947f000ab77f56c22699e1d74460d2ba2baf" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.838415 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/258aafb0-b050-4b10-b8b5-339e387f81ae-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j8fz6\" (UID: \"258aafb0-b050-4b10-b8b5-339e387f81ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8fz6" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.846242 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gtmd\" (UniqueName: \"kubernetes.io/projected/258aafb0-b050-4b10-b8b5-339e387f81ae-kube-api-access-2gtmd\") pod \"marketplace-operator-79b997595-j8fz6\" (UID: \"258aafb0-b050-4b10-b8b5-339e387f81ae\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8fz6" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.878856 4947 scope.go:117] "RemoveContainer" containerID="8b7e943c87580bf125b99141bd7d3ad82ab29e8ec60215a0d37fbb974aa1f82b" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.895762 4947 scope.go:117] "RemoveContainer" containerID="5100387c0a32014d504db5fb4ea8382a392876bb9311a82db28b3e7fb91f007b" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.920725 4947 scope.go:117] "RemoveContainer" containerID="df7776808f2148295386e0ce68f44b1034a81bebd6482412d9966d63876f933c" Dec 03 06:53:42 crc kubenswrapper[4947]: I1203 06:53:42.974198 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 06:53:43 crc kubenswrapper[4947]: I1203 06:53:43.009150 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j8fz6" Dec 03 06:53:43 crc kubenswrapper[4947]: I1203 06:53:43.092727 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e5e8eae-11a8-4641-a62d-91c5a2051197" path="/var/lib/kubelet/pods/4e5e8eae-11a8-4641-a62d-91c5a2051197/volumes" Dec 03 06:53:43 crc kubenswrapper[4947]: I1203 06:53:43.093746 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b257084-363a-41ed-9bc8-838592867c51" path="/var/lib/kubelet/pods/5b257084-363a-41ed-9bc8-838592867c51/volumes" Dec 03 06:53:43 crc kubenswrapper[4947]: I1203 06:53:43.094274 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdf67ede-d6c7-4285-bef1-ef1fcc52c311" path="/var/lib/kubelet/pods/bdf67ede-d6c7-4285-bef1-ef1fcc52c311/volumes" Dec 03 06:53:43 crc kubenswrapper[4947]: I1203 06:53:43.095470 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea" path="/var/lib/kubelet/pods/c1e19e11-9f2f-414e-90f7-8e8c99f5d8ea/volumes" Dec 03 06:53:43 crc kubenswrapper[4947]: I1203 06:53:43.096203 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0e18759-9a71-4927-b7d6-ded426d626fa" path="/var/lib/kubelet/pods/e0e18759-9a71-4927-b7d6-ded426d626fa/volumes" Dec 03 06:53:43 crc kubenswrapper[4947]: I1203 06:53:43.240993 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j8fz6"] Dec 03 06:53:43 crc kubenswrapper[4947]: I1203 06:53:43.354338 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 06:53:43 crc kubenswrapper[4947]: I1203 06:53:43.507312 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 06:53:43 crc kubenswrapper[4947]: I1203 06:53:43.552605 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 06:53:43 crc kubenswrapper[4947]: I1203 06:53:43.598550 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 06:53:43 crc kubenswrapper[4947]: I1203 06:53:43.771384 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j8fz6" event={"ID":"258aafb0-b050-4b10-b8b5-339e387f81ae","Type":"ContainerStarted","Data":"b85c1b2a653b445d493274ecb7fc4174a0830ceb1241b0106ee2c7fddccc32c2"} Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.134090 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.134196 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.241305 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.241414 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.241437 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.241459 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.241532 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.241555 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.241588 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.241669 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.241732 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.242031 4947 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.242073 4947 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.242097 4947 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.242124 4947 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.252839 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.343158 4947 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.787570 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j8fz6" event={"ID":"258aafb0-b050-4b10-b8b5-339e387f81ae","Type":"ContainerStarted","Data":"c8d79ca8bed5b0f7dd6fae455ac9c916addc287d951fc4c71d225a0206a6cafa"} Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.787986 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-j8fz6" Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.792107 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.792162 4947 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="523ab64fe2ffa2c9110ede53e24acd5cc10a62356a3a910166f9aea08defe7bd" exitCode=137 Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.792236 4947 scope.go:117] "RemoveContainer" containerID="523ab64fe2ffa2c9110ede53e24acd5cc10a62356a3a910166f9aea08defe7bd" Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.792468 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.793114 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-j8fz6" Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.804610 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-j8fz6" podStartSLOduration=3.8045854869999998 podStartE2EDuration="3.804585487s" podCreationTimestamp="2025-12-03 06:53:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:53:44.803776015 +0000 UTC m=+286.064730451" watchObservedRunningTime="2025-12-03 06:53:44.804585487 +0000 UTC m=+286.065539933" Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.815652 4947 scope.go:117] "RemoveContainer" containerID="523ab64fe2ffa2c9110ede53e24acd5cc10a62356a3a910166f9aea08defe7bd" Dec 03 06:53:44 crc kubenswrapper[4947]: E1203 06:53:44.816224 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"523ab64fe2ffa2c9110ede53e24acd5cc10a62356a3a910166f9aea08defe7bd\": container with ID starting with 523ab64fe2ffa2c9110ede53e24acd5cc10a62356a3a910166f9aea08defe7bd not found: ID does not exist" containerID="523ab64fe2ffa2c9110ede53e24acd5cc10a62356a3a910166f9aea08defe7bd" Dec 03 06:53:44 crc kubenswrapper[4947]: I1203 06:53:44.816271 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"523ab64fe2ffa2c9110ede53e24acd5cc10a62356a3a910166f9aea08defe7bd"} err="failed to get container status \"523ab64fe2ffa2c9110ede53e24acd5cc10a62356a3a910166f9aea08defe7bd\": rpc error: code = NotFound desc = could not find container \"523ab64fe2ffa2c9110ede53e24acd5cc10a62356a3a910166f9aea08defe7bd\": container with ID starting with 523ab64fe2ffa2c9110ede53e24acd5cc10a62356a3a910166f9aea08defe7bd not found: ID does not exist" Dec 03 06:53:45 crc kubenswrapper[4947]: I1203 06:53:45.089412 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 03 06:53:45 crc kubenswrapper[4947]: I1203 06:53:45.089721 4947 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 03 06:53:45 crc kubenswrapper[4947]: I1203 06:53:45.099164 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 06:53:45 crc kubenswrapper[4947]: I1203 06:53:45.099213 4947 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="a8194252-d393-4002-a3f3-48178b57862d" Dec 03 06:53:45 crc kubenswrapper[4947]: I1203 06:53:45.103375 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 06:53:45 crc kubenswrapper[4947]: I1203 06:53:45.103403 4947 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="a8194252-d393-4002-a3f3-48178b57862d" Dec 03 06:54:15 crc kubenswrapper[4947]: I1203 06:54:15.608411 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v9zdx"] Dec 03 06:54:15 crc kubenswrapper[4947]: I1203 06:54:15.609096 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" podUID="8f3f9b27-48d3-4b7e-963e-e2829a4a659e" containerName="controller-manager" containerID="cri-o://94515d1aad9107d2fb69029de2f01b00523004209b013cf1aac0d806317b53bf" gracePeriod=30 Dec 03 06:54:15 crc kubenswrapper[4947]: I1203 06:54:15.734050 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q"] Dec 03 06:54:15 crc kubenswrapper[4947]: I1203 06:54:15.734254 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q" podUID="2feb0620-965a-4f0e-98fd-bccefd87053b" containerName="route-controller-manager" containerID="cri-o://25200c6735ef4a116925d48980ffc122b1c87aed09fd99259a9c5677fb9f5b80" gracePeriod=30 Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:15.994190 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.000469 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmkwn\" (UniqueName: \"kubernetes.io/projected/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-kube-api-access-bmkwn\") pod \"8f3f9b27-48d3-4b7e-963e-e2829a4a659e\" (UID: \"8f3f9b27-48d3-4b7e-963e-e2829a4a659e\") " Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.000535 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-config\") pod \"8f3f9b27-48d3-4b7e-963e-e2829a4a659e\" (UID: \"8f3f9b27-48d3-4b7e-963e-e2829a4a659e\") " Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.000557 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-serving-cert\") pod \"8f3f9b27-48d3-4b7e-963e-e2829a4a659e\" (UID: \"8f3f9b27-48d3-4b7e-963e-e2829a4a659e\") " Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.000573 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-client-ca\") pod \"8f3f9b27-48d3-4b7e-963e-e2829a4a659e\" (UID: \"8f3f9b27-48d3-4b7e-963e-e2829a4a659e\") " Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.000588 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-proxy-ca-bundles\") pod \"8f3f9b27-48d3-4b7e-963e-e2829a4a659e\" (UID: \"8f3f9b27-48d3-4b7e-963e-e2829a4a659e\") " Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.001642 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8f3f9b27-48d3-4b7e-963e-e2829a4a659e" (UID: "8f3f9b27-48d3-4b7e-963e-e2829a4a659e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.001971 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-client-ca" (OuterVolumeSpecName: "client-ca") pod "8f3f9b27-48d3-4b7e-963e-e2829a4a659e" (UID: "8f3f9b27-48d3-4b7e-963e-e2829a4a659e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.003623 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-config" (OuterVolumeSpecName: "config") pod "8f3f9b27-48d3-4b7e-963e-e2829a4a659e" (UID: "8f3f9b27-48d3-4b7e-963e-e2829a4a659e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.011878 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8f3f9b27-48d3-4b7e-963e-e2829a4a659e" (UID: "8f3f9b27-48d3-4b7e-963e-e2829a4a659e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.020328 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-kube-api-access-bmkwn" (OuterVolumeSpecName: "kube-api-access-bmkwn") pod "8f3f9b27-48d3-4b7e-963e-e2829a4a659e" (UID: "8f3f9b27-48d3-4b7e-963e-e2829a4a659e"). InnerVolumeSpecName "kube-api-access-bmkwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.070157 4947 generic.go:334] "Generic (PLEG): container finished" podID="2feb0620-965a-4f0e-98fd-bccefd87053b" containerID="25200c6735ef4a116925d48980ffc122b1c87aed09fd99259a9c5677fb9f5b80" exitCode=0 Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.070217 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q" event={"ID":"2feb0620-965a-4f0e-98fd-bccefd87053b","Type":"ContainerDied","Data":"25200c6735ef4a116925d48980ffc122b1c87aed09fd99259a9c5677fb9f5b80"} Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.071835 4947 generic.go:334] "Generic (PLEG): container finished" podID="8f3f9b27-48d3-4b7e-963e-e2829a4a659e" containerID="94515d1aad9107d2fb69029de2f01b00523004209b013cf1aac0d806317b53bf" exitCode=0 Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.071865 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" event={"ID":"8f3f9b27-48d3-4b7e-963e-e2829a4a659e","Type":"ContainerDied","Data":"94515d1aad9107d2fb69029de2f01b00523004209b013cf1aac0d806317b53bf"} Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.071885 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" event={"ID":"8f3f9b27-48d3-4b7e-963e-e2829a4a659e","Type":"ContainerDied","Data":"b9d4a57bf2a9a9d51960103e98caea8ebf1314c619c4784000186c79de3c2e16"} Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.071905 4947 scope.go:117] "RemoveContainer" containerID="94515d1aad9107d2fb69029de2f01b00523004209b013cf1aac0d806317b53bf" Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.072069 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v9zdx" Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.085699 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q" Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.090614 4947 scope.go:117] "RemoveContainer" containerID="94515d1aad9107d2fb69029de2f01b00523004209b013cf1aac0d806317b53bf" Dec 03 06:54:16 crc kubenswrapper[4947]: E1203 06:54:16.090944 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94515d1aad9107d2fb69029de2f01b00523004209b013cf1aac0d806317b53bf\": container with ID starting with 94515d1aad9107d2fb69029de2f01b00523004209b013cf1aac0d806317b53bf not found: ID does not exist" containerID="94515d1aad9107d2fb69029de2f01b00523004209b013cf1aac0d806317b53bf" Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.090982 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94515d1aad9107d2fb69029de2f01b00523004209b013cf1aac0d806317b53bf"} err="failed to get container status \"94515d1aad9107d2fb69029de2f01b00523004209b013cf1aac0d806317b53bf\": rpc error: code = NotFound desc = could not find container \"94515d1aad9107d2fb69029de2f01b00523004209b013cf1aac0d806317b53bf\": container with ID starting with 94515d1aad9107d2fb69029de2f01b00523004209b013cf1aac0d806317b53bf not found: ID does not exist" Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.101936 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmkwn\" (UniqueName: \"kubernetes.io/projected/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-kube-api-access-bmkwn\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.101966 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.101977 4947 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.101986 4947 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.101995 4947 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8f3f9b27-48d3-4b7e-963e-e2829a4a659e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.113030 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v9zdx"] Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.116274 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v9zdx"] Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.202958 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2feb0620-965a-4f0e-98fd-bccefd87053b-config\") pod \"2feb0620-965a-4f0e-98fd-bccefd87053b\" (UID: \"2feb0620-965a-4f0e-98fd-bccefd87053b\") " Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.203047 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2feb0620-965a-4f0e-98fd-bccefd87053b-client-ca\") pod \"2feb0620-965a-4f0e-98fd-bccefd87053b\" (UID: \"2feb0620-965a-4f0e-98fd-bccefd87053b\") " Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.203074 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlrx2\" (UniqueName: \"kubernetes.io/projected/2feb0620-965a-4f0e-98fd-bccefd87053b-kube-api-access-mlrx2\") pod \"2feb0620-965a-4f0e-98fd-bccefd87053b\" (UID: \"2feb0620-965a-4f0e-98fd-bccefd87053b\") " Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.203100 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2feb0620-965a-4f0e-98fd-bccefd87053b-serving-cert\") pod \"2feb0620-965a-4f0e-98fd-bccefd87053b\" (UID: \"2feb0620-965a-4f0e-98fd-bccefd87053b\") " Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.203976 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2feb0620-965a-4f0e-98fd-bccefd87053b-client-ca" (OuterVolumeSpecName: "client-ca") pod "2feb0620-965a-4f0e-98fd-bccefd87053b" (UID: "2feb0620-965a-4f0e-98fd-bccefd87053b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.204239 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2feb0620-965a-4f0e-98fd-bccefd87053b-config" (OuterVolumeSpecName: "config") pod "2feb0620-965a-4f0e-98fd-bccefd87053b" (UID: "2feb0620-965a-4f0e-98fd-bccefd87053b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.206291 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2feb0620-965a-4f0e-98fd-bccefd87053b-kube-api-access-mlrx2" (OuterVolumeSpecName: "kube-api-access-mlrx2") pod "2feb0620-965a-4f0e-98fd-bccefd87053b" (UID: "2feb0620-965a-4f0e-98fd-bccefd87053b"). InnerVolumeSpecName "kube-api-access-mlrx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.206362 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2feb0620-965a-4f0e-98fd-bccefd87053b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2feb0620-965a-4f0e-98fd-bccefd87053b" (UID: "2feb0620-965a-4f0e-98fd-bccefd87053b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.303984 4947 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2feb0620-965a-4f0e-98fd-bccefd87053b-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.304018 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlrx2\" (UniqueName: \"kubernetes.io/projected/2feb0620-965a-4f0e-98fd-bccefd87053b-kube-api-access-mlrx2\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.304031 4947 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2feb0620-965a-4f0e-98fd-bccefd87053b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:16 crc kubenswrapper[4947]: I1203 06:54:16.304040 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2feb0620-965a-4f0e-98fd-bccefd87053b-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.079603 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q" Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.079605 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q" event={"ID":"2feb0620-965a-4f0e-98fd-bccefd87053b","Type":"ContainerDied","Data":"c0af378ce26d1b027559f4d50bcd0bc1bc6faddd35e07aec645e46a75280af43"} Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.079814 4947 scope.go:117] "RemoveContainer" containerID="25200c6735ef4a116925d48980ffc122b1c87aed09fd99259a9c5677fb9f5b80" Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.095096 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f3f9b27-48d3-4b7e-963e-e2829a4a659e" path="/var/lib/kubelet/pods/8f3f9b27-48d3-4b7e-963e-e2829a4a659e/volumes" Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.132696 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q"] Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.138816 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lgg8q"] Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.892752 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6d9f96d886-xsclr"] Dec 03 06:54:17 crc kubenswrapper[4947]: E1203 06:54:17.893127 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f3f9b27-48d3-4b7e-963e-e2829a4a659e" containerName="controller-manager" Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.893158 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f3f9b27-48d3-4b7e-963e-e2829a4a659e" containerName="controller-manager" Dec 03 06:54:17 crc kubenswrapper[4947]: E1203 06:54:17.893173 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2feb0620-965a-4f0e-98fd-bccefd87053b" containerName="route-controller-manager" Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.893188 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="2feb0620-965a-4f0e-98fd-bccefd87053b" containerName="route-controller-manager" Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.893364 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f3f9b27-48d3-4b7e-963e-e2829a4a659e" containerName="controller-manager" Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.893391 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="2feb0620-965a-4f0e-98fd-bccefd87053b" containerName="route-controller-manager" Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.894000 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.897456 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.897870 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.898042 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.901724 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.902084 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.904100 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.906996 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn"] Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.910332 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn" Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.919211 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.920372 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.921917 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.923052 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.923256 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.923408 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d9f96d886-xsclr"] Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.937646 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.945262 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 06:54:17 crc kubenswrapper[4947]: I1203 06:54:17.953991 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn"] Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.032156 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/877c3f1f-add4-48d2-8413-1b08e18ccd9c-config\") pod \"route-controller-manager-5464fdff99-c8pzn\" (UID: \"877c3f1f-add4-48d2-8413-1b08e18ccd9c\") " pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.032217 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f56290ea-486d-4693-920d-a300f35249df-proxy-ca-bundles\") pod \"controller-manager-6d9f96d886-xsclr\" (UID: \"f56290ea-486d-4693-920d-a300f35249df\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.032241 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhp65\" (UniqueName: \"kubernetes.io/projected/f56290ea-486d-4693-920d-a300f35249df-kube-api-access-mhp65\") pod \"controller-manager-6d9f96d886-xsclr\" (UID: \"f56290ea-486d-4693-920d-a300f35249df\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.032283 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f56290ea-486d-4693-920d-a300f35249df-config\") pod \"controller-manager-6d9f96d886-xsclr\" (UID: \"f56290ea-486d-4693-920d-a300f35249df\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.032408 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/877c3f1f-add4-48d2-8413-1b08e18ccd9c-serving-cert\") pod \"route-controller-manager-5464fdff99-c8pzn\" (UID: \"877c3f1f-add4-48d2-8413-1b08e18ccd9c\") " pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.032565 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h64r\" (UniqueName: \"kubernetes.io/projected/877c3f1f-add4-48d2-8413-1b08e18ccd9c-kube-api-access-9h64r\") pod \"route-controller-manager-5464fdff99-c8pzn\" (UID: \"877c3f1f-add4-48d2-8413-1b08e18ccd9c\") " pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.032635 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/877c3f1f-add4-48d2-8413-1b08e18ccd9c-client-ca\") pod \"route-controller-manager-5464fdff99-c8pzn\" (UID: \"877c3f1f-add4-48d2-8413-1b08e18ccd9c\") " pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.032706 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f56290ea-486d-4693-920d-a300f35249df-client-ca\") pod \"controller-manager-6d9f96d886-xsclr\" (UID: \"f56290ea-486d-4693-920d-a300f35249df\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.032734 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f56290ea-486d-4693-920d-a300f35249df-serving-cert\") pod \"controller-manager-6d9f96d886-xsclr\" (UID: \"f56290ea-486d-4693-920d-a300f35249df\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.134018 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/877c3f1f-add4-48d2-8413-1b08e18ccd9c-client-ca\") pod \"route-controller-manager-5464fdff99-c8pzn\" (UID: \"877c3f1f-add4-48d2-8413-1b08e18ccd9c\") " pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.134077 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f56290ea-486d-4693-920d-a300f35249df-client-ca\") pod \"controller-manager-6d9f96d886-xsclr\" (UID: \"f56290ea-486d-4693-920d-a300f35249df\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.134117 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f56290ea-486d-4693-920d-a300f35249df-serving-cert\") pod \"controller-manager-6d9f96d886-xsclr\" (UID: \"f56290ea-486d-4693-920d-a300f35249df\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.134151 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/877c3f1f-add4-48d2-8413-1b08e18ccd9c-config\") pod \"route-controller-manager-5464fdff99-c8pzn\" (UID: \"877c3f1f-add4-48d2-8413-1b08e18ccd9c\") " pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.134169 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f56290ea-486d-4693-920d-a300f35249df-proxy-ca-bundles\") pod \"controller-manager-6d9f96d886-xsclr\" (UID: \"f56290ea-486d-4693-920d-a300f35249df\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.134190 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhp65\" (UniqueName: \"kubernetes.io/projected/f56290ea-486d-4693-920d-a300f35249df-kube-api-access-mhp65\") pod \"controller-manager-6d9f96d886-xsclr\" (UID: \"f56290ea-486d-4693-920d-a300f35249df\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.134216 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f56290ea-486d-4693-920d-a300f35249df-config\") pod \"controller-manager-6d9f96d886-xsclr\" (UID: \"f56290ea-486d-4693-920d-a300f35249df\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.134246 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/877c3f1f-add4-48d2-8413-1b08e18ccd9c-serving-cert\") pod \"route-controller-manager-5464fdff99-c8pzn\" (UID: \"877c3f1f-add4-48d2-8413-1b08e18ccd9c\") " pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.134278 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h64r\" (UniqueName: \"kubernetes.io/projected/877c3f1f-add4-48d2-8413-1b08e18ccd9c-kube-api-access-9h64r\") pod \"route-controller-manager-5464fdff99-c8pzn\" (UID: \"877c3f1f-add4-48d2-8413-1b08e18ccd9c\") " pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.135344 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f56290ea-486d-4693-920d-a300f35249df-client-ca\") pod \"controller-manager-6d9f96d886-xsclr\" (UID: \"f56290ea-486d-4693-920d-a300f35249df\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.137038 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f56290ea-486d-4693-920d-a300f35249df-config\") pod \"controller-manager-6d9f96d886-xsclr\" (UID: \"f56290ea-486d-4693-920d-a300f35249df\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.137364 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/877c3f1f-add4-48d2-8413-1b08e18ccd9c-client-ca\") pod \"route-controller-manager-5464fdff99-c8pzn\" (UID: \"877c3f1f-add4-48d2-8413-1b08e18ccd9c\") " pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.137738 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f56290ea-486d-4693-920d-a300f35249df-proxy-ca-bundles\") pod \"controller-manager-6d9f96d886-xsclr\" (UID: \"f56290ea-486d-4693-920d-a300f35249df\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.139413 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/877c3f1f-add4-48d2-8413-1b08e18ccd9c-config\") pod \"route-controller-manager-5464fdff99-c8pzn\" (UID: \"877c3f1f-add4-48d2-8413-1b08e18ccd9c\") " pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.141736 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/877c3f1f-add4-48d2-8413-1b08e18ccd9c-serving-cert\") pod \"route-controller-manager-5464fdff99-c8pzn\" (UID: \"877c3f1f-add4-48d2-8413-1b08e18ccd9c\") " pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.147437 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f56290ea-486d-4693-920d-a300f35249df-serving-cert\") pod \"controller-manager-6d9f96d886-xsclr\" (UID: \"f56290ea-486d-4693-920d-a300f35249df\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.155410 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhp65\" (UniqueName: \"kubernetes.io/projected/f56290ea-486d-4693-920d-a300f35249df-kube-api-access-mhp65\") pod \"controller-manager-6d9f96d886-xsclr\" (UID: \"f56290ea-486d-4693-920d-a300f35249df\") " pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.156347 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h64r\" (UniqueName: \"kubernetes.io/projected/877c3f1f-add4-48d2-8413-1b08e18ccd9c-kube-api-access-9h64r\") pod \"route-controller-manager-5464fdff99-c8pzn\" (UID: \"877c3f1f-add4-48d2-8413-1b08e18ccd9c\") " pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.212027 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.245055 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn" Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.530914 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn"] Dec 03 06:54:18 crc kubenswrapper[4947]: I1203 06:54:18.652679 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d9f96d886-xsclr"] Dec 03 06:54:18 crc kubenswrapper[4947]: W1203 06:54:18.662004 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf56290ea_486d_4693_920d_a300f35249df.slice/crio-53dab9ca6d486595412ebec7b249da892a40b13e7a24553067a415d5cb23b63f WatchSource:0}: Error finding container 53dab9ca6d486595412ebec7b249da892a40b13e7a24553067a415d5cb23b63f: Status 404 returned error can't find the container with id 53dab9ca6d486595412ebec7b249da892a40b13e7a24553067a415d5cb23b63f Dec 03 06:54:19 crc kubenswrapper[4947]: I1203 06:54:19.089779 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2feb0620-965a-4f0e-98fd-bccefd87053b" path="/var/lib/kubelet/pods/2feb0620-965a-4f0e-98fd-bccefd87053b/volumes" Dec 03 06:54:19 crc kubenswrapper[4947]: I1203 06:54:19.097896 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" event={"ID":"f56290ea-486d-4693-920d-a300f35249df","Type":"ContainerStarted","Data":"a6f0a84d16fe104fc9be4201e07d1405201f1b6b92badccfd292b407d1a200ef"} Dec 03 06:54:19 crc kubenswrapper[4947]: I1203 06:54:19.097965 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" event={"ID":"f56290ea-486d-4693-920d-a300f35249df","Type":"ContainerStarted","Data":"53dab9ca6d486595412ebec7b249da892a40b13e7a24553067a415d5cb23b63f"} Dec 03 06:54:19 crc kubenswrapper[4947]: I1203 06:54:19.098223 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" Dec 03 06:54:19 crc kubenswrapper[4947]: I1203 06:54:19.099589 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn" event={"ID":"877c3f1f-add4-48d2-8413-1b08e18ccd9c","Type":"ContainerStarted","Data":"b18b2d8338dca766f6a4cb4a127fec5a4b209de18059ff8957b205f7238db92e"} Dec 03 06:54:19 crc kubenswrapper[4947]: I1203 06:54:19.099637 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn" event={"ID":"877c3f1f-add4-48d2-8413-1b08e18ccd9c","Type":"ContainerStarted","Data":"603f207d376c02d4befbab066a663436b77a9dbbe195ff621f3cd472733bd1e5"} Dec 03 06:54:19 crc kubenswrapper[4947]: I1203 06:54:19.099788 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn" Dec 03 06:54:19 crc kubenswrapper[4947]: I1203 06:54:19.107221 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn" Dec 03 06:54:19 crc kubenswrapper[4947]: I1203 06:54:19.145126 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" Dec 03 06:54:19 crc kubenswrapper[4947]: I1203 06:54:19.187506 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn" podStartSLOduration=3.187461901 podStartE2EDuration="3.187461901s" podCreationTimestamp="2025-12-03 06:54:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:54:19.184439542 +0000 UTC m=+320.445393988" watchObservedRunningTime="2025-12-03 06:54:19.187461901 +0000 UTC m=+320.448416327" Dec 03 06:54:19 crc kubenswrapper[4947]: I1203 06:54:19.209739 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" podStartSLOduration=4.209709326 podStartE2EDuration="4.209709326s" podCreationTimestamp="2025-12-03 06:54:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:54:19.207277853 +0000 UTC m=+320.468232279" watchObservedRunningTime="2025-12-03 06:54:19.209709326 +0000 UTC m=+320.470663782" Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.192210 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qwtfv"] Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.194399 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qwtfv" Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.196407 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.207685 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qwtfv"] Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.319429 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srnzs\" (UniqueName: \"kubernetes.io/projected/7287497a-3ee8-48ea-b85f-4b1d8a9927cf-kube-api-access-srnzs\") pod \"redhat-operators-qwtfv\" (UID: \"7287497a-3ee8-48ea-b85f-4b1d8a9927cf\") " pod="openshift-marketplace/redhat-operators-qwtfv" Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.319566 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7287497a-3ee8-48ea-b85f-4b1d8a9927cf-utilities\") pod \"redhat-operators-qwtfv\" (UID: \"7287497a-3ee8-48ea-b85f-4b1d8a9927cf\") " pod="openshift-marketplace/redhat-operators-qwtfv" Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.319617 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7287497a-3ee8-48ea-b85f-4b1d8a9927cf-catalog-content\") pod \"redhat-operators-qwtfv\" (UID: \"7287497a-3ee8-48ea-b85f-4b1d8a9927cf\") " pod="openshift-marketplace/redhat-operators-qwtfv" Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.408564 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j45sc"] Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.410381 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j45sc" Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.414115 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.414814 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j45sc"] Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.421210 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7287497a-3ee8-48ea-b85f-4b1d8a9927cf-utilities\") pod \"redhat-operators-qwtfv\" (UID: \"7287497a-3ee8-48ea-b85f-4b1d8a9927cf\") " pod="openshift-marketplace/redhat-operators-qwtfv" Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.421290 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7287497a-3ee8-48ea-b85f-4b1d8a9927cf-catalog-content\") pod \"redhat-operators-qwtfv\" (UID: \"7287497a-3ee8-48ea-b85f-4b1d8a9927cf\") " pod="openshift-marketplace/redhat-operators-qwtfv" Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.421332 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srnzs\" (UniqueName: \"kubernetes.io/projected/7287497a-3ee8-48ea-b85f-4b1d8a9927cf-kube-api-access-srnzs\") pod \"redhat-operators-qwtfv\" (UID: \"7287497a-3ee8-48ea-b85f-4b1d8a9927cf\") " pod="openshift-marketplace/redhat-operators-qwtfv" Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.422268 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7287497a-3ee8-48ea-b85f-4b1d8a9927cf-utilities\") pod \"redhat-operators-qwtfv\" (UID: \"7287497a-3ee8-48ea-b85f-4b1d8a9927cf\") " pod="openshift-marketplace/redhat-operators-qwtfv" Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.422368 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7287497a-3ee8-48ea-b85f-4b1d8a9927cf-catalog-content\") pod \"redhat-operators-qwtfv\" (UID: \"7287497a-3ee8-48ea-b85f-4b1d8a9927cf\") " pod="openshift-marketplace/redhat-operators-qwtfv" Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.444831 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srnzs\" (UniqueName: \"kubernetes.io/projected/7287497a-3ee8-48ea-b85f-4b1d8a9927cf-kube-api-access-srnzs\") pod \"redhat-operators-qwtfv\" (UID: \"7287497a-3ee8-48ea-b85f-4b1d8a9927cf\") " pod="openshift-marketplace/redhat-operators-qwtfv" Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.518084 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qwtfv" Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.522887 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl4nj\" (UniqueName: \"kubernetes.io/projected/780d43c8-0dea-47ad-95cc-26801572c76d-kube-api-access-nl4nj\") pod \"redhat-marketplace-j45sc\" (UID: \"780d43c8-0dea-47ad-95cc-26801572c76d\") " pod="openshift-marketplace/redhat-marketplace-j45sc" Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.522928 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/780d43c8-0dea-47ad-95cc-26801572c76d-catalog-content\") pod \"redhat-marketplace-j45sc\" (UID: \"780d43c8-0dea-47ad-95cc-26801572c76d\") " pod="openshift-marketplace/redhat-marketplace-j45sc" Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.522974 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/780d43c8-0dea-47ad-95cc-26801572c76d-utilities\") pod \"redhat-marketplace-j45sc\" (UID: \"780d43c8-0dea-47ad-95cc-26801572c76d\") " pod="openshift-marketplace/redhat-marketplace-j45sc" Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.624632 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl4nj\" (UniqueName: \"kubernetes.io/projected/780d43c8-0dea-47ad-95cc-26801572c76d-kube-api-access-nl4nj\") pod \"redhat-marketplace-j45sc\" (UID: \"780d43c8-0dea-47ad-95cc-26801572c76d\") " pod="openshift-marketplace/redhat-marketplace-j45sc" Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.624693 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/780d43c8-0dea-47ad-95cc-26801572c76d-catalog-content\") pod \"redhat-marketplace-j45sc\" (UID: \"780d43c8-0dea-47ad-95cc-26801572c76d\") " pod="openshift-marketplace/redhat-marketplace-j45sc" Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.624753 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/780d43c8-0dea-47ad-95cc-26801572c76d-utilities\") pod \"redhat-marketplace-j45sc\" (UID: \"780d43c8-0dea-47ad-95cc-26801572c76d\") " pod="openshift-marketplace/redhat-marketplace-j45sc" Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.625423 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/780d43c8-0dea-47ad-95cc-26801572c76d-utilities\") pod \"redhat-marketplace-j45sc\" (UID: \"780d43c8-0dea-47ad-95cc-26801572c76d\") " pod="openshift-marketplace/redhat-marketplace-j45sc" Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.625444 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/780d43c8-0dea-47ad-95cc-26801572c76d-catalog-content\") pod \"redhat-marketplace-j45sc\" (UID: \"780d43c8-0dea-47ad-95cc-26801572c76d\") " pod="openshift-marketplace/redhat-marketplace-j45sc" Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.651058 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl4nj\" (UniqueName: \"kubernetes.io/projected/780d43c8-0dea-47ad-95cc-26801572c76d-kube-api-access-nl4nj\") pod \"redhat-marketplace-j45sc\" (UID: \"780d43c8-0dea-47ad-95cc-26801572c76d\") " pod="openshift-marketplace/redhat-marketplace-j45sc" Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.731162 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j45sc" Dec 03 06:54:24 crc kubenswrapper[4947]: I1203 06:54:24.953708 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qwtfv"] Dec 03 06:54:25 crc kubenswrapper[4947]: I1203 06:54:25.135838 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qwtfv" event={"ID":"7287497a-3ee8-48ea-b85f-4b1d8a9927cf","Type":"ContainerStarted","Data":"f54f750d262b2e795aa81784c7f244515a91bfb43b41362017359200194cdcbd"} Dec 03 06:54:25 crc kubenswrapper[4947]: I1203 06:54:25.142889 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j45sc"] Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.146133 4947 generic.go:334] "Generic (PLEG): container finished" podID="7287497a-3ee8-48ea-b85f-4b1d8a9927cf" containerID="84bda53ef880c281eee4012d871c3c6c370e995de333c0d88d4d863ee8e194e4" exitCode=0 Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.146263 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qwtfv" event={"ID":"7287497a-3ee8-48ea-b85f-4b1d8a9927cf","Type":"ContainerDied","Data":"84bda53ef880c281eee4012d871c3c6c370e995de333c0d88d4d863ee8e194e4"} Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.148239 4947 generic.go:334] "Generic (PLEG): container finished" podID="780d43c8-0dea-47ad-95cc-26801572c76d" containerID="a43a4f471bb575fbdd542db632fa0f752b4975f30d6b4be38da71070bc31b70e" exitCode=0 Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.148285 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j45sc" event={"ID":"780d43c8-0dea-47ad-95cc-26801572c76d","Type":"ContainerDied","Data":"a43a4f471bb575fbdd542db632fa0f752b4975f30d6b4be38da71070bc31b70e"} Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.148314 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j45sc" event={"ID":"780d43c8-0dea-47ad-95cc-26801572c76d","Type":"ContainerStarted","Data":"24b8383d2cce2010171212a5588e95f105e68c96aabcfea6095d5bdcbe2d3892"} Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.593210 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hqplp"] Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.596345 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqplp" Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.599571 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.617408 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hqplp"] Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.653175 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2dpj\" (UniqueName: \"kubernetes.io/projected/be1f365b-f0e8-413f-ac93-6be2fc6282a8-kube-api-access-l2dpj\") pod \"certified-operators-hqplp\" (UID: \"be1f365b-f0e8-413f-ac93-6be2fc6282a8\") " pod="openshift-marketplace/certified-operators-hqplp" Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.653291 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be1f365b-f0e8-413f-ac93-6be2fc6282a8-catalog-content\") pod \"certified-operators-hqplp\" (UID: \"be1f365b-f0e8-413f-ac93-6be2fc6282a8\") " pod="openshift-marketplace/certified-operators-hqplp" Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.653337 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be1f365b-f0e8-413f-ac93-6be2fc6282a8-utilities\") pod \"certified-operators-hqplp\" (UID: \"be1f365b-f0e8-413f-ac93-6be2fc6282a8\") " pod="openshift-marketplace/certified-operators-hqplp" Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.756750 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be1f365b-f0e8-413f-ac93-6be2fc6282a8-catalog-content\") pod \"certified-operators-hqplp\" (UID: \"be1f365b-f0e8-413f-ac93-6be2fc6282a8\") " pod="openshift-marketplace/certified-operators-hqplp" Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.756832 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be1f365b-f0e8-413f-ac93-6be2fc6282a8-utilities\") pod \"certified-operators-hqplp\" (UID: \"be1f365b-f0e8-413f-ac93-6be2fc6282a8\") " pod="openshift-marketplace/certified-operators-hqplp" Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.756889 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2dpj\" (UniqueName: \"kubernetes.io/projected/be1f365b-f0e8-413f-ac93-6be2fc6282a8-kube-api-access-l2dpj\") pod \"certified-operators-hqplp\" (UID: \"be1f365b-f0e8-413f-ac93-6be2fc6282a8\") " pod="openshift-marketplace/certified-operators-hqplp" Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.757470 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be1f365b-f0e8-413f-ac93-6be2fc6282a8-utilities\") pod \"certified-operators-hqplp\" (UID: \"be1f365b-f0e8-413f-ac93-6be2fc6282a8\") " pod="openshift-marketplace/certified-operators-hqplp" Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.758099 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be1f365b-f0e8-413f-ac93-6be2fc6282a8-catalog-content\") pod \"certified-operators-hqplp\" (UID: \"be1f365b-f0e8-413f-ac93-6be2fc6282a8\") " pod="openshift-marketplace/certified-operators-hqplp" Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.789327 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2dpj\" (UniqueName: \"kubernetes.io/projected/be1f365b-f0e8-413f-ac93-6be2fc6282a8-kube-api-access-l2dpj\") pod \"certified-operators-hqplp\" (UID: \"be1f365b-f0e8-413f-ac93-6be2fc6282a8\") " pod="openshift-marketplace/certified-operators-hqplp" Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.790807 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sp9x2"] Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.792461 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sp9x2" Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.795970 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.807816 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sp9x2"] Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.872344 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnf2w\" (UniqueName: \"kubernetes.io/projected/db662847-42e5-472b-b370-1d537a258211-kube-api-access-rnf2w\") pod \"community-operators-sp9x2\" (UID: \"db662847-42e5-472b-b370-1d537a258211\") " pod="openshift-marketplace/community-operators-sp9x2" Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.872466 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db662847-42e5-472b-b370-1d537a258211-catalog-content\") pod \"community-operators-sp9x2\" (UID: \"db662847-42e5-472b-b370-1d537a258211\") " pod="openshift-marketplace/community-operators-sp9x2" Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.872544 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db662847-42e5-472b-b370-1d537a258211-utilities\") pod \"community-operators-sp9x2\" (UID: \"db662847-42e5-472b-b370-1d537a258211\") " pod="openshift-marketplace/community-operators-sp9x2" Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.916446 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqplp" Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.973410 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db662847-42e5-472b-b370-1d537a258211-catalog-content\") pod \"community-operators-sp9x2\" (UID: \"db662847-42e5-472b-b370-1d537a258211\") " pod="openshift-marketplace/community-operators-sp9x2" Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.973461 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db662847-42e5-472b-b370-1d537a258211-utilities\") pod \"community-operators-sp9x2\" (UID: \"db662847-42e5-472b-b370-1d537a258211\") " pod="openshift-marketplace/community-operators-sp9x2" Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.973556 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnf2w\" (UniqueName: \"kubernetes.io/projected/db662847-42e5-472b-b370-1d537a258211-kube-api-access-rnf2w\") pod \"community-operators-sp9x2\" (UID: \"db662847-42e5-472b-b370-1d537a258211\") " pod="openshift-marketplace/community-operators-sp9x2" Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.973841 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db662847-42e5-472b-b370-1d537a258211-catalog-content\") pod \"community-operators-sp9x2\" (UID: \"db662847-42e5-472b-b370-1d537a258211\") " pod="openshift-marketplace/community-operators-sp9x2" Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.973950 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db662847-42e5-472b-b370-1d537a258211-utilities\") pod \"community-operators-sp9x2\" (UID: \"db662847-42e5-472b-b370-1d537a258211\") " pod="openshift-marketplace/community-operators-sp9x2" Dec 03 06:54:26 crc kubenswrapper[4947]: I1203 06:54:26.990191 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnf2w\" (UniqueName: \"kubernetes.io/projected/db662847-42e5-472b-b370-1d537a258211-kube-api-access-rnf2w\") pod \"community-operators-sp9x2\" (UID: \"db662847-42e5-472b-b370-1d537a258211\") " pod="openshift-marketplace/community-operators-sp9x2" Dec 03 06:54:27 crc kubenswrapper[4947]: I1203 06:54:27.124704 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sp9x2" Dec 03 06:54:27 crc kubenswrapper[4947]: I1203 06:54:27.389447 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hqplp"] Dec 03 06:54:27 crc kubenswrapper[4947]: I1203 06:54:27.595986 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sp9x2"] Dec 03 06:54:28 crc kubenswrapper[4947]: I1203 06:54:28.160423 4947 generic.go:334] "Generic (PLEG): container finished" podID="be1f365b-f0e8-413f-ac93-6be2fc6282a8" containerID="9ad641e180c0e3ed8e3aefdf96bc3e35b050ab9b17d6a3df2fb20f7d71cbb0b8" exitCode=0 Dec 03 06:54:28 crc kubenswrapper[4947]: I1203 06:54:28.160684 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqplp" event={"ID":"be1f365b-f0e8-413f-ac93-6be2fc6282a8","Type":"ContainerDied","Data":"9ad641e180c0e3ed8e3aefdf96bc3e35b050ab9b17d6a3df2fb20f7d71cbb0b8"} Dec 03 06:54:28 crc kubenswrapper[4947]: I1203 06:54:28.160830 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqplp" event={"ID":"be1f365b-f0e8-413f-ac93-6be2fc6282a8","Type":"ContainerStarted","Data":"0d93e14fd615618355de39579c2f58a684bf55fd468252ac5da351e26d0a6ca3"} Dec 03 06:54:28 crc kubenswrapper[4947]: I1203 06:54:28.163738 4947 generic.go:334] "Generic (PLEG): container finished" podID="7287497a-3ee8-48ea-b85f-4b1d8a9927cf" containerID="c6f9f5cff93bd88d3cb996ce2d7e99563ff5e9e9593d378b97760c81ac139696" exitCode=0 Dec 03 06:54:28 crc kubenswrapper[4947]: I1203 06:54:28.163805 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qwtfv" event={"ID":"7287497a-3ee8-48ea-b85f-4b1d8a9927cf","Type":"ContainerDied","Data":"c6f9f5cff93bd88d3cb996ce2d7e99563ff5e9e9593d378b97760c81ac139696"} Dec 03 06:54:28 crc kubenswrapper[4947]: I1203 06:54:28.165874 4947 generic.go:334] "Generic (PLEG): container finished" podID="db662847-42e5-472b-b370-1d537a258211" containerID="3bb4842b5bb7ad2c27bb152003d20ae81b0a2ab69256edc81ddf59003c0cdea8" exitCode=0 Dec 03 06:54:28 crc kubenswrapper[4947]: I1203 06:54:28.165934 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sp9x2" event={"ID":"db662847-42e5-472b-b370-1d537a258211","Type":"ContainerDied","Data":"3bb4842b5bb7ad2c27bb152003d20ae81b0a2ab69256edc81ddf59003c0cdea8"} Dec 03 06:54:28 crc kubenswrapper[4947]: I1203 06:54:28.165954 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sp9x2" event={"ID":"db662847-42e5-472b-b370-1d537a258211","Type":"ContainerStarted","Data":"1a24fc132d35b8fdfe6b94d699319287bbd545f0b915a4a819faa63b68bf1d58"} Dec 03 06:54:28 crc kubenswrapper[4947]: I1203 06:54:28.171243 4947 generic.go:334] "Generic (PLEG): container finished" podID="780d43c8-0dea-47ad-95cc-26801572c76d" containerID="8e03b811823e334251c23ec3e9ed9085bf71643e1741f6e678c0665ea1402c3c" exitCode=0 Dec 03 06:54:28 crc kubenswrapper[4947]: I1203 06:54:28.171281 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j45sc" event={"ID":"780d43c8-0dea-47ad-95cc-26801572c76d","Type":"ContainerDied","Data":"8e03b811823e334251c23ec3e9ed9085bf71643e1741f6e678c0665ea1402c3c"} Dec 03 06:54:29 crc kubenswrapper[4947]: I1203 06:54:29.178331 4947 generic.go:334] "Generic (PLEG): container finished" podID="be1f365b-f0e8-413f-ac93-6be2fc6282a8" containerID="e2e4c79d4ca3cc76c2d236e2edebe8f791942eb804809abc84594084e7b54694" exitCode=0 Dec 03 06:54:29 crc kubenswrapper[4947]: I1203 06:54:29.178403 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqplp" event={"ID":"be1f365b-f0e8-413f-ac93-6be2fc6282a8","Type":"ContainerDied","Data":"e2e4c79d4ca3cc76c2d236e2edebe8f791942eb804809abc84594084e7b54694"} Dec 03 06:54:29 crc kubenswrapper[4947]: I1203 06:54:29.180531 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qwtfv" event={"ID":"7287497a-3ee8-48ea-b85f-4b1d8a9927cf","Type":"ContainerStarted","Data":"b191938a393f368c92afd0240563960a6a3b2f566fb1c20b3c791242636734cc"} Dec 03 06:54:29 crc kubenswrapper[4947]: I1203 06:54:29.182415 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sp9x2" event={"ID":"db662847-42e5-472b-b370-1d537a258211","Type":"ContainerStarted","Data":"7ef9d23a84e9e187a04e495f773226443522d320329721b2d43f4061a8000446"} Dec 03 06:54:29 crc kubenswrapper[4947]: I1203 06:54:29.185713 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j45sc" event={"ID":"780d43c8-0dea-47ad-95cc-26801572c76d","Type":"ContainerStarted","Data":"adce6fe2f3e5a7a68e619e3e929007880e17a4bebe9ab17eb773b3fdd89a126e"} Dec 03 06:54:29 crc kubenswrapper[4947]: I1203 06:54:29.217606 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qwtfv" podStartSLOduration=2.787667838 podStartE2EDuration="5.217590894s" podCreationTimestamp="2025-12-03 06:54:24 +0000 UTC" firstStartedPulling="2025-12-03 06:54:26.14970254 +0000 UTC m=+327.410656966" lastFinishedPulling="2025-12-03 06:54:28.579625586 +0000 UTC m=+329.840580022" observedRunningTime="2025-12-03 06:54:29.214976893 +0000 UTC m=+330.475931359" watchObservedRunningTime="2025-12-03 06:54:29.217590894 +0000 UTC m=+330.478545320" Dec 03 06:54:29 crc kubenswrapper[4947]: I1203 06:54:29.231164 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j45sc" podStartSLOduration=2.769873387 podStartE2EDuration="5.231141541s" podCreationTimestamp="2025-12-03 06:54:24 +0000 UTC" firstStartedPulling="2025-12-03 06:54:26.15046193 +0000 UTC m=+327.411416356" lastFinishedPulling="2025-12-03 06:54:28.611730084 +0000 UTC m=+329.872684510" observedRunningTime="2025-12-03 06:54:29.228477049 +0000 UTC m=+330.489431485" watchObservedRunningTime="2025-12-03 06:54:29.231141541 +0000 UTC m=+330.492095957" Dec 03 06:54:30 crc kubenswrapper[4947]: I1203 06:54:30.087019 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:54:30 crc kubenswrapper[4947]: I1203 06:54:30.087516 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:54:30 crc kubenswrapper[4947]: I1203 06:54:30.194202 4947 generic.go:334] "Generic (PLEG): container finished" podID="db662847-42e5-472b-b370-1d537a258211" containerID="7ef9d23a84e9e187a04e495f773226443522d320329721b2d43f4061a8000446" exitCode=0 Dec 03 06:54:30 crc kubenswrapper[4947]: I1203 06:54:30.194289 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sp9x2" event={"ID":"db662847-42e5-472b-b370-1d537a258211","Type":"ContainerDied","Data":"7ef9d23a84e9e187a04e495f773226443522d320329721b2d43f4061a8000446"} Dec 03 06:54:30 crc kubenswrapper[4947]: I1203 06:54:30.202681 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqplp" event={"ID":"be1f365b-f0e8-413f-ac93-6be2fc6282a8","Type":"ContainerStarted","Data":"e38bd3f1fe6049d3622ace19445979e3ef2446e14f2908ecd8f836a736bb8536"} Dec 03 06:54:30 crc kubenswrapper[4947]: I1203 06:54:30.246158 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hqplp" podStartSLOduration=2.802427629 podStartE2EDuration="4.246141573s" podCreationTimestamp="2025-12-03 06:54:26 +0000 UTC" firstStartedPulling="2025-12-03 06:54:28.162461707 +0000 UTC m=+329.423416133" lastFinishedPulling="2025-12-03 06:54:29.606175651 +0000 UTC m=+330.867130077" observedRunningTime="2025-12-03 06:54:30.242822262 +0000 UTC m=+331.503776688" watchObservedRunningTime="2025-12-03 06:54:30.246141573 +0000 UTC m=+331.507096009" Dec 03 06:54:31 crc kubenswrapper[4947]: I1203 06:54:31.209096 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sp9x2" event={"ID":"db662847-42e5-472b-b370-1d537a258211","Type":"ContainerStarted","Data":"4dfd9a96d0d07531aeaa9b0238f912b14fda8de323126126867228eb7c2a76d0"} Dec 03 06:54:31 crc kubenswrapper[4947]: I1203 06:54:31.228448 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sp9x2" podStartSLOduration=2.603212814 podStartE2EDuration="5.22843067s" podCreationTimestamp="2025-12-03 06:54:26 +0000 UTC" firstStartedPulling="2025-12-03 06:54:28.168394318 +0000 UTC m=+329.429348744" lastFinishedPulling="2025-12-03 06:54:30.793612174 +0000 UTC m=+332.054566600" observedRunningTime="2025-12-03 06:54:31.226541209 +0000 UTC m=+332.487495635" watchObservedRunningTime="2025-12-03 06:54:31.22843067 +0000 UTC m=+332.489385106" Dec 03 06:54:34 crc kubenswrapper[4947]: I1203 06:54:34.518304 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qwtfv" Dec 03 06:54:34 crc kubenswrapper[4947]: I1203 06:54:34.518679 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qwtfv" Dec 03 06:54:34 crc kubenswrapper[4947]: I1203 06:54:34.567272 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qwtfv" Dec 03 06:54:34 crc kubenswrapper[4947]: I1203 06:54:34.732575 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j45sc" Dec 03 06:54:34 crc kubenswrapper[4947]: I1203 06:54:34.732641 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j45sc" Dec 03 06:54:34 crc kubenswrapper[4947]: I1203 06:54:34.810826 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j45sc" Dec 03 06:54:35 crc kubenswrapper[4947]: I1203 06:54:35.274625 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qwtfv" Dec 03 06:54:35 crc kubenswrapper[4947]: I1203 06:54:35.281562 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j45sc" Dec 03 06:54:35 crc kubenswrapper[4947]: I1203 06:54:35.649257 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn"] Dec 03 06:54:35 crc kubenswrapper[4947]: I1203 06:54:35.649691 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn" podUID="877c3f1f-add4-48d2-8413-1b08e18ccd9c" containerName="route-controller-manager" containerID="cri-o://b18b2d8338dca766f6a4cb4a127fec5a4b209de18059ff8957b205f7238db92e" gracePeriod=30 Dec 03 06:54:36 crc kubenswrapper[4947]: I1203 06:54:36.917530 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hqplp" Dec 03 06:54:36 crc kubenswrapper[4947]: I1203 06:54:36.918385 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hqplp" Dec 03 06:54:36 crc kubenswrapper[4947]: I1203 06:54:36.959809 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hqplp" Dec 03 06:54:37 crc kubenswrapper[4947]: I1203 06:54:37.125118 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sp9x2" Dec 03 06:54:37 crc kubenswrapper[4947]: I1203 06:54:37.125176 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sp9x2" Dec 03 06:54:37 crc kubenswrapper[4947]: I1203 06:54:37.191739 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sp9x2" Dec 03 06:54:37 crc kubenswrapper[4947]: I1203 06:54:37.293867 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hqplp" Dec 03 06:54:37 crc kubenswrapper[4947]: I1203 06:54:37.300349 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sp9x2" Dec 03 06:54:38 crc kubenswrapper[4947]: I1203 06:54:38.247130 4947 patch_prober.go:28] interesting pod/route-controller-manager-5464fdff99-c8pzn container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" start-of-body= Dec 03 06:54:38 crc kubenswrapper[4947]: I1203 06:54:38.248637 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn" podUID="877c3f1f-add4-48d2-8413-1b08e18ccd9c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" Dec 03 06:54:39 crc kubenswrapper[4947]: I1203 06:54:39.261608 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn" event={"ID":"877c3f1f-add4-48d2-8413-1b08e18ccd9c","Type":"ContainerDied","Data":"b18b2d8338dca766f6a4cb4a127fec5a4b209de18059ff8957b205f7238db92e"} Dec 03 06:54:39 crc kubenswrapper[4947]: I1203 06:54:39.261634 4947 generic.go:334] "Generic (PLEG): container finished" podID="877c3f1f-add4-48d2-8413-1b08e18ccd9c" containerID="b18b2d8338dca766f6a4cb4a127fec5a4b209de18059ff8957b205f7238db92e" exitCode=0 Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.226289 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.258109 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b4689d77-qqrgl"] Dec 03 06:54:40 crc kubenswrapper[4947]: E1203 06:54:40.258399 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877c3f1f-add4-48d2-8413-1b08e18ccd9c" containerName="route-controller-manager" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.258415 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="877c3f1f-add4-48d2-8413-1b08e18ccd9c" containerName="route-controller-manager" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.258566 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="877c3f1f-add4-48d2-8413-1b08e18ccd9c" containerName="route-controller-manager" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.259016 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b4689d77-qqrgl" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.263160 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b4689d77-qqrgl"] Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.271886 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn" event={"ID":"877c3f1f-add4-48d2-8413-1b08e18ccd9c","Type":"ContainerDied","Data":"603f207d376c02d4befbab066a663436b77a9dbbe195ff621f3cd472733bd1e5"} Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.271920 4947 scope.go:117] "RemoveContainer" containerID="b18b2d8338dca766f6a4cb4a127fec5a4b209de18059ff8957b205f7238db92e" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.272023 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.375745 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/877c3f1f-add4-48d2-8413-1b08e18ccd9c-client-ca\") pod \"877c3f1f-add4-48d2-8413-1b08e18ccd9c\" (UID: \"877c3f1f-add4-48d2-8413-1b08e18ccd9c\") " Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.376220 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/877c3f1f-add4-48d2-8413-1b08e18ccd9c-serving-cert\") pod \"877c3f1f-add4-48d2-8413-1b08e18ccd9c\" (UID: \"877c3f1f-add4-48d2-8413-1b08e18ccd9c\") " Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.376299 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/877c3f1f-add4-48d2-8413-1b08e18ccd9c-config\") pod \"877c3f1f-add4-48d2-8413-1b08e18ccd9c\" (UID: \"877c3f1f-add4-48d2-8413-1b08e18ccd9c\") " Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.376339 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h64r\" (UniqueName: \"kubernetes.io/projected/877c3f1f-add4-48d2-8413-1b08e18ccd9c-kube-api-access-9h64r\") pod \"877c3f1f-add4-48d2-8413-1b08e18ccd9c\" (UID: \"877c3f1f-add4-48d2-8413-1b08e18ccd9c\") " Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.376546 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f009dd7c-40cd-4318-a0d6-62ba6996dbc2-config\") pod \"route-controller-manager-5b4689d77-qqrgl\" (UID: \"f009dd7c-40cd-4318-a0d6-62ba6996dbc2\") " pod="openshift-route-controller-manager/route-controller-manager-5b4689d77-qqrgl" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.376583 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f009dd7c-40cd-4318-a0d6-62ba6996dbc2-client-ca\") pod \"route-controller-manager-5b4689d77-qqrgl\" (UID: \"f009dd7c-40cd-4318-a0d6-62ba6996dbc2\") " pod="openshift-route-controller-manager/route-controller-manager-5b4689d77-qqrgl" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.376780 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpqxk\" (UniqueName: \"kubernetes.io/projected/f009dd7c-40cd-4318-a0d6-62ba6996dbc2-kube-api-access-dpqxk\") pod \"route-controller-manager-5b4689d77-qqrgl\" (UID: \"f009dd7c-40cd-4318-a0d6-62ba6996dbc2\") " pod="openshift-route-controller-manager/route-controller-manager-5b4689d77-qqrgl" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.376868 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/877c3f1f-add4-48d2-8413-1b08e18ccd9c-client-ca" (OuterVolumeSpecName: "client-ca") pod "877c3f1f-add4-48d2-8413-1b08e18ccd9c" (UID: "877c3f1f-add4-48d2-8413-1b08e18ccd9c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.376880 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f009dd7c-40cd-4318-a0d6-62ba6996dbc2-serving-cert\") pod \"route-controller-manager-5b4689d77-qqrgl\" (UID: \"f009dd7c-40cd-4318-a0d6-62ba6996dbc2\") " pod="openshift-route-controller-manager/route-controller-manager-5b4689d77-qqrgl" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.376958 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/877c3f1f-add4-48d2-8413-1b08e18ccd9c-config" (OuterVolumeSpecName: "config") pod "877c3f1f-add4-48d2-8413-1b08e18ccd9c" (UID: "877c3f1f-add4-48d2-8413-1b08e18ccd9c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.377153 4947 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/877c3f1f-add4-48d2-8413-1b08e18ccd9c-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.377175 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/877c3f1f-add4-48d2-8413-1b08e18ccd9c-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.381136 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/877c3f1f-add4-48d2-8413-1b08e18ccd9c-kube-api-access-9h64r" (OuterVolumeSpecName: "kube-api-access-9h64r") pod "877c3f1f-add4-48d2-8413-1b08e18ccd9c" (UID: "877c3f1f-add4-48d2-8413-1b08e18ccd9c"). InnerVolumeSpecName "kube-api-access-9h64r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.381229 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877c3f1f-add4-48d2-8413-1b08e18ccd9c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "877c3f1f-add4-48d2-8413-1b08e18ccd9c" (UID: "877c3f1f-add4-48d2-8413-1b08e18ccd9c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.478690 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f009dd7c-40cd-4318-a0d6-62ba6996dbc2-config\") pod \"route-controller-manager-5b4689d77-qqrgl\" (UID: \"f009dd7c-40cd-4318-a0d6-62ba6996dbc2\") " pod="openshift-route-controller-manager/route-controller-manager-5b4689d77-qqrgl" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.478779 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f009dd7c-40cd-4318-a0d6-62ba6996dbc2-client-ca\") pod \"route-controller-manager-5b4689d77-qqrgl\" (UID: \"f009dd7c-40cd-4318-a0d6-62ba6996dbc2\") " pod="openshift-route-controller-manager/route-controller-manager-5b4689d77-qqrgl" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.478878 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpqxk\" (UniqueName: \"kubernetes.io/projected/f009dd7c-40cd-4318-a0d6-62ba6996dbc2-kube-api-access-dpqxk\") pod \"route-controller-manager-5b4689d77-qqrgl\" (UID: \"f009dd7c-40cd-4318-a0d6-62ba6996dbc2\") " pod="openshift-route-controller-manager/route-controller-manager-5b4689d77-qqrgl" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.478950 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f009dd7c-40cd-4318-a0d6-62ba6996dbc2-serving-cert\") pod \"route-controller-manager-5b4689d77-qqrgl\" (UID: \"f009dd7c-40cd-4318-a0d6-62ba6996dbc2\") " pod="openshift-route-controller-manager/route-controller-manager-5b4689d77-qqrgl" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.479037 4947 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/877c3f1f-add4-48d2-8413-1b08e18ccd9c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.479066 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9h64r\" (UniqueName: \"kubernetes.io/projected/877c3f1f-add4-48d2-8413-1b08e18ccd9c-kube-api-access-9h64r\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.480458 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f009dd7c-40cd-4318-a0d6-62ba6996dbc2-config\") pod \"route-controller-manager-5b4689d77-qqrgl\" (UID: \"f009dd7c-40cd-4318-a0d6-62ba6996dbc2\") " pod="openshift-route-controller-manager/route-controller-manager-5b4689d77-qqrgl" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.480570 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f009dd7c-40cd-4318-a0d6-62ba6996dbc2-client-ca\") pod \"route-controller-manager-5b4689d77-qqrgl\" (UID: \"f009dd7c-40cd-4318-a0d6-62ba6996dbc2\") " pod="openshift-route-controller-manager/route-controller-manager-5b4689d77-qqrgl" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.486115 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f009dd7c-40cd-4318-a0d6-62ba6996dbc2-serving-cert\") pod \"route-controller-manager-5b4689d77-qqrgl\" (UID: \"f009dd7c-40cd-4318-a0d6-62ba6996dbc2\") " pod="openshift-route-controller-manager/route-controller-manager-5b4689d77-qqrgl" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.513321 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpqxk\" (UniqueName: \"kubernetes.io/projected/f009dd7c-40cd-4318-a0d6-62ba6996dbc2-kube-api-access-dpqxk\") pod \"route-controller-manager-5b4689d77-qqrgl\" (UID: \"f009dd7c-40cd-4318-a0d6-62ba6996dbc2\") " pod="openshift-route-controller-manager/route-controller-manager-5b4689d77-qqrgl" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.578271 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b4689d77-qqrgl" Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.613600 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn"] Dec 03 06:54:40 crc kubenswrapper[4947]: I1203 06:54:40.620916 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5464fdff99-c8pzn"] Dec 03 06:54:41 crc kubenswrapper[4947]: I1203 06:54:41.038194 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b4689d77-qqrgl"] Dec 03 06:54:41 crc kubenswrapper[4947]: W1203 06:54:41.049912 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf009dd7c_40cd_4318_a0d6_62ba6996dbc2.slice/crio-4b421a41220774c24bb861b6594523dab57879ed0fc252d0bc38aaa511ca67aa WatchSource:0}: Error finding container 4b421a41220774c24bb861b6594523dab57879ed0fc252d0bc38aaa511ca67aa: Status 404 returned error can't find the container with id 4b421a41220774c24bb861b6594523dab57879ed0fc252d0bc38aaa511ca67aa Dec 03 06:54:41 crc kubenswrapper[4947]: I1203 06:54:41.091842 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="877c3f1f-add4-48d2-8413-1b08e18ccd9c" path="/var/lib/kubelet/pods/877c3f1f-add4-48d2-8413-1b08e18ccd9c/volumes" Dec 03 06:54:41 crc kubenswrapper[4947]: I1203 06:54:41.285613 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b4689d77-qqrgl" event={"ID":"f009dd7c-40cd-4318-a0d6-62ba6996dbc2","Type":"ContainerStarted","Data":"4b421a41220774c24bb861b6594523dab57879ed0fc252d0bc38aaa511ca67aa"} Dec 03 06:54:42 crc kubenswrapper[4947]: I1203 06:54:42.295149 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b4689d77-qqrgl" event={"ID":"f009dd7c-40cd-4318-a0d6-62ba6996dbc2","Type":"ContainerStarted","Data":"0d6594811b56998352f8a10cf618a1840c5c215d8e951874385675d08e0945a2"} Dec 03 06:54:42 crc kubenswrapper[4947]: I1203 06:54:42.296039 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b4689d77-qqrgl" Dec 03 06:54:42 crc kubenswrapper[4947]: I1203 06:54:42.301061 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b4689d77-qqrgl" Dec 03 06:54:42 crc kubenswrapper[4947]: I1203 06:54:42.314705 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b4689d77-qqrgl" podStartSLOduration=7.314671788 podStartE2EDuration="7.314671788s" podCreationTimestamp="2025-12-03 06:54:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:54:42.311765279 +0000 UTC m=+343.572719705" watchObservedRunningTime="2025-12-03 06:54:42.314671788 +0000 UTC m=+343.575626244" Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.258144 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-gj2qb"] Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.260842 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.283338 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-gj2qb"] Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.386244 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8e123b9f-9ad0-421e-89a1-65e9de33787b-registry-tls\") pod \"image-registry-66df7c8f76-gj2qb\" (UID: \"8e123b9f-9ad0-421e-89a1-65e9de33787b\") " pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.386318 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-gj2qb\" (UID: \"8e123b9f-9ad0-421e-89a1-65e9de33787b\") " pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.386359 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7z6n\" (UniqueName: \"kubernetes.io/projected/8e123b9f-9ad0-421e-89a1-65e9de33787b-kube-api-access-p7z6n\") pod \"image-registry-66df7c8f76-gj2qb\" (UID: \"8e123b9f-9ad0-421e-89a1-65e9de33787b\") " pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.386389 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8e123b9f-9ad0-421e-89a1-65e9de33787b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-gj2qb\" (UID: \"8e123b9f-9ad0-421e-89a1-65e9de33787b\") " pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.386424 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8e123b9f-9ad0-421e-89a1-65e9de33787b-registry-certificates\") pod \"image-registry-66df7c8f76-gj2qb\" (UID: \"8e123b9f-9ad0-421e-89a1-65e9de33787b\") " pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.386754 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e123b9f-9ad0-421e-89a1-65e9de33787b-trusted-ca\") pod \"image-registry-66df7c8f76-gj2qb\" (UID: \"8e123b9f-9ad0-421e-89a1-65e9de33787b\") " pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.387038 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8e123b9f-9ad0-421e-89a1-65e9de33787b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-gj2qb\" (UID: \"8e123b9f-9ad0-421e-89a1-65e9de33787b\") " pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.387102 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e123b9f-9ad0-421e-89a1-65e9de33787b-bound-sa-token\") pod \"image-registry-66df7c8f76-gj2qb\" (UID: \"8e123b9f-9ad0-421e-89a1-65e9de33787b\") " pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.417659 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-gj2qb\" (UID: \"8e123b9f-9ad0-421e-89a1-65e9de33787b\") " pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.488787 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7z6n\" (UniqueName: \"kubernetes.io/projected/8e123b9f-9ad0-421e-89a1-65e9de33787b-kube-api-access-p7z6n\") pod \"image-registry-66df7c8f76-gj2qb\" (UID: \"8e123b9f-9ad0-421e-89a1-65e9de33787b\") " pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.488853 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8e123b9f-9ad0-421e-89a1-65e9de33787b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-gj2qb\" (UID: \"8e123b9f-9ad0-421e-89a1-65e9de33787b\") " pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.488882 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8e123b9f-9ad0-421e-89a1-65e9de33787b-registry-certificates\") pod \"image-registry-66df7c8f76-gj2qb\" (UID: \"8e123b9f-9ad0-421e-89a1-65e9de33787b\") " pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.488908 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e123b9f-9ad0-421e-89a1-65e9de33787b-trusted-ca\") pod \"image-registry-66df7c8f76-gj2qb\" (UID: \"8e123b9f-9ad0-421e-89a1-65e9de33787b\") " pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.488962 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8e123b9f-9ad0-421e-89a1-65e9de33787b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-gj2qb\" (UID: \"8e123b9f-9ad0-421e-89a1-65e9de33787b\") " pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.488985 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e123b9f-9ad0-421e-89a1-65e9de33787b-bound-sa-token\") pod \"image-registry-66df7c8f76-gj2qb\" (UID: \"8e123b9f-9ad0-421e-89a1-65e9de33787b\") " pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.489038 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8e123b9f-9ad0-421e-89a1-65e9de33787b-registry-tls\") pod \"image-registry-66df7c8f76-gj2qb\" (UID: \"8e123b9f-9ad0-421e-89a1-65e9de33787b\") " pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.489858 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8e123b9f-9ad0-421e-89a1-65e9de33787b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-gj2qb\" (UID: \"8e123b9f-9ad0-421e-89a1-65e9de33787b\") " pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.490923 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e123b9f-9ad0-421e-89a1-65e9de33787b-trusted-ca\") pod \"image-registry-66df7c8f76-gj2qb\" (UID: \"8e123b9f-9ad0-421e-89a1-65e9de33787b\") " pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.491152 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8e123b9f-9ad0-421e-89a1-65e9de33787b-registry-certificates\") pod \"image-registry-66df7c8f76-gj2qb\" (UID: \"8e123b9f-9ad0-421e-89a1-65e9de33787b\") " pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.499454 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8e123b9f-9ad0-421e-89a1-65e9de33787b-registry-tls\") pod \"image-registry-66df7c8f76-gj2qb\" (UID: \"8e123b9f-9ad0-421e-89a1-65e9de33787b\") " pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.509605 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8e123b9f-9ad0-421e-89a1-65e9de33787b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-gj2qb\" (UID: \"8e123b9f-9ad0-421e-89a1-65e9de33787b\") " pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.513848 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7z6n\" (UniqueName: \"kubernetes.io/projected/8e123b9f-9ad0-421e-89a1-65e9de33787b-kube-api-access-p7z6n\") pod \"image-registry-66df7c8f76-gj2qb\" (UID: \"8e123b9f-9ad0-421e-89a1-65e9de33787b\") " pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.516888 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e123b9f-9ad0-421e-89a1-65e9de33787b-bound-sa-token\") pod \"image-registry-66df7c8f76-gj2qb\" (UID: \"8e123b9f-9ad0-421e-89a1-65e9de33787b\") " pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:54 crc kubenswrapper[4947]: I1203 06:54:54.592917 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:55 crc kubenswrapper[4947]: I1203 06:54:55.018850 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-gj2qb"] Dec 03 06:54:55 crc kubenswrapper[4947]: W1203 06:54:55.022854 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e123b9f_9ad0_421e_89a1_65e9de33787b.slice/crio-1327f3c3e0c4f53f35f7605adaa2616e2e36f4bdea425fe2a94b38eaea21de27 WatchSource:0}: Error finding container 1327f3c3e0c4f53f35f7605adaa2616e2e36f4bdea425fe2a94b38eaea21de27: Status 404 returned error can't find the container with id 1327f3c3e0c4f53f35f7605adaa2616e2e36f4bdea425fe2a94b38eaea21de27 Dec 03 06:54:55 crc kubenswrapper[4947]: I1203 06:54:55.366567 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" event={"ID":"8e123b9f-9ad0-421e-89a1-65e9de33787b","Type":"ContainerStarted","Data":"67ec6f515fed8efba2ca026085af1668273d9c08d9e6cf712a47ef39ac013fc4"} Dec 03 06:54:55 crc kubenswrapper[4947]: I1203 06:54:55.366614 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" event={"ID":"8e123b9f-9ad0-421e-89a1-65e9de33787b","Type":"ContainerStarted","Data":"1327f3c3e0c4f53f35f7605adaa2616e2e36f4bdea425fe2a94b38eaea21de27"} Dec 03 06:54:55 crc kubenswrapper[4947]: I1203 06:54:55.367749 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:54:55 crc kubenswrapper[4947]: I1203 06:54:55.391448 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" podStartSLOduration=1.39142545 podStartE2EDuration="1.39142545s" podCreationTimestamp="2025-12-03 06:54:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:54:55.388419879 +0000 UTC m=+356.649374325" watchObservedRunningTime="2025-12-03 06:54:55.39142545 +0000 UTC m=+356.652379876" Dec 03 06:55:00 crc kubenswrapper[4947]: I1203 06:55:00.086181 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:55:00 crc kubenswrapper[4947]: I1203 06:55:00.086671 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:55:14 crc kubenswrapper[4947]: I1203 06:55:14.598419 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-gj2qb" Dec 03 06:55:14 crc kubenswrapper[4947]: I1203 06:55:14.662208 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mh92n"] Dec 03 06:55:15 crc kubenswrapper[4947]: I1203 06:55:15.642732 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d9f96d886-xsclr"] Dec 03 06:55:15 crc kubenswrapper[4947]: I1203 06:55:15.643091 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" podUID="f56290ea-486d-4693-920d-a300f35249df" containerName="controller-manager" containerID="cri-o://a6f0a84d16fe104fc9be4201e07d1405201f1b6b92badccfd292b407d1a200ef" gracePeriod=30 Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.079641 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.199804 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f56290ea-486d-4693-920d-a300f35249df-proxy-ca-bundles\") pod \"f56290ea-486d-4693-920d-a300f35249df\" (UID: \"f56290ea-486d-4693-920d-a300f35249df\") " Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.199923 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f56290ea-486d-4693-920d-a300f35249df-serving-cert\") pod \"f56290ea-486d-4693-920d-a300f35249df\" (UID: \"f56290ea-486d-4693-920d-a300f35249df\") " Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.199986 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhp65\" (UniqueName: \"kubernetes.io/projected/f56290ea-486d-4693-920d-a300f35249df-kube-api-access-mhp65\") pod \"f56290ea-486d-4693-920d-a300f35249df\" (UID: \"f56290ea-486d-4693-920d-a300f35249df\") " Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.200071 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f56290ea-486d-4693-920d-a300f35249df-config\") pod \"f56290ea-486d-4693-920d-a300f35249df\" (UID: \"f56290ea-486d-4693-920d-a300f35249df\") " Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.200100 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f56290ea-486d-4693-920d-a300f35249df-client-ca\") pod \"f56290ea-486d-4693-920d-a300f35249df\" (UID: \"f56290ea-486d-4693-920d-a300f35249df\") " Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.201191 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f56290ea-486d-4693-920d-a300f35249df-client-ca" (OuterVolumeSpecName: "client-ca") pod "f56290ea-486d-4693-920d-a300f35249df" (UID: "f56290ea-486d-4693-920d-a300f35249df"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.201243 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f56290ea-486d-4693-920d-a300f35249df-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f56290ea-486d-4693-920d-a300f35249df" (UID: "f56290ea-486d-4693-920d-a300f35249df"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.201705 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f56290ea-486d-4693-920d-a300f35249df-config" (OuterVolumeSpecName: "config") pod "f56290ea-486d-4693-920d-a300f35249df" (UID: "f56290ea-486d-4693-920d-a300f35249df"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.207754 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f56290ea-486d-4693-920d-a300f35249df-kube-api-access-mhp65" (OuterVolumeSpecName: "kube-api-access-mhp65") pod "f56290ea-486d-4693-920d-a300f35249df" (UID: "f56290ea-486d-4693-920d-a300f35249df"). InnerVolumeSpecName "kube-api-access-mhp65". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.211685 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f56290ea-486d-4693-920d-a300f35249df-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f56290ea-486d-4693-920d-a300f35249df" (UID: "f56290ea-486d-4693-920d-a300f35249df"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.302086 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f56290ea-486d-4693-920d-a300f35249df-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.302163 4947 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f56290ea-486d-4693-920d-a300f35249df-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.302176 4947 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f56290ea-486d-4693-920d-a300f35249df-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.302189 4947 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f56290ea-486d-4693-920d-a300f35249df-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.302204 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhp65\" (UniqueName: \"kubernetes.io/projected/f56290ea-486d-4693-920d-a300f35249df-kube-api-access-mhp65\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.502870 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.502890 4947 generic.go:334] "Generic (PLEG): container finished" podID="f56290ea-486d-4693-920d-a300f35249df" containerID="a6f0a84d16fe104fc9be4201e07d1405201f1b6b92badccfd292b407d1a200ef" exitCode=0 Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.502962 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" event={"ID":"f56290ea-486d-4693-920d-a300f35249df","Type":"ContainerDied","Data":"a6f0a84d16fe104fc9be4201e07d1405201f1b6b92badccfd292b407d1a200ef"} Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.503041 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d9f96d886-xsclr" event={"ID":"f56290ea-486d-4693-920d-a300f35249df","Type":"ContainerDied","Data":"53dab9ca6d486595412ebec7b249da892a40b13e7a24553067a415d5cb23b63f"} Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.503083 4947 scope.go:117] "RemoveContainer" containerID="a6f0a84d16fe104fc9be4201e07d1405201f1b6b92badccfd292b407d1a200ef" Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.534003 4947 scope.go:117] "RemoveContainer" containerID="a6f0a84d16fe104fc9be4201e07d1405201f1b6b92badccfd292b407d1a200ef" Dec 03 06:55:16 crc kubenswrapper[4947]: E1203 06:55:16.534477 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6f0a84d16fe104fc9be4201e07d1405201f1b6b92badccfd292b407d1a200ef\": container with ID starting with a6f0a84d16fe104fc9be4201e07d1405201f1b6b92badccfd292b407d1a200ef not found: ID does not exist" containerID="a6f0a84d16fe104fc9be4201e07d1405201f1b6b92badccfd292b407d1a200ef" Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.534533 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6f0a84d16fe104fc9be4201e07d1405201f1b6b92badccfd292b407d1a200ef"} err="failed to get container status \"a6f0a84d16fe104fc9be4201e07d1405201f1b6b92badccfd292b407d1a200ef\": rpc error: code = NotFound desc = could not find container \"a6f0a84d16fe104fc9be4201e07d1405201f1b6b92badccfd292b407d1a200ef\": container with ID starting with a6f0a84d16fe104fc9be4201e07d1405201f1b6b92badccfd292b407d1a200ef not found: ID does not exist" Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.549112 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d9f96d886-xsclr"] Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.554907 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6d9f96d886-xsclr"] Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.945018 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-854b4dfc55-vdh26"] Dec 03 06:55:16 crc kubenswrapper[4947]: E1203 06:55:16.945672 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f56290ea-486d-4693-920d-a300f35249df" containerName="controller-manager" Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.945703 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f56290ea-486d-4693-920d-a300f35249df" containerName="controller-manager" Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.946402 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f56290ea-486d-4693-920d-a300f35249df" containerName="controller-manager" Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.947076 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-854b4dfc55-vdh26" Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.953350 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.954473 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.954541 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.960723 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.960753 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.960973 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.971713 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-854b4dfc55-vdh26"] Dec 03 06:55:16 crc kubenswrapper[4947]: I1203 06:55:16.985300 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 06:55:17 crc kubenswrapper[4947]: I1203 06:55:17.091893 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f56290ea-486d-4693-920d-a300f35249df" path="/var/lib/kubelet/pods/f56290ea-486d-4693-920d-a300f35249df/volumes" Dec 03 06:55:17 crc kubenswrapper[4947]: I1203 06:55:17.112754 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea576e5c-87c5-455c-bdfe-d07ebcbd197d-proxy-ca-bundles\") pod \"controller-manager-854b4dfc55-vdh26\" (UID: \"ea576e5c-87c5-455c-bdfe-d07ebcbd197d\") " pod="openshift-controller-manager/controller-manager-854b4dfc55-vdh26" Dec 03 06:55:17 crc kubenswrapper[4947]: I1203 06:55:17.112804 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea576e5c-87c5-455c-bdfe-d07ebcbd197d-client-ca\") pod \"controller-manager-854b4dfc55-vdh26\" (UID: \"ea576e5c-87c5-455c-bdfe-d07ebcbd197d\") " pod="openshift-controller-manager/controller-manager-854b4dfc55-vdh26" Dec 03 06:55:17 crc kubenswrapper[4947]: I1203 06:55:17.112836 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea576e5c-87c5-455c-bdfe-d07ebcbd197d-config\") pod \"controller-manager-854b4dfc55-vdh26\" (UID: \"ea576e5c-87c5-455c-bdfe-d07ebcbd197d\") " pod="openshift-controller-manager/controller-manager-854b4dfc55-vdh26" Dec 03 06:55:17 crc kubenswrapper[4947]: I1203 06:55:17.112872 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmmls\" (UniqueName: \"kubernetes.io/projected/ea576e5c-87c5-455c-bdfe-d07ebcbd197d-kube-api-access-zmmls\") pod \"controller-manager-854b4dfc55-vdh26\" (UID: \"ea576e5c-87c5-455c-bdfe-d07ebcbd197d\") " pod="openshift-controller-manager/controller-manager-854b4dfc55-vdh26" Dec 03 06:55:17 crc kubenswrapper[4947]: I1203 06:55:17.113056 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea576e5c-87c5-455c-bdfe-d07ebcbd197d-serving-cert\") pod \"controller-manager-854b4dfc55-vdh26\" (UID: \"ea576e5c-87c5-455c-bdfe-d07ebcbd197d\") " pod="openshift-controller-manager/controller-manager-854b4dfc55-vdh26" Dec 03 06:55:17 crc kubenswrapper[4947]: I1203 06:55:17.214326 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea576e5c-87c5-455c-bdfe-d07ebcbd197d-client-ca\") pod \"controller-manager-854b4dfc55-vdh26\" (UID: \"ea576e5c-87c5-455c-bdfe-d07ebcbd197d\") " pod="openshift-controller-manager/controller-manager-854b4dfc55-vdh26" Dec 03 06:55:17 crc kubenswrapper[4947]: I1203 06:55:17.214363 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea576e5c-87c5-455c-bdfe-d07ebcbd197d-proxy-ca-bundles\") pod \"controller-manager-854b4dfc55-vdh26\" (UID: \"ea576e5c-87c5-455c-bdfe-d07ebcbd197d\") " pod="openshift-controller-manager/controller-manager-854b4dfc55-vdh26" Dec 03 06:55:17 crc kubenswrapper[4947]: I1203 06:55:17.214393 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea576e5c-87c5-455c-bdfe-d07ebcbd197d-config\") pod \"controller-manager-854b4dfc55-vdh26\" (UID: \"ea576e5c-87c5-455c-bdfe-d07ebcbd197d\") " pod="openshift-controller-manager/controller-manager-854b4dfc55-vdh26" Dec 03 06:55:17 crc kubenswrapper[4947]: I1203 06:55:17.214422 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmmls\" (UniqueName: \"kubernetes.io/projected/ea576e5c-87c5-455c-bdfe-d07ebcbd197d-kube-api-access-zmmls\") pod \"controller-manager-854b4dfc55-vdh26\" (UID: \"ea576e5c-87c5-455c-bdfe-d07ebcbd197d\") " pod="openshift-controller-manager/controller-manager-854b4dfc55-vdh26" Dec 03 06:55:17 crc kubenswrapper[4947]: I1203 06:55:17.214458 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea576e5c-87c5-455c-bdfe-d07ebcbd197d-serving-cert\") pod \"controller-manager-854b4dfc55-vdh26\" (UID: \"ea576e5c-87c5-455c-bdfe-d07ebcbd197d\") " pod="openshift-controller-manager/controller-manager-854b4dfc55-vdh26" Dec 03 06:55:17 crc kubenswrapper[4947]: I1203 06:55:17.215829 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea576e5c-87c5-455c-bdfe-d07ebcbd197d-client-ca\") pod \"controller-manager-854b4dfc55-vdh26\" (UID: \"ea576e5c-87c5-455c-bdfe-d07ebcbd197d\") " pod="openshift-controller-manager/controller-manager-854b4dfc55-vdh26" Dec 03 06:55:17 crc kubenswrapper[4947]: I1203 06:55:17.215847 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea576e5c-87c5-455c-bdfe-d07ebcbd197d-proxy-ca-bundles\") pod \"controller-manager-854b4dfc55-vdh26\" (UID: \"ea576e5c-87c5-455c-bdfe-d07ebcbd197d\") " pod="openshift-controller-manager/controller-manager-854b4dfc55-vdh26" Dec 03 06:55:17 crc kubenswrapper[4947]: I1203 06:55:17.217812 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea576e5c-87c5-455c-bdfe-d07ebcbd197d-serving-cert\") pod \"controller-manager-854b4dfc55-vdh26\" (UID: \"ea576e5c-87c5-455c-bdfe-d07ebcbd197d\") " pod="openshift-controller-manager/controller-manager-854b4dfc55-vdh26" Dec 03 06:55:17 crc kubenswrapper[4947]: I1203 06:55:17.217994 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea576e5c-87c5-455c-bdfe-d07ebcbd197d-config\") pod \"controller-manager-854b4dfc55-vdh26\" (UID: \"ea576e5c-87c5-455c-bdfe-d07ebcbd197d\") " pod="openshift-controller-manager/controller-manager-854b4dfc55-vdh26" Dec 03 06:55:17 crc kubenswrapper[4947]: I1203 06:55:17.245692 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmmls\" (UniqueName: \"kubernetes.io/projected/ea576e5c-87c5-455c-bdfe-d07ebcbd197d-kube-api-access-zmmls\") pod \"controller-manager-854b4dfc55-vdh26\" (UID: \"ea576e5c-87c5-455c-bdfe-d07ebcbd197d\") " pod="openshift-controller-manager/controller-manager-854b4dfc55-vdh26" Dec 03 06:55:17 crc kubenswrapper[4947]: I1203 06:55:17.290142 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-854b4dfc55-vdh26" Dec 03 06:55:17 crc kubenswrapper[4947]: I1203 06:55:17.488949 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-854b4dfc55-vdh26"] Dec 03 06:55:17 crc kubenswrapper[4947]: I1203 06:55:17.515270 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-854b4dfc55-vdh26" event={"ID":"ea576e5c-87c5-455c-bdfe-d07ebcbd197d","Type":"ContainerStarted","Data":"66eb7676f75f6d77cc68a7807fa29b8e58f492cbdbee2541071ea78e54ef08fe"} Dec 03 06:55:18 crc kubenswrapper[4947]: I1203 06:55:18.522167 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-854b4dfc55-vdh26" event={"ID":"ea576e5c-87c5-455c-bdfe-d07ebcbd197d","Type":"ContainerStarted","Data":"4cc475ecff871a0bf43118d9c3cf3edf7235ac1325b58137f1af080e39723b43"} Dec 03 06:55:18 crc kubenswrapper[4947]: I1203 06:55:18.522654 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-854b4dfc55-vdh26" Dec 03 06:55:18 crc kubenswrapper[4947]: I1203 06:55:18.527339 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-854b4dfc55-vdh26" Dec 03 06:55:18 crc kubenswrapper[4947]: I1203 06:55:18.546120 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-854b4dfc55-vdh26" podStartSLOduration=3.546101601 podStartE2EDuration="3.546101601s" podCreationTimestamp="2025-12-03 06:55:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:55:18.541940529 +0000 UTC m=+379.802894945" watchObservedRunningTime="2025-12-03 06:55:18.546101601 +0000 UTC m=+379.807056027" Dec 03 06:55:30 crc kubenswrapper[4947]: I1203 06:55:30.087300 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:55:30 crc kubenswrapper[4947]: I1203 06:55:30.088009 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:55:30 crc kubenswrapper[4947]: I1203 06:55:30.088073 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 06:55:30 crc kubenswrapper[4947]: I1203 06:55:30.088927 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5f17900e42556091147e4ba38edd46b7b9702c592fdecfb16bbf2c0cd119c24e"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 06:55:30 crc kubenswrapper[4947]: I1203 06:55:30.089040 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://5f17900e42556091147e4ba38edd46b7b9702c592fdecfb16bbf2c0cd119c24e" gracePeriod=600 Dec 03 06:55:30 crc kubenswrapper[4947]: I1203 06:55:30.609155 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="5f17900e42556091147e4ba38edd46b7b9702c592fdecfb16bbf2c0cd119c24e" exitCode=0 Dec 03 06:55:30 crc kubenswrapper[4947]: I1203 06:55:30.609283 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"5f17900e42556091147e4ba38edd46b7b9702c592fdecfb16bbf2c0cd119c24e"} Dec 03 06:55:30 crc kubenswrapper[4947]: I1203 06:55:30.609644 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"b7e10701f788e3907b17dd8d9a143170799e7cae0c946a386313adec1343a6dd"} Dec 03 06:55:30 crc kubenswrapper[4947]: I1203 06:55:30.609677 4947 scope.go:117] "RemoveContainer" containerID="276d3c5c26aa5e94e3bf56cff764b5b6768bce54060a242f9f2d3ad7d8e66c7e" Dec 03 06:55:39 crc kubenswrapper[4947]: I1203 06:55:39.704656 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" podUID="9b82a6de-d75a-462f-9b68-105a28a52e28" containerName="registry" containerID="cri-o://6f4f3de5653b29224d272d228083f2e24f104b08673df553368ad648dc539daa" gracePeriod=30 Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.313125 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.476093 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9b82a6de-d75a-462f-9b68-105a28a52e28-registry-certificates\") pod \"9b82a6de-d75a-462f-9b68-105a28a52e28\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.476254 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9b82a6de-d75a-462f-9b68-105a28a52e28-trusted-ca\") pod \"9b82a6de-d75a-462f-9b68-105a28a52e28\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.476298 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxww8\" (UniqueName: \"kubernetes.io/projected/9b82a6de-d75a-462f-9b68-105a28a52e28-kube-api-access-xxww8\") pod \"9b82a6de-d75a-462f-9b68-105a28a52e28\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.476350 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9b82a6de-d75a-462f-9b68-105a28a52e28-bound-sa-token\") pod \"9b82a6de-d75a-462f-9b68-105a28a52e28\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.476459 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9b82a6de-d75a-462f-9b68-105a28a52e28-installation-pull-secrets\") pod \"9b82a6de-d75a-462f-9b68-105a28a52e28\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.476572 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9b82a6de-d75a-462f-9b68-105a28a52e28-ca-trust-extracted\") pod \"9b82a6de-d75a-462f-9b68-105a28a52e28\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.476604 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9b82a6de-d75a-462f-9b68-105a28a52e28-registry-tls\") pod \"9b82a6de-d75a-462f-9b68-105a28a52e28\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.476832 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"9b82a6de-d75a-462f-9b68-105a28a52e28\" (UID: \"9b82a6de-d75a-462f-9b68-105a28a52e28\") " Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.478570 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b82a6de-d75a-462f-9b68-105a28a52e28-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9b82a6de-d75a-462f-9b68-105a28a52e28" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.478724 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b82a6de-d75a-462f-9b68-105a28a52e28-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "9b82a6de-d75a-462f-9b68-105a28a52e28" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.486786 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b82a6de-d75a-462f-9b68-105a28a52e28-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "9b82a6de-d75a-462f-9b68-105a28a52e28" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.487001 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b82a6de-d75a-462f-9b68-105a28a52e28-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9b82a6de-d75a-462f-9b68-105a28a52e28" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.497245 4947 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9b82a6de-d75a-462f-9b68-105a28a52e28-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.497312 4947 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9b82a6de-d75a-462f-9b68-105a28a52e28-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.501389 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b82a6de-d75a-462f-9b68-105a28a52e28-kube-api-access-xxww8" (OuterVolumeSpecName: "kube-api-access-xxww8") pod "9b82a6de-d75a-462f-9b68-105a28a52e28" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28"). InnerVolumeSpecName "kube-api-access-xxww8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.502653 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b82a6de-d75a-462f-9b68-105a28a52e28-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "9b82a6de-d75a-462f-9b68-105a28a52e28" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.504545 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "9b82a6de-d75a-462f-9b68-105a28a52e28" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.518828 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b82a6de-d75a-462f-9b68-105a28a52e28-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "9b82a6de-d75a-462f-9b68-105a28a52e28" (UID: "9b82a6de-d75a-462f-9b68-105a28a52e28"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.598789 4947 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9b82a6de-d75a-462f-9b68-105a28a52e28-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.598838 4947 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9b82a6de-d75a-462f-9b68-105a28a52e28-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.598857 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxww8\" (UniqueName: \"kubernetes.io/projected/9b82a6de-d75a-462f-9b68-105a28a52e28-kube-api-access-xxww8\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.598877 4947 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9b82a6de-d75a-462f-9b68-105a28a52e28-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.598894 4947 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9b82a6de-d75a-462f-9b68-105a28a52e28-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.689462 4947 generic.go:334] "Generic (PLEG): container finished" podID="9b82a6de-d75a-462f-9b68-105a28a52e28" containerID="6f4f3de5653b29224d272d228083f2e24f104b08673df553368ad648dc539daa" exitCode=0 Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.689620 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.689585 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" event={"ID":"9b82a6de-d75a-462f-9b68-105a28a52e28","Type":"ContainerDied","Data":"6f4f3de5653b29224d272d228083f2e24f104b08673df553368ad648dc539daa"} Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.689747 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mh92n" event={"ID":"9b82a6de-d75a-462f-9b68-105a28a52e28","Type":"ContainerDied","Data":"90575b3c1727a2dac265e2518074a65abe9fe7f16bfb7412d9d7db255b939bb9"} Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.689817 4947 scope.go:117] "RemoveContainer" containerID="6f4f3de5653b29224d272d228083f2e24f104b08673df553368ad648dc539daa" Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.718754 4947 scope.go:117] "RemoveContainer" containerID="6f4f3de5653b29224d272d228083f2e24f104b08673df553368ad648dc539daa" Dec 03 06:55:40 crc kubenswrapper[4947]: E1203 06:55:40.720782 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f4f3de5653b29224d272d228083f2e24f104b08673df553368ad648dc539daa\": container with ID starting with 6f4f3de5653b29224d272d228083f2e24f104b08673df553368ad648dc539daa not found: ID does not exist" containerID="6f4f3de5653b29224d272d228083f2e24f104b08673df553368ad648dc539daa" Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.720833 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f4f3de5653b29224d272d228083f2e24f104b08673df553368ad648dc539daa"} err="failed to get container status \"6f4f3de5653b29224d272d228083f2e24f104b08673df553368ad648dc539daa\": rpc error: code = NotFound desc = could not find container \"6f4f3de5653b29224d272d228083f2e24f104b08673df553368ad648dc539daa\": container with ID starting with 6f4f3de5653b29224d272d228083f2e24f104b08673df553368ad648dc539daa not found: ID does not exist" Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.739434 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mh92n"] Dec 03 06:55:40 crc kubenswrapper[4947]: I1203 06:55:40.746633 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mh92n"] Dec 03 06:55:41 crc kubenswrapper[4947]: I1203 06:55:41.106478 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b82a6de-d75a-462f-9b68-105a28a52e28" path="/var/lib/kubelet/pods/9b82a6de-d75a-462f-9b68-105a28a52e28/volumes" Dec 03 06:57:30 crc kubenswrapper[4947]: I1203 06:57:30.085886 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:57:30 crc kubenswrapper[4947]: I1203 06:57:30.087424 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:58:00 crc kubenswrapper[4947]: I1203 06:58:00.086320 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:58:00 crc kubenswrapper[4947]: I1203 06:58:00.086983 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:58:30 crc kubenswrapper[4947]: I1203 06:58:30.086772 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:58:30 crc kubenswrapper[4947]: I1203 06:58:30.087389 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:58:30 crc kubenswrapper[4947]: I1203 06:58:30.087453 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 06:58:30 crc kubenswrapper[4947]: I1203 06:58:30.088300 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b7e10701f788e3907b17dd8d9a143170799e7cae0c946a386313adec1343a6dd"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 06:58:30 crc kubenswrapper[4947]: I1203 06:58:30.088395 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://b7e10701f788e3907b17dd8d9a143170799e7cae0c946a386313adec1343a6dd" gracePeriod=600 Dec 03 06:58:30 crc kubenswrapper[4947]: I1203 06:58:30.836192 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="b7e10701f788e3907b17dd8d9a143170799e7cae0c946a386313adec1343a6dd" exitCode=0 Dec 03 06:58:30 crc kubenswrapper[4947]: I1203 06:58:30.836271 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"b7e10701f788e3907b17dd8d9a143170799e7cae0c946a386313adec1343a6dd"} Dec 03 06:58:30 crc kubenswrapper[4947]: I1203 06:58:30.836633 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"5422fc464d66825e41f13e058e49aa601ef8b656e45633815d2d71ca8056f807"} Dec 03 06:58:30 crc kubenswrapper[4947]: I1203 06:58:30.836664 4947 scope.go:117] "RemoveContainer" containerID="5f17900e42556091147e4ba38edd46b7b9702c592fdecfb16bbf2c0cd119c24e" Dec 03 07:00:00 crc kubenswrapper[4947]: I1203 07:00:00.181833 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412420-lxv44"] Dec 03 07:00:00 crc kubenswrapper[4947]: E1203 07:00:00.183029 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b82a6de-d75a-462f-9b68-105a28a52e28" containerName="registry" Dec 03 07:00:00 crc kubenswrapper[4947]: I1203 07:00:00.183048 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b82a6de-d75a-462f-9b68-105a28a52e28" containerName="registry" Dec 03 07:00:00 crc kubenswrapper[4947]: I1203 07:00:00.183158 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b82a6de-d75a-462f-9b68-105a28a52e28" containerName="registry" Dec 03 07:00:00 crc kubenswrapper[4947]: I1203 07:00:00.184379 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-lxv44" Dec 03 07:00:00 crc kubenswrapper[4947]: I1203 07:00:00.187392 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 07:00:00 crc kubenswrapper[4947]: I1203 07:00:00.187860 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 07:00:00 crc kubenswrapper[4947]: I1203 07:00:00.192477 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412420-lxv44"] Dec 03 07:00:00 crc kubenswrapper[4947]: I1203 07:00:00.286770 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwjb9\" (UniqueName: \"kubernetes.io/projected/42c748f8-81ed-4cac-ad68-33b0a1d7218d-kube-api-access-lwjb9\") pod \"collect-profiles-29412420-lxv44\" (UID: \"42c748f8-81ed-4cac-ad68-33b0a1d7218d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-lxv44" Dec 03 07:00:00 crc kubenswrapper[4947]: I1203 07:00:00.286886 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42c748f8-81ed-4cac-ad68-33b0a1d7218d-config-volume\") pod \"collect-profiles-29412420-lxv44\" (UID: \"42c748f8-81ed-4cac-ad68-33b0a1d7218d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-lxv44" Dec 03 07:00:00 crc kubenswrapper[4947]: I1203 07:00:00.287000 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42c748f8-81ed-4cac-ad68-33b0a1d7218d-secret-volume\") pod \"collect-profiles-29412420-lxv44\" (UID: \"42c748f8-81ed-4cac-ad68-33b0a1d7218d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-lxv44" Dec 03 07:00:00 crc kubenswrapper[4947]: I1203 07:00:00.388609 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwjb9\" (UniqueName: \"kubernetes.io/projected/42c748f8-81ed-4cac-ad68-33b0a1d7218d-kube-api-access-lwjb9\") pod \"collect-profiles-29412420-lxv44\" (UID: \"42c748f8-81ed-4cac-ad68-33b0a1d7218d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-lxv44" Dec 03 07:00:00 crc kubenswrapper[4947]: I1203 07:00:00.388662 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42c748f8-81ed-4cac-ad68-33b0a1d7218d-config-volume\") pod \"collect-profiles-29412420-lxv44\" (UID: \"42c748f8-81ed-4cac-ad68-33b0a1d7218d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-lxv44" Dec 03 07:00:00 crc kubenswrapper[4947]: I1203 07:00:00.388715 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42c748f8-81ed-4cac-ad68-33b0a1d7218d-secret-volume\") pod \"collect-profiles-29412420-lxv44\" (UID: \"42c748f8-81ed-4cac-ad68-33b0a1d7218d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-lxv44" Dec 03 07:00:00 crc kubenswrapper[4947]: I1203 07:00:00.390239 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42c748f8-81ed-4cac-ad68-33b0a1d7218d-config-volume\") pod \"collect-profiles-29412420-lxv44\" (UID: \"42c748f8-81ed-4cac-ad68-33b0a1d7218d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-lxv44" Dec 03 07:00:00 crc kubenswrapper[4947]: I1203 07:00:00.404819 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42c748f8-81ed-4cac-ad68-33b0a1d7218d-secret-volume\") pod \"collect-profiles-29412420-lxv44\" (UID: \"42c748f8-81ed-4cac-ad68-33b0a1d7218d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-lxv44" Dec 03 07:00:00 crc kubenswrapper[4947]: I1203 07:00:00.411742 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwjb9\" (UniqueName: \"kubernetes.io/projected/42c748f8-81ed-4cac-ad68-33b0a1d7218d-kube-api-access-lwjb9\") pod \"collect-profiles-29412420-lxv44\" (UID: \"42c748f8-81ed-4cac-ad68-33b0a1d7218d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-lxv44" Dec 03 07:00:00 crc kubenswrapper[4947]: I1203 07:00:00.503111 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-lxv44" Dec 03 07:00:00 crc kubenswrapper[4947]: I1203 07:00:00.763674 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412420-lxv44"] Dec 03 07:00:01 crc kubenswrapper[4947]: I1203 07:00:01.454477 4947 generic.go:334] "Generic (PLEG): container finished" podID="42c748f8-81ed-4cac-ad68-33b0a1d7218d" containerID="10cf80903e41763a6942116d3c10552ad3a1f46e555a333db975487d6c490fbb" exitCode=0 Dec 03 07:00:01 crc kubenswrapper[4947]: I1203 07:00:01.455524 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-lxv44" event={"ID":"42c748f8-81ed-4cac-ad68-33b0a1d7218d","Type":"ContainerDied","Data":"10cf80903e41763a6942116d3c10552ad3a1f46e555a333db975487d6c490fbb"} Dec 03 07:00:01 crc kubenswrapper[4947]: I1203 07:00:01.456139 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-lxv44" event={"ID":"42c748f8-81ed-4cac-ad68-33b0a1d7218d","Type":"ContainerStarted","Data":"8c0add53467a9b0d598034be57a1fbf87109b5b8d940d32cd0995bfa7da9010f"} Dec 03 07:00:02 crc kubenswrapper[4947]: I1203 07:00:02.789285 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-lxv44" Dec 03 07:00:02 crc kubenswrapper[4947]: I1203 07:00:02.926519 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwjb9\" (UniqueName: \"kubernetes.io/projected/42c748f8-81ed-4cac-ad68-33b0a1d7218d-kube-api-access-lwjb9\") pod \"42c748f8-81ed-4cac-ad68-33b0a1d7218d\" (UID: \"42c748f8-81ed-4cac-ad68-33b0a1d7218d\") " Dec 03 07:00:02 crc kubenswrapper[4947]: I1203 07:00:02.926701 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42c748f8-81ed-4cac-ad68-33b0a1d7218d-config-volume\") pod \"42c748f8-81ed-4cac-ad68-33b0a1d7218d\" (UID: \"42c748f8-81ed-4cac-ad68-33b0a1d7218d\") " Dec 03 07:00:02 crc kubenswrapper[4947]: I1203 07:00:02.926755 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42c748f8-81ed-4cac-ad68-33b0a1d7218d-secret-volume\") pod \"42c748f8-81ed-4cac-ad68-33b0a1d7218d\" (UID: \"42c748f8-81ed-4cac-ad68-33b0a1d7218d\") " Dec 03 07:00:02 crc kubenswrapper[4947]: I1203 07:00:02.928135 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42c748f8-81ed-4cac-ad68-33b0a1d7218d-config-volume" (OuterVolumeSpecName: "config-volume") pod "42c748f8-81ed-4cac-ad68-33b0a1d7218d" (UID: "42c748f8-81ed-4cac-ad68-33b0a1d7218d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:02 crc kubenswrapper[4947]: I1203 07:00:02.934778 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c748f8-81ed-4cac-ad68-33b0a1d7218d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "42c748f8-81ed-4cac-ad68-33b0a1d7218d" (UID: "42c748f8-81ed-4cac-ad68-33b0a1d7218d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:02 crc kubenswrapper[4947]: I1203 07:00:02.934935 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42c748f8-81ed-4cac-ad68-33b0a1d7218d-kube-api-access-lwjb9" (OuterVolumeSpecName: "kube-api-access-lwjb9") pod "42c748f8-81ed-4cac-ad68-33b0a1d7218d" (UID: "42c748f8-81ed-4cac-ad68-33b0a1d7218d"). InnerVolumeSpecName "kube-api-access-lwjb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:00:03 crc kubenswrapper[4947]: I1203 07:00:03.028097 4947 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42c748f8-81ed-4cac-ad68-33b0a1d7218d-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:03 crc kubenswrapper[4947]: I1203 07:00:03.028149 4947 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/42c748f8-81ed-4cac-ad68-33b0a1d7218d-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:03 crc kubenswrapper[4947]: I1203 07:00:03.028168 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwjb9\" (UniqueName: \"kubernetes.io/projected/42c748f8-81ed-4cac-ad68-33b0a1d7218d-kube-api-access-lwjb9\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:03 crc kubenswrapper[4947]: I1203 07:00:03.473919 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-lxv44" event={"ID":"42c748f8-81ed-4cac-ad68-33b0a1d7218d","Type":"ContainerDied","Data":"8c0add53467a9b0d598034be57a1fbf87109b5b8d940d32cd0995bfa7da9010f"} Dec 03 07:00:03 crc kubenswrapper[4947]: I1203 07:00:03.473984 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c0add53467a9b0d598034be57a1fbf87109b5b8d940d32cd0995bfa7da9010f" Dec 03 07:00:03 crc kubenswrapper[4947]: I1203 07:00:03.474017 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-lxv44" Dec 03 07:00:30 crc kubenswrapper[4947]: I1203 07:00:30.086661 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:00:30 crc kubenswrapper[4947]: I1203 07:00:30.087317 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:00:59 crc kubenswrapper[4947]: I1203 07:00:59.754222 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pt9n6"] Dec 03 07:00:59 crc kubenswrapper[4947]: I1203 07:00:59.764265 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="ovn-controller" containerID="cri-o://4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a" gracePeriod=30 Dec 03 07:00:59 crc kubenswrapper[4947]: I1203 07:00:59.764329 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="nbdb" containerID="cri-o://3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea" gracePeriod=30 Dec 03 07:00:59 crc kubenswrapper[4947]: I1203 07:00:59.764515 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="northd" containerID="cri-o://bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18" gracePeriod=30 Dec 03 07:00:59 crc kubenswrapper[4947]: I1203 07:00:59.764525 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="sbdb" containerID="cri-o://dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825" gracePeriod=30 Dec 03 07:00:59 crc kubenswrapper[4947]: I1203 07:00:59.764632 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="ovn-acl-logging" containerID="cri-o://2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996" gracePeriod=30 Dec 03 07:00:59 crc kubenswrapper[4947]: I1203 07:00:59.764620 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="kube-rbac-proxy-node" containerID="cri-o://af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1" gracePeriod=30 Dec 03 07:00:59 crc kubenswrapper[4947]: I1203 07:00:59.764563 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16" gracePeriod=30 Dec 03 07:00:59 crc kubenswrapper[4947]: I1203 07:00:59.797109 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="ovnkube-controller" containerID="cri-o://a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d" gracePeriod=30 Dec 03 07:00:59 crc kubenswrapper[4947]: I1203 07:00:59.866898 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-97tnc_1c90ac94-365a-4c82-b72a-41129d95a39e/kube-multus/2.log" Dec 03 07:00:59 crc kubenswrapper[4947]: I1203 07:00:59.867575 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-97tnc_1c90ac94-365a-4c82-b72a-41129d95a39e/kube-multus/1.log" Dec 03 07:00:59 crc kubenswrapper[4947]: I1203 07:00:59.867686 4947 generic.go:334] "Generic (PLEG): container finished" podID="1c90ac94-365a-4c82-b72a-41129d95a39e" containerID="f383d35aa20681f32ce2b9b1f63026fc2f6fc5da5a6cf993c059b3e42727cb59" exitCode=2 Dec 03 07:00:59 crc kubenswrapper[4947]: I1203 07:00:59.867768 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-97tnc" event={"ID":"1c90ac94-365a-4c82-b72a-41129d95a39e","Type":"ContainerDied","Data":"f383d35aa20681f32ce2b9b1f63026fc2f6fc5da5a6cf993c059b3e42727cb59"} Dec 03 07:00:59 crc kubenswrapper[4947]: I1203 07:00:59.867845 4947 scope.go:117] "RemoveContainer" containerID="5d1d6a820530cf13f5904d39860daa4f63b1b1037a7f863db881d1fbd9799441" Dec 03 07:00:59 crc kubenswrapper[4947]: I1203 07:00:59.868357 4947 scope.go:117] "RemoveContainer" containerID="f383d35aa20681f32ce2b9b1f63026fc2f6fc5da5a6cf993c059b3e42727cb59" Dec 03 07:00:59 crc kubenswrapper[4947]: E1203 07:00:59.868689 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-97tnc_openshift-multus(1c90ac94-365a-4c82-b72a-41129d95a39e)\"" pod="openshift-multus/multus-97tnc" podUID="1c90ac94-365a-4c82-b72a-41129d95a39e" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.086367 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.086421 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.113337 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pt9n6_19542618-7a4e-44bc-9297-9931dcc41eea/ovnkube-controller/3.log" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.115995 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pt9n6_19542618-7a4e-44bc-9297-9931dcc41eea/ovn-acl-logging/0.log" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.116387 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pt9n6_19542618-7a4e-44bc-9297-9931dcc41eea/ovn-controller/0.log" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.117102 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.174950 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vlb9x"] Dec 03 07:01:00 crc kubenswrapper[4947]: E1203 07:01:00.175141 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="ovnkube-controller" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175152 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="ovnkube-controller" Dec 03 07:01:00 crc kubenswrapper[4947]: E1203 07:01:00.175161 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="northd" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175166 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="northd" Dec 03 07:01:00 crc kubenswrapper[4947]: E1203 07:01:00.175177 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="sbdb" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175182 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="sbdb" Dec 03 07:01:00 crc kubenswrapper[4947]: E1203 07:01:00.175192 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="kubecfg-setup" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175197 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="kubecfg-setup" Dec 03 07:01:00 crc kubenswrapper[4947]: E1203 07:01:00.175203 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="ovnkube-controller" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175208 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="ovnkube-controller" Dec 03 07:01:00 crc kubenswrapper[4947]: E1203 07:01:00.175214 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="ovnkube-controller" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175221 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="ovnkube-controller" Dec 03 07:01:00 crc kubenswrapper[4947]: E1203 07:01:00.175231 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="kube-rbac-proxy-node" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175236 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="kube-rbac-proxy-node" Dec 03 07:01:00 crc kubenswrapper[4947]: E1203 07:01:00.175243 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="ovnkube-controller" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175248 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="ovnkube-controller" Dec 03 07:01:00 crc kubenswrapper[4947]: E1203 07:01:00.175256 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="ovn-acl-logging" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175262 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="ovn-acl-logging" Dec 03 07:01:00 crc kubenswrapper[4947]: E1203 07:01:00.175269 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="nbdb" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175274 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="nbdb" Dec 03 07:01:00 crc kubenswrapper[4947]: E1203 07:01:00.175282 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c748f8-81ed-4cac-ad68-33b0a1d7218d" containerName="collect-profiles" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175287 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c748f8-81ed-4cac-ad68-33b0a1d7218d" containerName="collect-profiles" Dec 03 07:01:00 crc kubenswrapper[4947]: E1203 07:01:00.175295 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="ovn-controller" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175302 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="ovn-controller" Dec 03 07:01:00 crc kubenswrapper[4947]: E1203 07:01:00.175310 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175315 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175386 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="ovnkube-controller" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175393 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="ovnkube-controller" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175401 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c748f8-81ed-4cac-ad68-33b0a1d7218d" containerName="collect-profiles" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175407 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="nbdb" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175415 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="ovnkube-controller" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175421 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="kube-rbac-proxy-node" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175427 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175434 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="sbdb" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175439 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="ovn-acl-logging" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175446 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="ovnkube-controller" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175454 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="ovn-controller" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175462 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="northd" Dec 03 07:01:00 crc kubenswrapper[4947]: E1203 07:01:00.175555 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="ovnkube-controller" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175561 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="ovnkube-controller" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.175644 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" containerName="ovnkube-controller" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.176950 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.267560 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/19542618-7a4e-44bc-9297-9931dcc41eea-ovnkube-config\") pod \"19542618-7a4e-44bc-9297-9931dcc41eea\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.267816 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-kubelet\") pod \"19542618-7a4e-44bc-9297-9931dcc41eea\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.267843 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-var-lib-openvswitch\") pod \"19542618-7a4e-44bc-9297-9931dcc41eea\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.267862 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqnqr\" (UniqueName: \"kubernetes.io/projected/19542618-7a4e-44bc-9297-9931dcc41eea-kube-api-access-fqnqr\") pod \"19542618-7a4e-44bc-9297-9931dcc41eea\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.267879 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-run-openvswitch\") pod \"19542618-7a4e-44bc-9297-9931dcc41eea\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.267899 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-systemd-units\") pod \"19542618-7a4e-44bc-9297-9931dcc41eea\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.267917 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-cni-netd\") pod \"19542618-7a4e-44bc-9297-9931dcc41eea\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.267954 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-etc-openvswitch\") pod \"19542618-7a4e-44bc-9297-9931dcc41eea\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.267986 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-slash\") pod \"19542618-7a4e-44bc-9297-9931dcc41eea\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.267999 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-run-netns\") pod \"19542618-7a4e-44bc-9297-9931dcc41eea\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268018 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/19542618-7a4e-44bc-9297-9931dcc41eea-env-overrides\") pod \"19542618-7a4e-44bc-9297-9931dcc41eea\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268032 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-run-ovn\") pod \"19542618-7a4e-44bc-9297-9931dcc41eea\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268049 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"19542618-7a4e-44bc-9297-9931dcc41eea\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268065 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-run-systemd\") pod \"19542618-7a4e-44bc-9297-9931dcc41eea\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268085 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-log-socket\") pod \"19542618-7a4e-44bc-9297-9931dcc41eea\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268101 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-run-ovn-kubernetes\") pod \"19542618-7a4e-44bc-9297-9931dcc41eea\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268118 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-node-log\") pod \"19542618-7a4e-44bc-9297-9931dcc41eea\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268135 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/19542618-7a4e-44bc-9297-9931dcc41eea-ovn-node-metrics-cert\") pod \"19542618-7a4e-44bc-9297-9931dcc41eea\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268153 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-cni-bin\") pod \"19542618-7a4e-44bc-9297-9931dcc41eea\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268168 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/19542618-7a4e-44bc-9297-9931dcc41eea-ovnkube-script-lib\") pod \"19542618-7a4e-44bc-9297-9931dcc41eea\" (UID: \"19542618-7a4e-44bc-9297-9931dcc41eea\") " Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268279 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-host-kubelet\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268306 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268327 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c140b03-4a97-4b25-9822-52c35ff4ca86-env-overrides\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268344 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2c140b03-4a97-4b25-9822-52c35ff4ca86-ovnkube-script-lib\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268361 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c140b03-4a97-4b25-9822-52c35ff4ca86-ovnkube-config\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268378 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-host-run-ovn-kubernetes\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268393 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-run-systemd\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268410 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-host-slash\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268430 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-systemd-units\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268449 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-run-openvswitch\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268471 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-run-ovn\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268521 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-log-socket\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268537 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6smlz\" (UniqueName: \"kubernetes.io/projected/2c140b03-4a97-4b25-9822-52c35ff4ca86-kube-api-access-6smlz\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268557 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-etc-openvswitch\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268588 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c140b03-4a97-4b25-9822-52c35ff4ca86-ovn-node-metrics-cert\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268609 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-host-cni-bin\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268624 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-var-lib-openvswitch\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268648 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-host-cni-netd\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268669 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-node-log\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.268687 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-host-run-netns\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.269576 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19542618-7a4e-44bc-9297-9931dcc41eea-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "19542618-7a4e-44bc-9297-9931dcc41eea" (UID: "19542618-7a4e-44bc-9297-9931dcc41eea"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.269612 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "19542618-7a4e-44bc-9297-9931dcc41eea" (UID: "19542618-7a4e-44bc-9297-9931dcc41eea"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.269630 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "19542618-7a4e-44bc-9297-9931dcc41eea" (UID: "19542618-7a4e-44bc-9297-9931dcc41eea"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.270329 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "19542618-7a4e-44bc-9297-9931dcc41eea" (UID: "19542618-7a4e-44bc-9297-9931dcc41eea"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.270356 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "19542618-7a4e-44bc-9297-9931dcc41eea" (UID: "19542618-7a4e-44bc-9297-9931dcc41eea"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.270380 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-log-socket" (OuterVolumeSpecName: "log-socket") pod "19542618-7a4e-44bc-9297-9931dcc41eea" (UID: "19542618-7a4e-44bc-9297-9931dcc41eea"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.270399 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "19542618-7a4e-44bc-9297-9931dcc41eea" (UID: "19542618-7a4e-44bc-9297-9931dcc41eea"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.270416 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-node-log" (OuterVolumeSpecName: "node-log") pod "19542618-7a4e-44bc-9297-9931dcc41eea" (UID: "19542618-7a4e-44bc-9297-9931dcc41eea"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.270505 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "19542618-7a4e-44bc-9297-9931dcc41eea" (UID: "19542618-7a4e-44bc-9297-9931dcc41eea"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.270526 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "19542618-7a4e-44bc-9297-9931dcc41eea" (UID: "19542618-7a4e-44bc-9297-9931dcc41eea"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.270553 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-slash" (OuterVolumeSpecName: "host-slash") pod "19542618-7a4e-44bc-9297-9931dcc41eea" (UID: "19542618-7a4e-44bc-9297-9931dcc41eea"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.270535 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "19542618-7a4e-44bc-9297-9931dcc41eea" (UID: "19542618-7a4e-44bc-9297-9931dcc41eea"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.270580 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "19542618-7a4e-44bc-9297-9931dcc41eea" (UID: "19542618-7a4e-44bc-9297-9931dcc41eea"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.270591 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "19542618-7a4e-44bc-9297-9931dcc41eea" (UID: "19542618-7a4e-44bc-9297-9931dcc41eea"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.270621 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "19542618-7a4e-44bc-9297-9931dcc41eea" (UID: "19542618-7a4e-44bc-9297-9931dcc41eea"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.270793 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19542618-7a4e-44bc-9297-9931dcc41eea-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "19542618-7a4e-44bc-9297-9931dcc41eea" (UID: "19542618-7a4e-44bc-9297-9931dcc41eea"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.270918 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19542618-7a4e-44bc-9297-9931dcc41eea-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "19542618-7a4e-44bc-9297-9931dcc41eea" (UID: "19542618-7a4e-44bc-9297-9931dcc41eea"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.275564 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19542618-7a4e-44bc-9297-9931dcc41eea-kube-api-access-fqnqr" (OuterVolumeSpecName: "kube-api-access-fqnqr") pod "19542618-7a4e-44bc-9297-9931dcc41eea" (UID: "19542618-7a4e-44bc-9297-9931dcc41eea"). InnerVolumeSpecName "kube-api-access-fqnqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.278484 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19542618-7a4e-44bc-9297-9931dcc41eea-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "19542618-7a4e-44bc-9297-9931dcc41eea" (UID: "19542618-7a4e-44bc-9297-9931dcc41eea"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.286573 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "19542618-7a4e-44bc-9297-9931dcc41eea" (UID: "19542618-7a4e-44bc-9297-9931dcc41eea"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.369654 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-node-log\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.369702 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-host-run-netns\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.369719 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-host-kubelet\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.369746 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.369764 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c140b03-4a97-4b25-9822-52c35ff4ca86-env-overrides\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.369778 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2c140b03-4a97-4b25-9822-52c35ff4ca86-ovnkube-script-lib\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.369795 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c140b03-4a97-4b25-9822-52c35ff4ca86-ovnkube-config\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.369790 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-node-log\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.369847 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-run-systemd\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.369810 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-run-systemd\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.369885 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-host-kubelet\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.369862 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-host-run-netns\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.369905 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-host-run-ovn-kubernetes\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.369905 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.370578 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2c140b03-4a97-4b25-9822-52c35ff4ca86-ovnkube-script-lib\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.370615 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-host-run-ovn-kubernetes\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.370661 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-host-slash\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.370688 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-systemd-units\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.370707 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-run-openvswitch\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.370729 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-run-ovn\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.370751 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-log-socket\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.370768 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6smlz\" (UniqueName: \"kubernetes.io/projected/2c140b03-4a97-4b25-9822-52c35ff4ca86-kube-api-access-6smlz\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.370795 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-etc-openvswitch\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.370821 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c140b03-4a97-4b25-9822-52c35ff4ca86-ovn-node-metrics-cert\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.370838 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-host-cni-bin\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.370838 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-run-openvswitch\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.370857 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-var-lib-openvswitch\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.370880 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-var-lib-openvswitch\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.370902 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-run-ovn\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.370922 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-log-socket\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.370959 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-host-cni-netd\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371051 4947 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-slash\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371078 4947 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371106 4947 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/19542618-7a4e-44bc-9297-9931dcc41eea-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371133 4947 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371156 4947 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371174 4947 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371177 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2c140b03-4a97-4b25-9822-52c35ff4ca86-ovnkube-config\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371195 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-etc-openvswitch\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371220 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-host-cni-netd\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.370791 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2c140b03-4a97-4b25-9822-52c35ff4ca86-env-overrides\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371244 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-host-slash\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371190 4947 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-log-socket\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371294 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-host-cni-bin\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371298 4947 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371338 4947 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-node-log\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371356 4947 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/19542618-7a4e-44bc-9297-9931dcc41eea-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371373 4947 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371392 4947 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/19542618-7a4e-44bc-9297-9931dcc41eea-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371408 4947 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/19542618-7a4e-44bc-9297-9931dcc41eea-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371425 4947 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371442 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqnqr\" (UniqueName: \"kubernetes.io/projected/19542618-7a4e-44bc-9297-9931dcc41eea-kube-api-access-fqnqr\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371459 4947 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371475 4947 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371520 4947 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371539 4947 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371556 4947 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19542618-7a4e-44bc-9297-9931dcc41eea-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.371262 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2c140b03-4a97-4b25-9822-52c35ff4ca86-systemd-units\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.374252 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2c140b03-4a97-4b25-9822-52c35ff4ca86-ovn-node-metrics-cert\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.389455 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6smlz\" (UniqueName: \"kubernetes.io/projected/2c140b03-4a97-4b25-9822-52c35ff4ca86-kube-api-access-6smlz\") pod \"ovnkube-node-vlb9x\" (UID: \"2c140b03-4a97-4b25-9822-52c35ff4ca86\") " pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.489171 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.882941 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pt9n6_19542618-7a4e-44bc-9297-9931dcc41eea/ovnkube-controller/3.log" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.888175 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pt9n6_19542618-7a4e-44bc-9297-9931dcc41eea/ovn-acl-logging/0.log" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.889409 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pt9n6_19542618-7a4e-44bc-9297-9931dcc41eea/ovn-controller/0.log" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890046 4947 generic.go:334] "Generic (PLEG): container finished" podID="19542618-7a4e-44bc-9297-9931dcc41eea" containerID="a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d" exitCode=0 Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890073 4947 generic.go:334] "Generic (PLEG): container finished" podID="19542618-7a4e-44bc-9297-9931dcc41eea" containerID="dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825" exitCode=0 Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890084 4947 generic.go:334] "Generic (PLEG): container finished" podID="19542618-7a4e-44bc-9297-9931dcc41eea" containerID="3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea" exitCode=0 Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890093 4947 generic.go:334] "Generic (PLEG): container finished" podID="19542618-7a4e-44bc-9297-9931dcc41eea" containerID="bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18" exitCode=0 Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890102 4947 generic.go:334] "Generic (PLEG): container finished" podID="19542618-7a4e-44bc-9297-9931dcc41eea" containerID="e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16" exitCode=0 Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890111 4947 generic.go:334] "Generic (PLEG): container finished" podID="19542618-7a4e-44bc-9297-9931dcc41eea" containerID="af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1" exitCode=0 Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890119 4947 generic.go:334] "Generic (PLEG): container finished" podID="19542618-7a4e-44bc-9297-9931dcc41eea" containerID="2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996" exitCode=143 Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890128 4947 generic.go:334] "Generic (PLEG): container finished" podID="19542618-7a4e-44bc-9297-9931dcc41eea" containerID="4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a" exitCode=143 Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890177 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerDied","Data":"a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890205 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerDied","Data":"dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890220 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerDied","Data":"3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890232 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerDied","Data":"bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890246 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerDied","Data":"e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890258 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerDied","Data":"af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890271 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890283 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890290 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890298 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890306 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890307 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890323 4947 scope.go:117] "RemoveContainer" containerID="a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890313 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890468 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890478 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890500 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890513 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerDied","Data":"2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890527 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890534 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890540 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890547 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890553 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890560 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890566 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890573 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890580 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890586 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890596 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerDied","Data":"4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890607 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890615 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890622 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890629 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890636 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890643 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890650 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890657 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890664 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890670 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890679 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pt9n6" event={"ID":"19542618-7a4e-44bc-9297-9931dcc41eea","Type":"ContainerDied","Data":"dabca9ad190807bc920c390111b266c4671349adb1266225a129315e8ef3bc5c"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890691 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890700 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890706 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890713 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890719 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890726 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890732 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890739 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890745 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.890751 4947 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.894030 4947 generic.go:334] "Generic (PLEG): container finished" podID="2c140b03-4a97-4b25-9822-52c35ff4ca86" containerID="eb859cd08751b053b43ab1541be740762ed28a8450b36d182125da5823a4c3fb" exitCode=0 Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.894095 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" event={"ID":"2c140b03-4a97-4b25-9822-52c35ff4ca86","Type":"ContainerDied","Data":"eb859cd08751b053b43ab1541be740762ed28a8450b36d182125da5823a4c3fb"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.894118 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" event={"ID":"2c140b03-4a97-4b25-9822-52c35ff4ca86","Type":"ContainerStarted","Data":"d60b917d2b7b9adc69239078856753d02ec29b72fe0612f93c7f55f484330eea"} Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.898474 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-97tnc_1c90ac94-365a-4c82-b72a-41129d95a39e/kube-multus/2.log" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.955251 4947 scope.go:117] "RemoveContainer" containerID="2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.978282 4947 scope.go:117] "RemoveContainer" containerID="dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825" Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.992428 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pt9n6"] Dec 03 07:01:00 crc kubenswrapper[4947]: I1203 07:01:00.999402 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-pt9n6"] Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.009574 4947 scope.go:117] "RemoveContainer" containerID="3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.027386 4947 scope.go:117] "RemoveContainer" containerID="bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.045262 4947 scope.go:117] "RemoveContainer" containerID="e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.062687 4947 scope.go:117] "RemoveContainer" containerID="af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.079857 4947 scope.go:117] "RemoveContainer" containerID="2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.093899 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19542618-7a4e-44bc-9297-9931dcc41eea" path="/var/lib/kubelet/pods/19542618-7a4e-44bc-9297-9931dcc41eea/volumes" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.111418 4947 scope.go:117] "RemoveContainer" containerID="4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.170212 4947 scope.go:117] "RemoveContainer" containerID="1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.200383 4947 scope.go:117] "RemoveContainer" containerID="a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d" Dec 03 07:01:01 crc kubenswrapper[4947]: E1203 07:01:01.201110 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d\": container with ID starting with a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d not found: ID does not exist" containerID="a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.201165 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d"} err="failed to get container status \"a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d\": rpc error: code = NotFound desc = could not find container \"a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d\": container with ID starting with a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.201204 4947 scope.go:117] "RemoveContainer" containerID="2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6" Dec 03 07:01:01 crc kubenswrapper[4947]: E1203 07:01:01.201798 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6\": container with ID starting with 2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6 not found: ID does not exist" containerID="2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.201834 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6"} err="failed to get container status \"2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6\": rpc error: code = NotFound desc = could not find container \"2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6\": container with ID starting with 2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6 not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.201858 4947 scope.go:117] "RemoveContainer" containerID="dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825" Dec 03 07:01:01 crc kubenswrapper[4947]: E1203 07:01:01.202246 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\": container with ID starting with dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825 not found: ID does not exist" containerID="dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.202283 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825"} err="failed to get container status \"dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\": rpc error: code = NotFound desc = could not find container \"dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\": container with ID starting with dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825 not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.202302 4947 scope.go:117] "RemoveContainer" containerID="3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea" Dec 03 07:01:01 crc kubenswrapper[4947]: E1203 07:01:01.202714 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\": container with ID starting with 3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea not found: ID does not exist" containerID="3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.202749 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea"} err="failed to get container status \"3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\": rpc error: code = NotFound desc = could not find container \"3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\": container with ID starting with 3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.202773 4947 scope.go:117] "RemoveContainer" containerID="bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18" Dec 03 07:01:01 crc kubenswrapper[4947]: E1203 07:01:01.203140 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\": container with ID starting with bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18 not found: ID does not exist" containerID="bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.203169 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18"} err="failed to get container status \"bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\": rpc error: code = NotFound desc = could not find container \"bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\": container with ID starting with bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18 not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.203189 4947 scope.go:117] "RemoveContainer" containerID="e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16" Dec 03 07:01:01 crc kubenswrapper[4947]: E1203 07:01:01.203634 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\": container with ID starting with e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16 not found: ID does not exist" containerID="e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.203661 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16"} err="failed to get container status \"e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\": rpc error: code = NotFound desc = could not find container \"e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\": container with ID starting with e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16 not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.203680 4947 scope.go:117] "RemoveContainer" containerID="af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1" Dec 03 07:01:01 crc kubenswrapper[4947]: E1203 07:01:01.204152 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\": container with ID starting with af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1 not found: ID does not exist" containerID="af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.204190 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1"} err="failed to get container status \"af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\": rpc error: code = NotFound desc = could not find container \"af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\": container with ID starting with af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1 not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.204214 4947 scope.go:117] "RemoveContainer" containerID="2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996" Dec 03 07:01:01 crc kubenswrapper[4947]: E1203 07:01:01.204821 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\": container with ID starting with 2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996 not found: ID does not exist" containerID="2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.204926 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996"} err="failed to get container status \"2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\": rpc error: code = NotFound desc = could not find container \"2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\": container with ID starting with 2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996 not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.204954 4947 scope.go:117] "RemoveContainer" containerID="4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a" Dec 03 07:01:01 crc kubenswrapper[4947]: E1203 07:01:01.205521 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\": container with ID starting with 4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a not found: ID does not exist" containerID="4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.205584 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a"} err="failed to get container status \"4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\": rpc error: code = NotFound desc = could not find container \"4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\": container with ID starting with 4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.205609 4947 scope.go:117] "RemoveContainer" containerID="1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab" Dec 03 07:01:01 crc kubenswrapper[4947]: E1203 07:01:01.206083 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\": container with ID starting with 1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab not found: ID does not exist" containerID="1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.206122 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab"} err="failed to get container status \"1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\": rpc error: code = NotFound desc = could not find container \"1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\": container with ID starting with 1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.206149 4947 scope.go:117] "RemoveContainer" containerID="a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.206782 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d"} err="failed to get container status \"a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d\": rpc error: code = NotFound desc = could not find container \"a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d\": container with ID starting with a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.206809 4947 scope.go:117] "RemoveContainer" containerID="2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.207286 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6"} err="failed to get container status \"2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6\": rpc error: code = NotFound desc = could not find container \"2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6\": container with ID starting with 2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6 not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.207318 4947 scope.go:117] "RemoveContainer" containerID="dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.207923 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825"} err="failed to get container status \"dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\": rpc error: code = NotFound desc = could not find container \"dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\": container with ID starting with dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825 not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.207950 4947 scope.go:117] "RemoveContainer" containerID="3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.208343 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea"} err="failed to get container status \"3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\": rpc error: code = NotFound desc = could not find container \"3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\": container with ID starting with 3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.208378 4947 scope.go:117] "RemoveContainer" containerID="bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.208737 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18"} err="failed to get container status \"bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\": rpc error: code = NotFound desc = could not find container \"bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\": container with ID starting with bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18 not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.208763 4947 scope.go:117] "RemoveContainer" containerID="e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.209252 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16"} err="failed to get container status \"e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\": rpc error: code = NotFound desc = could not find container \"e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\": container with ID starting with e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16 not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.209300 4947 scope.go:117] "RemoveContainer" containerID="af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.209790 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1"} err="failed to get container status \"af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\": rpc error: code = NotFound desc = could not find container \"af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\": container with ID starting with af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1 not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.209834 4947 scope.go:117] "RemoveContainer" containerID="2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.210386 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996"} err="failed to get container status \"2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\": rpc error: code = NotFound desc = could not find container \"2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\": container with ID starting with 2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996 not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.210446 4947 scope.go:117] "RemoveContainer" containerID="4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.210856 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a"} err="failed to get container status \"4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\": rpc error: code = NotFound desc = could not find container \"4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\": container with ID starting with 4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.210895 4947 scope.go:117] "RemoveContainer" containerID="1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.211379 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab"} err="failed to get container status \"1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\": rpc error: code = NotFound desc = could not find container \"1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\": container with ID starting with 1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.211414 4947 scope.go:117] "RemoveContainer" containerID="a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.211839 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d"} err="failed to get container status \"a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d\": rpc error: code = NotFound desc = could not find container \"a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d\": container with ID starting with a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.211874 4947 scope.go:117] "RemoveContainer" containerID="2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.212303 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6"} err="failed to get container status \"2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6\": rpc error: code = NotFound desc = could not find container \"2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6\": container with ID starting with 2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6 not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.212335 4947 scope.go:117] "RemoveContainer" containerID="dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.212685 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825"} err="failed to get container status \"dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\": rpc error: code = NotFound desc = could not find container \"dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\": container with ID starting with dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825 not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.212719 4947 scope.go:117] "RemoveContainer" containerID="3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.213097 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea"} err="failed to get container status \"3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\": rpc error: code = NotFound desc = could not find container \"3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\": container with ID starting with 3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.213130 4947 scope.go:117] "RemoveContainer" containerID="bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.213549 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18"} err="failed to get container status \"bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\": rpc error: code = NotFound desc = could not find container \"bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\": container with ID starting with bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18 not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.213582 4947 scope.go:117] "RemoveContainer" containerID="e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.214087 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16"} err="failed to get container status \"e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\": rpc error: code = NotFound desc = could not find container \"e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\": container with ID starting with e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16 not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.214119 4947 scope.go:117] "RemoveContainer" containerID="af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.214405 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1"} err="failed to get container status \"af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\": rpc error: code = NotFound desc = could not find container \"af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\": container with ID starting with af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1 not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.214432 4947 scope.go:117] "RemoveContainer" containerID="2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.214888 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996"} err="failed to get container status \"2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\": rpc error: code = NotFound desc = could not find container \"2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\": container with ID starting with 2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996 not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.214919 4947 scope.go:117] "RemoveContainer" containerID="4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.215314 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a"} err="failed to get container status \"4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\": rpc error: code = NotFound desc = could not find container \"4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\": container with ID starting with 4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.215344 4947 scope.go:117] "RemoveContainer" containerID="1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.215696 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab"} err="failed to get container status \"1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\": rpc error: code = NotFound desc = could not find container \"1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\": container with ID starting with 1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.215730 4947 scope.go:117] "RemoveContainer" containerID="a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.216134 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d"} err="failed to get container status \"a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d\": rpc error: code = NotFound desc = could not find container \"a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d\": container with ID starting with a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.216172 4947 scope.go:117] "RemoveContainer" containerID="2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.216517 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6"} err="failed to get container status \"2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6\": rpc error: code = NotFound desc = could not find container \"2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6\": container with ID starting with 2df74884a31bab7f500126a44c874d4ca39ce597b41fb518e93d1726e886d7d6 not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.216542 4947 scope.go:117] "RemoveContainer" containerID="dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.216913 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825"} err="failed to get container status \"dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\": rpc error: code = NotFound desc = could not find container \"dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825\": container with ID starting with dce097b9dfa3ab74d1705f74aeb8d927588cca770cb32b22e690ac7cc31f1825 not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.216940 4947 scope.go:117] "RemoveContainer" containerID="3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.218110 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea"} err="failed to get container status \"3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\": rpc error: code = NotFound desc = could not find container \"3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea\": container with ID starting with 3d8b4b3ada0d6262828ebf46c13d0db06a9dfc235613aebffadd33ba357c77ea not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.218134 4947 scope.go:117] "RemoveContainer" containerID="bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.218732 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18"} err="failed to get container status \"bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\": rpc error: code = NotFound desc = could not find container \"bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18\": container with ID starting with bd3901ca728fda778f28299964641e3e108b2ccfb484fd6c59a80745396fff18 not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.218754 4947 scope.go:117] "RemoveContainer" containerID="e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.219327 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16"} err="failed to get container status \"e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\": rpc error: code = NotFound desc = could not find container \"e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16\": container with ID starting with e213049f65cdfc80b027c2e610cb99b40c0e53ad2e4341e3158895617d7efb16 not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.219359 4947 scope.go:117] "RemoveContainer" containerID="af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.219907 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1"} err="failed to get container status \"af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\": rpc error: code = NotFound desc = could not find container \"af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1\": container with ID starting with af0c1fad8c588b043c55feaa1ff6714ece9b591bec1d8ae2739d44275b322fd1 not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.219930 4947 scope.go:117] "RemoveContainer" containerID="2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.220328 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996"} err="failed to get container status \"2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\": rpc error: code = NotFound desc = could not find container \"2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996\": container with ID starting with 2f1664e0b1904b2757c677ba100cf217bdda529d65e1a3c4d25f58cba74f4996 not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.220354 4947 scope.go:117] "RemoveContainer" containerID="4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.220747 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a"} err="failed to get container status \"4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\": rpc error: code = NotFound desc = could not find container \"4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a\": container with ID starting with 4143950146dbcea333cc8db9f2459524767354d1b33f7009c0eab5bf129b738a not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.220773 4947 scope.go:117] "RemoveContainer" containerID="1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.221178 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab"} err="failed to get container status \"1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\": rpc error: code = NotFound desc = could not find container \"1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab\": container with ID starting with 1d45b3e55b5826b695b1ab2312815ec4506a8485594d55feb03d9cd81d67e9ab not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.221203 4947 scope.go:117] "RemoveContainer" containerID="a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.221696 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d"} err="failed to get container status \"a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d\": rpc error: code = NotFound desc = could not find container \"a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d\": container with ID starting with a1010b10ed188b86cfd359f0b51a0cd204b6876393c09cd80e8b087d45fa5b0d not found: ID does not exist" Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.915887 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" event={"ID":"2c140b03-4a97-4b25-9822-52c35ff4ca86","Type":"ContainerStarted","Data":"ca5c3b611a469ec1a7c47845fa271986bc7a291f6064ff2836be0aa846a32083"} Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.915945 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" event={"ID":"2c140b03-4a97-4b25-9822-52c35ff4ca86","Type":"ContainerStarted","Data":"f5dd631bc3b02beec9584f5f8a7609f11895c547e2ebf43c10b332466e946a87"} Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.915969 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" event={"ID":"2c140b03-4a97-4b25-9822-52c35ff4ca86","Type":"ContainerStarted","Data":"1301fc28c23b7d86d082762f1d35f17454a267cee80410eef31c2bf6a1845a9b"} Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.915986 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" event={"ID":"2c140b03-4a97-4b25-9822-52c35ff4ca86","Type":"ContainerStarted","Data":"0c68a4eb6ab8124a5ee06e2fc718c63db3440a04d52f86e7bcf2fbb1358bdc6b"} Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.916003 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" event={"ID":"2c140b03-4a97-4b25-9822-52c35ff4ca86","Type":"ContainerStarted","Data":"c06755857b7eae844e3e711a4e9277b0739f32159426c1d2dfea1c853ea42649"} Dec 03 07:01:01 crc kubenswrapper[4947]: I1203 07:01:01.916020 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" event={"ID":"2c140b03-4a97-4b25-9822-52c35ff4ca86","Type":"ContainerStarted","Data":"03c8697746ff8eced6d08e05c495c4f3c32822b24c9b541b954b06e61812dcaf"} Dec 03 07:01:04 crc kubenswrapper[4947]: I1203 07:01:04.945882 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" event={"ID":"2c140b03-4a97-4b25-9822-52c35ff4ca86","Type":"ContainerStarted","Data":"8c891de029d18fdf6332e2143444792badabd604195bbcd00fb55f2c8e25b0de"} Dec 03 07:01:06 crc kubenswrapper[4947]: I1203 07:01:06.968409 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" event={"ID":"2c140b03-4a97-4b25-9822-52c35ff4ca86","Type":"ContainerStarted","Data":"f52c8629a6a3642589920764e6ab879e27b9e8993d93a421dc53f428bb9ba557"} Dec 03 07:01:06 crc kubenswrapper[4947]: I1203 07:01:06.969152 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:06 crc kubenswrapper[4947]: I1203 07:01:06.969179 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:07 crc kubenswrapper[4947]: I1203 07:01:07.005717 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" podStartSLOduration=7.005696931 podStartE2EDuration="7.005696931s" podCreationTimestamp="2025-12-03 07:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:01:07.003229445 +0000 UTC m=+728.264183911" watchObservedRunningTime="2025-12-03 07:01:07.005696931 +0000 UTC m=+728.266651377" Dec 03 07:01:07 crc kubenswrapper[4947]: I1203 07:01:07.007436 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:07 crc kubenswrapper[4947]: I1203 07:01:07.974839 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:08 crc kubenswrapper[4947]: I1203 07:01:08.016547 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:11 crc kubenswrapper[4947]: I1203 07:01:11.319778 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-npzts"] Dec 03 07:01:11 crc kubenswrapper[4947]: I1203 07:01:11.321461 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-npzts" Dec 03 07:01:11 crc kubenswrapper[4947]: I1203 07:01:11.324056 4947 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-26pm6" Dec 03 07:01:11 crc kubenswrapper[4947]: I1203 07:01:11.324851 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 03 07:01:11 crc kubenswrapper[4947]: I1203 07:01:11.325127 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 03 07:01:11 crc kubenswrapper[4947]: I1203 07:01:11.325227 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 03 07:01:11 crc kubenswrapper[4947]: I1203 07:01:11.340927 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-npzts"] Dec 03 07:01:11 crc kubenswrapper[4947]: I1203 07:01:11.345895 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f803f9ab-0e25-403d-9a18-07a7ed80f160-crc-storage\") pod \"crc-storage-crc-npzts\" (UID: \"f803f9ab-0e25-403d-9a18-07a7ed80f160\") " pod="crc-storage/crc-storage-crc-npzts" Dec 03 07:01:11 crc kubenswrapper[4947]: I1203 07:01:11.346007 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nqs7\" (UniqueName: \"kubernetes.io/projected/f803f9ab-0e25-403d-9a18-07a7ed80f160-kube-api-access-6nqs7\") pod \"crc-storage-crc-npzts\" (UID: \"f803f9ab-0e25-403d-9a18-07a7ed80f160\") " pod="crc-storage/crc-storage-crc-npzts" Dec 03 07:01:11 crc kubenswrapper[4947]: I1203 07:01:11.346142 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f803f9ab-0e25-403d-9a18-07a7ed80f160-node-mnt\") pod \"crc-storage-crc-npzts\" (UID: \"f803f9ab-0e25-403d-9a18-07a7ed80f160\") " pod="crc-storage/crc-storage-crc-npzts" Dec 03 07:01:11 crc kubenswrapper[4947]: I1203 07:01:11.447037 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nqs7\" (UniqueName: \"kubernetes.io/projected/f803f9ab-0e25-403d-9a18-07a7ed80f160-kube-api-access-6nqs7\") pod \"crc-storage-crc-npzts\" (UID: \"f803f9ab-0e25-403d-9a18-07a7ed80f160\") " pod="crc-storage/crc-storage-crc-npzts" Dec 03 07:01:11 crc kubenswrapper[4947]: I1203 07:01:11.447521 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f803f9ab-0e25-403d-9a18-07a7ed80f160-node-mnt\") pod \"crc-storage-crc-npzts\" (UID: \"f803f9ab-0e25-403d-9a18-07a7ed80f160\") " pod="crc-storage/crc-storage-crc-npzts" Dec 03 07:01:11 crc kubenswrapper[4947]: I1203 07:01:11.447691 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f803f9ab-0e25-403d-9a18-07a7ed80f160-crc-storage\") pod \"crc-storage-crc-npzts\" (UID: \"f803f9ab-0e25-403d-9a18-07a7ed80f160\") " pod="crc-storage/crc-storage-crc-npzts" Dec 03 07:01:11 crc kubenswrapper[4947]: I1203 07:01:11.448051 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f803f9ab-0e25-403d-9a18-07a7ed80f160-node-mnt\") pod \"crc-storage-crc-npzts\" (UID: \"f803f9ab-0e25-403d-9a18-07a7ed80f160\") " pod="crc-storage/crc-storage-crc-npzts" Dec 03 07:01:11 crc kubenswrapper[4947]: I1203 07:01:11.449087 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f803f9ab-0e25-403d-9a18-07a7ed80f160-crc-storage\") pod \"crc-storage-crc-npzts\" (UID: \"f803f9ab-0e25-403d-9a18-07a7ed80f160\") " pod="crc-storage/crc-storage-crc-npzts" Dec 03 07:01:11 crc kubenswrapper[4947]: I1203 07:01:11.482027 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nqs7\" (UniqueName: \"kubernetes.io/projected/f803f9ab-0e25-403d-9a18-07a7ed80f160-kube-api-access-6nqs7\") pod \"crc-storage-crc-npzts\" (UID: \"f803f9ab-0e25-403d-9a18-07a7ed80f160\") " pod="crc-storage/crc-storage-crc-npzts" Dec 03 07:01:11 crc kubenswrapper[4947]: I1203 07:01:11.657640 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-npzts" Dec 03 07:01:11 crc kubenswrapper[4947]: E1203 07:01:11.704550 4947 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-npzts_crc-storage_f803f9ab-0e25-403d-9a18-07a7ed80f160_0(47c271b3d07fa06a9d438331c347475e179713b0f4c38351d29bdfcd52ab0dbd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 07:01:11 crc kubenswrapper[4947]: E1203 07:01:11.704653 4947 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-npzts_crc-storage_f803f9ab-0e25-403d-9a18-07a7ed80f160_0(47c271b3d07fa06a9d438331c347475e179713b0f4c38351d29bdfcd52ab0dbd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-npzts" Dec 03 07:01:11 crc kubenswrapper[4947]: E1203 07:01:11.704692 4947 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-npzts_crc-storage_f803f9ab-0e25-403d-9a18-07a7ed80f160_0(47c271b3d07fa06a9d438331c347475e179713b0f4c38351d29bdfcd52ab0dbd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-npzts" Dec 03 07:01:11 crc kubenswrapper[4947]: E1203 07:01:11.704770 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-npzts_crc-storage(f803f9ab-0e25-403d-9a18-07a7ed80f160)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-npzts_crc-storage(f803f9ab-0e25-403d-9a18-07a7ed80f160)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-npzts_crc-storage_f803f9ab-0e25-403d-9a18-07a7ed80f160_0(47c271b3d07fa06a9d438331c347475e179713b0f4c38351d29bdfcd52ab0dbd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-npzts" podUID="f803f9ab-0e25-403d-9a18-07a7ed80f160" Dec 03 07:01:12 crc kubenswrapper[4947]: I1203 07:01:12.001712 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-npzts" Dec 03 07:01:12 crc kubenswrapper[4947]: I1203 07:01:12.002344 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-npzts" Dec 03 07:01:12 crc kubenswrapper[4947]: E1203 07:01:12.045771 4947 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-npzts_crc-storage_f803f9ab-0e25-403d-9a18-07a7ed80f160_0(1d058f15cb7d0e0ae257bb8e1755c57287f0cd5553f5a07cb26113755abb838f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 07:01:12 crc kubenswrapper[4947]: E1203 07:01:12.045895 4947 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-npzts_crc-storage_f803f9ab-0e25-403d-9a18-07a7ed80f160_0(1d058f15cb7d0e0ae257bb8e1755c57287f0cd5553f5a07cb26113755abb838f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-npzts" Dec 03 07:01:12 crc kubenswrapper[4947]: E1203 07:01:12.045933 4947 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-npzts_crc-storage_f803f9ab-0e25-403d-9a18-07a7ed80f160_0(1d058f15cb7d0e0ae257bb8e1755c57287f0cd5553f5a07cb26113755abb838f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-npzts" Dec 03 07:01:12 crc kubenswrapper[4947]: E1203 07:01:12.046063 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-npzts_crc-storage(f803f9ab-0e25-403d-9a18-07a7ed80f160)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-npzts_crc-storage(f803f9ab-0e25-403d-9a18-07a7ed80f160)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-npzts_crc-storage_f803f9ab-0e25-403d-9a18-07a7ed80f160_0(1d058f15cb7d0e0ae257bb8e1755c57287f0cd5553f5a07cb26113755abb838f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-npzts" podUID="f803f9ab-0e25-403d-9a18-07a7ed80f160" Dec 03 07:01:15 crc kubenswrapper[4947]: I1203 07:01:15.083246 4947 scope.go:117] "RemoveContainer" containerID="f383d35aa20681f32ce2b9b1f63026fc2f6fc5da5a6cf993c059b3e42727cb59" Dec 03 07:01:16 crc kubenswrapper[4947]: I1203 07:01:16.025605 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-97tnc_1c90ac94-365a-4c82-b72a-41129d95a39e/kube-multus/2.log" Dec 03 07:01:16 crc kubenswrapper[4947]: I1203 07:01:16.025661 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-97tnc" event={"ID":"1c90ac94-365a-4c82-b72a-41129d95a39e","Type":"ContainerStarted","Data":"5da36185a1242fbc9af4917fba172a7318b187396803cd9b4487acaffec4256f"} Dec 03 07:01:23 crc kubenswrapper[4947]: I1203 07:01:23.082249 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-npzts" Dec 03 07:01:23 crc kubenswrapper[4947]: I1203 07:01:23.083208 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-npzts" Dec 03 07:01:23 crc kubenswrapper[4947]: I1203 07:01:23.366265 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-npzts"] Dec 03 07:01:23 crc kubenswrapper[4947]: I1203 07:01:23.380109 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 07:01:24 crc kubenswrapper[4947]: I1203 07:01:24.081672 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-npzts" event={"ID":"f803f9ab-0e25-403d-9a18-07a7ed80f160","Type":"ContainerStarted","Data":"d8ce51a54e9bef6f5e3a0154d55d3144c07305b46b55f3d77ae8093ffdcd0a7d"} Dec 03 07:01:25 crc kubenswrapper[4947]: I1203 07:01:25.090540 4947 generic.go:334] "Generic (PLEG): container finished" podID="f803f9ab-0e25-403d-9a18-07a7ed80f160" containerID="412c634436939ef4a8a4636d15751704c5c6f71a3607e5980bb469d235937256" exitCode=0 Dec 03 07:01:25 crc kubenswrapper[4947]: I1203 07:01:25.096929 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-npzts" event={"ID":"f803f9ab-0e25-403d-9a18-07a7ed80f160","Type":"ContainerDied","Data":"412c634436939ef4a8a4636d15751704c5c6f71a3607e5980bb469d235937256"} Dec 03 07:01:26 crc kubenswrapper[4947]: I1203 07:01:26.387204 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-npzts" Dec 03 07:01:26 crc kubenswrapper[4947]: I1203 07:01:26.467040 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f803f9ab-0e25-403d-9a18-07a7ed80f160-node-mnt\") pod \"f803f9ab-0e25-403d-9a18-07a7ed80f160\" (UID: \"f803f9ab-0e25-403d-9a18-07a7ed80f160\") " Dec 03 07:01:26 crc kubenswrapper[4947]: I1203 07:01:26.467111 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nqs7\" (UniqueName: \"kubernetes.io/projected/f803f9ab-0e25-403d-9a18-07a7ed80f160-kube-api-access-6nqs7\") pod \"f803f9ab-0e25-403d-9a18-07a7ed80f160\" (UID: \"f803f9ab-0e25-403d-9a18-07a7ed80f160\") " Dec 03 07:01:26 crc kubenswrapper[4947]: I1203 07:01:26.467154 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f803f9ab-0e25-403d-9a18-07a7ed80f160-crc-storage\") pod \"f803f9ab-0e25-403d-9a18-07a7ed80f160\" (UID: \"f803f9ab-0e25-403d-9a18-07a7ed80f160\") " Dec 03 07:01:26 crc kubenswrapper[4947]: I1203 07:01:26.467637 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f803f9ab-0e25-403d-9a18-07a7ed80f160-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "f803f9ab-0e25-403d-9a18-07a7ed80f160" (UID: "f803f9ab-0e25-403d-9a18-07a7ed80f160"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:01:26 crc kubenswrapper[4947]: I1203 07:01:26.471637 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f803f9ab-0e25-403d-9a18-07a7ed80f160-kube-api-access-6nqs7" (OuterVolumeSpecName: "kube-api-access-6nqs7") pod "f803f9ab-0e25-403d-9a18-07a7ed80f160" (UID: "f803f9ab-0e25-403d-9a18-07a7ed80f160"). InnerVolumeSpecName "kube-api-access-6nqs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:26 crc kubenswrapper[4947]: I1203 07:01:26.479294 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f803f9ab-0e25-403d-9a18-07a7ed80f160-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "f803f9ab-0e25-403d-9a18-07a7ed80f160" (UID: "f803f9ab-0e25-403d-9a18-07a7ed80f160"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:01:26 crc kubenswrapper[4947]: I1203 07:01:26.568175 4947 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f803f9ab-0e25-403d-9a18-07a7ed80f160-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:26 crc kubenswrapper[4947]: I1203 07:01:26.568247 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nqs7\" (UniqueName: \"kubernetes.io/projected/f803f9ab-0e25-403d-9a18-07a7ed80f160-kube-api-access-6nqs7\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:26 crc kubenswrapper[4947]: I1203 07:01:26.568262 4947 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f803f9ab-0e25-403d-9a18-07a7ed80f160-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:27 crc kubenswrapper[4947]: I1203 07:01:27.104998 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-npzts" event={"ID":"f803f9ab-0e25-403d-9a18-07a7ed80f160","Type":"ContainerDied","Data":"d8ce51a54e9bef6f5e3a0154d55d3144c07305b46b55f3d77ae8093ffdcd0a7d"} Dec 03 07:01:27 crc kubenswrapper[4947]: I1203 07:01:27.105033 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-npzts" Dec 03 07:01:27 crc kubenswrapper[4947]: I1203 07:01:27.105059 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8ce51a54e9bef6f5e3a0154d55d3144c07305b46b55f3d77ae8093ffdcd0a7d" Dec 03 07:01:27 crc kubenswrapper[4947]: I1203 07:01:27.819404 4947 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 07:01:30 crc kubenswrapper[4947]: I1203 07:01:30.087050 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:01:30 crc kubenswrapper[4947]: I1203 07:01:30.087135 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:01:30 crc kubenswrapper[4947]: I1203 07:01:30.087194 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 07:01:30 crc kubenswrapper[4947]: I1203 07:01:30.087938 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5422fc464d66825e41f13e058e49aa601ef8b656e45633815d2d71ca8056f807"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:01:30 crc kubenswrapper[4947]: I1203 07:01:30.088011 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://5422fc464d66825e41f13e058e49aa601ef8b656e45633815d2d71ca8056f807" gracePeriod=600 Dec 03 07:01:30 crc kubenswrapper[4947]: I1203 07:01:30.521501 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vlb9x" Dec 03 07:01:31 crc kubenswrapper[4947]: I1203 07:01:31.125078 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="5422fc464d66825e41f13e058e49aa601ef8b656e45633815d2d71ca8056f807" exitCode=0 Dec 03 07:01:31 crc kubenswrapper[4947]: I1203 07:01:31.125124 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"5422fc464d66825e41f13e058e49aa601ef8b656e45633815d2d71ca8056f807"} Dec 03 07:01:31 crc kubenswrapper[4947]: I1203 07:01:31.126273 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"b942c1f8be6b00a1ec761600a3a1ec1dc91971dc8d9f088d0a1649ce3f26d690"} Dec 03 07:01:31 crc kubenswrapper[4947]: I1203 07:01:31.126322 4947 scope.go:117] "RemoveContainer" containerID="b7e10701f788e3907b17dd8d9a143170799e7cae0c946a386313adec1343a6dd" Dec 03 07:01:34 crc kubenswrapper[4947]: I1203 07:01:34.577671 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76"] Dec 03 07:01:34 crc kubenswrapper[4947]: E1203 07:01:34.578065 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f803f9ab-0e25-403d-9a18-07a7ed80f160" containerName="storage" Dec 03 07:01:34 crc kubenswrapper[4947]: I1203 07:01:34.578077 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f803f9ab-0e25-403d-9a18-07a7ed80f160" containerName="storage" Dec 03 07:01:34 crc kubenswrapper[4947]: I1203 07:01:34.578162 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f803f9ab-0e25-403d-9a18-07a7ed80f160" containerName="storage" Dec 03 07:01:34 crc kubenswrapper[4947]: I1203 07:01:34.578812 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76" Dec 03 07:01:34 crc kubenswrapper[4947]: I1203 07:01:34.580353 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 07:01:34 crc kubenswrapper[4947]: I1203 07:01:34.588960 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76"] Dec 03 07:01:34 crc kubenswrapper[4947]: I1203 07:01:34.770241 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgd76\" (UniqueName: \"kubernetes.io/projected/50f2761d-ae32-4e29-9404-a1d5eb5141ef-kube-api-access-jgd76\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76\" (UID: \"50f2761d-ae32-4e29-9404-a1d5eb5141ef\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76" Dec 03 07:01:34 crc kubenswrapper[4947]: I1203 07:01:34.770312 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50f2761d-ae32-4e29-9404-a1d5eb5141ef-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76\" (UID: \"50f2761d-ae32-4e29-9404-a1d5eb5141ef\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76" Dec 03 07:01:34 crc kubenswrapper[4947]: I1203 07:01:34.770362 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50f2761d-ae32-4e29-9404-a1d5eb5141ef-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76\" (UID: \"50f2761d-ae32-4e29-9404-a1d5eb5141ef\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76" Dec 03 07:01:34 crc kubenswrapper[4947]: I1203 07:01:34.871840 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgd76\" (UniqueName: \"kubernetes.io/projected/50f2761d-ae32-4e29-9404-a1d5eb5141ef-kube-api-access-jgd76\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76\" (UID: \"50f2761d-ae32-4e29-9404-a1d5eb5141ef\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76" Dec 03 07:01:34 crc kubenswrapper[4947]: I1203 07:01:34.871999 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50f2761d-ae32-4e29-9404-a1d5eb5141ef-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76\" (UID: \"50f2761d-ae32-4e29-9404-a1d5eb5141ef\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76" Dec 03 07:01:34 crc kubenswrapper[4947]: I1203 07:01:34.872064 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50f2761d-ae32-4e29-9404-a1d5eb5141ef-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76\" (UID: \"50f2761d-ae32-4e29-9404-a1d5eb5141ef\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76" Dec 03 07:01:34 crc kubenswrapper[4947]: I1203 07:01:34.873710 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50f2761d-ae32-4e29-9404-a1d5eb5141ef-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76\" (UID: \"50f2761d-ae32-4e29-9404-a1d5eb5141ef\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76" Dec 03 07:01:34 crc kubenswrapper[4947]: I1203 07:01:34.873877 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50f2761d-ae32-4e29-9404-a1d5eb5141ef-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76\" (UID: \"50f2761d-ae32-4e29-9404-a1d5eb5141ef\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76" Dec 03 07:01:34 crc kubenswrapper[4947]: I1203 07:01:34.901551 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgd76\" (UniqueName: \"kubernetes.io/projected/50f2761d-ae32-4e29-9404-a1d5eb5141ef-kube-api-access-jgd76\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76\" (UID: \"50f2761d-ae32-4e29-9404-a1d5eb5141ef\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76" Dec 03 07:01:35 crc kubenswrapper[4947]: I1203 07:01:35.194358 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76" Dec 03 07:01:35 crc kubenswrapper[4947]: I1203 07:01:35.441316 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76"] Dec 03 07:01:35 crc kubenswrapper[4947]: W1203 07:01:35.452013 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50f2761d_ae32_4e29_9404_a1d5eb5141ef.slice/crio-213fa132c3febe778747447bdd001541b9e79b95def534dd52c57e4371f47d41 WatchSource:0}: Error finding container 213fa132c3febe778747447bdd001541b9e79b95def534dd52c57e4371f47d41: Status 404 returned error can't find the container with id 213fa132c3febe778747447bdd001541b9e79b95def534dd52c57e4371f47d41 Dec 03 07:01:36 crc kubenswrapper[4947]: I1203 07:01:36.161116 4947 generic.go:334] "Generic (PLEG): container finished" podID="50f2761d-ae32-4e29-9404-a1d5eb5141ef" containerID="410d77b1c98b71e9accce1ef94ea70200103d771f517a662b01fd052a8e0b10d" exitCode=0 Dec 03 07:01:36 crc kubenswrapper[4947]: I1203 07:01:36.161166 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76" event={"ID":"50f2761d-ae32-4e29-9404-a1d5eb5141ef","Type":"ContainerDied","Data":"410d77b1c98b71e9accce1ef94ea70200103d771f517a662b01fd052a8e0b10d"} Dec 03 07:01:36 crc kubenswrapper[4947]: I1203 07:01:36.161198 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76" event={"ID":"50f2761d-ae32-4e29-9404-a1d5eb5141ef","Type":"ContainerStarted","Data":"213fa132c3febe778747447bdd001541b9e79b95def534dd52c57e4371f47d41"} Dec 03 07:01:36 crc kubenswrapper[4947]: I1203 07:01:36.733014 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-trprb"] Dec 03 07:01:36 crc kubenswrapper[4947]: I1203 07:01:36.737030 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-trprb" Dec 03 07:01:36 crc kubenswrapper[4947]: I1203 07:01:36.737961 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-trprb"] Dec 03 07:01:36 crc kubenswrapper[4947]: I1203 07:01:36.898017 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6083fcf-6624-417b-9be2-93559bed3c31-catalog-content\") pod \"redhat-operators-trprb\" (UID: \"d6083fcf-6624-417b-9be2-93559bed3c31\") " pod="openshift-marketplace/redhat-operators-trprb" Dec 03 07:01:36 crc kubenswrapper[4947]: I1203 07:01:36.898182 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx49l\" (UniqueName: \"kubernetes.io/projected/d6083fcf-6624-417b-9be2-93559bed3c31-kube-api-access-fx49l\") pod \"redhat-operators-trprb\" (UID: \"d6083fcf-6624-417b-9be2-93559bed3c31\") " pod="openshift-marketplace/redhat-operators-trprb" Dec 03 07:01:36 crc kubenswrapper[4947]: I1203 07:01:36.898232 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6083fcf-6624-417b-9be2-93559bed3c31-utilities\") pod \"redhat-operators-trprb\" (UID: \"d6083fcf-6624-417b-9be2-93559bed3c31\") " pod="openshift-marketplace/redhat-operators-trprb" Dec 03 07:01:37 crc kubenswrapper[4947]: I1203 07:01:37.000119 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6083fcf-6624-417b-9be2-93559bed3c31-utilities\") pod \"redhat-operators-trprb\" (UID: \"d6083fcf-6624-417b-9be2-93559bed3c31\") " pod="openshift-marketplace/redhat-operators-trprb" Dec 03 07:01:37 crc kubenswrapper[4947]: I1203 07:01:37.000218 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6083fcf-6624-417b-9be2-93559bed3c31-catalog-content\") pod \"redhat-operators-trprb\" (UID: \"d6083fcf-6624-417b-9be2-93559bed3c31\") " pod="openshift-marketplace/redhat-operators-trprb" Dec 03 07:01:37 crc kubenswrapper[4947]: I1203 07:01:37.000290 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx49l\" (UniqueName: \"kubernetes.io/projected/d6083fcf-6624-417b-9be2-93559bed3c31-kube-api-access-fx49l\") pod \"redhat-operators-trprb\" (UID: \"d6083fcf-6624-417b-9be2-93559bed3c31\") " pod="openshift-marketplace/redhat-operators-trprb" Dec 03 07:01:37 crc kubenswrapper[4947]: I1203 07:01:37.000828 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6083fcf-6624-417b-9be2-93559bed3c31-utilities\") pod \"redhat-operators-trprb\" (UID: \"d6083fcf-6624-417b-9be2-93559bed3c31\") " pod="openshift-marketplace/redhat-operators-trprb" Dec 03 07:01:37 crc kubenswrapper[4947]: I1203 07:01:37.000854 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6083fcf-6624-417b-9be2-93559bed3c31-catalog-content\") pod \"redhat-operators-trprb\" (UID: \"d6083fcf-6624-417b-9be2-93559bed3c31\") " pod="openshift-marketplace/redhat-operators-trprb" Dec 03 07:01:37 crc kubenswrapper[4947]: I1203 07:01:37.036897 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx49l\" (UniqueName: \"kubernetes.io/projected/d6083fcf-6624-417b-9be2-93559bed3c31-kube-api-access-fx49l\") pod \"redhat-operators-trprb\" (UID: \"d6083fcf-6624-417b-9be2-93559bed3c31\") " pod="openshift-marketplace/redhat-operators-trprb" Dec 03 07:01:37 crc kubenswrapper[4947]: I1203 07:01:37.061423 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-trprb" Dec 03 07:01:37 crc kubenswrapper[4947]: I1203 07:01:37.281753 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-trprb"] Dec 03 07:01:37 crc kubenswrapper[4947]: W1203 07:01:37.286189 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6083fcf_6624_417b_9be2_93559bed3c31.slice/crio-6a7222b9106170259b0235be9d7409dc637ad5a20d64f6c6c59cd275c10da21f WatchSource:0}: Error finding container 6a7222b9106170259b0235be9d7409dc637ad5a20d64f6c6c59cd275c10da21f: Status 404 returned error can't find the container with id 6a7222b9106170259b0235be9d7409dc637ad5a20d64f6c6c59cd275c10da21f Dec 03 07:01:38 crc kubenswrapper[4947]: I1203 07:01:38.174342 4947 generic.go:334] "Generic (PLEG): container finished" podID="d6083fcf-6624-417b-9be2-93559bed3c31" containerID="d075a2438ebb10a272713a16a8e71dcd51b222fca53d0b8c9a8468593f46bd5e" exitCode=0 Dec 03 07:01:38 crc kubenswrapper[4947]: I1203 07:01:38.174404 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-trprb" event={"ID":"d6083fcf-6624-417b-9be2-93559bed3c31","Type":"ContainerDied","Data":"d075a2438ebb10a272713a16a8e71dcd51b222fca53d0b8c9a8468593f46bd5e"} Dec 03 07:01:38 crc kubenswrapper[4947]: I1203 07:01:38.175328 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-trprb" event={"ID":"d6083fcf-6624-417b-9be2-93559bed3c31","Type":"ContainerStarted","Data":"6a7222b9106170259b0235be9d7409dc637ad5a20d64f6c6c59cd275c10da21f"} Dec 03 07:01:38 crc kubenswrapper[4947]: I1203 07:01:38.181287 4947 generic.go:334] "Generic (PLEG): container finished" podID="50f2761d-ae32-4e29-9404-a1d5eb5141ef" containerID="0a1ef963ca6bfbe74ebb7a79fc8a587eb40bd96fcf0959db318ca866bdc1592e" exitCode=0 Dec 03 07:01:38 crc kubenswrapper[4947]: I1203 07:01:38.181335 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76" event={"ID":"50f2761d-ae32-4e29-9404-a1d5eb5141ef","Type":"ContainerDied","Data":"0a1ef963ca6bfbe74ebb7a79fc8a587eb40bd96fcf0959db318ca866bdc1592e"} Dec 03 07:01:39 crc kubenswrapper[4947]: I1203 07:01:39.188803 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-trprb" event={"ID":"d6083fcf-6624-417b-9be2-93559bed3c31","Type":"ContainerStarted","Data":"6d5d7948092da84041e53968d4fe9f8400cd0ebdd276289200b70ade4f08c671"} Dec 03 07:01:39 crc kubenswrapper[4947]: I1203 07:01:39.193859 4947 generic.go:334] "Generic (PLEG): container finished" podID="50f2761d-ae32-4e29-9404-a1d5eb5141ef" containerID="034d772be25ad35b29ae11f6765157c71d819f9608db2d926d015e20ffebdfe9" exitCode=0 Dec 03 07:01:39 crc kubenswrapper[4947]: I1203 07:01:39.193904 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76" event={"ID":"50f2761d-ae32-4e29-9404-a1d5eb5141ef","Type":"ContainerDied","Data":"034d772be25ad35b29ae11f6765157c71d819f9608db2d926d015e20ffebdfe9"} Dec 03 07:01:40 crc kubenswrapper[4947]: I1203 07:01:40.588174 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76" Dec 03 07:01:40 crc kubenswrapper[4947]: I1203 07:01:40.768271 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgd76\" (UniqueName: \"kubernetes.io/projected/50f2761d-ae32-4e29-9404-a1d5eb5141ef-kube-api-access-jgd76\") pod \"50f2761d-ae32-4e29-9404-a1d5eb5141ef\" (UID: \"50f2761d-ae32-4e29-9404-a1d5eb5141ef\") " Dec 03 07:01:40 crc kubenswrapper[4947]: I1203 07:01:40.768345 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50f2761d-ae32-4e29-9404-a1d5eb5141ef-bundle\") pod \"50f2761d-ae32-4e29-9404-a1d5eb5141ef\" (UID: \"50f2761d-ae32-4e29-9404-a1d5eb5141ef\") " Dec 03 07:01:40 crc kubenswrapper[4947]: I1203 07:01:40.768367 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50f2761d-ae32-4e29-9404-a1d5eb5141ef-util\") pod \"50f2761d-ae32-4e29-9404-a1d5eb5141ef\" (UID: \"50f2761d-ae32-4e29-9404-a1d5eb5141ef\") " Dec 03 07:01:40 crc kubenswrapper[4947]: I1203 07:01:40.769274 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50f2761d-ae32-4e29-9404-a1d5eb5141ef-bundle" (OuterVolumeSpecName: "bundle") pod "50f2761d-ae32-4e29-9404-a1d5eb5141ef" (UID: "50f2761d-ae32-4e29-9404-a1d5eb5141ef"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:01:40 crc kubenswrapper[4947]: I1203 07:01:40.777023 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50f2761d-ae32-4e29-9404-a1d5eb5141ef-kube-api-access-jgd76" (OuterVolumeSpecName: "kube-api-access-jgd76") pod "50f2761d-ae32-4e29-9404-a1d5eb5141ef" (UID: "50f2761d-ae32-4e29-9404-a1d5eb5141ef"). InnerVolumeSpecName "kube-api-access-jgd76". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:40 crc kubenswrapper[4947]: I1203 07:01:40.869576 4947 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/50f2761d-ae32-4e29-9404-a1d5eb5141ef-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:40 crc kubenswrapper[4947]: I1203 07:01:40.869877 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgd76\" (UniqueName: \"kubernetes.io/projected/50f2761d-ae32-4e29-9404-a1d5eb5141ef-kube-api-access-jgd76\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:41 crc kubenswrapper[4947]: I1203 07:01:41.071709 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50f2761d-ae32-4e29-9404-a1d5eb5141ef-util" (OuterVolumeSpecName: "util") pod "50f2761d-ae32-4e29-9404-a1d5eb5141ef" (UID: "50f2761d-ae32-4e29-9404-a1d5eb5141ef"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:01:41 crc kubenswrapper[4947]: I1203 07:01:41.072081 4947 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/50f2761d-ae32-4e29-9404-a1d5eb5141ef-util\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:41 crc kubenswrapper[4947]: I1203 07:01:41.209705 4947 generic.go:334] "Generic (PLEG): container finished" podID="d6083fcf-6624-417b-9be2-93559bed3c31" containerID="6d5d7948092da84041e53968d4fe9f8400cd0ebdd276289200b70ade4f08c671" exitCode=0 Dec 03 07:01:41 crc kubenswrapper[4947]: I1203 07:01:41.209793 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-trprb" event={"ID":"d6083fcf-6624-417b-9be2-93559bed3c31","Type":"ContainerDied","Data":"6d5d7948092da84041e53968d4fe9f8400cd0ebdd276289200b70ade4f08c671"} Dec 03 07:01:41 crc kubenswrapper[4947]: I1203 07:01:41.216162 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76" event={"ID":"50f2761d-ae32-4e29-9404-a1d5eb5141ef","Type":"ContainerDied","Data":"213fa132c3febe778747447bdd001541b9e79b95def534dd52c57e4371f47d41"} Dec 03 07:01:41 crc kubenswrapper[4947]: I1203 07:01:41.216208 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="213fa132c3febe778747447bdd001541b9e79b95def534dd52c57e4371f47d41" Dec 03 07:01:41 crc kubenswrapper[4947]: I1203 07:01:41.216268 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76" Dec 03 07:01:42 crc kubenswrapper[4947]: I1203 07:01:42.225003 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-trprb" event={"ID":"d6083fcf-6624-417b-9be2-93559bed3c31","Type":"ContainerStarted","Data":"ff3ef034b2c07f39e1fe839324bb4b00ccfc28924ef39f5dc7dda5230c22e128"} Dec 03 07:01:42 crc kubenswrapper[4947]: I1203 07:01:42.247234 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-trprb" podStartSLOduration=2.5233320519999998 podStartE2EDuration="6.247213112s" podCreationTimestamp="2025-12-03 07:01:36 +0000 UTC" firstStartedPulling="2025-12-03 07:01:38.176280464 +0000 UTC m=+759.437234930" lastFinishedPulling="2025-12-03 07:01:41.900161554 +0000 UTC m=+763.161115990" observedRunningTime="2025-12-03 07:01:42.245723292 +0000 UTC m=+763.506677748" watchObservedRunningTime="2025-12-03 07:01:42.247213112 +0000 UTC m=+763.508167538" Dec 03 07:01:45 crc kubenswrapper[4947]: I1203 07:01:45.250427 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-nvjl2"] Dec 03 07:01:45 crc kubenswrapper[4947]: E1203 07:01:45.250995 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f2761d-ae32-4e29-9404-a1d5eb5141ef" containerName="extract" Dec 03 07:01:45 crc kubenswrapper[4947]: I1203 07:01:45.251009 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f2761d-ae32-4e29-9404-a1d5eb5141ef" containerName="extract" Dec 03 07:01:45 crc kubenswrapper[4947]: E1203 07:01:45.251025 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f2761d-ae32-4e29-9404-a1d5eb5141ef" containerName="pull" Dec 03 07:01:45 crc kubenswrapper[4947]: I1203 07:01:45.251033 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f2761d-ae32-4e29-9404-a1d5eb5141ef" containerName="pull" Dec 03 07:01:45 crc kubenswrapper[4947]: E1203 07:01:45.251048 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f2761d-ae32-4e29-9404-a1d5eb5141ef" containerName="util" Dec 03 07:01:45 crc kubenswrapper[4947]: I1203 07:01:45.251055 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f2761d-ae32-4e29-9404-a1d5eb5141ef" containerName="util" Dec 03 07:01:45 crc kubenswrapper[4947]: I1203 07:01:45.251180 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="50f2761d-ae32-4e29-9404-a1d5eb5141ef" containerName="extract" Dec 03 07:01:45 crc kubenswrapper[4947]: I1203 07:01:45.251671 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-nvjl2" Dec 03 07:01:45 crc kubenswrapper[4947]: I1203 07:01:45.253166 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-hcclz" Dec 03 07:01:45 crc kubenswrapper[4947]: I1203 07:01:45.254416 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 03 07:01:45 crc kubenswrapper[4947]: I1203 07:01:45.254912 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 03 07:01:45 crc kubenswrapper[4947]: I1203 07:01:45.261767 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-nvjl2"] Dec 03 07:01:45 crc kubenswrapper[4947]: I1203 07:01:45.432331 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd7xl\" (UniqueName: \"kubernetes.io/projected/81eb265d-a63b-40bb-be3e-9d7ed1012a86-kube-api-access-xd7xl\") pod \"nmstate-operator-5b5b58f5c8-nvjl2\" (UID: \"81eb265d-a63b-40bb-be3e-9d7ed1012a86\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-nvjl2" Dec 03 07:01:45 crc kubenswrapper[4947]: I1203 07:01:45.533339 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd7xl\" (UniqueName: \"kubernetes.io/projected/81eb265d-a63b-40bb-be3e-9d7ed1012a86-kube-api-access-xd7xl\") pod \"nmstate-operator-5b5b58f5c8-nvjl2\" (UID: \"81eb265d-a63b-40bb-be3e-9d7ed1012a86\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-nvjl2" Dec 03 07:01:45 crc kubenswrapper[4947]: I1203 07:01:45.555362 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd7xl\" (UniqueName: \"kubernetes.io/projected/81eb265d-a63b-40bb-be3e-9d7ed1012a86-kube-api-access-xd7xl\") pod \"nmstate-operator-5b5b58f5c8-nvjl2\" (UID: \"81eb265d-a63b-40bb-be3e-9d7ed1012a86\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-nvjl2" Dec 03 07:01:45 crc kubenswrapper[4947]: I1203 07:01:45.567257 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-nvjl2" Dec 03 07:01:45 crc kubenswrapper[4947]: I1203 07:01:45.808743 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-nvjl2"] Dec 03 07:01:45 crc kubenswrapper[4947]: W1203 07:01:45.814741 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81eb265d_a63b_40bb_be3e_9d7ed1012a86.slice/crio-f2c0adae5ac80a265ba439625ebca7597d443d18c0afb6f6b6c3181158fd8434 WatchSource:0}: Error finding container f2c0adae5ac80a265ba439625ebca7597d443d18c0afb6f6b6c3181158fd8434: Status 404 returned error can't find the container with id f2c0adae5ac80a265ba439625ebca7597d443d18c0afb6f6b6c3181158fd8434 Dec 03 07:01:46 crc kubenswrapper[4947]: I1203 07:01:46.250375 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-nvjl2" event={"ID":"81eb265d-a63b-40bb-be3e-9d7ed1012a86","Type":"ContainerStarted","Data":"f2c0adae5ac80a265ba439625ebca7597d443d18c0afb6f6b6c3181158fd8434"} Dec 03 07:01:47 crc kubenswrapper[4947]: I1203 07:01:47.061752 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-trprb" Dec 03 07:01:47 crc kubenswrapper[4947]: I1203 07:01:47.061963 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-trprb" Dec 03 07:01:48 crc kubenswrapper[4947]: I1203 07:01:48.117539 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-trprb" podUID="d6083fcf-6624-417b-9be2-93559bed3c31" containerName="registry-server" probeResult="failure" output=< Dec 03 07:01:48 crc kubenswrapper[4947]: timeout: failed to connect service ":50051" within 1s Dec 03 07:01:48 crc kubenswrapper[4947]: > Dec 03 07:01:49 crc kubenswrapper[4947]: I1203 07:01:49.276388 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-nvjl2" event={"ID":"81eb265d-a63b-40bb-be3e-9d7ed1012a86","Type":"ContainerStarted","Data":"d8ea15b1115f3fe82d6ac708502051365fb55654e6f42927e2391558a8062bae"} Dec 03 07:01:49 crc kubenswrapper[4947]: I1203 07:01:49.307807 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-nvjl2" podStartSLOduration=1.9598635340000001 podStartE2EDuration="4.307774438s" podCreationTimestamp="2025-12-03 07:01:45 +0000 UTC" firstStartedPulling="2025-12-03 07:01:45.816418859 +0000 UTC m=+767.077373285" lastFinishedPulling="2025-12-03 07:01:48.164329763 +0000 UTC m=+769.425284189" observedRunningTime="2025-12-03 07:01:49.300028138 +0000 UTC m=+770.560982624" watchObservedRunningTime="2025-12-03 07:01:49.307774438 +0000 UTC m=+770.568728904" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.117050 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-8smps"] Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.118053 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8smps" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.123210 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-4svzs" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.129554 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vbwhn"] Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.130642 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vbwhn" Dec 03 07:01:55 crc kubenswrapper[4947]: W1203 07:01:55.133543 4947 reflector.go:561] object-"openshift-nmstate"/"openshift-nmstate-webhook": failed to list *v1.Secret: secrets "openshift-nmstate-webhook" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-nmstate": no relationship found between node 'crc' and this object Dec 03 07:01:55 crc kubenswrapper[4947]: E1203 07:01:55.133588 4947 reflector.go:158] "Unhandled Error" err="object-\"openshift-nmstate\"/\"openshift-nmstate-webhook\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-nmstate-webhook\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-nmstate\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.142547 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-gnfbv"] Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.143557 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-gnfbv" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.146068 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-8smps"] Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.165432 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vbwhn"] Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.244856 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w2fjr"] Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.245442 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w2fjr" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.247185 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.247432 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-4pqtq" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.252428 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.255865 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w2fjr"] Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.263840 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ddd645aa-f1f2-4657-acde-beac87387ecc-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-vbwhn\" (UID: \"ddd645aa-f1f2-4657-acde-beac87387ecc\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vbwhn" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.263882 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m7h7\" (UniqueName: \"kubernetes.io/projected/ddd645aa-f1f2-4657-acde-beac87387ecc-kube-api-access-4m7h7\") pod \"nmstate-webhook-5f6d4c5ccb-vbwhn\" (UID: \"ddd645aa-f1f2-4657-acde-beac87387ecc\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vbwhn" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.263912 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knssz\" (UniqueName: \"kubernetes.io/projected/acc10835-1647-4727-9bda-15a99886aec1-kube-api-access-knssz\") pod \"nmstate-metrics-7f946cbc9-8smps\" (UID: \"acc10835-1647-4727-9bda-15a99886aec1\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8smps" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.263939 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9h2h\" (UniqueName: \"kubernetes.io/projected/4eaa6896-fdad-4d14-bca7-a155e8ddfa63-kube-api-access-k9h2h\") pod \"nmstate-handler-gnfbv\" (UID: \"4eaa6896-fdad-4d14-bca7-a155e8ddfa63\") " pod="openshift-nmstate/nmstate-handler-gnfbv" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.263972 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/4eaa6896-fdad-4d14-bca7-a155e8ddfa63-nmstate-lock\") pod \"nmstate-handler-gnfbv\" (UID: \"4eaa6896-fdad-4d14-bca7-a155e8ddfa63\") " pod="openshift-nmstate/nmstate-handler-gnfbv" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.264042 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/4eaa6896-fdad-4d14-bca7-a155e8ddfa63-ovs-socket\") pod \"nmstate-handler-gnfbv\" (UID: \"4eaa6896-fdad-4d14-bca7-a155e8ddfa63\") " pod="openshift-nmstate/nmstate-handler-gnfbv" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.264072 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/4eaa6896-fdad-4d14-bca7-a155e8ddfa63-dbus-socket\") pod \"nmstate-handler-gnfbv\" (UID: \"4eaa6896-fdad-4d14-bca7-a155e8ddfa63\") " pod="openshift-nmstate/nmstate-handler-gnfbv" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.364979 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b4zk\" (UniqueName: \"kubernetes.io/projected/05b1f704-be0b-42bd-bee0-533712c2fa3b-kube-api-access-4b4zk\") pod \"nmstate-console-plugin-7fbb5f6569-w2fjr\" (UID: \"05b1f704-be0b-42bd-bee0-533712c2fa3b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w2fjr" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.365083 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/4eaa6896-fdad-4d14-bca7-a155e8ddfa63-ovs-socket\") pod \"nmstate-handler-gnfbv\" (UID: \"4eaa6896-fdad-4d14-bca7-a155e8ddfa63\") " pod="openshift-nmstate/nmstate-handler-gnfbv" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.365135 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/4eaa6896-fdad-4d14-bca7-a155e8ddfa63-dbus-socket\") pod \"nmstate-handler-gnfbv\" (UID: \"4eaa6896-fdad-4d14-bca7-a155e8ddfa63\") " pod="openshift-nmstate/nmstate-handler-gnfbv" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.365201 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/05b1f704-be0b-42bd-bee0-533712c2fa3b-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-w2fjr\" (UID: \"05b1f704-be0b-42bd-bee0-533712c2fa3b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w2fjr" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.365214 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/4eaa6896-fdad-4d14-bca7-a155e8ddfa63-ovs-socket\") pod \"nmstate-handler-gnfbv\" (UID: \"4eaa6896-fdad-4d14-bca7-a155e8ddfa63\") " pod="openshift-nmstate/nmstate-handler-gnfbv" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.365239 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ddd645aa-f1f2-4657-acde-beac87387ecc-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-vbwhn\" (UID: \"ddd645aa-f1f2-4657-acde-beac87387ecc\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vbwhn" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.365300 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m7h7\" (UniqueName: \"kubernetes.io/projected/ddd645aa-f1f2-4657-acde-beac87387ecc-kube-api-access-4m7h7\") pod \"nmstate-webhook-5f6d4c5ccb-vbwhn\" (UID: \"ddd645aa-f1f2-4657-acde-beac87387ecc\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vbwhn" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.365332 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knssz\" (UniqueName: \"kubernetes.io/projected/acc10835-1647-4727-9bda-15a99886aec1-kube-api-access-knssz\") pod \"nmstate-metrics-7f946cbc9-8smps\" (UID: \"acc10835-1647-4727-9bda-15a99886aec1\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8smps" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.365358 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9h2h\" (UniqueName: \"kubernetes.io/projected/4eaa6896-fdad-4d14-bca7-a155e8ddfa63-kube-api-access-k9h2h\") pod \"nmstate-handler-gnfbv\" (UID: \"4eaa6896-fdad-4d14-bca7-a155e8ddfa63\") " pod="openshift-nmstate/nmstate-handler-gnfbv" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.365406 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/4eaa6896-fdad-4d14-bca7-a155e8ddfa63-nmstate-lock\") pod \"nmstate-handler-gnfbv\" (UID: \"4eaa6896-fdad-4d14-bca7-a155e8ddfa63\") " pod="openshift-nmstate/nmstate-handler-gnfbv" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.365469 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/05b1f704-be0b-42bd-bee0-533712c2fa3b-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-w2fjr\" (UID: \"05b1f704-be0b-42bd-bee0-533712c2fa3b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w2fjr" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.365656 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/4eaa6896-fdad-4d14-bca7-a155e8ddfa63-nmstate-lock\") pod \"nmstate-handler-gnfbv\" (UID: \"4eaa6896-fdad-4d14-bca7-a155e8ddfa63\") " pod="openshift-nmstate/nmstate-handler-gnfbv" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.365686 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/4eaa6896-fdad-4d14-bca7-a155e8ddfa63-dbus-socket\") pod \"nmstate-handler-gnfbv\" (UID: \"4eaa6896-fdad-4d14-bca7-a155e8ddfa63\") " pod="openshift-nmstate/nmstate-handler-gnfbv" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.398862 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m7h7\" (UniqueName: \"kubernetes.io/projected/ddd645aa-f1f2-4657-acde-beac87387ecc-kube-api-access-4m7h7\") pod \"nmstate-webhook-5f6d4c5ccb-vbwhn\" (UID: \"ddd645aa-f1f2-4657-acde-beac87387ecc\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vbwhn" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.399287 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9h2h\" (UniqueName: \"kubernetes.io/projected/4eaa6896-fdad-4d14-bca7-a155e8ddfa63-kube-api-access-k9h2h\") pod \"nmstate-handler-gnfbv\" (UID: \"4eaa6896-fdad-4d14-bca7-a155e8ddfa63\") " pod="openshift-nmstate/nmstate-handler-gnfbv" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.399817 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knssz\" (UniqueName: \"kubernetes.io/projected/acc10835-1647-4727-9bda-15a99886aec1-kube-api-access-knssz\") pod \"nmstate-metrics-7f946cbc9-8smps\" (UID: \"acc10835-1647-4727-9bda-15a99886aec1\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8smps" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.426697 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cd8b9f767-f7qlc"] Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.427617 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.435584 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8smps" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.475140 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/05b1f704-be0b-42bd-bee0-533712c2fa3b-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-w2fjr\" (UID: \"05b1f704-be0b-42bd-bee0-533712c2fa3b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w2fjr" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.475330 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b4zk\" (UniqueName: \"kubernetes.io/projected/05b1f704-be0b-42bd-bee0-533712c2fa3b-kube-api-access-4b4zk\") pod \"nmstate-console-plugin-7fbb5f6569-w2fjr\" (UID: \"05b1f704-be0b-42bd-bee0-533712c2fa3b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w2fjr" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.475440 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/05b1f704-be0b-42bd-bee0-533712c2fa3b-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-w2fjr\" (UID: \"05b1f704-be0b-42bd-bee0-533712c2fa3b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w2fjr" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.476634 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/05b1f704-be0b-42bd-bee0-533712c2fa3b-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-w2fjr\" (UID: \"05b1f704-be0b-42bd-bee0-533712c2fa3b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w2fjr" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.477940 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-gnfbv" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.479752 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/05b1f704-be0b-42bd-bee0-533712c2fa3b-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-w2fjr\" (UID: \"05b1f704-be0b-42bd-bee0-533712c2fa3b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w2fjr" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.489895 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cd8b9f767-f7qlc"] Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.498263 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b4zk\" (UniqueName: \"kubernetes.io/projected/05b1f704-be0b-42bd-bee0-533712c2fa3b-kube-api-access-4b4zk\") pod \"nmstate-console-plugin-7fbb5f6569-w2fjr\" (UID: \"05b1f704-be0b-42bd-bee0-533712c2fa3b\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w2fjr" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.563509 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w2fjr" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.576333 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9672d14b-2b38-41ab-8a1a-f785cc81dab9-oauth-serving-cert\") pod \"console-7cd8b9f767-f7qlc\" (UID: \"9672d14b-2b38-41ab-8a1a-f785cc81dab9\") " pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.576384 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9672d14b-2b38-41ab-8a1a-f785cc81dab9-console-serving-cert\") pod \"console-7cd8b9f767-f7qlc\" (UID: \"9672d14b-2b38-41ab-8a1a-f785cc81dab9\") " pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.576416 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9672d14b-2b38-41ab-8a1a-f785cc81dab9-console-config\") pod \"console-7cd8b9f767-f7qlc\" (UID: \"9672d14b-2b38-41ab-8a1a-f785cc81dab9\") " pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.576579 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9672d14b-2b38-41ab-8a1a-f785cc81dab9-console-oauth-config\") pod \"console-7cd8b9f767-f7qlc\" (UID: \"9672d14b-2b38-41ab-8a1a-f785cc81dab9\") " pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.576666 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9672d14b-2b38-41ab-8a1a-f785cc81dab9-service-ca\") pod \"console-7cd8b9f767-f7qlc\" (UID: \"9672d14b-2b38-41ab-8a1a-f785cc81dab9\") " pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.576682 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9672d14b-2b38-41ab-8a1a-f785cc81dab9-trusted-ca-bundle\") pod \"console-7cd8b9f767-f7qlc\" (UID: \"9672d14b-2b38-41ab-8a1a-f785cc81dab9\") " pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.576816 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psk2d\" (UniqueName: \"kubernetes.io/projected/9672d14b-2b38-41ab-8a1a-f785cc81dab9-kube-api-access-psk2d\") pod \"console-7cd8b9f767-f7qlc\" (UID: \"9672d14b-2b38-41ab-8a1a-f785cc81dab9\") " pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.677649 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9672d14b-2b38-41ab-8a1a-f785cc81dab9-trusted-ca-bundle\") pod \"console-7cd8b9f767-f7qlc\" (UID: \"9672d14b-2b38-41ab-8a1a-f785cc81dab9\") " pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.677708 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psk2d\" (UniqueName: \"kubernetes.io/projected/9672d14b-2b38-41ab-8a1a-f785cc81dab9-kube-api-access-psk2d\") pod \"console-7cd8b9f767-f7qlc\" (UID: \"9672d14b-2b38-41ab-8a1a-f785cc81dab9\") " pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.677760 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9672d14b-2b38-41ab-8a1a-f785cc81dab9-oauth-serving-cert\") pod \"console-7cd8b9f767-f7qlc\" (UID: \"9672d14b-2b38-41ab-8a1a-f785cc81dab9\") " pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.677796 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9672d14b-2b38-41ab-8a1a-f785cc81dab9-console-serving-cert\") pod \"console-7cd8b9f767-f7qlc\" (UID: \"9672d14b-2b38-41ab-8a1a-f785cc81dab9\") " pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.677835 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9672d14b-2b38-41ab-8a1a-f785cc81dab9-console-config\") pod \"console-7cd8b9f767-f7qlc\" (UID: \"9672d14b-2b38-41ab-8a1a-f785cc81dab9\") " pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.677862 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9672d14b-2b38-41ab-8a1a-f785cc81dab9-console-oauth-config\") pod \"console-7cd8b9f767-f7qlc\" (UID: \"9672d14b-2b38-41ab-8a1a-f785cc81dab9\") " pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.677896 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9672d14b-2b38-41ab-8a1a-f785cc81dab9-service-ca\") pod \"console-7cd8b9f767-f7qlc\" (UID: \"9672d14b-2b38-41ab-8a1a-f785cc81dab9\") " pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.679048 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9672d14b-2b38-41ab-8a1a-f785cc81dab9-service-ca\") pod \"console-7cd8b9f767-f7qlc\" (UID: \"9672d14b-2b38-41ab-8a1a-f785cc81dab9\") " pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.679266 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9672d14b-2b38-41ab-8a1a-f785cc81dab9-oauth-serving-cert\") pod \"console-7cd8b9f767-f7qlc\" (UID: \"9672d14b-2b38-41ab-8a1a-f785cc81dab9\") " pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.680468 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9672d14b-2b38-41ab-8a1a-f785cc81dab9-trusted-ca-bundle\") pod \"console-7cd8b9f767-f7qlc\" (UID: \"9672d14b-2b38-41ab-8a1a-f785cc81dab9\") " pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.680894 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9672d14b-2b38-41ab-8a1a-f785cc81dab9-console-config\") pod \"console-7cd8b9f767-f7qlc\" (UID: \"9672d14b-2b38-41ab-8a1a-f785cc81dab9\") " pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.682162 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9672d14b-2b38-41ab-8a1a-f785cc81dab9-console-serving-cert\") pod \"console-7cd8b9f767-f7qlc\" (UID: \"9672d14b-2b38-41ab-8a1a-f785cc81dab9\") " pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.685298 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9672d14b-2b38-41ab-8a1a-f785cc81dab9-console-oauth-config\") pod \"console-7cd8b9f767-f7qlc\" (UID: \"9672d14b-2b38-41ab-8a1a-f785cc81dab9\") " pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.698970 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psk2d\" (UniqueName: \"kubernetes.io/projected/9672d14b-2b38-41ab-8a1a-f785cc81dab9-kube-api-access-psk2d\") pod \"console-7cd8b9f767-f7qlc\" (UID: \"9672d14b-2b38-41ab-8a1a-f785cc81dab9\") " pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.745754 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.757117 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-8smps"] Dec 03 07:01:55 crc kubenswrapper[4947]: W1203 07:01:55.763197 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacc10835_1647_4727_9bda_15a99886aec1.slice/crio-d8ef68539c826b4ea4e1f6a88ef72f2c958ece5f280db4e4aef2a59bc61576e0 WatchSource:0}: Error finding container d8ef68539c826b4ea4e1f6a88ef72f2c958ece5f280db4e4aef2a59bc61576e0: Status 404 returned error can't find the container with id d8ef68539c826b4ea4e1f6a88ef72f2c958ece5f280db4e4aef2a59bc61576e0 Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.830601 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w2fjr"] Dec 03 07:01:55 crc kubenswrapper[4947]: I1203 07:01:55.955461 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cd8b9f767-f7qlc"] Dec 03 07:01:55 crc kubenswrapper[4947]: W1203 07:01:55.965077 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9672d14b_2b38_41ab_8a1a_f785cc81dab9.slice/crio-324ebada11c2ef6cbaf395568d633d8fd08ebc2e77bfd36064f4918144cfa722 WatchSource:0}: Error finding container 324ebada11c2ef6cbaf395568d633d8fd08ebc2e77bfd36064f4918144cfa722: Status 404 returned error can't find the container with id 324ebada11c2ef6cbaf395568d633d8fd08ebc2e77bfd36064f4918144cfa722 Dec 03 07:01:56 crc kubenswrapper[4947]: I1203 07:01:56.075448 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 03 07:01:56 crc kubenswrapper[4947]: I1203 07:01:56.089519 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ddd645aa-f1f2-4657-acde-beac87387ecc-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-vbwhn\" (UID: \"ddd645aa-f1f2-4657-acde-beac87387ecc\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vbwhn" Dec 03 07:01:56 crc kubenswrapper[4947]: I1203 07:01:56.321721 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8smps" event={"ID":"acc10835-1647-4727-9bda-15a99886aec1","Type":"ContainerStarted","Data":"d8ef68539c826b4ea4e1f6a88ef72f2c958ece5f280db4e4aef2a59bc61576e0"} Dec 03 07:01:56 crc kubenswrapper[4947]: I1203 07:01:56.322508 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w2fjr" event={"ID":"05b1f704-be0b-42bd-bee0-533712c2fa3b","Type":"ContainerStarted","Data":"5edf0383081a40901fe339b8801fe2d6c1af51f238b5f257e86c7f0da6ceaae6"} Dec 03 07:01:56 crc kubenswrapper[4947]: I1203 07:01:56.323516 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cd8b9f767-f7qlc" event={"ID":"9672d14b-2b38-41ab-8a1a-f785cc81dab9","Type":"ContainerStarted","Data":"aeee8b478c56304741dde6345645905fa16589171acc8de25fb27f873a5ca6cf"} Dec 03 07:01:56 crc kubenswrapper[4947]: I1203 07:01:56.323561 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cd8b9f767-f7qlc" event={"ID":"9672d14b-2b38-41ab-8a1a-f785cc81dab9","Type":"ContainerStarted","Data":"324ebada11c2ef6cbaf395568d633d8fd08ebc2e77bfd36064f4918144cfa722"} Dec 03 07:01:56 crc kubenswrapper[4947]: I1203 07:01:56.324167 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-gnfbv" event={"ID":"4eaa6896-fdad-4d14-bca7-a155e8ddfa63","Type":"ContainerStarted","Data":"53abe6e2e169c8c56ee7cf245fc1dbf4dce8047525c34ca8ddd441ee74d1ee78"} Dec 03 07:01:56 crc kubenswrapper[4947]: I1203 07:01:56.346546 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cd8b9f767-f7qlc" podStartSLOduration=1.346522437 podStartE2EDuration="1.346522437s" podCreationTimestamp="2025-12-03 07:01:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:01:56.34107111 +0000 UTC m=+777.602025536" watchObservedRunningTime="2025-12-03 07:01:56.346522437 +0000 UTC m=+777.607476903" Dec 03 07:01:56 crc kubenswrapper[4947]: I1203 07:01:56.372164 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vbwhn" Dec 03 07:01:56 crc kubenswrapper[4947]: I1203 07:01:56.971442 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vbwhn"] Dec 03 07:01:56 crc kubenswrapper[4947]: W1203 07:01:56.979980 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddd645aa_f1f2_4657_acde_beac87387ecc.slice/crio-7f1323562e08794751e87697460243a9682c20e32f68bc53f7817473a38a66dc WatchSource:0}: Error finding container 7f1323562e08794751e87697460243a9682c20e32f68bc53f7817473a38a66dc: Status 404 returned error can't find the container with id 7f1323562e08794751e87697460243a9682c20e32f68bc53f7817473a38a66dc Dec 03 07:01:57 crc kubenswrapper[4947]: I1203 07:01:57.109278 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-trprb" Dec 03 07:01:57 crc kubenswrapper[4947]: I1203 07:01:57.169257 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-trprb" Dec 03 07:01:57 crc kubenswrapper[4947]: I1203 07:01:57.350143 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-trprb"] Dec 03 07:01:57 crc kubenswrapper[4947]: I1203 07:01:57.362254 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vbwhn" event={"ID":"ddd645aa-f1f2-4657-acde-beac87387ecc","Type":"ContainerStarted","Data":"7f1323562e08794751e87697460243a9682c20e32f68bc53f7817473a38a66dc"} Dec 03 07:01:58 crc kubenswrapper[4947]: I1203 07:01:58.367978 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-trprb" podUID="d6083fcf-6624-417b-9be2-93559bed3c31" containerName="registry-server" containerID="cri-o://ff3ef034b2c07f39e1fe839324bb4b00ccfc28924ef39f5dc7dda5230c22e128" gracePeriod=2 Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.000913 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-trprb" Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.127125 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6083fcf-6624-417b-9be2-93559bed3c31-utilities\") pod \"d6083fcf-6624-417b-9be2-93559bed3c31\" (UID: \"d6083fcf-6624-417b-9be2-93559bed3c31\") " Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.127194 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx49l\" (UniqueName: \"kubernetes.io/projected/d6083fcf-6624-417b-9be2-93559bed3c31-kube-api-access-fx49l\") pod \"d6083fcf-6624-417b-9be2-93559bed3c31\" (UID: \"d6083fcf-6624-417b-9be2-93559bed3c31\") " Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.127229 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6083fcf-6624-417b-9be2-93559bed3c31-catalog-content\") pod \"d6083fcf-6624-417b-9be2-93559bed3c31\" (UID: \"d6083fcf-6624-417b-9be2-93559bed3c31\") " Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.128665 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6083fcf-6624-417b-9be2-93559bed3c31-utilities" (OuterVolumeSpecName: "utilities") pod "d6083fcf-6624-417b-9be2-93559bed3c31" (UID: "d6083fcf-6624-417b-9be2-93559bed3c31"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.134095 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6083fcf-6624-417b-9be2-93559bed3c31-kube-api-access-fx49l" (OuterVolumeSpecName: "kube-api-access-fx49l") pod "d6083fcf-6624-417b-9be2-93559bed3c31" (UID: "d6083fcf-6624-417b-9be2-93559bed3c31"). InnerVolumeSpecName "kube-api-access-fx49l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.228245 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6083fcf-6624-417b-9be2-93559bed3c31-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d6083fcf-6624-417b-9be2-93559bed3c31" (UID: "d6083fcf-6624-417b-9be2-93559bed3c31"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.229840 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6083fcf-6624-417b-9be2-93559bed3c31-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.229867 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx49l\" (UniqueName: \"kubernetes.io/projected/d6083fcf-6624-417b-9be2-93559bed3c31-kube-api-access-fx49l\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.229881 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6083fcf-6624-417b-9be2-93559bed3c31-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.374134 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vbwhn" event={"ID":"ddd645aa-f1f2-4657-acde-beac87387ecc","Type":"ContainerStarted","Data":"ed73c5b035eda2245399fcd04bece22d710aeb94e99fd46873f1728d1ba3669e"} Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.374212 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vbwhn" Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.375539 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-gnfbv" event={"ID":"4eaa6896-fdad-4d14-bca7-a155e8ddfa63","Type":"ContainerStarted","Data":"4f37428eb12d3eb92f244c201d358c980f1a0d33d18602729a18a10ab4f54abe"} Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.375659 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-gnfbv" Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.378176 4947 generic.go:334] "Generic (PLEG): container finished" podID="d6083fcf-6624-417b-9be2-93559bed3c31" containerID="ff3ef034b2c07f39e1fe839324bb4b00ccfc28924ef39f5dc7dda5230c22e128" exitCode=0 Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.378229 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-trprb" event={"ID":"d6083fcf-6624-417b-9be2-93559bed3c31","Type":"ContainerDied","Data":"ff3ef034b2c07f39e1fe839324bb4b00ccfc28924ef39f5dc7dda5230c22e128"} Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.378247 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-trprb" event={"ID":"d6083fcf-6624-417b-9be2-93559bed3c31","Type":"ContainerDied","Data":"6a7222b9106170259b0235be9d7409dc637ad5a20d64f6c6c59cd275c10da21f"} Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.378263 4947 scope.go:117] "RemoveContainer" containerID="ff3ef034b2c07f39e1fe839324bb4b00ccfc28924ef39f5dc7dda5230c22e128" Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.378370 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-trprb" Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.380611 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8smps" event={"ID":"acc10835-1647-4727-9bda-15a99886aec1","Type":"ContainerStarted","Data":"f8f1caa65c33342b86acbc257e032237be618be9b718a1d8e2bcd10009b7d5ba"} Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.381640 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w2fjr" event={"ID":"05b1f704-be0b-42bd-bee0-533712c2fa3b","Type":"ContainerStarted","Data":"dc58e84db1e2007c95017c2696008dd62df60a8d5206e3e344a24919c514df22"} Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.394426 4947 scope.go:117] "RemoveContainer" containerID="6d5d7948092da84041e53968d4fe9f8400cd0ebdd276289200b70ade4f08c671" Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.398854 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vbwhn" podStartSLOduration=2.519168548 podStartE2EDuration="4.398836196s" podCreationTimestamp="2025-12-03 07:01:55 +0000 UTC" firstStartedPulling="2025-12-03 07:01:56.982918057 +0000 UTC m=+778.243872483" lastFinishedPulling="2025-12-03 07:01:58.862585685 +0000 UTC m=+780.123540131" observedRunningTime="2025-12-03 07:01:59.395082934 +0000 UTC m=+780.656037360" watchObservedRunningTime="2025-12-03 07:01:59.398836196 +0000 UTC m=+780.659790622" Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.408313 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-w2fjr" podStartSLOduration=1.465823955 podStartE2EDuration="4.408292761s" podCreationTimestamp="2025-12-03 07:01:55 +0000 UTC" firstStartedPulling="2025-12-03 07:01:55.846228886 +0000 UTC m=+777.107183312" lastFinishedPulling="2025-12-03 07:01:58.788697682 +0000 UTC m=+780.049652118" observedRunningTime="2025-12-03 07:01:59.407246132 +0000 UTC m=+780.668200558" watchObservedRunningTime="2025-12-03 07:01:59.408292761 +0000 UTC m=+780.669247187" Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.425791 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-trprb"] Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.429659 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-trprb"] Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.436047 4947 scope.go:117] "RemoveContainer" containerID="d075a2438ebb10a272713a16a8e71dcd51b222fca53d0b8c9a8468593f46bd5e" Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.440409 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-gnfbv" podStartSLOduration=1.09340565 podStartE2EDuration="4.440388626s" podCreationTimestamp="2025-12-03 07:01:55 +0000 UTC" firstStartedPulling="2025-12-03 07:01:55.5011491 +0000 UTC m=+776.762103536" lastFinishedPulling="2025-12-03 07:01:58.848132076 +0000 UTC m=+780.109086512" observedRunningTime="2025-12-03 07:01:59.439142303 +0000 UTC m=+780.700096749" watchObservedRunningTime="2025-12-03 07:01:59.440388626 +0000 UTC m=+780.701343052" Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.457975 4947 scope.go:117] "RemoveContainer" containerID="ff3ef034b2c07f39e1fe839324bb4b00ccfc28924ef39f5dc7dda5230c22e128" Dec 03 07:01:59 crc kubenswrapper[4947]: E1203 07:01:59.458258 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff3ef034b2c07f39e1fe839324bb4b00ccfc28924ef39f5dc7dda5230c22e128\": container with ID starting with ff3ef034b2c07f39e1fe839324bb4b00ccfc28924ef39f5dc7dda5230c22e128 not found: ID does not exist" containerID="ff3ef034b2c07f39e1fe839324bb4b00ccfc28924ef39f5dc7dda5230c22e128" Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.458284 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff3ef034b2c07f39e1fe839324bb4b00ccfc28924ef39f5dc7dda5230c22e128"} err="failed to get container status \"ff3ef034b2c07f39e1fe839324bb4b00ccfc28924ef39f5dc7dda5230c22e128\": rpc error: code = NotFound desc = could not find container \"ff3ef034b2c07f39e1fe839324bb4b00ccfc28924ef39f5dc7dda5230c22e128\": container with ID starting with ff3ef034b2c07f39e1fe839324bb4b00ccfc28924ef39f5dc7dda5230c22e128 not found: ID does not exist" Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.458302 4947 scope.go:117] "RemoveContainer" containerID="6d5d7948092da84041e53968d4fe9f8400cd0ebdd276289200b70ade4f08c671" Dec 03 07:01:59 crc kubenswrapper[4947]: E1203 07:01:59.458592 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d5d7948092da84041e53968d4fe9f8400cd0ebdd276289200b70ade4f08c671\": container with ID starting with 6d5d7948092da84041e53968d4fe9f8400cd0ebdd276289200b70ade4f08c671 not found: ID does not exist" containerID="6d5d7948092da84041e53968d4fe9f8400cd0ebdd276289200b70ade4f08c671" Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.458621 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d5d7948092da84041e53968d4fe9f8400cd0ebdd276289200b70ade4f08c671"} err="failed to get container status \"6d5d7948092da84041e53968d4fe9f8400cd0ebdd276289200b70ade4f08c671\": rpc error: code = NotFound desc = could not find container \"6d5d7948092da84041e53968d4fe9f8400cd0ebdd276289200b70ade4f08c671\": container with ID starting with 6d5d7948092da84041e53968d4fe9f8400cd0ebdd276289200b70ade4f08c671 not found: ID does not exist" Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.458642 4947 scope.go:117] "RemoveContainer" containerID="d075a2438ebb10a272713a16a8e71dcd51b222fca53d0b8c9a8468593f46bd5e" Dec 03 07:01:59 crc kubenswrapper[4947]: E1203 07:01:59.459140 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d075a2438ebb10a272713a16a8e71dcd51b222fca53d0b8c9a8468593f46bd5e\": container with ID starting with d075a2438ebb10a272713a16a8e71dcd51b222fca53d0b8c9a8468593f46bd5e not found: ID does not exist" containerID="d075a2438ebb10a272713a16a8e71dcd51b222fca53d0b8c9a8468593f46bd5e" Dec 03 07:01:59 crc kubenswrapper[4947]: I1203 07:01:59.459155 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d075a2438ebb10a272713a16a8e71dcd51b222fca53d0b8c9a8468593f46bd5e"} err="failed to get container status \"d075a2438ebb10a272713a16a8e71dcd51b222fca53d0b8c9a8468593f46bd5e\": rpc error: code = NotFound desc = could not find container \"d075a2438ebb10a272713a16a8e71dcd51b222fca53d0b8c9a8468593f46bd5e\": container with ID starting with d075a2438ebb10a272713a16a8e71dcd51b222fca53d0b8c9a8468593f46bd5e not found: ID does not exist" Dec 03 07:02:01 crc kubenswrapper[4947]: I1203 07:02:01.091118 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6083fcf-6624-417b-9be2-93559bed3c31" path="/var/lib/kubelet/pods/d6083fcf-6624-417b-9be2-93559bed3c31/volumes" Dec 03 07:02:02 crc kubenswrapper[4947]: I1203 07:02:02.408214 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8smps" event={"ID":"acc10835-1647-4727-9bda-15a99886aec1","Type":"ContainerStarted","Data":"612c57b20fb818aced9061cb2875d4dfebec697ed7560a7df9374134ede5a121"} Dec 03 07:02:02 crc kubenswrapper[4947]: I1203 07:02:02.433862 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-8smps" podStartSLOduration=1.974386409 podStartE2EDuration="7.433834078s" podCreationTimestamp="2025-12-03 07:01:55 +0000 UTC" firstStartedPulling="2025-12-03 07:01:55.765806137 +0000 UTC m=+777.026760563" lastFinishedPulling="2025-12-03 07:02:01.225253806 +0000 UTC m=+782.486208232" observedRunningTime="2025-12-03 07:02:02.428955286 +0000 UTC m=+783.689909752" watchObservedRunningTime="2025-12-03 07:02:02.433834078 +0000 UTC m=+783.694788494" Dec 03 07:02:05 crc kubenswrapper[4947]: I1203 07:02:05.504279 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-gnfbv" Dec 03 07:02:05 crc kubenswrapper[4947]: I1203 07:02:05.745945 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:02:05 crc kubenswrapper[4947]: I1203 07:02:05.745997 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:02:05 crc kubenswrapper[4947]: I1203 07:02:05.751403 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:02:06 crc kubenswrapper[4947]: I1203 07:02:06.431621 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7cd8b9f767-f7qlc" Dec 03 07:02:06 crc kubenswrapper[4947]: I1203 07:02:06.515277 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-t2gnl"] Dec 03 07:02:16 crc kubenswrapper[4947]: I1203 07:02:16.380678 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-vbwhn" Dec 03 07:02:29 crc kubenswrapper[4947]: I1203 07:02:29.218648 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l"] Dec 03 07:02:29 crc kubenswrapper[4947]: E1203 07:02:29.219509 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6083fcf-6624-417b-9be2-93559bed3c31" containerName="registry-server" Dec 03 07:02:29 crc kubenswrapper[4947]: I1203 07:02:29.219524 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6083fcf-6624-417b-9be2-93559bed3c31" containerName="registry-server" Dec 03 07:02:29 crc kubenswrapper[4947]: E1203 07:02:29.219542 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6083fcf-6624-417b-9be2-93559bed3c31" containerName="extract-content" Dec 03 07:02:29 crc kubenswrapper[4947]: I1203 07:02:29.219551 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6083fcf-6624-417b-9be2-93559bed3c31" containerName="extract-content" Dec 03 07:02:29 crc kubenswrapper[4947]: E1203 07:02:29.219575 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6083fcf-6624-417b-9be2-93559bed3c31" containerName="extract-utilities" Dec 03 07:02:29 crc kubenswrapper[4947]: I1203 07:02:29.219584 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6083fcf-6624-417b-9be2-93559bed3c31" containerName="extract-utilities" Dec 03 07:02:29 crc kubenswrapper[4947]: I1203 07:02:29.219731 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6083fcf-6624-417b-9be2-93559bed3c31" containerName="registry-server" Dec 03 07:02:29 crc kubenswrapper[4947]: I1203 07:02:29.220665 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l" Dec 03 07:02:29 crc kubenswrapper[4947]: I1203 07:02:29.223964 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 07:02:29 crc kubenswrapper[4947]: I1203 07:02:29.234472 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l"] Dec 03 07:02:29 crc kubenswrapper[4947]: I1203 07:02:29.325309 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2091a52a-0cd3-4b46-93d8-efacf220dd22-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l\" (UID: \"2091a52a-0cd3-4b46-93d8-efacf220dd22\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l" Dec 03 07:02:29 crc kubenswrapper[4947]: I1203 07:02:29.325986 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm9sk\" (UniqueName: \"kubernetes.io/projected/2091a52a-0cd3-4b46-93d8-efacf220dd22-kube-api-access-dm9sk\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l\" (UID: \"2091a52a-0cd3-4b46-93d8-efacf220dd22\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l" Dec 03 07:02:29 crc kubenswrapper[4947]: I1203 07:02:29.326161 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2091a52a-0cd3-4b46-93d8-efacf220dd22-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l\" (UID: \"2091a52a-0cd3-4b46-93d8-efacf220dd22\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l" Dec 03 07:02:29 crc kubenswrapper[4947]: I1203 07:02:29.427030 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2091a52a-0cd3-4b46-93d8-efacf220dd22-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l\" (UID: \"2091a52a-0cd3-4b46-93d8-efacf220dd22\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l" Dec 03 07:02:29 crc kubenswrapper[4947]: I1203 07:02:29.427132 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm9sk\" (UniqueName: \"kubernetes.io/projected/2091a52a-0cd3-4b46-93d8-efacf220dd22-kube-api-access-dm9sk\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l\" (UID: \"2091a52a-0cd3-4b46-93d8-efacf220dd22\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l" Dec 03 07:02:29 crc kubenswrapper[4947]: I1203 07:02:29.427191 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2091a52a-0cd3-4b46-93d8-efacf220dd22-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l\" (UID: \"2091a52a-0cd3-4b46-93d8-efacf220dd22\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l" Dec 03 07:02:29 crc kubenswrapper[4947]: I1203 07:02:29.427709 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2091a52a-0cd3-4b46-93d8-efacf220dd22-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l\" (UID: \"2091a52a-0cd3-4b46-93d8-efacf220dd22\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l" Dec 03 07:02:29 crc kubenswrapper[4947]: I1203 07:02:29.427731 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2091a52a-0cd3-4b46-93d8-efacf220dd22-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l\" (UID: \"2091a52a-0cd3-4b46-93d8-efacf220dd22\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l" Dec 03 07:02:29 crc kubenswrapper[4947]: I1203 07:02:29.448193 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm9sk\" (UniqueName: \"kubernetes.io/projected/2091a52a-0cd3-4b46-93d8-efacf220dd22-kube-api-access-dm9sk\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l\" (UID: \"2091a52a-0cd3-4b46-93d8-efacf220dd22\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l" Dec 03 07:02:29 crc kubenswrapper[4947]: I1203 07:02:29.546364 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l" Dec 03 07:02:29 crc kubenswrapper[4947]: I1203 07:02:29.726321 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l"] Dec 03 07:02:29 crc kubenswrapper[4947]: W1203 07:02:29.741695 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2091a52a_0cd3_4b46_93d8_efacf220dd22.slice/crio-44219264a6dac1d46e2c35f3769f9ab0a78836e2dc6b08a4bd7d71cc002e5a78 WatchSource:0}: Error finding container 44219264a6dac1d46e2c35f3769f9ab0a78836e2dc6b08a4bd7d71cc002e5a78: Status 404 returned error can't find the container with id 44219264a6dac1d46e2c35f3769f9ab0a78836e2dc6b08a4bd7d71cc002e5a78 Dec 03 07:02:30 crc kubenswrapper[4947]: I1203 07:02:30.585737 4947 generic.go:334] "Generic (PLEG): container finished" podID="2091a52a-0cd3-4b46-93d8-efacf220dd22" containerID="289cf758776a148e2524ffad5f01d5e0911a821afa2d9fb96e2bee9dab79f600" exitCode=0 Dec 03 07:02:30 crc kubenswrapper[4947]: I1203 07:02:30.585819 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l" event={"ID":"2091a52a-0cd3-4b46-93d8-efacf220dd22","Type":"ContainerDied","Data":"289cf758776a148e2524ffad5f01d5e0911a821afa2d9fb96e2bee9dab79f600"} Dec 03 07:02:30 crc kubenswrapper[4947]: I1203 07:02:30.585871 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l" event={"ID":"2091a52a-0cd3-4b46-93d8-efacf220dd22","Type":"ContainerStarted","Data":"44219264a6dac1d46e2c35f3769f9ab0a78836e2dc6b08a4bd7d71cc002e5a78"} Dec 03 07:02:31 crc kubenswrapper[4947]: I1203 07:02:31.589239 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-t2gnl" podUID="a274d4f7-f741-48cc-9c34-8e2805ad66e3" containerName="console" containerID="cri-o://45a32ada656de5faa8ca2abc0f130d56586a58450ba8e779b9ab6f245c309603" gracePeriod=15 Dec 03 07:02:31 crc kubenswrapper[4947]: I1203 07:02:31.965022 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-t2gnl_a274d4f7-f741-48cc-9c34-8e2805ad66e3/console/0.log" Dec 03 07:02:31 crc kubenswrapper[4947]: I1203 07:02:31.965577 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.062218 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a274d4f7-f741-48cc-9c34-8e2805ad66e3-console-config\") pod \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.062261 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mfnd\" (UniqueName: \"kubernetes.io/projected/a274d4f7-f741-48cc-9c34-8e2805ad66e3-kube-api-access-6mfnd\") pod \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.062298 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a274d4f7-f741-48cc-9c34-8e2805ad66e3-console-oauth-config\") pod \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.062318 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a274d4f7-f741-48cc-9c34-8e2805ad66e3-console-serving-cert\") pod \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.062374 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a274d4f7-f741-48cc-9c34-8e2805ad66e3-service-ca\") pod \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.062414 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a274d4f7-f741-48cc-9c34-8e2805ad66e3-oauth-serving-cert\") pod \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.062441 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a274d4f7-f741-48cc-9c34-8e2805ad66e3-trusted-ca-bundle\") pod \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\" (UID: \"a274d4f7-f741-48cc-9c34-8e2805ad66e3\") " Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.063297 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a274d4f7-f741-48cc-9c34-8e2805ad66e3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a274d4f7-f741-48cc-9c34-8e2805ad66e3" (UID: "a274d4f7-f741-48cc-9c34-8e2805ad66e3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.063312 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a274d4f7-f741-48cc-9c34-8e2805ad66e3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a274d4f7-f741-48cc-9c34-8e2805ad66e3" (UID: "a274d4f7-f741-48cc-9c34-8e2805ad66e3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.063360 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a274d4f7-f741-48cc-9c34-8e2805ad66e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "a274d4f7-f741-48cc-9c34-8e2805ad66e3" (UID: "a274d4f7-f741-48cc-9c34-8e2805ad66e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.063358 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a274d4f7-f741-48cc-9c34-8e2805ad66e3-console-config" (OuterVolumeSpecName: "console-config") pod "a274d4f7-f741-48cc-9c34-8e2805ad66e3" (UID: "a274d4f7-f741-48cc-9c34-8e2805ad66e3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.067642 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a274d4f7-f741-48cc-9c34-8e2805ad66e3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a274d4f7-f741-48cc-9c34-8e2805ad66e3" (UID: "a274d4f7-f741-48cc-9c34-8e2805ad66e3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.067716 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a274d4f7-f741-48cc-9c34-8e2805ad66e3-kube-api-access-6mfnd" (OuterVolumeSpecName: "kube-api-access-6mfnd") pod "a274d4f7-f741-48cc-9c34-8e2805ad66e3" (UID: "a274d4f7-f741-48cc-9c34-8e2805ad66e3"). InnerVolumeSpecName "kube-api-access-6mfnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.067750 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a274d4f7-f741-48cc-9c34-8e2805ad66e3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a274d4f7-f741-48cc-9c34-8e2805ad66e3" (UID: "a274d4f7-f741-48cc-9c34-8e2805ad66e3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.164222 4947 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a274d4f7-f741-48cc-9c34-8e2805ad66e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.164274 4947 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a274d4f7-f741-48cc-9c34-8e2805ad66e3-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.164292 4947 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a274d4f7-f741-48cc-9c34-8e2805ad66e3-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.164301 4947 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a274d4f7-f741-48cc-9c34-8e2805ad66e3-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.164311 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mfnd\" (UniqueName: \"kubernetes.io/projected/a274d4f7-f741-48cc-9c34-8e2805ad66e3-kube-api-access-6mfnd\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.164320 4947 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a274d4f7-f741-48cc-9c34-8e2805ad66e3-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.164329 4947 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a274d4f7-f741-48cc-9c34-8e2805ad66e3-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.616653 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-t2gnl_a274d4f7-f741-48cc-9c34-8e2805ad66e3/console/0.log" Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.616715 4947 generic.go:334] "Generic (PLEG): container finished" podID="a274d4f7-f741-48cc-9c34-8e2805ad66e3" containerID="45a32ada656de5faa8ca2abc0f130d56586a58450ba8e779b9ab6f245c309603" exitCode=2 Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.616825 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-t2gnl" event={"ID":"a274d4f7-f741-48cc-9c34-8e2805ad66e3","Type":"ContainerDied","Data":"45a32ada656de5faa8ca2abc0f130d56586a58450ba8e779b9ab6f245c309603"} Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.616852 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-t2gnl" Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.616883 4947 scope.go:117] "RemoveContainer" containerID="45a32ada656de5faa8ca2abc0f130d56586a58450ba8e779b9ab6f245c309603" Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.616866 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-t2gnl" event={"ID":"a274d4f7-f741-48cc-9c34-8e2805ad66e3","Type":"ContainerDied","Data":"d0201c7480def2ead3d7aeea472f0d748595675d29208d5792b7d4c2c194ac19"} Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.620160 4947 generic.go:334] "Generic (PLEG): container finished" podID="2091a52a-0cd3-4b46-93d8-efacf220dd22" containerID="332cca26715f6d17434a43aceb60f4314fcee53cf969a952fb05dd2a3394a78a" exitCode=0 Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.620187 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l" event={"ID":"2091a52a-0cd3-4b46-93d8-efacf220dd22","Type":"ContainerDied","Data":"332cca26715f6d17434a43aceb60f4314fcee53cf969a952fb05dd2a3394a78a"} Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.663289 4947 scope.go:117] "RemoveContainer" containerID="45a32ada656de5faa8ca2abc0f130d56586a58450ba8e779b9ab6f245c309603" Dec 03 07:02:32 crc kubenswrapper[4947]: E1203 07:02:32.663961 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45a32ada656de5faa8ca2abc0f130d56586a58450ba8e779b9ab6f245c309603\": container with ID starting with 45a32ada656de5faa8ca2abc0f130d56586a58450ba8e779b9ab6f245c309603 not found: ID does not exist" containerID="45a32ada656de5faa8ca2abc0f130d56586a58450ba8e779b9ab6f245c309603" Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.664012 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45a32ada656de5faa8ca2abc0f130d56586a58450ba8e779b9ab6f245c309603"} err="failed to get container status \"45a32ada656de5faa8ca2abc0f130d56586a58450ba8e779b9ab6f245c309603\": rpc error: code = NotFound desc = could not find container \"45a32ada656de5faa8ca2abc0f130d56586a58450ba8e779b9ab6f245c309603\": container with ID starting with 45a32ada656de5faa8ca2abc0f130d56586a58450ba8e779b9ab6f245c309603 not found: ID does not exist" Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.680690 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-t2gnl"] Dec 03 07:02:32 crc kubenswrapper[4947]: I1203 07:02:32.690588 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-t2gnl"] Dec 03 07:02:33 crc kubenswrapper[4947]: I1203 07:02:33.096044 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a274d4f7-f741-48cc-9c34-8e2805ad66e3" path="/var/lib/kubelet/pods/a274d4f7-f741-48cc-9c34-8e2805ad66e3/volumes" Dec 03 07:02:33 crc kubenswrapper[4947]: I1203 07:02:33.632105 4947 generic.go:334] "Generic (PLEG): container finished" podID="2091a52a-0cd3-4b46-93d8-efacf220dd22" containerID="d15f008896a76058dd1d25c8073711170e6cd9d7dda09871e7d83736713b4b8c" exitCode=0 Dec 03 07:02:33 crc kubenswrapper[4947]: I1203 07:02:33.632236 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l" event={"ID":"2091a52a-0cd3-4b46-93d8-efacf220dd22","Type":"ContainerDied","Data":"d15f008896a76058dd1d25c8073711170e6cd9d7dda09871e7d83736713b4b8c"} Dec 03 07:02:35 crc kubenswrapper[4947]: I1203 07:02:35.427028 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l" Dec 03 07:02:35 crc kubenswrapper[4947]: I1203 07:02:35.602043 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2091a52a-0cd3-4b46-93d8-efacf220dd22-bundle\") pod \"2091a52a-0cd3-4b46-93d8-efacf220dd22\" (UID: \"2091a52a-0cd3-4b46-93d8-efacf220dd22\") " Dec 03 07:02:35 crc kubenswrapper[4947]: I1203 07:02:35.602147 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2091a52a-0cd3-4b46-93d8-efacf220dd22-util\") pod \"2091a52a-0cd3-4b46-93d8-efacf220dd22\" (UID: \"2091a52a-0cd3-4b46-93d8-efacf220dd22\") " Dec 03 07:02:35 crc kubenswrapper[4947]: I1203 07:02:35.602350 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dm9sk\" (UniqueName: \"kubernetes.io/projected/2091a52a-0cd3-4b46-93d8-efacf220dd22-kube-api-access-dm9sk\") pod \"2091a52a-0cd3-4b46-93d8-efacf220dd22\" (UID: \"2091a52a-0cd3-4b46-93d8-efacf220dd22\") " Dec 03 07:02:35 crc kubenswrapper[4947]: I1203 07:02:35.604818 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2091a52a-0cd3-4b46-93d8-efacf220dd22-bundle" (OuterVolumeSpecName: "bundle") pod "2091a52a-0cd3-4b46-93d8-efacf220dd22" (UID: "2091a52a-0cd3-4b46-93d8-efacf220dd22"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:02:35 crc kubenswrapper[4947]: I1203 07:02:35.607804 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2091a52a-0cd3-4b46-93d8-efacf220dd22-kube-api-access-dm9sk" (OuterVolumeSpecName: "kube-api-access-dm9sk") pod "2091a52a-0cd3-4b46-93d8-efacf220dd22" (UID: "2091a52a-0cd3-4b46-93d8-efacf220dd22"). InnerVolumeSpecName "kube-api-access-dm9sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:02:35 crc kubenswrapper[4947]: I1203 07:02:35.624129 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2091a52a-0cd3-4b46-93d8-efacf220dd22-util" (OuterVolumeSpecName: "util") pod "2091a52a-0cd3-4b46-93d8-efacf220dd22" (UID: "2091a52a-0cd3-4b46-93d8-efacf220dd22"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:02:35 crc kubenswrapper[4947]: I1203 07:02:35.650265 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l" event={"ID":"2091a52a-0cd3-4b46-93d8-efacf220dd22","Type":"ContainerDied","Data":"44219264a6dac1d46e2c35f3769f9ab0a78836e2dc6b08a4bd7d71cc002e5a78"} Dec 03 07:02:35 crc kubenswrapper[4947]: I1203 07:02:35.650304 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44219264a6dac1d46e2c35f3769f9ab0a78836e2dc6b08a4bd7d71cc002e5a78" Dec 03 07:02:35 crc kubenswrapper[4947]: I1203 07:02:35.650359 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l" Dec 03 07:02:35 crc kubenswrapper[4947]: I1203 07:02:35.703677 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dm9sk\" (UniqueName: \"kubernetes.io/projected/2091a52a-0cd3-4b46-93d8-efacf220dd22-kube-api-access-dm9sk\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:35 crc kubenswrapper[4947]: I1203 07:02:35.703720 4947 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2091a52a-0cd3-4b46-93d8-efacf220dd22-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:35 crc kubenswrapper[4947]: I1203 07:02:35.703732 4947 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2091a52a-0cd3-4b46-93d8-efacf220dd22-util\") on node \"crc\" DevicePath \"\"" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.125607 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6844cc4bd8-5bv87"] Dec 03 07:02:44 crc kubenswrapper[4947]: E1203 07:02:44.126312 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2091a52a-0cd3-4b46-93d8-efacf220dd22" containerName="util" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.126324 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="2091a52a-0cd3-4b46-93d8-efacf220dd22" containerName="util" Dec 03 07:02:44 crc kubenswrapper[4947]: E1203 07:02:44.126335 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2091a52a-0cd3-4b46-93d8-efacf220dd22" containerName="pull" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.126340 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="2091a52a-0cd3-4b46-93d8-efacf220dd22" containerName="pull" Dec 03 07:02:44 crc kubenswrapper[4947]: E1203 07:02:44.126350 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2091a52a-0cd3-4b46-93d8-efacf220dd22" containerName="extract" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.126356 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="2091a52a-0cd3-4b46-93d8-efacf220dd22" containerName="extract" Dec 03 07:02:44 crc kubenswrapper[4947]: E1203 07:02:44.126365 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a274d4f7-f741-48cc-9c34-8e2805ad66e3" containerName="console" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.126370 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="a274d4f7-f741-48cc-9c34-8e2805ad66e3" containerName="console" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.126466 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="a274d4f7-f741-48cc-9c34-8e2805ad66e3" containerName="console" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.126477 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="2091a52a-0cd3-4b46-93d8-efacf220dd22" containerName="extract" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.126860 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6844cc4bd8-5bv87" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.129081 4947 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.129273 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.129328 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.130534 4947 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.130566 4947 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-2k4xb" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.142048 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6844cc4bd8-5bv87"] Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.302299 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e6cdefe4-13ac-46b3-8b10-dfdd75ece90a-apiservice-cert\") pod \"metallb-operator-controller-manager-6844cc4bd8-5bv87\" (UID: \"e6cdefe4-13ac-46b3-8b10-dfdd75ece90a\") " pod="metallb-system/metallb-operator-controller-manager-6844cc4bd8-5bv87" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.302343 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e6cdefe4-13ac-46b3-8b10-dfdd75ece90a-webhook-cert\") pod \"metallb-operator-controller-manager-6844cc4bd8-5bv87\" (UID: \"e6cdefe4-13ac-46b3-8b10-dfdd75ece90a\") " pod="metallb-system/metallb-operator-controller-manager-6844cc4bd8-5bv87" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.302481 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f4wx\" (UniqueName: \"kubernetes.io/projected/e6cdefe4-13ac-46b3-8b10-dfdd75ece90a-kube-api-access-6f4wx\") pod \"metallb-operator-controller-manager-6844cc4bd8-5bv87\" (UID: \"e6cdefe4-13ac-46b3-8b10-dfdd75ece90a\") " pod="metallb-system/metallb-operator-controller-manager-6844cc4bd8-5bv87" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.370220 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-646746987c-7qg2b"] Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.371028 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-646746987c-7qg2b" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.372591 4947 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.372652 4947 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-6xd2f" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.373015 4947 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.404020 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f4wx\" (UniqueName: \"kubernetes.io/projected/e6cdefe4-13ac-46b3-8b10-dfdd75ece90a-kube-api-access-6f4wx\") pod \"metallb-operator-controller-manager-6844cc4bd8-5bv87\" (UID: \"e6cdefe4-13ac-46b3-8b10-dfdd75ece90a\") " pod="metallb-system/metallb-operator-controller-manager-6844cc4bd8-5bv87" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.404082 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e6cdefe4-13ac-46b3-8b10-dfdd75ece90a-apiservice-cert\") pod \"metallb-operator-controller-manager-6844cc4bd8-5bv87\" (UID: \"e6cdefe4-13ac-46b3-8b10-dfdd75ece90a\") " pod="metallb-system/metallb-operator-controller-manager-6844cc4bd8-5bv87" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.404100 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e6cdefe4-13ac-46b3-8b10-dfdd75ece90a-webhook-cert\") pod \"metallb-operator-controller-manager-6844cc4bd8-5bv87\" (UID: \"e6cdefe4-13ac-46b3-8b10-dfdd75ece90a\") " pod="metallb-system/metallb-operator-controller-manager-6844cc4bd8-5bv87" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.411408 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e6cdefe4-13ac-46b3-8b10-dfdd75ece90a-webhook-cert\") pod \"metallb-operator-controller-manager-6844cc4bd8-5bv87\" (UID: \"e6cdefe4-13ac-46b3-8b10-dfdd75ece90a\") " pod="metallb-system/metallb-operator-controller-manager-6844cc4bd8-5bv87" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.413120 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e6cdefe4-13ac-46b3-8b10-dfdd75ece90a-apiservice-cert\") pod \"metallb-operator-controller-manager-6844cc4bd8-5bv87\" (UID: \"e6cdefe4-13ac-46b3-8b10-dfdd75ece90a\") " pod="metallb-system/metallb-operator-controller-manager-6844cc4bd8-5bv87" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.420960 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-646746987c-7qg2b"] Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.426902 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f4wx\" (UniqueName: \"kubernetes.io/projected/e6cdefe4-13ac-46b3-8b10-dfdd75ece90a-kube-api-access-6f4wx\") pod \"metallb-operator-controller-manager-6844cc4bd8-5bv87\" (UID: \"e6cdefe4-13ac-46b3-8b10-dfdd75ece90a\") " pod="metallb-system/metallb-operator-controller-manager-6844cc4bd8-5bv87" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.441826 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6844cc4bd8-5bv87" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.505620 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/502b6d02-4507-4965-b356-64edddc1b97b-webhook-cert\") pod \"metallb-operator-webhook-server-646746987c-7qg2b\" (UID: \"502b6d02-4507-4965-b356-64edddc1b97b\") " pod="metallb-system/metallb-operator-webhook-server-646746987c-7qg2b" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.505660 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/502b6d02-4507-4965-b356-64edddc1b97b-apiservice-cert\") pod \"metallb-operator-webhook-server-646746987c-7qg2b\" (UID: \"502b6d02-4507-4965-b356-64edddc1b97b\") " pod="metallb-system/metallb-operator-webhook-server-646746987c-7qg2b" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.506162 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnfmb\" (UniqueName: \"kubernetes.io/projected/502b6d02-4507-4965-b356-64edddc1b97b-kube-api-access-rnfmb\") pod \"metallb-operator-webhook-server-646746987c-7qg2b\" (UID: \"502b6d02-4507-4965-b356-64edddc1b97b\") " pod="metallb-system/metallb-operator-webhook-server-646746987c-7qg2b" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.607620 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnfmb\" (UniqueName: \"kubernetes.io/projected/502b6d02-4507-4965-b356-64edddc1b97b-kube-api-access-rnfmb\") pod \"metallb-operator-webhook-server-646746987c-7qg2b\" (UID: \"502b6d02-4507-4965-b356-64edddc1b97b\") " pod="metallb-system/metallb-operator-webhook-server-646746987c-7qg2b" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.607681 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/502b6d02-4507-4965-b356-64edddc1b97b-webhook-cert\") pod \"metallb-operator-webhook-server-646746987c-7qg2b\" (UID: \"502b6d02-4507-4965-b356-64edddc1b97b\") " pod="metallb-system/metallb-operator-webhook-server-646746987c-7qg2b" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.607699 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/502b6d02-4507-4965-b356-64edddc1b97b-apiservice-cert\") pod \"metallb-operator-webhook-server-646746987c-7qg2b\" (UID: \"502b6d02-4507-4965-b356-64edddc1b97b\") " pod="metallb-system/metallb-operator-webhook-server-646746987c-7qg2b" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.611974 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/502b6d02-4507-4965-b356-64edddc1b97b-apiservice-cert\") pod \"metallb-operator-webhook-server-646746987c-7qg2b\" (UID: \"502b6d02-4507-4965-b356-64edddc1b97b\") " pod="metallb-system/metallb-operator-webhook-server-646746987c-7qg2b" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.617315 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/502b6d02-4507-4965-b356-64edddc1b97b-webhook-cert\") pod \"metallb-operator-webhook-server-646746987c-7qg2b\" (UID: \"502b6d02-4507-4965-b356-64edddc1b97b\") " pod="metallb-system/metallb-operator-webhook-server-646746987c-7qg2b" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.638904 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnfmb\" (UniqueName: \"kubernetes.io/projected/502b6d02-4507-4965-b356-64edddc1b97b-kube-api-access-rnfmb\") pod \"metallb-operator-webhook-server-646746987c-7qg2b\" (UID: \"502b6d02-4507-4965-b356-64edddc1b97b\") " pod="metallb-system/metallb-operator-webhook-server-646746987c-7qg2b" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.684921 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-646746987c-7qg2b" Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.726523 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6844cc4bd8-5bv87"] Dec 03 07:02:44 crc kubenswrapper[4947]: I1203 07:02:44.927507 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-646746987c-7qg2b"] Dec 03 07:02:44 crc kubenswrapper[4947]: W1203 07:02:44.976897 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod502b6d02_4507_4965_b356_64edddc1b97b.slice/crio-d67cae90c956a461b7d88a399ccaac82c2a98e1ef05dc6ca2ce83693540ac1d0 WatchSource:0}: Error finding container d67cae90c956a461b7d88a399ccaac82c2a98e1ef05dc6ca2ce83693540ac1d0: Status 404 returned error can't find the container with id d67cae90c956a461b7d88a399ccaac82c2a98e1ef05dc6ca2ce83693540ac1d0 Dec 03 07:02:45 crc kubenswrapper[4947]: I1203 07:02:45.712443 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-646746987c-7qg2b" event={"ID":"502b6d02-4507-4965-b356-64edddc1b97b","Type":"ContainerStarted","Data":"d67cae90c956a461b7d88a399ccaac82c2a98e1ef05dc6ca2ce83693540ac1d0"} Dec 03 07:02:45 crc kubenswrapper[4947]: I1203 07:02:45.713788 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6844cc4bd8-5bv87" event={"ID":"e6cdefe4-13ac-46b3-8b10-dfdd75ece90a","Type":"ContainerStarted","Data":"acc85f749d8d3a56f5623a780d51551e82777a59e21834f390937f8bc46ae986"} Dec 03 07:02:48 crc kubenswrapper[4947]: I1203 07:02:48.739174 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6844cc4bd8-5bv87" event={"ID":"e6cdefe4-13ac-46b3-8b10-dfdd75ece90a","Type":"ContainerStarted","Data":"9a4796976d0bf5a3ad4ba4652cc2728d6c6a717839b01b3c7e027b158f35ed3e"} Dec 03 07:02:48 crc kubenswrapper[4947]: I1203 07:02:48.740426 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6844cc4bd8-5bv87" Dec 03 07:02:48 crc kubenswrapper[4947]: I1203 07:02:48.761000 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6844cc4bd8-5bv87" podStartSLOduration=1.7688467540000001 podStartE2EDuration="4.76098564s" podCreationTimestamp="2025-12-03 07:02:44 +0000 UTC" firstStartedPulling="2025-12-03 07:02:44.764012097 +0000 UTC m=+826.024966523" lastFinishedPulling="2025-12-03 07:02:47.756150983 +0000 UTC m=+829.017105409" observedRunningTime="2025-12-03 07:02:48.756939291 +0000 UTC m=+830.017893717" watchObservedRunningTime="2025-12-03 07:02:48.76098564 +0000 UTC m=+830.021940056" Dec 03 07:02:51 crc kubenswrapper[4947]: I1203 07:02:51.757674 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-646746987c-7qg2b" event={"ID":"502b6d02-4507-4965-b356-64edddc1b97b","Type":"ContainerStarted","Data":"1ae040e706f6e5e474f0e48f80cc070894bec7930634db837a5fa1dd144a713d"} Dec 03 07:02:51 crc kubenswrapper[4947]: I1203 07:02:51.758391 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-646746987c-7qg2b" Dec 03 07:02:51 crc kubenswrapper[4947]: I1203 07:02:51.775773 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-646746987c-7qg2b" podStartSLOduration=1.7357689920000001 podStartE2EDuration="7.775751126s" podCreationTimestamp="2025-12-03 07:02:44 +0000 UTC" firstStartedPulling="2025-12-03 07:02:44.979368815 +0000 UTC m=+826.240323241" lastFinishedPulling="2025-12-03 07:02:51.019350939 +0000 UTC m=+832.280305375" observedRunningTime="2025-12-03 07:02:51.774085761 +0000 UTC m=+833.035040197" watchObservedRunningTime="2025-12-03 07:02:51.775751126 +0000 UTC m=+833.036705592" Dec 03 07:03:04 crc kubenswrapper[4947]: I1203 07:03:04.689569 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-646746987c-7qg2b" Dec 03 07:03:24 crc kubenswrapper[4947]: I1203 07:03:24.445969 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6844cc4bd8-5bv87" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.167671 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-4jbhl"] Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.171862 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.173598 4947 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-whgrx" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.175201 4947 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.180077 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-lhpmz"] Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.180929 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lhpmz" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.182542 4947 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.185042 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.195431 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-lhpmz"] Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.250100 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8704521a-a01e-42a0-a985-dcaa9756de9f-metrics\") pod \"frr-k8s-4jbhl\" (UID: \"8704521a-a01e-42a0-a985-dcaa9756de9f\") " pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.250155 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8704521a-a01e-42a0-a985-dcaa9756de9f-frr-conf\") pod \"frr-k8s-4jbhl\" (UID: \"8704521a-a01e-42a0-a985-dcaa9756de9f\") " pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.250185 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3a50a88-0743-4bc2-831c-65de7fbf4bb5-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-lhpmz\" (UID: \"b3a50a88-0743-4bc2-831c-65de7fbf4bb5\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lhpmz" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.250400 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8704521a-a01e-42a0-a985-dcaa9756de9f-frr-startup\") pod \"frr-k8s-4jbhl\" (UID: \"8704521a-a01e-42a0-a985-dcaa9756de9f\") " pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.250537 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2rvd\" (UniqueName: \"kubernetes.io/projected/b3a50a88-0743-4bc2-831c-65de7fbf4bb5-kube-api-access-t2rvd\") pod \"frr-k8s-webhook-server-7fcb986d4-lhpmz\" (UID: \"b3a50a88-0743-4bc2-831c-65de7fbf4bb5\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lhpmz" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.250706 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8704521a-a01e-42a0-a985-dcaa9756de9f-reloader\") pod \"frr-k8s-4jbhl\" (UID: \"8704521a-a01e-42a0-a985-dcaa9756de9f\") " pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.250764 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8704521a-a01e-42a0-a985-dcaa9756de9f-metrics-certs\") pod \"frr-k8s-4jbhl\" (UID: \"8704521a-a01e-42a0-a985-dcaa9756de9f\") " pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.250829 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8704521a-a01e-42a0-a985-dcaa9756de9f-frr-sockets\") pod \"frr-k8s-4jbhl\" (UID: \"8704521a-a01e-42a0-a985-dcaa9756de9f\") " pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.250938 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqxtz\" (UniqueName: \"kubernetes.io/projected/8704521a-a01e-42a0-a985-dcaa9756de9f-kube-api-access-zqxtz\") pod \"frr-k8s-4jbhl\" (UID: \"8704521a-a01e-42a0-a985-dcaa9756de9f\") " pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.277753 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-swvdq"] Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.278867 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-swvdq" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.281450 4947 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.281510 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.281531 4947 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-b4mjb" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.281742 4947 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.299097 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-smdq8"] Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.300031 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-smdq8" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.303311 4947 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.311326 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-smdq8"] Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.352130 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8704521a-a01e-42a0-a985-dcaa9756de9f-frr-conf\") pod \"frr-k8s-4jbhl\" (UID: \"8704521a-a01e-42a0-a985-dcaa9756de9f\") " pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.352191 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3a50a88-0743-4bc2-831c-65de7fbf4bb5-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-lhpmz\" (UID: \"b3a50a88-0743-4bc2-831c-65de7fbf4bb5\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lhpmz" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.352225 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/84e9b3ca-43fe-49fb-b14c-3837ce889acb-memberlist\") pod \"speaker-swvdq\" (UID: \"84e9b3ca-43fe-49fb-b14c-3837ce889acb\") " pod="metallb-system/speaker-swvdq" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.352253 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8704521a-a01e-42a0-a985-dcaa9756de9f-frr-startup\") pod \"frr-k8s-4jbhl\" (UID: \"8704521a-a01e-42a0-a985-dcaa9756de9f\") " pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.352281 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2rvd\" (UniqueName: \"kubernetes.io/projected/b3a50a88-0743-4bc2-831c-65de7fbf4bb5-kube-api-access-t2rvd\") pod \"frr-k8s-webhook-server-7fcb986d4-lhpmz\" (UID: \"b3a50a88-0743-4bc2-831c-65de7fbf4bb5\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lhpmz" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.352312 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zgd2\" (UniqueName: \"kubernetes.io/projected/84e9b3ca-43fe-49fb-b14c-3837ce889acb-kube-api-access-9zgd2\") pod \"speaker-swvdq\" (UID: \"84e9b3ca-43fe-49fb-b14c-3837ce889acb\") " pod="metallb-system/speaker-swvdq" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.352340 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/84e9b3ca-43fe-49fb-b14c-3837ce889acb-metallb-excludel2\") pod \"speaker-swvdq\" (UID: \"84e9b3ca-43fe-49fb-b14c-3837ce889acb\") " pod="metallb-system/speaker-swvdq" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.352376 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74-cert\") pod \"controller-f8648f98b-smdq8\" (UID: \"dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74\") " pod="metallb-system/controller-f8648f98b-smdq8" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.352405 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8704521a-a01e-42a0-a985-dcaa9756de9f-reloader\") pod \"frr-k8s-4jbhl\" (UID: \"8704521a-a01e-42a0-a985-dcaa9756de9f\") " pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.352434 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8704521a-a01e-42a0-a985-dcaa9756de9f-metrics-certs\") pod \"frr-k8s-4jbhl\" (UID: \"8704521a-a01e-42a0-a985-dcaa9756de9f\") " pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.352482 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqmjn\" (UniqueName: \"kubernetes.io/projected/dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74-kube-api-access-vqmjn\") pod \"controller-f8648f98b-smdq8\" (UID: \"dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74\") " pod="metallb-system/controller-f8648f98b-smdq8" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.352528 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74-metrics-certs\") pod \"controller-f8648f98b-smdq8\" (UID: \"dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74\") " pod="metallb-system/controller-f8648f98b-smdq8" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.352562 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8704521a-a01e-42a0-a985-dcaa9756de9f-frr-sockets\") pod \"frr-k8s-4jbhl\" (UID: \"8704521a-a01e-42a0-a985-dcaa9756de9f\") " pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.352603 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqxtz\" (UniqueName: \"kubernetes.io/projected/8704521a-a01e-42a0-a985-dcaa9756de9f-kube-api-access-zqxtz\") pod \"frr-k8s-4jbhl\" (UID: \"8704521a-a01e-42a0-a985-dcaa9756de9f\") " pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.352644 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84e9b3ca-43fe-49fb-b14c-3837ce889acb-metrics-certs\") pod \"speaker-swvdq\" (UID: \"84e9b3ca-43fe-49fb-b14c-3837ce889acb\") " pod="metallb-system/speaker-swvdq" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.352648 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8704521a-a01e-42a0-a985-dcaa9756de9f-frr-conf\") pod \"frr-k8s-4jbhl\" (UID: \"8704521a-a01e-42a0-a985-dcaa9756de9f\") " pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.352670 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8704521a-a01e-42a0-a985-dcaa9756de9f-metrics\") pod \"frr-k8s-4jbhl\" (UID: \"8704521a-a01e-42a0-a985-dcaa9756de9f\") " pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.353064 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8704521a-a01e-42a0-a985-dcaa9756de9f-metrics\") pod \"frr-k8s-4jbhl\" (UID: \"8704521a-a01e-42a0-a985-dcaa9756de9f\") " pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.353276 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8704521a-a01e-42a0-a985-dcaa9756de9f-frr-startup\") pod \"frr-k8s-4jbhl\" (UID: \"8704521a-a01e-42a0-a985-dcaa9756de9f\") " pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.353644 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8704521a-a01e-42a0-a985-dcaa9756de9f-frr-sockets\") pod \"frr-k8s-4jbhl\" (UID: \"8704521a-a01e-42a0-a985-dcaa9756de9f\") " pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.353861 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8704521a-a01e-42a0-a985-dcaa9756de9f-reloader\") pod \"frr-k8s-4jbhl\" (UID: \"8704521a-a01e-42a0-a985-dcaa9756de9f\") " pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.359361 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3a50a88-0743-4bc2-831c-65de7fbf4bb5-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-lhpmz\" (UID: \"b3a50a88-0743-4bc2-831c-65de7fbf4bb5\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lhpmz" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.359836 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8704521a-a01e-42a0-a985-dcaa9756de9f-metrics-certs\") pod \"frr-k8s-4jbhl\" (UID: \"8704521a-a01e-42a0-a985-dcaa9756de9f\") " pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.380719 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqxtz\" (UniqueName: \"kubernetes.io/projected/8704521a-a01e-42a0-a985-dcaa9756de9f-kube-api-access-zqxtz\") pod \"frr-k8s-4jbhl\" (UID: \"8704521a-a01e-42a0-a985-dcaa9756de9f\") " pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.381245 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2rvd\" (UniqueName: \"kubernetes.io/projected/b3a50a88-0743-4bc2-831c-65de7fbf4bb5-kube-api-access-t2rvd\") pod \"frr-k8s-webhook-server-7fcb986d4-lhpmz\" (UID: \"b3a50a88-0743-4bc2-831c-65de7fbf4bb5\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lhpmz" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.453501 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zgd2\" (UniqueName: \"kubernetes.io/projected/84e9b3ca-43fe-49fb-b14c-3837ce889acb-kube-api-access-9zgd2\") pod \"speaker-swvdq\" (UID: \"84e9b3ca-43fe-49fb-b14c-3837ce889acb\") " pod="metallb-system/speaker-swvdq" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.453544 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/84e9b3ca-43fe-49fb-b14c-3837ce889acb-metallb-excludel2\") pod \"speaker-swvdq\" (UID: \"84e9b3ca-43fe-49fb-b14c-3837ce889acb\") " pod="metallb-system/speaker-swvdq" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.453585 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74-cert\") pod \"controller-f8648f98b-smdq8\" (UID: \"dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74\") " pod="metallb-system/controller-f8648f98b-smdq8" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.453611 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqmjn\" (UniqueName: \"kubernetes.io/projected/dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74-kube-api-access-vqmjn\") pod \"controller-f8648f98b-smdq8\" (UID: \"dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74\") " pod="metallb-system/controller-f8648f98b-smdq8" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.453628 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74-metrics-certs\") pod \"controller-f8648f98b-smdq8\" (UID: \"dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74\") " pod="metallb-system/controller-f8648f98b-smdq8" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.453667 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84e9b3ca-43fe-49fb-b14c-3837ce889acb-metrics-certs\") pod \"speaker-swvdq\" (UID: \"84e9b3ca-43fe-49fb-b14c-3837ce889acb\") " pod="metallb-system/speaker-swvdq" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.453697 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/84e9b3ca-43fe-49fb-b14c-3837ce889acb-memberlist\") pod \"speaker-swvdq\" (UID: \"84e9b3ca-43fe-49fb-b14c-3837ce889acb\") " pod="metallb-system/speaker-swvdq" Dec 03 07:03:25 crc kubenswrapper[4947]: E1203 07:03:25.453775 4947 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 07:03:25 crc kubenswrapper[4947]: E1203 07:03:25.453812 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84e9b3ca-43fe-49fb-b14c-3837ce889acb-memberlist podName:84e9b3ca-43fe-49fb-b14c-3837ce889acb nodeName:}" failed. No retries permitted until 2025-12-03 07:03:25.953799285 +0000 UTC m=+867.214753711 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/84e9b3ca-43fe-49fb-b14c-3837ce889acb-memberlist") pod "speaker-swvdq" (UID: "84e9b3ca-43fe-49fb-b14c-3837ce889acb") : secret "metallb-memberlist" not found Dec 03 07:03:25 crc kubenswrapper[4947]: E1203 07:03:25.453916 4947 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 03 07:03:25 crc kubenswrapper[4947]: E1203 07:03:25.453937 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74-metrics-certs podName:dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74 nodeName:}" failed. No retries permitted until 2025-12-03 07:03:25.953930589 +0000 UTC m=+867.214885015 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74-metrics-certs") pod "controller-f8648f98b-smdq8" (UID: "dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74") : secret "controller-certs-secret" not found Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.454475 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/84e9b3ca-43fe-49fb-b14c-3837ce889acb-metallb-excludel2\") pod \"speaker-swvdq\" (UID: \"84e9b3ca-43fe-49fb-b14c-3837ce889acb\") " pod="metallb-system/speaker-swvdq" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.455173 4947 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.456812 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/84e9b3ca-43fe-49fb-b14c-3837ce889acb-metrics-certs\") pod \"speaker-swvdq\" (UID: \"84e9b3ca-43fe-49fb-b14c-3837ce889acb\") " pod="metallb-system/speaker-swvdq" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.466917 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74-cert\") pod \"controller-f8648f98b-smdq8\" (UID: \"dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74\") " pod="metallb-system/controller-f8648f98b-smdq8" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.468651 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zgd2\" (UniqueName: \"kubernetes.io/projected/84e9b3ca-43fe-49fb-b14c-3837ce889acb-kube-api-access-9zgd2\") pod \"speaker-swvdq\" (UID: \"84e9b3ca-43fe-49fb-b14c-3837ce889acb\") " pod="metallb-system/speaker-swvdq" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.470536 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqmjn\" (UniqueName: \"kubernetes.io/projected/dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74-kube-api-access-vqmjn\") pod \"controller-f8648f98b-smdq8\" (UID: \"dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74\") " pod="metallb-system/controller-f8648f98b-smdq8" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.496052 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.508054 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lhpmz" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.886514 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-lhpmz"] Dec 03 07:03:25 crc kubenswrapper[4947]: W1203 07:03:25.890180 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3a50a88_0743_4bc2_831c_65de7fbf4bb5.slice/crio-8d7b2908dff76bcdf0a977bad900b4ea0c7654b407085fffe0301e11ec6ccfe4 WatchSource:0}: Error finding container 8d7b2908dff76bcdf0a977bad900b4ea0c7654b407085fffe0301e11ec6ccfe4: Status 404 returned error can't find the container with id 8d7b2908dff76bcdf0a977bad900b4ea0c7654b407085fffe0301e11ec6ccfe4 Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.963442 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/84e9b3ca-43fe-49fb-b14c-3837ce889acb-memberlist\") pod \"speaker-swvdq\" (UID: \"84e9b3ca-43fe-49fb-b14c-3837ce889acb\") " pod="metallb-system/speaker-swvdq" Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.963546 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74-metrics-certs\") pod \"controller-f8648f98b-smdq8\" (UID: \"dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74\") " pod="metallb-system/controller-f8648f98b-smdq8" Dec 03 07:03:25 crc kubenswrapper[4947]: E1203 07:03:25.963595 4947 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 07:03:25 crc kubenswrapper[4947]: E1203 07:03:25.963653 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84e9b3ca-43fe-49fb-b14c-3837ce889acb-memberlist podName:84e9b3ca-43fe-49fb-b14c-3837ce889acb nodeName:}" failed. No retries permitted until 2025-12-03 07:03:26.963636817 +0000 UTC m=+868.224591243 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/84e9b3ca-43fe-49fb-b14c-3837ce889acb-memberlist") pod "speaker-swvdq" (UID: "84e9b3ca-43fe-49fb-b14c-3837ce889acb") : secret "metallb-memberlist" not found Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.967949 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4jbhl" event={"ID":"8704521a-a01e-42a0-a985-dcaa9756de9f","Type":"ContainerStarted","Data":"2f1fb90054cfc9b47af5769793440a969b4f2582dc9bde23e6399cbdaff3e31f"} Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.969351 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lhpmz" event={"ID":"b3a50a88-0743-4bc2-831c-65de7fbf4bb5","Type":"ContainerStarted","Data":"8d7b2908dff76bcdf0a977bad900b4ea0c7654b407085fffe0301e11ec6ccfe4"} Dec 03 07:03:25 crc kubenswrapper[4947]: I1203 07:03:25.974093 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74-metrics-certs\") pod \"controller-f8648f98b-smdq8\" (UID: \"dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74\") " pod="metallb-system/controller-f8648f98b-smdq8" Dec 03 07:03:26 crc kubenswrapper[4947]: I1203 07:03:26.252942 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-smdq8" Dec 03 07:03:26 crc kubenswrapper[4947]: I1203 07:03:26.753040 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-smdq8"] Dec 03 07:03:26 crc kubenswrapper[4947]: I1203 07:03:26.977438 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-smdq8" event={"ID":"dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74","Type":"ContainerStarted","Data":"2cb3f38c3a57ed277edc262ef2b52f9d3c81cc831d799e27ab1e71f755d23937"} Dec 03 07:03:26 crc kubenswrapper[4947]: I1203 07:03:26.977852 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-smdq8" event={"ID":"dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74","Type":"ContainerStarted","Data":"84289b99474948f7a532d621376e4931ae80d0d999d345960cdf2022176d52dd"} Dec 03 07:03:26 crc kubenswrapper[4947]: I1203 07:03:26.983252 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/84e9b3ca-43fe-49fb-b14c-3837ce889acb-memberlist\") pod \"speaker-swvdq\" (UID: \"84e9b3ca-43fe-49fb-b14c-3837ce889acb\") " pod="metallb-system/speaker-swvdq" Dec 03 07:03:26 crc kubenswrapper[4947]: I1203 07:03:26.992547 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/84e9b3ca-43fe-49fb-b14c-3837ce889acb-memberlist\") pod \"speaker-swvdq\" (UID: \"84e9b3ca-43fe-49fb-b14c-3837ce889acb\") " pod="metallb-system/speaker-swvdq" Dec 03 07:03:27 crc kubenswrapper[4947]: I1203 07:03:27.092675 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-swvdq" Dec 03 07:03:27 crc kubenswrapper[4947]: I1203 07:03:27.991483 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-smdq8" event={"ID":"dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74","Type":"ContainerStarted","Data":"92404d95f57730346decf8389a86b19ba149635b0d933657fc6c8c9b1caa8c01"} Dec 03 07:03:27 crc kubenswrapper[4947]: I1203 07:03:27.992911 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-smdq8" Dec 03 07:03:27 crc kubenswrapper[4947]: I1203 07:03:27.994213 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-swvdq" event={"ID":"84e9b3ca-43fe-49fb-b14c-3837ce889acb","Type":"ContainerStarted","Data":"5565f44e67a9071639b96cfd19ec48b407b1baad44f4c0c4009817a1aed9469f"} Dec 03 07:03:27 crc kubenswrapper[4947]: I1203 07:03:27.994248 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-swvdq" event={"ID":"84e9b3ca-43fe-49fb-b14c-3837ce889acb","Type":"ContainerStarted","Data":"7c882e0042ec3a465b4fb2b370103ddd01f8e9b58a421a50246232b9693215bb"} Dec 03 07:03:27 crc kubenswrapper[4947]: I1203 07:03:27.994260 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-swvdq" event={"ID":"84e9b3ca-43fe-49fb-b14c-3837ce889acb","Type":"ContainerStarted","Data":"2301664ff32853aedc82aaf9aa235d378601d8f48d010354b8918822b0f6ebb9"} Dec 03 07:03:27 crc kubenswrapper[4947]: I1203 07:03:27.994443 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-swvdq" Dec 03 07:03:28 crc kubenswrapper[4947]: I1203 07:03:28.015887 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-smdq8" podStartSLOduration=3.015871958 podStartE2EDuration="3.015871958s" podCreationTimestamp="2025-12-03 07:03:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:03:28.014720567 +0000 UTC m=+869.275675013" watchObservedRunningTime="2025-12-03 07:03:28.015871958 +0000 UTC m=+869.276826384" Dec 03 07:03:28 crc kubenswrapper[4947]: I1203 07:03:28.038717 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-swvdq" podStartSLOduration=3.038702474 podStartE2EDuration="3.038702474s" podCreationTimestamp="2025-12-03 07:03:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:03:28.037197534 +0000 UTC m=+869.298151960" watchObservedRunningTime="2025-12-03 07:03:28.038702474 +0000 UTC m=+869.299656890" Dec 03 07:03:30 crc kubenswrapper[4947]: I1203 07:03:30.086370 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:03:30 crc kubenswrapper[4947]: I1203 07:03:30.086652 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:03:34 crc kubenswrapper[4947]: I1203 07:03:34.045632 4947 generic.go:334] "Generic (PLEG): container finished" podID="8704521a-a01e-42a0-a985-dcaa9756de9f" containerID="b260fcae6d23ad08a48f26372ebf16def1a54959914eb6ad4f4d57d7853e0e3c" exitCode=0 Dec 03 07:03:34 crc kubenswrapper[4947]: I1203 07:03:34.045694 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4jbhl" event={"ID":"8704521a-a01e-42a0-a985-dcaa9756de9f","Type":"ContainerDied","Data":"b260fcae6d23ad08a48f26372ebf16def1a54959914eb6ad4f4d57d7853e0e3c"} Dec 03 07:03:34 crc kubenswrapper[4947]: I1203 07:03:34.048453 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lhpmz" event={"ID":"b3a50a88-0743-4bc2-831c-65de7fbf4bb5","Type":"ContainerStarted","Data":"f815aef5ca2d6be748b7bf36de70a9bbd62aa74b3dc39a9c12191ee060c7e5d8"} Dec 03 07:03:34 crc kubenswrapper[4947]: I1203 07:03:34.048700 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lhpmz" Dec 03 07:03:35 crc kubenswrapper[4947]: I1203 07:03:35.056240 4947 generic.go:334] "Generic (PLEG): container finished" podID="8704521a-a01e-42a0-a985-dcaa9756de9f" containerID="d2804ecf23b9eb4fa5f36311ce4cda814f4a7c386364e41ff26bb17f40bf0997" exitCode=0 Dec 03 07:03:35 crc kubenswrapper[4947]: I1203 07:03:35.058307 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4jbhl" event={"ID":"8704521a-a01e-42a0-a985-dcaa9756de9f","Type":"ContainerDied","Data":"d2804ecf23b9eb4fa5f36311ce4cda814f4a7c386364e41ff26bb17f40bf0997"} Dec 03 07:03:35 crc kubenswrapper[4947]: I1203 07:03:35.095991 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lhpmz" podStartSLOduration=2.181198472 podStartE2EDuration="10.095970683s" podCreationTimestamp="2025-12-03 07:03:25 +0000 UTC" firstStartedPulling="2025-12-03 07:03:25.892826968 +0000 UTC m=+867.153781394" lastFinishedPulling="2025-12-03 07:03:33.807599139 +0000 UTC m=+875.068553605" observedRunningTime="2025-12-03 07:03:34.092935629 +0000 UTC m=+875.353890065" watchObservedRunningTime="2025-12-03 07:03:35.095970683 +0000 UTC m=+876.356925109" Dec 03 07:03:36 crc kubenswrapper[4947]: I1203 07:03:36.065661 4947 generic.go:334] "Generic (PLEG): container finished" podID="8704521a-a01e-42a0-a985-dcaa9756de9f" containerID="c4e64b131d449355edaa7ffefe86f1495ad55c2147b1415e897c478607c6e70b" exitCode=0 Dec 03 07:03:36 crc kubenswrapper[4947]: I1203 07:03:36.065711 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4jbhl" event={"ID":"8704521a-a01e-42a0-a985-dcaa9756de9f","Type":"ContainerDied","Data":"c4e64b131d449355edaa7ffefe86f1495ad55c2147b1415e897c478607c6e70b"} Dec 03 07:03:36 crc kubenswrapper[4947]: I1203 07:03:36.258624 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-smdq8" Dec 03 07:03:37 crc kubenswrapper[4947]: I1203 07:03:37.077992 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4jbhl" event={"ID":"8704521a-a01e-42a0-a985-dcaa9756de9f","Type":"ContainerStarted","Data":"115589bb54f78b86435da0402b9f17e3c6e43c20950b3041ee2854697c1a3f64"} Dec 03 07:03:37 crc kubenswrapper[4947]: I1203 07:03:37.078296 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4jbhl" event={"ID":"8704521a-a01e-42a0-a985-dcaa9756de9f","Type":"ContainerStarted","Data":"2ce038e137328e84225152527650b8d30b491008b6c853cbbbf36c794ea1772a"} Dec 03 07:03:37 crc kubenswrapper[4947]: I1203 07:03:37.078313 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4jbhl" event={"ID":"8704521a-a01e-42a0-a985-dcaa9756de9f","Type":"ContainerStarted","Data":"8243c760fa17c85e5db7d825fe73a238aec4a1ac9ebdf376d3a6f32010579a60"} Dec 03 07:03:37 crc kubenswrapper[4947]: I1203 07:03:37.078325 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4jbhl" event={"ID":"8704521a-a01e-42a0-a985-dcaa9756de9f","Type":"ContainerStarted","Data":"09d156caa2c046d4b413d269df583b5fa23ddce025bd059171fc34d5d61ff8d0"} Dec 03 07:03:37 crc kubenswrapper[4947]: I1203 07:03:37.078338 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4jbhl" event={"ID":"8704521a-a01e-42a0-a985-dcaa9756de9f","Type":"ContainerStarted","Data":"5ea652adbfe30f1e4a645b9a237443a236278ce226714d917add7dbe04b91740"} Dec 03 07:03:37 crc kubenswrapper[4947]: I1203 07:03:37.097508 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-swvdq" Dec 03 07:03:38 crc kubenswrapper[4947]: I1203 07:03:38.093715 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4jbhl" event={"ID":"8704521a-a01e-42a0-a985-dcaa9756de9f","Type":"ContainerStarted","Data":"4f0756d196acf9cf1cfbaf82ea4340206a053acabc19438f9064bf1958d00204"} Dec 03 07:03:38 crc kubenswrapper[4947]: I1203 07:03:38.094776 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:38 crc kubenswrapper[4947]: I1203 07:03:38.930540 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-4jbhl" podStartSLOduration=5.726535467 podStartE2EDuration="13.930523753s" podCreationTimestamp="2025-12-03 07:03:25 +0000 UTC" firstStartedPulling="2025-12-03 07:03:25.60610197 +0000 UTC m=+866.867056396" lastFinishedPulling="2025-12-03 07:03:33.810090216 +0000 UTC m=+875.071044682" observedRunningTime="2025-12-03 07:03:38.116190334 +0000 UTC m=+879.377144800" watchObservedRunningTime="2025-12-03 07:03:38.930523753 +0000 UTC m=+880.191478179" Dec 03 07:03:38 crc kubenswrapper[4947]: I1203 07:03:38.931025 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr"] Dec 03 07:03:38 crc kubenswrapper[4947]: I1203 07:03:38.932076 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr" Dec 03 07:03:38 crc kubenswrapper[4947]: I1203 07:03:38.934660 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 07:03:38 crc kubenswrapper[4947]: I1203 07:03:38.953896 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr"] Dec 03 07:03:39 crc kubenswrapper[4947]: I1203 07:03:39.107837 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f692755d-a958-4fc0-9908-0c088cb8b85a-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr\" (UID: \"f692755d-a958-4fc0-9908-0c088cb8b85a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr" Dec 03 07:03:39 crc kubenswrapper[4947]: I1203 07:03:39.107923 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f692755d-a958-4fc0-9908-0c088cb8b85a-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr\" (UID: \"f692755d-a958-4fc0-9908-0c088cb8b85a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr" Dec 03 07:03:39 crc kubenswrapper[4947]: I1203 07:03:39.108002 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wzgt\" (UniqueName: \"kubernetes.io/projected/f692755d-a958-4fc0-9908-0c088cb8b85a-kube-api-access-2wzgt\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr\" (UID: \"f692755d-a958-4fc0-9908-0c088cb8b85a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr" Dec 03 07:03:39 crc kubenswrapper[4947]: I1203 07:03:39.209073 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f692755d-a958-4fc0-9908-0c088cb8b85a-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr\" (UID: \"f692755d-a958-4fc0-9908-0c088cb8b85a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr" Dec 03 07:03:39 crc kubenswrapper[4947]: I1203 07:03:39.209252 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wzgt\" (UniqueName: \"kubernetes.io/projected/f692755d-a958-4fc0-9908-0c088cb8b85a-kube-api-access-2wzgt\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr\" (UID: \"f692755d-a958-4fc0-9908-0c088cb8b85a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr" Dec 03 07:03:39 crc kubenswrapper[4947]: I1203 07:03:39.209322 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f692755d-a958-4fc0-9908-0c088cb8b85a-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr\" (UID: \"f692755d-a958-4fc0-9908-0c088cb8b85a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr" Dec 03 07:03:39 crc kubenswrapper[4947]: I1203 07:03:39.210908 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f692755d-a958-4fc0-9908-0c088cb8b85a-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr\" (UID: \"f692755d-a958-4fc0-9908-0c088cb8b85a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr" Dec 03 07:03:39 crc kubenswrapper[4947]: I1203 07:03:39.211529 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f692755d-a958-4fc0-9908-0c088cb8b85a-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr\" (UID: \"f692755d-a958-4fc0-9908-0c088cb8b85a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr" Dec 03 07:03:39 crc kubenswrapper[4947]: I1203 07:03:39.239164 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wzgt\" (UniqueName: \"kubernetes.io/projected/f692755d-a958-4fc0-9908-0c088cb8b85a-kube-api-access-2wzgt\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr\" (UID: \"f692755d-a958-4fc0-9908-0c088cb8b85a\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr" Dec 03 07:03:39 crc kubenswrapper[4947]: I1203 07:03:39.314907 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr" Dec 03 07:03:39 crc kubenswrapper[4947]: I1203 07:03:39.571104 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr"] Dec 03 07:03:40 crc kubenswrapper[4947]: I1203 07:03:40.108359 4947 generic.go:334] "Generic (PLEG): container finished" podID="f692755d-a958-4fc0-9908-0c088cb8b85a" containerID="f081ceb2455af939db651baf3495e8363f2f2664a29c270e57f03b5f406853c4" exitCode=0 Dec 03 07:03:40 crc kubenswrapper[4947]: I1203 07:03:40.108612 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr" event={"ID":"f692755d-a958-4fc0-9908-0c088cb8b85a","Type":"ContainerDied","Data":"f081ceb2455af939db651baf3495e8363f2f2664a29c270e57f03b5f406853c4"} Dec 03 07:03:40 crc kubenswrapper[4947]: I1203 07:03:40.109687 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr" event={"ID":"f692755d-a958-4fc0-9908-0c088cb8b85a","Type":"ContainerStarted","Data":"559e268951e27895cff7bac851fbdeb61eaded46bea51322b61f15ac4d839592"} Dec 03 07:03:40 crc kubenswrapper[4947]: I1203 07:03:40.496339 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:40 crc kubenswrapper[4947]: I1203 07:03:40.544847 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:44 crc kubenswrapper[4947]: I1203 07:03:44.137799 4947 generic.go:334] "Generic (PLEG): container finished" podID="f692755d-a958-4fc0-9908-0c088cb8b85a" containerID="593f2302325648126045625137f99113cacfeeeac738068253917f976adcae82" exitCode=0 Dec 03 07:03:44 crc kubenswrapper[4947]: I1203 07:03:44.137954 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr" event={"ID":"f692755d-a958-4fc0-9908-0c088cb8b85a","Type":"ContainerDied","Data":"593f2302325648126045625137f99113cacfeeeac738068253917f976adcae82"} Dec 03 07:03:45 crc kubenswrapper[4947]: I1203 07:03:45.150369 4947 generic.go:334] "Generic (PLEG): container finished" podID="f692755d-a958-4fc0-9908-0c088cb8b85a" containerID="e7d8f7396dbdabc16c1c2be6b872ecd393fd638cf15d64fd1ab00e51df7c928d" exitCode=0 Dec 03 07:03:45 crc kubenswrapper[4947]: I1203 07:03:45.150428 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr" event={"ID":"f692755d-a958-4fc0-9908-0c088cb8b85a","Type":"ContainerDied","Data":"e7d8f7396dbdabc16c1c2be6b872ecd393fd638cf15d64fd1ab00e51df7c928d"} Dec 03 07:03:45 crc kubenswrapper[4947]: I1203 07:03:45.514959 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-lhpmz" Dec 03 07:03:46 crc kubenswrapper[4947]: I1203 07:03:46.453113 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr" Dec 03 07:03:46 crc kubenswrapper[4947]: I1203 07:03:46.615033 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f692755d-a958-4fc0-9908-0c088cb8b85a-bundle\") pod \"f692755d-a958-4fc0-9908-0c088cb8b85a\" (UID: \"f692755d-a958-4fc0-9908-0c088cb8b85a\") " Dec 03 07:03:46 crc kubenswrapper[4947]: I1203 07:03:46.615179 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wzgt\" (UniqueName: \"kubernetes.io/projected/f692755d-a958-4fc0-9908-0c088cb8b85a-kube-api-access-2wzgt\") pod \"f692755d-a958-4fc0-9908-0c088cb8b85a\" (UID: \"f692755d-a958-4fc0-9908-0c088cb8b85a\") " Dec 03 07:03:46 crc kubenswrapper[4947]: I1203 07:03:46.615243 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f692755d-a958-4fc0-9908-0c088cb8b85a-util\") pod \"f692755d-a958-4fc0-9908-0c088cb8b85a\" (UID: \"f692755d-a958-4fc0-9908-0c088cb8b85a\") " Dec 03 07:03:46 crc kubenswrapper[4947]: I1203 07:03:46.617067 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f692755d-a958-4fc0-9908-0c088cb8b85a-bundle" (OuterVolumeSpecName: "bundle") pod "f692755d-a958-4fc0-9908-0c088cb8b85a" (UID: "f692755d-a958-4fc0-9908-0c088cb8b85a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:03:46 crc kubenswrapper[4947]: I1203 07:03:46.625046 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f692755d-a958-4fc0-9908-0c088cb8b85a-kube-api-access-2wzgt" (OuterVolumeSpecName: "kube-api-access-2wzgt") pod "f692755d-a958-4fc0-9908-0c088cb8b85a" (UID: "f692755d-a958-4fc0-9908-0c088cb8b85a"). InnerVolumeSpecName "kube-api-access-2wzgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:03:46 crc kubenswrapper[4947]: I1203 07:03:46.640049 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f692755d-a958-4fc0-9908-0c088cb8b85a-util" (OuterVolumeSpecName: "util") pod "f692755d-a958-4fc0-9908-0c088cb8b85a" (UID: "f692755d-a958-4fc0-9908-0c088cb8b85a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:03:46 crc kubenswrapper[4947]: I1203 07:03:46.717451 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wzgt\" (UniqueName: \"kubernetes.io/projected/f692755d-a958-4fc0-9908-0c088cb8b85a-kube-api-access-2wzgt\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:46 crc kubenswrapper[4947]: I1203 07:03:46.717523 4947 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f692755d-a958-4fc0-9908-0c088cb8b85a-util\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:46 crc kubenswrapper[4947]: I1203 07:03:46.717542 4947 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f692755d-a958-4fc0-9908-0c088cb8b85a-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:03:47 crc kubenswrapper[4947]: I1203 07:03:47.168188 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr" event={"ID":"f692755d-a958-4fc0-9908-0c088cb8b85a","Type":"ContainerDied","Data":"559e268951e27895cff7bac851fbdeb61eaded46bea51322b61f15ac4d839592"} Dec 03 07:03:47 crc kubenswrapper[4947]: I1203 07:03:47.168846 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="559e268951e27895cff7bac851fbdeb61eaded46bea51322b61f15ac4d839592" Dec 03 07:03:47 crc kubenswrapper[4947]: I1203 07:03:47.168733 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr" Dec 03 07:03:47 crc kubenswrapper[4947]: I1203 07:03:47.766526 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-45245"] Dec 03 07:03:47 crc kubenswrapper[4947]: E1203 07:03:47.766777 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f692755d-a958-4fc0-9908-0c088cb8b85a" containerName="pull" Dec 03 07:03:47 crc kubenswrapper[4947]: I1203 07:03:47.766792 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f692755d-a958-4fc0-9908-0c088cb8b85a" containerName="pull" Dec 03 07:03:47 crc kubenswrapper[4947]: E1203 07:03:47.766817 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f692755d-a958-4fc0-9908-0c088cb8b85a" containerName="util" Dec 03 07:03:47 crc kubenswrapper[4947]: I1203 07:03:47.766826 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f692755d-a958-4fc0-9908-0c088cb8b85a" containerName="util" Dec 03 07:03:47 crc kubenswrapper[4947]: E1203 07:03:47.766839 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f692755d-a958-4fc0-9908-0c088cb8b85a" containerName="extract" Dec 03 07:03:47 crc kubenswrapper[4947]: I1203 07:03:47.766848 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f692755d-a958-4fc0-9908-0c088cb8b85a" containerName="extract" Dec 03 07:03:47 crc kubenswrapper[4947]: I1203 07:03:47.766999 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f692755d-a958-4fc0-9908-0c088cb8b85a" containerName="extract" Dec 03 07:03:47 crc kubenswrapper[4947]: I1203 07:03:47.768045 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-45245" Dec 03 07:03:47 crc kubenswrapper[4947]: I1203 07:03:47.788664 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-45245"] Dec 03 07:03:47 crc kubenswrapper[4947]: I1203 07:03:47.932308 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2f058a7-63ac-45c7-bf91-d683d7010b48-utilities\") pod \"certified-operators-45245\" (UID: \"d2f058a7-63ac-45c7-bf91-d683d7010b48\") " pod="openshift-marketplace/certified-operators-45245" Dec 03 07:03:47 crc kubenswrapper[4947]: I1203 07:03:47.932399 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ntq5\" (UniqueName: \"kubernetes.io/projected/d2f058a7-63ac-45c7-bf91-d683d7010b48-kube-api-access-6ntq5\") pod \"certified-operators-45245\" (UID: \"d2f058a7-63ac-45c7-bf91-d683d7010b48\") " pod="openshift-marketplace/certified-operators-45245" Dec 03 07:03:47 crc kubenswrapper[4947]: I1203 07:03:47.932530 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2f058a7-63ac-45c7-bf91-d683d7010b48-catalog-content\") pod \"certified-operators-45245\" (UID: \"d2f058a7-63ac-45c7-bf91-d683d7010b48\") " pod="openshift-marketplace/certified-operators-45245" Dec 03 07:03:48 crc kubenswrapper[4947]: I1203 07:03:48.033359 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2f058a7-63ac-45c7-bf91-d683d7010b48-catalog-content\") pod \"certified-operators-45245\" (UID: \"d2f058a7-63ac-45c7-bf91-d683d7010b48\") " pod="openshift-marketplace/certified-operators-45245" Dec 03 07:03:48 crc kubenswrapper[4947]: I1203 07:03:48.033432 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2f058a7-63ac-45c7-bf91-d683d7010b48-utilities\") pod \"certified-operators-45245\" (UID: \"d2f058a7-63ac-45c7-bf91-d683d7010b48\") " pod="openshift-marketplace/certified-operators-45245" Dec 03 07:03:48 crc kubenswrapper[4947]: I1203 07:03:48.033542 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ntq5\" (UniqueName: \"kubernetes.io/projected/d2f058a7-63ac-45c7-bf91-d683d7010b48-kube-api-access-6ntq5\") pod \"certified-operators-45245\" (UID: \"d2f058a7-63ac-45c7-bf91-d683d7010b48\") " pod="openshift-marketplace/certified-operators-45245" Dec 03 07:03:48 crc kubenswrapper[4947]: I1203 07:03:48.033910 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2f058a7-63ac-45c7-bf91-d683d7010b48-catalog-content\") pod \"certified-operators-45245\" (UID: \"d2f058a7-63ac-45c7-bf91-d683d7010b48\") " pod="openshift-marketplace/certified-operators-45245" Dec 03 07:03:48 crc kubenswrapper[4947]: I1203 07:03:48.034015 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2f058a7-63ac-45c7-bf91-d683d7010b48-utilities\") pod \"certified-operators-45245\" (UID: \"d2f058a7-63ac-45c7-bf91-d683d7010b48\") " pod="openshift-marketplace/certified-operators-45245" Dec 03 07:03:48 crc kubenswrapper[4947]: I1203 07:03:48.049988 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ntq5\" (UniqueName: \"kubernetes.io/projected/d2f058a7-63ac-45c7-bf91-d683d7010b48-kube-api-access-6ntq5\") pod \"certified-operators-45245\" (UID: \"d2f058a7-63ac-45c7-bf91-d683d7010b48\") " pod="openshift-marketplace/certified-operators-45245" Dec 03 07:03:48 crc kubenswrapper[4947]: I1203 07:03:48.095139 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-45245" Dec 03 07:03:48 crc kubenswrapper[4947]: I1203 07:03:48.548479 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-45245"] Dec 03 07:03:48 crc kubenswrapper[4947]: W1203 07:03:48.551771 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2f058a7_63ac_45c7_bf91_d683d7010b48.slice/crio-680459d6730842c14224865bd42b185d61637d81752b4ed2c345f9cb898357ea WatchSource:0}: Error finding container 680459d6730842c14224865bd42b185d61637d81752b4ed2c345f9cb898357ea: Status 404 returned error can't find the container with id 680459d6730842c14224865bd42b185d61637d81752b4ed2c345f9cb898357ea Dec 03 07:03:49 crc kubenswrapper[4947]: I1203 07:03:49.192662 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-45245" event={"ID":"d2f058a7-63ac-45c7-bf91-d683d7010b48","Type":"ContainerStarted","Data":"680459d6730842c14224865bd42b185d61637d81752b4ed2c345f9cb898357ea"} Dec 03 07:03:50 crc kubenswrapper[4947]: I1203 07:03:50.211572 4947 generic.go:334] "Generic (PLEG): container finished" podID="d2f058a7-63ac-45c7-bf91-d683d7010b48" containerID="3c89c9304eec52307490ea4e00bf638d0be3e118f87c5429ca650b76f26f619f" exitCode=0 Dec 03 07:03:50 crc kubenswrapper[4947]: I1203 07:03:50.211662 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-45245" event={"ID":"d2f058a7-63ac-45c7-bf91-d683d7010b48","Type":"ContainerDied","Data":"3c89c9304eec52307490ea4e00bf638d0be3e118f87c5429ca650b76f26f619f"} Dec 03 07:03:51 crc kubenswrapper[4947]: I1203 07:03:51.219727 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-45245" event={"ID":"d2f058a7-63ac-45c7-bf91-d683d7010b48","Type":"ContainerStarted","Data":"018394d2a85f56bfb8dd9bcc9ac93767d486f31ffb61017f1d44f9ec9056b8a6"} Dec 03 07:03:51 crc kubenswrapper[4947]: I1203 07:03:51.993629 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-j956d"] Dec 03 07:03:51 crc kubenswrapper[4947]: I1203 07:03:51.994293 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-j956d" Dec 03 07:03:51 crc kubenswrapper[4947]: I1203 07:03:51.996457 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Dec 03 07:03:51 crc kubenswrapper[4947]: I1203 07:03:51.998030 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Dec 03 07:03:51 crc kubenswrapper[4947]: I1203 07:03:51.998280 4947 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-954vj" Dec 03 07:03:52 crc kubenswrapper[4947]: I1203 07:03:52.058908 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-j956d"] Dec 03 07:03:52 crc kubenswrapper[4947]: I1203 07:03:52.084909 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4a477c26-da80-4f21-bd96-949fab49ac91-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-j956d\" (UID: \"4a477c26-da80-4f21-bd96-949fab49ac91\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-j956d" Dec 03 07:03:52 crc kubenswrapper[4947]: I1203 07:03:52.085002 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf847\" (UniqueName: \"kubernetes.io/projected/4a477c26-da80-4f21-bd96-949fab49ac91-kube-api-access-wf847\") pod \"cert-manager-operator-controller-manager-64cf6dff88-j956d\" (UID: \"4a477c26-da80-4f21-bd96-949fab49ac91\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-j956d" Dec 03 07:03:52 crc kubenswrapper[4947]: I1203 07:03:52.186092 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf847\" (UniqueName: \"kubernetes.io/projected/4a477c26-da80-4f21-bd96-949fab49ac91-kube-api-access-wf847\") pod \"cert-manager-operator-controller-manager-64cf6dff88-j956d\" (UID: \"4a477c26-da80-4f21-bd96-949fab49ac91\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-j956d" Dec 03 07:03:52 crc kubenswrapper[4947]: I1203 07:03:52.186182 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4a477c26-da80-4f21-bd96-949fab49ac91-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-j956d\" (UID: \"4a477c26-da80-4f21-bd96-949fab49ac91\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-j956d" Dec 03 07:03:52 crc kubenswrapper[4947]: I1203 07:03:52.186692 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4a477c26-da80-4f21-bd96-949fab49ac91-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-j956d\" (UID: \"4a477c26-da80-4f21-bd96-949fab49ac91\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-j956d" Dec 03 07:03:52 crc kubenswrapper[4947]: I1203 07:03:52.203082 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf847\" (UniqueName: \"kubernetes.io/projected/4a477c26-da80-4f21-bd96-949fab49ac91-kube-api-access-wf847\") pod \"cert-manager-operator-controller-manager-64cf6dff88-j956d\" (UID: \"4a477c26-da80-4f21-bd96-949fab49ac91\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-j956d" Dec 03 07:03:52 crc kubenswrapper[4947]: I1203 07:03:52.226837 4947 generic.go:334] "Generic (PLEG): container finished" podID="d2f058a7-63ac-45c7-bf91-d683d7010b48" containerID="018394d2a85f56bfb8dd9bcc9ac93767d486f31ffb61017f1d44f9ec9056b8a6" exitCode=0 Dec 03 07:03:52 crc kubenswrapper[4947]: I1203 07:03:52.226878 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-45245" event={"ID":"d2f058a7-63ac-45c7-bf91-d683d7010b48","Type":"ContainerDied","Data":"018394d2a85f56bfb8dd9bcc9ac93767d486f31ffb61017f1d44f9ec9056b8a6"} Dec 03 07:03:52 crc kubenswrapper[4947]: I1203 07:03:52.306656 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-j956d" Dec 03 07:03:52 crc kubenswrapper[4947]: I1203 07:03:52.829238 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-j956d"] Dec 03 07:03:52 crc kubenswrapper[4947]: W1203 07:03:52.843434 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a477c26_da80_4f21_bd96_949fab49ac91.slice/crio-857825493cd9707109c8f4d1d2d72569075b460253ad8d9ed0de88ad07febdde WatchSource:0}: Error finding container 857825493cd9707109c8f4d1d2d72569075b460253ad8d9ed0de88ad07febdde: Status 404 returned error can't find the container with id 857825493cd9707109c8f4d1d2d72569075b460253ad8d9ed0de88ad07febdde Dec 03 07:03:53 crc kubenswrapper[4947]: I1203 07:03:53.232655 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-j956d" event={"ID":"4a477c26-da80-4f21-bd96-949fab49ac91","Type":"ContainerStarted","Data":"857825493cd9707109c8f4d1d2d72569075b460253ad8d9ed0de88ad07febdde"} Dec 03 07:03:54 crc kubenswrapper[4947]: I1203 07:03:54.240269 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-45245" event={"ID":"d2f058a7-63ac-45c7-bf91-d683d7010b48","Type":"ContainerStarted","Data":"7738d74cbfe1756f18f99fd94b2e0904230d40dbea41045dcc5666ee3ba91979"} Dec 03 07:03:54 crc kubenswrapper[4947]: I1203 07:03:54.266974 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-45245" podStartSLOduration=3.884970761 podStartE2EDuration="7.266956272s" podCreationTimestamp="2025-12-03 07:03:47 +0000 UTC" firstStartedPulling="2025-12-03 07:03:50.212893517 +0000 UTC m=+891.473847943" lastFinishedPulling="2025-12-03 07:03:53.594879008 +0000 UTC m=+894.855833454" observedRunningTime="2025-12-03 07:03:54.265447241 +0000 UTC m=+895.526401667" watchObservedRunningTime="2025-12-03 07:03:54.266956272 +0000 UTC m=+895.527910698" Dec 03 07:03:55 crc kubenswrapper[4947]: I1203 07:03:55.503626 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-4jbhl" Dec 03 07:03:58 crc kubenswrapper[4947]: I1203 07:03:58.095376 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-45245" Dec 03 07:03:58 crc kubenswrapper[4947]: I1203 07:03:58.095726 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-45245" Dec 03 07:03:58 crc kubenswrapper[4947]: I1203 07:03:58.131985 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g76f7"] Dec 03 07:03:58 crc kubenswrapper[4947]: I1203 07:03:58.133853 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g76f7" Dec 03 07:03:58 crc kubenswrapper[4947]: I1203 07:03:58.145082 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g76f7"] Dec 03 07:03:58 crc kubenswrapper[4947]: I1203 07:03:58.146763 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-45245" Dec 03 07:03:58 crc kubenswrapper[4947]: I1203 07:03:58.277376 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f350cbd5-9e7f-47da-be36-c0292934a34b-catalog-content\") pod \"community-operators-g76f7\" (UID: \"f350cbd5-9e7f-47da-be36-c0292934a34b\") " pod="openshift-marketplace/community-operators-g76f7" Dec 03 07:03:58 crc kubenswrapper[4947]: I1203 07:03:58.277531 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f350cbd5-9e7f-47da-be36-c0292934a34b-utilities\") pod \"community-operators-g76f7\" (UID: \"f350cbd5-9e7f-47da-be36-c0292934a34b\") " pod="openshift-marketplace/community-operators-g76f7" Dec 03 07:03:58 crc kubenswrapper[4947]: I1203 07:03:58.277606 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzr5v\" (UniqueName: \"kubernetes.io/projected/f350cbd5-9e7f-47da-be36-c0292934a34b-kube-api-access-fzr5v\") pod \"community-operators-g76f7\" (UID: \"f350cbd5-9e7f-47da-be36-c0292934a34b\") " pod="openshift-marketplace/community-operators-g76f7" Dec 03 07:03:58 crc kubenswrapper[4947]: I1203 07:03:58.345244 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-45245" Dec 03 07:03:58 crc kubenswrapper[4947]: I1203 07:03:58.379570 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f350cbd5-9e7f-47da-be36-c0292934a34b-utilities\") pod \"community-operators-g76f7\" (UID: \"f350cbd5-9e7f-47da-be36-c0292934a34b\") " pod="openshift-marketplace/community-operators-g76f7" Dec 03 07:03:58 crc kubenswrapper[4947]: I1203 07:03:58.379669 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzr5v\" (UniqueName: \"kubernetes.io/projected/f350cbd5-9e7f-47da-be36-c0292934a34b-kube-api-access-fzr5v\") pod \"community-operators-g76f7\" (UID: \"f350cbd5-9e7f-47da-be36-c0292934a34b\") " pod="openshift-marketplace/community-operators-g76f7" Dec 03 07:03:58 crc kubenswrapper[4947]: I1203 07:03:58.379722 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f350cbd5-9e7f-47da-be36-c0292934a34b-catalog-content\") pod \"community-operators-g76f7\" (UID: \"f350cbd5-9e7f-47da-be36-c0292934a34b\") " pod="openshift-marketplace/community-operators-g76f7" Dec 03 07:03:58 crc kubenswrapper[4947]: I1203 07:03:58.380145 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f350cbd5-9e7f-47da-be36-c0292934a34b-utilities\") pod \"community-operators-g76f7\" (UID: \"f350cbd5-9e7f-47da-be36-c0292934a34b\") " pod="openshift-marketplace/community-operators-g76f7" Dec 03 07:03:58 crc kubenswrapper[4947]: I1203 07:03:58.380244 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f350cbd5-9e7f-47da-be36-c0292934a34b-catalog-content\") pod \"community-operators-g76f7\" (UID: \"f350cbd5-9e7f-47da-be36-c0292934a34b\") " pod="openshift-marketplace/community-operators-g76f7" Dec 03 07:03:58 crc kubenswrapper[4947]: I1203 07:03:58.402344 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzr5v\" (UniqueName: \"kubernetes.io/projected/f350cbd5-9e7f-47da-be36-c0292934a34b-kube-api-access-fzr5v\") pod \"community-operators-g76f7\" (UID: \"f350cbd5-9e7f-47da-be36-c0292934a34b\") " pod="openshift-marketplace/community-operators-g76f7" Dec 03 07:03:58 crc kubenswrapper[4947]: I1203 07:03:58.460112 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g76f7" Dec 03 07:04:00 crc kubenswrapper[4947]: I1203 07:04:00.087020 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:04:00 crc kubenswrapper[4947]: I1203 07:04:00.087086 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:04:01 crc kubenswrapper[4947]: I1203 07:04:01.446445 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g76f7"] Dec 03 07:04:01 crc kubenswrapper[4947]: W1203 07:04:01.454055 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf350cbd5_9e7f_47da_be36_c0292934a34b.slice/crio-f3806524e37f0158c4549a01ef4b40bee262e01d7212fb09d2462202e7ff3171 WatchSource:0}: Error finding container f3806524e37f0158c4549a01ef4b40bee262e01d7212fb09d2462202e7ff3171: Status 404 returned error can't find the container with id f3806524e37f0158c4549a01ef4b40bee262e01d7212fb09d2462202e7ff3171 Dec 03 07:04:01 crc kubenswrapper[4947]: I1203 07:04:01.725367 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-45245"] Dec 03 07:04:01 crc kubenswrapper[4947]: I1203 07:04:01.726094 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-45245" podUID="d2f058a7-63ac-45c7-bf91-d683d7010b48" containerName="registry-server" containerID="cri-o://7738d74cbfe1756f18f99fd94b2e0904230d40dbea41045dcc5666ee3ba91979" gracePeriod=2 Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.101062 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-45245" Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.244915 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2f058a7-63ac-45c7-bf91-d683d7010b48-catalog-content\") pod \"d2f058a7-63ac-45c7-bf91-d683d7010b48\" (UID: \"d2f058a7-63ac-45c7-bf91-d683d7010b48\") " Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.244970 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2f058a7-63ac-45c7-bf91-d683d7010b48-utilities\") pod \"d2f058a7-63ac-45c7-bf91-d683d7010b48\" (UID: \"d2f058a7-63ac-45c7-bf91-d683d7010b48\") " Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.245006 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ntq5\" (UniqueName: \"kubernetes.io/projected/d2f058a7-63ac-45c7-bf91-d683d7010b48-kube-api-access-6ntq5\") pod \"d2f058a7-63ac-45c7-bf91-d683d7010b48\" (UID: \"d2f058a7-63ac-45c7-bf91-d683d7010b48\") " Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.247597 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2f058a7-63ac-45c7-bf91-d683d7010b48-utilities" (OuterVolumeSpecName: "utilities") pod "d2f058a7-63ac-45c7-bf91-d683d7010b48" (UID: "d2f058a7-63ac-45c7-bf91-d683d7010b48"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.262899 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2f058a7-63ac-45c7-bf91-d683d7010b48-kube-api-access-6ntq5" (OuterVolumeSpecName: "kube-api-access-6ntq5") pod "d2f058a7-63ac-45c7-bf91-d683d7010b48" (UID: "d2f058a7-63ac-45c7-bf91-d683d7010b48"). InnerVolumeSpecName "kube-api-access-6ntq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.305616 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2f058a7-63ac-45c7-bf91-d683d7010b48-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2f058a7-63ac-45c7-bf91-d683d7010b48" (UID: "d2f058a7-63ac-45c7-bf91-d683d7010b48"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.309584 4947 generic.go:334] "Generic (PLEG): container finished" podID="f350cbd5-9e7f-47da-be36-c0292934a34b" containerID="49305d4b6af455ae0a9d12d69673b1e7f6d6bfabc5ce73be011b0fe905d82b71" exitCode=0 Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.309681 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g76f7" event={"ID":"f350cbd5-9e7f-47da-be36-c0292934a34b","Type":"ContainerDied","Data":"49305d4b6af455ae0a9d12d69673b1e7f6d6bfabc5ce73be011b0fe905d82b71"} Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.309720 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g76f7" event={"ID":"f350cbd5-9e7f-47da-be36-c0292934a34b","Type":"ContainerStarted","Data":"f3806524e37f0158c4549a01ef4b40bee262e01d7212fb09d2462202e7ff3171"} Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.318307 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-j956d" event={"ID":"4a477c26-da80-4f21-bd96-949fab49ac91","Type":"ContainerStarted","Data":"1df37dce8b09d22fab64c807f1c45ee401ab43dff88b17ef35b50e369d786cc7"} Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.322538 4947 generic.go:334] "Generic (PLEG): container finished" podID="d2f058a7-63ac-45c7-bf91-d683d7010b48" containerID="7738d74cbfe1756f18f99fd94b2e0904230d40dbea41045dcc5666ee3ba91979" exitCode=0 Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.322609 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-45245" event={"ID":"d2f058a7-63ac-45c7-bf91-d683d7010b48","Type":"ContainerDied","Data":"7738d74cbfe1756f18f99fd94b2e0904230d40dbea41045dcc5666ee3ba91979"} Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.322644 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-45245" Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.322668 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-45245" event={"ID":"d2f058a7-63ac-45c7-bf91-d683d7010b48","Type":"ContainerDied","Data":"680459d6730842c14224865bd42b185d61637d81752b4ed2c345f9cb898357ea"} Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.322707 4947 scope.go:117] "RemoveContainer" containerID="7738d74cbfe1756f18f99fd94b2e0904230d40dbea41045dcc5666ee3ba91979" Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.347045 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2f058a7-63ac-45c7-bf91-d683d7010b48-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.347560 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2f058a7-63ac-45c7-bf91-d683d7010b48-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.347576 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ntq5\" (UniqueName: \"kubernetes.io/projected/d2f058a7-63ac-45c7-bf91-d683d7010b48-kube-api-access-6ntq5\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.352372 4947 scope.go:117] "RemoveContainer" containerID="018394d2a85f56bfb8dd9bcc9ac93767d486f31ffb61017f1d44f9ec9056b8a6" Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.376321 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-j956d" podStartSLOduration=2.999491543 podStartE2EDuration="11.376296977s" podCreationTimestamp="2025-12-03 07:03:51 +0000 UTC" firstStartedPulling="2025-12-03 07:03:52.851732538 +0000 UTC m=+894.112686974" lastFinishedPulling="2025-12-03 07:04:01.228537982 +0000 UTC m=+902.489492408" observedRunningTime="2025-12-03 07:04:02.367945301 +0000 UTC m=+903.628899737" watchObservedRunningTime="2025-12-03 07:04:02.376296977 +0000 UTC m=+903.637251443" Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.392509 4947 scope.go:117] "RemoveContainer" containerID="3c89c9304eec52307490ea4e00bf638d0be3e118f87c5429ca650b76f26f619f" Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.403541 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-45245"] Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.408985 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-45245"] Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.412260 4947 scope.go:117] "RemoveContainer" containerID="7738d74cbfe1756f18f99fd94b2e0904230d40dbea41045dcc5666ee3ba91979" Dec 03 07:04:02 crc kubenswrapper[4947]: E1203 07:04:02.412781 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7738d74cbfe1756f18f99fd94b2e0904230d40dbea41045dcc5666ee3ba91979\": container with ID starting with 7738d74cbfe1756f18f99fd94b2e0904230d40dbea41045dcc5666ee3ba91979 not found: ID does not exist" containerID="7738d74cbfe1756f18f99fd94b2e0904230d40dbea41045dcc5666ee3ba91979" Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.412834 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7738d74cbfe1756f18f99fd94b2e0904230d40dbea41045dcc5666ee3ba91979"} err="failed to get container status \"7738d74cbfe1756f18f99fd94b2e0904230d40dbea41045dcc5666ee3ba91979\": rpc error: code = NotFound desc = could not find container \"7738d74cbfe1756f18f99fd94b2e0904230d40dbea41045dcc5666ee3ba91979\": container with ID starting with 7738d74cbfe1756f18f99fd94b2e0904230d40dbea41045dcc5666ee3ba91979 not found: ID does not exist" Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.412862 4947 scope.go:117] "RemoveContainer" containerID="018394d2a85f56bfb8dd9bcc9ac93767d486f31ffb61017f1d44f9ec9056b8a6" Dec 03 07:04:02 crc kubenswrapper[4947]: E1203 07:04:02.413222 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"018394d2a85f56bfb8dd9bcc9ac93767d486f31ffb61017f1d44f9ec9056b8a6\": container with ID starting with 018394d2a85f56bfb8dd9bcc9ac93767d486f31ffb61017f1d44f9ec9056b8a6 not found: ID does not exist" containerID="018394d2a85f56bfb8dd9bcc9ac93767d486f31ffb61017f1d44f9ec9056b8a6" Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.413327 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"018394d2a85f56bfb8dd9bcc9ac93767d486f31ffb61017f1d44f9ec9056b8a6"} err="failed to get container status \"018394d2a85f56bfb8dd9bcc9ac93767d486f31ffb61017f1d44f9ec9056b8a6\": rpc error: code = NotFound desc = could not find container \"018394d2a85f56bfb8dd9bcc9ac93767d486f31ffb61017f1d44f9ec9056b8a6\": container with ID starting with 018394d2a85f56bfb8dd9bcc9ac93767d486f31ffb61017f1d44f9ec9056b8a6 not found: ID does not exist" Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.413430 4947 scope.go:117] "RemoveContainer" containerID="3c89c9304eec52307490ea4e00bf638d0be3e118f87c5429ca650b76f26f619f" Dec 03 07:04:02 crc kubenswrapper[4947]: E1203 07:04:02.413775 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c89c9304eec52307490ea4e00bf638d0be3e118f87c5429ca650b76f26f619f\": container with ID starting with 3c89c9304eec52307490ea4e00bf638d0be3e118f87c5429ca650b76f26f619f not found: ID does not exist" containerID="3c89c9304eec52307490ea4e00bf638d0be3e118f87c5429ca650b76f26f619f" Dec 03 07:04:02 crc kubenswrapper[4947]: I1203 07:04:02.413876 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c89c9304eec52307490ea4e00bf638d0be3e118f87c5429ca650b76f26f619f"} err="failed to get container status \"3c89c9304eec52307490ea4e00bf638d0be3e118f87c5429ca650b76f26f619f\": rpc error: code = NotFound desc = could not find container \"3c89c9304eec52307490ea4e00bf638d0be3e118f87c5429ca650b76f26f619f\": container with ID starting with 3c89c9304eec52307490ea4e00bf638d0be3e118f87c5429ca650b76f26f619f not found: ID does not exist" Dec 03 07:04:03 crc kubenswrapper[4947]: I1203 07:04:03.098757 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2f058a7-63ac-45c7-bf91-d683d7010b48" path="/var/lib/kubelet/pods/d2f058a7-63ac-45c7-bf91-d683d7010b48/volumes" Dec 03 07:04:03 crc kubenswrapper[4947]: I1203 07:04:03.330346 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g76f7" event={"ID":"f350cbd5-9e7f-47da-be36-c0292934a34b","Type":"ContainerStarted","Data":"870be4922bb220de7f9715653065b5cdc8502c407dc411333db02a86cf1598e8"} Dec 03 07:04:04 crc kubenswrapper[4947]: I1203 07:04:04.342271 4947 generic.go:334] "Generic (PLEG): container finished" podID="f350cbd5-9e7f-47da-be36-c0292934a34b" containerID="870be4922bb220de7f9715653065b5cdc8502c407dc411333db02a86cf1598e8" exitCode=0 Dec 03 07:04:04 crc kubenswrapper[4947]: I1203 07:04:04.342348 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g76f7" event={"ID":"f350cbd5-9e7f-47da-be36-c0292934a34b","Type":"ContainerDied","Data":"870be4922bb220de7f9715653065b5cdc8502c407dc411333db02a86cf1598e8"} Dec 03 07:04:04 crc kubenswrapper[4947]: I1203 07:04:04.925833 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-zk967"] Dec 03 07:04:04 crc kubenswrapper[4947]: E1203 07:04:04.926044 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2f058a7-63ac-45c7-bf91-d683d7010b48" containerName="extract-content" Dec 03 07:04:04 crc kubenswrapper[4947]: I1203 07:04:04.926055 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2f058a7-63ac-45c7-bf91-d683d7010b48" containerName="extract-content" Dec 03 07:04:04 crc kubenswrapper[4947]: E1203 07:04:04.926063 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2f058a7-63ac-45c7-bf91-d683d7010b48" containerName="registry-server" Dec 03 07:04:04 crc kubenswrapper[4947]: I1203 07:04:04.926070 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2f058a7-63ac-45c7-bf91-d683d7010b48" containerName="registry-server" Dec 03 07:04:04 crc kubenswrapper[4947]: E1203 07:04:04.926095 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2f058a7-63ac-45c7-bf91-d683d7010b48" containerName="extract-utilities" Dec 03 07:04:04 crc kubenswrapper[4947]: I1203 07:04:04.926101 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2f058a7-63ac-45c7-bf91-d683d7010b48" containerName="extract-utilities" Dec 03 07:04:04 crc kubenswrapper[4947]: I1203 07:04:04.926196 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2f058a7-63ac-45c7-bf91-d683d7010b48" containerName="registry-server" Dec 03 07:04:04 crc kubenswrapper[4947]: I1203 07:04:04.926549 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-zk967" Dec 03 07:04:04 crc kubenswrapper[4947]: I1203 07:04:04.928327 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 03 07:04:04 crc kubenswrapper[4947]: I1203 07:04:04.928781 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 03 07:04:04 crc kubenswrapper[4947]: I1203 07:04:04.929596 4947 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-67nl9" Dec 03 07:04:04 crc kubenswrapper[4947]: I1203 07:04:04.956218 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-zk967"] Dec 03 07:04:04 crc kubenswrapper[4947]: I1203 07:04:04.982926 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c04fa588-2ca2-4f54-abe9-167468be5bab-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-zk967\" (UID: \"c04fa588-2ca2-4f54-abe9-167468be5bab\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-zk967" Dec 03 07:04:04 crc kubenswrapper[4947]: I1203 07:04:04.982996 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxx5t\" (UniqueName: \"kubernetes.io/projected/c04fa588-2ca2-4f54-abe9-167468be5bab-kube-api-access-sxx5t\") pod \"cert-manager-webhook-f4fb5df64-zk967\" (UID: \"c04fa588-2ca2-4f54-abe9-167468be5bab\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-zk967" Dec 03 07:04:05 crc kubenswrapper[4947]: I1203 07:04:05.084342 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c04fa588-2ca2-4f54-abe9-167468be5bab-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-zk967\" (UID: \"c04fa588-2ca2-4f54-abe9-167468be5bab\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-zk967" Dec 03 07:04:05 crc kubenswrapper[4947]: I1203 07:04:05.084403 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxx5t\" (UniqueName: \"kubernetes.io/projected/c04fa588-2ca2-4f54-abe9-167468be5bab-kube-api-access-sxx5t\") pod \"cert-manager-webhook-f4fb5df64-zk967\" (UID: \"c04fa588-2ca2-4f54-abe9-167468be5bab\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-zk967" Dec 03 07:04:05 crc kubenswrapper[4947]: I1203 07:04:05.110838 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c04fa588-2ca2-4f54-abe9-167468be5bab-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-zk967\" (UID: \"c04fa588-2ca2-4f54-abe9-167468be5bab\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-zk967" Dec 03 07:04:05 crc kubenswrapper[4947]: I1203 07:04:05.115538 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxx5t\" (UniqueName: \"kubernetes.io/projected/c04fa588-2ca2-4f54-abe9-167468be5bab-kube-api-access-sxx5t\") pod \"cert-manager-webhook-f4fb5df64-zk967\" (UID: \"c04fa588-2ca2-4f54-abe9-167468be5bab\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-zk967" Dec 03 07:04:05 crc kubenswrapper[4947]: I1203 07:04:05.242168 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-zk967" Dec 03 07:04:05 crc kubenswrapper[4947]: I1203 07:04:05.352854 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g76f7" event={"ID":"f350cbd5-9e7f-47da-be36-c0292934a34b","Type":"ContainerStarted","Data":"6cf0fea4e7048215e016c036740fa29a14a9b6d6cfc06eec0ce1b380ab7e02fe"} Dec 03 07:04:05 crc kubenswrapper[4947]: I1203 07:04:05.370400 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g76f7" podStartSLOduration=4.875776739 podStartE2EDuration="7.370385603s" podCreationTimestamp="2025-12-03 07:03:58 +0000 UTC" firstStartedPulling="2025-12-03 07:04:02.312929059 +0000 UTC m=+903.573883485" lastFinishedPulling="2025-12-03 07:04:04.807537923 +0000 UTC m=+906.068492349" observedRunningTime="2025-12-03 07:04:05.369239713 +0000 UTC m=+906.630194139" watchObservedRunningTime="2025-12-03 07:04:05.370385603 +0000 UTC m=+906.631340029" Dec 03 07:04:05 crc kubenswrapper[4947]: I1203 07:04:05.713966 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-zk967"] Dec 03 07:04:05 crc kubenswrapper[4947]: W1203 07:04:05.726442 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc04fa588_2ca2_4f54_abe9_167468be5bab.slice/crio-abb66ad431419bc15d6e5b3b7b0f3530dc9a20d51a0bf2db07b0ffc4935c1562 WatchSource:0}: Error finding container abb66ad431419bc15d6e5b3b7b0f3530dc9a20d51a0bf2db07b0ffc4935c1562: Status 404 returned error can't find the container with id abb66ad431419bc15d6e5b3b7b0f3530dc9a20d51a0bf2db07b0ffc4935c1562 Dec 03 07:04:06 crc kubenswrapper[4947]: I1203 07:04:06.360465 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-zk967" event={"ID":"c04fa588-2ca2-4f54-abe9-167468be5bab","Type":"ContainerStarted","Data":"abb66ad431419bc15d6e5b3b7b0f3530dc9a20d51a0bf2db07b0ffc4935c1562"} Dec 03 07:04:07 crc kubenswrapper[4947]: I1203 07:04:07.283245 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-p55bx"] Dec 03 07:04:07 crc kubenswrapper[4947]: I1203 07:04:07.284236 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-p55bx" Dec 03 07:04:07 crc kubenswrapper[4947]: I1203 07:04:07.286542 4947 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-dj244" Dec 03 07:04:07 crc kubenswrapper[4947]: I1203 07:04:07.294837 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-p55bx"] Dec 03 07:04:07 crc kubenswrapper[4947]: I1203 07:04:07.424177 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpck8\" (UniqueName: \"kubernetes.io/projected/4dcf4bb6-050a-4327-bff0-d1e10bf87782-kube-api-access-vpck8\") pod \"cert-manager-cainjector-855d9ccff4-p55bx\" (UID: \"4dcf4bb6-050a-4327-bff0-d1e10bf87782\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-p55bx" Dec 03 07:04:07 crc kubenswrapper[4947]: I1203 07:04:07.424246 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4dcf4bb6-050a-4327-bff0-d1e10bf87782-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-p55bx\" (UID: \"4dcf4bb6-050a-4327-bff0-d1e10bf87782\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-p55bx" Dec 03 07:04:07 crc kubenswrapper[4947]: I1203 07:04:07.525480 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpck8\" (UniqueName: \"kubernetes.io/projected/4dcf4bb6-050a-4327-bff0-d1e10bf87782-kube-api-access-vpck8\") pod \"cert-manager-cainjector-855d9ccff4-p55bx\" (UID: \"4dcf4bb6-050a-4327-bff0-d1e10bf87782\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-p55bx" Dec 03 07:04:07 crc kubenswrapper[4947]: I1203 07:04:07.525914 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4dcf4bb6-050a-4327-bff0-d1e10bf87782-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-p55bx\" (UID: \"4dcf4bb6-050a-4327-bff0-d1e10bf87782\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-p55bx" Dec 03 07:04:07 crc kubenswrapper[4947]: I1203 07:04:07.549928 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpck8\" (UniqueName: \"kubernetes.io/projected/4dcf4bb6-050a-4327-bff0-d1e10bf87782-kube-api-access-vpck8\") pod \"cert-manager-cainjector-855d9ccff4-p55bx\" (UID: \"4dcf4bb6-050a-4327-bff0-d1e10bf87782\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-p55bx" Dec 03 07:04:07 crc kubenswrapper[4947]: I1203 07:04:07.552127 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4dcf4bb6-050a-4327-bff0-d1e10bf87782-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-p55bx\" (UID: \"4dcf4bb6-050a-4327-bff0-d1e10bf87782\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-p55bx" Dec 03 07:04:07 crc kubenswrapper[4947]: I1203 07:04:07.621428 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-p55bx" Dec 03 07:04:07 crc kubenswrapper[4947]: I1203 07:04:07.868586 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-p55bx"] Dec 03 07:04:07 crc kubenswrapper[4947]: W1203 07:04:07.874040 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4dcf4bb6_050a_4327_bff0_d1e10bf87782.slice/crio-12dc57fda0588e5f324f6b6beeef0ad2ec05270e39906091ec24f91535dffdb9 WatchSource:0}: Error finding container 12dc57fda0588e5f324f6b6beeef0ad2ec05270e39906091ec24f91535dffdb9: Status 404 returned error can't find the container with id 12dc57fda0588e5f324f6b6beeef0ad2ec05270e39906091ec24f91535dffdb9 Dec 03 07:04:08 crc kubenswrapper[4947]: I1203 07:04:08.372067 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-p55bx" event={"ID":"4dcf4bb6-050a-4327-bff0-d1e10bf87782","Type":"ContainerStarted","Data":"12dc57fda0588e5f324f6b6beeef0ad2ec05270e39906091ec24f91535dffdb9"} Dec 03 07:04:08 crc kubenswrapper[4947]: I1203 07:04:08.460594 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g76f7" Dec 03 07:04:08 crc kubenswrapper[4947]: I1203 07:04:08.461814 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g76f7" Dec 03 07:04:08 crc kubenswrapper[4947]: I1203 07:04:08.519428 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g76f7" Dec 03 07:04:09 crc kubenswrapper[4947]: I1203 07:04:09.424635 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g76f7" Dec 03 07:04:09 crc kubenswrapper[4947]: I1203 07:04:09.928221 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g76f7"] Dec 03 07:04:11 crc kubenswrapper[4947]: I1203 07:04:11.391355 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g76f7" podUID="f350cbd5-9e7f-47da-be36-c0292934a34b" containerName="registry-server" containerID="cri-o://6cf0fea4e7048215e016c036740fa29a14a9b6d6cfc06eec0ce1b380ab7e02fe" gracePeriod=2 Dec 03 07:04:12 crc kubenswrapper[4947]: I1203 07:04:12.403219 4947 generic.go:334] "Generic (PLEG): container finished" podID="f350cbd5-9e7f-47da-be36-c0292934a34b" containerID="6cf0fea4e7048215e016c036740fa29a14a9b6d6cfc06eec0ce1b380ab7e02fe" exitCode=0 Dec 03 07:04:12 crc kubenswrapper[4947]: I1203 07:04:12.403269 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g76f7" event={"ID":"f350cbd5-9e7f-47da-be36-c0292934a34b","Type":"ContainerDied","Data":"6cf0fea4e7048215e016c036740fa29a14a9b6d6cfc06eec0ce1b380ab7e02fe"} Dec 03 07:04:15 crc kubenswrapper[4947]: I1203 07:04:15.912944 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g76f7" Dec 03 07:04:15 crc kubenswrapper[4947]: I1203 07:04:15.931208 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f350cbd5-9e7f-47da-be36-c0292934a34b-utilities\") pod \"f350cbd5-9e7f-47da-be36-c0292934a34b\" (UID: \"f350cbd5-9e7f-47da-be36-c0292934a34b\") " Dec 03 07:04:15 crc kubenswrapper[4947]: I1203 07:04:15.931253 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzr5v\" (UniqueName: \"kubernetes.io/projected/f350cbd5-9e7f-47da-be36-c0292934a34b-kube-api-access-fzr5v\") pod \"f350cbd5-9e7f-47da-be36-c0292934a34b\" (UID: \"f350cbd5-9e7f-47da-be36-c0292934a34b\") " Dec 03 07:04:15 crc kubenswrapper[4947]: I1203 07:04:15.931361 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f350cbd5-9e7f-47da-be36-c0292934a34b-catalog-content\") pod \"f350cbd5-9e7f-47da-be36-c0292934a34b\" (UID: \"f350cbd5-9e7f-47da-be36-c0292934a34b\") " Dec 03 07:04:15 crc kubenswrapper[4947]: I1203 07:04:15.935613 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f350cbd5-9e7f-47da-be36-c0292934a34b-utilities" (OuterVolumeSpecName: "utilities") pod "f350cbd5-9e7f-47da-be36-c0292934a34b" (UID: "f350cbd5-9e7f-47da-be36-c0292934a34b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:04:15 crc kubenswrapper[4947]: I1203 07:04:15.978620 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f350cbd5-9e7f-47da-be36-c0292934a34b-kube-api-access-fzr5v" (OuterVolumeSpecName: "kube-api-access-fzr5v") pod "f350cbd5-9e7f-47da-be36-c0292934a34b" (UID: "f350cbd5-9e7f-47da-be36-c0292934a34b"). InnerVolumeSpecName "kube-api-access-fzr5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:04:15 crc kubenswrapper[4947]: I1203 07:04:15.989880 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f350cbd5-9e7f-47da-be36-c0292934a34b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f350cbd5-9e7f-47da-be36-c0292934a34b" (UID: "f350cbd5-9e7f-47da-be36-c0292934a34b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:04:16 crc kubenswrapper[4947]: I1203 07:04:16.033658 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f350cbd5-9e7f-47da-be36-c0292934a34b-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:16 crc kubenswrapper[4947]: I1203 07:04:16.033728 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzr5v\" (UniqueName: \"kubernetes.io/projected/f350cbd5-9e7f-47da-be36-c0292934a34b-kube-api-access-fzr5v\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:16 crc kubenswrapper[4947]: I1203 07:04:16.033747 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f350cbd5-9e7f-47da-be36-c0292934a34b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:16 crc kubenswrapper[4947]: I1203 07:04:16.429687 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-p55bx" event={"ID":"4dcf4bb6-050a-4327-bff0-d1e10bf87782","Type":"ContainerStarted","Data":"a7a1fe568c28c767c091f2268448f4ecc4fead05d7b8c85917ea3f22cf2c6597"} Dec 03 07:04:16 crc kubenswrapper[4947]: I1203 07:04:16.432626 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g76f7" event={"ID":"f350cbd5-9e7f-47da-be36-c0292934a34b","Type":"ContainerDied","Data":"f3806524e37f0158c4549a01ef4b40bee262e01d7212fb09d2462202e7ff3171"} Dec 03 07:04:16 crc kubenswrapper[4947]: I1203 07:04:16.432675 4947 scope.go:117] "RemoveContainer" containerID="6cf0fea4e7048215e016c036740fa29a14a9b6d6cfc06eec0ce1b380ab7e02fe" Dec 03 07:04:16 crc kubenswrapper[4947]: I1203 07:04:16.432634 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g76f7" Dec 03 07:04:16 crc kubenswrapper[4947]: I1203 07:04:16.434193 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-zk967" event={"ID":"c04fa588-2ca2-4f54-abe9-167468be5bab","Type":"ContainerStarted","Data":"fd597bc3b6bfd03d6b36d6c7f9f04e9085d6c0d5849b5406454f21937d14cf90"} Dec 03 07:04:16 crc kubenswrapper[4947]: I1203 07:04:16.434336 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-zk967" Dec 03 07:04:16 crc kubenswrapper[4947]: I1203 07:04:16.447223 4947 scope.go:117] "RemoveContainer" containerID="870be4922bb220de7f9715653065b5cdc8502c407dc411333db02a86cf1598e8" Dec 03 07:04:16 crc kubenswrapper[4947]: I1203 07:04:16.455007 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-p55bx" podStartSLOduration=1.271188626 podStartE2EDuration="9.454975079s" podCreationTimestamp="2025-12-03 07:04:07 +0000 UTC" firstStartedPulling="2025-12-03 07:04:07.876306923 +0000 UTC m=+909.137261349" lastFinishedPulling="2025-12-03 07:04:16.060093376 +0000 UTC m=+917.321047802" observedRunningTime="2025-12-03 07:04:16.452081191 +0000 UTC m=+917.713035657" watchObservedRunningTime="2025-12-03 07:04:16.454975079 +0000 UTC m=+917.715929545" Dec 03 07:04:16 crc kubenswrapper[4947]: I1203 07:04:16.482371 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-zk967" podStartSLOduration=2.149469473 podStartE2EDuration="12.482352677s" podCreationTimestamp="2025-12-03 07:04:04 +0000 UTC" firstStartedPulling="2025-12-03 07:04:05.729870773 +0000 UTC m=+906.990825199" lastFinishedPulling="2025-12-03 07:04:16.062753987 +0000 UTC m=+917.323708403" observedRunningTime="2025-12-03 07:04:16.478059181 +0000 UTC m=+917.739013607" watchObservedRunningTime="2025-12-03 07:04:16.482352677 +0000 UTC m=+917.743307103" Dec 03 07:04:16 crc kubenswrapper[4947]: I1203 07:04:16.491690 4947 scope.go:117] "RemoveContainer" containerID="49305d4b6af455ae0a9d12d69673b1e7f6d6bfabc5ce73be011b0fe905d82b71" Dec 03 07:04:16 crc kubenswrapper[4947]: I1203 07:04:16.492421 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g76f7"] Dec 03 07:04:16 crc kubenswrapper[4947]: I1203 07:04:16.499557 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g76f7"] Dec 03 07:04:17 crc kubenswrapper[4947]: I1203 07:04:17.091076 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f350cbd5-9e7f-47da-be36-c0292934a34b" path="/var/lib/kubelet/pods/f350cbd5-9e7f-47da-be36-c0292934a34b/volumes" Dec 03 07:04:24 crc kubenswrapper[4947]: I1203 07:04:24.181538 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-grghg"] Dec 03 07:04:24 crc kubenswrapper[4947]: E1203 07:04:24.182411 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f350cbd5-9e7f-47da-be36-c0292934a34b" containerName="extract-content" Dec 03 07:04:24 crc kubenswrapper[4947]: I1203 07:04:24.182428 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f350cbd5-9e7f-47da-be36-c0292934a34b" containerName="extract-content" Dec 03 07:04:24 crc kubenswrapper[4947]: E1203 07:04:24.182447 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f350cbd5-9e7f-47da-be36-c0292934a34b" containerName="registry-server" Dec 03 07:04:24 crc kubenswrapper[4947]: I1203 07:04:24.182457 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f350cbd5-9e7f-47da-be36-c0292934a34b" containerName="registry-server" Dec 03 07:04:24 crc kubenswrapper[4947]: E1203 07:04:24.182471 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f350cbd5-9e7f-47da-be36-c0292934a34b" containerName="extract-utilities" Dec 03 07:04:24 crc kubenswrapper[4947]: I1203 07:04:24.182479 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f350cbd5-9e7f-47da-be36-c0292934a34b" containerName="extract-utilities" Dec 03 07:04:24 crc kubenswrapper[4947]: I1203 07:04:24.182679 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f350cbd5-9e7f-47da-be36-c0292934a34b" containerName="registry-server" Dec 03 07:04:24 crc kubenswrapper[4947]: I1203 07:04:24.183190 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-grghg" Dec 03 07:04:24 crc kubenswrapper[4947]: I1203 07:04:24.187651 4947 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-j728s" Dec 03 07:04:24 crc kubenswrapper[4947]: I1203 07:04:24.202352 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-grghg"] Dec 03 07:04:24 crc kubenswrapper[4947]: I1203 07:04:24.254044 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb84f\" (UniqueName: \"kubernetes.io/projected/b3404cf1-3794-4aeb-badb-55bede44e49e-kube-api-access-zb84f\") pod \"cert-manager-86cb77c54b-grghg\" (UID: \"b3404cf1-3794-4aeb-badb-55bede44e49e\") " pod="cert-manager/cert-manager-86cb77c54b-grghg" Dec 03 07:04:24 crc kubenswrapper[4947]: I1203 07:04:24.254117 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b3404cf1-3794-4aeb-badb-55bede44e49e-bound-sa-token\") pod \"cert-manager-86cb77c54b-grghg\" (UID: \"b3404cf1-3794-4aeb-badb-55bede44e49e\") " pod="cert-manager/cert-manager-86cb77c54b-grghg" Dec 03 07:04:24 crc kubenswrapper[4947]: I1203 07:04:24.354950 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb84f\" (UniqueName: \"kubernetes.io/projected/b3404cf1-3794-4aeb-badb-55bede44e49e-kube-api-access-zb84f\") pod \"cert-manager-86cb77c54b-grghg\" (UID: \"b3404cf1-3794-4aeb-badb-55bede44e49e\") " pod="cert-manager/cert-manager-86cb77c54b-grghg" Dec 03 07:04:24 crc kubenswrapper[4947]: I1203 07:04:24.355023 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b3404cf1-3794-4aeb-badb-55bede44e49e-bound-sa-token\") pod \"cert-manager-86cb77c54b-grghg\" (UID: \"b3404cf1-3794-4aeb-badb-55bede44e49e\") " pod="cert-manager/cert-manager-86cb77c54b-grghg" Dec 03 07:04:24 crc kubenswrapper[4947]: I1203 07:04:24.389899 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb84f\" (UniqueName: \"kubernetes.io/projected/b3404cf1-3794-4aeb-badb-55bede44e49e-kube-api-access-zb84f\") pod \"cert-manager-86cb77c54b-grghg\" (UID: \"b3404cf1-3794-4aeb-badb-55bede44e49e\") " pod="cert-manager/cert-manager-86cb77c54b-grghg" Dec 03 07:04:24 crc kubenswrapper[4947]: I1203 07:04:24.394712 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b3404cf1-3794-4aeb-badb-55bede44e49e-bound-sa-token\") pod \"cert-manager-86cb77c54b-grghg\" (UID: \"b3404cf1-3794-4aeb-badb-55bede44e49e\") " pod="cert-manager/cert-manager-86cb77c54b-grghg" Dec 03 07:04:24 crc kubenswrapper[4947]: I1203 07:04:24.503191 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-grghg" Dec 03 07:04:25 crc kubenswrapper[4947]: I1203 07:04:25.067266 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-grghg"] Dec 03 07:04:25 crc kubenswrapper[4947]: I1203 07:04:25.247320 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-zk967" Dec 03 07:04:25 crc kubenswrapper[4947]: I1203 07:04:25.506989 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-grghg" event={"ID":"b3404cf1-3794-4aeb-badb-55bede44e49e","Type":"ContainerStarted","Data":"d3b28d6738becd7c7129f03b9521beef1ebaf453444361705dac88a2eeee0927"} Dec 03 07:04:25 crc kubenswrapper[4947]: I1203 07:04:25.507071 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-grghg" event={"ID":"b3404cf1-3794-4aeb-badb-55bede44e49e","Type":"ContainerStarted","Data":"e633ce1972dcbf9167c3e173f7a29aa006d6c468c3766adeea91a052032e665e"} Dec 03 07:04:25 crc kubenswrapper[4947]: I1203 07:04:25.527577 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-grghg" podStartSLOduration=1.5275542149999999 podStartE2EDuration="1.527554215s" podCreationTimestamp="2025-12-03 07:04:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:04:25.521589734 +0000 UTC m=+926.782544220" watchObservedRunningTime="2025-12-03 07:04:25.527554215 +0000 UTC m=+926.788508681" Dec 03 07:04:27 crc kubenswrapper[4947]: I1203 07:04:27.715648 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hgvgl"] Dec 03 07:04:27 crc kubenswrapper[4947]: I1203 07:04:27.718034 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hgvgl" Dec 03 07:04:27 crc kubenswrapper[4947]: I1203 07:04:27.734444 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgvgl"] Dec 03 07:04:27 crc kubenswrapper[4947]: I1203 07:04:27.819030 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fqbb\" (UniqueName: \"kubernetes.io/projected/8eaecc6a-5644-4911-9a51-a07a1652c266-kube-api-access-2fqbb\") pod \"redhat-marketplace-hgvgl\" (UID: \"8eaecc6a-5644-4911-9a51-a07a1652c266\") " pod="openshift-marketplace/redhat-marketplace-hgvgl" Dec 03 07:04:27 crc kubenswrapper[4947]: I1203 07:04:27.819399 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eaecc6a-5644-4911-9a51-a07a1652c266-catalog-content\") pod \"redhat-marketplace-hgvgl\" (UID: \"8eaecc6a-5644-4911-9a51-a07a1652c266\") " pod="openshift-marketplace/redhat-marketplace-hgvgl" Dec 03 07:04:27 crc kubenswrapper[4947]: I1203 07:04:27.819563 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eaecc6a-5644-4911-9a51-a07a1652c266-utilities\") pod \"redhat-marketplace-hgvgl\" (UID: \"8eaecc6a-5644-4911-9a51-a07a1652c266\") " pod="openshift-marketplace/redhat-marketplace-hgvgl" Dec 03 07:04:27 crc kubenswrapper[4947]: I1203 07:04:27.920541 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fqbb\" (UniqueName: \"kubernetes.io/projected/8eaecc6a-5644-4911-9a51-a07a1652c266-kube-api-access-2fqbb\") pod \"redhat-marketplace-hgvgl\" (UID: \"8eaecc6a-5644-4911-9a51-a07a1652c266\") " pod="openshift-marketplace/redhat-marketplace-hgvgl" Dec 03 07:04:27 crc kubenswrapper[4947]: I1203 07:04:27.920609 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eaecc6a-5644-4911-9a51-a07a1652c266-catalog-content\") pod \"redhat-marketplace-hgvgl\" (UID: \"8eaecc6a-5644-4911-9a51-a07a1652c266\") " pod="openshift-marketplace/redhat-marketplace-hgvgl" Dec 03 07:04:27 crc kubenswrapper[4947]: I1203 07:04:27.920630 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eaecc6a-5644-4911-9a51-a07a1652c266-utilities\") pod \"redhat-marketplace-hgvgl\" (UID: \"8eaecc6a-5644-4911-9a51-a07a1652c266\") " pod="openshift-marketplace/redhat-marketplace-hgvgl" Dec 03 07:04:27 crc kubenswrapper[4947]: I1203 07:04:27.921059 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eaecc6a-5644-4911-9a51-a07a1652c266-utilities\") pod \"redhat-marketplace-hgvgl\" (UID: \"8eaecc6a-5644-4911-9a51-a07a1652c266\") " pod="openshift-marketplace/redhat-marketplace-hgvgl" Dec 03 07:04:27 crc kubenswrapper[4947]: I1203 07:04:27.921326 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eaecc6a-5644-4911-9a51-a07a1652c266-catalog-content\") pod \"redhat-marketplace-hgvgl\" (UID: \"8eaecc6a-5644-4911-9a51-a07a1652c266\") " pod="openshift-marketplace/redhat-marketplace-hgvgl" Dec 03 07:04:27 crc kubenswrapper[4947]: I1203 07:04:27.938347 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fqbb\" (UniqueName: \"kubernetes.io/projected/8eaecc6a-5644-4911-9a51-a07a1652c266-kube-api-access-2fqbb\") pod \"redhat-marketplace-hgvgl\" (UID: \"8eaecc6a-5644-4911-9a51-a07a1652c266\") " pod="openshift-marketplace/redhat-marketplace-hgvgl" Dec 03 07:04:28 crc kubenswrapper[4947]: I1203 07:04:28.049256 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hgvgl" Dec 03 07:04:28 crc kubenswrapper[4947]: I1203 07:04:28.540859 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgvgl"] Dec 03 07:04:28 crc kubenswrapper[4947]: W1203 07:04:28.546764 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8eaecc6a_5644_4911_9a51_a07a1652c266.slice/crio-2a345f0b7534e23910f2b7215cc2b961268bc6a769c237254b75258c3693cb7d WatchSource:0}: Error finding container 2a345f0b7534e23910f2b7215cc2b961268bc6a769c237254b75258c3693cb7d: Status 404 returned error can't find the container with id 2a345f0b7534e23910f2b7215cc2b961268bc6a769c237254b75258c3693cb7d Dec 03 07:04:29 crc kubenswrapper[4947]: I1203 07:04:29.534007 4947 generic.go:334] "Generic (PLEG): container finished" podID="8eaecc6a-5644-4911-9a51-a07a1652c266" containerID="128ee84ee16b9dd04f326aa5c9b68f120b1d50bbf667cce3e405dbf58ac8cef0" exitCode=0 Dec 03 07:04:29 crc kubenswrapper[4947]: I1203 07:04:29.534057 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgvgl" event={"ID":"8eaecc6a-5644-4911-9a51-a07a1652c266","Type":"ContainerDied","Data":"128ee84ee16b9dd04f326aa5c9b68f120b1d50bbf667cce3e405dbf58ac8cef0"} Dec 03 07:04:29 crc kubenswrapper[4947]: I1203 07:04:29.534085 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgvgl" event={"ID":"8eaecc6a-5644-4911-9a51-a07a1652c266","Type":"ContainerStarted","Data":"2a345f0b7534e23910f2b7215cc2b961268bc6a769c237254b75258c3693cb7d"} Dec 03 07:04:30 crc kubenswrapper[4947]: I1203 07:04:30.086127 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:04:30 crc kubenswrapper[4947]: I1203 07:04:30.086537 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:04:30 crc kubenswrapper[4947]: I1203 07:04:30.086617 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 07:04:30 crc kubenswrapper[4947]: I1203 07:04:30.087426 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b942c1f8be6b00a1ec761600a3a1ec1dc91971dc8d9f088d0a1649ce3f26d690"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:04:30 crc kubenswrapper[4947]: I1203 07:04:30.087624 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://b942c1f8be6b00a1ec761600a3a1ec1dc91971dc8d9f088d0a1649ce3f26d690" gracePeriod=600 Dec 03 07:04:30 crc kubenswrapper[4947]: I1203 07:04:30.551561 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="b942c1f8be6b00a1ec761600a3a1ec1dc91971dc8d9f088d0a1649ce3f26d690" exitCode=0 Dec 03 07:04:30 crc kubenswrapper[4947]: I1203 07:04:30.551606 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"b942c1f8be6b00a1ec761600a3a1ec1dc91971dc8d9f088d0a1649ce3f26d690"} Dec 03 07:04:30 crc kubenswrapper[4947]: I1203 07:04:30.551870 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"766a670181f7035ab6014723203ba9e8cefa098d94e396af08b21929329a9713"} Dec 03 07:04:30 crc kubenswrapper[4947]: I1203 07:04:30.551891 4947 scope.go:117] "RemoveContainer" containerID="5422fc464d66825e41f13e058e49aa601ef8b656e45633815d2d71ca8056f807" Dec 03 07:04:31 crc kubenswrapper[4947]: I1203 07:04:31.562912 4947 generic.go:334] "Generic (PLEG): container finished" podID="8eaecc6a-5644-4911-9a51-a07a1652c266" containerID="aab777d85b167aa47643130df70723aae373af837daef3e0940d430d702e3b9d" exitCode=0 Dec 03 07:04:31 crc kubenswrapper[4947]: I1203 07:04:31.562988 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgvgl" event={"ID":"8eaecc6a-5644-4911-9a51-a07a1652c266","Type":"ContainerDied","Data":"aab777d85b167aa47643130df70723aae373af837daef3e0940d430d702e3b9d"} Dec 03 07:04:32 crc kubenswrapper[4947]: I1203 07:04:32.574430 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgvgl" event={"ID":"8eaecc6a-5644-4911-9a51-a07a1652c266","Type":"ContainerStarted","Data":"09c9d9ab8fe7ae89ff887086d6f28d421237ff7768c2211b71290f1e8c3a5743"} Dec 03 07:04:32 crc kubenswrapper[4947]: I1203 07:04:32.599837 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hgvgl" podStartSLOduration=2.834554618 podStartE2EDuration="5.599823208s" podCreationTimestamp="2025-12-03 07:04:27 +0000 UTC" firstStartedPulling="2025-12-03 07:04:29.535699093 +0000 UTC m=+930.796653519" lastFinishedPulling="2025-12-03 07:04:32.300967683 +0000 UTC m=+933.561922109" observedRunningTime="2025-12-03 07:04:32.59917687 +0000 UTC m=+933.860131366" watchObservedRunningTime="2025-12-03 07:04:32.599823208 +0000 UTC m=+933.860777634" Dec 03 07:04:33 crc kubenswrapper[4947]: I1203 07:04:33.501664 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-dsc9b"] Dec 03 07:04:33 crc kubenswrapper[4947]: I1203 07:04:33.502440 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dsc9b" Dec 03 07:04:33 crc kubenswrapper[4947]: I1203 07:04:33.505751 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 03 07:04:33 crc kubenswrapper[4947]: I1203 07:04:33.506003 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 03 07:04:33 crc kubenswrapper[4947]: I1203 07:04:33.506163 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-jdw2b" Dec 03 07:04:33 crc kubenswrapper[4947]: I1203 07:04:33.518998 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dsc9b"] Dec 03 07:04:33 crc kubenswrapper[4947]: I1203 07:04:33.592243 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rstq9\" (UniqueName: \"kubernetes.io/projected/58ec37ed-05a0-4fdf-9482-c1bc4a1ce7b6-kube-api-access-rstq9\") pod \"openstack-operator-index-dsc9b\" (UID: \"58ec37ed-05a0-4fdf-9482-c1bc4a1ce7b6\") " pod="openstack-operators/openstack-operator-index-dsc9b" Dec 03 07:04:33 crc kubenswrapper[4947]: I1203 07:04:33.693182 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rstq9\" (UniqueName: \"kubernetes.io/projected/58ec37ed-05a0-4fdf-9482-c1bc4a1ce7b6-kube-api-access-rstq9\") pod \"openstack-operator-index-dsc9b\" (UID: \"58ec37ed-05a0-4fdf-9482-c1bc4a1ce7b6\") " pod="openstack-operators/openstack-operator-index-dsc9b" Dec 03 07:04:33 crc kubenswrapper[4947]: I1203 07:04:33.711750 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rstq9\" (UniqueName: \"kubernetes.io/projected/58ec37ed-05a0-4fdf-9482-c1bc4a1ce7b6-kube-api-access-rstq9\") pod \"openstack-operator-index-dsc9b\" (UID: \"58ec37ed-05a0-4fdf-9482-c1bc4a1ce7b6\") " pod="openstack-operators/openstack-operator-index-dsc9b" Dec 03 07:04:33 crc kubenswrapper[4947]: I1203 07:04:33.824101 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dsc9b" Dec 03 07:04:34 crc kubenswrapper[4947]: I1203 07:04:34.159916 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dsc9b"] Dec 03 07:04:34 crc kubenswrapper[4947]: I1203 07:04:34.587070 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dsc9b" event={"ID":"58ec37ed-05a0-4fdf-9482-c1bc4a1ce7b6","Type":"ContainerStarted","Data":"1e227e591329649c77fe6309a983cde7e113bbad9812d11c112d8a9c91284b99"} Dec 03 07:04:35 crc kubenswrapper[4947]: I1203 07:04:35.595942 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dsc9b" event={"ID":"58ec37ed-05a0-4fdf-9482-c1bc4a1ce7b6","Type":"ContainerStarted","Data":"6d1ae273deab69573819e3463a75f7b48e22099a3792e81f7ee39408a1c57423"} Dec 03 07:04:35 crc kubenswrapper[4947]: I1203 07:04:35.617331 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-dsc9b" podStartSLOduration=1.790724826 podStartE2EDuration="2.617316005s" podCreationTimestamp="2025-12-03 07:04:33 +0000 UTC" firstStartedPulling="2025-12-03 07:04:34.175570327 +0000 UTC m=+935.436524763" lastFinishedPulling="2025-12-03 07:04:35.002161516 +0000 UTC m=+936.263115942" observedRunningTime="2025-12-03 07:04:35.615787304 +0000 UTC m=+936.876741730" watchObservedRunningTime="2025-12-03 07:04:35.617316005 +0000 UTC m=+936.878270431" Dec 03 07:04:38 crc kubenswrapper[4947]: I1203 07:04:38.050134 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hgvgl" Dec 03 07:04:38 crc kubenswrapper[4947]: I1203 07:04:38.050586 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hgvgl" Dec 03 07:04:38 crc kubenswrapper[4947]: I1203 07:04:38.118184 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hgvgl" Dec 03 07:04:38 crc kubenswrapper[4947]: I1203 07:04:38.499146 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-dsc9b"] Dec 03 07:04:38 crc kubenswrapper[4947]: I1203 07:04:38.499825 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-dsc9b" podUID="58ec37ed-05a0-4fdf-9482-c1bc4a1ce7b6" containerName="registry-server" containerID="cri-o://6d1ae273deab69573819e3463a75f7b48e22099a3792e81f7ee39408a1c57423" gracePeriod=2 Dec 03 07:04:38 crc kubenswrapper[4947]: I1203 07:04:38.657702 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hgvgl" Dec 03 07:04:39 crc kubenswrapper[4947]: I1203 07:04:39.107189 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-cxbp6"] Dec 03 07:04:39 crc kubenswrapper[4947]: I1203 07:04:39.109736 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cxbp6" Dec 03 07:04:39 crc kubenswrapper[4947]: I1203 07:04:39.122130 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cxbp6"] Dec 03 07:04:39 crc kubenswrapper[4947]: I1203 07:04:39.269193 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf996\" (UniqueName: \"kubernetes.io/projected/8309e8b6-d88f-4bba-bdcf-d52ab10570aa-kube-api-access-cf996\") pod \"openstack-operator-index-cxbp6\" (UID: \"8309e8b6-d88f-4bba-bdcf-d52ab10570aa\") " pod="openstack-operators/openstack-operator-index-cxbp6" Dec 03 07:04:39 crc kubenswrapper[4947]: I1203 07:04:39.370934 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf996\" (UniqueName: \"kubernetes.io/projected/8309e8b6-d88f-4bba-bdcf-d52ab10570aa-kube-api-access-cf996\") pod \"openstack-operator-index-cxbp6\" (UID: \"8309e8b6-d88f-4bba-bdcf-d52ab10570aa\") " pod="openstack-operators/openstack-operator-index-cxbp6" Dec 03 07:04:39 crc kubenswrapper[4947]: I1203 07:04:39.391161 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf996\" (UniqueName: \"kubernetes.io/projected/8309e8b6-d88f-4bba-bdcf-d52ab10570aa-kube-api-access-cf996\") pod \"openstack-operator-index-cxbp6\" (UID: \"8309e8b6-d88f-4bba-bdcf-d52ab10570aa\") " pod="openstack-operators/openstack-operator-index-cxbp6" Dec 03 07:04:39 crc kubenswrapper[4947]: I1203 07:04:39.443547 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cxbp6" Dec 03 07:04:39 crc kubenswrapper[4947]: I1203 07:04:39.630627 4947 generic.go:334] "Generic (PLEG): container finished" podID="58ec37ed-05a0-4fdf-9482-c1bc4a1ce7b6" containerID="6d1ae273deab69573819e3463a75f7b48e22099a3792e81f7ee39408a1c57423" exitCode=0 Dec 03 07:04:39 crc kubenswrapper[4947]: I1203 07:04:39.630713 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dsc9b" event={"ID":"58ec37ed-05a0-4fdf-9482-c1bc4a1ce7b6","Type":"ContainerDied","Data":"6d1ae273deab69573819e3463a75f7b48e22099a3792e81f7ee39408a1c57423"} Dec 03 07:04:39 crc kubenswrapper[4947]: I1203 07:04:39.949018 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dsc9b" Dec 03 07:04:39 crc kubenswrapper[4947]: I1203 07:04:39.949050 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cxbp6"] Dec 03 07:04:40 crc kubenswrapper[4947]: I1203 07:04:40.081929 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rstq9\" (UniqueName: \"kubernetes.io/projected/58ec37ed-05a0-4fdf-9482-c1bc4a1ce7b6-kube-api-access-rstq9\") pod \"58ec37ed-05a0-4fdf-9482-c1bc4a1ce7b6\" (UID: \"58ec37ed-05a0-4fdf-9482-c1bc4a1ce7b6\") " Dec 03 07:04:40 crc kubenswrapper[4947]: I1203 07:04:40.086317 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58ec37ed-05a0-4fdf-9482-c1bc4a1ce7b6-kube-api-access-rstq9" (OuterVolumeSpecName: "kube-api-access-rstq9") pod "58ec37ed-05a0-4fdf-9482-c1bc4a1ce7b6" (UID: "58ec37ed-05a0-4fdf-9482-c1bc4a1ce7b6"). InnerVolumeSpecName "kube-api-access-rstq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:04:40 crc kubenswrapper[4947]: I1203 07:04:40.183730 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rstq9\" (UniqueName: \"kubernetes.io/projected/58ec37ed-05a0-4fdf-9482-c1bc4a1ce7b6-kube-api-access-rstq9\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:40 crc kubenswrapper[4947]: I1203 07:04:40.638739 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dsc9b" event={"ID":"58ec37ed-05a0-4fdf-9482-c1bc4a1ce7b6","Type":"ContainerDied","Data":"1e227e591329649c77fe6309a983cde7e113bbad9812d11c112d8a9c91284b99"} Dec 03 07:04:40 crc kubenswrapper[4947]: I1203 07:04:40.639044 4947 scope.go:117] "RemoveContainer" containerID="6d1ae273deab69573819e3463a75f7b48e22099a3792e81f7ee39408a1c57423" Dec 03 07:04:40 crc kubenswrapper[4947]: I1203 07:04:40.639202 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dsc9b" Dec 03 07:04:40 crc kubenswrapper[4947]: I1203 07:04:40.640272 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cxbp6" event={"ID":"8309e8b6-d88f-4bba-bdcf-d52ab10570aa","Type":"ContainerStarted","Data":"d7db1e28cce2c6fde384ce6daf0e08d55a4e3e5c47b47a7e3e5334bc19b3729d"} Dec 03 07:04:40 crc kubenswrapper[4947]: I1203 07:04:40.640293 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cxbp6" event={"ID":"8309e8b6-d88f-4bba-bdcf-d52ab10570aa","Type":"ContainerStarted","Data":"7d5210c7c4aa867f1bc152c7933f4edf280188b7206a6990b08c29213708ece8"} Dec 03 07:04:40 crc kubenswrapper[4947]: I1203 07:04:40.658531 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-cxbp6" podStartSLOduration=1.1972497340000001 podStartE2EDuration="1.658475306s" podCreationTimestamp="2025-12-03 07:04:39 +0000 UTC" firstStartedPulling="2025-12-03 07:04:39.966523635 +0000 UTC m=+941.227478061" lastFinishedPulling="2025-12-03 07:04:40.427749207 +0000 UTC m=+941.688703633" observedRunningTime="2025-12-03 07:04:40.653701056 +0000 UTC m=+941.914655502" watchObservedRunningTime="2025-12-03 07:04:40.658475306 +0000 UTC m=+941.919429732" Dec 03 07:04:40 crc kubenswrapper[4947]: I1203 07:04:40.678575 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-dsc9b"] Dec 03 07:04:40 crc kubenswrapper[4947]: I1203 07:04:40.682133 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-dsc9b"] Dec 03 07:04:41 crc kubenswrapper[4947]: I1203 07:04:41.096648 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58ec37ed-05a0-4fdf-9482-c1bc4a1ce7b6" path="/var/lib/kubelet/pods/58ec37ed-05a0-4fdf-9482-c1bc4a1ce7b6/volumes" Dec 03 07:04:41 crc kubenswrapper[4947]: I1203 07:04:41.894821 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgvgl"] Dec 03 07:04:41 crc kubenswrapper[4947]: I1203 07:04:41.895058 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hgvgl" podUID="8eaecc6a-5644-4911-9a51-a07a1652c266" containerName="registry-server" containerID="cri-o://09c9d9ab8fe7ae89ff887086d6f28d421237ff7768c2211b71290f1e8c3a5743" gracePeriod=2 Dec 03 07:04:42 crc kubenswrapper[4947]: I1203 07:04:42.369456 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hgvgl" Dec 03 07:04:42 crc kubenswrapper[4947]: I1203 07:04:42.511692 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eaecc6a-5644-4911-9a51-a07a1652c266-catalog-content\") pod \"8eaecc6a-5644-4911-9a51-a07a1652c266\" (UID: \"8eaecc6a-5644-4911-9a51-a07a1652c266\") " Dec 03 07:04:42 crc kubenswrapper[4947]: I1203 07:04:42.511732 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eaecc6a-5644-4911-9a51-a07a1652c266-utilities\") pod \"8eaecc6a-5644-4911-9a51-a07a1652c266\" (UID: \"8eaecc6a-5644-4911-9a51-a07a1652c266\") " Dec 03 07:04:42 crc kubenswrapper[4947]: I1203 07:04:42.511755 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fqbb\" (UniqueName: \"kubernetes.io/projected/8eaecc6a-5644-4911-9a51-a07a1652c266-kube-api-access-2fqbb\") pod \"8eaecc6a-5644-4911-9a51-a07a1652c266\" (UID: \"8eaecc6a-5644-4911-9a51-a07a1652c266\") " Dec 03 07:04:42 crc kubenswrapper[4947]: I1203 07:04:42.512750 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eaecc6a-5644-4911-9a51-a07a1652c266-utilities" (OuterVolumeSpecName: "utilities") pod "8eaecc6a-5644-4911-9a51-a07a1652c266" (UID: "8eaecc6a-5644-4911-9a51-a07a1652c266"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:04:42 crc kubenswrapper[4947]: I1203 07:04:42.516554 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eaecc6a-5644-4911-9a51-a07a1652c266-kube-api-access-2fqbb" (OuterVolumeSpecName: "kube-api-access-2fqbb") pod "8eaecc6a-5644-4911-9a51-a07a1652c266" (UID: "8eaecc6a-5644-4911-9a51-a07a1652c266"). InnerVolumeSpecName "kube-api-access-2fqbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:04:42 crc kubenswrapper[4947]: I1203 07:04:42.529691 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eaecc6a-5644-4911-9a51-a07a1652c266-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8eaecc6a-5644-4911-9a51-a07a1652c266" (UID: "8eaecc6a-5644-4911-9a51-a07a1652c266"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:04:42 crc kubenswrapper[4947]: I1203 07:04:42.613729 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eaecc6a-5644-4911-9a51-a07a1652c266-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:42 crc kubenswrapper[4947]: I1203 07:04:42.613772 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eaecc6a-5644-4911-9a51-a07a1652c266-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:42 crc kubenswrapper[4947]: I1203 07:04:42.613785 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fqbb\" (UniqueName: \"kubernetes.io/projected/8eaecc6a-5644-4911-9a51-a07a1652c266-kube-api-access-2fqbb\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:42 crc kubenswrapper[4947]: I1203 07:04:42.656446 4947 generic.go:334] "Generic (PLEG): container finished" podID="8eaecc6a-5644-4911-9a51-a07a1652c266" containerID="09c9d9ab8fe7ae89ff887086d6f28d421237ff7768c2211b71290f1e8c3a5743" exitCode=0 Dec 03 07:04:42 crc kubenswrapper[4947]: I1203 07:04:42.656523 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgvgl" event={"ID":"8eaecc6a-5644-4911-9a51-a07a1652c266","Type":"ContainerDied","Data":"09c9d9ab8fe7ae89ff887086d6f28d421237ff7768c2211b71290f1e8c3a5743"} Dec 03 07:04:42 crc kubenswrapper[4947]: I1203 07:04:42.656557 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hgvgl" event={"ID":"8eaecc6a-5644-4911-9a51-a07a1652c266","Type":"ContainerDied","Data":"2a345f0b7534e23910f2b7215cc2b961268bc6a769c237254b75258c3693cb7d"} Dec 03 07:04:42 crc kubenswrapper[4947]: I1203 07:04:42.656581 4947 scope.go:117] "RemoveContainer" containerID="09c9d9ab8fe7ae89ff887086d6f28d421237ff7768c2211b71290f1e8c3a5743" Dec 03 07:04:42 crc kubenswrapper[4947]: I1203 07:04:42.656526 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hgvgl" Dec 03 07:04:42 crc kubenswrapper[4947]: I1203 07:04:42.673579 4947 scope.go:117] "RemoveContainer" containerID="aab777d85b167aa47643130df70723aae373af837daef3e0940d430d702e3b9d" Dec 03 07:04:42 crc kubenswrapper[4947]: I1203 07:04:42.690623 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgvgl"] Dec 03 07:04:42 crc kubenswrapper[4947]: I1203 07:04:42.693340 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hgvgl"] Dec 03 07:04:42 crc kubenswrapper[4947]: I1203 07:04:42.715414 4947 scope.go:117] "RemoveContainer" containerID="128ee84ee16b9dd04f326aa5c9b68f120b1d50bbf667cce3e405dbf58ac8cef0" Dec 03 07:04:42 crc kubenswrapper[4947]: I1203 07:04:42.734335 4947 scope.go:117] "RemoveContainer" containerID="09c9d9ab8fe7ae89ff887086d6f28d421237ff7768c2211b71290f1e8c3a5743" Dec 03 07:04:42 crc kubenswrapper[4947]: E1203 07:04:42.734754 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09c9d9ab8fe7ae89ff887086d6f28d421237ff7768c2211b71290f1e8c3a5743\": container with ID starting with 09c9d9ab8fe7ae89ff887086d6f28d421237ff7768c2211b71290f1e8c3a5743 not found: ID does not exist" containerID="09c9d9ab8fe7ae89ff887086d6f28d421237ff7768c2211b71290f1e8c3a5743" Dec 03 07:04:42 crc kubenswrapper[4947]: I1203 07:04:42.734803 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c9d9ab8fe7ae89ff887086d6f28d421237ff7768c2211b71290f1e8c3a5743"} err="failed to get container status \"09c9d9ab8fe7ae89ff887086d6f28d421237ff7768c2211b71290f1e8c3a5743\": rpc error: code = NotFound desc = could not find container \"09c9d9ab8fe7ae89ff887086d6f28d421237ff7768c2211b71290f1e8c3a5743\": container with ID starting with 09c9d9ab8fe7ae89ff887086d6f28d421237ff7768c2211b71290f1e8c3a5743 not found: ID does not exist" Dec 03 07:04:42 crc kubenswrapper[4947]: I1203 07:04:42.734834 4947 scope.go:117] "RemoveContainer" containerID="aab777d85b167aa47643130df70723aae373af837daef3e0940d430d702e3b9d" Dec 03 07:04:42 crc kubenswrapper[4947]: E1203 07:04:42.735096 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aab777d85b167aa47643130df70723aae373af837daef3e0940d430d702e3b9d\": container with ID starting with aab777d85b167aa47643130df70723aae373af837daef3e0940d430d702e3b9d not found: ID does not exist" containerID="aab777d85b167aa47643130df70723aae373af837daef3e0940d430d702e3b9d" Dec 03 07:04:42 crc kubenswrapper[4947]: I1203 07:04:42.735125 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aab777d85b167aa47643130df70723aae373af837daef3e0940d430d702e3b9d"} err="failed to get container status \"aab777d85b167aa47643130df70723aae373af837daef3e0940d430d702e3b9d\": rpc error: code = NotFound desc = could not find container \"aab777d85b167aa47643130df70723aae373af837daef3e0940d430d702e3b9d\": container with ID starting with aab777d85b167aa47643130df70723aae373af837daef3e0940d430d702e3b9d not found: ID does not exist" Dec 03 07:04:42 crc kubenswrapper[4947]: I1203 07:04:42.735142 4947 scope.go:117] "RemoveContainer" containerID="128ee84ee16b9dd04f326aa5c9b68f120b1d50bbf667cce3e405dbf58ac8cef0" Dec 03 07:04:42 crc kubenswrapper[4947]: E1203 07:04:42.735438 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"128ee84ee16b9dd04f326aa5c9b68f120b1d50bbf667cce3e405dbf58ac8cef0\": container with ID starting with 128ee84ee16b9dd04f326aa5c9b68f120b1d50bbf667cce3e405dbf58ac8cef0 not found: ID does not exist" containerID="128ee84ee16b9dd04f326aa5c9b68f120b1d50bbf667cce3e405dbf58ac8cef0" Dec 03 07:04:42 crc kubenswrapper[4947]: I1203 07:04:42.735478 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"128ee84ee16b9dd04f326aa5c9b68f120b1d50bbf667cce3e405dbf58ac8cef0"} err="failed to get container status \"128ee84ee16b9dd04f326aa5c9b68f120b1d50bbf667cce3e405dbf58ac8cef0\": rpc error: code = NotFound desc = could not find container \"128ee84ee16b9dd04f326aa5c9b68f120b1d50bbf667cce3e405dbf58ac8cef0\": container with ID starting with 128ee84ee16b9dd04f326aa5c9b68f120b1d50bbf667cce3e405dbf58ac8cef0 not found: ID does not exist" Dec 03 07:04:43 crc kubenswrapper[4947]: I1203 07:04:43.090952 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eaecc6a-5644-4911-9a51-a07a1652c266" path="/var/lib/kubelet/pods/8eaecc6a-5644-4911-9a51-a07a1652c266/volumes" Dec 03 07:04:49 crc kubenswrapper[4947]: I1203 07:04:49.444036 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-cxbp6" Dec 03 07:04:49 crc kubenswrapper[4947]: I1203 07:04:49.445533 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-cxbp6" Dec 03 07:04:49 crc kubenswrapper[4947]: I1203 07:04:49.502293 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-cxbp6" Dec 03 07:04:49 crc kubenswrapper[4947]: I1203 07:04:49.740457 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-cxbp6" Dec 03 07:04:52 crc kubenswrapper[4947]: I1203 07:04:52.143048 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz"] Dec 03 07:04:52 crc kubenswrapper[4947]: E1203 07:04:52.143735 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eaecc6a-5644-4911-9a51-a07a1652c266" containerName="extract-content" Dec 03 07:04:52 crc kubenswrapper[4947]: I1203 07:04:52.143758 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eaecc6a-5644-4911-9a51-a07a1652c266" containerName="extract-content" Dec 03 07:04:52 crc kubenswrapper[4947]: E1203 07:04:52.143777 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eaecc6a-5644-4911-9a51-a07a1652c266" containerName="extract-utilities" Dec 03 07:04:52 crc kubenswrapper[4947]: I1203 07:04:52.143789 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eaecc6a-5644-4911-9a51-a07a1652c266" containerName="extract-utilities" Dec 03 07:04:52 crc kubenswrapper[4947]: E1203 07:04:52.143803 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58ec37ed-05a0-4fdf-9482-c1bc4a1ce7b6" containerName="registry-server" Dec 03 07:04:52 crc kubenswrapper[4947]: I1203 07:04:52.143814 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="58ec37ed-05a0-4fdf-9482-c1bc4a1ce7b6" containerName="registry-server" Dec 03 07:04:52 crc kubenswrapper[4947]: E1203 07:04:52.143826 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eaecc6a-5644-4911-9a51-a07a1652c266" containerName="registry-server" Dec 03 07:04:52 crc kubenswrapper[4947]: I1203 07:04:52.143836 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eaecc6a-5644-4911-9a51-a07a1652c266" containerName="registry-server" Dec 03 07:04:52 crc kubenswrapper[4947]: I1203 07:04:52.144024 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="58ec37ed-05a0-4fdf-9482-c1bc4a1ce7b6" containerName="registry-server" Dec 03 07:04:52 crc kubenswrapper[4947]: I1203 07:04:52.144044 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eaecc6a-5644-4911-9a51-a07a1652c266" containerName="registry-server" Dec 03 07:04:52 crc kubenswrapper[4947]: I1203 07:04:52.145303 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz" Dec 03 07:04:52 crc kubenswrapper[4947]: I1203 07:04:52.148057 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-589ck" Dec 03 07:04:52 crc kubenswrapper[4947]: I1203 07:04:52.155345 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz"] Dec 03 07:04:52 crc kubenswrapper[4947]: I1203 07:04:52.161410 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7144393-2451-4a57-8755-5d18064bec1a-bundle\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz\" (UID: \"b7144393-2451-4a57-8755-5d18064bec1a\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz" Dec 03 07:04:52 crc kubenswrapper[4947]: I1203 07:04:52.161508 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7144393-2451-4a57-8755-5d18064bec1a-util\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz\" (UID: \"b7144393-2451-4a57-8755-5d18064bec1a\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz" Dec 03 07:04:52 crc kubenswrapper[4947]: I1203 07:04:52.161552 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smjs8\" (UniqueName: \"kubernetes.io/projected/b7144393-2451-4a57-8755-5d18064bec1a-kube-api-access-smjs8\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz\" (UID: \"b7144393-2451-4a57-8755-5d18064bec1a\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz" Dec 03 07:04:52 crc kubenswrapper[4947]: I1203 07:04:52.262680 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7144393-2451-4a57-8755-5d18064bec1a-bundle\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz\" (UID: \"b7144393-2451-4a57-8755-5d18064bec1a\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz" Dec 03 07:04:52 crc kubenswrapper[4947]: I1203 07:04:52.262782 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7144393-2451-4a57-8755-5d18064bec1a-util\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz\" (UID: \"b7144393-2451-4a57-8755-5d18064bec1a\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz" Dec 03 07:04:52 crc kubenswrapper[4947]: I1203 07:04:52.262812 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smjs8\" (UniqueName: \"kubernetes.io/projected/b7144393-2451-4a57-8755-5d18064bec1a-kube-api-access-smjs8\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz\" (UID: \"b7144393-2451-4a57-8755-5d18064bec1a\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz" Dec 03 07:04:52 crc kubenswrapper[4947]: I1203 07:04:52.263302 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7144393-2451-4a57-8755-5d18064bec1a-bundle\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz\" (UID: \"b7144393-2451-4a57-8755-5d18064bec1a\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz" Dec 03 07:04:52 crc kubenswrapper[4947]: I1203 07:04:52.263401 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7144393-2451-4a57-8755-5d18064bec1a-util\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz\" (UID: \"b7144393-2451-4a57-8755-5d18064bec1a\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz" Dec 03 07:04:52 crc kubenswrapper[4947]: I1203 07:04:52.288584 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smjs8\" (UniqueName: \"kubernetes.io/projected/b7144393-2451-4a57-8755-5d18064bec1a-kube-api-access-smjs8\") pod \"98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz\" (UID: \"b7144393-2451-4a57-8755-5d18064bec1a\") " pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz" Dec 03 07:04:52 crc kubenswrapper[4947]: I1203 07:04:52.469946 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz" Dec 03 07:04:52 crc kubenswrapper[4947]: I1203 07:04:52.971507 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz"] Dec 03 07:04:53 crc kubenswrapper[4947]: I1203 07:04:53.736613 4947 generic.go:334] "Generic (PLEG): container finished" podID="b7144393-2451-4a57-8755-5d18064bec1a" containerID="b94c930fe2a0b8a3a71c255f05815baa1b0b8febe933a7e0dd0286b79aa7e434" exitCode=0 Dec 03 07:04:53 crc kubenswrapper[4947]: I1203 07:04:53.736756 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz" event={"ID":"b7144393-2451-4a57-8755-5d18064bec1a","Type":"ContainerDied","Data":"b94c930fe2a0b8a3a71c255f05815baa1b0b8febe933a7e0dd0286b79aa7e434"} Dec 03 07:04:53 crc kubenswrapper[4947]: I1203 07:04:53.737110 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz" event={"ID":"b7144393-2451-4a57-8755-5d18064bec1a","Type":"ContainerStarted","Data":"b59a88696ecdda37779cc155756c5c615515f283946de0d21d3d2fd2bb4c92e6"} Dec 03 07:04:54 crc kubenswrapper[4947]: I1203 07:04:54.748461 4947 generic.go:334] "Generic (PLEG): container finished" podID="b7144393-2451-4a57-8755-5d18064bec1a" containerID="2d79ec39cecee0c4e8f71636dc1e7cdc504b6906a7fede900df3606f8f2f69f5" exitCode=0 Dec 03 07:04:54 crc kubenswrapper[4947]: I1203 07:04:54.748532 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz" event={"ID":"b7144393-2451-4a57-8755-5d18064bec1a","Type":"ContainerDied","Data":"2d79ec39cecee0c4e8f71636dc1e7cdc504b6906a7fede900df3606f8f2f69f5"} Dec 03 07:04:55 crc kubenswrapper[4947]: I1203 07:04:55.756576 4947 generic.go:334] "Generic (PLEG): container finished" podID="b7144393-2451-4a57-8755-5d18064bec1a" containerID="acb801ee229c7e30ba78d65bfd7d1219e7fa70a7907e3535b169de6b0381cefb" exitCode=0 Dec 03 07:04:55 crc kubenswrapper[4947]: I1203 07:04:55.756872 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz" event={"ID":"b7144393-2451-4a57-8755-5d18064bec1a","Type":"ContainerDied","Data":"acb801ee229c7e30ba78d65bfd7d1219e7fa70a7907e3535b169de6b0381cefb"} Dec 03 07:04:57 crc kubenswrapper[4947]: I1203 07:04:57.096565 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz" Dec 03 07:04:57 crc kubenswrapper[4947]: I1203 07:04:57.133952 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smjs8\" (UniqueName: \"kubernetes.io/projected/b7144393-2451-4a57-8755-5d18064bec1a-kube-api-access-smjs8\") pod \"b7144393-2451-4a57-8755-5d18064bec1a\" (UID: \"b7144393-2451-4a57-8755-5d18064bec1a\") " Dec 03 07:04:57 crc kubenswrapper[4947]: I1203 07:04:57.134263 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7144393-2451-4a57-8755-5d18064bec1a-bundle\") pod \"b7144393-2451-4a57-8755-5d18064bec1a\" (UID: \"b7144393-2451-4a57-8755-5d18064bec1a\") " Dec 03 07:04:57 crc kubenswrapper[4947]: I1203 07:04:57.134372 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7144393-2451-4a57-8755-5d18064bec1a-util\") pod \"b7144393-2451-4a57-8755-5d18064bec1a\" (UID: \"b7144393-2451-4a57-8755-5d18064bec1a\") " Dec 03 07:04:57 crc kubenswrapper[4947]: I1203 07:04:57.136857 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7144393-2451-4a57-8755-5d18064bec1a-bundle" (OuterVolumeSpecName: "bundle") pod "b7144393-2451-4a57-8755-5d18064bec1a" (UID: "b7144393-2451-4a57-8755-5d18064bec1a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:04:57 crc kubenswrapper[4947]: I1203 07:04:57.151657 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7144393-2451-4a57-8755-5d18064bec1a-kube-api-access-smjs8" (OuterVolumeSpecName: "kube-api-access-smjs8") pod "b7144393-2451-4a57-8755-5d18064bec1a" (UID: "b7144393-2451-4a57-8755-5d18064bec1a"). InnerVolumeSpecName "kube-api-access-smjs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:04:57 crc kubenswrapper[4947]: I1203 07:04:57.160948 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7144393-2451-4a57-8755-5d18064bec1a-util" (OuterVolumeSpecName: "util") pod "b7144393-2451-4a57-8755-5d18064bec1a" (UID: "b7144393-2451-4a57-8755-5d18064bec1a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:04:57 crc kubenswrapper[4947]: I1203 07:04:57.235126 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smjs8\" (UniqueName: \"kubernetes.io/projected/b7144393-2451-4a57-8755-5d18064bec1a-kube-api-access-smjs8\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:57 crc kubenswrapper[4947]: I1203 07:04:57.235156 4947 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7144393-2451-4a57-8755-5d18064bec1a-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:57 crc kubenswrapper[4947]: I1203 07:04:57.235166 4947 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7144393-2451-4a57-8755-5d18064bec1a-util\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:57 crc kubenswrapper[4947]: I1203 07:04:57.775450 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz" event={"ID":"b7144393-2451-4a57-8755-5d18064bec1a","Type":"ContainerDied","Data":"b59a88696ecdda37779cc155756c5c615515f283946de0d21d3d2fd2bb4c92e6"} Dec 03 07:04:57 crc kubenswrapper[4947]: I1203 07:04:57.775525 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b59a88696ecdda37779cc155756c5c615515f283946de0d21d3d2fd2bb4c92e6" Dec 03 07:04:57 crc kubenswrapper[4947]: I1203 07:04:57.775618 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz" Dec 03 07:04:59 crc kubenswrapper[4947]: I1203 07:04:59.741155 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-m52kd"] Dec 03 07:04:59 crc kubenswrapper[4947]: E1203 07:04:59.741415 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7144393-2451-4a57-8755-5d18064bec1a" containerName="extract" Dec 03 07:04:59 crc kubenswrapper[4947]: I1203 07:04:59.741428 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7144393-2451-4a57-8755-5d18064bec1a" containerName="extract" Dec 03 07:04:59 crc kubenswrapper[4947]: E1203 07:04:59.741442 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7144393-2451-4a57-8755-5d18064bec1a" containerName="pull" Dec 03 07:04:59 crc kubenswrapper[4947]: I1203 07:04:59.741450 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7144393-2451-4a57-8755-5d18064bec1a" containerName="pull" Dec 03 07:04:59 crc kubenswrapper[4947]: E1203 07:04:59.741471 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7144393-2451-4a57-8755-5d18064bec1a" containerName="util" Dec 03 07:04:59 crc kubenswrapper[4947]: I1203 07:04:59.741478 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7144393-2451-4a57-8755-5d18064bec1a" containerName="util" Dec 03 07:04:59 crc kubenswrapper[4947]: I1203 07:04:59.741600 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7144393-2451-4a57-8755-5d18064bec1a" containerName="extract" Dec 03 07:04:59 crc kubenswrapper[4947]: I1203 07:04:59.741972 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-m52kd" Dec 03 07:04:59 crc kubenswrapper[4947]: I1203 07:04:59.744971 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-v24jq" Dec 03 07:04:59 crc kubenswrapper[4947]: I1203 07:04:59.770414 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw2dh\" (UniqueName: \"kubernetes.io/projected/b6f8f031-fa7a-4e17-88ae-3a27974fa5f1-kube-api-access-nw2dh\") pod \"openstack-operator-controller-operator-7dd5c7bb7c-m52kd\" (UID: \"b6f8f031-fa7a-4e17-88ae-3a27974fa5f1\") " pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-m52kd" Dec 03 07:04:59 crc kubenswrapper[4947]: I1203 07:04:59.772267 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-m52kd"] Dec 03 07:04:59 crc kubenswrapper[4947]: I1203 07:04:59.872121 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw2dh\" (UniqueName: \"kubernetes.io/projected/b6f8f031-fa7a-4e17-88ae-3a27974fa5f1-kube-api-access-nw2dh\") pod \"openstack-operator-controller-operator-7dd5c7bb7c-m52kd\" (UID: \"b6f8f031-fa7a-4e17-88ae-3a27974fa5f1\") " pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-m52kd" Dec 03 07:04:59 crc kubenswrapper[4947]: I1203 07:04:59.904957 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw2dh\" (UniqueName: \"kubernetes.io/projected/b6f8f031-fa7a-4e17-88ae-3a27974fa5f1-kube-api-access-nw2dh\") pod \"openstack-operator-controller-operator-7dd5c7bb7c-m52kd\" (UID: \"b6f8f031-fa7a-4e17-88ae-3a27974fa5f1\") " pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-m52kd" Dec 03 07:05:00 crc kubenswrapper[4947]: I1203 07:05:00.057662 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-m52kd" Dec 03 07:05:00 crc kubenswrapper[4947]: I1203 07:05:00.507681 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-m52kd"] Dec 03 07:05:00 crc kubenswrapper[4947]: I1203 07:05:00.797450 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-m52kd" event={"ID":"b6f8f031-fa7a-4e17-88ae-3a27974fa5f1","Type":"ContainerStarted","Data":"3796ec6ccfae4a58bff06806e1ae4fb675a8d94f2a731f07379a596a6430bc87"} Dec 03 07:05:04 crc kubenswrapper[4947]: I1203 07:05:04.826379 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-m52kd" event={"ID":"b6f8f031-fa7a-4e17-88ae-3a27974fa5f1","Type":"ContainerStarted","Data":"13261188b1f74bb77151930aaf0fe822caeb355de3fffc5feae04545e96362c8"} Dec 03 07:05:04 crc kubenswrapper[4947]: I1203 07:05:04.827017 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-m52kd" Dec 03 07:05:10 crc kubenswrapper[4947]: I1203 07:05:10.060249 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-m52kd" Dec 03 07:05:10 crc kubenswrapper[4947]: I1203 07:05:10.103459 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-7dd5c7bb7c-m52kd" podStartSLOduration=7.107750159 podStartE2EDuration="11.103433921s" podCreationTimestamp="2025-12-03 07:04:59 +0000 UTC" firstStartedPulling="2025-12-03 07:05:00.53714379 +0000 UTC m=+961.798098216" lastFinishedPulling="2025-12-03 07:05:04.532827542 +0000 UTC m=+965.793781978" observedRunningTime="2025-12-03 07:05:04.912116785 +0000 UTC m=+966.173071221" watchObservedRunningTime="2025-12-03 07:05:10.103433921 +0000 UTC m=+971.364388387" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.570229 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-26vkh"] Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.571935 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-26vkh" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.573785 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-f25sq" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.575460 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-g9h4f"] Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.576700 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-g9h4f" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.579726 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-krbf7" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.587605 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-mg8v4"] Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.588659 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-mg8v4" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.590831 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-v2q6d" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.593304 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-26vkh"] Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.602656 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-g9h4f"] Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.604095 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lttr4\" (UniqueName: \"kubernetes.io/projected/888543d1-0ade-4e16-8965-a0ecb6fd65a7-kube-api-access-lttr4\") pod \"barbican-operator-controller-manager-7d9dfd778-26vkh\" (UID: \"888543d1-0ade-4e16-8965-a0ecb6fd65a7\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-26vkh" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.604150 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htzs9\" (UniqueName: \"kubernetes.io/projected/83997d19-6166-445a-a2bd-15acf15fa18d-kube-api-access-htzs9\") pod \"cinder-operator-controller-manager-859b6ccc6-g9h4f\" (UID: \"83997d19-6166-445a-a2bd-15acf15fa18d\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-g9h4f" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.604268 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wnp2\" (UniqueName: \"kubernetes.io/projected/e266debc-b844-4b60-bbbf-c038b61a7ab8-kube-api-access-2wnp2\") pod \"designate-operator-controller-manager-78b4bc895b-mg8v4\" (UID: \"e266debc-b844-4b60-bbbf-c038b61a7ab8\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-mg8v4" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.629749 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-mg8v4"] Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.664748 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-nvsv7"] Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.665959 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-nvsv7" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.675902 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-kmg87" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.684551 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfz6c"] Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.686270 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfz6c" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.689150 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-snsds" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.706936 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgb7n\" (UniqueName: \"kubernetes.io/projected/1e069b5f-1572-4c42-b34e-74c49b4b6940-kube-api-access-sgb7n\") pod \"heat-operator-controller-manager-5f64f6f8bb-hfz6c\" (UID: \"1e069b5f-1572-4c42-b34e-74c49b4b6940\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfz6c" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.707003 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wnp2\" (UniqueName: \"kubernetes.io/projected/e266debc-b844-4b60-bbbf-c038b61a7ab8-kube-api-access-2wnp2\") pod \"designate-operator-controller-manager-78b4bc895b-mg8v4\" (UID: \"e266debc-b844-4b60-bbbf-c038b61a7ab8\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-mg8v4" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.707079 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lttr4\" (UniqueName: \"kubernetes.io/projected/888543d1-0ade-4e16-8965-a0ecb6fd65a7-kube-api-access-lttr4\") pod \"barbican-operator-controller-manager-7d9dfd778-26vkh\" (UID: \"888543d1-0ade-4e16-8965-a0ecb6fd65a7\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-26vkh" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.707115 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htzs9\" (UniqueName: \"kubernetes.io/projected/83997d19-6166-445a-a2bd-15acf15fa18d-kube-api-access-htzs9\") pod \"cinder-operator-controller-manager-859b6ccc6-g9h4f\" (UID: \"83997d19-6166-445a-a2bd-15acf15fa18d\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-g9h4f" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.707199 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7fqt\" (UniqueName: \"kubernetes.io/projected/528fb24c-b835-446a-84c8-fce6b4e4815c-kube-api-access-r7fqt\") pod \"glance-operator-controller-manager-77987cd8cd-nvsv7\" (UID: \"528fb24c-b835-446a-84c8-fce6b4e4815c\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-nvsv7" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.717719 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-nvsv7"] Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.742086 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wnp2\" (UniqueName: \"kubernetes.io/projected/e266debc-b844-4b60-bbbf-c038b61a7ab8-kube-api-access-2wnp2\") pod \"designate-operator-controller-manager-78b4bc895b-mg8v4\" (UID: \"e266debc-b844-4b60-bbbf-c038b61a7ab8\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-mg8v4" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.742131 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htzs9\" (UniqueName: \"kubernetes.io/projected/83997d19-6166-445a-a2bd-15acf15fa18d-kube-api-access-htzs9\") pod \"cinder-operator-controller-manager-859b6ccc6-g9h4f\" (UID: \"83997d19-6166-445a-a2bd-15acf15fa18d\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-g9h4f" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.747541 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfz6c"] Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.749617 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-htpcw"] Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.750260 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lttr4\" (UniqueName: \"kubernetes.io/projected/888543d1-0ade-4e16-8965-a0ecb6fd65a7-kube-api-access-lttr4\") pod \"barbican-operator-controller-manager-7d9dfd778-26vkh\" (UID: \"888543d1-0ade-4e16-8965-a0ecb6fd65a7\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-26vkh" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.750662 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-htpcw" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.752271 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.752643 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-z5b89" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.758518 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gl8qx"] Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.759538 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gl8qx" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.763718 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-zxjfz" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.792227 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-htpcw"] Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.806781 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-lt6cp"] Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.807884 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-lt6cp" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.807929 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2tbr\" (UniqueName: \"kubernetes.io/projected/13d55077-1827-43a0-a985-85db61855cb3-kube-api-access-z2tbr\") pod \"horizon-operator-controller-manager-68c6d99b8f-gl8qx\" (UID: \"13d55077-1827-43a0-a985-85db61855cb3\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gl8qx" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.807995 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b30f0b12-f386-4968-98fd-e2272aa1b2f9-cert\") pod \"infra-operator-controller-manager-57548d458d-htpcw\" (UID: \"b30f0b12-f386-4968-98fd-e2272aa1b2f9\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-htpcw" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.808012 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bld6\" (UniqueName: \"kubernetes.io/projected/b30f0b12-f386-4968-98fd-e2272aa1b2f9-kube-api-access-2bld6\") pod \"infra-operator-controller-manager-57548d458d-htpcw\" (UID: \"b30f0b12-f386-4968-98fd-e2272aa1b2f9\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-htpcw" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.808047 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7fqt\" (UniqueName: \"kubernetes.io/projected/528fb24c-b835-446a-84c8-fce6b4e4815c-kube-api-access-r7fqt\") pod \"glance-operator-controller-manager-77987cd8cd-nvsv7\" (UID: \"528fb24c-b835-446a-84c8-fce6b4e4815c\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-nvsv7" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.808079 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgb7n\" (UniqueName: \"kubernetes.io/projected/1e069b5f-1572-4c42-b34e-74c49b4b6940-kube-api-access-sgb7n\") pod \"heat-operator-controller-manager-5f64f6f8bb-hfz6c\" (UID: \"1e069b5f-1572-4c42-b34e-74c49b4b6940\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfz6c" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.813913 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gl8qx"] Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.815754 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-mnc8j" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.819107 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-bdgzf"] Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.820291 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-bdgzf" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.824849 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-tqdnw" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.830550 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-lt6cp"] Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.836383 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-bdgzf"] Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.840169 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-qzhrh"] Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.841539 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-qzhrh" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.847837 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hflxr"] Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.848790 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hflxr" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.858386 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-xj5pq" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.859061 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-h8jd7" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.864284 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgb7n\" (UniqueName: \"kubernetes.io/projected/1e069b5f-1572-4c42-b34e-74c49b4b6940-kube-api-access-sgb7n\") pod \"heat-operator-controller-manager-5f64f6f8bb-hfz6c\" (UID: \"1e069b5f-1572-4c42-b34e-74c49b4b6940\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfz6c" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.876569 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-qzhrh"] Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.881434 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7fqt\" (UniqueName: \"kubernetes.io/projected/528fb24c-b835-446a-84c8-fce6b4e4815c-kube-api-access-r7fqt\") pod \"glance-operator-controller-manager-77987cd8cd-nvsv7\" (UID: \"528fb24c-b835-446a-84c8-fce6b4e4815c\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-nvsv7" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.892013 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-26vkh" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.902581 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-g9h4f" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.903168 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n9p7d"] Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.904155 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n9p7d" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.909742 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdwm7\" (UniqueName: \"kubernetes.io/projected/fbf40e80-a5ef-41d4-ad63-b060d52be33f-kube-api-access-jdwm7\") pod \"manila-operator-controller-manager-7c79b5df47-bdgzf\" (UID: \"fbf40e80-a5ef-41d4-ad63-b060d52be33f\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-bdgzf" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.909782 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2tbr\" (UniqueName: \"kubernetes.io/projected/13d55077-1827-43a0-a985-85db61855cb3-kube-api-access-z2tbr\") pod \"horizon-operator-controller-manager-68c6d99b8f-gl8qx\" (UID: \"13d55077-1827-43a0-a985-85db61855cb3\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gl8qx" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.909817 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2kl6\" (UniqueName: \"kubernetes.io/projected/b537173f-b7c8-426d-bf40-8bb6ece17177-kube-api-access-w2kl6\") pod \"mariadb-operator-controller-manager-56bbcc9d85-hflxr\" (UID: \"b537173f-b7c8-426d-bf40-8bb6ece17177\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hflxr" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.909848 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bld6\" (UniqueName: \"kubernetes.io/projected/b30f0b12-f386-4968-98fd-e2272aa1b2f9-kube-api-access-2bld6\") pod \"infra-operator-controller-manager-57548d458d-htpcw\" (UID: \"b30f0b12-f386-4968-98fd-e2272aa1b2f9\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-htpcw" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.909865 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b30f0b12-f386-4968-98fd-e2272aa1b2f9-cert\") pod \"infra-operator-controller-manager-57548d458d-htpcw\" (UID: \"b30f0b12-f386-4968-98fd-e2272aa1b2f9\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-htpcw" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.909886 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zssv\" (UniqueName: \"kubernetes.io/projected/fd59ba56-51b7-4260-9b1f-e3bee0916e06-kube-api-access-2zssv\") pod \"keystone-operator-controller-manager-7765d96ddf-qzhrh\" (UID: \"fd59ba56-51b7-4260-9b1f-e3bee0916e06\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-qzhrh" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.909913 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjlz6\" (UniqueName: \"kubernetes.io/projected/59be11f8-72a2-45cc-b690-951bda0d87be-kube-api-access-vjlz6\") pod \"ironic-operator-controller-manager-6c548fd776-lt6cp\" (UID: \"59be11f8-72a2-45cc-b690-951bda0d87be\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-lt6cp" Dec 03 07:05:28 crc kubenswrapper[4947]: E1203 07:05:28.910291 4947 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 07:05:28 crc kubenswrapper[4947]: E1203 07:05:28.910344 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b30f0b12-f386-4968-98fd-e2272aa1b2f9-cert podName:b30f0b12-f386-4968-98fd-e2272aa1b2f9 nodeName:}" failed. No retries permitted until 2025-12-03 07:05:29.410327809 +0000 UTC m=+990.671282235 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b30f0b12-f386-4968-98fd-e2272aa1b2f9-cert") pod "infra-operator-controller-manager-57548d458d-htpcw" (UID: "b30f0b12-f386-4968-98fd-e2272aa1b2f9") : secret "infra-operator-webhook-server-cert" not found Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.910546 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-qgr5j" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.917655 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hflxr"] Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.923149 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-mg8v4" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.947553 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n9p7d"] Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.960271 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bld6\" (UniqueName: \"kubernetes.io/projected/b30f0b12-f386-4968-98fd-e2272aa1b2f9-kube-api-access-2bld6\") pod \"infra-operator-controller-manager-57548d458d-htpcw\" (UID: \"b30f0b12-f386-4968-98fd-e2272aa1b2f9\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-htpcw" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.961164 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2tbr\" (UniqueName: \"kubernetes.io/projected/13d55077-1827-43a0-a985-85db61855cb3-kube-api-access-z2tbr\") pod \"horizon-operator-controller-manager-68c6d99b8f-gl8qx\" (UID: \"13d55077-1827-43a0-a985-85db61855cb3\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gl8qx" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.985829 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-lwmcn"] Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.987041 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-lwmcn" Dec 03 07:05:28 crc kubenswrapper[4947]: I1203 07:05:28.997083 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-nvsv7" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.000566 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-qv7wl" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.009569 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-lwmcn"] Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.010026 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfz6c" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.012966 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-mh7hb"] Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.013965 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mh7hb" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.016390 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lw6b\" (UniqueName: \"kubernetes.io/projected/0681e281-d288-430f-b175-1d5c36593c9a-kube-api-access-5lw6b\") pod \"nova-operator-controller-manager-697bc559fc-lwmcn\" (UID: \"0681e281-d288-430f-b175-1d5c36593c9a\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-lwmcn" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.016424 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zssv\" (UniqueName: \"kubernetes.io/projected/fd59ba56-51b7-4260-9b1f-e3bee0916e06-kube-api-access-2zssv\") pod \"keystone-operator-controller-manager-7765d96ddf-qzhrh\" (UID: \"fd59ba56-51b7-4260-9b1f-e3bee0916e06\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-qzhrh" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.016456 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjlz6\" (UniqueName: \"kubernetes.io/projected/59be11f8-72a2-45cc-b690-951bda0d87be-kube-api-access-vjlz6\") pod \"ironic-operator-controller-manager-6c548fd776-lt6cp\" (UID: \"59be11f8-72a2-45cc-b690-951bda0d87be\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-lt6cp" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.016505 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdwm7\" (UniqueName: \"kubernetes.io/projected/fbf40e80-a5ef-41d4-ad63-b060d52be33f-kube-api-access-jdwm7\") pod \"manila-operator-controller-manager-7c79b5df47-bdgzf\" (UID: \"fbf40e80-a5ef-41d4-ad63-b060d52be33f\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-bdgzf" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.016527 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pd7p\" (UniqueName: \"kubernetes.io/projected/f67211f5-3446-471e-8124-52d1d18dadbe-kube-api-access-7pd7p\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-n9p7d\" (UID: \"f67211f5-3446-471e-8124-52d1d18dadbe\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n9p7d" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.016556 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2kl6\" (UniqueName: \"kubernetes.io/projected/b537173f-b7c8-426d-bf40-8bb6ece17177-kube-api-access-w2kl6\") pod \"mariadb-operator-controller-manager-56bbcc9d85-hflxr\" (UID: \"b537173f-b7c8-426d-bf40-8bb6ece17177\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hflxr" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.020437 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-txjhf" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.041128 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2kl6\" (UniqueName: \"kubernetes.io/projected/b537173f-b7c8-426d-bf40-8bb6ece17177-kube-api-access-w2kl6\") pod \"mariadb-operator-controller-manager-56bbcc9d85-hflxr\" (UID: \"b537173f-b7c8-426d-bf40-8bb6ece17177\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hflxr" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.044726 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zssv\" (UniqueName: \"kubernetes.io/projected/fd59ba56-51b7-4260-9b1f-e3bee0916e06-kube-api-access-2zssv\") pod \"keystone-operator-controller-manager-7765d96ddf-qzhrh\" (UID: \"fd59ba56-51b7-4260-9b1f-e3bee0916e06\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-qzhrh" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.048105 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjlz6\" (UniqueName: \"kubernetes.io/projected/59be11f8-72a2-45cc-b690-951bda0d87be-kube-api-access-vjlz6\") pod \"ironic-operator-controller-manager-6c548fd776-lt6cp\" (UID: \"59be11f8-72a2-45cc-b690-951bda0d87be\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-lt6cp" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.051949 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdwm7\" (UniqueName: \"kubernetes.io/projected/fbf40e80-a5ef-41d4-ad63-b060d52be33f-kube-api-access-jdwm7\") pod \"manila-operator-controller-manager-7c79b5df47-bdgzf\" (UID: \"fbf40e80-a5ef-41d4-ad63-b060d52be33f\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-bdgzf" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.070098 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-mh7hb"] Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.121954 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pd7p\" (UniqueName: \"kubernetes.io/projected/f67211f5-3446-471e-8124-52d1d18dadbe-kube-api-access-7pd7p\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-n9p7d\" (UID: \"f67211f5-3446-471e-8124-52d1d18dadbe\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n9p7d" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.122144 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8dh7\" (UniqueName: \"kubernetes.io/projected/0f8b1d0b-522a-413b-a689-39044ec47286-kube-api-access-t8dh7\") pod \"octavia-operator-controller-manager-998648c74-mh7hb\" (UID: \"0f8b1d0b-522a-413b-a689-39044ec47286\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-mh7hb" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.122220 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lw6b\" (UniqueName: \"kubernetes.io/projected/0681e281-d288-430f-b175-1d5c36593c9a-kube-api-access-5lw6b\") pod \"nova-operator-controller-manager-697bc559fc-lwmcn\" (UID: \"0681e281-d288-430f-b175-1d5c36593c9a\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-lwmcn" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.140271 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gl8qx" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.145550 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-lt6cp" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.173525 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-bdgzf" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.195619 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686l6m2r"] Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.197028 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.197280 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686l6m2r"] Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.200831 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-p7vnh" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.201046 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.201748 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-b5bpz"] Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.204044 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-b5bpz" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.208260 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pd7p\" (UniqueName: \"kubernetes.io/projected/f67211f5-3446-471e-8124-52d1d18dadbe-kube-api-access-7pd7p\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-n9p7d\" (UID: \"f67211f5-3446-471e-8124-52d1d18dadbe\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n9p7d" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.208266 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-ld68m" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.211622 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-llv5g"] Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.212958 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-llv5g" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.214247 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-bqw7w" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.216205 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-b5bpz"] Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.224898 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-72sgg"] Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.226112 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-72sgg" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.227730 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-jn4bh" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.230901 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lw6b\" (UniqueName: \"kubernetes.io/projected/0681e281-d288-430f-b175-1d5c36593c9a-kube-api-access-5lw6b\") pod \"nova-operator-controller-manager-697bc559fc-lwmcn\" (UID: \"0681e281-d288-430f-b175-1d5c36593c9a\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-lwmcn" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.230956 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-llv5g"] Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.235379 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82wmp\" (UniqueName: \"kubernetes.io/projected/de69b4ac-b81b-4074-8e69-2ec717ecd70b-kube-api-access-82wmp\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686l6m2r\" (UID: \"de69b4ac-b81b-4074-8e69-2ec717ecd70b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.235438 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8dh7\" (UniqueName: \"kubernetes.io/projected/0f8b1d0b-522a-413b-a689-39044ec47286-kube-api-access-t8dh7\") pod \"octavia-operator-controller-manager-998648c74-mh7hb\" (UID: \"0f8b1d0b-522a-413b-a689-39044ec47286\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-mh7hb" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.235466 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de69b4ac-b81b-4074-8e69-2ec717ecd70b-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686l6m2r\" (UID: \"de69b4ac-b81b-4074-8e69-2ec717ecd70b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.235527 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr27k\" (UniqueName: \"kubernetes.io/projected/b548d942-a26b-47aa-b352-23afd3148288-kube-api-access-kr27k\") pod \"ovn-operator-controller-manager-b6456fdb6-b5bpz\" (UID: \"b548d942-a26b-47aa-b352-23afd3148288\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-b5bpz" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.236845 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-qzhrh" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.238657 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-72sgg"] Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.243769 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdq5s"] Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.246650 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdq5s" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.248902 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-xqhxg" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.251572 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdq5s"] Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.284221 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hflxr" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.294387 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n9p7d" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.297741 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-9bpvr"] Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.298943 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-9bpvr" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.304648 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-nw952" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.307374 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-9bpvr"] Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.316746 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-vksgw"] Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.317881 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-vksgw" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.330473 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-vksgw"] Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.336359 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82wmp\" (UniqueName: \"kubernetes.io/projected/de69b4ac-b81b-4074-8e69-2ec717ecd70b-kube-api-access-82wmp\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686l6m2r\" (UID: \"de69b4ac-b81b-4074-8e69-2ec717ecd70b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.336406 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk2vt\" (UniqueName: \"kubernetes.io/projected/76116ad0-e325-41c1-a25e-9089331c52ba-kube-api-access-qk2vt\") pod \"test-operator-controller-manager-5854674fcc-9bpvr\" (UID: \"76116ad0-e325-41c1-a25e-9089331c52ba\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-9bpvr" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.336457 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de69b4ac-b81b-4074-8e69-2ec717ecd70b-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686l6m2r\" (UID: \"de69b4ac-b81b-4074-8e69-2ec717ecd70b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.336520 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v72r7\" (UniqueName: \"kubernetes.io/projected/6999c194-78a5-48db-9e56-59f65d9e11c1-kube-api-access-v72r7\") pod \"telemetry-operator-controller-manager-76cc84c6bb-gdq5s\" (UID: \"6999c194-78a5-48db-9e56-59f65d9e11c1\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdq5s" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.336546 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s45qf\" (UniqueName: \"kubernetes.io/projected/e8e59da3-af36-42c7-9c78-98608089eaea-kube-api-access-s45qf\") pod \"swift-operator-controller-manager-5f8c65bbfc-72sgg\" (UID: \"e8e59da3-af36-42c7-9c78-98608089eaea\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-72sgg" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.336568 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnw2b\" (UniqueName: \"kubernetes.io/projected/0703a4f3-6732-44a1-b690-fcea6eb2228d-kube-api-access-lnw2b\") pod \"placement-operator-controller-manager-78f8948974-llv5g\" (UID: \"0703a4f3-6732-44a1-b690-fcea6eb2228d\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-llv5g" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.336600 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr27k\" (UniqueName: \"kubernetes.io/projected/b548d942-a26b-47aa-b352-23afd3148288-kube-api-access-kr27k\") pod \"ovn-operator-controller-manager-b6456fdb6-b5bpz\" (UID: \"b548d942-a26b-47aa-b352-23afd3148288\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-b5bpz" Dec 03 07:05:29 crc kubenswrapper[4947]: E1203 07:05:29.337202 4947 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 07:05:29 crc kubenswrapper[4947]: E1203 07:05:29.337255 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de69b4ac-b81b-4074-8e69-2ec717ecd70b-cert podName:de69b4ac-b81b-4074-8e69-2ec717ecd70b nodeName:}" failed. No retries permitted until 2025-12-03 07:05:29.837237504 +0000 UTC m=+991.098191930 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/de69b4ac-b81b-4074-8e69-2ec717ecd70b-cert") pod "openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" (UID: "de69b4ac-b81b-4074-8e69-2ec717ecd70b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.372445 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-99vdw" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.373270 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-lwmcn" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.406298 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8dh7\" (UniqueName: \"kubernetes.io/projected/0f8b1d0b-522a-413b-a689-39044ec47286-kube-api-access-t8dh7\") pod \"octavia-operator-controller-manager-998648c74-mh7hb\" (UID: \"0f8b1d0b-522a-413b-a689-39044ec47286\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-mh7hb" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.428423 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82wmp\" (UniqueName: \"kubernetes.io/projected/de69b4ac-b81b-4074-8e69-2ec717ecd70b-kube-api-access-82wmp\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686l6m2r\" (UID: \"de69b4ac-b81b-4074-8e69-2ec717ecd70b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.429566 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr27k\" (UniqueName: \"kubernetes.io/projected/b548d942-a26b-47aa-b352-23afd3148288-kube-api-access-kr27k\") pod \"ovn-operator-controller-manager-b6456fdb6-b5bpz\" (UID: \"b548d942-a26b-47aa-b352-23afd3148288\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-b5bpz" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.439181 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4w9b\" (UniqueName: \"kubernetes.io/projected/bab921ef-a66f-48fe-87fc-fa040bc09b2e-kube-api-access-b4w9b\") pod \"watcher-operator-controller-manager-769dc69bc-vksgw\" (UID: \"bab921ef-a66f-48fe-87fc-fa040bc09b2e\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-vksgw" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.439250 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk2vt\" (UniqueName: \"kubernetes.io/projected/76116ad0-e325-41c1-a25e-9089331c52ba-kube-api-access-qk2vt\") pod \"test-operator-controller-manager-5854674fcc-9bpvr\" (UID: \"76116ad0-e325-41c1-a25e-9089331c52ba\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-9bpvr" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.439297 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b30f0b12-f386-4968-98fd-e2272aa1b2f9-cert\") pod \"infra-operator-controller-manager-57548d458d-htpcw\" (UID: \"b30f0b12-f386-4968-98fd-e2272aa1b2f9\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-htpcw" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.439323 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v72r7\" (UniqueName: \"kubernetes.io/projected/6999c194-78a5-48db-9e56-59f65d9e11c1-kube-api-access-v72r7\") pod \"telemetry-operator-controller-manager-76cc84c6bb-gdq5s\" (UID: \"6999c194-78a5-48db-9e56-59f65d9e11c1\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdq5s" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.439342 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnw2b\" (UniqueName: \"kubernetes.io/projected/0703a4f3-6732-44a1-b690-fcea6eb2228d-kube-api-access-lnw2b\") pod \"placement-operator-controller-manager-78f8948974-llv5g\" (UID: \"0703a4f3-6732-44a1-b690-fcea6eb2228d\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-llv5g" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.439357 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s45qf\" (UniqueName: \"kubernetes.io/projected/e8e59da3-af36-42c7-9c78-98608089eaea-kube-api-access-s45qf\") pod \"swift-operator-controller-manager-5f8c65bbfc-72sgg\" (UID: \"e8e59da3-af36-42c7-9c78-98608089eaea\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-72sgg" Dec 03 07:05:29 crc kubenswrapper[4947]: E1203 07:05:29.440046 4947 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 07:05:29 crc kubenswrapper[4947]: E1203 07:05:29.440092 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b30f0b12-f386-4968-98fd-e2272aa1b2f9-cert podName:b30f0b12-f386-4968-98fd-e2272aa1b2f9 nodeName:}" failed. No retries permitted until 2025-12-03 07:05:30.440077786 +0000 UTC m=+991.701032212 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b30f0b12-f386-4968-98fd-e2272aa1b2f9-cert") pod "infra-operator-controller-manager-57548d458d-htpcw" (UID: "b30f0b12-f386-4968-98fd-e2272aa1b2f9") : secret "infra-operator-webhook-server-cert" not found Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.486812 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc"] Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.487894 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.490209 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.490477 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-sppd8" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.500833 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.515105 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s45qf\" (UniqueName: \"kubernetes.io/projected/e8e59da3-af36-42c7-9c78-98608089eaea-kube-api-access-s45qf\") pod \"swift-operator-controller-manager-5f8c65bbfc-72sgg\" (UID: \"e8e59da3-af36-42c7-9c78-98608089eaea\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-72sgg" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.515896 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v72r7\" (UniqueName: \"kubernetes.io/projected/6999c194-78a5-48db-9e56-59f65d9e11c1-kube-api-access-v72r7\") pod \"telemetry-operator-controller-manager-76cc84c6bb-gdq5s\" (UID: \"6999c194-78a5-48db-9e56-59f65d9e11c1\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdq5s" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.540241 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4w9b\" (UniqueName: \"kubernetes.io/projected/bab921ef-a66f-48fe-87fc-fa040bc09b2e-kube-api-access-b4w9b\") pod \"watcher-operator-controller-manager-769dc69bc-vksgw\" (UID: \"bab921ef-a66f-48fe-87fc-fa040bc09b2e\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-vksgw" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.540302 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-metrics-certs\") pod \"openstack-operator-controller-manager-9f56fc979-wpbkc\" (UID: \"549a3cff-42c6-45ea-8e4c-36c4aa29457c\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.540331 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-wpbkc\" (UID: \"549a3cff-42c6-45ea-8e4c-36c4aa29457c\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.540373 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpdvj\" (UniqueName: \"kubernetes.io/projected/549a3cff-42c6-45ea-8e4c-36c4aa29457c-kube-api-access-vpdvj\") pod \"openstack-operator-controller-manager-9f56fc979-wpbkc\" (UID: \"549a3cff-42c6-45ea-8e4c-36c4aa29457c\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.542560 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnw2b\" (UniqueName: \"kubernetes.io/projected/0703a4f3-6732-44a1-b690-fcea6eb2228d-kube-api-access-lnw2b\") pod \"placement-operator-controller-manager-78f8948974-llv5g\" (UID: \"0703a4f3-6732-44a1-b690-fcea6eb2228d\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-llv5g" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.548258 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc"] Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.548734 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk2vt\" (UniqueName: \"kubernetes.io/projected/76116ad0-e325-41c1-a25e-9089331c52ba-kube-api-access-qk2vt\") pod \"test-operator-controller-manager-5854674fcc-9bpvr\" (UID: \"76116ad0-e325-41c1-a25e-9089331c52ba\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-9bpvr" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.599310 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4w9b\" (UniqueName: \"kubernetes.io/projected/bab921ef-a66f-48fe-87fc-fa040bc09b2e-kube-api-access-b4w9b\") pod \"watcher-operator-controller-manager-769dc69bc-vksgw\" (UID: \"bab921ef-a66f-48fe-87fc-fa040bc09b2e\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-vksgw" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.644250 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-metrics-certs\") pod \"openstack-operator-controller-manager-9f56fc979-wpbkc\" (UID: \"549a3cff-42c6-45ea-8e4c-36c4aa29457c\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.644306 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-wpbkc\" (UID: \"549a3cff-42c6-45ea-8e4c-36c4aa29457c\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.644366 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpdvj\" (UniqueName: \"kubernetes.io/projected/549a3cff-42c6-45ea-8e4c-36c4aa29457c-kube-api-access-vpdvj\") pod \"openstack-operator-controller-manager-9f56fc979-wpbkc\" (UID: \"549a3cff-42c6-45ea-8e4c-36c4aa29457c\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" Dec 03 07:05:29 crc kubenswrapper[4947]: E1203 07:05:29.644791 4947 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 07:05:29 crc kubenswrapper[4947]: E1203 07:05:29.644835 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-metrics-certs podName:549a3cff-42c6-45ea-8e4c-36c4aa29457c nodeName:}" failed. No retries permitted until 2025-12-03 07:05:30.144820874 +0000 UTC m=+991.405775300 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-metrics-certs") pod "openstack-operator-controller-manager-9f56fc979-wpbkc" (UID: "549a3cff-42c6-45ea-8e4c-36c4aa29457c") : secret "metrics-server-cert" not found Dec 03 07:05:29 crc kubenswrapper[4947]: E1203 07:05:29.644986 4947 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 07:05:29 crc kubenswrapper[4947]: E1203 07:05:29.645012 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-webhook-certs podName:549a3cff-42c6-45ea-8e4c-36c4aa29457c nodeName:}" failed. No retries permitted until 2025-12-03 07:05:30.145005339 +0000 UTC m=+991.405959765 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-webhook-certs") pod "openstack-operator-controller-manager-9f56fc979-wpbkc" (UID: "549a3cff-42c6-45ea-8e4c-36c4aa29457c") : secret "webhook-server-cert" not found Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.670513 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mh7hb" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.685302 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sk2lv"] Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.689312 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpdvj\" (UniqueName: \"kubernetes.io/projected/549a3cff-42c6-45ea-8e4c-36c4aa29457c-kube-api-access-vpdvj\") pod \"openstack-operator-controller-manager-9f56fc979-wpbkc\" (UID: \"549a3cff-42c6-45ea-8e4c-36c4aa29457c\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.696764 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-72sgg" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.706221 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sk2lv"] Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.706347 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sk2lv" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.710027 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-g6r27" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.728840 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-b5bpz" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.733407 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-llv5g" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.746314 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmsm9\" (UniqueName: \"kubernetes.io/projected/ee4fc346-902a-4be8-9bf4-081b58b2c547-kube-api-access-qmsm9\") pod \"rabbitmq-cluster-operator-manager-668c99d594-sk2lv\" (UID: \"ee4fc346-902a-4be8-9bf4-081b58b2c547\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sk2lv" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.848347 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmsm9\" (UniqueName: \"kubernetes.io/projected/ee4fc346-902a-4be8-9bf4-081b58b2c547-kube-api-access-qmsm9\") pod \"rabbitmq-cluster-operator-manager-668c99d594-sk2lv\" (UID: \"ee4fc346-902a-4be8-9bf4-081b58b2c547\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sk2lv" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.848886 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de69b4ac-b81b-4074-8e69-2ec717ecd70b-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686l6m2r\" (UID: \"de69b4ac-b81b-4074-8e69-2ec717ecd70b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" Dec 03 07:05:29 crc kubenswrapper[4947]: E1203 07:05:29.849068 4947 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 07:05:29 crc kubenswrapper[4947]: E1203 07:05:29.849126 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de69b4ac-b81b-4074-8e69-2ec717ecd70b-cert podName:de69b4ac-b81b-4074-8e69-2ec717ecd70b nodeName:}" failed. No retries permitted until 2025-12-03 07:05:30.849100501 +0000 UTC m=+992.110054927 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/de69b4ac-b81b-4074-8e69-2ec717ecd70b-cert") pod "openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" (UID: "de69b4ac-b81b-4074-8e69-2ec717ecd70b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.885415 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmsm9\" (UniqueName: \"kubernetes.io/projected/ee4fc346-902a-4be8-9bf4-081b58b2c547-kube-api-access-qmsm9\") pod \"rabbitmq-cluster-operator-manager-668c99d594-sk2lv\" (UID: \"ee4fc346-902a-4be8-9bf4-081b58b2c547\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sk2lv" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.945343 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-9bpvr" Dec 03 07:05:29 crc kubenswrapper[4947]: I1203 07:05:29.959383 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdq5s" Dec 03 07:05:30 crc kubenswrapper[4947]: I1203 07:05:30.088773 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-vksgw" Dec 03 07:05:30 crc kubenswrapper[4947]: I1203 07:05:30.126173 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sk2lv" Dec 03 07:05:30 crc kubenswrapper[4947]: I1203 07:05:30.153616 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-metrics-certs\") pod \"openstack-operator-controller-manager-9f56fc979-wpbkc\" (UID: \"549a3cff-42c6-45ea-8e4c-36c4aa29457c\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" Dec 03 07:05:30 crc kubenswrapper[4947]: I1203 07:05:30.153691 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-wpbkc\" (UID: \"549a3cff-42c6-45ea-8e4c-36c4aa29457c\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" Dec 03 07:05:30 crc kubenswrapper[4947]: E1203 07:05:30.154193 4947 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 07:05:30 crc kubenswrapper[4947]: E1203 07:05:30.154252 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-metrics-certs podName:549a3cff-42c6-45ea-8e4c-36c4aa29457c nodeName:}" failed. No retries permitted until 2025-12-03 07:05:31.154234305 +0000 UTC m=+992.415188731 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-metrics-certs") pod "openstack-operator-controller-manager-9f56fc979-wpbkc" (UID: "549a3cff-42c6-45ea-8e4c-36c4aa29457c") : secret "metrics-server-cert" not found Dec 03 07:05:30 crc kubenswrapper[4947]: E1203 07:05:30.154352 4947 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 07:05:30 crc kubenswrapper[4947]: E1203 07:05:30.154386 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-webhook-certs podName:549a3cff-42c6-45ea-8e4c-36c4aa29457c nodeName:}" failed. No retries permitted until 2025-12-03 07:05:31.154376948 +0000 UTC m=+992.415331374 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-webhook-certs") pod "openstack-operator-controller-manager-9f56fc979-wpbkc" (UID: "549a3cff-42c6-45ea-8e4c-36c4aa29457c") : secret "webhook-server-cert" not found Dec 03 07:05:30 crc kubenswrapper[4947]: I1203 07:05:30.457331 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b30f0b12-f386-4968-98fd-e2272aa1b2f9-cert\") pod \"infra-operator-controller-manager-57548d458d-htpcw\" (UID: \"b30f0b12-f386-4968-98fd-e2272aa1b2f9\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-htpcw" Dec 03 07:05:30 crc kubenswrapper[4947]: I1203 07:05:30.457904 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-nvsv7"] Dec 03 07:05:30 crc kubenswrapper[4947]: E1203 07:05:30.458044 4947 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 07:05:30 crc kubenswrapper[4947]: E1203 07:05:30.458107 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b30f0b12-f386-4968-98fd-e2272aa1b2f9-cert podName:b30f0b12-f386-4968-98fd-e2272aa1b2f9 nodeName:}" failed. No retries permitted until 2025-12-03 07:05:32.458088744 +0000 UTC m=+993.719043190 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b30f0b12-f386-4968-98fd-e2272aa1b2f9-cert") pod "infra-operator-controller-manager-57548d458d-htpcw" (UID: "b30f0b12-f386-4968-98fd-e2272aa1b2f9") : secret "infra-operator-webhook-server-cert" not found Dec 03 07:05:30 crc kubenswrapper[4947]: I1203 07:05:30.483523 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-g9h4f"] Dec 03 07:05:30 crc kubenswrapper[4947]: W1203 07:05:30.493102 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e069b5f_1572_4c42_b34e_74c49b4b6940.slice/crio-9820a8507289f119ee4f8057d299b732cf14bff3a43675a98cd3c89e03331770 WatchSource:0}: Error finding container 9820a8507289f119ee4f8057d299b732cf14bff3a43675a98cd3c89e03331770: Status 404 returned error can't find the container with id 9820a8507289f119ee4f8057d299b732cf14bff3a43675a98cd3c89e03331770 Dec 03 07:05:30 crc kubenswrapper[4947]: I1203 07:05:30.497810 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfz6c"] Dec 03 07:05:30 crc kubenswrapper[4947]: I1203 07:05:30.507914 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-26vkh"] Dec 03 07:05:30 crc kubenswrapper[4947]: W1203 07:05:30.513275 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod888543d1_0ade_4e16_8965_a0ecb6fd65a7.slice/crio-2a71c2aad3b565ad158164fe5a94bb14c5708e6b0a79577490ba0a1f4080c0f6 WatchSource:0}: Error finding container 2a71c2aad3b565ad158164fe5a94bb14c5708e6b0a79577490ba0a1f4080c0f6: Status 404 returned error can't find the container with id 2a71c2aad3b565ad158164fe5a94bb14c5708e6b0a79577490ba0a1f4080c0f6 Dec 03 07:05:30 crc kubenswrapper[4947]: I1203 07:05:30.868278 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de69b4ac-b81b-4074-8e69-2ec717ecd70b-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686l6m2r\" (UID: \"de69b4ac-b81b-4074-8e69-2ec717ecd70b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" Dec 03 07:05:30 crc kubenswrapper[4947]: E1203 07:05:30.868465 4947 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 07:05:30 crc kubenswrapper[4947]: E1203 07:05:30.868542 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de69b4ac-b81b-4074-8e69-2ec717ecd70b-cert podName:de69b4ac-b81b-4074-8e69-2ec717ecd70b nodeName:}" failed. No retries permitted until 2025-12-03 07:05:32.868523376 +0000 UTC m=+994.129477802 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/de69b4ac-b81b-4074-8e69-2ec717ecd70b-cert") pod "openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" (UID: "de69b4ac-b81b-4074-8e69-2ec717ecd70b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 07:05:30 crc kubenswrapper[4947]: I1203 07:05:30.887384 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-bdgzf"] Dec 03 07:05:30 crc kubenswrapper[4947]: I1203 07:05:30.893593 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-b5bpz"] Dec 03 07:05:30 crc kubenswrapper[4947]: I1203 07:05:30.897580 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-qzhrh"] Dec 03 07:05:30 crc kubenswrapper[4947]: I1203 07:05:30.901779 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-mg8v4"] Dec 03 07:05:30 crc kubenswrapper[4947]: I1203 07:05:30.905164 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-lt6cp"] Dec 03 07:05:30 crc kubenswrapper[4947]: I1203 07:05:30.914327 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-mh7hb"] Dec 03 07:05:30 crc kubenswrapper[4947]: I1203 07:05:30.920006 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n9p7d"] Dec 03 07:05:30 crc kubenswrapper[4947]: I1203 07:05:30.924813 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdq5s"] Dec 03 07:05:30 crc kubenswrapper[4947]: I1203 07:05:30.942445 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-9bpvr"] Dec 03 07:05:30 crc kubenswrapper[4947]: W1203 07:05:30.955736 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6999c194_78a5_48db_9e56_59f65d9e11c1.slice/crio-914092b8039aebb36271ec5b616b30ca99909a9457ec0b85b4d641a2d9d8c6ff WatchSource:0}: Error finding container 914092b8039aebb36271ec5b616b30ca99909a9457ec0b85b4d641a2d9d8c6ff: Status 404 returned error can't find the container with id 914092b8039aebb36271ec5b616b30ca99909a9457ec0b85b4d641a2d9d8c6ff Dec 03 07:05:30 crc kubenswrapper[4947]: E1203 07:05:30.972662 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v72r7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-gdq5s_openstack-operators(6999c194-78a5-48db-9e56-59f65d9e11c1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 07:05:30 crc kubenswrapper[4947]: E1203 07:05:30.975638 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v72r7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-76cc84c6bb-gdq5s_openstack-operators(6999c194-78a5-48db-9e56-59f65d9e11c1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 07:05:30 crc kubenswrapper[4947]: E1203 07:05:30.982839 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdq5s" podUID="6999c194-78a5-48db-9e56-59f65d9e11c1" Dec 03 07:05:31 crc kubenswrapper[4947]: I1203 07:05:31.018066 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-9bpvr" event={"ID":"76116ad0-e325-41c1-a25e-9089331c52ba","Type":"ContainerStarted","Data":"884c6972832f8d3806f30a6222af848c81c0f3da020adcb3a8282d1e4ee81756"} Dec 03 07:05:31 crc kubenswrapper[4947]: I1203 07:05:31.020174 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-g9h4f" event={"ID":"83997d19-6166-445a-a2bd-15acf15fa18d","Type":"ContainerStarted","Data":"677c71d2d10861d68dcadb8f8e6f716c487e7437737a7890c10c7eeeb3eb491a"} Dec 03 07:05:31 crc kubenswrapper[4947]: I1203 07:05:31.022984 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-qzhrh" event={"ID":"fd59ba56-51b7-4260-9b1f-e3bee0916e06","Type":"ContainerStarted","Data":"4ac240a6a7af4443af25b7b2305714d6999137183d9ebae6b8153ffcd2763f7b"} Dec 03 07:05:31 crc kubenswrapper[4947]: I1203 07:05:31.024719 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mh7hb" event={"ID":"0f8b1d0b-522a-413b-a689-39044ec47286","Type":"ContainerStarted","Data":"a3d6d92228acc06af0b0dc069ed75b3eea67ba0b74233ec7e861183ef7249d90"} Dec 03 07:05:31 crc kubenswrapper[4947]: I1203 07:05:31.026390 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdq5s" event={"ID":"6999c194-78a5-48db-9e56-59f65d9e11c1","Type":"ContainerStarted","Data":"914092b8039aebb36271ec5b616b30ca99909a9457ec0b85b4d641a2d9d8c6ff"} Dec 03 07:05:31 crc kubenswrapper[4947]: E1203 07:05:31.029646 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdq5s" podUID="6999c194-78a5-48db-9e56-59f65d9e11c1" Dec 03 07:05:31 crc kubenswrapper[4947]: I1203 07:05:31.029769 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-lt6cp" event={"ID":"59be11f8-72a2-45cc-b690-951bda0d87be","Type":"ContainerStarted","Data":"d722d5ec7fd508fdbd64aa1d973cd0cbe15f96799719af6bf07e99b8124f376e"} Dec 03 07:05:31 crc kubenswrapper[4947]: I1203 07:05:31.047238 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n9p7d" event={"ID":"f67211f5-3446-471e-8124-52d1d18dadbe","Type":"ContainerStarted","Data":"6e69ebf210b38810b5dab08613e2b3a02b241ba0dcf11d937866759f1da604a8"} Dec 03 07:05:31 crc kubenswrapper[4947]: I1203 07:05:31.055755 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-nvsv7" event={"ID":"528fb24c-b835-446a-84c8-fce6b4e4815c","Type":"ContainerStarted","Data":"fe3339b0db0c319b4b8bf3d1359211f79468abe11e0bb021a25675295b561dd9"} Dec 03 07:05:31 crc kubenswrapper[4947]: I1203 07:05:31.057665 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-lwmcn"] Dec 03 07:05:31 crc kubenswrapper[4947]: I1203 07:05:31.058050 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfz6c" event={"ID":"1e069b5f-1572-4c42-b34e-74c49b4b6940","Type":"ContainerStarted","Data":"9820a8507289f119ee4f8057d299b732cf14bff3a43675a98cd3c89e03331770"} Dec 03 07:05:31 crc kubenswrapper[4947]: I1203 07:05:31.061534 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-26vkh" event={"ID":"888543d1-0ade-4e16-8965-a0ecb6fd65a7","Type":"ContainerStarted","Data":"2a71c2aad3b565ad158164fe5a94bb14c5708e6b0a79577490ba0a1f4080c0f6"} Dec 03 07:05:31 crc kubenswrapper[4947]: I1203 07:05:31.062583 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-b5bpz" event={"ID":"b548d942-a26b-47aa-b352-23afd3148288","Type":"ContainerStarted","Data":"ca84e98c05ac8133623e56baa1fab635426cc6f17c144ecde6b4c56331f763fe"} Dec 03 07:05:31 crc kubenswrapper[4947]: I1203 07:05:31.063592 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-bdgzf" event={"ID":"fbf40e80-a5ef-41d4-ad63-b060d52be33f","Type":"ContainerStarted","Data":"a12aa25ea998c980959ef75e615658bbcd19a34cc4ba3f96d91575b70dc9b97b"} Dec 03 07:05:31 crc kubenswrapper[4947]: I1203 07:05:31.065773 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-mg8v4" event={"ID":"e266debc-b844-4b60-bbbf-c038b61a7ab8","Type":"ContainerStarted","Data":"93295a6ee92e222afbd50dd93598a324125b5fb6bb6a0ce16aeb0bc5ee1c02e9"} Dec 03 07:05:31 crc kubenswrapper[4947]: W1203 07:05:31.087281 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb537173f_b7c8_426d_bf40_8bb6ece17177.slice/crio-b4fe9c40fc9bdd8b71866faf1e3347e141ab6968406341115b796cdd055cb317 WatchSource:0}: Error finding container b4fe9c40fc9bdd8b71866faf1e3347e141ab6968406341115b796cdd055cb317: Status 404 returned error can't find the container with id b4fe9c40fc9bdd8b71866faf1e3347e141ab6968406341115b796cdd055cb317 Dec 03 07:05:31 crc kubenswrapper[4947]: E1203 07:05:31.088095 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s45qf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-72sgg_openstack-operators(e8e59da3-af36-42c7-9c78-98608089eaea): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 07:05:31 crc kubenswrapper[4947]: E1203 07:05:31.091085 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s45qf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-72sgg_openstack-operators(e8e59da3-af36-42c7-9c78-98608089eaea): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 07:05:31 crc kubenswrapper[4947]: E1203 07:05:31.091189 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w2kl6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-hflxr_openstack-operators(b537173f-b7c8-426d-bf40-8bb6ece17177): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 07:05:31 crc kubenswrapper[4947]: E1203 07:05:31.093029 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-72sgg" podUID="e8e59da3-af36-42c7-9c78-98608089eaea" Dec 03 07:05:31 crc kubenswrapper[4947]: I1203 07:05:31.094772 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gl8qx"] Dec 03 07:05:31 crc kubenswrapper[4947]: E1203 07:05:31.095781 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w2kl6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-hflxr_openstack-operators(b537173f-b7c8-426d-bf40-8bb6ece17177): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 07:05:31 crc kubenswrapper[4947]: E1203 07:05:31.096959 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hflxr" podUID="b537173f-b7c8-426d-bf40-8bb6ece17177" Dec 03 07:05:31 crc kubenswrapper[4947]: E1203 07:05:31.097056 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z2tbr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-gl8qx_openstack-operators(13d55077-1827-43a0-a985-85db61855cb3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 07:05:31 crc kubenswrapper[4947]: E1203 07:05:31.097116 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lnw2b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-llv5g_openstack-operators(0703a4f3-6732-44a1-b690-fcea6eb2228d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 07:05:31 crc kubenswrapper[4947]: I1203 07:05:31.100051 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hflxr"] Dec 03 07:05:31 crc kubenswrapper[4947]: E1203 07:05:31.100392 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lnw2b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-llv5g_openstack-operators(0703a4f3-6732-44a1-b690-fcea6eb2228d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 07:05:31 crc kubenswrapper[4947]: E1203 07:05:31.100423 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z2tbr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-gl8qx_openstack-operators(13d55077-1827-43a0-a985-85db61855cb3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 07:05:31 crc kubenswrapper[4947]: E1203 07:05:31.101585 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gl8qx" podUID="13d55077-1827-43a0-a985-85db61855cb3" Dec 03 07:05:31 crc kubenswrapper[4947]: E1203 07:05:31.101589 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b4w9b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-vksgw_openstack-operators(bab921ef-a66f-48fe-87fc-fa040bc09b2e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 07:05:31 crc kubenswrapper[4947]: E1203 07:05:31.101635 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-llv5g" podUID="0703a4f3-6732-44a1-b690-fcea6eb2228d" Dec 03 07:05:31 crc kubenswrapper[4947]: E1203 07:05:31.103889 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b4w9b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-vksgw_openstack-operators(bab921ef-a66f-48fe-87fc-fa040bc09b2e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 07:05:31 crc kubenswrapper[4947]: E1203 07:05:31.105931 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-vksgw" podUID="bab921ef-a66f-48fe-87fc-fa040bc09b2e" Dec 03 07:05:31 crc kubenswrapper[4947]: I1203 07:05:31.108426 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-72sgg"] Dec 03 07:05:31 crc kubenswrapper[4947]: W1203 07:05:31.110655 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee4fc346_902a_4be8_9bf4_081b58b2c547.slice/crio-ef5e2365461db0faaed13c49b88633aa54667730dc7ef818ad699068407e9a91 WatchSource:0}: Error finding container ef5e2365461db0faaed13c49b88633aa54667730dc7ef818ad699068407e9a91: Status 404 returned error can't find the container with id ef5e2365461db0faaed13c49b88633aa54667730dc7ef818ad699068407e9a91 Dec 03 07:05:31 crc kubenswrapper[4947]: E1203 07:05:31.112265 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qmsm9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-sk2lv_openstack-operators(ee4fc346-902a-4be8-9bf4-081b58b2c547): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 07:05:31 crc kubenswrapper[4947]: E1203 07:05:31.114203 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sk2lv" podUID="ee4fc346-902a-4be8-9bf4-081b58b2c547" Dec 03 07:05:31 crc kubenswrapper[4947]: I1203 07:05:31.116741 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-llv5g"] Dec 03 07:05:31 crc kubenswrapper[4947]: I1203 07:05:31.122652 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-vksgw"] Dec 03 07:05:31 crc kubenswrapper[4947]: I1203 07:05:31.129597 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sk2lv"] Dec 03 07:05:31 crc kubenswrapper[4947]: I1203 07:05:31.173158 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-metrics-certs\") pod \"openstack-operator-controller-manager-9f56fc979-wpbkc\" (UID: \"549a3cff-42c6-45ea-8e4c-36c4aa29457c\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" Dec 03 07:05:31 crc kubenswrapper[4947]: I1203 07:05:31.173409 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-wpbkc\" (UID: \"549a3cff-42c6-45ea-8e4c-36c4aa29457c\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" Dec 03 07:05:31 crc kubenswrapper[4947]: E1203 07:05:31.174019 4947 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 07:05:31 crc kubenswrapper[4947]: E1203 07:05:31.174028 4947 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 07:05:31 crc kubenswrapper[4947]: E1203 07:05:31.174079 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-metrics-certs podName:549a3cff-42c6-45ea-8e4c-36c4aa29457c nodeName:}" failed. No retries permitted until 2025-12-03 07:05:33.174063951 +0000 UTC m=+994.435018377 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-metrics-certs") pod "openstack-operator-controller-manager-9f56fc979-wpbkc" (UID: "549a3cff-42c6-45ea-8e4c-36c4aa29457c") : secret "metrics-server-cert" not found Dec 03 07:05:31 crc kubenswrapper[4947]: E1203 07:05:31.174097 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-webhook-certs podName:549a3cff-42c6-45ea-8e4c-36c4aa29457c nodeName:}" failed. No retries permitted until 2025-12-03 07:05:33.174089472 +0000 UTC m=+994.435043888 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-webhook-certs") pod "openstack-operator-controller-manager-9f56fc979-wpbkc" (UID: "549a3cff-42c6-45ea-8e4c-36c4aa29457c") : secret "webhook-server-cert" not found Dec 03 07:05:32 crc kubenswrapper[4947]: I1203 07:05:32.108836 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-vksgw" event={"ID":"bab921ef-a66f-48fe-87fc-fa040bc09b2e","Type":"ContainerStarted","Data":"118655ee4b299835025af791d692eb9e764f45a9c92c74b0d985a101bc69bb5f"} Dec 03 07:05:32 crc kubenswrapper[4947]: E1203 07:05:32.114709 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-vksgw" podUID="bab921ef-a66f-48fe-87fc-fa040bc09b2e" Dec 03 07:05:32 crc kubenswrapper[4947]: I1203 07:05:32.115956 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hflxr" event={"ID":"b537173f-b7c8-426d-bf40-8bb6ece17177","Type":"ContainerStarted","Data":"b4fe9c40fc9bdd8b71866faf1e3347e141ab6968406341115b796cdd055cb317"} Dec 03 07:05:32 crc kubenswrapper[4947]: I1203 07:05:32.121467 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sk2lv" event={"ID":"ee4fc346-902a-4be8-9bf4-081b58b2c547","Type":"ContainerStarted","Data":"ef5e2365461db0faaed13c49b88633aa54667730dc7ef818ad699068407e9a91"} Dec 03 07:05:32 crc kubenswrapper[4947]: E1203 07:05:32.121884 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hflxr" podUID="b537173f-b7c8-426d-bf40-8bb6ece17177" Dec 03 07:05:32 crc kubenswrapper[4947]: E1203 07:05:32.123011 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sk2lv" podUID="ee4fc346-902a-4be8-9bf4-081b58b2c547" Dec 03 07:05:32 crc kubenswrapper[4947]: I1203 07:05:32.184175 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-lwmcn" event={"ID":"0681e281-d288-430f-b175-1d5c36593c9a","Type":"ContainerStarted","Data":"a41d8688bfe947ec50431c7a0de241f349386c96a7d52f4dc9f88b6013396ed7"} Dec 03 07:05:32 crc kubenswrapper[4947]: I1203 07:05:32.186617 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gl8qx" event={"ID":"13d55077-1827-43a0-a985-85db61855cb3","Type":"ContainerStarted","Data":"bc574663893a48838bf644e638a3f38f8cf5d7627849a77521b3126fc2a2665f"} Dec 03 07:05:32 crc kubenswrapper[4947]: E1203 07:05:32.191732 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gl8qx" podUID="13d55077-1827-43a0-a985-85db61855cb3" Dec 03 07:05:32 crc kubenswrapper[4947]: I1203 07:05:32.195204 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-llv5g" event={"ID":"0703a4f3-6732-44a1-b690-fcea6eb2228d","Type":"ContainerStarted","Data":"3988d66a7b5db5e61d7ef29fc0b770483cc50a386fd72e2a4f08c3e0684c81db"} Dec 03 07:05:32 crc kubenswrapper[4947]: E1203 07:05:32.197993 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-llv5g" podUID="0703a4f3-6732-44a1-b690-fcea6eb2228d" Dec 03 07:05:32 crc kubenswrapper[4947]: I1203 07:05:32.198403 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-72sgg" event={"ID":"e8e59da3-af36-42c7-9c78-98608089eaea","Type":"ContainerStarted","Data":"8a630a2683fb580b87a759ce2d25532c7cd4d26fee2dd5eb4f2463dc33f71ae9"} Dec 03 07:05:32 crc kubenswrapper[4947]: E1203 07:05:32.201192 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:7d66757c0af67104f0389e851a7cc0daa44443ad202d157417bd86bbb57cc385\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdq5s" podUID="6999c194-78a5-48db-9e56-59f65d9e11c1" Dec 03 07:05:32 crc kubenswrapper[4947]: E1203 07:05:32.201981 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-72sgg" podUID="e8e59da3-af36-42c7-9c78-98608089eaea" Dec 03 07:05:32 crc kubenswrapper[4947]: I1203 07:05:32.508041 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b30f0b12-f386-4968-98fd-e2272aa1b2f9-cert\") pod \"infra-operator-controller-manager-57548d458d-htpcw\" (UID: \"b30f0b12-f386-4968-98fd-e2272aa1b2f9\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-htpcw" Dec 03 07:05:32 crc kubenswrapper[4947]: E1203 07:05:32.508275 4947 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 07:05:32 crc kubenswrapper[4947]: E1203 07:05:32.508330 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b30f0b12-f386-4968-98fd-e2272aa1b2f9-cert podName:b30f0b12-f386-4968-98fd-e2272aa1b2f9 nodeName:}" failed. No retries permitted until 2025-12-03 07:05:36.508311212 +0000 UTC m=+997.769265638 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b30f0b12-f386-4968-98fd-e2272aa1b2f9-cert") pod "infra-operator-controller-manager-57548d458d-htpcw" (UID: "b30f0b12-f386-4968-98fd-e2272aa1b2f9") : secret "infra-operator-webhook-server-cert" not found Dec 03 07:05:32 crc kubenswrapper[4947]: I1203 07:05:32.922587 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de69b4ac-b81b-4074-8e69-2ec717ecd70b-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686l6m2r\" (UID: \"de69b4ac-b81b-4074-8e69-2ec717ecd70b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" Dec 03 07:05:32 crc kubenswrapper[4947]: E1203 07:05:32.922797 4947 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 07:05:32 crc kubenswrapper[4947]: E1203 07:05:32.922865 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de69b4ac-b81b-4074-8e69-2ec717ecd70b-cert podName:de69b4ac-b81b-4074-8e69-2ec717ecd70b nodeName:}" failed. No retries permitted until 2025-12-03 07:05:36.922848375 +0000 UTC m=+998.183802801 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/de69b4ac-b81b-4074-8e69-2ec717ecd70b-cert") pod "openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" (UID: "de69b4ac-b81b-4074-8e69-2ec717ecd70b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 07:05:33 crc kubenswrapper[4947]: E1203 07:05:33.212255 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-72sgg" podUID="e8e59da3-af36-42c7-9c78-98608089eaea" Dec 03 07:05:33 crc kubenswrapper[4947]: E1203 07:05:33.212399 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-llv5g" podUID="0703a4f3-6732-44a1-b690-fcea6eb2228d" Dec 03 07:05:33 crc kubenswrapper[4947]: E1203 07:05:33.212472 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gl8qx" podUID="13d55077-1827-43a0-a985-85db61855cb3" Dec 03 07:05:33 crc kubenswrapper[4947]: E1203 07:05:33.212686 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sk2lv" podUID="ee4fc346-902a-4be8-9bf4-081b58b2c547" Dec 03 07:05:33 crc kubenswrapper[4947]: E1203 07:05:33.215780 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-vksgw" podUID="bab921ef-a66f-48fe-87fc-fa040bc09b2e" Dec 03 07:05:33 crc kubenswrapper[4947]: E1203 07:05:33.218151 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hflxr" podUID="b537173f-b7c8-426d-bf40-8bb6ece17177" Dec 03 07:05:33 crc kubenswrapper[4947]: I1203 07:05:33.227532 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-metrics-certs\") pod \"openstack-operator-controller-manager-9f56fc979-wpbkc\" (UID: \"549a3cff-42c6-45ea-8e4c-36c4aa29457c\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" Dec 03 07:05:33 crc kubenswrapper[4947]: I1203 07:05:33.227611 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-wpbkc\" (UID: \"549a3cff-42c6-45ea-8e4c-36c4aa29457c\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" Dec 03 07:05:33 crc kubenswrapper[4947]: E1203 07:05:33.233523 4947 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 07:05:33 crc kubenswrapper[4947]: E1203 07:05:33.233609 4947 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 07:05:33 crc kubenswrapper[4947]: E1203 07:05:33.233648 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-webhook-certs podName:549a3cff-42c6-45ea-8e4c-36c4aa29457c nodeName:}" failed. No retries permitted until 2025-12-03 07:05:37.233627691 +0000 UTC m=+998.494582117 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-webhook-certs") pod "openstack-operator-controller-manager-9f56fc979-wpbkc" (UID: "549a3cff-42c6-45ea-8e4c-36c4aa29457c") : secret "webhook-server-cert" not found Dec 03 07:05:33 crc kubenswrapper[4947]: E1203 07:05:33.233717 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-metrics-certs podName:549a3cff-42c6-45ea-8e4c-36c4aa29457c nodeName:}" failed. No retries permitted until 2025-12-03 07:05:37.233691573 +0000 UTC m=+998.494645999 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-metrics-certs") pod "openstack-operator-controller-manager-9f56fc979-wpbkc" (UID: "549a3cff-42c6-45ea-8e4c-36c4aa29457c") : secret "metrics-server-cert" not found Dec 03 07:05:36 crc kubenswrapper[4947]: I1203 07:05:36.585966 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b30f0b12-f386-4968-98fd-e2272aa1b2f9-cert\") pod \"infra-operator-controller-manager-57548d458d-htpcw\" (UID: \"b30f0b12-f386-4968-98fd-e2272aa1b2f9\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-htpcw" Dec 03 07:05:36 crc kubenswrapper[4947]: E1203 07:05:36.586150 4947 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 07:05:36 crc kubenswrapper[4947]: E1203 07:05:36.586471 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b30f0b12-f386-4968-98fd-e2272aa1b2f9-cert podName:b30f0b12-f386-4968-98fd-e2272aa1b2f9 nodeName:}" failed. No retries permitted until 2025-12-03 07:05:44.586440627 +0000 UTC m=+1005.847395123 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b30f0b12-f386-4968-98fd-e2272aa1b2f9-cert") pod "infra-operator-controller-manager-57548d458d-htpcw" (UID: "b30f0b12-f386-4968-98fd-e2272aa1b2f9") : secret "infra-operator-webhook-server-cert" not found Dec 03 07:05:36 crc kubenswrapper[4947]: I1203 07:05:36.995461 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de69b4ac-b81b-4074-8e69-2ec717ecd70b-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686l6m2r\" (UID: \"de69b4ac-b81b-4074-8e69-2ec717ecd70b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" Dec 03 07:05:36 crc kubenswrapper[4947]: E1203 07:05:36.995769 4947 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 07:05:36 crc kubenswrapper[4947]: E1203 07:05:36.995849 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de69b4ac-b81b-4074-8e69-2ec717ecd70b-cert podName:de69b4ac-b81b-4074-8e69-2ec717ecd70b nodeName:}" failed. No retries permitted until 2025-12-03 07:05:44.99582098 +0000 UTC m=+1006.256775406 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/de69b4ac-b81b-4074-8e69-2ec717ecd70b-cert") pod "openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" (UID: "de69b4ac-b81b-4074-8e69-2ec717ecd70b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 07:05:37 crc kubenswrapper[4947]: I1203 07:05:37.304971 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-metrics-certs\") pod \"openstack-operator-controller-manager-9f56fc979-wpbkc\" (UID: \"549a3cff-42c6-45ea-8e4c-36c4aa29457c\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" Dec 03 07:05:37 crc kubenswrapper[4947]: I1203 07:05:37.305057 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-wpbkc\" (UID: \"549a3cff-42c6-45ea-8e4c-36c4aa29457c\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" Dec 03 07:05:37 crc kubenswrapper[4947]: E1203 07:05:37.305418 4947 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 07:05:37 crc kubenswrapper[4947]: E1203 07:05:37.305433 4947 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 07:05:37 crc kubenswrapper[4947]: E1203 07:05:37.305481 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-webhook-certs podName:549a3cff-42c6-45ea-8e4c-36c4aa29457c nodeName:}" failed. No retries permitted until 2025-12-03 07:05:45.305463317 +0000 UTC m=+1006.566417763 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-webhook-certs") pod "openstack-operator-controller-manager-9f56fc979-wpbkc" (UID: "549a3cff-42c6-45ea-8e4c-36c4aa29457c") : secret "webhook-server-cert" not found Dec 03 07:05:37 crc kubenswrapper[4947]: E1203 07:05:37.305548 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-metrics-certs podName:549a3cff-42c6-45ea-8e4c-36c4aa29457c nodeName:}" failed. No retries permitted until 2025-12-03 07:05:45.305521148 +0000 UTC m=+1006.566475634 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-metrics-certs") pod "openstack-operator-controller-manager-9f56fc979-wpbkc" (UID: "549a3cff-42c6-45ea-8e4c-36c4aa29457c") : secret "metrics-server-cert" not found Dec 03 07:05:44 crc kubenswrapper[4947]: E1203 07:05:44.041582 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 03 07:05:44 crc kubenswrapper[4947]: E1203 07:05:44.042373 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t8dh7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-mh7hb_openstack-operators(0f8b1d0b-522a-413b-a689-39044ec47286): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 07:05:44 crc kubenswrapper[4947]: I1203 07:05:44.600650 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b30f0b12-f386-4968-98fd-e2272aa1b2f9-cert\") pod \"infra-operator-controller-manager-57548d458d-htpcw\" (UID: \"b30f0b12-f386-4968-98fd-e2272aa1b2f9\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-htpcw" Dec 03 07:05:44 crc kubenswrapper[4947]: E1203 07:05:44.600795 4947 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 07:05:44 crc kubenswrapper[4947]: E1203 07:05:44.600873 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b30f0b12-f386-4968-98fd-e2272aa1b2f9-cert podName:b30f0b12-f386-4968-98fd-e2272aa1b2f9 nodeName:}" failed. No retries permitted until 2025-12-03 07:06:00.600859313 +0000 UTC m=+1021.861813739 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b30f0b12-f386-4968-98fd-e2272aa1b2f9-cert") pod "infra-operator-controller-manager-57548d458d-htpcw" (UID: "b30f0b12-f386-4968-98fd-e2272aa1b2f9") : secret "infra-operator-webhook-server-cert" not found Dec 03 07:05:45 crc kubenswrapper[4947]: I1203 07:05:45.004275 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de69b4ac-b81b-4074-8e69-2ec717ecd70b-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686l6m2r\" (UID: \"de69b4ac-b81b-4074-8e69-2ec717ecd70b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" Dec 03 07:05:45 crc kubenswrapper[4947]: E1203 07:05:45.004579 4947 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 07:05:45 crc kubenswrapper[4947]: E1203 07:05:45.004683 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de69b4ac-b81b-4074-8e69-2ec717ecd70b-cert podName:de69b4ac-b81b-4074-8e69-2ec717ecd70b nodeName:}" failed. No retries permitted until 2025-12-03 07:06:01.004655445 +0000 UTC m=+1022.265609881 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/de69b4ac-b81b-4074-8e69-2ec717ecd70b-cert") pod "openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" (UID: "de69b4ac-b81b-4074-8e69-2ec717ecd70b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 07:05:45 crc kubenswrapper[4947]: I1203 07:05:45.308833 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-metrics-certs\") pod \"openstack-operator-controller-manager-9f56fc979-wpbkc\" (UID: \"549a3cff-42c6-45ea-8e4c-36c4aa29457c\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" Dec 03 07:05:45 crc kubenswrapper[4947]: I1203 07:05:45.308893 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-wpbkc\" (UID: \"549a3cff-42c6-45ea-8e4c-36c4aa29457c\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" Dec 03 07:05:45 crc kubenswrapper[4947]: E1203 07:05:45.309029 4947 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 07:05:45 crc kubenswrapper[4947]: E1203 07:05:45.309079 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-webhook-certs podName:549a3cff-42c6-45ea-8e4c-36c4aa29457c nodeName:}" failed. No retries permitted until 2025-12-03 07:06:01.30906406 +0000 UTC m=+1022.570018486 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-webhook-certs") pod "openstack-operator-controller-manager-9f56fc979-wpbkc" (UID: "549a3cff-42c6-45ea-8e4c-36c4aa29457c") : secret "webhook-server-cert" not found Dec 03 07:05:45 crc kubenswrapper[4947]: I1203 07:05:45.319613 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-metrics-certs\") pod \"openstack-operator-controller-manager-9f56fc979-wpbkc\" (UID: \"549a3cff-42c6-45ea-8e4c-36c4aa29457c\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" Dec 03 07:05:45 crc kubenswrapper[4947]: E1203 07:05:45.755898 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 03 07:05:45 crc kubenswrapper[4947]: E1203 07:05:45.756469 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2zssv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-qzhrh_openstack-operators(fd59ba56-51b7-4260-9b1f-e3bee0916e06): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 07:05:46 crc kubenswrapper[4947]: E1203 07:05:46.208688 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jdwm7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7c79b5df47-bdgzf_openstack-operators(fbf40e80-a5ef-41d4-ad63-b060d52be33f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 07:05:46 crc kubenswrapper[4947]: E1203 07:05:46.210011 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-bdgzf" podUID="fbf40e80-a5ef-41d4-ad63-b060d52be33f" Dec 03 07:05:46 crc kubenswrapper[4947]: E1203 07:05:46.211074 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sgb7n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-hfz6c_openstack-operators(1e069b5f-1572-4c42-b34e-74c49b4b6940): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 07:05:46 crc kubenswrapper[4947]: E1203 07:05:46.213332 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfz6c" podUID="1e069b5f-1572-4c42-b34e-74c49b4b6940" Dec 03 07:05:46 crc kubenswrapper[4947]: I1203 07:05:46.303061 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-g9h4f" event={"ID":"83997d19-6166-445a-a2bd-15acf15fa18d","Type":"ContainerStarted","Data":"957848b69b8db51bf93690421f3b2661c7d9aaeade39385c755ffe063938ad59"} Dec 03 07:05:46 crc kubenswrapper[4947]: I1203 07:05:46.313751 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfz6c" event={"ID":"1e069b5f-1572-4c42-b34e-74c49b4b6940","Type":"ContainerStarted","Data":"6e0117b691364c0c8ada1b50120e6fdc127e1dc41bdce251da8727f0d79a096d"} Dec 03 07:05:46 crc kubenswrapper[4947]: I1203 07:05:46.313907 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfz6c" Dec 03 07:05:46 crc kubenswrapper[4947]: E1203 07:05:46.317196 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfz6c" podUID="1e069b5f-1572-4c42-b34e-74c49b4b6940" Dec 03 07:05:46 crc kubenswrapper[4947]: I1203 07:05:46.324711 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-26vkh" event={"ID":"888543d1-0ade-4e16-8965-a0ecb6fd65a7","Type":"ContainerStarted","Data":"60cbdd0bdcdf3cddc019366e5e16a9b700ccee08625a666ad9994f3bd2141ca6"} Dec 03 07:05:46 crc kubenswrapper[4947]: I1203 07:05:46.340806 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-b5bpz" event={"ID":"b548d942-a26b-47aa-b352-23afd3148288","Type":"ContainerStarted","Data":"a25f191eba91b9128c87c9da17f1369ec247cc86d706ac3d7a161a9789233118"} Dec 03 07:05:46 crc kubenswrapper[4947]: I1203 07:05:46.345311 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-bdgzf" event={"ID":"fbf40e80-a5ef-41d4-ad63-b060d52be33f","Type":"ContainerStarted","Data":"45b5f11f1b14e924bd23dab8f50085cd80811e069bee23c707d6b0c95f25dfd4"} Dec 03 07:05:46 crc kubenswrapper[4947]: I1203 07:05:46.345855 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-bdgzf" Dec 03 07:05:46 crc kubenswrapper[4947]: E1203 07:05:46.347318 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-bdgzf" podUID="fbf40e80-a5ef-41d4-ad63-b060d52be33f" Dec 03 07:05:46 crc kubenswrapper[4947]: I1203 07:05:46.353063 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-9bpvr" event={"ID":"76116ad0-e325-41c1-a25e-9089331c52ba","Type":"ContainerStarted","Data":"d0b163bc27e487a296a0dcf1d93be7fe0b6ed00bc2919fc4f1fd6c367f92a4f0"} Dec 03 07:05:46 crc kubenswrapper[4947]: I1203 07:05:46.365417 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n9p7d" event={"ID":"f67211f5-3446-471e-8124-52d1d18dadbe","Type":"ContainerStarted","Data":"c02030f85ba37dc51320a85f3081c133ea1726caa2f4d7bc74ed21568add8811"} Dec 03 07:05:46 crc kubenswrapper[4947]: I1203 07:05:46.367610 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-nvsv7" event={"ID":"528fb24c-b835-446a-84c8-fce6b4e4815c","Type":"ContainerStarted","Data":"49336c0503840e7d007d164551a7d9625073ce525a9c4fe258889a1b2e3be02f"} Dec 03 07:05:46 crc kubenswrapper[4947]: I1203 07:05:46.378774 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-mg8v4" event={"ID":"e266debc-b844-4b60-bbbf-c038b61a7ab8","Type":"ContainerStarted","Data":"403814104acc91b27ce57306161068d787e3634b86096edecabeccc6f91a3939"} Dec 03 07:05:46 crc kubenswrapper[4947]: I1203 07:05:46.396850 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-lt6cp" event={"ID":"59be11f8-72a2-45cc-b690-951bda0d87be","Type":"ContainerStarted","Data":"7a81ebdf5f6b14b59997d12d9d3a17b2fec339d812a0ddfdee3556481282a43b"} Dec 03 07:05:46 crc kubenswrapper[4947]: I1203 07:05:46.442729 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-lwmcn" event={"ID":"0681e281-d288-430f-b175-1d5c36593c9a","Type":"ContainerStarted","Data":"af541a6c3486d87c85fcbdd29d3296df2b1779ebb37b548b55ea95c57feaef8f"} Dec 03 07:05:47 crc kubenswrapper[4947]: E1203 07:05:47.463756 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-bdgzf" podUID="fbf40e80-a5ef-41d4-ad63-b060d52be33f" Dec 03 07:05:47 crc kubenswrapper[4947]: E1203 07:05:47.463760 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfz6c" podUID="1e069b5f-1572-4c42-b34e-74c49b4b6940" Dec 03 07:05:59 crc kubenswrapper[4947]: I1203 07:05:59.014407 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfz6c" Dec 03 07:05:59 crc kubenswrapper[4947]: I1203 07:05:59.178775 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-bdgzf" Dec 03 07:05:59 crc kubenswrapper[4947]: E1203 07:05:59.595480 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 03 07:05:59 crc kubenswrapper[4947]: E1203 07:05:59.595692 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z2tbr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-gl8qx_openstack-operators(13d55077-1827-43a0-a985-85db61855cb3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 07:06:00 crc kubenswrapper[4947]: E1203 07:06:00.156249 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621" Dec 03 07:06:00 crc kubenswrapper[4947]: E1203 07:06:00.156456 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b4w9b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-vksgw_openstack-operators(bab921ef-a66f-48fe-87fc-fa040bc09b2e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 07:06:00 crc kubenswrapper[4947]: I1203 07:06:00.652483 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b30f0b12-f386-4968-98fd-e2272aa1b2f9-cert\") pod \"infra-operator-controller-manager-57548d458d-htpcw\" (UID: \"b30f0b12-f386-4968-98fd-e2272aa1b2f9\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-htpcw" Dec 03 07:06:00 crc kubenswrapper[4947]: I1203 07:06:00.658552 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b30f0b12-f386-4968-98fd-e2272aa1b2f9-cert\") pod \"infra-operator-controller-manager-57548d458d-htpcw\" (UID: \"b30f0b12-f386-4968-98fd-e2272aa1b2f9\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-htpcw" Dec 03 07:06:00 crc kubenswrapper[4947]: E1203 07:06:00.708992 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 07:06:00 crc kubenswrapper[4947]: E1203 07:06:00.709387 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r7fqt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987cd8cd-nvsv7_openstack-operators(528fb24c-b835-446a-84c8-fce6b4e4815c): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 03 07:06:00 crc kubenswrapper[4947]: E1203 07:06:00.710589 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-nvsv7" podUID="528fb24c-b835-446a-84c8-fce6b4e4815c" Dec 03 07:06:00 crc kubenswrapper[4947]: E1203 07:06:00.714788 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7" Dec 03 07:06:00 crc kubenswrapper[4947]: E1203 07:06:00.714947 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w2kl6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-hflxr_openstack-operators(b537173f-b7c8-426d-bf40-8bb6ece17177): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 07:06:00 crc kubenswrapper[4947]: I1203 07:06:00.909256 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-z5b89" Dec 03 07:06:00 crc kubenswrapper[4947]: I1203 07:06:00.917972 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-htpcw" Dec 03 07:06:01 crc kubenswrapper[4947]: I1203 07:06:01.059398 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de69b4ac-b81b-4074-8e69-2ec717ecd70b-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686l6m2r\" (UID: \"de69b4ac-b81b-4074-8e69-2ec717ecd70b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" Dec 03 07:06:01 crc kubenswrapper[4947]: I1203 07:06:01.065197 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de69b4ac-b81b-4074-8e69-2ec717ecd70b-cert\") pod \"openstack-baremetal-operator-controller-manager-55d86b6686l6m2r\" (UID: \"de69b4ac-b81b-4074-8e69-2ec717ecd70b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" Dec 03 07:06:01 crc kubenswrapper[4947]: I1203 07:06:01.184230 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-p7vnh" Dec 03 07:06:01 crc kubenswrapper[4947]: I1203 07:06:01.192716 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" Dec 03 07:06:01 crc kubenswrapper[4947]: I1203 07:06:01.363086 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-wpbkc\" (UID: \"549a3cff-42c6-45ea-8e4c-36c4aa29457c\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" Dec 03 07:06:01 crc kubenswrapper[4947]: I1203 07:06:01.370228 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/549a3cff-42c6-45ea-8e4c-36c4aa29457c-webhook-certs\") pod \"openstack-operator-controller-manager-9f56fc979-wpbkc\" (UID: \"549a3cff-42c6-45ea-8e4c-36c4aa29457c\") " pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" Dec 03 07:06:01 crc kubenswrapper[4947]: I1203 07:06:01.553608 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-nvsv7" Dec 03 07:06:01 crc kubenswrapper[4947]: I1203 07:06:01.572990 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-nvsv7" Dec 03 07:06:01 crc kubenswrapper[4947]: I1203 07:06:01.611762 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-sppd8" Dec 03 07:06:01 crc kubenswrapper[4947]: I1203 07:06:01.620312 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" Dec 03 07:06:02 crc kubenswrapper[4947]: E1203 07:06:02.391367 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 07:06:02 crc kubenswrapper[4947]: E1203 07:06:02.391529 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5lw6b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-lwmcn_openstack-operators(0681e281-d288-430f-b175-1d5c36593c9a): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 03 07:06:02 crc kubenswrapper[4947]: E1203 07:06:02.392702 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-lwmcn" podUID="0681e281-d288-430f-b175-1d5c36593c9a" Dec 03 07:06:02 crc kubenswrapper[4947]: E1203 07:06:02.395118 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 07:06:02 crc kubenswrapper[4947]: E1203 07:06:02.395245 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2wnp2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-mg8v4_openstack-operators(e266debc-b844-4b60-bbbf-c038b61a7ab8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 07:06:02 crc kubenswrapper[4947]: E1203 07:06:02.396469 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-mg8v4" podUID="e266debc-b844-4b60-bbbf-c038b61a7ab8" Dec 03 07:06:02 crc kubenswrapper[4947]: I1203 07:06:02.558283 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-mg8v4" Dec 03 07:06:02 crc kubenswrapper[4947]: I1203 07:06:02.558355 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-lwmcn" Dec 03 07:06:02 crc kubenswrapper[4947]: I1203 07:06:02.561283 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-mg8v4" Dec 03 07:06:02 crc kubenswrapper[4947]: I1203 07:06:02.561330 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-lwmcn" Dec 03 07:06:02 crc kubenswrapper[4947]: E1203 07:06:02.892773 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 07:06:02 crc kubenswrapper[4947]: E1203 07:06:02.895944 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7pd7p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-n9p7d_openstack-operators(f67211f5-3446-471e-8124-52d1d18dadbe): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 03 07:06:02 crc kubenswrapper[4947]: E1203 07:06:02.898922 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n9p7d" podUID="f67211f5-3446-471e-8124-52d1d18dadbe" Dec 03 07:06:02 crc kubenswrapper[4947]: E1203 07:06:02.993933 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 07:06:02 crc kubenswrapper[4947]: E1203 07:06:02.994482 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qk2vt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-9bpvr_openstack-operators(76116ad0-e325-41c1-a25e-9089331c52ba): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 07:06:02 crc kubenswrapper[4947]: E1203 07:06:02.996106 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-9bpvr" podUID="76116ad0-e325-41c1-a25e-9089331c52ba" Dec 03 07:06:03 crc kubenswrapper[4947]: E1203 07:06:03.028298 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 07:06:03 crc kubenswrapper[4947]: E1203 07:06:03.028485 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-htzs9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-g9h4f_openstack-operators(83997d19-6166-445a-a2bd-15acf15fa18d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 07:06:03 crc kubenswrapper[4947]: E1203 07:06:03.030147 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-g9h4f" podUID="83997d19-6166-445a-a2bd-15acf15fa18d" Dec 03 07:06:03 crc kubenswrapper[4947]: E1203 07:06:03.045407 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 07:06:03 crc kubenswrapper[4947]: E1203 07:06:03.045567 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2zssv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-qzhrh_openstack-operators(fd59ba56-51b7-4260-9b1f-e3bee0916e06): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 07:06:03 crc kubenswrapper[4947]: E1203 07:06:03.046771 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-qzhrh" podUID="fd59ba56-51b7-4260-9b1f-e3bee0916e06" Dec 03 07:06:03 crc kubenswrapper[4947]: I1203 07:06:03.592008 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686l6m2r"] Dec 03 07:06:03 crc kubenswrapper[4947]: I1203 07:06:03.592450 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdq5s" event={"ID":"6999c194-78a5-48db-9e56-59f65d9e11c1","Type":"ContainerStarted","Data":"e21bd7e5cd57de71f35015999747d256853707771315011edc16b0b04cb69e03"} Dec 03 07:06:03 crc kubenswrapper[4947]: I1203 07:06:03.609740 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-lt6cp" event={"ID":"59be11f8-72a2-45cc-b690-951bda0d87be","Type":"ContainerStarted","Data":"5e79dead61ecfbaa8ca83729b1b43aee6fd5d94da3c0a10169019f73624354ab"} Dec 03 07:06:03 crc kubenswrapper[4947]: I1203 07:06:03.616572 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-72sgg" event={"ID":"e8e59da3-af36-42c7-9c78-98608089eaea","Type":"ContainerStarted","Data":"842bd9b2ab36bcf612301a565165b09938692285f64e97c3a90b1a68a0cee182"} Dec 03 07:06:03 crc kubenswrapper[4947]: I1203 07:06:03.621276 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-llv5g" event={"ID":"0703a4f3-6732-44a1-b690-fcea6eb2228d","Type":"ContainerStarted","Data":"050ab20fbb0a540564e9dd4fcfd750692e926a78c5cfdc558d982782462f04b9"} Dec 03 07:06:03 crc kubenswrapper[4947]: I1203 07:06:03.621468 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-9bpvr" Dec 03 07:06:03 crc kubenswrapper[4947]: I1203 07:06:03.621636 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-g9h4f" Dec 03 07:06:03 crc kubenswrapper[4947]: I1203 07:06:03.622178 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n9p7d" Dec 03 07:06:03 crc kubenswrapper[4947]: I1203 07:06:03.627804 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-9bpvr" Dec 03 07:06:03 crc kubenswrapper[4947]: I1203 07:06:03.628160 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n9p7d" Dec 03 07:06:03 crc kubenswrapper[4947]: I1203 07:06:03.628329 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-g9h4f" Dec 03 07:06:03 crc kubenswrapper[4947]: I1203 07:06:03.650560 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-htpcw"] Dec 03 07:06:03 crc kubenswrapper[4947]: E1203 07:06:03.754998 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hflxr" podUID="b537173f-b7c8-426d-bf40-8bb6ece17177" Dec 03 07:06:03 crc kubenswrapper[4947]: I1203 07:06:03.756454 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc"] Dec 03 07:06:03 crc kubenswrapper[4947]: E1203 07:06:03.885841 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gl8qx" podUID="13d55077-1827-43a0-a985-85db61855cb3" Dec 03 07:06:03 crc kubenswrapper[4947]: E1203 07:06:03.980772 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mh7hb" podUID="0f8b1d0b-522a-413b-a689-39044ec47286" Dec 03 07:06:04 crc kubenswrapper[4947]: E1203 07:06:04.263556 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-vksgw" podUID="bab921ef-a66f-48fe-87fc-fa040bc09b2e" Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.632731 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gl8qx" event={"ID":"13d55077-1827-43a0-a985-85db61855cb3","Type":"ContainerStarted","Data":"8a22474c4ac7691700bef8c308064dc5772a3d9aaf326549596964ef893bacde"} Dec 03 07:06:04 crc kubenswrapper[4947]: E1203 07:06:04.637798 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gl8qx" podUID="13d55077-1827-43a0-a985-85db61855cb3" Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.645673 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-b5bpz" event={"ID":"b548d942-a26b-47aa-b352-23afd3148288","Type":"ContainerStarted","Data":"963861d9fde840184164611290ff13d977b2b16eba99c3992b1ee7d7980b6dcf"} Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.646527 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-b5bpz" Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.655843 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-b5bpz" Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.666887 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sk2lv" event={"ID":"ee4fc346-902a-4be8-9bf4-081b58b2c547","Type":"ContainerStarted","Data":"016e76a29b884f9218e18c830e8c3bab087cbb4c74772c0567898da22f76f9c4"} Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.699857 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-26vkh" event={"ID":"888543d1-0ade-4e16-8965-a0ecb6fd65a7","Type":"ContainerStarted","Data":"a42169e39c9eb3e92c5e64503e35762407cb004bc51656c3d8b4409c35443aef"} Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.700646 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-26vkh" Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.712756 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-26vkh" Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.725247 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-b5bpz" podStartSLOduration=4.433014472 podStartE2EDuration="36.725225407s" podCreationTimestamp="2025-12-03 07:05:28 +0000 UTC" firstStartedPulling="2025-12-03 07:05:30.921910225 +0000 UTC m=+992.182864641" lastFinishedPulling="2025-12-03 07:06:03.21412115 +0000 UTC m=+1024.475075576" observedRunningTime="2025-12-03 07:06:04.693753569 +0000 UTC m=+1025.954707995" watchObservedRunningTime="2025-12-03 07:06:04.725225407 +0000 UTC m=+1025.986179833" Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.726554 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-mg8v4" event={"ID":"e266debc-b844-4b60-bbbf-c038b61a7ab8","Type":"ContainerStarted","Data":"de625943db888accb72381b4f194c0ba70b61bb3b329ce91332c790d23396336"} Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.729089 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-sk2lv" podStartSLOduration=3.656313911 podStartE2EDuration="35.729074501s" podCreationTimestamp="2025-12-03 07:05:29 +0000 UTC" firstStartedPulling="2025-12-03 07:05:31.112135342 +0000 UTC m=+992.373089768" lastFinishedPulling="2025-12-03 07:06:03.184895922 +0000 UTC m=+1024.445850358" observedRunningTime="2025-12-03 07:06:04.712299119 +0000 UTC m=+1025.973253545" watchObservedRunningTime="2025-12-03 07:06:04.729074501 +0000 UTC m=+1025.990028927" Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.750616 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-26vkh" podStartSLOduration=4.078011455 podStartE2EDuration="36.750599361s" podCreationTimestamp="2025-12-03 07:05:28 +0000 UTC" firstStartedPulling="2025-12-03 07:05:30.516273133 +0000 UTC m=+991.777227559" lastFinishedPulling="2025-12-03 07:06:03.188861019 +0000 UTC m=+1024.449815465" observedRunningTime="2025-12-03 07:06:04.733149801 +0000 UTC m=+1025.994104217" watchObservedRunningTime="2025-12-03 07:06:04.750599361 +0000 UTC m=+1026.011553787" Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.778803 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-mg8v4" podStartSLOduration=21.927185237 podStartE2EDuration="36.77878457s" podCreationTimestamp="2025-12-03 07:05:28 +0000 UTC" firstStartedPulling="2025-12-03 07:05:30.919658645 +0000 UTC m=+992.180613071" lastFinishedPulling="2025-12-03 07:05:45.771257978 +0000 UTC m=+1007.032212404" observedRunningTime="2025-12-03 07:06:04.751931137 +0000 UTC m=+1026.012885563" watchObservedRunningTime="2025-12-03 07:06:04.77878457 +0000 UTC m=+1026.039738996" Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.779749 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-bdgzf" event={"ID":"fbf40e80-a5ef-41d4-ad63-b060d52be33f","Type":"ContainerStarted","Data":"8c28fabd6ac28111fcd2fa639dcce48ea644d783468d4085a12bbb4b374e4806"} Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.786886 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-72sgg" event={"ID":"e8e59da3-af36-42c7-9c78-98608089eaea","Type":"ContainerStarted","Data":"53346eb919ecc68ac201d1f053b45bbd2af4d5aac3b4749061f306a45eb2665c"} Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.787518 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-72sgg" Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.826970 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-vksgw" event={"ID":"bab921ef-a66f-48fe-87fc-fa040bc09b2e","Type":"ContainerStarted","Data":"472689a8bbeea018466d7475100f7dd269d57d24552edd531458b62fdf498d3a"} Dec 03 07:06:04 crc kubenswrapper[4947]: E1203 07:06:04.828377 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-vksgw" podUID="bab921ef-a66f-48fe-87fc-fa040bc09b2e" Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.830793 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-bdgzf" podStartSLOduration=4.40658275 podStartE2EDuration="36.830772991s" podCreationTimestamp="2025-12-03 07:05:28 +0000 UTC" firstStartedPulling="2025-12-03 07:05:30.91427278 +0000 UTC m=+992.175227206" lastFinishedPulling="2025-12-03 07:06:03.338463021 +0000 UTC m=+1024.599417447" observedRunningTime="2025-12-03 07:06:04.820956887 +0000 UTC m=+1026.081911313" watchObservedRunningTime="2025-12-03 07:06:04.830772991 +0000 UTC m=+1026.091727417" Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.834896 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" event={"ID":"de69b4ac-b81b-4074-8e69-2ec717ecd70b","Type":"ContainerStarted","Data":"c8c059f4c722f0a18d67119195d4fc36b1a7b171761e0e2f1615d6c527582efd"} Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.860419 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-72sgg" podStartSLOduration=4.825841302 podStartE2EDuration="36.860406061s" podCreationTimestamp="2025-12-03 07:05:28 +0000 UTC" firstStartedPulling="2025-12-03 07:05:31.087989072 +0000 UTC m=+992.348943498" lastFinishedPulling="2025-12-03 07:06:03.122553831 +0000 UTC m=+1024.383508257" observedRunningTime="2025-12-03 07:06:04.857590975 +0000 UTC m=+1026.118545411" watchObservedRunningTime="2025-12-03 07:06:04.860406061 +0000 UTC m=+1026.121360487" Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.871175 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfz6c" event={"ID":"1e069b5f-1572-4c42-b34e-74c49b4b6940","Type":"ContainerStarted","Data":"e452ae130ac9cb0af7a444ae70c9d8f3659919408223aee573481dbafae52f95"} Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.907825 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-lwmcn" event={"ID":"0681e281-d288-430f-b175-1d5c36593c9a","Type":"ContainerStarted","Data":"0f28b3195a97a1af0e039a70ce5cf190fee19af1b6b461dd0ac72b05141093b7"} Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.943681 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hflxr" event={"ID":"b537173f-b7c8-426d-bf40-8bb6ece17177","Type":"ContainerStarted","Data":"6e89db32d59d2ebb0be1a1e4f2b140dd48fcf02e01d94fb7ce931f6d3af53cb6"} Dec 03 07:06:04 crc kubenswrapper[4947]: E1203 07:06:04.949661 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hflxr" podUID="b537173f-b7c8-426d-bf40-8bb6ece17177" Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.956668 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-lwmcn" podStartSLOduration=22.245114486 podStartE2EDuration="36.956654994s" podCreationTimestamp="2025-12-03 07:05:28 +0000 UTC" firstStartedPulling="2025-12-03 07:05:31.079121413 +0000 UTC m=+992.340075839" lastFinishedPulling="2025-12-03 07:05:45.790661921 +0000 UTC m=+1007.051616347" observedRunningTime="2025-12-03 07:06:04.950135819 +0000 UTC m=+1026.211090245" watchObservedRunningTime="2025-12-03 07:06:04.956654994 +0000 UTC m=+1026.217609420" Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.962936 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-hfz6c" podStartSLOduration=4.072290931 podStartE2EDuration="36.962924524s" podCreationTimestamp="2025-12-03 07:05:28 +0000 UTC" firstStartedPulling="2025-12-03 07:05:30.502931773 +0000 UTC m=+991.763886199" lastFinishedPulling="2025-12-03 07:06:03.393565366 +0000 UTC m=+1024.654519792" observedRunningTime="2025-12-03 07:06:04.923652325 +0000 UTC m=+1026.184606751" watchObservedRunningTime="2025-12-03 07:06:04.962924524 +0000 UTC m=+1026.223878950" Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.963813 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mh7hb" event={"ID":"0f8b1d0b-522a-413b-a689-39044ec47286","Type":"ContainerStarted","Data":"8a88ceac2f37691cf1fa05e0ece87c490ff2fe8adab1dcb2488fdd048a9352d9"} Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.970900 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-nvsv7" event={"ID":"528fb24c-b835-446a-84c8-fce6b4e4815c","Type":"ContainerStarted","Data":"58e47bac4cebf74853b85773bfb7290156555a796709460215db5d6a910f8920"} Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.982036 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" event={"ID":"549a3cff-42c6-45ea-8e4c-36c4aa29457c","Type":"ContainerStarted","Data":"9019a6402afe79db2bc07576bb2e9f5abcd971e0ab018fdf20becccf36e0ff1f"} Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.982597 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.999273 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-htpcw" event={"ID":"b30f0b12-f386-4968-98fd-e2272aa1b2f9","Type":"ContainerStarted","Data":"48afe6d63785d629f5dd6ffeebb72928ba280afd807fcc8bcdf55bedf23349a5"} Dec 03 07:06:04 crc kubenswrapper[4947]: I1203 07:06:04.999315 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-lt6cp" Dec 03 07:06:05 crc kubenswrapper[4947]: I1203 07:06:05.003668 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-lt6cp" Dec 03 07:06:05 crc kubenswrapper[4947]: I1203 07:06:05.046751 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-nvsv7" podStartSLOduration=21.743692222 podStartE2EDuration="37.046731623s" podCreationTimestamp="2025-12-03 07:05:28 +0000 UTC" firstStartedPulling="2025-12-03 07:05:30.466591103 +0000 UTC m=+991.727545529" lastFinishedPulling="2025-12-03 07:05:45.769630494 +0000 UTC m=+1007.030584930" observedRunningTime="2025-12-03 07:06:05.031139023 +0000 UTC m=+1026.292093449" watchObservedRunningTime="2025-12-03 07:06:05.046731623 +0000 UTC m=+1026.307686049" Dec 03 07:06:05 crc kubenswrapper[4947]: I1203 07:06:05.069131 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" podStartSLOduration=36.069113636 podStartE2EDuration="36.069113636s" podCreationTimestamp="2025-12-03 07:05:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:06:05.067397779 +0000 UTC m=+1026.328352205" watchObservedRunningTime="2025-12-03 07:06:05.069113636 +0000 UTC m=+1026.330068062" Dec 03 07:06:05 crc kubenswrapper[4947]: I1203 07:06:05.097996 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-lt6cp" podStartSLOduration=4.796222541 podStartE2EDuration="37.097981253s" podCreationTimestamp="2025-12-03 07:05:28 +0000 UTC" firstStartedPulling="2025-12-03 07:05:30.917920387 +0000 UTC m=+992.178874813" lastFinishedPulling="2025-12-03 07:06:03.219679099 +0000 UTC m=+1024.480633525" observedRunningTime="2025-12-03 07:06:05.090838921 +0000 UTC m=+1026.351793347" watchObservedRunningTime="2025-12-03 07:06:05.097981253 +0000 UTC m=+1026.358935679" Dec 03 07:06:06 crc kubenswrapper[4947]: I1203 07:06:06.014538 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-g9h4f" event={"ID":"83997d19-6166-445a-a2bd-15acf15fa18d","Type":"ContainerStarted","Data":"d2c47673eed02cd67b891d993886af2db178462d72748e9ad76fc14a6b00455d"} Dec 03 07:06:06 crc kubenswrapper[4947]: I1203 07:06:06.018009 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-qzhrh" event={"ID":"fd59ba56-51b7-4260-9b1f-e3bee0916e06","Type":"ContainerStarted","Data":"c73ecb5f34b743020c78dfb7dde5151dc447547fb2bc78e85132e432a2592fda"} Dec 03 07:06:06 crc kubenswrapper[4947]: I1203 07:06:06.018050 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-qzhrh" event={"ID":"fd59ba56-51b7-4260-9b1f-e3bee0916e06","Type":"ContainerStarted","Data":"6a2601a1e9a3291acd0f3672a01092efa614ce97dfa4233e7eb268f8ca7f1edf"} Dec 03 07:06:06 crc kubenswrapper[4947]: I1203 07:06:06.018363 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-qzhrh" Dec 03 07:06:06 crc kubenswrapper[4947]: I1203 07:06:06.020592 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-llv5g" event={"ID":"0703a4f3-6732-44a1-b690-fcea6eb2228d","Type":"ContainerStarted","Data":"d614a38244152e1c2db9b2bc9ba4753f995e48b7dbf3d666b138b6821688e135"} Dec 03 07:06:06 crc kubenswrapper[4947]: I1203 07:06:06.020725 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-llv5g" Dec 03 07:06:06 crc kubenswrapper[4947]: I1203 07:06:06.022346 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mh7hb" event={"ID":"0f8b1d0b-522a-413b-a689-39044ec47286","Type":"ContainerStarted","Data":"7f05e477237ea04b9b7a9f29ab027df1ff42e1e93dde95e23a292a7e7d5617ea"} Dec 03 07:06:06 crc kubenswrapper[4947]: I1203 07:06:06.022506 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mh7hb" Dec 03 07:06:06 crc kubenswrapper[4947]: I1203 07:06:06.024796 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdq5s" event={"ID":"6999c194-78a5-48db-9e56-59f65d9e11c1","Type":"ContainerStarted","Data":"2e5b42578e4295b06f29be86f74b356fc38e49294ec5b934493688aff9f386db"} Dec 03 07:06:06 crc kubenswrapper[4947]: I1203 07:06:06.024893 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdq5s" Dec 03 07:06:06 crc kubenswrapper[4947]: I1203 07:06:06.027592 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" event={"ID":"549a3cff-42c6-45ea-8e4c-36c4aa29457c","Type":"ContainerStarted","Data":"ae659ae54b93f86683bdddb8d242507ef7e69b527bdbc7b46ef9790b9cd83627"} Dec 03 07:06:06 crc kubenswrapper[4947]: I1203 07:06:06.032016 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-g9h4f" podStartSLOduration=22.749334187 podStartE2EDuration="38.032001308s" podCreationTimestamp="2025-12-03 07:05:28 +0000 UTC" firstStartedPulling="2025-12-03 07:05:30.487811695 +0000 UTC m=+991.748766131" lastFinishedPulling="2025-12-03 07:05:45.770478826 +0000 UTC m=+1007.031433252" observedRunningTime="2025-12-03 07:06:06.028681528 +0000 UTC m=+1027.289635944" watchObservedRunningTime="2025-12-03 07:06:06.032001308 +0000 UTC m=+1027.292955734" Dec 03 07:06:06 crc kubenswrapper[4947]: I1203 07:06:06.043792 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-9bpvr" event={"ID":"76116ad0-e325-41c1-a25e-9089331c52ba","Type":"ContainerStarted","Data":"f2ca7c3e62431b022fc75e87abef123a00891cd927d229e854e7a765e5aa2037"} Dec 03 07:06:06 crc kubenswrapper[4947]: I1203 07:06:06.051156 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n9p7d" event={"ID":"f67211f5-3446-471e-8124-52d1d18dadbe","Type":"ContainerStarted","Data":"c0d6618b5aa231812895223b41947d4d13325728f57f3486a9f4c3677d2bd1fb"} Dec 03 07:06:06 crc kubenswrapper[4947]: I1203 07:06:06.062423 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mh7hb" podStartSLOduration=3.434954693 podStartE2EDuration="38.062398027s" podCreationTimestamp="2025-12-03 07:05:28 +0000 UTC" firstStartedPulling="2025-12-03 07:05:30.926875099 +0000 UTC m=+992.187829525" lastFinishedPulling="2025-12-03 07:06:05.554318433 +0000 UTC m=+1026.815272859" observedRunningTime="2025-12-03 07:06:06.061226135 +0000 UTC m=+1027.322180571" watchObservedRunningTime="2025-12-03 07:06:06.062398027 +0000 UTC m=+1027.323352453" Dec 03 07:06:06 crc kubenswrapper[4947]: I1203 07:06:06.088308 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-llv5g" podStartSLOduration=6.294452933 podStartE2EDuration="38.088287525s" podCreationTimestamp="2025-12-03 07:05:28 +0000 UTC" firstStartedPulling="2025-12-03 07:05:31.097042245 +0000 UTC m=+992.357996661" lastFinishedPulling="2025-12-03 07:06:02.890876827 +0000 UTC m=+1024.151831253" observedRunningTime="2025-12-03 07:06:06.084470352 +0000 UTC m=+1027.345424778" watchObservedRunningTime="2025-12-03 07:06:06.088287525 +0000 UTC m=+1027.349241951" Dec 03 07:06:06 crc kubenswrapper[4947]: I1203 07:06:06.107597 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdq5s" podStartSLOduration=7.20519287 podStartE2EDuration="38.107576265s" podCreationTimestamp="2025-12-03 07:05:28 +0000 UTC" firstStartedPulling="2025-12-03 07:05:30.972442007 +0000 UTC m=+992.233396433" lastFinishedPulling="2025-12-03 07:06:01.874825402 +0000 UTC m=+1023.135779828" observedRunningTime="2025-12-03 07:06:06.105971631 +0000 UTC m=+1027.366926077" watchObservedRunningTime="2025-12-03 07:06:06.107576265 +0000 UTC m=+1027.368530711" Dec 03 07:06:06 crc kubenswrapper[4947]: I1203 07:06:06.123865 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-qzhrh" podStartSLOduration=4.954276862 podStartE2EDuration="38.123849173s" podCreationTimestamp="2025-12-03 07:05:28 +0000 UTC" firstStartedPulling="2025-12-03 07:05:30.930503117 +0000 UTC m=+992.191457543" lastFinishedPulling="2025-12-03 07:06:04.100075428 +0000 UTC m=+1025.361029854" observedRunningTime="2025-12-03 07:06:06.121406747 +0000 UTC m=+1027.382361173" watchObservedRunningTime="2025-12-03 07:06:06.123849173 +0000 UTC m=+1027.384803599" Dec 03 07:06:06 crc kubenswrapper[4947]: I1203 07:06:06.144042 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-n9p7d" podStartSLOduration=23.3420031 podStartE2EDuration="38.144020427s" podCreationTimestamp="2025-12-03 07:05:28 +0000 UTC" firstStartedPulling="2025-12-03 07:05:30.935525302 +0000 UTC m=+992.196479728" lastFinishedPulling="2025-12-03 07:05:45.737542589 +0000 UTC m=+1006.998497055" observedRunningTime="2025-12-03 07:06:06.13669857 +0000 UTC m=+1027.397652996" watchObservedRunningTime="2025-12-03 07:06:06.144020427 +0000 UTC m=+1027.404974853" Dec 03 07:06:06 crc kubenswrapper[4947]: I1203 07:06:06.165322 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-9bpvr" podStartSLOduration=22.35826965 podStartE2EDuration="37.165306191s" podCreationTimestamp="2025-12-03 07:05:29 +0000 UTC" firstStartedPulling="2025-12-03 07:05:30.965077139 +0000 UTC m=+992.226031565" lastFinishedPulling="2025-12-03 07:05:45.77211367 +0000 UTC m=+1007.033068106" observedRunningTime="2025-12-03 07:06:06.16195684 +0000 UTC m=+1027.422911266" watchObservedRunningTime="2025-12-03 07:06:06.165306191 +0000 UTC m=+1027.426260617" Dec 03 07:06:08 crc kubenswrapper[4947]: I1203 07:06:08.066429 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" event={"ID":"de69b4ac-b81b-4074-8e69-2ec717ecd70b","Type":"ContainerStarted","Data":"00c3ace4f8881747e1e7e191fc60075bc25fbe310ddaf88c4c30aa9e5fe73c66"} Dec 03 07:06:08 crc kubenswrapper[4947]: I1203 07:06:08.068556 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" Dec 03 07:06:08 crc kubenswrapper[4947]: I1203 07:06:08.068593 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" event={"ID":"de69b4ac-b81b-4074-8e69-2ec717ecd70b","Type":"ContainerStarted","Data":"c01b138f72450c8c2cfbcf4ce0b173140af7cdd78edd9e1ff66970a0a0403d3a"} Dec 03 07:06:08 crc kubenswrapper[4947]: I1203 07:06:08.070780 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-htpcw" event={"ID":"b30f0b12-f386-4968-98fd-e2272aa1b2f9","Type":"ContainerStarted","Data":"a3784ace34aa712ef2cb66656c4bb368e97a8871467384dab8b30a0ec693181b"} Dec 03 07:06:08 crc kubenswrapper[4947]: I1203 07:06:08.070828 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-htpcw" event={"ID":"b30f0b12-f386-4968-98fd-e2272aa1b2f9","Type":"ContainerStarted","Data":"80f9014e5946822a882411aeedfae02bf1a8a2b63ef560a8a0e3fe445c198bfa"} Dec 03 07:06:08 crc kubenswrapper[4947]: I1203 07:06:08.071188 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-htpcw" Dec 03 07:06:08 crc kubenswrapper[4947]: I1203 07:06:08.113478 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" podStartSLOduration=36.112844492 podStartE2EDuration="40.113453278s" podCreationTimestamp="2025-12-03 07:05:28 +0000 UTC" firstStartedPulling="2025-12-03 07:06:03.666925393 +0000 UTC m=+1024.927879819" lastFinishedPulling="2025-12-03 07:06:07.667534179 +0000 UTC m=+1028.928488605" observedRunningTime="2025-12-03 07:06:08.106953962 +0000 UTC m=+1029.367908388" watchObservedRunningTime="2025-12-03 07:06:08.113453278 +0000 UTC m=+1029.374407704" Dec 03 07:06:08 crc kubenswrapper[4947]: I1203 07:06:08.126271 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-htpcw" podStartSLOduration=36.297376186 podStartE2EDuration="40.126256283s" podCreationTimestamp="2025-12-03 07:05:28 +0000 UTC" firstStartedPulling="2025-12-03 07:06:03.818435377 +0000 UTC m=+1025.079389803" lastFinishedPulling="2025-12-03 07:06:07.647315464 +0000 UTC m=+1028.908269900" observedRunningTime="2025-12-03 07:06:08.125079351 +0000 UTC m=+1029.386033777" watchObservedRunningTime="2025-12-03 07:06:08.126256283 +0000 UTC m=+1029.387210709" Dec 03 07:06:09 crc kubenswrapper[4947]: I1203 07:06:09.240164 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-qzhrh" Dec 03 07:06:09 crc kubenswrapper[4947]: I1203 07:06:09.702248 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-72sgg" Dec 03 07:06:09 crc kubenswrapper[4947]: I1203 07:06:09.747284 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-llv5g" Dec 03 07:06:09 crc kubenswrapper[4947]: I1203 07:06:09.961438 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-gdq5s" Dec 03 07:06:11 crc kubenswrapper[4947]: I1203 07:06:11.627627 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-9f56fc979-wpbkc" Dec 03 07:06:17 crc kubenswrapper[4947]: E1203 07:06:17.085539 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-vksgw" podUID="bab921ef-a66f-48fe-87fc-fa040bc09b2e" Dec 03 07:06:18 crc kubenswrapper[4947]: E1203 07:06:18.088432 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gl8qx" podUID="13d55077-1827-43a0-a985-85db61855cb3" Dec 03 07:06:19 crc kubenswrapper[4947]: I1203 07:06:19.673977 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-mh7hb" Dec 03 07:06:20 crc kubenswrapper[4947]: E1203 07:06:20.084312 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hflxr" podUID="b537173f-b7c8-426d-bf40-8bb6ece17177" Dec 03 07:06:20 crc kubenswrapper[4947]: I1203 07:06:20.926412 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-htpcw" Dec 03 07:06:21 crc kubenswrapper[4947]: I1203 07:06:21.198921 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55d86b6686l6m2r" Dec 03 07:06:28 crc kubenswrapper[4947]: I1203 07:06:28.084553 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 07:06:29 crc kubenswrapper[4947]: I1203 07:06:29.218755 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-vksgw" event={"ID":"bab921ef-a66f-48fe-87fc-fa040bc09b2e","Type":"ContainerStarted","Data":"9052e42209a841ced8d4075ec2f662558a1672e8e3ebb7ab91d9c780d89229d3"} Dec 03 07:06:29 crc kubenswrapper[4947]: I1203 07:06:29.219321 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-vksgw" Dec 03 07:06:29 crc kubenswrapper[4947]: I1203 07:06:29.247161 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-vksgw" podStartSLOduration=2.776774085 podStartE2EDuration="1m0.247138316s" podCreationTimestamp="2025-12-03 07:05:29 +0000 UTC" firstStartedPulling="2025-12-03 07:05:31.101508946 +0000 UTC m=+992.362463372" lastFinishedPulling="2025-12-03 07:06:28.571873167 +0000 UTC m=+1049.832827603" observedRunningTime="2025-12-03 07:06:29.236623142 +0000 UTC m=+1050.497577598" watchObservedRunningTime="2025-12-03 07:06:29.247138316 +0000 UTC m=+1050.508092752" Dec 03 07:06:30 crc kubenswrapper[4947]: I1203 07:06:30.086314 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:06:30 crc kubenswrapper[4947]: I1203 07:06:30.086399 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:06:36 crc kubenswrapper[4947]: I1203 07:06:36.306313 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gl8qx" event={"ID":"13d55077-1827-43a0-a985-85db61855cb3","Type":"ContainerStarted","Data":"ff98de27568b615ac92755fa95c3f8348789a92ac3ed8127290a2dbc3cd779c8"} Dec 03 07:06:36 crc kubenswrapper[4947]: I1203 07:06:36.306919 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gl8qx" Dec 03 07:06:36 crc kubenswrapper[4947]: I1203 07:06:36.308513 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hflxr" event={"ID":"b537173f-b7c8-426d-bf40-8bb6ece17177","Type":"ContainerStarted","Data":"2c4fc55b24add8a8b9a051494bb163496b3c597acefbad5c1252905e82f53ba4"} Dec 03 07:06:36 crc kubenswrapper[4947]: I1203 07:06:36.308633 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hflxr" Dec 03 07:06:36 crc kubenswrapper[4947]: I1203 07:06:36.329105 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gl8qx" podStartSLOduration=4.173732144 podStartE2EDuration="1m8.3290916s" podCreationTimestamp="2025-12-03 07:05:28 +0000 UTC" firstStartedPulling="2025-12-03 07:05:31.096959853 +0000 UTC m=+992.357914269" lastFinishedPulling="2025-12-03 07:06:35.252319299 +0000 UTC m=+1056.513273725" observedRunningTime="2025-12-03 07:06:36.323689895 +0000 UTC m=+1057.584644321" watchObservedRunningTime="2025-12-03 07:06:36.3290916 +0000 UTC m=+1057.590046026" Dec 03 07:06:36 crc kubenswrapper[4947]: I1203 07:06:36.356575 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hflxr" podStartSLOduration=4.314654762 podStartE2EDuration="1m8.35655212s" podCreationTimestamp="2025-12-03 07:05:28 +0000 UTC" firstStartedPulling="2025-12-03 07:05:31.091127096 +0000 UTC m=+992.352081522" lastFinishedPulling="2025-12-03 07:06:35.133024424 +0000 UTC m=+1056.393978880" observedRunningTime="2025-12-03 07:06:36.353842668 +0000 UTC m=+1057.614797094" watchObservedRunningTime="2025-12-03 07:06:36.35655212 +0000 UTC m=+1057.617506546" Dec 03 07:06:40 crc kubenswrapper[4947]: I1203 07:06:40.095149 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-vksgw" Dec 03 07:06:49 crc kubenswrapper[4947]: I1203 07:06:49.148737 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-gl8qx" Dec 03 07:06:49 crc kubenswrapper[4947]: I1203 07:06:49.287607 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-hflxr" Dec 03 07:07:00 crc kubenswrapper[4947]: I1203 07:07:00.086407 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:07:00 crc kubenswrapper[4947]: I1203 07:07:00.086929 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.028070 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-hkrwn"] Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.030192 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-hkrwn" Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.035194 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.035291 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.035319 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.035442 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-qzmnm" Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.054112 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk5wg\" (UniqueName: \"kubernetes.io/projected/4ce4dd53-a622-469e-b1b8-f042fb62cdb8-kube-api-access-pk5wg\") pod \"dnsmasq-dns-5cd484bb89-hkrwn\" (UID: \"4ce4dd53-a622-469e-b1b8-f042fb62cdb8\") " pod="openstack/dnsmasq-dns-5cd484bb89-hkrwn" Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.054184 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ce4dd53-a622-469e-b1b8-f042fb62cdb8-config\") pod \"dnsmasq-dns-5cd484bb89-hkrwn\" (UID: \"4ce4dd53-a622-469e-b1b8-f042fb62cdb8\") " pod="openstack/dnsmasq-dns-5cd484bb89-hkrwn" Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.057568 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-hkrwn"] Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.085717 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-567c455747-g4bds"] Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.087169 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-g4bds" Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.089473 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.092839 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-567c455747-g4bds"] Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.155048 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk5wg\" (UniqueName: \"kubernetes.io/projected/4ce4dd53-a622-469e-b1b8-f042fb62cdb8-kube-api-access-pk5wg\") pod \"dnsmasq-dns-5cd484bb89-hkrwn\" (UID: \"4ce4dd53-a622-469e-b1b8-f042fb62cdb8\") " pod="openstack/dnsmasq-dns-5cd484bb89-hkrwn" Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.155469 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ce4dd53-a622-469e-b1b8-f042fb62cdb8-config\") pod \"dnsmasq-dns-5cd484bb89-hkrwn\" (UID: \"4ce4dd53-a622-469e-b1b8-f042fb62cdb8\") " pod="openstack/dnsmasq-dns-5cd484bb89-hkrwn" Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.155526 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckwnk\" (UniqueName: \"kubernetes.io/projected/3b5ca077-d5b1-4d8f-ad52-1d9058a764ce-kube-api-access-ckwnk\") pod \"dnsmasq-dns-567c455747-g4bds\" (UID: \"3b5ca077-d5b1-4d8f-ad52-1d9058a764ce\") " pod="openstack/dnsmasq-dns-567c455747-g4bds" Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.155565 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b5ca077-d5b1-4d8f-ad52-1d9058a764ce-dns-svc\") pod \"dnsmasq-dns-567c455747-g4bds\" (UID: \"3b5ca077-d5b1-4d8f-ad52-1d9058a764ce\") " pod="openstack/dnsmasq-dns-567c455747-g4bds" Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.155617 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b5ca077-d5b1-4d8f-ad52-1d9058a764ce-config\") pod \"dnsmasq-dns-567c455747-g4bds\" (UID: \"3b5ca077-d5b1-4d8f-ad52-1d9058a764ce\") " pod="openstack/dnsmasq-dns-567c455747-g4bds" Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.156944 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ce4dd53-a622-469e-b1b8-f042fb62cdb8-config\") pod \"dnsmasq-dns-5cd484bb89-hkrwn\" (UID: \"4ce4dd53-a622-469e-b1b8-f042fb62cdb8\") " pod="openstack/dnsmasq-dns-5cd484bb89-hkrwn" Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.175332 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk5wg\" (UniqueName: \"kubernetes.io/projected/4ce4dd53-a622-469e-b1b8-f042fb62cdb8-kube-api-access-pk5wg\") pod \"dnsmasq-dns-5cd484bb89-hkrwn\" (UID: \"4ce4dd53-a622-469e-b1b8-f042fb62cdb8\") " pod="openstack/dnsmasq-dns-5cd484bb89-hkrwn" Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.256314 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b5ca077-d5b1-4d8f-ad52-1d9058a764ce-dns-svc\") pod \"dnsmasq-dns-567c455747-g4bds\" (UID: \"3b5ca077-d5b1-4d8f-ad52-1d9058a764ce\") " pod="openstack/dnsmasq-dns-567c455747-g4bds" Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.256388 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b5ca077-d5b1-4d8f-ad52-1d9058a764ce-config\") pod \"dnsmasq-dns-567c455747-g4bds\" (UID: \"3b5ca077-d5b1-4d8f-ad52-1d9058a764ce\") " pod="openstack/dnsmasq-dns-567c455747-g4bds" Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.256469 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckwnk\" (UniqueName: \"kubernetes.io/projected/3b5ca077-d5b1-4d8f-ad52-1d9058a764ce-kube-api-access-ckwnk\") pod \"dnsmasq-dns-567c455747-g4bds\" (UID: \"3b5ca077-d5b1-4d8f-ad52-1d9058a764ce\") " pod="openstack/dnsmasq-dns-567c455747-g4bds" Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.258230 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b5ca077-d5b1-4d8f-ad52-1d9058a764ce-dns-svc\") pod \"dnsmasq-dns-567c455747-g4bds\" (UID: \"3b5ca077-d5b1-4d8f-ad52-1d9058a764ce\") " pod="openstack/dnsmasq-dns-567c455747-g4bds" Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.259295 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b5ca077-d5b1-4d8f-ad52-1d9058a764ce-config\") pod \"dnsmasq-dns-567c455747-g4bds\" (UID: \"3b5ca077-d5b1-4d8f-ad52-1d9058a764ce\") " pod="openstack/dnsmasq-dns-567c455747-g4bds" Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.274576 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckwnk\" (UniqueName: \"kubernetes.io/projected/3b5ca077-d5b1-4d8f-ad52-1d9058a764ce-kube-api-access-ckwnk\") pod \"dnsmasq-dns-567c455747-g4bds\" (UID: \"3b5ca077-d5b1-4d8f-ad52-1d9058a764ce\") " pod="openstack/dnsmasq-dns-567c455747-g4bds" Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.353331 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-hkrwn" Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.408379 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-g4bds" Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.780720 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-hkrwn"] Dec 03 07:07:06 crc kubenswrapper[4947]: I1203 07:07:06.883363 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-567c455747-g4bds"] Dec 03 07:07:07 crc kubenswrapper[4947]: I1203 07:07:07.570594 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567c455747-g4bds" event={"ID":"3b5ca077-d5b1-4d8f-ad52-1d9058a764ce","Type":"ContainerStarted","Data":"fe03120c1fbcbdbba698c720a903e48f463b2c02dd24ee49e6173129e3278dd2"} Dec 03 07:07:07 crc kubenswrapper[4947]: I1203 07:07:07.572533 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cd484bb89-hkrwn" event={"ID":"4ce4dd53-a622-469e-b1b8-f042fb62cdb8","Type":"ContainerStarted","Data":"734be8a14e6c85c25357a240a29d5d499803062e26fd2c1430a5fb4c60db2459"} Dec 03 07:07:08 crc kubenswrapper[4947]: I1203 07:07:08.569389 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-hkrwn"] Dec 03 07:07:08 crc kubenswrapper[4947]: I1203 07:07:08.618253 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-whnjz"] Dec 03 07:07:08 crc kubenswrapper[4947]: I1203 07:07:08.622163 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-whnjz" Dec 03 07:07:08 crc kubenswrapper[4947]: I1203 07:07:08.631292 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-whnjz"] Dec 03 07:07:08 crc kubenswrapper[4947]: I1203 07:07:08.723433 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1b40298-733b-476a-aeca-4c0025a86a98-dns-svc\") pod \"dnsmasq-dns-bc4b48fc9-whnjz\" (UID: \"c1b40298-733b-476a-aeca-4c0025a86a98\") " pod="openstack/dnsmasq-dns-bc4b48fc9-whnjz" Dec 03 07:07:08 crc kubenswrapper[4947]: I1203 07:07:08.723484 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqqqn\" (UniqueName: \"kubernetes.io/projected/c1b40298-733b-476a-aeca-4c0025a86a98-kube-api-access-tqqqn\") pod \"dnsmasq-dns-bc4b48fc9-whnjz\" (UID: \"c1b40298-733b-476a-aeca-4c0025a86a98\") " pod="openstack/dnsmasq-dns-bc4b48fc9-whnjz" Dec 03 07:07:08 crc kubenswrapper[4947]: I1203 07:07:08.723520 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1b40298-733b-476a-aeca-4c0025a86a98-config\") pod \"dnsmasq-dns-bc4b48fc9-whnjz\" (UID: \"c1b40298-733b-476a-aeca-4c0025a86a98\") " pod="openstack/dnsmasq-dns-bc4b48fc9-whnjz" Dec 03 07:07:08 crc kubenswrapper[4947]: I1203 07:07:08.826432 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1b40298-733b-476a-aeca-4c0025a86a98-dns-svc\") pod \"dnsmasq-dns-bc4b48fc9-whnjz\" (UID: \"c1b40298-733b-476a-aeca-4c0025a86a98\") " pod="openstack/dnsmasq-dns-bc4b48fc9-whnjz" Dec 03 07:07:08 crc kubenswrapper[4947]: I1203 07:07:08.826511 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqqqn\" (UniqueName: \"kubernetes.io/projected/c1b40298-733b-476a-aeca-4c0025a86a98-kube-api-access-tqqqn\") pod \"dnsmasq-dns-bc4b48fc9-whnjz\" (UID: \"c1b40298-733b-476a-aeca-4c0025a86a98\") " pod="openstack/dnsmasq-dns-bc4b48fc9-whnjz" Dec 03 07:07:08 crc kubenswrapper[4947]: I1203 07:07:08.826548 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1b40298-733b-476a-aeca-4c0025a86a98-config\") pod \"dnsmasq-dns-bc4b48fc9-whnjz\" (UID: \"c1b40298-733b-476a-aeca-4c0025a86a98\") " pod="openstack/dnsmasq-dns-bc4b48fc9-whnjz" Dec 03 07:07:08 crc kubenswrapper[4947]: I1203 07:07:08.827534 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1b40298-733b-476a-aeca-4c0025a86a98-config\") pod \"dnsmasq-dns-bc4b48fc9-whnjz\" (UID: \"c1b40298-733b-476a-aeca-4c0025a86a98\") " pod="openstack/dnsmasq-dns-bc4b48fc9-whnjz" Dec 03 07:07:08 crc kubenswrapper[4947]: I1203 07:07:08.828149 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1b40298-733b-476a-aeca-4c0025a86a98-dns-svc\") pod \"dnsmasq-dns-bc4b48fc9-whnjz\" (UID: \"c1b40298-733b-476a-aeca-4c0025a86a98\") " pod="openstack/dnsmasq-dns-bc4b48fc9-whnjz" Dec 03 07:07:08 crc kubenswrapper[4947]: I1203 07:07:08.844391 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-567c455747-g4bds"] Dec 03 07:07:08 crc kubenswrapper[4947]: I1203 07:07:08.855316 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqqqn\" (UniqueName: \"kubernetes.io/projected/c1b40298-733b-476a-aeca-4c0025a86a98-kube-api-access-tqqqn\") pod \"dnsmasq-dns-bc4b48fc9-whnjz\" (UID: \"c1b40298-733b-476a-aeca-4c0025a86a98\") " pod="openstack/dnsmasq-dns-bc4b48fc9-whnjz" Dec 03 07:07:08 crc kubenswrapper[4947]: I1203 07:07:08.877730 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb666b895-n9nkr"] Dec 03 07:07:08 crc kubenswrapper[4947]: I1203 07:07:08.879526 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-n9nkr" Dec 03 07:07:08 crc kubenswrapper[4947]: I1203 07:07:08.885640 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-n9nkr"] Dec 03 07:07:08 crc kubenswrapper[4947]: I1203 07:07:08.927248 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8ddeca6-ac51-4af7-8b3c-6c851b1c415a-dns-svc\") pod \"dnsmasq-dns-cb666b895-n9nkr\" (UID: \"a8ddeca6-ac51-4af7-8b3c-6c851b1c415a\") " pod="openstack/dnsmasq-dns-cb666b895-n9nkr" Dec 03 07:07:08 crc kubenswrapper[4947]: I1203 07:07:08.927289 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl247\" (UniqueName: \"kubernetes.io/projected/a8ddeca6-ac51-4af7-8b3c-6c851b1c415a-kube-api-access-gl247\") pod \"dnsmasq-dns-cb666b895-n9nkr\" (UID: \"a8ddeca6-ac51-4af7-8b3c-6c851b1c415a\") " pod="openstack/dnsmasq-dns-cb666b895-n9nkr" Dec 03 07:07:08 crc kubenswrapper[4947]: I1203 07:07:08.927372 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8ddeca6-ac51-4af7-8b3c-6c851b1c415a-config\") pod \"dnsmasq-dns-cb666b895-n9nkr\" (UID: \"a8ddeca6-ac51-4af7-8b3c-6c851b1c415a\") " pod="openstack/dnsmasq-dns-cb666b895-n9nkr" Dec 03 07:07:08 crc kubenswrapper[4947]: I1203 07:07:08.942107 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-whnjz" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.029814 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8ddeca6-ac51-4af7-8b3c-6c851b1c415a-config\") pod \"dnsmasq-dns-cb666b895-n9nkr\" (UID: \"a8ddeca6-ac51-4af7-8b3c-6c851b1c415a\") " pod="openstack/dnsmasq-dns-cb666b895-n9nkr" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.029887 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8ddeca6-ac51-4af7-8b3c-6c851b1c415a-dns-svc\") pod \"dnsmasq-dns-cb666b895-n9nkr\" (UID: \"a8ddeca6-ac51-4af7-8b3c-6c851b1c415a\") " pod="openstack/dnsmasq-dns-cb666b895-n9nkr" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.030083 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl247\" (UniqueName: \"kubernetes.io/projected/a8ddeca6-ac51-4af7-8b3c-6c851b1c415a-kube-api-access-gl247\") pod \"dnsmasq-dns-cb666b895-n9nkr\" (UID: \"a8ddeca6-ac51-4af7-8b3c-6c851b1c415a\") " pod="openstack/dnsmasq-dns-cb666b895-n9nkr" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.030774 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8ddeca6-ac51-4af7-8b3c-6c851b1c415a-dns-svc\") pod \"dnsmasq-dns-cb666b895-n9nkr\" (UID: \"a8ddeca6-ac51-4af7-8b3c-6c851b1c415a\") " pod="openstack/dnsmasq-dns-cb666b895-n9nkr" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.032010 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8ddeca6-ac51-4af7-8b3c-6c851b1c415a-config\") pod \"dnsmasq-dns-cb666b895-n9nkr\" (UID: \"a8ddeca6-ac51-4af7-8b3c-6c851b1c415a\") " pod="openstack/dnsmasq-dns-cb666b895-n9nkr" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.049461 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl247\" (UniqueName: \"kubernetes.io/projected/a8ddeca6-ac51-4af7-8b3c-6c851b1c415a-kube-api-access-gl247\") pod \"dnsmasq-dns-cb666b895-n9nkr\" (UID: \"a8ddeca6-ac51-4af7-8b3c-6c851b1c415a\") " pod="openstack/dnsmasq-dns-cb666b895-n9nkr" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.210466 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-n9nkr" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.456476 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-whnjz"] Dec 03 07:07:09 crc kubenswrapper[4947]: W1203 07:07:09.472441 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1b40298_733b_476a_aeca_4c0025a86a98.slice/crio-3687303ebec00058c85b17ff0fa7423648f4975db261aca78fcc262376571678 WatchSource:0}: Error finding container 3687303ebec00058c85b17ff0fa7423648f4975db261aca78fcc262376571678: Status 404 returned error can't find the container with id 3687303ebec00058c85b17ff0fa7423648f4975db261aca78fcc262376571678 Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.610662 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-whnjz" event={"ID":"c1b40298-733b-476a-aeca-4c0025a86a98","Type":"ContainerStarted","Data":"3687303ebec00058c85b17ff0fa7423648f4975db261aca78fcc262376571678"} Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.676065 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-n9nkr"] Dec 03 07:07:09 crc kubenswrapper[4947]: W1203 07:07:09.686241 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8ddeca6_ac51_4af7_8b3c_6c851b1c415a.slice/crio-c42132d3c6359466d825ae44da3d0618ebe465c3b1cd3821d0def3662877400a WatchSource:0}: Error finding container c42132d3c6359466d825ae44da3d0618ebe465c3b1cd3821d0def3662877400a: Status 404 returned error can't find the container with id c42132d3c6359466d825ae44da3d0618ebe465c3b1cd3821d0def3662877400a Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.763337 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.764820 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.765979 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.768249 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.768433 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.768643 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.768752 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-kbjhm" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.768850 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.768962 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.769057 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.957888 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.958192 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-config-data\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.958223 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.958257 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.958314 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpc2q\" (UniqueName: \"kubernetes.io/projected/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-kube-api-access-lpc2q\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.958335 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-pod-info\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.958357 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.958399 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.958415 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.958470 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-server-conf\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.958709 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.997570 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 07:07:09 crc kubenswrapper[4947]: I1203 07:07:09.999364 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.005049 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.005071 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-rn5xb" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.005385 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.005692 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.006599 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.006806 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.009062 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.037966 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.063758 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpc2q\" (UniqueName: \"kubernetes.io/projected/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-kube-api-access-lpc2q\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.063791 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-pod-info\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.064426 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.066115 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.066244 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.066345 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.066429 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-server-conf\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.066531 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.066694 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.066773 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-config-data\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.066886 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.066991 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.067692 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-server-conf\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.067974 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.068116 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-config-data\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.066694 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.069002 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.079320 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-pod-info\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.080189 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.081736 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.079996 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.084531 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpc2q\" (UniqueName: \"kubernetes.io/projected/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-kube-api-access-lpc2q\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.111140 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " pod="openstack/rabbitmq-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.168138 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5367165f-75ec-4633-8042-edfe91e3be60-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.168186 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.168223 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5367165f-75ec-4633-8042-edfe91e3be60-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.168289 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.168339 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dss2r\" (UniqueName: \"kubernetes.io/projected/5367165f-75ec-4633-8042-edfe91e3be60-kube-api-access-dss2r\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.168375 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.168397 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5367165f-75ec-4633-8042-edfe91e3be60-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.168426 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5367165f-75ec-4633-8042-edfe91e3be60-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.168519 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5367165f-75ec-4633-8042-edfe91e3be60-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.168610 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.168635 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5367165f-75ec-4633-8042-edfe91e3be60-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.278110 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.278168 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5367165f-75ec-4633-8042-edfe91e3be60-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.278238 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5367165f-75ec-4633-8042-edfe91e3be60-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.278261 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.278298 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5367165f-75ec-4633-8042-edfe91e3be60-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.278325 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.278358 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dss2r\" (UniqueName: \"kubernetes.io/projected/5367165f-75ec-4633-8042-edfe91e3be60-kube-api-access-dss2r\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.278393 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.278415 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5367165f-75ec-4633-8042-edfe91e3be60-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.278470 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5367165f-75ec-4633-8042-edfe91e3be60-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.278646 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5367165f-75ec-4633-8042-edfe91e3be60-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.279353 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.282762 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5367165f-75ec-4633-8042-edfe91e3be60-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.283815 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.285213 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.285314 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.286015 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5367165f-75ec-4633-8042-edfe91e3be60-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.286220 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5367165f-75ec-4633-8042-edfe91e3be60-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.286952 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5367165f-75ec-4633-8042-edfe91e3be60-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.317556 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5367165f-75ec-4633-8042-edfe91e3be60-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.318000 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dss2r\" (UniqueName: \"kubernetes.io/projected/5367165f-75ec-4633-8042-edfe91e3be60-kube-api-access-dss2r\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.318946 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5367165f-75ec-4633-8042-edfe91e3be60-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.322780 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.337415 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.387280 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.640595 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-n9nkr" event={"ID":"a8ddeca6-ac51-4af7-8b3c-6c851b1c415a","Type":"ContainerStarted","Data":"c42132d3c6359466d825ae44da3d0618ebe465c3b1cd3821d0def3662877400a"} Dec 03 07:07:10 crc kubenswrapper[4947]: I1203 07:07:10.912373 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.005408 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.311629 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.312805 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.316243 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-2prbt" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.316414 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.316441 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.317795 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.322242 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.336710 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.406828 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-config-data-generated\") pod \"openstack-galera-0\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.406867 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs69t\" (UniqueName: \"kubernetes.io/projected/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-kube-api-access-rs69t\") pod \"openstack-galera-0\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.406891 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.406932 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.406954 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-kolla-config\") pod \"openstack-galera-0\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.406974 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.406991 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-config-data-default\") pod \"openstack-galera-0\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.407018 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-operator-scripts\") pod \"openstack-galera-0\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.508380 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-config-data-generated\") pod \"openstack-galera-0\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.508426 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs69t\" (UniqueName: \"kubernetes.io/projected/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-kube-api-access-rs69t\") pod \"openstack-galera-0\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.508451 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.508514 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.508543 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-kolla-config\") pod \"openstack-galera-0\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.508564 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.508584 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-config-data-default\") pod \"openstack-galera-0\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.508619 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-operator-scripts\") pod \"openstack-galera-0\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.512638 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-kolla-config\") pod \"openstack-galera-0\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.512654 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-config-data-generated\") pod \"openstack-galera-0\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.512809 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.513419 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-config-data-default\") pod \"openstack-galera-0\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.514827 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-operator-scripts\") pod \"openstack-galera-0\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.518553 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.530201 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.532260 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.543330 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs69t\" (UniqueName: \"kubernetes.io/projected/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-kube-api-access-rs69t\") pod \"openstack-galera-0\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " pod="openstack/openstack-galera-0" Dec 03 07:07:11 crc kubenswrapper[4947]: I1203 07:07:11.670094 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 07:07:12 crc kubenswrapper[4947]: I1203 07:07:12.878043 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 07:07:12 crc kubenswrapper[4947]: I1203 07:07:12.879391 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:12 crc kubenswrapper[4947]: I1203 07:07:12.881959 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 03 07:07:12 crc kubenswrapper[4947]: I1203 07:07:12.882621 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 03 07:07:12 crc kubenswrapper[4947]: I1203 07:07:12.882755 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 03 07:07:12 crc kubenswrapper[4947]: I1203 07:07:12.882777 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-nf5v4" Dec 03 07:07:12 crc kubenswrapper[4947]: I1203 07:07:12.889804 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 07:07:12 crc kubenswrapper[4947]: I1203 07:07:12.943364 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:12 crc kubenswrapper[4947]: I1203 07:07:12.943458 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:12 crc kubenswrapper[4947]: I1203 07:07:12.943565 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:12 crc kubenswrapper[4947]: I1203 07:07:12.943601 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvkdq\" (UniqueName: \"kubernetes.io/projected/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-kube-api-access-zvkdq\") pod \"openstack-cell1-galera-0\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:12 crc kubenswrapper[4947]: I1203 07:07:12.943627 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:12 crc kubenswrapper[4947]: I1203 07:07:12.943676 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:12 crc kubenswrapper[4947]: I1203 07:07:12.943706 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:12 crc kubenswrapper[4947]: I1203 07:07:12.943729 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.045474 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.045603 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.045644 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.045697 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvkdq\" (UniqueName: \"kubernetes.io/projected/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-kube-api-access-zvkdq\") pod \"openstack-cell1-galera-0\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.045770 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.045837 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.045873 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.045921 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.046212 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.046594 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.047559 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.047701 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.047750 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.052428 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.063743 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.073092 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvkdq\" (UniqueName: \"kubernetes.io/projected/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-kube-api-access-zvkdq\") pod \"openstack-cell1-galera-0\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.076725 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.196126 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.197096 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.198877 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.199038 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-28kn7" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.200183 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.208195 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.217512 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.264056 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/add50932-e8ea-4e7a-ab75-6fb1e0463499-memcached-tls-certs\") pod \"memcached-0\" (UID: \"add50932-e8ea-4e7a-ab75-6fb1e0463499\") " pod="openstack/memcached-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.264755 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/add50932-e8ea-4e7a-ab75-6fb1e0463499-config-data\") pod \"memcached-0\" (UID: \"add50932-e8ea-4e7a-ab75-6fb1e0463499\") " pod="openstack/memcached-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.264812 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add50932-e8ea-4e7a-ab75-6fb1e0463499-combined-ca-bundle\") pod \"memcached-0\" (UID: \"add50932-e8ea-4e7a-ab75-6fb1e0463499\") " pod="openstack/memcached-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.264958 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/add50932-e8ea-4e7a-ab75-6fb1e0463499-kolla-config\") pod \"memcached-0\" (UID: \"add50932-e8ea-4e7a-ab75-6fb1e0463499\") " pod="openstack/memcached-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.265114 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwk2t\" (UniqueName: \"kubernetes.io/projected/add50932-e8ea-4e7a-ab75-6fb1e0463499-kube-api-access-nwk2t\") pod \"memcached-0\" (UID: \"add50932-e8ea-4e7a-ab75-6fb1e0463499\") " pod="openstack/memcached-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.366408 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwk2t\" (UniqueName: \"kubernetes.io/projected/add50932-e8ea-4e7a-ab75-6fb1e0463499-kube-api-access-nwk2t\") pod \"memcached-0\" (UID: \"add50932-e8ea-4e7a-ab75-6fb1e0463499\") " pod="openstack/memcached-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.366538 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/add50932-e8ea-4e7a-ab75-6fb1e0463499-memcached-tls-certs\") pod \"memcached-0\" (UID: \"add50932-e8ea-4e7a-ab75-6fb1e0463499\") " pod="openstack/memcached-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.366562 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/add50932-e8ea-4e7a-ab75-6fb1e0463499-config-data\") pod \"memcached-0\" (UID: \"add50932-e8ea-4e7a-ab75-6fb1e0463499\") " pod="openstack/memcached-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.366584 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add50932-e8ea-4e7a-ab75-6fb1e0463499-combined-ca-bundle\") pod \"memcached-0\" (UID: \"add50932-e8ea-4e7a-ab75-6fb1e0463499\") " pod="openstack/memcached-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.366609 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/add50932-e8ea-4e7a-ab75-6fb1e0463499-kolla-config\") pod \"memcached-0\" (UID: \"add50932-e8ea-4e7a-ab75-6fb1e0463499\") " pod="openstack/memcached-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.367633 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/add50932-e8ea-4e7a-ab75-6fb1e0463499-kolla-config\") pod \"memcached-0\" (UID: \"add50932-e8ea-4e7a-ab75-6fb1e0463499\") " pod="openstack/memcached-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.368183 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/add50932-e8ea-4e7a-ab75-6fb1e0463499-config-data\") pod \"memcached-0\" (UID: \"add50932-e8ea-4e7a-ab75-6fb1e0463499\") " pod="openstack/memcached-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.387441 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/add50932-e8ea-4e7a-ab75-6fb1e0463499-memcached-tls-certs\") pod \"memcached-0\" (UID: \"add50932-e8ea-4e7a-ab75-6fb1e0463499\") " pod="openstack/memcached-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.389977 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwk2t\" (UniqueName: \"kubernetes.io/projected/add50932-e8ea-4e7a-ab75-6fb1e0463499-kube-api-access-nwk2t\") pod \"memcached-0\" (UID: \"add50932-e8ea-4e7a-ab75-6fb1e0463499\") " pod="openstack/memcached-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.428923 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add50932-e8ea-4e7a-ab75-6fb1e0463499-combined-ca-bundle\") pod \"memcached-0\" (UID: \"add50932-e8ea-4e7a-ab75-6fb1e0463499\") " pod="openstack/memcached-0" Dec 03 07:07:13 crc kubenswrapper[4947]: I1203 07:07:13.605445 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 07:07:14 crc kubenswrapper[4947]: I1203 07:07:14.824406 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 07:07:14 crc kubenswrapper[4947]: I1203 07:07:14.828339 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 07:07:14 crc kubenswrapper[4947]: I1203 07:07:14.834732 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-jt4g2" Dec 03 07:07:14 crc kubenswrapper[4947]: I1203 07:07:14.835844 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 07:07:14 crc kubenswrapper[4947]: I1203 07:07:14.901010 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4znd\" (UniqueName: \"kubernetes.io/projected/76881366-670a-494f-ba95-7c5187ba80e8-kube-api-access-d4znd\") pod \"kube-state-metrics-0\" (UID: \"76881366-670a-494f-ba95-7c5187ba80e8\") " pod="openstack/kube-state-metrics-0" Dec 03 07:07:15 crc kubenswrapper[4947]: I1203 07:07:15.002153 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4znd\" (UniqueName: \"kubernetes.io/projected/76881366-670a-494f-ba95-7c5187ba80e8-kube-api-access-d4znd\") pod \"kube-state-metrics-0\" (UID: \"76881366-670a-494f-ba95-7c5187ba80e8\") " pod="openstack/kube-state-metrics-0" Dec 03 07:07:15 crc kubenswrapper[4947]: I1203 07:07:15.021605 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4znd\" (UniqueName: \"kubernetes.io/projected/76881366-670a-494f-ba95-7c5187ba80e8-kube-api-access-d4znd\") pod \"kube-state-metrics-0\" (UID: \"76881366-670a-494f-ba95-7c5187ba80e8\") " pod="openstack/kube-state-metrics-0" Dec 03 07:07:15 crc kubenswrapper[4947]: I1203 07:07:15.148324 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 07:07:16 crc kubenswrapper[4947]: I1203 07:07:16.750273 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5367165f-75ec-4633-8042-edfe91e3be60","Type":"ContainerStarted","Data":"b18b2126e63d321b34ad6f06ae5e5d1eba12bd9f355cc32ce87928b9eee7b969"} Dec 03 07:07:16 crc kubenswrapper[4947]: I1203 07:07:16.753674 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"51d7ef1d-a0bf-465f-baad-1bc3a71618ff","Type":"ContainerStarted","Data":"f7b505747bd91cb699b7598e9ae32eebcc8cebfc9aebbb06ea83a68fdb33110c"} Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:18.981612 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-84gbj"] Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:18.986286 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84gbj" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.023230 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.023311 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-bmx8l" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.028557 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.049567 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-84gbj"] Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.062710 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/98211c56-fd23-46c2-9710-31fc562e2182-var-log-ovn\") pod \"ovn-controller-84gbj\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " pod="openstack/ovn-controller-84gbj" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.062765 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/98211c56-fd23-46c2-9710-31fc562e2182-var-run\") pod \"ovn-controller-84gbj\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " pod="openstack/ovn-controller-84gbj" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.062807 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/98211c56-fd23-46c2-9710-31fc562e2182-var-run-ovn\") pod \"ovn-controller-84gbj\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " pod="openstack/ovn-controller-84gbj" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.062849 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/98211c56-fd23-46c2-9710-31fc562e2182-scripts\") pod \"ovn-controller-84gbj\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " pod="openstack/ovn-controller-84gbj" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.062872 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98211c56-fd23-46c2-9710-31fc562e2182-combined-ca-bundle\") pod \"ovn-controller-84gbj\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " pod="openstack/ovn-controller-84gbj" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.062918 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/98211c56-fd23-46c2-9710-31fc562e2182-ovn-controller-tls-certs\") pod \"ovn-controller-84gbj\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " pod="openstack/ovn-controller-84gbj" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.062942 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7hjn\" (UniqueName: \"kubernetes.io/projected/98211c56-fd23-46c2-9710-31fc562e2182-kube-api-access-w7hjn\") pod \"ovn-controller-84gbj\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " pod="openstack/ovn-controller-84gbj" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.065173 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-lb94d"] Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.066868 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.081617 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-lb94d"] Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.163402 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/98211c56-fd23-46c2-9710-31fc562e2182-var-run\") pod \"ovn-controller-84gbj\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " pod="openstack/ovn-controller-84gbj" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.163463 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-var-log\") pod \"ovn-controller-ovs-lb94d\" (UID: \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\") " pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.163501 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/98211c56-fd23-46c2-9710-31fc562e2182-var-run-ovn\") pod \"ovn-controller-84gbj\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " pod="openstack/ovn-controller-84gbj" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.163540 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/98211c56-fd23-46c2-9710-31fc562e2182-scripts\") pod \"ovn-controller-84gbj\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " pod="openstack/ovn-controller-84gbj" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.163591 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-var-lib\") pod \"ovn-controller-ovs-lb94d\" (UID: \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\") " pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.163607 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-var-run\") pod \"ovn-controller-ovs-lb94d\" (UID: \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\") " pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.163624 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98211c56-fd23-46c2-9710-31fc562e2182-combined-ca-bundle\") pod \"ovn-controller-84gbj\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " pod="openstack/ovn-controller-84gbj" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.163640 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-etc-ovs\") pod \"ovn-controller-ovs-lb94d\" (UID: \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\") " pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.163675 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc7vg\" (UniqueName: \"kubernetes.io/projected/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-kube-api-access-pc7vg\") pod \"ovn-controller-ovs-lb94d\" (UID: \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\") " pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.163698 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-scripts\") pod \"ovn-controller-ovs-lb94d\" (UID: \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\") " pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.163748 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/98211c56-fd23-46c2-9710-31fc562e2182-ovn-controller-tls-certs\") pod \"ovn-controller-84gbj\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " pod="openstack/ovn-controller-84gbj" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.163769 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7hjn\" (UniqueName: \"kubernetes.io/projected/98211c56-fd23-46c2-9710-31fc562e2182-kube-api-access-w7hjn\") pod \"ovn-controller-84gbj\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " pod="openstack/ovn-controller-84gbj" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.163786 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/98211c56-fd23-46c2-9710-31fc562e2182-var-log-ovn\") pod \"ovn-controller-84gbj\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " pod="openstack/ovn-controller-84gbj" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.164263 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/98211c56-fd23-46c2-9710-31fc562e2182-var-log-ovn\") pod \"ovn-controller-84gbj\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " pod="openstack/ovn-controller-84gbj" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.166815 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/98211c56-fd23-46c2-9710-31fc562e2182-var-run\") pod \"ovn-controller-84gbj\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " pod="openstack/ovn-controller-84gbj" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.166908 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/98211c56-fd23-46c2-9710-31fc562e2182-var-run-ovn\") pod \"ovn-controller-84gbj\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " pod="openstack/ovn-controller-84gbj" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.265257 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-var-log\") pod \"ovn-controller-ovs-lb94d\" (UID: \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\") " pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.265364 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-var-run\") pod \"ovn-controller-ovs-lb94d\" (UID: \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\") " pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.265389 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-var-lib\") pod \"ovn-controller-ovs-lb94d\" (UID: \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\") " pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.265422 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-etc-ovs\") pod \"ovn-controller-ovs-lb94d\" (UID: \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\") " pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.265451 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc7vg\" (UniqueName: \"kubernetes.io/projected/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-kube-api-access-pc7vg\") pod \"ovn-controller-ovs-lb94d\" (UID: \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\") " pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.265485 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-scripts\") pod \"ovn-controller-ovs-lb94d\" (UID: \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\") " pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.265659 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-var-run\") pod \"ovn-controller-ovs-lb94d\" (UID: \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\") " pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.265674 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-var-lib\") pod \"ovn-controller-ovs-lb94d\" (UID: \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\") " pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.265770 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-etc-ovs\") pod \"ovn-controller-ovs-lb94d\" (UID: \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\") " pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.265905 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-var-log\") pod \"ovn-controller-ovs-lb94d\" (UID: \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\") " pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.267845 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-scripts\") pod \"ovn-controller-ovs-lb94d\" (UID: \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\") " pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.293418 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc7vg\" (UniqueName: \"kubernetes.io/projected/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-kube-api-access-pc7vg\") pod \"ovn-controller-ovs-lb94d\" (UID: \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\") " pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.326668 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/98211c56-fd23-46c2-9710-31fc562e2182-ovn-controller-tls-certs\") pod \"ovn-controller-84gbj\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " pod="openstack/ovn-controller-84gbj" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.326842 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98211c56-fd23-46c2-9710-31fc562e2182-combined-ca-bundle\") pod \"ovn-controller-84gbj\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " pod="openstack/ovn-controller-84gbj" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.328537 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/98211c56-fd23-46c2-9710-31fc562e2182-scripts\") pod \"ovn-controller-84gbj\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " pod="openstack/ovn-controller-84gbj" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.329154 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7hjn\" (UniqueName: \"kubernetes.io/projected/98211c56-fd23-46c2-9710-31fc562e2182-kube-api-access-w7hjn\") pod \"ovn-controller-84gbj\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " pod="openstack/ovn-controller-84gbj" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.346639 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84gbj" Dec 03 07:07:19 crc kubenswrapper[4947]: I1203 07:07:19.396820 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.331689 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.334668 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.340071 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.341444 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.341534 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-ks6xz" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.341447 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.341735 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.341851 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.536328 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c475475-916c-4267-8064-f932c04d0df2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.536377 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c475475-916c-4267-8064-f932c04d0df2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.536448 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c475475-916c-4267-8064-f932c04d0df2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.536521 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6c475475-916c-4267-8064-f932c04d0df2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.536552 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c475475-916c-4267-8064-f932c04d0df2-config\") pod \"ovsdbserver-nb-0\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.536583 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.536604 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ww8p\" (UniqueName: \"kubernetes.io/projected/6c475475-916c-4267-8064-f932c04d0df2-kube-api-access-6ww8p\") pod \"ovsdbserver-nb-0\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.536648 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c475475-916c-4267-8064-f932c04d0df2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.638375 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.638422 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ww8p\" (UniqueName: \"kubernetes.io/projected/6c475475-916c-4267-8064-f932c04d0df2-kube-api-access-6ww8p\") pod \"ovsdbserver-nb-0\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.638528 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c475475-916c-4267-8064-f932c04d0df2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.638642 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c475475-916c-4267-8064-f932c04d0df2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.638672 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c475475-916c-4267-8064-f932c04d0df2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.638846 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c475475-916c-4267-8064-f932c04d0df2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.639059 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6c475475-916c-4267-8064-f932c04d0df2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.639122 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c475475-916c-4267-8064-f932c04d0df2-config\") pod \"ovsdbserver-nb-0\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.639529 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.640886 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6c475475-916c-4267-8064-f932c04d0df2-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.641421 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c475475-916c-4267-8064-f932c04d0df2-config\") pod \"ovsdbserver-nb-0\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.641582 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c475475-916c-4267-8064-f932c04d0df2-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.655591 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c475475-916c-4267-8064-f932c04d0df2-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.655621 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c475475-916c-4267-8064-f932c04d0df2-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.658346 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c475475-916c-4267-8064-f932c04d0df2-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.659590 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ww8p\" (UniqueName: \"kubernetes.io/projected/6c475475-916c-4267-8064-f932c04d0df2-kube-api-access-6ww8p\") pod \"ovsdbserver-nb-0\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.667067 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.668308 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.676569 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.922215 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.928225 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.931778 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-fmfqj" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.932083 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.932245 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.932561 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 03 07:07:21 crc kubenswrapper[4947]: I1203 07:07:21.942213 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.050048 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54rrj\" (UniqueName: \"kubernetes.io/projected/90b682d1-68e6-49a4-83a6-51b1b40b7e99-kube-api-access-54rrj\") pod \"ovsdbserver-sb-0\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.050134 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90b682d1-68e6-49a4-83a6-51b1b40b7e99-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.050177 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90b682d1-68e6-49a4-83a6-51b1b40b7e99-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.050259 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/90b682d1-68e6-49a4-83a6-51b1b40b7e99-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.050291 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90b682d1-68e6-49a4-83a6-51b1b40b7e99-config\") pod \"ovsdbserver-sb-0\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.050322 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/90b682d1-68e6-49a4-83a6-51b1b40b7e99-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.050367 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.050393 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90b682d1-68e6-49a4-83a6-51b1b40b7e99-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.151526 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90b682d1-68e6-49a4-83a6-51b1b40b7e99-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.151771 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90b682d1-68e6-49a4-83a6-51b1b40b7e99-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.151826 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/90b682d1-68e6-49a4-83a6-51b1b40b7e99-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.151846 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90b682d1-68e6-49a4-83a6-51b1b40b7e99-config\") pod \"ovsdbserver-sb-0\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.151864 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/90b682d1-68e6-49a4-83a6-51b1b40b7e99-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.151894 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.151908 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90b682d1-68e6-49a4-83a6-51b1b40b7e99-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.151957 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54rrj\" (UniqueName: \"kubernetes.io/projected/90b682d1-68e6-49a4-83a6-51b1b40b7e99-kube-api-access-54rrj\") pod \"ovsdbserver-sb-0\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.152418 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.153695 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/90b682d1-68e6-49a4-83a6-51b1b40b7e99-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.154009 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90b682d1-68e6-49a4-83a6-51b1b40b7e99-config\") pod \"ovsdbserver-sb-0\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.154332 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90b682d1-68e6-49a4-83a6-51b1b40b7e99-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.164140 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/90b682d1-68e6-49a4-83a6-51b1b40b7e99-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.164142 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90b682d1-68e6-49a4-83a6-51b1b40b7e99-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.164281 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90b682d1-68e6-49a4-83a6-51b1b40b7e99-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.167132 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54rrj\" (UniqueName: \"kubernetes.io/projected/90b682d1-68e6-49a4-83a6-51b1b40b7e99-kube-api-access-54rrj\") pod \"ovsdbserver-sb-0\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.169846 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:22 crc kubenswrapper[4947]: I1203 07:07:22.250616 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:25 crc kubenswrapper[4947]: W1203 07:07:25.253337 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadd50932_e8ea_4e7a_ab75_6fb1e0463499.slice/crio-74fb78f5022052166a30c09f0fbdf7c96543565acb2a6e491193815ec2a0dd60 WatchSource:0}: Error finding container 74fb78f5022052166a30c09f0fbdf7c96543565acb2a6e491193815ec2a0dd60: Status 404 returned error can't find the container with id 74fb78f5022052166a30c09f0fbdf7c96543565acb2a6e491193815ec2a0dd60 Dec 03 07:07:25 crc kubenswrapper[4947]: I1203 07:07:25.820010 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"add50932-e8ea-4e7a-ab75-6fb1e0463499","Type":"ContainerStarted","Data":"74fb78f5022052166a30c09f0fbdf7c96543565acb2a6e491193815ec2a0dd60"} Dec 03 07:07:26 crc kubenswrapper[4947]: I1203 07:07:26.463920 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 07:07:26 crc kubenswrapper[4947]: I1203 07:07:26.469711 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 07:07:26 crc kubenswrapper[4947]: I1203 07:07:26.569867 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-84gbj"] Dec 03 07:07:26 crc kubenswrapper[4947]: I1203 07:07:26.579390 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 07:07:26 crc kubenswrapper[4947]: I1203 07:07:26.719124 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-lb94d"] Dec 03 07:07:26 crc kubenswrapper[4947]: I1203 07:07:26.806801 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 07:07:27 crc kubenswrapper[4947]: W1203 07:07:27.216764 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b86a5d2_1933_4a2f_97de_f3b49985fbf8.slice/crio-0c1cc19621bbdad40595aed4161c6d5e21175857117992d4456c69b529bf91c9 WatchSource:0}: Error finding container 0c1cc19621bbdad40595aed4161c6d5e21175857117992d4456c69b529bf91c9: Status 404 returned error can't find the container with id 0c1cc19621bbdad40595aed4161c6d5e21175857117992d4456c69b529bf91c9 Dec 03 07:07:27 crc kubenswrapper[4947]: W1203 07:07:27.227522 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c475475_916c_4267_8064_f932c04d0df2.slice/crio-1a2c780252bf9565c240dbb58e23c3b76dd581500932bae059cebded2f499d39 WatchSource:0}: Error finding container 1a2c780252bf9565c240dbb58e23c3b76dd581500932bae059cebded2f499d39: Status 404 returned error can't find the container with id 1a2c780252bf9565c240dbb58e23c3b76dd581500932bae059cebded2f499d39 Dec 03 07:07:27 crc kubenswrapper[4947]: W1203 07:07:27.229350 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76881366_670a_494f_ba95_7c5187ba80e8.slice/crio-19319df9642e415de8d25c9c6f6fb4e4f404292a76344615b35465139baacb3a WatchSource:0}: Error finding container 19319df9642e415de8d25c9c6f6fb4e4f404292a76344615b35465139baacb3a: Status 404 returned error can't find the container with id 19319df9642e415de8d25c9c6f6fb4e4f404292a76344615b35465139baacb3a Dec 03 07:07:27 crc kubenswrapper[4947]: W1203 07:07:27.231744 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98211c56_fd23_46c2_9710_31fc562e2182.slice/crio-5ae027dcf3be49781729b775d310cf2e66b9997bd637936180522f9cc453e25a WatchSource:0}: Error finding container 5ae027dcf3be49781729b775d310cf2e66b9997bd637936180522f9cc453e25a: Status 404 returned error can't find the container with id 5ae027dcf3be49781729b775d310cf2e66b9997bd637936180522f9cc453e25a Dec 03 07:07:27 crc kubenswrapper[4947]: I1203 07:07:27.701874 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 07:07:27 crc kubenswrapper[4947]: I1203 07:07:27.839476 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6c475475-916c-4267-8064-f932c04d0df2","Type":"ContainerStarted","Data":"1a2c780252bf9565c240dbb58e23c3b76dd581500932bae059cebded2f499d39"} Dec 03 07:07:27 crc kubenswrapper[4947]: I1203 07:07:27.840752 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8","Type":"ContainerStarted","Data":"830c08616b73bd8b530b0cc2fc5c5768679ded583b14b43ef2e68f803802520a"} Dec 03 07:07:27 crc kubenswrapper[4947]: I1203 07:07:27.842000 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"76881366-670a-494f-ba95-7c5187ba80e8","Type":"ContainerStarted","Data":"19319df9642e415de8d25c9c6f6fb4e4f404292a76344615b35465139baacb3a"} Dec 03 07:07:27 crc kubenswrapper[4947]: I1203 07:07:27.844020 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lb94d" event={"ID":"7b86a5d2-1933-4a2f-97de-f3b49985fbf8","Type":"ContainerStarted","Data":"0c1cc19621bbdad40595aed4161c6d5e21175857117992d4456c69b529bf91c9"} Dec 03 07:07:27 crc kubenswrapper[4947]: I1203 07:07:27.845052 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84gbj" event={"ID":"98211c56-fd23-46c2-9710-31fc562e2182","Type":"ContainerStarted","Data":"5ae027dcf3be49781729b775d310cf2e66b9997bd637936180522f9cc453e25a"} Dec 03 07:07:27 crc kubenswrapper[4947]: W1203 07:07:27.877481 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbc40b18_511b_4bd7_bb2c_3dc868c6dcec.slice/crio-3e74b3ace2547285a3aaed7a716aed4a64a2cb2d00748ce51351b96e4c728b84 WatchSource:0}: Error finding container 3e74b3ace2547285a3aaed7a716aed4a64a2cb2d00748ce51351b96e4c728b84: Status 404 returned error can't find the container with id 3e74b3ace2547285a3aaed7a716aed4a64a2cb2d00748ce51351b96e4c728b84 Dec 03 07:07:27 crc kubenswrapper[4947]: W1203 07:07:27.899271 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90b682d1_68e6_49a4_83a6_51b1b40b7e99.slice/crio-579cbb8fdc4c6b9e04a1e0d47193bfce34fc2131c0319a5c4dd0313964672ec8 WatchSource:0}: Error finding container 579cbb8fdc4c6b9e04a1e0d47193bfce34fc2131c0319a5c4dd0313964672ec8: Status 404 returned error can't find the container with id 579cbb8fdc4c6b9e04a1e0d47193bfce34fc2131c0319a5c4dd0313964672ec8 Dec 03 07:07:28 crc kubenswrapper[4947]: I1203 07:07:28.857250 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"90b682d1-68e6-49a4-83a6-51b1b40b7e99","Type":"ContainerStarted","Data":"579cbb8fdc4c6b9e04a1e0d47193bfce34fc2131c0319a5c4dd0313964672ec8"} Dec 03 07:07:28 crc kubenswrapper[4947]: I1203 07:07:28.859083 4947 generic.go:334] "Generic (PLEG): container finished" podID="a8ddeca6-ac51-4af7-8b3c-6c851b1c415a" containerID="9825011b0c5d954e49ebaa8a361b886e2057fc03510d0f7691a5884d5f35ffb7" exitCode=0 Dec 03 07:07:28 crc kubenswrapper[4947]: I1203 07:07:28.859138 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-n9nkr" event={"ID":"a8ddeca6-ac51-4af7-8b3c-6c851b1c415a","Type":"ContainerDied","Data":"9825011b0c5d954e49ebaa8a361b886e2057fc03510d0f7691a5884d5f35ffb7"} Dec 03 07:07:28 crc kubenswrapper[4947]: I1203 07:07:28.861695 4947 generic.go:334] "Generic (PLEG): container finished" podID="c1b40298-733b-476a-aeca-4c0025a86a98" containerID="39035dd80a514fe785a182e5a96c87adfd007590c131150de146cd8daa9aa27c" exitCode=0 Dec 03 07:07:28 crc kubenswrapper[4947]: I1203 07:07:28.861783 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-whnjz" event={"ID":"c1b40298-733b-476a-aeca-4c0025a86a98","Type":"ContainerDied","Data":"39035dd80a514fe785a182e5a96c87adfd007590c131150de146cd8daa9aa27c"} Dec 03 07:07:28 crc kubenswrapper[4947]: I1203 07:07:28.864125 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"add50932-e8ea-4e7a-ab75-6fb1e0463499","Type":"ContainerStarted","Data":"263bf6a9b3ef4a98946d41c42d9dbc1a2931efc6079ce15d3713906e986c6d70"} Dec 03 07:07:28 crc kubenswrapper[4947]: I1203 07:07:28.864251 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 03 07:07:28 crc kubenswrapper[4947]: I1203 07:07:28.865260 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec","Type":"ContainerStarted","Data":"3e74b3ace2547285a3aaed7a716aed4a64a2cb2d00748ce51351b96e4c728b84"} Dec 03 07:07:28 crc kubenswrapper[4947]: I1203 07:07:28.866978 4947 generic.go:334] "Generic (PLEG): container finished" podID="3b5ca077-d5b1-4d8f-ad52-1d9058a764ce" containerID="87526f34acea0e2f1eb1a7ffe8ce55922b984b35d78248e3ac5885e564f36592" exitCode=0 Dec 03 07:07:28 crc kubenswrapper[4947]: I1203 07:07:28.867016 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567c455747-g4bds" event={"ID":"3b5ca077-d5b1-4d8f-ad52-1d9058a764ce","Type":"ContainerDied","Data":"87526f34acea0e2f1eb1a7ffe8ce55922b984b35d78248e3ac5885e564f36592"} Dec 03 07:07:28 crc kubenswrapper[4947]: I1203 07:07:28.868577 4947 generic.go:334] "Generic (PLEG): container finished" podID="4ce4dd53-a622-469e-b1b8-f042fb62cdb8" containerID="60bda54479d41c97db192f899070213824cff9631674beed5629d3e36ad2f365" exitCode=0 Dec 03 07:07:28 crc kubenswrapper[4947]: I1203 07:07:28.868599 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cd484bb89-hkrwn" event={"ID":"4ce4dd53-a622-469e-b1b8-f042fb62cdb8","Type":"ContainerDied","Data":"60bda54479d41c97db192f899070213824cff9631674beed5629d3e36ad2f365"} Dec 03 07:07:28 crc kubenswrapper[4947]: I1203 07:07:28.930858 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=13.197821553 podStartE2EDuration="15.930841987s" podCreationTimestamp="2025-12-03 07:07:13 +0000 UTC" firstStartedPulling="2025-12-03 07:07:25.255579899 +0000 UTC m=+1106.516534315" lastFinishedPulling="2025-12-03 07:07:27.988600333 +0000 UTC m=+1109.249554749" observedRunningTime="2025-12-03 07:07:28.924118215 +0000 UTC m=+1110.185072651" watchObservedRunningTime="2025-12-03 07:07:28.930841987 +0000 UTC m=+1110.191796413" Dec 03 07:07:29 crc kubenswrapper[4947]: I1203 07:07:29.877044 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5367165f-75ec-4633-8042-edfe91e3be60","Type":"ContainerStarted","Data":"3a110d90b266fd5a43f006a937f64aeb99494243e24e04ebca6558fe2553d7e0"} Dec 03 07:07:29 crc kubenswrapper[4947]: I1203 07:07:29.878695 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"51d7ef1d-a0bf-465f-baad-1bc3a71618ff","Type":"ContainerStarted","Data":"f8e964bb562c5817c758d9180e2467c30d1c58be4f6c8c81c568b4e200107ce9"} Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.086551 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.087100 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.087142 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.087629 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"766a670181f7035ab6014723203ba9e8cefa098d94e396af08b21929329a9713"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.087697 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://766a670181f7035ab6014723203ba9e8cefa098d94e396af08b21929329a9713" gracePeriod=600 Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.590324 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-hkrwn" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.685813 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-g4bds" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.706449 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk5wg\" (UniqueName: \"kubernetes.io/projected/4ce4dd53-a622-469e-b1b8-f042fb62cdb8-kube-api-access-pk5wg\") pod \"4ce4dd53-a622-469e-b1b8-f042fb62cdb8\" (UID: \"4ce4dd53-a622-469e-b1b8-f042fb62cdb8\") " Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.706579 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ce4dd53-a622-469e-b1b8-f042fb62cdb8-config\") pod \"4ce4dd53-a622-469e-b1b8-f042fb62cdb8\" (UID: \"4ce4dd53-a622-469e-b1b8-f042fb62cdb8\") " Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.713796 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ce4dd53-a622-469e-b1b8-f042fb62cdb8-kube-api-access-pk5wg" (OuterVolumeSpecName: "kube-api-access-pk5wg") pod "4ce4dd53-a622-469e-b1b8-f042fb62cdb8" (UID: "4ce4dd53-a622-469e-b1b8-f042fb62cdb8"). InnerVolumeSpecName "kube-api-access-pk5wg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.727442 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ce4dd53-a622-469e-b1b8-f042fb62cdb8-config" (OuterVolumeSpecName: "config") pod "4ce4dd53-a622-469e-b1b8-f042fb62cdb8" (UID: "4ce4dd53-a622-469e-b1b8-f042fb62cdb8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.777658 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-6cg4w"] Dec 03 07:07:30 crc kubenswrapper[4947]: E1203 07:07:30.778351 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5ca077-d5b1-4d8f-ad52-1d9058a764ce" containerName="init" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.778372 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5ca077-d5b1-4d8f-ad52-1d9058a764ce" containerName="init" Dec 03 07:07:30 crc kubenswrapper[4947]: E1203 07:07:30.778403 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce4dd53-a622-469e-b1b8-f042fb62cdb8" containerName="init" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.778412 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce4dd53-a622-469e-b1b8-f042fb62cdb8" containerName="init" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.778601 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ce4dd53-a622-469e-b1b8-f042fb62cdb8" containerName="init" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.778622 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5ca077-d5b1-4d8f-ad52-1d9058a764ce" containerName="init" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.779268 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6cg4w" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.782106 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.789304 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6cg4w"] Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.807406 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b5ca077-d5b1-4d8f-ad52-1d9058a764ce-config\") pod \"3b5ca077-d5b1-4d8f-ad52-1d9058a764ce\" (UID: \"3b5ca077-d5b1-4d8f-ad52-1d9058a764ce\") " Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.807583 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b5ca077-d5b1-4d8f-ad52-1d9058a764ce-dns-svc\") pod \"3b5ca077-d5b1-4d8f-ad52-1d9058a764ce\" (UID: \"3b5ca077-d5b1-4d8f-ad52-1d9058a764ce\") " Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.807706 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckwnk\" (UniqueName: \"kubernetes.io/projected/3b5ca077-d5b1-4d8f-ad52-1d9058a764ce-kube-api-access-ckwnk\") pod \"3b5ca077-d5b1-4d8f-ad52-1d9058a764ce\" (UID: \"3b5ca077-d5b1-4d8f-ad52-1d9058a764ce\") " Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.808073 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk5wg\" (UniqueName: \"kubernetes.io/projected/4ce4dd53-a622-469e-b1b8-f042fb62cdb8-kube-api-access-pk5wg\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.808097 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ce4dd53-a622-469e-b1b8-f042fb62cdb8-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.815781 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b5ca077-d5b1-4d8f-ad52-1d9058a764ce-kube-api-access-ckwnk" (OuterVolumeSpecName: "kube-api-access-ckwnk") pod "3b5ca077-d5b1-4d8f-ad52-1d9058a764ce" (UID: "3b5ca077-d5b1-4d8f-ad52-1d9058a764ce"). InnerVolumeSpecName "kube-api-access-ckwnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.834007 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b5ca077-d5b1-4d8f-ad52-1d9058a764ce-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b5ca077-d5b1-4d8f-ad52-1d9058a764ce" (UID: "3b5ca077-d5b1-4d8f-ad52-1d9058a764ce"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.837069 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b5ca077-d5b1-4d8f-ad52-1d9058a764ce-config" (OuterVolumeSpecName: "config") pod "3b5ca077-d5b1-4d8f-ad52-1d9058a764ce" (UID: "3b5ca077-d5b1-4d8f-ad52-1d9058a764ce"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.878813 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-whnjz"] Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.901721 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="766a670181f7035ab6014723203ba9e8cefa098d94e396af08b21929329a9713" exitCode=0 Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.901811 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"766a670181f7035ab6014723203ba9e8cefa098d94e396af08b21929329a9713"} Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.901849 4947 scope.go:117] "RemoveContainer" containerID="b942c1f8be6b00a1ec761600a3a1ec1dc91971dc8d9f088d0a1649ce3f26d690" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.909379 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nts44\" (UniqueName: \"kubernetes.io/projected/3496bac7-6b31-4ba8-a490-14bff1522b8c-kube-api-access-nts44\") pod \"ovn-controller-metrics-6cg4w\" (UID: \"3496bac7-6b31-4ba8-a490-14bff1522b8c\") " pod="openstack/ovn-controller-metrics-6cg4w" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.909423 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3496bac7-6b31-4ba8-a490-14bff1522b8c-ovn-rundir\") pod \"ovn-controller-metrics-6cg4w\" (UID: \"3496bac7-6b31-4ba8-a490-14bff1522b8c\") " pod="openstack/ovn-controller-metrics-6cg4w" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.909479 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3496bac7-6b31-4ba8-a490-14bff1522b8c-config\") pod \"ovn-controller-metrics-6cg4w\" (UID: \"3496bac7-6b31-4ba8-a490-14bff1522b8c\") " pod="openstack/ovn-controller-metrics-6cg4w" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.909671 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3496bac7-6b31-4ba8-a490-14bff1522b8c-ovs-rundir\") pod \"ovn-controller-metrics-6cg4w\" (UID: \"3496bac7-6b31-4ba8-a490-14bff1522b8c\") " pod="openstack/ovn-controller-metrics-6cg4w" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.909795 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3496bac7-6b31-4ba8-a490-14bff1522b8c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6cg4w\" (UID: \"3496bac7-6b31-4ba8-a490-14bff1522b8c\") " pod="openstack/ovn-controller-metrics-6cg4w" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.909832 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3496bac7-6b31-4ba8-a490-14bff1522b8c-combined-ca-bundle\") pod \"ovn-controller-metrics-6cg4w\" (UID: \"3496bac7-6b31-4ba8-a490-14bff1522b8c\") " pod="openstack/ovn-controller-metrics-6cg4w" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.909950 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b5ca077-d5b1-4d8f-ad52-1d9058a764ce-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.909966 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b5ca077-d5b1-4d8f-ad52-1d9058a764ce-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.909977 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckwnk\" (UniqueName: \"kubernetes.io/projected/3b5ca077-d5b1-4d8f-ad52-1d9058a764ce-kube-api-access-ckwnk\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.918054 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567c455747-g4bds" event={"ID":"3b5ca077-d5b1-4d8f-ad52-1d9058a764ce","Type":"ContainerDied","Data":"fe03120c1fbcbdbba698c720a903e48f463b2c02dd24ee49e6173129e3278dd2"} Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.918151 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567c455747-g4bds" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.925795 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57db9b5bc9-xcf5x"] Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.937661 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cd484bb89-hkrwn" event={"ID":"4ce4dd53-a622-469e-b1b8-f042fb62cdb8","Type":"ContainerDied","Data":"734be8a14e6c85c25357a240a29d5d499803062e26fd2c1430a5fb4c60db2459"} Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.937736 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cd484bb89-hkrwn" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.937862 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57db9b5bc9-xcf5x" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.942093 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 03 07:07:30 crc kubenswrapper[4947]: I1203 07:07:30.957107 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57db9b5bc9-xcf5x"] Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.011457 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62dc1af0-3dad-4341-b5ff-b5530fdaa78c-config\") pod \"dnsmasq-dns-57db9b5bc9-xcf5x\" (UID: \"62dc1af0-3dad-4341-b5ff-b5530fdaa78c\") " pod="openstack/dnsmasq-dns-57db9b5bc9-xcf5x" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.011615 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3496bac7-6b31-4ba8-a490-14bff1522b8c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6cg4w\" (UID: \"3496bac7-6b31-4ba8-a490-14bff1522b8c\") " pod="openstack/ovn-controller-metrics-6cg4w" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.011651 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3496bac7-6b31-4ba8-a490-14bff1522b8c-combined-ca-bundle\") pod \"ovn-controller-metrics-6cg4w\" (UID: \"3496bac7-6b31-4ba8-a490-14bff1522b8c\") " pod="openstack/ovn-controller-metrics-6cg4w" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.011679 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62dc1af0-3dad-4341-b5ff-b5530fdaa78c-ovsdbserver-nb\") pod \"dnsmasq-dns-57db9b5bc9-xcf5x\" (UID: \"62dc1af0-3dad-4341-b5ff-b5530fdaa78c\") " pod="openstack/dnsmasq-dns-57db9b5bc9-xcf5x" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.011705 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nts44\" (UniqueName: \"kubernetes.io/projected/3496bac7-6b31-4ba8-a490-14bff1522b8c-kube-api-access-nts44\") pod \"ovn-controller-metrics-6cg4w\" (UID: \"3496bac7-6b31-4ba8-a490-14bff1522b8c\") " pod="openstack/ovn-controller-metrics-6cg4w" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.011739 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62dc1af0-3dad-4341-b5ff-b5530fdaa78c-dns-svc\") pod \"dnsmasq-dns-57db9b5bc9-xcf5x\" (UID: \"62dc1af0-3dad-4341-b5ff-b5530fdaa78c\") " pod="openstack/dnsmasq-dns-57db9b5bc9-xcf5x" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.011757 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3496bac7-6b31-4ba8-a490-14bff1522b8c-ovn-rundir\") pod \"ovn-controller-metrics-6cg4w\" (UID: \"3496bac7-6b31-4ba8-a490-14bff1522b8c\") " pod="openstack/ovn-controller-metrics-6cg4w" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.011810 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3496bac7-6b31-4ba8-a490-14bff1522b8c-config\") pod \"ovn-controller-metrics-6cg4w\" (UID: \"3496bac7-6b31-4ba8-a490-14bff1522b8c\") " pod="openstack/ovn-controller-metrics-6cg4w" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.012014 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3496bac7-6b31-4ba8-a490-14bff1522b8c-ovs-rundir\") pod \"ovn-controller-metrics-6cg4w\" (UID: \"3496bac7-6b31-4ba8-a490-14bff1522b8c\") " pod="openstack/ovn-controller-metrics-6cg4w" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.012095 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7jkz\" (UniqueName: \"kubernetes.io/projected/62dc1af0-3dad-4341-b5ff-b5530fdaa78c-kube-api-access-w7jkz\") pod \"dnsmasq-dns-57db9b5bc9-xcf5x\" (UID: \"62dc1af0-3dad-4341-b5ff-b5530fdaa78c\") " pod="openstack/dnsmasq-dns-57db9b5bc9-xcf5x" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.012413 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3496bac7-6b31-4ba8-a490-14bff1522b8c-config\") pod \"ovn-controller-metrics-6cg4w\" (UID: \"3496bac7-6b31-4ba8-a490-14bff1522b8c\") " pod="openstack/ovn-controller-metrics-6cg4w" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.012865 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3496bac7-6b31-4ba8-a490-14bff1522b8c-ovs-rundir\") pod \"ovn-controller-metrics-6cg4w\" (UID: \"3496bac7-6b31-4ba8-a490-14bff1522b8c\") " pod="openstack/ovn-controller-metrics-6cg4w" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.012872 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3496bac7-6b31-4ba8-a490-14bff1522b8c-ovn-rundir\") pod \"ovn-controller-metrics-6cg4w\" (UID: \"3496bac7-6b31-4ba8-a490-14bff1522b8c\") " pod="openstack/ovn-controller-metrics-6cg4w" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.021186 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3496bac7-6b31-4ba8-a490-14bff1522b8c-combined-ca-bundle\") pod \"ovn-controller-metrics-6cg4w\" (UID: \"3496bac7-6b31-4ba8-a490-14bff1522b8c\") " pod="openstack/ovn-controller-metrics-6cg4w" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.021888 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3496bac7-6b31-4ba8-a490-14bff1522b8c-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6cg4w\" (UID: \"3496bac7-6b31-4ba8-a490-14bff1522b8c\") " pod="openstack/ovn-controller-metrics-6cg4w" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.058769 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nts44\" (UniqueName: \"kubernetes.io/projected/3496bac7-6b31-4ba8-a490-14bff1522b8c-kube-api-access-nts44\") pod \"ovn-controller-metrics-6cg4w\" (UID: \"3496bac7-6b31-4ba8-a490-14bff1522b8c\") " pod="openstack/ovn-controller-metrics-6cg4w" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.112852 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7jkz\" (UniqueName: \"kubernetes.io/projected/62dc1af0-3dad-4341-b5ff-b5530fdaa78c-kube-api-access-w7jkz\") pod \"dnsmasq-dns-57db9b5bc9-xcf5x\" (UID: \"62dc1af0-3dad-4341-b5ff-b5530fdaa78c\") " pod="openstack/dnsmasq-dns-57db9b5bc9-xcf5x" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.112902 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62dc1af0-3dad-4341-b5ff-b5530fdaa78c-config\") pod \"dnsmasq-dns-57db9b5bc9-xcf5x\" (UID: \"62dc1af0-3dad-4341-b5ff-b5530fdaa78c\") " pod="openstack/dnsmasq-dns-57db9b5bc9-xcf5x" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.112936 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62dc1af0-3dad-4341-b5ff-b5530fdaa78c-ovsdbserver-nb\") pod \"dnsmasq-dns-57db9b5bc9-xcf5x\" (UID: \"62dc1af0-3dad-4341-b5ff-b5530fdaa78c\") " pod="openstack/dnsmasq-dns-57db9b5bc9-xcf5x" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.112982 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62dc1af0-3dad-4341-b5ff-b5530fdaa78c-dns-svc\") pod \"dnsmasq-dns-57db9b5bc9-xcf5x\" (UID: \"62dc1af0-3dad-4341-b5ff-b5530fdaa78c\") " pod="openstack/dnsmasq-dns-57db9b5bc9-xcf5x" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.113815 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62dc1af0-3dad-4341-b5ff-b5530fdaa78c-dns-svc\") pod \"dnsmasq-dns-57db9b5bc9-xcf5x\" (UID: \"62dc1af0-3dad-4341-b5ff-b5530fdaa78c\") " pod="openstack/dnsmasq-dns-57db9b5bc9-xcf5x" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.113920 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6cg4w" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.114746 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62dc1af0-3dad-4341-b5ff-b5530fdaa78c-config\") pod \"dnsmasq-dns-57db9b5bc9-xcf5x\" (UID: \"62dc1af0-3dad-4341-b5ff-b5530fdaa78c\") " pod="openstack/dnsmasq-dns-57db9b5bc9-xcf5x" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.120229 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62dc1af0-3dad-4341-b5ff-b5530fdaa78c-ovsdbserver-nb\") pod \"dnsmasq-dns-57db9b5bc9-xcf5x\" (UID: \"62dc1af0-3dad-4341-b5ff-b5530fdaa78c\") " pod="openstack/dnsmasq-dns-57db9b5bc9-xcf5x" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.141365 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7jkz\" (UniqueName: \"kubernetes.io/projected/62dc1af0-3dad-4341-b5ff-b5530fdaa78c-kube-api-access-w7jkz\") pod \"dnsmasq-dns-57db9b5bc9-xcf5x\" (UID: \"62dc1af0-3dad-4341-b5ff-b5530fdaa78c\") " pod="openstack/dnsmasq-dns-57db9b5bc9-xcf5x" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.153659 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-n9nkr"] Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.186207 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-567c455747-g4bds"] Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.216387 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57db9b5bc9-xcf5x" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.220011 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-567c455747-g4bds"] Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.239024 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-db7757ddc-8rn2t"] Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.240379 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.263082 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.265555 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-db7757ddc-8rn2t"] Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.304959 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-hkrwn"] Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.317311 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cd484bb89-hkrwn"] Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.317998 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-dns-svc\") pod \"dnsmasq-dns-db7757ddc-8rn2t\" (UID: \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\") " pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.318070 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-config\") pod \"dnsmasq-dns-db7757ddc-8rn2t\" (UID: \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\") " pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.318104 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-ovsdbserver-sb\") pod \"dnsmasq-dns-db7757ddc-8rn2t\" (UID: \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\") " pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.318181 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-ovsdbserver-nb\") pod \"dnsmasq-dns-db7757ddc-8rn2t\" (UID: \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\") " pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.318210 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xgmq\" (UniqueName: \"kubernetes.io/projected/cf491c7f-b173-445e-b5d5-e4fbd838ab64-kube-api-access-6xgmq\") pod \"dnsmasq-dns-db7757ddc-8rn2t\" (UID: \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\") " pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.419296 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-ovsdbserver-nb\") pod \"dnsmasq-dns-db7757ddc-8rn2t\" (UID: \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\") " pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.419715 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xgmq\" (UniqueName: \"kubernetes.io/projected/cf491c7f-b173-445e-b5d5-e4fbd838ab64-kube-api-access-6xgmq\") pod \"dnsmasq-dns-db7757ddc-8rn2t\" (UID: \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\") " pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.420616 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-dns-svc\") pod \"dnsmasq-dns-db7757ddc-8rn2t\" (UID: \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\") " pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.420668 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-config\") pod \"dnsmasq-dns-db7757ddc-8rn2t\" (UID: \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\") " pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.420711 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-ovsdbserver-sb\") pod \"dnsmasq-dns-db7757ddc-8rn2t\" (UID: \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\") " pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.421247 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-ovsdbserver-sb\") pod \"dnsmasq-dns-db7757ddc-8rn2t\" (UID: \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\") " pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.420536 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-ovsdbserver-nb\") pod \"dnsmasq-dns-db7757ddc-8rn2t\" (UID: \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\") " pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.421795 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-dns-svc\") pod \"dnsmasq-dns-db7757ddc-8rn2t\" (UID: \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\") " pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.422271 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-config\") pod \"dnsmasq-dns-db7757ddc-8rn2t\" (UID: \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\") " pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.435241 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xgmq\" (UniqueName: \"kubernetes.io/projected/cf491c7f-b173-445e-b5d5-e4fbd838ab64-kube-api-access-6xgmq\") pod \"dnsmasq-dns-db7757ddc-8rn2t\" (UID: \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\") " pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" Dec 03 07:07:31 crc kubenswrapper[4947]: I1203 07:07:31.581220 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" Dec 03 07:07:33 crc kubenswrapper[4947]: I1203 07:07:33.092561 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b5ca077-d5b1-4d8f-ad52-1d9058a764ce" path="/var/lib/kubelet/pods/3b5ca077-d5b1-4d8f-ad52-1d9058a764ce/volumes" Dec 03 07:07:33 crc kubenswrapper[4947]: I1203 07:07:33.093359 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ce4dd53-a622-469e-b1b8-f042fb62cdb8" path="/var/lib/kubelet/pods/4ce4dd53-a622-469e-b1b8-f042fb62cdb8/volumes" Dec 03 07:07:33 crc kubenswrapper[4947]: I1203 07:07:33.607527 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 03 07:07:35 crc kubenswrapper[4947]: I1203 07:07:35.223634 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57db9b5bc9-xcf5x"] Dec 03 07:07:35 crc kubenswrapper[4947]: I1203 07:07:35.230554 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59d5fbdd8c-74bxs"] Dec 03 07:07:35 crc kubenswrapper[4947]: I1203 07:07:35.232118 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" Dec 03 07:07:35 crc kubenswrapper[4947]: I1203 07:07:35.244558 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d5fbdd8c-74bxs"] Dec 03 07:07:35 crc kubenswrapper[4947]: I1203 07:07:35.282818 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2k9v\" (UniqueName: \"kubernetes.io/projected/2d17d252-d33f-4846-a443-2d99e5d3464c-kube-api-access-h2k9v\") pod \"dnsmasq-dns-59d5fbdd8c-74bxs\" (UID: \"2d17d252-d33f-4846-a443-2d99e5d3464c\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" Dec 03 07:07:35 crc kubenswrapper[4947]: I1203 07:07:35.282864 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d17d252-d33f-4846-a443-2d99e5d3464c-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5fbdd8c-74bxs\" (UID: \"2d17d252-d33f-4846-a443-2d99e5d3464c\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" Dec 03 07:07:35 crc kubenswrapper[4947]: I1203 07:07:35.282901 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d17d252-d33f-4846-a443-2d99e5d3464c-config\") pod \"dnsmasq-dns-59d5fbdd8c-74bxs\" (UID: \"2d17d252-d33f-4846-a443-2d99e5d3464c\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" Dec 03 07:07:35 crc kubenswrapper[4947]: I1203 07:07:35.283129 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d17d252-d33f-4846-a443-2d99e5d3464c-dns-svc\") pod \"dnsmasq-dns-59d5fbdd8c-74bxs\" (UID: \"2d17d252-d33f-4846-a443-2d99e5d3464c\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" Dec 03 07:07:35 crc kubenswrapper[4947]: I1203 07:07:35.283213 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d17d252-d33f-4846-a443-2d99e5d3464c-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5fbdd8c-74bxs\" (UID: \"2d17d252-d33f-4846-a443-2d99e5d3464c\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" Dec 03 07:07:35 crc kubenswrapper[4947]: I1203 07:07:35.384777 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2k9v\" (UniqueName: \"kubernetes.io/projected/2d17d252-d33f-4846-a443-2d99e5d3464c-kube-api-access-h2k9v\") pod \"dnsmasq-dns-59d5fbdd8c-74bxs\" (UID: \"2d17d252-d33f-4846-a443-2d99e5d3464c\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" Dec 03 07:07:35 crc kubenswrapper[4947]: I1203 07:07:35.384870 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d17d252-d33f-4846-a443-2d99e5d3464c-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5fbdd8c-74bxs\" (UID: \"2d17d252-d33f-4846-a443-2d99e5d3464c\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" Dec 03 07:07:35 crc kubenswrapper[4947]: I1203 07:07:35.384937 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d17d252-d33f-4846-a443-2d99e5d3464c-config\") pod \"dnsmasq-dns-59d5fbdd8c-74bxs\" (UID: \"2d17d252-d33f-4846-a443-2d99e5d3464c\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" Dec 03 07:07:35 crc kubenswrapper[4947]: I1203 07:07:35.384963 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d17d252-d33f-4846-a443-2d99e5d3464c-dns-svc\") pod \"dnsmasq-dns-59d5fbdd8c-74bxs\" (UID: \"2d17d252-d33f-4846-a443-2d99e5d3464c\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" Dec 03 07:07:35 crc kubenswrapper[4947]: I1203 07:07:35.385092 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d17d252-d33f-4846-a443-2d99e5d3464c-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5fbdd8c-74bxs\" (UID: \"2d17d252-d33f-4846-a443-2d99e5d3464c\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" Dec 03 07:07:35 crc kubenswrapper[4947]: I1203 07:07:35.386368 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d17d252-d33f-4846-a443-2d99e5d3464c-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5fbdd8c-74bxs\" (UID: \"2d17d252-d33f-4846-a443-2d99e5d3464c\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" Dec 03 07:07:35 crc kubenswrapper[4947]: I1203 07:07:35.386549 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d17d252-d33f-4846-a443-2d99e5d3464c-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5fbdd8c-74bxs\" (UID: \"2d17d252-d33f-4846-a443-2d99e5d3464c\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" Dec 03 07:07:35 crc kubenswrapper[4947]: I1203 07:07:35.387052 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d17d252-d33f-4846-a443-2d99e5d3464c-config\") pod \"dnsmasq-dns-59d5fbdd8c-74bxs\" (UID: \"2d17d252-d33f-4846-a443-2d99e5d3464c\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" Dec 03 07:07:35 crc kubenswrapper[4947]: I1203 07:07:35.387536 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d17d252-d33f-4846-a443-2d99e5d3464c-dns-svc\") pod \"dnsmasq-dns-59d5fbdd8c-74bxs\" (UID: \"2d17d252-d33f-4846-a443-2d99e5d3464c\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" Dec 03 07:07:35 crc kubenswrapper[4947]: I1203 07:07:35.404813 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2k9v\" (UniqueName: \"kubernetes.io/projected/2d17d252-d33f-4846-a443-2d99e5d3464c-kube-api-access-h2k9v\") pod \"dnsmasq-dns-59d5fbdd8c-74bxs\" (UID: \"2d17d252-d33f-4846-a443-2d99e5d3464c\") " pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" Dec 03 07:07:35 crc kubenswrapper[4947]: I1203 07:07:35.558978 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.161955 4947 scope.go:117] "RemoveContainer" containerID="87526f34acea0e2f1eb1a7ffe8ce55922b984b35d78248e3ac5885e564f36592" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.287980 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.309854 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.312699 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.312903 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.313048 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.314094 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-sbzpd" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.323445 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.400746 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " pod="openstack/swift-storage-0" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.400860 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-etc-swift\") pod \"swift-storage-0\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " pod="openstack/swift-storage-0" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.400911 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5dc8a280-5a18-41fd-8e61-f51afa973d20-cache\") pod \"swift-storage-0\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " pod="openstack/swift-storage-0" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.400982 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5dc8a280-5a18-41fd-8e61-f51afa973d20-lock\") pod \"swift-storage-0\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " pod="openstack/swift-storage-0" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.401005 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5kk8\" (UniqueName: \"kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-kube-api-access-b5kk8\") pod \"swift-storage-0\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " pod="openstack/swift-storage-0" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.430285 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57db9b5bc9-xcf5x"] Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.502450 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5dc8a280-5a18-41fd-8e61-f51afa973d20-lock\") pod \"swift-storage-0\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " pod="openstack/swift-storage-0" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.502845 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5kk8\" (UniqueName: \"kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-kube-api-access-b5kk8\") pod \"swift-storage-0\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " pod="openstack/swift-storage-0" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.502875 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " pod="openstack/swift-storage-0" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.502938 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-etc-swift\") pod \"swift-storage-0\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " pod="openstack/swift-storage-0" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.502973 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5dc8a280-5a18-41fd-8e61-f51afa973d20-cache\") pod \"swift-storage-0\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " pod="openstack/swift-storage-0" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.503050 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5dc8a280-5a18-41fd-8e61-f51afa973d20-lock\") pod \"swift-storage-0\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " pod="openstack/swift-storage-0" Dec 03 07:07:36 crc kubenswrapper[4947]: E1203 07:07:36.503121 4947 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 07:07:36 crc kubenswrapper[4947]: E1203 07:07:36.503141 4947 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 07:07:36 crc kubenswrapper[4947]: E1203 07:07:36.503198 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-etc-swift podName:5dc8a280-5a18-41fd-8e61-f51afa973d20 nodeName:}" failed. No retries permitted until 2025-12-03 07:07:37.003176952 +0000 UTC m=+1118.264131448 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-etc-swift") pod "swift-storage-0" (UID: "5dc8a280-5a18-41fd-8e61-f51afa973d20") : configmap "swift-ring-files" not found Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.503331 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/swift-storage-0" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.503543 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5dc8a280-5a18-41fd-8e61-f51afa973d20-cache\") pod \"swift-storage-0\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " pod="openstack/swift-storage-0" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.524362 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5kk8\" (UniqueName: \"kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-kube-api-access-b5kk8\") pod \"swift-storage-0\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " pod="openstack/swift-storage-0" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.530769 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " pod="openstack/swift-storage-0" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.694436 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-db7757ddc-8rn2t"] Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.744621 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6cg4w"] Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.781104 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-jp2pf"] Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.782076 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jp2pf" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.784717 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.785267 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.785313 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.793372 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-jp2pf"] Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.806276 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/58224663-92bc-4143-ad66-3ce51e606d86-swiftconf\") pod \"swift-ring-rebalance-jp2pf\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " pod="openstack/swift-ring-rebalance-jp2pf" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.806333 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/58224663-92bc-4143-ad66-3ce51e606d86-ring-data-devices\") pod \"swift-ring-rebalance-jp2pf\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " pod="openstack/swift-ring-rebalance-jp2pf" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.806481 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/58224663-92bc-4143-ad66-3ce51e606d86-dispersionconf\") pod \"swift-ring-rebalance-jp2pf\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " pod="openstack/swift-ring-rebalance-jp2pf" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.806570 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58224663-92bc-4143-ad66-3ce51e606d86-scripts\") pod \"swift-ring-rebalance-jp2pf\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " pod="openstack/swift-ring-rebalance-jp2pf" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.806808 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/58224663-92bc-4143-ad66-3ce51e606d86-etc-swift\") pod \"swift-ring-rebalance-jp2pf\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " pod="openstack/swift-ring-rebalance-jp2pf" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.806845 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l8xh\" (UniqueName: \"kubernetes.io/projected/58224663-92bc-4143-ad66-3ce51e606d86-kube-api-access-5l8xh\") pod \"swift-ring-rebalance-jp2pf\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " pod="openstack/swift-ring-rebalance-jp2pf" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.806948 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58224663-92bc-4143-ad66-3ce51e606d86-combined-ca-bundle\") pod \"swift-ring-rebalance-jp2pf\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " pod="openstack/swift-ring-rebalance-jp2pf" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.829942 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d5fbdd8c-74bxs"] Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.908731 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/58224663-92bc-4143-ad66-3ce51e606d86-etc-swift\") pod \"swift-ring-rebalance-jp2pf\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " pod="openstack/swift-ring-rebalance-jp2pf" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.908770 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l8xh\" (UniqueName: \"kubernetes.io/projected/58224663-92bc-4143-ad66-3ce51e606d86-kube-api-access-5l8xh\") pod \"swift-ring-rebalance-jp2pf\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " pod="openstack/swift-ring-rebalance-jp2pf" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.908805 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58224663-92bc-4143-ad66-3ce51e606d86-combined-ca-bundle\") pod \"swift-ring-rebalance-jp2pf\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " pod="openstack/swift-ring-rebalance-jp2pf" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.908838 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/58224663-92bc-4143-ad66-3ce51e606d86-swiftconf\") pod \"swift-ring-rebalance-jp2pf\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " pod="openstack/swift-ring-rebalance-jp2pf" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.908883 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/58224663-92bc-4143-ad66-3ce51e606d86-ring-data-devices\") pod \"swift-ring-rebalance-jp2pf\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " pod="openstack/swift-ring-rebalance-jp2pf" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.908923 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/58224663-92bc-4143-ad66-3ce51e606d86-dispersionconf\") pod \"swift-ring-rebalance-jp2pf\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " pod="openstack/swift-ring-rebalance-jp2pf" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.908940 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58224663-92bc-4143-ad66-3ce51e606d86-scripts\") pod \"swift-ring-rebalance-jp2pf\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " pod="openstack/swift-ring-rebalance-jp2pf" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.909684 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58224663-92bc-4143-ad66-3ce51e606d86-scripts\") pod \"swift-ring-rebalance-jp2pf\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " pod="openstack/swift-ring-rebalance-jp2pf" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.909915 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/58224663-92bc-4143-ad66-3ce51e606d86-etc-swift\") pod \"swift-ring-rebalance-jp2pf\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " pod="openstack/swift-ring-rebalance-jp2pf" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.911229 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/58224663-92bc-4143-ad66-3ce51e606d86-ring-data-devices\") pod \"swift-ring-rebalance-jp2pf\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " pod="openstack/swift-ring-rebalance-jp2pf" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.914539 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/58224663-92bc-4143-ad66-3ce51e606d86-swiftconf\") pod \"swift-ring-rebalance-jp2pf\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " pod="openstack/swift-ring-rebalance-jp2pf" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.921751 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/58224663-92bc-4143-ad66-3ce51e606d86-dispersionconf\") pod \"swift-ring-rebalance-jp2pf\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " pod="openstack/swift-ring-rebalance-jp2pf" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.921826 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58224663-92bc-4143-ad66-3ce51e606d86-combined-ca-bundle\") pod \"swift-ring-rebalance-jp2pf\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " pod="openstack/swift-ring-rebalance-jp2pf" Dec 03 07:07:36 crc kubenswrapper[4947]: I1203 07:07:36.930085 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l8xh\" (UniqueName: \"kubernetes.io/projected/58224663-92bc-4143-ad66-3ce51e606d86-kube-api-access-5l8xh\") pod \"swift-ring-rebalance-jp2pf\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " pod="openstack/swift-ring-rebalance-jp2pf" Dec 03 07:07:37 crc kubenswrapper[4947]: I1203 07:07:37.000091 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"c728d116f3c53bcc6037be4215b8c5da0c570fa9f0fddd2b5bb621a8286fe726"} Dec 03 07:07:37 crc kubenswrapper[4947]: I1203 07:07:37.010476 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-etc-swift\") pod \"swift-storage-0\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " pod="openstack/swift-storage-0" Dec 03 07:07:37 crc kubenswrapper[4947]: E1203 07:07:37.010627 4947 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 07:07:37 crc kubenswrapper[4947]: E1203 07:07:37.010644 4947 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 07:07:37 crc kubenswrapper[4947]: E1203 07:07:37.010680 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-etc-swift podName:5dc8a280-5a18-41fd-8e61-f51afa973d20 nodeName:}" failed. No retries permitted until 2025-12-03 07:07:38.010669013 +0000 UTC m=+1119.271623439 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-etc-swift") pod "swift-storage-0" (UID: "5dc8a280-5a18-41fd-8e61-f51afa973d20") : configmap "swift-ring-files" not found Dec 03 07:07:37 crc kubenswrapper[4947]: I1203 07:07:37.053194 4947 scope.go:117] "RemoveContainer" containerID="60bda54479d41c97db192f899070213824cff9631674beed5629d3e36ad2f365" Dec 03 07:07:37 crc kubenswrapper[4947]: I1203 07:07:37.099146 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jp2pf" Dec 03 07:07:37 crc kubenswrapper[4947]: I1203 07:07:37.615642 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-jp2pf"] Dec 03 07:07:37 crc kubenswrapper[4947]: W1203 07:07:37.629015 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58224663_92bc_4143_ad66_3ce51e606d86.slice/crio-903c8b4cdda0e73f00503bff8ad3b6fbd9447faf2317bc01c93206f4e36af074 WatchSource:0}: Error finding container 903c8b4cdda0e73f00503bff8ad3b6fbd9447faf2317bc01c93206f4e36af074: Status 404 returned error can't find the container with id 903c8b4cdda0e73f00503bff8ad3b6fbd9447faf2317bc01c93206f4e36af074 Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.014400 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec","Type":"ContainerStarted","Data":"4a757f605dd0032d26474e5a8df58f33e3b89eab2725fe96b4bea59fa90b65d0"} Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.018085 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6cg4w" event={"ID":"3496bac7-6b31-4ba8-a490-14bff1522b8c","Type":"ContainerStarted","Data":"315d6d274559f1e5b1ce6fbe348451f3e8c32240b91c6ac76e8b6663da1e470b"} Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.020276 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84gbj" event={"ID":"98211c56-fd23-46c2-9710-31fc562e2182","Type":"ContainerStarted","Data":"8338c791b953b84b6e5c48f3b02c0d6abbb444adb3782f88dea28209a3c3d714"} Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.020791 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-84gbj" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.023212 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8","Type":"ContainerStarted","Data":"73f4afbf254709699e2004489bb7cfd91169baaf41f65098687b729bf3f60e60"} Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.025807 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-whnjz" event={"ID":"c1b40298-733b-476a-aeca-4c0025a86a98","Type":"ContainerStarted","Data":"b3186f465fe5c5549d5888d12a4905e4502c3677c5337ad8fd71abb40cedd698"} Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.025915 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bc4b48fc9-whnjz" podUID="c1b40298-733b-476a-aeca-4c0025a86a98" containerName="dnsmasq-dns" containerID="cri-o://b3186f465fe5c5549d5888d12a4905e4502c3677c5337ad8fd71abb40cedd698" gracePeriod=10 Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.025946 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bc4b48fc9-whnjz" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.032165 4947 generic.go:334] "Generic (PLEG): container finished" podID="62dc1af0-3dad-4341-b5ff-b5530fdaa78c" containerID="17b78379a993ac9eec0b7a1b04f5494cc08096178ffb496ccaeb686220daace2" exitCode=0 Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.032222 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db9b5bc9-xcf5x" event={"ID":"62dc1af0-3dad-4341-b5ff-b5530fdaa78c","Type":"ContainerDied","Data":"17b78379a993ac9eec0b7a1b04f5494cc08096178ffb496ccaeb686220daace2"} Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.032247 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db9b5bc9-xcf5x" event={"ID":"62dc1af0-3dad-4341-b5ff-b5530fdaa78c","Type":"ContainerStarted","Data":"2bba5406d3c53cf6fa4c2c20dd6af8c032f2e590b8ac3646aa3e39b379c98c6b"} Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.044081 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-etc-swift\") pod \"swift-storage-0\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " pod="openstack/swift-storage-0" Dec 03 07:07:38 crc kubenswrapper[4947]: E1203 07:07:38.045875 4947 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 07:07:38 crc kubenswrapper[4947]: E1203 07:07:38.045893 4947 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 07:07:38 crc kubenswrapper[4947]: E1203 07:07:38.045939 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-etc-swift podName:5dc8a280-5a18-41fd-8e61-f51afa973d20 nodeName:}" failed. No retries permitted until 2025-12-03 07:07:40.045920815 +0000 UTC m=+1121.306875241 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-etc-swift") pod "swift-storage-0" (UID: "5dc8a280-5a18-41fd-8e61-f51afa973d20") : configmap "swift-ring-files" not found Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.052130 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lb94d" event={"ID":"7b86a5d2-1933-4a2f-97de-f3b49985fbf8","Type":"ContainerStarted","Data":"62aa129fc147cc1180b141d3b348f304094abc5ad47b8dddaf94bcc554ab8860"} Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.060389 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6c475475-916c-4267-8064-f932c04d0df2","Type":"ContainerStarted","Data":"759d1c8703ad59941336de48361002019ed4b13b124229b7732fe2bcf89eedf4"} Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.065986 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bc4b48fc9-whnjz" podStartSLOduration=11.620938485 podStartE2EDuration="30.065967525s" podCreationTimestamp="2025-12-03 07:07:08 +0000 UTC" firstStartedPulling="2025-12-03 07:07:09.477158522 +0000 UTC m=+1090.738112948" lastFinishedPulling="2025-12-03 07:07:27.922187562 +0000 UTC m=+1109.183141988" observedRunningTime="2025-12-03 07:07:38.065877733 +0000 UTC m=+1119.326832289" watchObservedRunningTime="2025-12-03 07:07:38.065967525 +0000 UTC m=+1119.326921951" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.102526 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-n9nkr" event={"ID":"a8ddeca6-ac51-4af7-8b3c-6c851b1c415a","Type":"ContainerStarted","Data":"708ea3f1d1f27acb11d4399b9375afce3d5834fb4f327235a60a9d483f16059d"} Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.102642 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb666b895-n9nkr" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.102639 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cb666b895-n9nkr" podUID="a8ddeca6-ac51-4af7-8b3c-6c851b1c415a" containerName="dnsmasq-dns" containerID="cri-o://708ea3f1d1f27acb11d4399b9375afce3d5834fb4f327235a60a9d483f16059d" gracePeriod=10 Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.109796 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-84gbj" podStartSLOduration=11.340219894 podStartE2EDuration="20.109778996s" podCreationTimestamp="2025-12-03 07:07:18 +0000 UTC" firstStartedPulling="2025-12-03 07:07:27.235296264 +0000 UTC m=+1108.496250700" lastFinishedPulling="2025-12-03 07:07:36.004855376 +0000 UTC m=+1117.265809802" observedRunningTime="2025-12-03 07:07:38.10876157 +0000 UTC m=+1119.369715996" watchObservedRunningTime="2025-12-03 07:07:38.109778996 +0000 UTC m=+1119.370733422" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.133486 4947 generic.go:334] "Generic (PLEG): container finished" podID="cf491c7f-b173-445e-b5d5-e4fbd838ab64" containerID="6e6fcff7155477943a004007fa2e33bc97a6045325d08896ed2ee5494ee97d1d" exitCode=0 Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.133589 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" event={"ID":"cf491c7f-b173-445e-b5d5-e4fbd838ab64","Type":"ContainerDied","Data":"6e6fcff7155477943a004007fa2e33bc97a6045325d08896ed2ee5494ee97d1d"} Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.133621 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" event={"ID":"cf491c7f-b173-445e-b5d5-e4fbd838ab64","Type":"ContainerStarted","Data":"15d6a1d13f2534efedca36fb7339c3d5142b7d7a892353f9008c4ac09314cea6"} Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.170687 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jp2pf" event={"ID":"58224663-92bc-4143-ad66-3ce51e606d86","Type":"ContainerStarted","Data":"903c8b4cdda0e73f00503bff8ad3b6fbd9447faf2317bc01c93206f4e36af074"} Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.262979 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"90b682d1-68e6-49a4-83a6-51b1b40b7e99","Type":"ContainerStarted","Data":"a79ca432e119cfe4382eac1658d9f1475820f86e884d9d267beaad6d791c9e27"} Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.287072 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb666b895-n9nkr" podStartSLOduration=12.053487046 podStartE2EDuration="30.287051646s" podCreationTimestamp="2025-12-03 07:07:08 +0000 UTC" firstStartedPulling="2025-12-03 07:07:09.688591321 +0000 UTC m=+1090.949545737" lastFinishedPulling="2025-12-03 07:07:27.922155911 +0000 UTC m=+1109.183110337" observedRunningTime="2025-12-03 07:07:38.272079312 +0000 UTC m=+1119.533033738" watchObservedRunningTime="2025-12-03 07:07:38.287051646 +0000 UTC m=+1119.548006072" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.291239 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"76881366-670a-494f-ba95-7c5187ba80e8","Type":"ContainerStarted","Data":"93a8e7d2461c418769befcef894e43ae7999b371ae989342fcebf8d6081bc7bb"} Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.291548 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.301570 4947 generic.go:334] "Generic (PLEG): container finished" podID="2d17d252-d33f-4846-a443-2d99e5d3464c" containerID="4cfb0364aa14bf4135568be9bf4895f749e4233b4b26e4b93e7d9deb40ddf25f" exitCode=0 Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.301666 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" event={"ID":"2d17d252-d33f-4846-a443-2d99e5d3464c","Type":"ContainerDied","Data":"4cfb0364aa14bf4135568be9bf4895f749e4233b4b26e4b93e7d9deb40ddf25f"} Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.301720 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" event={"ID":"2d17d252-d33f-4846-a443-2d99e5d3464c","Type":"ContainerStarted","Data":"a2350a3008d4e7f690a842d4e8da78d89f9124b7656b7b3f3bd5358bbe989323"} Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.337901 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=14.408949825 podStartE2EDuration="24.337878376s" podCreationTimestamp="2025-12-03 07:07:14 +0000 UTC" firstStartedPulling="2025-12-03 07:07:27.233805783 +0000 UTC m=+1108.494760209" lastFinishedPulling="2025-12-03 07:07:37.162734334 +0000 UTC m=+1118.423688760" observedRunningTime="2025-12-03 07:07:38.324400333 +0000 UTC m=+1119.585354769" watchObservedRunningTime="2025-12-03 07:07:38.337878376 +0000 UTC m=+1119.598832802" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.758775 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57db9b5bc9-xcf5x" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.787149 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7jkz\" (UniqueName: \"kubernetes.io/projected/62dc1af0-3dad-4341-b5ff-b5530fdaa78c-kube-api-access-w7jkz\") pod \"62dc1af0-3dad-4341-b5ff-b5530fdaa78c\" (UID: \"62dc1af0-3dad-4341-b5ff-b5530fdaa78c\") " Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.787256 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62dc1af0-3dad-4341-b5ff-b5530fdaa78c-dns-svc\") pod \"62dc1af0-3dad-4341-b5ff-b5530fdaa78c\" (UID: \"62dc1af0-3dad-4341-b5ff-b5530fdaa78c\") " Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.787372 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62dc1af0-3dad-4341-b5ff-b5530fdaa78c-ovsdbserver-nb\") pod \"62dc1af0-3dad-4341-b5ff-b5530fdaa78c\" (UID: \"62dc1af0-3dad-4341-b5ff-b5530fdaa78c\") " Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.787472 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62dc1af0-3dad-4341-b5ff-b5530fdaa78c-config\") pod \"62dc1af0-3dad-4341-b5ff-b5530fdaa78c\" (UID: \"62dc1af0-3dad-4341-b5ff-b5530fdaa78c\") " Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.804527 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-whnjz" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.812469 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62dc1af0-3dad-4341-b5ff-b5530fdaa78c-kube-api-access-w7jkz" (OuterVolumeSpecName: "kube-api-access-w7jkz") pod "62dc1af0-3dad-4341-b5ff-b5530fdaa78c" (UID: "62dc1af0-3dad-4341-b5ff-b5530fdaa78c"). InnerVolumeSpecName "kube-api-access-w7jkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.843793 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62dc1af0-3dad-4341-b5ff-b5530fdaa78c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "62dc1af0-3dad-4341-b5ff-b5530fdaa78c" (UID: "62dc1af0-3dad-4341-b5ff-b5530fdaa78c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.846134 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62dc1af0-3dad-4341-b5ff-b5530fdaa78c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "62dc1af0-3dad-4341-b5ff-b5530fdaa78c" (UID: "62dc1af0-3dad-4341-b5ff-b5530fdaa78c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.850583 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-n9nkr" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.851107 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62dc1af0-3dad-4341-b5ff-b5530fdaa78c-config" (OuterVolumeSpecName: "config") pod "62dc1af0-3dad-4341-b5ff-b5530fdaa78c" (UID: "62dc1af0-3dad-4341-b5ff-b5530fdaa78c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.889482 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1b40298-733b-476a-aeca-4c0025a86a98-dns-svc\") pod \"c1b40298-733b-476a-aeca-4c0025a86a98\" (UID: \"c1b40298-733b-476a-aeca-4c0025a86a98\") " Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.889552 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1b40298-733b-476a-aeca-4c0025a86a98-config\") pod \"c1b40298-733b-476a-aeca-4c0025a86a98\" (UID: \"c1b40298-733b-476a-aeca-4c0025a86a98\") " Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.889599 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl247\" (UniqueName: \"kubernetes.io/projected/a8ddeca6-ac51-4af7-8b3c-6c851b1c415a-kube-api-access-gl247\") pod \"a8ddeca6-ac51-4af7-8b3c-6c851b1c415a\" (UID: \"a8ddeca6-ac51-4af7-8b3c-6c851b1c415a\") " Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.889631 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8ddeca6-ac51-4af7-8b3c-6c851b1c415a-config\") pod \"a8ddeca6-ac51-4af7-8b3c-6c851b1c415a\" (UID: \"a8ddeca6-ac51-4af7-8b3c-6c851b1c415a\") " Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.889654 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8ddeca6-ac51-4af7-8b3c-6c851b1c415a-dns-svc\") pod \"a8ddeca6-ac51-4af7-8b3c-6c851b1c415a\" (UID: \"a8ddeca6-ac51-4af7-8b3c-6c851b1c415a\") " Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.889751 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqqqn\" (UniqueName: \"kubernetes.io/projected/c1b40298-733b-476a-aeca-4c0025a86a98-kube-api-access-tqqqn\") pod \"c1b40298-733b-476a-aeca-4c0025a86a98\" (UID: \"c1b40298-733b-476a-aeca-4c0025a86a98\") " Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.890204 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62dc1af0-3dad-4341-b5ff-b5530fdaa78c-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.890225 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7jkz\" (UniqueName: \"kubernetes.io/projected/62dc1af0-3dad-4341-b5ff-b5530fdaa78c-kube-api-access-w7jkz\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.890239 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62dc1af0-3dad-4341-b5ff-b5530fdaa78c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.890251 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62dc1af0-3dad-4341-b5ff-b5530fdaa78c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.895093 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1b40298-733b-476a-aeca-4c0025a86a98-kube-api-access-tqqqn" (OuterVolumeSpecName: "kube-api-access-tqqqn") pod "c1b40298-733b-476a-aeca-4c0025a86a98" (UID: "c1b40298-733b-476a-aeca-4c0025a86a98"). InnerVolumeSpecName "kube-api-access-tqqqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.899919 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8ddeca6-ac51-4af7-8b3c-6c851b1c415a-kube-api-access-gl247" (OuterVolumeSpecName: "kube-api-access-gl247") pod "a8ddeca6-ac51-4af7-8b3c-6c851b1c415a" (UID: "a8ddeca6-ac51-4af7-8b3c-6c851b1c415a"). InnerVolumeSpecName "kube-api-access-gl247". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.931134 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8ddeca6-ac51-4af7-8b3c-6c851b1c415a-config" (OuterVolumeSpecName: "config") pod "a8ddeca6-ac51-4af7-8b3c-6c851b1c415a" (UID: "a8ddeca6-ac51-4af7-8b3c-6c851b1c415a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.938835 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1b40298-733b-476a-aeca-4c0025a86a98-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c1b40298-733b-476a-aeca-4c0025a86a98" (UID: "c1b40298-733b-476a-aeca-4c0025a86a98"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.963360 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8ddeca6-ac51-4af7-8b3c-6c851b1c415a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a8ddeca6-ac51-4af7-8b3c-6c851b1c415a" (UID: "a8ddeca6-ac51-4af7-8b3c-6c851b1c415a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.967696 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1b40298-733b-476a-aeca-4c0025a86a98-config" (OuterVolumeSpecName: "config") pod "c1b40298-733b-476a-aeca-4c0025a86a98" (UID: "c1b40298-733b-476a-aeca-4c0025a86a98"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.992092 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1b40298-733b-476a-aeca-4c0025a86a98-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.992133 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl247\" (UniqueName: \"kubernetes.io/projected/a8ddeca6-ac51-4af7-8b3c-6c851b1c415a-kube-api-access-gl247\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.992144 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8ddeca6-ac51-4af7-8b3c-6c851b1c415a-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.992155 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8ddeca6-ac51-4af7-8b3c-6c851b1c415a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.992164 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqqqn\" (UniqueName: \"kubernetes.io/projected/c1b40298-733b-476a-aeca-4c0025a86a98-kube-api-access-tqqqn\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:38 crc kubenswrapper[4947]: I1203 07:07:38.992173 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c1b40298-733b-476a-aeca-4c0025a86a98-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.312656 4947 generic.go:334] "Generic (PLEG): container finished" podID="a8ddeca6-ac51-4af7-8b3c-6c851b1c415a" containerID="708ea3f1d1f27acb11d4399b9375afce3d5834fb4f327235a60a9d483f16059d" exitCode=0 Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.312709 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-n9nkr" event={"ID":"a8ddeca6-ac51-4af7-8b3c-6c851b1c415a","Type":"ContainerDied","Data":"708ea3f1d1f27acb11d4399b9375afce3d5834fb4f327235a60a9d483f16059d"} Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.312761 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb666b895-n9nkr" event={"ID":"a8ddeca6-ac51-4af7-8b3c-6c851b1c415a","Type":"ContainerDied","Data":"c42132d3c6359466d825ae44da3d0618ebe465c3b1cd3821d0def3662877400a"} Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.312766 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb666b895-n9nkr" Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.312779 4947 scope.go:117] "RemoveContainer" containerID="708ea3f1d1f27acb11d4399b9375afce3d5834fb4f327235a60a9d483f16059d" Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.316336 4947 generic.go:334] "Generic (PLEG): container finished" podID="c1b40298-733b-476a-aeca-4c0025a86a98" containerID="b3186f465fe5c5549d5888d12a4905e4502c3677c5337ad8fd71abb40cedd698" exitCode=0 Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.316436 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-whnjz" event={"ID":"c1b40298-733b-476a-aeca-4c0025a86a98","Type":"ContainerDied","Data":"b3186f465fe5c5549d5888d12a4905e4502c3677c5337ad8fd71abb40cedd698"} Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.316792 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bc4b48fc9-whnjz" event={"ID":"c1b40298-733b-476a-aeca-4c0025a86a98","Type":"ContainerDied","Data":"3687303ebec00058c85b17ff0fa7423648f4975db261aca78fcc262376571678"} Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.316655 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bc4b48fc9-whnjz" Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.318399 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57db9b5bc9-xcf5x" event={"ID":"62dc1af0-3dad-4341-b5ff-b5530fdaa78c","Type":"ContainerDied","Data":"2bba5406d3c53cf6fa4c2c20dd6af8c032f2e590b8ac3646aa3e39b379c98c6b"} Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.318418 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57db9b5bc9-xcf5x" Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.321845 4947 generic.go:334] "Generic (PLEG): container finished" podID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerID="62aa129fc147cc1180b141d3b348f304094abc5ad47b8dddaf94bcc554ab8860" exitCode=0 Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.321938 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lb94d" event={"ID":"7b86a5d2-1933-4a2f-97de-f3b49985fbf8","Type":"ContainerDied","Data":"62aa129fc147cc1180b141d3b348f304094abc5ad47b8dddaf94bcc554ab8860"} Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.336731 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-n9nkr"] Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.338139 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" event={"ID":"2d17d252-d33f-4846-a443-2d99e5d3464c","Type":"ContainerStarted","Data":"b3c68005731767ce6aebd52c56c18a6a1b8835a9b47f24276345bcc23bbcc5be"} Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.351563 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cb666b895-n9nkr"] Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.354853 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" event={"ID":"cf491c7f-b173-445e-b5d5-e4fbd838ab64","Type":"ContainerStarted","Data":"f9c134c77017015ef8853a81ee9f495b3e2373e8224b36135f9326f3cb5df6a6"} Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.386147 4947 scope.go:117] "RemoveContainer" containerID="9825011b0c5d954e49ebaa8a361b886e2057fc03510d0f7691a5884d5f35ffb7" Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.390504 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57db9b5bc9-xcf5x"] Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.398806 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57db9b5bc9-xcf5x"] Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.405521 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-whnjz"] Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.418255 4947 scope.go:117] "RemoveContainer" containerID="708ea3f1d1f27acb11d4399b9375afce3d5834fb4f327235a60a9d483f16059d" Dec 03 07:07:39 crc kubenswrapper[4947]: E1203 07:07:39.418600 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"708ea3f1d1f27acb11d4399b9375afce3d5834fb4f327235a60a9d483f16059d\": container with ID starting with 708ea3f1d1f27acb11d4399b9375afce3d5834fb4f327235a60a9d483f16059d not found: ID does not exist" containerID="708ea3f1d1f27acb11d4399b9375afce3d5834fb4f327235a60a9d483f16059d" Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.418628 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"708ea3f1d1f27acb11d4399b9375afce3d5834fb4f327235a60a9d483f16059d"} err="failed to get container status \"708ea3f1d1f27acb11d4399b9375afce3d5834fb4f327235a60a9d483f16059d\": rpc error: code = NotFound desc = could not find container \"708ea3f1d1f27acb11d4399b9375afce3d5834fb4f327235a60a9d483f16059d\": container with ID starting with 708ea3f1d1f27acb11d4399b9375afce3d5834fb4f327235a60a9d483f16059d not found: ID does not exist" Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.418648 4947 scope.go:117] "RemoveContainer" containerID="9825011b0c5d954e49ebaa8a361b886e2057fc03510d0f7691a5884d5f35ffb7" Dec 03 07:07:39 crc kubenswrapper[4947]: E1203 07:07:39.418831 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9825011b0c5d954e49ebaa8a361b886e2057fc03510d0f7691a5884d5f35ffb7\": container with ID starting with 9825011b0c5d954e49ebaa8a361b886e2057fc03510d0f7691a5884d5f35ffb7 not found: ID does not exist" containerID="9825011b0c5d954e49ebaa8a361b886e2057fc03510d0f7691a5884d5f35ffb7" Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.418852 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9825011b0c5d954e49ebaa8a361b886e2057fc03510d0f7691a5884d5f35ffb7"} err="failed to get container status \"9825011b0c5d954e49ebaa8a361b886e2057fc03510d0f7691a5884d5f35ffb7\": rpc error: code = NotFound desc = could not find container \"9825011b0c5d954e49ebaa8a361b886e2057fc03510d0f7691a5884d5f35ffb7\": container with ID starting with 9825011b0c5d954e49ebaa8a361b886e2057fc03510d0f7691a5884d5f35ffb7 not found: ID does not exist" Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.418866 4947 scope.go:117] "RemoveContainer" containerID="b3186f465fe5c5549d5888d12a4905e4502c3677c5337ad8fd71abb40cedd698" Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.421891 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bc4b48fc9-whnjz"] Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.496015 4947 scope.go:117] "RemoveContainer" containerID="39035dd80a514fe785a182e5a96c87adfd007590c131150de146cd8daa9aa27c" Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.559705 4947 scope.go:117] "RemoveContainer" containerID="b3186f465fe5c5549d5888d12a4905e4502c3677c5337ad8fd71abb40cedd698" Dec 03 07:07:39 crc kubenswrapper[4947]: E1203 07:07:39.560228 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3186f465fe5c5549d5888d12a4905e4502c3677c5337ad8fd71abb40cedd698\": container with ID starting with b3186f465fe5c5549d5888d12a4905e4502c3677c5337ad8fd71abb40cedd698 not found: ID does not exist" containerID="b3186f465fe5c5549d5888d12a4905e4502c3677c5337ad8fd71abb40cedd698" Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.560274 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3186f465fe5c5549d5888d12a4905e4502c3677c5337ad8fd71abb40cedd698"} err="failed to get container status \"b3186f465fe5c5549d5888d12a4905e4502c3677c5337ad8fd71abb40cedd698\": rpc error: code = NotFound desc = could not find container \"b3186f465fe5c5549d5888d12a4905e4502c3677c5337ad8fd71abb40cedd698\": container with ID starting with b3186f465fe5c5549d5888d12a4905e4502c3677c5337ad8fd71abb40cedd698 not found: ID does not exist" Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.560303 4947 scope.go:117] "RemoveContainer" containerID="39035dd80a514fe785a182e5a96c87adfd007590c131150de146cd8daa9aa27c" Dec 03 07:07:39 crc kubenswrapper[4947]: E1203 07:07:39.564232 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39035dd80a514fe785a182e5a96c87adfd007590c131150de146cd8daa9aa27c\": container with ID starting with 39035dd80a514fe785a182e5a96c87adfd007590c131150de146cd8daa9aa27c not found: ID does not exist" containerID="39035dd80a514fe785a182e5a96c87adfd007590c131150de146cd8daa9aa27c" Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.564267 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39035dd80a514fe785a182e5a96c87adfd007590c131150de146cd8daa9aa27c"} err="failed to get container status \"39035dd80a514fe785a182e5a96c87adfd007590c131150de146cd8daa9aa27c\": rpc error: code = NotFound desc = could not find container \"39035dd80a514fe785a182e5a96c87adfd007590c131150de146cd8daa9aa27c\": container with ID starting with 39035dd80a514fe785a182e5a96c87adfd007590c131150de146cd8daa9aa27c not found: ID does not exist" Dec 03 07:07:39 crc kubenswrapper[4947]: I1203 07:07:39.564287 4947 scope.go:117] "RemoveContainer" containerID="17b78379a993ac9eec0b7a1b04f5494cc08096178ffb496ccaeb686220daace2" Dec 03 07:07:40 crc kubenswrapper[4947]: I1203 07:07:40.110868 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-etc-swift\") pod \"swift-storage-0\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " pod="openstack/swift-storage-0" Dec 03 07:07:40 crc kubenswrapper[4947]: E1203 07:07:40.111022 4947 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 07:07:40 crc kubenswrapper[4947]: E1203 07:07:40.111054 4947 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 07:07:40 crc kubenswrapper[4947]: E1203 07:07:40.111126 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-etc-swift podName:5dc8a280-5a18-41fd-8e61-f51afa973d20 nodeName:}" failed. No retries permitted until 2025-12-03 07:07:44.111101984 +0000 UTC m=+1125.372056410 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-etc-swift") pod "swift-storage-0" (UID: "5dc8a280-5a18-41fd-8e61-f51afa973d20") : configmap "swift-ring-files" not found Dec 03 07:07:40 crc kubenswrapper[4947]: I1203 07:07:40.367484 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lb94d" event={"ID":"7b86a5d2-1933-4a2f-97de-f3b49985fbf8","Type":"ContainerStarted","Data":"f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39"} Dec 03 07:07:40 crc kubenswrapper[4947]: I1203 07:07:40.367664 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" Dec 03 07:07:40 crc kubenswrapper[4947]: I1203 07:07:40.367757 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" Dec 03 07:07:40 crc kubenswrapper[4947]: I1203 07:07:40.394673 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" podStartSLOduration=5.394639028 podStartE2EDuration="5.394639028s" podCreationTimestamp="2025-12-03 07:07:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:07:40.383473687 +0000 UTC m=+1121.644428123" watchObservedRunningTime="2025-12-03 07:07:40.394639028 +0000 UTC m=+1121.655593534" Dec 03 07:07:40 crc kubenswrapper[4947]: I1203 07:07:40.407460 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" podStartSLOduration=9.407435543 podStartE2EDuration="9.407435543s" podCreationTimestamp="2025-12-03 07:07:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:07:40.398663377 +0000 UTC m=+1121.659617833" watchObservedRunningTime="2025-12-03 07:07:40.407435543 +0000 UTC m=+1121.668390009" Dec 03 07:07:41 crc kubenswrapper[4947]: I1203 07:07:41.092722 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62dc1af0-3dad-4341-b5ff-b5530fdaa78c" path="/var/lib/kubelet/pods/62dc1af0-3dad-4341-b5ff-b5530fdaa78c/volumes" Dec 03 07:07:41 crc kubenswrapper[4947]: I1203 07:07:41.093755 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8ddeca6-ac51-4af7-8b3c-6c851b1c415a" path="/var/lib/kubelet/pods/a8ddeca6-ac51-4af7-8b3c-6c851b1c415a/volumes" Dec 03 07:07:41 crc kubenswrapper[4947]: I1203 07:07:41.094484 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1b40298-733b-476a-aeca-4c0025a86a98" path="/var/lib/kubelet/pods/c1b40298-733b-476a-aeca-4c0025a86a98/volumes" Dec 03 07:07:44 crc kubenswrapper[4947]: I1203 07:07:44.190592 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-etc-swift\") pod \"swift-storage-0\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " pod="openstack/swift-storage-0" Dec 03 07:07:44 crc kubenswrapper[4947]: E1203 07:07:44.190856 4947 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 07:07:44 crc kubenswrapper[4947]: E1203 07:07:44.191259 4947 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 07:07:44 crc kubenswrapper[4947]: E1203 07:07:44.191342 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-etc-swift podName:5dc8a280-5a18-41fd-8e61-f51afa973d20 nodeName:}" failed. No retries permitted until 2025-12-03 07:07:52.191294529 +0000 UTC m=+1133.452248965 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-etc-swift") pod "swift-storage-0" (UID: "5dc8a280-5a18-41fd-8e61-f51afa973d20") : configmap "swift-ring-files" not found Dec 03 07:07:44 crc kubenswrapper[4947]: I1203 07:07:44.400707 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lb94d" event={"ID":"7b86a5d2-1933-4a2f-97de-f3b49985fbf8","Type":"ContainerStarted","Data":"3ce586afd15fbdd0b4c1dfbc0475b8228a9be2dd3ee32c7a7f80750aafce6e2f"} Dec 03 07:07:44 crc kubenswrapper[4947]: I1203 07:07:44.401060 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:07:44 crc kubenswrapper[4947]: I1203 07:07:44.431140 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-lb94d" podStartSLOduration=17.782068981 podStartE2EDuration="26.431112395s" podCreationTimestamp="2025-12-03 07:07:18 +0000 UTC" firstStartedPulling="2025-12-03 07:07:27.219411575 +0000 UTC m=+1108.480366001" lastFinishedPulling="2025-12-03 07:07:35.868454989 +0000 UTC m=+1117.129409415" observedRunningTime="2025-12-03 07:07:44.426033038 +0000 UTC m=+1125.686987494" watchObservedRunningTime="2025-12-03 07:07:44.431112395 +0000 UTC m=+1125.692066861" Dec 03 07:07:45 crc kubenswrapper[4947]: I1203 07:07:45.153382 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 03 07:07:45 crc kubenswrapper[4947]: I1203 07:07:45.410299 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:07:45 crc kubenswrapper[4947]: I1203 07:07:45.560581 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" Dec 03 07:07:45 crc kubenswrapper[4947]: I1203 07:07:45.628045 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-db7757ddc-8rn2t"] Dec 03 07:07:45 crc kubenswrapper[4947]: I1203 07:07:45.633508 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" podUID="cf491c7f-b173-445e-b5d5-e4fbd838ab64" containerName="dnsmasq-dns" containerID="cri-o://f9c134c77017015ef8853a81ee9f495b3e2373e8224b36135f9326f3cb5df6a6" gracePeriod=10 Dec 03 07:07:45 crc kubenswrapper[4947]: I1203 07:07:45.638815 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" Dec 03 07:07:46 crc kubenswrapper[4947]: I1203 07:07:46.420311 4947 generic.go:334] "Generic (PLEG): container finished" podID="cf491c7f-b173-445e-b5d5-e4fbd838ab64" containerID="f9c134c77017015ef8853a81ee9f495b3e2373e8224b36135f9326f3cb5df6a6" exitCode=0 Dec 03 07:07:46 crc kubenswrapper[4947]: I1203 07:07:46.420448 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" event={"ID":"cf491c7f-b173-445e-b5d5-e4fbd838ab64","Type":"ContainerDied","Data":"f9c134c77017015ef8853a81ee9f495b3e2373e8224b36135f9326f3cb5df6a6"} Dec 03 07:07:46 crc kubenswrapper[4947]: I1203 07:07:46.586497 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" podUID="cf491c7f-b173-445e-b5d5-e4fbd838ab64" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.113:5353: connect: connection refused" Dec 03 07:07:47 crc kubenswrapper[4947]: I1203 07:07:47.429920 4947 generic.go:334] "Generic (PLEG): container finished" podID="fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8" containerID="73f4afbf254709699e2004489bb7cfd91169baaf41f65098687b729bf3f60e60" exitCode=0 Dec 03 07:07:47 crc kubenswrapper[4947]: I1203 07:07:47.429961 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8","Type":"ContainerDied","Data":"73f4afbf254709699e2004489bb7cfd91169baaf41f65098687b729bf3f60e60"} Dec 03 07:07:48 crc kubenswrapper[4947]: I1203 07:07:48.865557 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" Dec 03 07:07:48 crc kubenswrapper[4947]: I1203 07:07:48.974939 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-dns-svc\") pod \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\" (UID: \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\") " Dec 03 07:07:48 crc kubenswrapper[4947]: I1203 07:07:48.975354 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-ovsdbserver-nb\") pod \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\" (UID: \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\") " Dec 03 07:07:48 crc kubenswrapper[4947]: I1203 07:07:48.975422 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-ovsdbserver-sb\") pod \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\" (UID: \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\") " Dec 03 07:07:48 crc kubenswrapper[4947]: I1203 07:07:48.975564 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-config\") pod \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\" (UID: \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\") " Dec 03 07:07:48 crc kubenswrapper[4947]: I1203 07:07:48.975650 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xgmq\" (UniqueName: \"kubernetes.io/projected/cf491c7f-b173-445e-b5d5-e4fbd838ab64-kube-api-access-6xgmq\") pod \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\" (UID: \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\") " Dec 03 07:07:48 crc kubenswrapper[4947]: I1203 07:07:48.982061 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf491c7f-b173-445e-b5d5-e4fbd838ab64-kube-api-access-6xgmq" (OuterVolumeSpecName: "kube-api-access-6xgmq") pod "cf491c7f-b173-445e-b5d5-e4fbd838ab64" (UID: "cf491c7f-b173-445e-b5d5-e4fbd838ab64"). InnerVolumeSpecName "kube-api-access-6xgmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.077187 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xgmq\" (UniqueName: \"kubernetes.io/projected/cf491c7f-b173-445e-b5d5-e4fbd838ab64-kube-api-access-6xgmq\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.095955 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-config" (OuterVolumeSpecName: "config") pod "cf491c7f-b173-445e-b5d5-e4fbd838ab64" (UID: "cf491c7f-b173-445e-b5d5-e4fbd838ab64"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.101567 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cf491c7f-b173-445e-b5d5-e4fbd838ab64" (UID: "cf491c7f-b173-445e-b5d5-e4fbd838ab64"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:07:49 crc kubenswrapper[4947]: E1203 07:07:49.102508 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-dns-svc podName:cf491c7f-b173-445e-b5d5-e4fbd838ab64 nodeName:}" failed. No retries permitted until 2025-12-03 07:07:49.602466258 +0000 UTC m=+1130.863420704 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-svc" (UniqueName: "kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-dns-svc") pod "cf491c7f-b173-445e-b5d5-e4fbd838ab64" (UID: "cf491c7f-b173-445e-b5d5-e4fbd838ab64") : error deleting /var/lib/kubelet/pods/cf491c7f-b173-445e-b5d5-e4fbd838ab64/volume-subpaths: remove /var/lib/kubelet/pods/cf491c7f-b173-445e-b5d5-e4fbd838ab64/volume-subpaths: no such file or directory Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.102815 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cf491c7f-b173-445e-b5d5-e4fbd838ab64" (UID: "cf491c7f-b173-445e-b5d5-e4fbd838ab64"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.179078 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.179108 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.179116 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.464139 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jp2pf" event={"ID":"58224663-92bc-4143-ad66-3ce51e606d86","Type":"ContainerStarted","Data":"c04ea9b7a8fa051bc8f8f6e63fce2b71c7ecf7839a65f3ca53d45b4449c04110"} Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.467799 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6cg4w" event={"ID":"3496bac7-6b31-4ba8-a490-14bff1522b8c","Type":"ContainerStarted","Data":"4768422c21501889e7e1951ef8537cbf19f01b02b829b137e7fb9d8dc5766658"} Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.474019 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" event={"ID":"cf491c7f-b173-445e-b5d5-e4fbd838ab64","Type":"ContainerDied","Data":"15d6a1d13f2534efedca36fb7339c3d5142b7d7a892353f9008c4ac09314cea6"} Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.474092 4947 scope.go:117] "RemoveContainer" containerID="f9c134c77017015ef8853a81ee9f495b3e2373e8224b36135f9326f3cb5df6a6" Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.474260 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db7757ddc-8rn2t" Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.478267 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6c475475-916c-4267-8064-f932c04d0df2","Type":"ContainerStarted","Data":"909ef5b644a4ef4623502b6cd9ac440ea6dcc840b5372f80ecaa44d612f7650b"} Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.481739 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"90b682d1-68e6-49a4-83a6-51b1b40b7e99","Type":"ContainerStarted","Data":"a00403c9e5c9c0356f8cda0bb2f5ea33101133e03ecec9269cd5a9a058bc1298"} Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.502744 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8","Type":"ContainerStarted","Data":"a97cfb598af771fadb5726f8c4ca3b03aba98f58c7aa78e79de0f3b339e5fad7"} Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.508879 4947 generic.go:334] "Generic (PLEG): container finished" podID="dbc40b18-511b-4bd7-bb2c-3dc868c6dcec" containerID="4a757f605dd0032d26474e5a8df58f33e3b89eab2725fe96b4bea59fa90b65d0" exitCode=0 Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.508976 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec","Type":"ContainerDied","Data":"4a757f605dd0032d26474e5a8df58f33e3b89eab2725fe96b4bea59fa90b65d0"} Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.510332 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-jp2pf" podStartSLOduration=2.391891924 podStartE2EDuration="13.510316914s" podCreationTimestamp="2025-12-03 07:07:36 +0000 UTC" firstStartedPulling="2025-12-03 07:07:37.631551524 +0000 UTC m=+1118.892505950" lastFinishedPulling="2025-12-03 07:07:48.749976514 +0000 UTC m=+1130.010930940" observedRunningTime="2025-12-03 07:07:49.491950959 +0000 UTC m=+1130.752905425" watchObservedRunningTime="2025-12-03 07:07:49.510316914 +0000 UTC m=+1130.771271350" Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.549453 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-6cg4w" podStartSLOduration=7.810501008 podStartE2EDuration="19.549397637s" podCreationTimestamp="2025-12-03 07:07:30 +0000 UTC" firstStartedPulling="2025-12-03 07:07:37.10768536 +0000 UTC m=+1118.368639786" lastFinishedPulling="2025-12-03 07:07:48.846581979 +0000 UTC m=+1130.107536415" observedRunningTime="2025-12-03 07:07:49.52056135 +0000 UTC m=+1130.781515806" watchObservedRunningTime="2025-12-03 07:07:49.549397637 +0000 UTC m=+1130.810352083" Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.572446 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=7.938812381 podStartE2EDuration="29.572416098s" podCreationTimestamp="2025-12-03 07:07:20 +0000 UTC" firstStartedPulling="2025-12-03 07:07:27.231907652 +0000 UTC m=+1108.492862088" lastFinishedPulling="2025-12-03 07:07:48.865511379 +0000 UTC m=+1130.126465805" observedRunningTime="2025-12-03 07:07:49.564388121 +0000 UTC m=+1130.825342587" watchObservedRunningTime="2025-12-03 07:07:49.572416098 +0000 UTC m=+1130.833370534" Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.597062 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=8.665566866 podStartE2EDuration="29.597033162s" podCreationTimestamp="2025-12-03 07:07:20 +0000 UTC" firstStartedPulling="2025-12-03 07:07:27.935250055 +0000 UTC m=+1109.196204481" lastFinishedPulling="2025-12-03 07:07:48.866716351 +0000 UTC m=+1130.127670777" observedRunningTime="2025-12-03 07:07:49.593822754 +0000 UTC m=+1130.854777200" watchObservedRunningTime="2025-12-03 07:07:49.597033162 +0000 UTC m=+1130.857987648" Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.659864 4947 scope.go:117] "RemoveContainer" containerID="6e6fcff7155477943a004007fa2e33bc97a6045325d08896ed2ee5494ee97d1d" Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.663110 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=29.89730127 podStartE2EDuration="38.663093942s" podCreationTimestamp="2025-12-03 07:07:11 +0000 UTC" firstStartedPulling="2025-12-03 07:07:27.225098808 +0000 UTC m=+1108.486053234" lastFinishedPulling="2025-12-03 07:07:35.99089148 +0000 UTC m=+1117.251845906" observedRunningTime="2025-12-03 07:07:49.655334103 +0000 UTC m=+1130.916288569" watchObservedRunningTime="2025-12-03 07:07:49.663093942 +0000 UTC m=+1130.924048368" Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.688292 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-dns-svc\") pod \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\" (UID: \"cf491c7f-b173-445e-b5d5-e4fbd838ab64\") " Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.688914 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cf491c7f-b173-445e-b5d5-e4fbd838ab64" (UID: "cf491c7f-b173-445e-b5d5-e4fbd838ab64"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.689113 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf491c7f-b173-445e-b5d5-e4fbd838ab64-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.806250 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-db7757ddc-8rn2t"] Dec 03 07:07:49 crc kubenswrapper[4947]: I1203 07:07:49.811866 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-db7757ddc-8rn2t"] Dec 03 07:07:50 crc kubenswrapper[4947]: I1203 07:07:50.527744 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec","Type":"ContainerStarted","Data":"58133114584f0b390c627b1710fda7f019565ce9036c5e41a7ab89f063b5168d"} Dec 03 07:07:50 crc kubenswrapper[4947]: I1203 07:07:50.555735 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=31.415017008 podStartE2EDuration="40.555716168s" podCreationTimestamp="2025-12-03 07:07:10 +0000 UTC" firstStartedPulling="2025-12-03 07:07:27.89726019 +0000 UTC m=+1109.158214616" lastFinishedPulling="2025-12-03 07:07:37.03795935 +0000 UTC m=+1118.298913776" observedRunningTime="2025-12-03 07:07:50.554402472 +0000 UTC m=+1131.815356918" watchObservedRunningTime="2025-12-03 07:07:50.555716168 +0000 UTC m=+1131.816670594" Dec 03 07:07:51 crc kubenswrapper[4947]: I1203 07:07:51.095380 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf491c7f-b173-445e-b5d5-e4fbd838ab64" path="/var/lib/kubelet/pods/cf491c7f-b173-445e-b5d5-e4fbd838ab64/volumes" Dec 03 07:07:51 crc kubenswrapper[4947]: I1203 07:07:51.670797 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 03 07:07:51 crc kubenswrapper[4947]: I1203 07:07:51.670870 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 03 07:07:51 crc kubenswrapper[4947]: I1203 07:07:51.676764 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:51 crc kubenswrapper[4947]: I1203 07:07:51.676799 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:51 crc kubenswrapper[4947]: I1203 07:07:51.737874 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:52 crc kubenswrapper[4947]: I1203 07:07:52.231424 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-etc-swift\") pod \"swift-storage-0\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " pod="openstack/swift-storage-0" Dec 03 07:07:52 crc kubenswrapper[4947]: E1203 07:07:52.231799 4947 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 07:07:52 crc kubenswrapper[4947]: E1203 07:07:52.231854 4947 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 07:07:52 crc kubenswrapper[4947]: E1203 07:07:52.231949 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-etc-swift podName:5dc8a280-5a18-41fd-8e61-f51afa973d20 nodeName:}" failed. No retries permitted until 2025-12-03 07:08:08.23192089 +0000 UTC m=+1149.492875356 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-etc-swift") pod "swift-storage-0" (UID: "5dc8a280-5a18-41fd-8e61-f51afa973d20") : configmap "swift-ring-files" not found Dec 03 07:07:52 crc kubenswrapper[4947]: I1203 07:07:52.250731 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:52 crc kubenswrapper[4947]: I1203 07:07:52.250773 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:52 crc kubenswrapper[4947]: I1203 07:07:52.299221 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:52 crc kubenswrapper[4947]: I1203 07:07:52.592594 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 03 07:07:52 crc kubenswrapper[4947]: I1203 07:07:52.604918 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 03 07:07:52 crc kubenswrapper[4947]: I1203 07:07:52.890422 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 03 07:07:52 crc kubenswrapper[4947]: E1203 07:07:52.890821 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8ddeca6-ac51-4af7-8b3c-6c851b1c415a" containerName="dnsmasq-dns" Dec 03 07:07:52 crc kubenswrapper[4947]: I1203 07:07:52.890835 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ddeca6-ac51-4af7-8b3c-6c851b1c415a" containerName="dnsmasq-dns" Dec 03 07:07:52 crc kubenswrapper[4947]: E1203 07:07:52.890852 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62dc1af0-3dad-4341-b5ff-b5530fdaa78c" containerName="init" Dec 03 07:07:52 crc kubenswrapper[4947]: I1203 07:07:52.890859 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="62dc1af0-3dad-4341-b5ff-b5530fdaa78c" containerName="init" Dec 03 07:07:52 crc kubenswrapper[4947]: E1203 07:07:52.890875 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf491c7f-b173-445e-b5d5-e4fbd838ab64" containerName="dnsmasq-dns" Dec 03 07:07:52 crc kubenswrapper[4947]: I1203 07:07:52.890885 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf491c7f-b173-445e-b5d5-e4fbd838ab64" containerName="dnsmasq-dns" Dec 03 07:07:52 crc kubenswrapper[4947]: E1203 07:07:52.890910 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf491c7f-b173-445e-b5d5-e4fbd838ab64" containerName="init" Dec 03 07:07:52 crc kubenswrapper[4947]: I1203 07:07:52.890917 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf491c7f-b173-445e-b5d5-e4fbd838ab64" containerName="init" Dec 03 07:07:52 crc kubenswrapper[4947]: E1203 07:07:52.890926 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1b40298-733b-476a-aeca-4c0025a86a98" containerName="dnsmasq-dns" Dec 03 07:07:52 crc kubenswrapper[4947]: I1203 07:07:52.890932 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1b40298-733b-476a-aeca-4c0025a86a98" containerName="dnsmasq-dns" Dec 03 07:07:52 crc kubenswrapper[4947]: E1203 07:07:52.890941 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8ddeca6-ac51-4af7-8b3c-6c851b1c415a" containerName="init" Dec 03 07:07:52 crc kubenswrapper[4947]: I1203 07:07:52.890948 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8ddeca6-ac51-4af7-8b3c-6c851b1c415a" containerName="init" Dec 03 07:07:52 crc kubenswrapper[4947]: E1203 07:07:52.890962 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1b40298-733b-476a-aeca-4c0025a86a98" containerName="init" Dec 03 07:07:52 crc kubenswrapper[4947]: I1203 07:07:52.890968 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1b40298-733b-476a-aeca-4c0025a86a98" containerName="init" Dec 03 07:07:52 crc kubenswrapper[4947]: I1203 07:07:52.891193 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf491c7f-b173-445e-b5d5-e4fbd838ab64" containerName="dnsmasq-dns" Dec 03 07:07:52 crc kubenswrapper[4947]: I1203 07:07:52.891214 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1b40298-733b-476a-aeca-4c0025a86a98" containerName="dnsmasq-dns" Dec 03 07:07:52 crc kubenswrapper[4947]: I1203 07:07:52.891228 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8ddeca6-ac51-4af7-8b3c-6c851b1c415a" containerName="dnsmasq-dns" Dec 03 07:07:52 crc kubenswrapper[4947]: I1203 07:07:52.891240 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="62dc1af0-3dad-4341-b5ff-b5530fdaa78c" containerName="init" Dec 03 07:07:52 crc kubenswrapper[4947]: I1203 07:07:52.893255 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 07:07:52 crc kubenswrapper[4947]: I1203 07:07:52.897291 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 03 07:07:52 crc kubenswrapper[4947]: I1203 07:07:52.897677 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-mnpw7" Dec 03 07:07:52 crc kubenswrapper[4947]: I1203 07:07:52.897980 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 03 07:07:52 crc kubenswrapper[4947]: I1203 07:07:52.900694 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 03 07:07:52 crc kubenswrapper[4947]: I1203 07:07:52.922425 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 07:07:53 crc kubenswrapper[4947]: I1203 07:07:53.054986 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzbmz\" (UniqueName: \"kubernetes.io/projected/855f6436-68f0-42d8-a12a-bf25632440c1-kube-api-access-zzbmz\") pod \"ovn-northd-0\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " pod="openstack/ovn-northd-0" Dec 03 07:07:53 crc kubenswrapper[4947]: I1203 07:07:53.055102 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/855f6436-68f0-42d8-a12a-bf25632440c1-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " pod="openstack/ovn-northd-0" Dec 03 07:07:53 crc kubenswrapper[4947]: I1203 07:07:53.055139 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/855f6436-68f0-42d8-a12a-bf25632440c1-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " pod="openstack/ovn-northd-0" Dec 03 07:07:53 crc kubenswrapper[4947]: I1203 07:07:53.055308 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/855f6436-68f0-42d8-a12a-bf25632440c1-scripts\") pod \"ovn-northd-0\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " pod="openstack/ovn-northd-0" Dec 03 07:07:53 crc kubenswrapper[4947]: I1203 07:07:53.055547 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/855f6436-68f0-42d8-a12a-bf25632440c1-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " pod="openstack/ovn-northd-0" Dec 03 07:07:53 crc kubenswrapper[4947]: I1203 07:07:53.055590 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855f6436-68f0-42d8-a12a-bf25632440c1-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " pod="openstack/ovn-northd-0" Dec 03 07:07:53 crc kubenswrapper[4947]: I1203 07:07:53.055896 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/855f6436-68f0-42d8-a12a-bf25632440c1-config\") pod \"ovn-northd-0\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " pod="openstack/ovn-northd-0" Dec 03 07:07:53 crc kubenswrapper[4947]: I1203 07:07:53.157594 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzbmz\" (UniqueName: \"kubernetes.io/projected/855f6436-68f0-42d8-a12a-bf25632440c1-kube-api-access-zzbmz\") pod \"ovn-northd-0\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " pod="openstack/ovn-northd-0" Dec 03 07:07:53 crc kubenswrapper[4947]: I1203 07:07:53.157652 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/855f6436-68f0-42d8-a12a-bf25632440c1-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " pod="openstack/ovn-northd-0" Dec 03 07:07:53 crc kubenswrapper[4947]: I1203 07:07:53.157675 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/855f6436-68f0-42d8-a12a-bf25632440c1-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " pod="openstack/ovn-northd-0" Dec 03 07:07:53 crc kubenswrapper[4947]: I1203 07:07:53.157724 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/855f6436-68f0-42d8-a12a-bf25632440c1-scripts\") pod \"ovn-northd-0\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " pod="openstack/ovn-northd-0" Dec 03 07:07:53 crc kubenswrapper[4947]: I1203 07:07:53.157826 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/855f6436-68f0-42d8-a12a-bf25632440c1-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " pod="openstack/ovn-northd-0" Dec 03 07:07:53 crc kubenswrapper[4947]: I1203 07:07:53.157853 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855f6436-68f0-42d8-a12a-bf25632440c1-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " pod="openstack/ovn-northd-0" Dec 03 07:07:53 crc kubenswrapper[4947]: I1203 07:07:53.157932 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/855f6436-68f0-42d8-a12a-bf25632440c1-config\") pod \"ovn-northd-0\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " pod="openstack/ovn-northd-0" Dec 03 07:07:53 crc kubenswrapper[4947]: I1203 07:07:53.158197 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/855f6436-68f0-42d8-a12a-bf25632440c1-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " pod="openstack/ovn-northd-0" Dec 03 07:07:53 crc kubenswrapper[4947]: I1203 07:07:53.158769 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/855f6436-68f0-42d8-a12a-bf25632440c1-config\") pod \"ovn-northd-0\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " pod="openstack/ovn-northd-0" Dec 03 07:07:53 crc kubenswrapper[4947]: I1203 07:07:53.158942 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/855f6436-68f0-42d8-a12a-bf25632440c1-scripts\") pod \"ovn-northd-0\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " pod="openstack/ovn-northd-0" Dec 03 07:07:53 crc kubenswrapper[4947]: I1203 07:07:53.171997 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855f6436-68f0-42d8-a12a-bf25632440c1-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " pod="openstack/ovn-northd-0" Dec 03 07:07:53 crc kubenswrapper[4947]: I1203 07:07:53.172614 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/855f6436-68f0-42d8-a12a-bf25632440c1-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " pod="openstack/ovn-northd-0" Dec 03 07:07:53 crc kubenswrapper[4947]: I1203 07:07:53.172714 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/855f6436-68f0-42d8-a12a-bf25632440c1-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " pod="openstack/ovn-northd-0" Dec 03 07:07:53 crc kubenswrapper[4947]: I1203 07:07:53.177244 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzbmz\" (UniqueName: \"kubernetes.io/projected/855f6436-68f0-42d8-a12a-bf25632440c1-kube-api-access-zzbmz\") pod \"ovn-northd-0\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " pod="openstack/ovn-northd-0" Dec 03 07:07:53 crc kubenswrapper[4947]: I1203 07:07:53.211627 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 07:07:53 crc kubenswrapper[4947]: I1203 07:07:53.220251 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:53 crc kubenswrapper[4947]: I1203 07:07:53.220297 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:53 crc kubenswrapper[4947]: I1203 07:07:53.700312 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 07:07:54 crc kubenswrapper[4947]: I1203 07:07:54.478575 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:54 crc kubenswrapper[4947]: I1203 07:07:54.573742 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"855f6436-68f0-42d8-a12a-bf25632440c1","Type":"ContainerStarted","Data":"b8f6ad072d8a430d3f397408cbb15abd6578dd51c4ed60e99bd7561ae1e6ae4d"} Dec 03 07:07:54 crc kubenswrapper[4947]: I1203 07:07:54.599839 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 03 07:07:55 crc kubenswrapper[4947]: I1203 07:07:55.588091 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"855f6436-68f0-42d8-a12a-bf25632440c1","Type":"ContainerStarted","Data":"9d943d955b4387689a8d712809d10af11da1db79e92a195ba57d31cce0773125"} Dec 03 07:07:55 crc kubenswrapper[4947]: I1203 07:07:55.588708 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"855f6436-68f0-42d8-a12a-bf25632440c1","Type":"ContainerStarted","Data":"9e770c8d82efa23dd527c2a4232bde18aef5979e147a5871d763e136d274533e"} Dec 03 07:07:55 crc kubenswrapper[4947]: I1203 07:07:55.588731 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 03 07:07:56 crc kubenswrapper[4947]: I1203 07:07:56.594786 4947 generic.go:334] "Generic (PLEG): container finished" podID="58224663-92bc-4143-ad66-3ce51e606d86" containerID="c04ea9b7a8fa051bc8f8f6e63fce2b71c7ecf7839a65f3ca53d45b4449c04110" exitCode=0 Dec 03 07:07:56 crc kubenswrapper[4947]: I1203 07:07:56.594883 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jp2pf" event={"ID":"58224663-92bc-4143-ad66-3ce51e606d86","Type":"ContainerDied","Data":"c04ea9b7a8fa051bc8f8f6e63fce2b71c7ecf7839a65f3ca53d45b4449c04110"} Dec 03 07:07:56 crc kubenswrapper[4947]: I1203 07:07:56.609732 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.328349601 podStartE2EDuration="4.609711477s" podCreationTimestamp="2025-12-03 07:07:52 +0000 UTC" firstStartedPulling="2025-12-03 07:07:53.710335609 +0000 UTC m=+1134.971290045" lastFinishedPulling="2025-12-03 07:07:54.991697495 +0000 UTC m=+1136.252651921" observedRunningTime="2025-12-03 07:07:55.605968516 +0000 UTC m=+1136.866922942" watchObservedRunningTime="2025-12-03 07:07:56.609711477 +0000 UTC m=+1137.870665913" Dec 03 07:07:57 crc kubenswrapper[4947]: I1203 07:07:57.776185 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 03 07:07:57 crc kubenswrapper[4947]: I1203 07:07:57.887930 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 03 07:07:58 crc kubenswrapper[4947]: I1203 07:07:58.031549 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jp2pf" Dec 03 07:07:58 crc kubenswrapper[4947]: I1203 07:07:58.142194 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/58224663-92bc-4143-ad66-3ce51e606d86-ring-data-devices\") pod \"58224663-92bc-4143-ad66-3ce51e606d86\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " Dec 03 07:07:58 crc kubenswrapper[4947]: I1203 07:07:58.142788 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/58224663-92bc-4143-ad66-3ce51e606d86-swiftconf\") pod \"58224663-92bc-4143-ad66-3ce51e606d86\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " Dec 03 07:07:58 crc kubenswrapper[4947]: I1203 07:07:58.142738 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58224663-92bc-4143-ad66-3ce51e606d86-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "58224663-92bc-4143-ad66-3ce51e606d86" (UID: "58224663-92bc-4143-ad66-3ce51e606d86"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:07:58 crc kubenswrapper[4947]: I1203 07:07:58.142871 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58224663-92bc-4143-ad66-3ce51e606d86-combined-ca-bundle\") pod \"58224663-92bc-4143-ad66-3ce51e606d86\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " Dec 03 07:07:58 crc kubenswrapper[4947]: I1203 07:07:58.143450 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/58224663-92bc-4143-ad66-3ce51e606d86-dispersionconf\") pod \"58224663-92bc-4143-ad66-3ce51e606d86\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " Dec 03 07:07:58 crc kubenswrapper[4947]: I1203 07:07:58.143580 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l8xh\" (UniqueName: \"kubernetes.io/projected/58224663-92bc-4143-ad66-3ce51e606d86-kube-api-access-5l8xh\") pod \"58224663-92bc-4143-ad66-3ce51e606d86\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " Dec 03 07:07:58 crc kubenswrapper[4947]: I1203 07:07:58.143624 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58224663-92bc-4143-ad66-3ce51e606d86-scripts\") pod \"58224663-92bc-4143-ad66-3ce51e606d86\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " Dec 03 07:07:58 crc kubenswrapper[4947]: I1203 07:07:58.143709 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/58224663-92bc-4143-ad66-3ce51e606d86-etc-swift\") pod \"58224663-92bc-4143-ad66-3ce51e606d86\" (UID: \"58224663-92bc-4143-ad66-3ce51e606d86\") " Dec 03 07:07:58 crc kubenswrapper[4947]: I1203 07:07:58.144168 4947 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/58224663-92bc-4143-ad66-3ce51e606d86-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:58 crc kubenswrapper[4947]: I1203 07:07:58.144962 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58224663-92bc-4143-ad66-3ce51e606d86-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "58224663-92bc-4143-ad66-3ce51e606d86" (UID: "58224663-92bc-4143-ad66-3ce51e606d86"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:07:58 crc kubenswrapper[4947]: I1203 07:07:58.148255 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58224663-92bc-4143-ad66-3ce51e606d86-kube-api-access-5l8xh" (OuterVolumeSpecName: "kube-api-access-5l8xh") pod "58224663-92bc-4143-ad66-3ce51e606d86" (UID: "58224663-92bc-4143-ad66-3ce51e606d86"). InnerVolumeSpecName "kube-api-access-5l8xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:07:58 crc kubenswrapper[4947]: I1203 07:07:58.154977 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58224663-92bc-4143-ad66-3ce51e606d86-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "58224663-92bc-4143-ad66-3ce51e606d86" (UID: "58224663-92bc-4143-ad66-3ce51e606d86"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:07:58 crc kubenswrapper[4947]: I1203 07:07:58.167699 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58224663-92bc-4143-ad66-3ce51e606d86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58224663-92bc-4143-ad66-3ce51e606d86" (UID: "58224663-92bc-4143-ad66-3ce51e606d86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:07:58 crc kubenswrapper[4947]: I1203 07:07:58.170710 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58224663-92bc-4143-ad66-3ce51e606d86-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "58224663-92bc-4143-ad66-3ce51e606d86" (UID: "58224663-92bc-4143-ad66-3ce51e606d86"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:07:58 crc kubenswrapper[4947]: I1203 07:07:58.175067 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58224663-92bc-4143-ad66-3ce51e606d86-scripts" (OuterVolumeSpecName: "scripts") pod "58224663-92bc-4143-ad66-3ce51e606d86" (UID: "58224663-92bc-4143-ad66-3ce51e606d86"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:07:58 crc kubenswrapper[4947]: I1203 07:07:58.245504 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l8xh\" (UniqueName: \"kubernetes.io/projected/58224663-92bc-4143-ad66-3ce51e606d86-kube-api-access-5l8xh\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:58 crc kubenswrapper[4947]: I1203 07:07:58.245549 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58224663-92bc-4143-ad66-3ce51e606d86-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:58 crc kubenswrapper[4947]: I1203 07:07:58.245562 4947 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/58224663-92bc-4143-ad66-3ce51e606d86-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:58 crc kubenswrapper[4947]: I1203 07:07:58.245575 4947 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/58224663-92bc-4143-ad66-3ce51e606d86-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:58 crc kubenswrapper[4947]: I1203 07:07:58.245586 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58224663-92bc-4143-ad66-3ce51e606d86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:58 crc kubenswrapper[4947]: I1203 07:07:58.245597 4947 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/58224663-92bc-4143-ad66-3ce51e606d86-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:58 crc kubenswrapper[4947]: I1203 07:07:58.616326 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jp2pf" Dec 03 07:07:58 crc kubenswrapper[4947]: I1203 07:07:58.616386 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jp2pf" event={"ID":"58224663-92bc-4143-ad66-3ce51e606d86","Type":"ContainerDied","Data":"903c8b4cdda0e73f00503bff8ad3b6fbd9447faf2317bc01c93206f4e36af074"} Dec 03 07:07:58 crc kubenswrapper[4947]: I1203 07:07:58.616438 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="903c8b4cdda0e73f00503bff8ad3b6fbd9447faf2317bc01c93206f4e36af074" Dec 03 07:08:01 crc kubenswrapper[4947]: I1203 07:08:01.646385 4947 generic.go:334] "Generic (PLEG): container finished" podID="5367165f-75ec-4633-8042-edfe91e3be60" containerID="3a110d90b266fd5a43f006a937f64aeb99494243e24e04ebca6558fe2553d7e0" exitCode=0 Dec 03 07:08:01 crc kubenswrapper[4947]: I1203 07:08:01.646762 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5367165f-75ec-4633-8042-edfe91e3be60","Type":"ContainerDied","Data":"3a110d90b266fd5a43f006a937f64aeb99494243e24e04ebca6558fe2553d7e0"} Dec 03 07:08:01 crc kubenswrapper[4947]: I1203 07:08:01.652811 4947 generic.go:334] "Generic (PLEG): container finished" podID="51d7ef1d-a0bf-465f-baad-1bc3a71618ff" containerID="f8e964bb562c5817c758d9180e2467c30d1c58be4f6c8c81c568b4e200107ce9" exitCode=0 Dec 03 07:08:01 crc kubenswrapper[4947]: I1203 07:08:01.652888 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"51d7ef1d-a0bf-465f-baad-1bc3a71618ff","Type":"ContainerDied","Data":"f8e964bb562c5817c758d9180e2467c30d1c58be4f6c8c81c568b4e200107ce9"} Dec 03 07:08:02 crc kubenswrapper[4947]: I1203 07:08:02.666485 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5367165f-75ec-4633-8042-edfe91e3be60","Type":"ContainerStarted","Data":"de4fb9c7248f451bd80d99c66509fa611e9f2312d6bc7b3b3051166fe6d71e26"} Dec 03 07:08:02 crc kubenswrapper[4947]: I1203 07:08:02.666989 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:08:02 crc kubenswrapper[4947]: I1203 07:08:02.670863 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"51d7ef1d-a0bf-465f-baad-1bc3a71618ff","Type":"ContainerStarted","Data":"23a38b8a1f4cbbe4e0977fa37f88e153dc49ca4d190b8ca3be90add4e540bbe0"} Dec 03 07:08:02 crc kubenswrapper[4947]: I1203 07:08:02.671242 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 03 07:08:02 crc kubenswrapper[4947]: I1203 07:08:02.697402 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=43.322842941 podStartE2EDuration="54.697388887s" podCreationTimestamp="2025-12-03 07:07:08 +0000 UTC" firstStartedPulling="2025-12-03 07:07:16.59599876 +0000 UTC m=+1097.856953186" lastFinishedPulling="2025-12-03 07:07:27.970544706 +0000 UTC m=+1109.231499132" observedRunningTime="2025-12-03 07:08:02.691362885 +0000 UTC m=+1143.952317311" watchObservedRunningTime="2025-12-03 07:08:02.697388887 +0000 UTC m=+1143.958343313" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.404196 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=44.053850038 podStartE2EDuration="55.404176142s" podCreationTimestamp="2025-12-03 07:07:08 +0000 UTC" firstStartedPulling="2025-12-03 07:07:16.596539924 +0000 UTC m=+1097.857494360" lastFinishedPulling="2025-12-03 07:07:27.946866038 +0000 UTC m=+1109.207820464" observedRunningTime="2025-12-03 07:08:02.729128053 +0000 UTC m=+1143.990082509" watchObservedRunningTime="2025-12-03 07:08:03.404176142 +0000 UTC m=+1144.665130558" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.409932 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-7pbrr"] Dec 03 07:08:03 crc kubenswrapper[4947]: E1203 07:08:03.410220 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58224663-92bc-4143-ad66-3ce51e606d86" containerName="swift-ring-rebalance" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.410235 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="58224663-92bc-4143-ad66-3ce51e606d86" containerName="swift-ring-rebalance" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.410389 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="58224663-92bc-4143-ad66-3ce51e606d86" containerName="swift-ring-rebalance" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.410904 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7pbrr" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.424550 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-7pbrr"] Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.445424 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnzps\" (UniqueName: \"kubernetes.io/projected/b720d360-8afc-419d-b5e2-49259161a9ea-kube-api-access-hnzps\") pod \"keystone-db-create-7pbrr\" (UID: \"b720d360-8afc-419d-b5e2-49259161a9ea\") " pod="openstack/keystone-db-create-7pbrr" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.445617 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b720d360-8afc-419d-b5e2-49259161a9ea-operator-scripts\") pod \"keystone-db-create-7pbrr\" (UID: \"b720d360-8afc-419d-b5e2-49259161a9ea\") " pod="openstack/keystone-db-create-7pbrr" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.539166 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-cqwr9"] Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.540596 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cqwr9" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.547360 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnzps\" (UniqueName: \"kubernetes.io/projected/b720d360-8afc-419d-b5e2-49259161a9ea-kube-api-access-hnzps\") pod \"keystone-db-create-7pbrr\" (UID: \"b720d360-8afc-419d-b5e2-49259161a9ea\") " pod="openstack/keystone-db-create-7pbrr" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.547430 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6b4e915-36d1-4586-96af-0abb8c1b9246-operator-scripts\") pod \"placement-db-create-cqwr9\" (UID: \"b6b4e915-36d1-4586-96af-0abb8c1b9246\") " pod="openstack/placement-db-create-cqwr9" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.547457 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8npq\" (UniqueName: \"kubernetes.io/projected/b6b4e915-36d1-4586-96af-0abb8c1b9246-kube-api-access-r8npq\") pod \"placement-db-create-cqwr9\" (UID: \"b6b4e915-36d1-4586-96af-0abb8c1b9246\") " pod="openstack/placement-db-create-cqwr9" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.547530 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b720d360-8afc-419d-b5e2-49259161a9ea-operator-scripts\") pod \"keystone-db-create-7pbrr\" (UID: \"b720d360-8afc-419d-b5e2-49259161a9ea\") " pod="openstack/keystone-db-create-7pbrr" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.548461 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b720d360-8afc-419d-b5e2-49259161a9ea-operator-scripts\") pod \"keystone-db-create-7pbrr\" (UID: \"b720d360-8afc-419d-b5e2-49259161a9ea\") " pod="openstack/keystone-db-create-7pbrr" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.595291 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-f419-account-create-update-9zkp8"] Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.596353 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnzps\" (UniqueName: \"kubernetes.io/projected/b720d360-8afc-419d-b5e2-49259161a9ea-kube-api-access-hnzps\") pod \"keystone-db-create-7pbrr\" (UID: \"b720d360-8afc-419d-b5e2-49259161a9ea\") " pod="openstack/keystone-db-create-7pbrr" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.596623 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f419-account-create-update-9zkp8" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.601610 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.625715 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-cqwr9"] Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.630829 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f419-account-create-update-9zkp8"] Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.649132 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8n8h\" (UniqueName: \"kubernetes.io/projected/a19d5562-e925-4567-94d0-001807fda043-kube-api-access-z8n8h\") pod \"keystone-f419-account-create-update-9zkp8\" (UID: \"a19d5562-e925-4567-94d0-001807fda043\") " pod="openstack/keystone-f419-account-create-update-9zkp8" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.649177 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6b4e915-36d1-4586-96af-0abb8c1b9246-operator-scripts\") pod \"placement-db-create-cqwr9\" (UID: \"b6b4e915-36d1-4586-96af-0abb8c1b9246\") " pod="openstack/placement-db-create-cqwr9" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.649202 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8npq\" (UniqueName: \"kubernetes.io/projected/b6b4e915-36d1-4586-96af-0abb8c1b9246-kube-api-access-r8npq\") pod \"placement-db-create-cqwr9\" (UID: \"b6b4e915-36d1-4586-96af-0abb8c1b9246\") " pod="openstack/placement-db-create-cqwr9" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.649520 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a19d5562-e925-4567-94d0-001807fda043-operator-scripts\") pod \"keystone-f419-account-create-update-9zkp8\" (UID: \"a19d5562-e925-4567-94d0-001807fda043\") " pod="openstack/keystone-f419-account-create-update-9zkp8" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.650026 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6b4e915-36d1-4586-96af-0abb8c1b9246-operator-scripts\") pod \"placement-db-create-cqwr9\" (UID: \"b6b4e915-36d1-4586-96af-0abb8c1b9246\") " pod="openstack/placement-db-create-cqwr9" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.656667 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-bdf2-account-create-update-ckdwp"] Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.658613 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bdf2-account-create-update-ckdwp" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.661706 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.667783 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8npq\" (UniqueName: \"kubernetes.io/projected/b6b4e915-36d1-4586-96af-0abb8c1b9246-kube-api-access-r8npq\") pod \"placement-db-create-cqwr9\" (UID: \"b6b4e915-36d1-4586-96af-0abb8c1b9246\") " pod="openstack/placement-db-create-cqwr9" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.677503 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-bdf2-account-create-update-ckdwp"] Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.726242 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7pbrr" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.751197 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8n8h\" (UniqueName: \"kubernetes.io/projected/a19d5562-e925-4567-94d0-001807fda043-kube-api-access-z8n8h\") pod \"keystone-f419-account-create-update-9zkp8\" (UID: \"a19d5562-e925-4567-94d0-001807fda043\") " pod="openstack/keystone-f419-account-create-update-9zkp8" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.751325 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a19d5562-e925-4567-94d0-001807fda043-operator-scripts\") pod \"keystone-f419-account-create-update-9zkp8\" (UID: \"a19d5562-e925-4567-94d0-001807fda043\") " pod="openstack/keystone-f419-account-create-update-9zkp8" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.752198 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a19d5562-e925-4567-94d0-001807fda043-operator-scripts\") pod \"keystone-f419-account-create-update-9zkp8\" (UID: \"a19d5562-e925-4567-94d0-001807fda043\") " pod="openstack/keystone-f419-account-create-update-9zkp8" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.772077 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8n8h\" (UniqueName: \"kubernetes.io/projected/a19d5562-e925-4567-94d0-001807fda043-kube-api-access-z8n8h\") pod \"keystone-f419-account-create-update-9zkp8\" (UID: \"a19d5562-e925-4567-94d0-001807fda043\") " pod="openstack/keystone-f419-account-create-update-9zkp8" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.843088 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-m8d6g"] Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.844659 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-m8d6g" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.852812 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/411f9305-440e-4f16-9f83-004561707000-operator-scripts\") pod \"placement-bdf2-account-create-update-ckdwp\" (UID: \"411f9305-440e-4f16-9f83-004561707000\") " pod="openstack/placement-bdf2-account-create-update-ckdwp" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.853088 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnc7t\" (UniqueName: \"kubernetes.io/projected/411f9305-440e-4f16-9f83-004561707000-kube-api-access-mnc7t\") pod \"placement-bdf2-account-create-update-ckdwp\" (UID: \"411f9305-440e-4f16-9f83-004561707000\") " pod="openstack/placement-bdf2-account-create-update-ckdwp" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.857634 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-m8d6g"] Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.941986 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-93c8-account-create-update-7nq7f"] Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.943934 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-93c8-account-create-update-7nq7f" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.952050 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.961167 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnc7t\" (UniqueName: \"kubernetes.io/projected/411f9305-440e-4f16-9f83-004561707000-kube-api-access-mnc7t\") pod \"placement-bdf2-account-create-update-ckdwp\" (UID: \"411f9305-440e-4f16-9f83-004561707000\") " pod="openstack/placement-bdf2-account-create-update-ckdwp" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.961221 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182-operator-scripts\") pod \"glance-db-create-m8d6g\" (UID: \"c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182\") " pod="openstack/glance-db-create-m8d6g" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.961305 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/411f9305-440e-4f16-9f83-004561707000-operator-scripts\") pod \"placement-bdf2-account-create-update-ckdwp\" (UID: \"411f9305-440e-4f16-9f83-004561707000\") " pod="openstack/placement-bdf2-account-create-update-ckdwp" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.961399 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dbws\" (UniqueName: \"kubernetes.io/projected/c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182-kube-api-access-2dbws\") pod \"glance-db-create-m8d6g\" (UID: \"c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182\") " pod="openstack/glance-db-create-m8d6g" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.963793 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/411f9305-440e-4f16-9f83-004561707000-operator-scripts\") pod \"placement-bdf2-account-create-update-ckdwp\" (UID: \"411f9305-440e-4f16-9f83-004561707000\") " pod="openstack/placement-bdf2-account-create-update-ckdwp" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.965653 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cqwr9" Dec 03 07:08:03 crc kubenswrapper[4947]: I1203 07:08:03.969909 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-93c8-account-create-update-7nq7f"] Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.018719 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnc7t\" (UniqueName: \"kubernetes.io/projected/411f9305-440e-4f16-9f83-004561707000-kube-api-access-mnc7t\") pod \"placement-bdf2-account-create-update-ckdwp\" (UID: \"411f9305-440e-4f16-9f83-004561707000\") " pod="openstack/placement-bdf2-account-create-update-ckdwp" Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.022428 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bdf2-account-create-update-ckdwp" Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.028549 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-7pbrr"] Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.028635 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f419-account-create-update-9zkp8" Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.063681 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dbws\" (UniqueName: \"kubernetes.io/projected/c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182-kube-api-access-2dbws\") pod \"glance-db-create-m8d6g\" (UID: \"c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182\") " pod="openstack/glance-db-create-m8d6g" Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.063740 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182-operator-scripts\") pod \"glance-db-create-m8d6g\" (UID: \"c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182\") " pod="openstack/glance-db-create-m8d6g" Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.063783 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn585\" (UniqueName: \"kubernetes.io/projected/0f5b09f9-d943-4f40-ab4a-567f71d6b13f-kube-api-access-kn585\") pod \"glance-93c8-account-create-update-7nq7f\" (UID: \"0f5b09f9-d943-4f40-ab4a-567f71d6b13f\") " pod="openstack/glance-93c8-account-create-update-7nq7f" Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.063822 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f5b09f9-d943-4f40-ab4a-567f71d6b13f-operator-scripts\") pod \"glance-93c8-account-create-update-7nq7f\" (UID: \"0f5b09f9-d943-4f40-ab4a-567f71d6b13f\") " pod="openstack/glance-93c8-account-create-update-7nq7f" Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.065825 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182-operator-scripts\") pod \"glance-db-create-m8d6g\" (UID: \"c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182\") " pod="openstack/glance-db-create-m8d6g" Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.078843 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dbws\" (UniqueName: \"kubernetes.io/projected/c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182-kube-api-access-2dbws\") pod \"glance-db-create-m8d6g\" (UID: \"c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182\") " pod="openstack/glance-db-create-m8d6g" Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.166094 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn585\" (UniqueName: \"kubernetes.io/projected/0f5b09f9-d943-4f40-ab4a-567f71d6b13f-kube-api-access-kn585\") pod \"glance-93c8-account-create-update-7nq7f\" (UID: \"0f5b09f9-d943-4f40-ab4a-567f71d6b13f\") " pod="openstack/glance-93c8-account-create-update-7nq7f" Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.166181 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f5b09f9-d943-4f40-ab4a-567f71d6b13f-operator-scripts\") pod \"glance-93c8-account-create-update-7nq7f\" (UID: \"0f5b09f9-d943-4f40-ab4a-567f71d6b13f\") " pod="openstack/glance-93c8-account-create-update-7nq7f" Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.167638 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f5b09f9-d943-4f40-ab4a-567f71d6b13f-operator-scripts\") pod \"glance-93c8-account-create-update-7nq7f\" (UID: \"0f5b09f9-d943-4f40-ab4a-567f71d6b13f\") " pod="openstack/glance-93c8-account-create-update-7nq7f" Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.177312 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-m8d6g" Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.187743 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn585\" (UniqueName: \"kubernetes.io/projected/0f5b09f9-d943-4f40-ab4a-567f71d6b13f-kube-api-access-kn585\") pod \"glance-93c8-account-create-update-7nq7f\" (UID: \"0f5b09f9-d943-4f40-ab4a-567f71d6b13f\") " pod="openstack/glance-93c8-account-create-update-7nq7f" Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.269330 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-93c8-account-create-update-7nq7f" Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.471730 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-cqwr9"] Dec 03 07:08:04 crc kubenswrapper[4947]: W1203 07:08:04.494618 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6b4e915_36d1_4586_96af_0abb8c1b9246.slice/crio-e42211fc185935960df0ea65bfc5351c1836350443b738c69139b0f2e64a37be WatchSource:0}: Error finding container e42211fc185935960df0ea65bfc5351c1836350443b738c69139b0f2e64a37be: Status 404 returned error can't find the container with id e42211fc185935960df0ea65bfc5351c1836350443b738c69139b0f2e64a37be Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.558283 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-bdf2-account-create-update-ckdwp"] Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.617749 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f419-account-create-update-9zkp8"] Dec 03 07:08:04 crc kubenswrapper[4947]: W1203 07:08:04.637613 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda19d5562_e925_4567_94d0_001807fda043.slice/crio-236647105a87b49963eee14d686fe0786bdc0555bb22d9d84e9d3e307894bc1e WatchSource:0}: Error finding container 236647105a87b49963eee14d686fe0786bdc0555bb22d9d84e9d3e307894bc1e: Status 404 returned error can't find the container with id 236647105a87b49963eee14d686fe0786bdc0555bb22d9d84e9d3e307894bc1e Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.689878 4947 generic.go:334] "Generic (PLEG): container finished" podID="b720d360-8afc-419d-b5e2-49259161a9ea" containerID="5ac2f4be7b9ddfe24a426f374335c91d29dea24a151ca75482bb4d5c943f1c1f" exitCode=0 Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.689956 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7pbrr" event={"ID":"b720d360-8afc-419d-b5e2-49259161a9ea","Type":"ContainerDied","Data":"5ac2f4be7b9ddfe24a426f374335c91d29dea24a151ca75482bb4d5c943f1c1f"} Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.689986 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7pbrr" event={"ID":"b720d360-8afc-419d-b5e2-49259161a9ea","Type":"ContainerStarted","Data":"4312b507bf7917ec296e192fefb648ce9938a96520a0f24b15d1e6635f0dcf5f"} Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.696282 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f419-account-create-update-9zkp8" event={"ID":"a19d5562-e925-4567-94d0-001807fda043","Type":"ContainerStarted","Data":"236647105a87b49963eee14d686fe0786bdc0555bb22d9d84e9d3e307894bc1e"} Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.697453 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bdf2-account-create-update-ckdwp" event={"ID":"411f9305-440e-4f16-9f83-004561707000","Type":"ContainerStarted","Data":"075ff666018b77dfa39f79bbdd0f4127053f993cac87a40fd6e3c97f67e9a8d4"} Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.698540 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cqwr9" event={"ID":"b6b4e915-36d1-4586-96af-0abb8c1b9246","Type":"ContainerStarted","Data":"7149a4572e5e5e3a063052ea9b7c1cd9fe182ceed0f5cd18d89d101e6199b317"} Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.698574 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cqwr9" event={"ID":"b6b4e915-36d1-4586-96af-0abb8c1b9246","Type":"ContainerStarted","Data":"e42211fc185935960df0ea65bfc5351c1836350443b738c69139b0f2e64a37be"} Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.737511 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-m8d6g"] Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.744298 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-cqwr9" podStartSLOduration=1.744280652 podStartE2EDuration="1.744280652s" podCreationTimestamp="2025-12-03 07:08:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:08:04.732460634 +0000 UTC m=+1145.993415060" watchObservedRunningTime="2025-12-03 07:08:04.744280652 +0000 UTC m=+1146.005235078" Dec 03 07:08:04 crc kubenswrapper[4947]: I1203 07:08:04.760692 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-93c8-account-create-update-7nq7f"] Dec 03 07:08:05 crc kubenswrapper[4947]: I1203 07:08:05.710930 4947 generic.go:334] "Generic (PLEG): container finished" podID="b6b4e915-36d1-4586-96af-0abb8c1b9246" containerID="7149a4572e5e5e3a063052ea9b7c1cd9fe182ceed0f5cd18d89d101e6199b317" exitCode=0 Dec 03 07:08:05 crc kubenswrapper[4947]: I1203 07:08:05.711048 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cqwr9" event={"ID":"b6b4e915-36d1-4586-96af-0abb8c1b9246","Type":"ContainerDied","Data":"7149a4572e5e5e3a063052ea9b7c1cd9fe182ceed0f5cd18d89d101e6199b317"} Dec 03 07:08:05 crc kubenswrapper[4947]: I1203 07:08:05.713379 4947 generic.go:334] "Generic (PLEG): container finished" podID="a19d5562-e925-4567-94d0-001807fda043" containerID="cca036791174cd51f9120f826c271de5b4fa4462fa9eca0d1720c564b47c02f0" exitCode=0 Dec 03 07:08:05 crc kubenswrapper[4947]: I1203 07:08:05.713451 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f419-account-create-update-9zkp8" event={"ID":"a19d5562-e925-4567-94d0-001807fda043","Type":"ContainerDied","Data":"cca036791174cd51f9120f826c271de5b4fa4462fa9eca0d1720c564b47c02f0"} Dec 03 07:08:05 crc kubenswrapper[4947]: I1203 07:08:05.716138 4947 generic.go:334] "Generic (PLEG): container finished" podID="0f5b09f9-d943-4f40-ab4a-567f71d6b13f" containerID="fe361200f59f5f57f5be4b1d94e67c3de3026379cc88db2902131071988c9d9f" exitCode=0 Dec 03 07:08:05 crc kubenswrapper[4947]: I1203 07:08:05.716182 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-93c8-account-create-update-7nq7f" event={"ID":"0f5b09f9-d943-4f40-ab4a-567f71d6b13f","Type":"ContainerDied","Data":"fe361200f59f5f57f5be4b1d94e67c3de3026379cc88db2902131071988c9d9f"} Dec 03 07:08:05 crc kubenswrapper[4947]: I1203 07:08:05.716230 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-93c8-account-create-update-7nq7f" event={"ID":"0f5b09f9-d943-4f40-ab4a-567f71d6b13f","Type":"ContainerStarted","Data":"a21093c7854214ce721da0e430ec0273080336607a0d8637cc8b41400893828c"} Dec 03 07:08:05 crc kubenswrapper[4947]: I1203 07:08:05.718738 4947 generic.go:334] "Generic (PLEG): container finished" podID="c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182" containerID="cec820629ebe723b81f827504cd18d3327c219c11a06e73c61a37a257b4fe002" exitCode=0 Dec 03 07:08:05 crc kubenswrapper[4947]: I1203 07:08:05.718784 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-m8d6g" event={"ID":"c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182","Type":"ContainerDied","Data":"cec820629ebe723b81f827504cd18d3327c219c11a06e73c61a37a257b4fe002"} Dec 03 07:08:05 crc kubenswrapper[4947]: I1203 07:08:05.718823 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-m8d6g" event={"ID":"c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182","Type":"ContainerStarted","Data":"835a3706d4699cc1b5f369d42352b96f87842b4f36a54236d2e6036975269c0e"} Dec 03 07:08:05 crc kubenswrapper[4947]: I1203 07:08:05.720549 4947 generic.go:334] "Generic (PLEG): container finished" podID="411f9305-440e-4f16-9f83-004561707000" containerID="54ffca1d739dbd65d8fc73257d9b164f24984c77619c97c08b04adf4a0aabcd3" exitCode=0 Dec 03 07:08:05 crc kubenswrapper[4947]: I1203 07:08:05.720589 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bdf2-account-create-update-ckdwp" event={"ID":"411f9305-440e-4f16-9f83-004561707000","Type":"ContainerDied","Data":"54ffca1d739dbd65d8fc73257d9b164f24984c77619c97c08b04adf4a0aabcd3"} Dec 03 07:08:06 crc kubenswrapper[4947]: I1203 07:08:06.116809 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7pbrr" Dec 03 07:08:06 crc kubenswrapper[4947]: I1203 07:08:06.210413 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b720d360-8afc-419d-b5e2-49259161a9ea-operator-scripts\") pod \"b720d360-8afc-419d-b5e2-49259161a9ea\" (UID: \"b720d360-8afc-419d-b5e2-49259161a9ea\") " Dec 03 07:08:06 crc kubenswrapper[4947]: I1203 07:08:06.210580 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnzps\" (UniqueName: \"kubernetes.io/projected/b720d360-8afc-419d-b5e2-49259161a9ea-kube-api-access-hnzps\") pod \"b720d360-8afc-419d-b5e2-49259161a9ea\" (UID: \"b720d360-8afc-419d-b5e2-49259161a9ea\") " Dec 03 07:08:06 crc kubenswrapper[4947]: I1203 07:08:06.211295 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b720d360-8afc-419d-b5e2-49259161a9ea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b720d360-8afc-419d-b5e2-49259161a9ea" (UID: "b720d360-8afc-419d-b5e2-49259161a9ea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:06 crc kubenswrapper[4947]: I1203 07:08:06.216875 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b720d360-8afc-419d-b5e2-49259161a9ea-kube-api-access-hnzps" (OuterVolumeSpecName: "kube-api-access-hnzps") pod "b720d360-8afc-419d-b5e2-49259161a9ea" (UID: "b720d360-8afc-419d-b5e2-49259161a9ea"). InnerVolumeSpecName "kube-api-access-hnzps". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:08:06 crc kubenswrapper[4947]: I1203 07:08:06.313125 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnzps\" (UniqueName: \"kubernetes.io/projected/b720d360-8afc-419d-b5e2-49259161a9ea-kube-api-access-hnzps\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:06 crc kubenswrapper[4947]: I1203 07:08:06.313379 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b720d360-8afc-419d-b5e2-49259161a9ea-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:06 crc kubenswrapper[4947]: I1203 07:08:06.733772 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7pbrr" event={"ID":"b720d360-8afc-419d-b5e2-49259161a9ea","Type":"ContainerDied","Data":"4312b507bf7917ec296e192fefb648ce9938a96520a0f24b15d1e6635f0dcf5f"} Dec 03 07:08:06 crc kubenswrapper[4947]: I1203 07:08:06.733837 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4312b507bf7917ec296e192fefb648ce9938a96520a0f24b15d1e6635f0dcf5f" Dec 03 07:08:06 crc kubenswrapper[4947]: I1203 07:08:06.733901 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7pbrr" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.164821 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f419-account-create-update-9zkp8" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.289948 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-93c8-account-create-update-7nq7f" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.295842 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cqwr9" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.303043 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-m8d6g" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.304217 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bdf2-account-create-update-ckdwp" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.329449 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a19d5562-e925-4567-94d0-001807fda043-operator-scripts\") pod \"a19d5562-e925-4567-94d0-001807fda043\" (UID: \"a19d5562-e925-4567-94d0-001807fda043\") " Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.329487 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8n8h\" (UniqueName: \"kubernetes.io/projected/a19d5562-e925-4567-94d0-001807fda043-kube-api-access-z8n8h\") pod \"a19d5562-e925-4567-94d0-001807fda043\" (UID: \"a19d5562-e925-4567-94d0-001807fda043\") " Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.330564 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a19d5562-e925-4567-94d0-001807fda043-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a19d5562-e925-4567-94d0-001807fda043" (UID: "a19d5562-e925-4567-94d0-001807fda043"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.333508 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a19d5562-e925-4567-94d0-001807fda043-kube-api-access-z8n8h" (OuterVolumeSpecName: "kube-api-access-z8n8h") pod "a19d5562-e925-4567-94d0-001807fda043" (UID: "a19d5562-e925-4567-94d0-001807fda043"). InnerVolumeSpecName "kube-api-access-z8n8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.430276 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dbws\" (UniqueName: \"kubernetes.io/projected/c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182-kube-api-access-2dbws\") pod \"c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182\" (UID: \"c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182\") " Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.430325 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn585\" (UniqueName: \"kubernetes.io/projected/0f5b09f9-d943-4f40-ab4a-567f71d6b13f-kube-api-access-kn585\") pod \"0f5b09f9-d943-4f40-ab4a-567f71d6b13f\" (UID: \"0f5b09f9-d943-4f40-ab4a-567f71d6b13f\") " Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.430359 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6b4e915-36d1-4586-96af-0abb8c1b9246-operator-scripts\") pod \"b6b4e915-36d1-4586-96af-0abb8c1b9246\" (UID: \"b6b4e915-36d1-4586-96af-0abb8c1b9246\") " Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.430380 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f5b09f9-d943-4f40-ab4a-567f71d6b13f-operator-scripts\") pod \"0f5b09f9-d943-4f40-ab4a-567f71d6b13f\" (UID: \"0f5b09f9-d943-4f40-ab4a-567f71d6b13f\") " Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.430413 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnc7t\" (UniqueName: \"kubernetes.io/projected/411f9305-440e-4f16-9f83-004561707000-kube-api-access-mnc7t\") pod \"411f9305-440e-4f16-9f83-004561707000\" (UID: \"411f9305-440e-4f16-9f83-004561707000\") " Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.430480 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8npq\" (UniqueName: \"kubernetes.io/projected/b6b4e915-36d1-4586-96af-0abb8c1b9246-kube-api-access-r8npq\") pod \"b6b4e915-36d1-4586-96af-0abb8c1b9246\" (UID: \"b6b4e915-36d1-4586-96af-0abb8c1b9246\") " Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.430554 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/411f9305-440e-4f16-9f83-004561707000-operator-scripts\") pod \"411f9305-440e-4f16-9f83-004561707000\" (UID: \"411f9305-440e-4f16-9f83-004561707000\") " Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.430602 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182-operator-scripts\") pod \"c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182\" (UID: \"c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182\") " Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.430883 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a19d5562-e925-4567-94d0-001807fda043-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.430899 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8n8h\" (UniqueName: \"kubernetes.io/projected/a19d5562-e925-4567-94d0-001807fda043-kube-api-access-z8n8h\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.431145 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f5b09f9-d943-4f40-ab4a-567f71d6b13f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0f5b09f9-d943-4f40-ab4a-567f71d6b13f" (UID: "0f5b09f9-d943-4f40-ab4a-567f71d6b13f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.431274 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182" (UID: "c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.431313 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/411f9305-440e-4f16-9f83-004561707000-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "411f9305-440e-4f16-9f83-004561707000" (UID: "411f9305-440e-4f16-9f83-004561707000"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.431696 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6b4e915-36d1-4586-96af-0abb8c1b9246-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b6b4e915-36d1-4586-96af-0abb8c1b9246" (UID: "b6b4e915-36d1-4586-96af-0abb8c1b9246"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.433527 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f5b09f9-d943-4f40-ab4a-567f71d6b13f-kube-api-access-kn585" (OuterVolumeSpecName: "kube-api-access-kn585") pod "0f5b09f9-d943-4f40-ab4a-567f71d6b13f" (UID: "0f5b09f9-d943-4f40-ab4a-567f71d6b13f"). InnerVolumeSpecName "kube-api-access-kn585". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.433592 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/411f9305-440e-4f16-9f83-004561707000-kube-api-access-mnc7t" (OuterVolumeSpecName: "kube-api-access-mnc7t") pod "411f9305-440e-4f16-9f83-004561707000" (UID: "411f9305-440e-4f16-9f83-004561707000"). InnerVolumeSpecName "kube-api-access-mnc7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.434068 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6b4e915-36d1-4586-96af-0abb8c1b9246-kube-api-access-r8npq" (OuterVolumeSpecName: "kube-api-access-r8npq") pod "b6b4e915-36d1-4586-96af-0abb8c1b9246" (UID: "b6b4e915-36d1-4586-96af-0abb8c1b9246"). InnerVolumeSpecName "kube-api-access-r8npq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.434471 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182-kube-api-access-2dbws" (OuterVolumeSpecName: "kube-api-access-2dbws") pod "c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182" (UID: "c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182"). InnerVolumeSpecName "kube-api-access-2dbws". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.532146 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dbws\" (UniqueName: \"kubernetes.io/projected/c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182-kube-api-access-2dbws\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.532183 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn585\" (UniqueName: \"kubernetes.io/projected/0f5b09f9-d943-4f40-ab4a-567f71d6b13f-kube-api-access-kn585\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.532196 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6b4e915-36d1-4586-96af-0abb8c1b9246-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.532208 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f5b09f9-d943-4f40-ab4a-567f71d6b13f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.532221 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnc7t\" (UniqueName: \"kubernetes.io/projected/411f9305-440e-4f16-9f83-004561707000-kube-api-access-mnc7t\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.532233 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8npq\" (UniqueName: \"kubernetes.io/projected/b6b4e915-36d1-4586-96af-0abb8c1b9246-kube-api-access-r8npq\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.532244 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/411f9305-440e-4f16-9f83-004561707000-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.532255 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.748664 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-bdf2-account-create-update-ckdwp" event={"ID":"411f9305-440e-4f16-9f83-004561707000","Type":"ContainerDied","Data":"075ff666018b77dfa39f79bbdd0f4127053f993cac87a40fd6e3c97f67e9a8d4"} Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.748719 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="075ff666018b77dfa39f79bbdd0f4127053f993cac87a40fd6e3c97f67e9a8d4" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.748694 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-bdf2-account-create-update-ckdwp" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.751823 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-cqwr9" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.751831 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-cqwr9" event={"ID":"b6b4e915-36d1-4586-96af-0abb8c1b9246","Type":"ContainerDied","Data":"e42211fc185935960df0ea65bfc5351c1836350443b738c69139b0f2e64a37be"} Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.751886 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e42211fc185935960df0ea65bfc5351c1836350443b738c69139b0f2e64a37be" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.754572 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f419-account-create-update-9zkp8" event={"ID":"a19d5562-e925-4567-94d0-001807fda043","Type":"ContainerDied","Data":"236647105a87b49963eee14d686fe0786bdc0555bb22d9d84e9d3e307894bc1e"} Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.754620 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f419-account-create-update-9zkp8" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.754623 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="236647105a87b49963eee14d686fe0786bdc0555bb22d9d84e9d3e307894bc1e" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.757108 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-93c8-account-create-update-7nq7f" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.758124 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-93c8-account-create-update-7nq7f" event={"ID":"0f5b09f9-d943-4f40-ab4a-567f71d6b13f","Type":"ContainerDied","Data":"a21093c7854214ce721da0e430ec0273080336607a0d8637cc8b41400893828c"} Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.758172 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a21093c7854214ce721da0e430ec0273080336607a0d8637cc8b41400893828c" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.760688 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-m8d6g" event={"ID":"c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182","Type":"ContainerDied","Data":"835a3706d4699cc1b5f369d42352b96f87842b4f36a54236d2e6036975269c0e"} Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.760739 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="835a3706d4699cc1b5f369d42352b96f87842b4f36a54236d2e6036975269c0e" Dec 03 07:08:07 crc kubenswrapper[4947]: I1203 07:08:07.760798 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-m8d6g" Dec 03 07:08:08 crc kubenswrapper[4947]: I1203 07:08:08.244603 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-etc-swift\") pod \"swift-storage-0\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " pod="openstack/swift-storage-0" Dec 03 07:08:08 crc kubenswrapper[4947]: I1203 07:08:08.249698 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-etc-swift\") pod \"swift-storage-0\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " pod="openstack/swift-storage-0" Dec 03 07:08:08 crc kubenswrapper[4947]: I1203 07:08:08.267089 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 03 07:08:08 crc kubenswrapper[4947]: I1203 07:08:08.488021 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.053283 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 03 07:08:09 crc kubenswrapper[4947]: W1203 07:08:09.055009 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5dc8a280_5a18_41fd_8e61_f51afa973d20.slice/crio-d45e15ccf604f6a3fb83d5c85740fc19f409cd96d5fbd32aa1553f993c35e019 WatchSource:0}: Error finding container d45e15ccf604f6a3fb83d5c85740fc19f409cd96d5fbd32aa1553f993c35e019: Status 404 returned error can't find the container with id d45e15ccf604f6a3fb83d5c85740fc19f409cd96d5fbd32aa1553f993c35e019 Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.258367 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-7tpzk"] Dec 03 07:08:09 crc kubenswrapper[4947]: E1203 07:08:09.258755 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a19d5562-e925-4567-94d0-001807fda043" containerName="mariadb-account-create-update" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.258769 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="a19d5562-e925-4567-94d0-001807fda043" containerName="mariadb-account-create-update" Dec 03 07:08:09 crc kubenswrapper[4947]: E1203 07:08:09.258792 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b720d360-8afc-419d-b5e2-49259161a9ea" containerName="mariadb-database-create" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.258800 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="b720d360-8afc-419d-b5e2-49259161a9ea" containerName="mariadb-database-create" Dec 03 07:08:09 crc kubenswrapper[4947]: E1203 07:08:09.258820 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182" containerName="mariadb-database-create" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.258828 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182" containerName="mariadb-database-create" Dec 03 07:08:09 crc kubenswrapper[4947]: E1203 07:08:09.258846 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f5b09f9-d943-4f40-ab4a-567f71d6b13f" containerName="mariadb-account-create-update" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.258855 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f5b09f9-d943-4f40-ab4a-567f71d6b13f" containerName="mariadb-account-create-update" Dec 03 07:08:09 crc kubenswrapper[4947]: E1203 07:08:09.258872 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b4e915-36d1-4586-96af-0abb8c1b9246" containerName="mariadb-database-create" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.258881 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b4e915-36d1-4586-96af-0abb8c1b9246" containerName="mariadb-database-create" Dec 03 07:08:09 crc kubenswrapper[4947]: E1203 07:08:09.258905 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="411f9305-440e-4f16-9f83-004561707000" containerName="mariadb-account-create-update" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.258915 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="411f9305-440e-4f16-9f83-004561707000" containerName="mariadb-account-create-update" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.259103 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="a19d5562-e925-4567-94d0-001807fda043" containerName="mariadb-account-create-update" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.259122 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="b720d360-8afc-419d-b5e2-49259161a9ea" containerName="mariadb-database-create" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.259140 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f5b09f9-d943-4f40-ab4a-567f71d6b13f" containerName="mariadb-account-create-update" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.259154 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182" containerName="mariadb-database-create" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.259167 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6b4e915-36d1-4586-96af-0abb8c1b9246" containerName="mariadb-database-create" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.259183 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="411f9305-440e-4f16-9f83-004561707000" containerName="mariadb-account-create-update" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.259731 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-7tpzk"] Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.259809 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7tpzk" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.262962 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.265223 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-cgkdm" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.365484 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/000a9c13-4796-4ef4-ba6e-5e57e567dc57-combined-ca-bundle\") pod \"glance-db-sync-7tpzk\" (UID: \"000a9c13-4796-4ef4-ba6e-5e57e567dc57\") " pod="openstack/glance-db-sync-7tpzk" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.365557 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj8dp\" (UniqueName: \"kubernetes.io/projected/000a9c13-4796-4ef4-ba6e-5e57e567dc57-kube-api-access-vj8dp\") pod \"glance-db-sync-7tpzk\" (UID: \"000a9c13-4796-4ef4-ba6e-5e57e567dc57\") " pod="openstack/glance-db-sync-7tpzk" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.365619 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/000a9c13-4796-4ef4-ba6e-5e57e567dc57-config-data\") pod \"glance-db-sync-7tpzk\" (UID: \"000a9c13-4796-4ef4-ba6e-5e57e567dc57\") " pod="openstack/glance-db-sync-7tpzk" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.365640 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/000a9c13-4796-4ef4-ba6e-5e57e567dc57-db-sync-config-data\") pod \"glance-db-sync-7tpzk\" (UID: \"000a9c13-4796-4ef4-ba6e-5e57e567dc57\") " pod="openstack/glance-db-sync-7tpzk" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.391108 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-84gbj" podUID="98211c56-fd23-46c2-9710-31fc562e2182" containerName="ovn-controller" probeResult="failure" output=< Dec 03 07:08:09 crc kubenswrapper[4947]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 03 07:08:09 crc kubenswrapper[4947]: > Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.467123 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/000a9c13-4796-4ef4-ba6e-5e57e567dc57-combined-ca-bundle\") pod \"glance-db-sync-7tpzk\" (UID: \"000a9c13-4796-4ef4-ba6e-5e57e567dc57\") " pod="openstack/glance-db-sync-7tpzk" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.467206 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj8dp\" (UniqueName: \"kubernetes.io/projected/000a9c13-4796-4ef4-ba6e-5e57e567dc57-kube-api-access-vj8dp\") pod \"glance-db-sync-7tpzk\" (UID: \"000a9c13-4796-4ef4-ba6e-5e57e567dc57\") " pod="openstack/glance-db-sync-7tpzk" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.467290 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/000a9c13-4796-4ef4-ba6e-5e57e567dc57-config-data\") pod \"glance-db-sync-7tpzk\" (UID: \"000a9c13-4796-4ef4-ba6e-5e57e567dc57\") " pod="openstack/glance-db-sync-7tpzk" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.467317 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/000a9c13-4796-4ef4-ba6e-5e57e567dc57-db-sync-config-data\") pod \"glance-db-sync-7tpzk\" (UID: \"000a9c13-4796-4ef4-ba6e-5e57e567dc57\") " pod="openstack/glance-db-sync-7tpzk" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.473657 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/000a9c13-4796-4ef4-ba6e-5e57e567dc57-combined-ca-bundle\") pod \"glance-db-sync-7tpzk\" (UID: \"000a9c13-4796-4ef4-ba6e-5e57e567dc57\") " pod="openstack/glance-db-sync-7tpzk" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.474485 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/000a9c13-4796-4ef4-ba6e-5e57e567dc57-db-sync-config-data\") pod \"glance-db-sync-7tpzk\" (UID: \"000a9c13-4796-4ef4-ba6e-5e57e567dc57\") " pod="openstack/glance-db-sync-7tpzk" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.479328 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/000a9c13-4796-4ef4-ba6e-5e57e567dc57-config-data\") pod \"glance-db-sync-7tpzk\" (UID: \"000a9c13-4796-4ef4-ba6e-5e57e567dc57\") " pod="openstack/glance-db-sync-7tpzk" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.483384 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj8dp\" (UniqueName: \"kubernetes.io/projected/000a9c13-4796-4ef4-ba6e-5e57e567dc57-kube-api-access-vj8dp\") pod \"glance-db-sync-7tpzk\" (UID: \"000a9c13-4796-4ef4-ba6e-5e57e567dc57\") " pod="openstack/glance-db-sync-7tpzk" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.610404 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7tpzk" Dec 03 07:08:09 crc kubenswrapper[4947]: I1203 07:08:09.776578 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerStarted","Data":"d45e15ccf604f6a3fb83d5c85740fc19f409cd96d5fbd32aa1553f993c35e019"} Dec 03 07:08:10 crc kubenswrapper[4947]: I1203 07:08:10.239064 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-7tpzk"] Dec 03 07:08:10 crc kubenswrapper[4947]: I1203 07:08:10.792853 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerStarted","Data":"f49bb955ad69cca770cf0f549937dcdd2e2098f40b0afb681932a0b678968068"} Dec 03 07:08:10 crc kubenswrapper[4947]: I1203 07:08:10.794403 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7tpzk" event={"ID":"000a9c13-4796-4ef4-ba6e-5e57e567dc57","Type":"ContainerStarted","Data":"1bbac8da2a59fbe20c18ba7d206ede392755099bf0186213f299d74cba03ef50"} Dec 03 07:08:11 crc kubenswrapper[4947]: I1203 07:08:11.802329 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerStarted","Data":"d8b3c8f44f2e233d211d7dc57ffed40c7ab6c7b15d021bbb35a7dbf134eff941"} Dec 03 07:08:11 crc kubenswrapper[4947]: I1203 07:08:11.802746 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerStarted","Data":"d3eb07996d43bedac1f1d1fa489d736d81b2c8f48876740c85c31fdcd4f49d77"} Dec 03 07:08:11 crc kubenswrapper[4947]: I1203 07:08:11.802768 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerStarted","Data":"9c72d4dda83bcdd5aa991a2138d85cb2117515fc20f7262bda9d4cf7cbb24de8"} Dec 03 07:08:12 crc kubenswrapper[4947]: I1203 07:08:12.812415 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerStarted","Data":"d09ae4508992ad6594ab0cf49b18c9dde1d758bb9614d6152f7610d5543b8ba4"} Dec 03 07:08:13 crc kubenswrapper[4947]: I1203 07:08:13.826263 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerStarted","Data":"d934198eccfc8bc4a5f9d891474951c5c3b59ba02e3e53ad44a55f4895165461"} Dec 03 07:08:13 crc kubenswrapper[4947]: I1203 07:08:13.826324 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerStarted","Data":"9f55040331ca3d502b5c2088f0718ed298df96d80034f2b8e129b91f1d02388f"} Dec 03 07:08:13 crc kubenswrapper[4947]: I1203 07:08:13.826343 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerStarted","Data":"e8d2991a67aac964ce17db312462ba4a29fb541e49ec2b41398e5be186932d4d"} Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.397020 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-84gbj" podUID="98211c56-fd23-46c2-9710-31fc562e2182" containerName="ovn-controller" probeResult="failure" output=< Dec 03 07:08:14 crc kubenswrapper[4947]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 03 07:08:14 crc kubenswrapper[4947]: > Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.443884 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.454441 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.699471 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-84gbj-config-qsxnj"] Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.700656 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84gbj-config-qsxnj" Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.703857 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.707242 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-84gbj-config-qsxnj"] Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.838364 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerStarted","Data":"7579cc3029f2cccd754569c3be548ada24f443eaf1602cc6cdbee7d860630040"} Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.838405 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerStarted","Data":"66517c20f9bd5a79033d770b7fe6acd20f04d2ccb83413adddf9ce2d92f48b06"} Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.867809 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/af01dff7-404f-4000-a537-74c62cb2d840-var-run\") pod \"ovn-controller-84gbj-config-qsxnj\" (UID: \"af01dff7-404f-4000-a537-74c62cb2d840\") " pod="openstack/ovn-controller-84gbj-config-qsxnj" Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.867894 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/af01dff7-404f-4000-a537-74c62cb2d840-additional-scripts\") pod \"ovn-controller-84gbj-config-qsxnj\" (UID: \"af01dff7-404f-4000-a537-74c62cb2d840\") " pod="openstack/ovn-controller-84gbj-config-qsxnj" Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.867931 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af01dff7-404f-4000-a537-74c62cb2d840-scripts\") pod \"ovn-controller-84gbj-config-qsxnj\" (UID: \"af01dff7-404f-4000-a537-74c62cb2d840\") " pod="openstack/ovn-controller-84gbj-config-qsxnj" Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.867985 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7ch8\" (UniqueName: \"kubernetes.io/projected/af01dff7-404f-4000-a537-74c62cb2d840-kube-api-access-v7ch8\") pod \"ovn-controller-84gbj-config-qsxnj\" (UID: \"af01dff7-404f-4000-a537-74c62cb2d840\") " pod="openstack/ovn-controller-84gbj-config-qsxnj" Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.868033 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/af01dff7-404f-4000-a537-74c62cb2d840-var-log-ovn\") pod \"ovn-controller-84gbj-config-qsxnj\" (UID: \"af01dff7-404f-4000-a537-74c62cb2d840\") " pod="openstack/ovn-controller-84gbj-config-qsxnj" Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.868062 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/af01dff7-404f-4000-a537-74c62cb2d840-var-run-ovn\") pod \"ovn-controller-84gbj-config-qsxnj\" (UID: \"af01dff7-404f-4000-a537-74c62cb2d840\") " pod="openstack/ovn-controller-84gbj-config-qsxnj" Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.969855 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/af01dff7-404f-4000-a537-74c62cb2d840-var-run-ovn\") pod \"ovn-controller-84gbj-config-qsxnj\" (UID: \"af01dff7-404f-4000-a537-74c62cb2d840\") " pod="openstack/ovn-controller-84gbj-config-qsxnj" Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.970196 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/af01dff7-404f-4000-a537-74c62cb2d840-var-run\") pod \"ovn-controller-84gbj-config-qsxnj\" (UID: \"af01dff7-404f-4000-a537-74c62cb2d840\") " pod="openstack/ovn-controller-84gbj-config-qsxnj" Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.970159 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/af01dff7-404f-4000-a537-74c62cb2d840-var-run-ovn\") pod \"ovn-controller-84gbj-config-qsxnj\" (UID: \"af01dff7-404f-4000-a537-74c62cb2d840\") " pod="openstack/ovn-controller-84gbj-config-qsxnj" Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.970382 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/af01dff7-404f-4000-a537-74c62cb2d840-var-run\") pod \"ovn-controller-84gbj-config-qsxnj\" (UID: \"af01dff7-404f-4000-a537-74c62cb2d840\") " pod="openstack/ovn-controller-84gbj-config-qsxnj" Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.970671 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/af01dff7-404f-4000-a537-74c62cb2d840-additional-scripts\") pod \"ovn-controller-84gbj-config-qsxnj\" (UID: \"af01dff7-404f-4000-a537-74c62cb2d840\") " pod="openstack/ovn-controller-84gbj-config-qsxnj" Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.970747 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af01dff7-404f-4000-a537-74c62cb2d840-scripts\") pod \"ovn-controller-84gbj-config-qsxnj\" (UID: \"af01dff7-404f-4000-a537-74c62cb2d840\") " pod="openstack/ovn-controller-84gbj-config-qsxnj" Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.971395 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/af01dff7-404f-4000-a537-74c62cb2d840-additional-scripts\") pod \"ovn-controller-84gbj-config-qsxnj\" (UID: \"af01dff7-404f-4000-a537-74c62cb2d840\") " pod="openstack/ovn-controller-84gbj-config-qsxnj" Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.971437 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7ch8\" (UniqueName: \"kubernetes.io/projected/af01dff7-404f-4000-a537-74c62cb2d840-kube-api-access-v7ch8\") pod \"ovn-controller-84gbj-config-qsxnj\" (UID: \"af01dff7-404f-4000-a537-74c62cb2d840\") " pod="openstack/ovn-controller-84gbj-config-qsxnj" Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.971478 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/af01dff7-404f-4000-a537-74c62cb2d840-var-log-ovn\") pod \"ovn-controller-84gbj-config-qsxnj\" (UID: \"af01dff7-404f-4000-a537-74c62cb2d840\") " pod="openstack/ovn-controller-84gbj-config-qsxnj" Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.971583 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/af01dff7-404f-4000-a537-74c62cb2d840-var-log-ovn\") pod \"ovn-controller-84gbj-config-qsxnj\" (UID: \"af01dff7-404f-4000-a537-74c62cb2d840\") " pod="openstack/ovn-controller-84gbj-config-qsxnj" Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.973006 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af01dff7-404f-4000-a537-74c62cb2d840-scripts\") pod \"ovn-controller-84gbj-config-qsxnj\" (UID: \"af01dff7-404f-4000-a537-74c62cb2d840\") " pod="openstack/ovn-controller-84gbj-config-qsxnj" Dec 03 07:08:14 crc kubenswrapper[4947]: I1203 07:08:14.998475 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7ch8\" (UniqueName: \"kubernetes.io/projected/af01dff7-404f-4000-a537-74c62cb2d840-kube-api-access-v7ch8\") pod \"ovn-controller-84gbj-config-qsxnj\" (UID: \"af01dff7-404f-4000-a537-74c62cb2d840\") " pod="openstack/ovn-controller-84gbj-config-qsxnj" Dec 03 07:08:15 crc kubenswrapper[4947]: I1203 07:08:15.033016 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84gbj-config-qsxnj" Dec 03 07:08:19 crc kubenswrapper[4947]: I1203 07:08:19.386551 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-84gbj" podUID="98211c56-fd23-46c2-9710-31fc562e2182" containerName="ovn-controller" probeResult="failure" output=< Dec 03 07:08:19 crc kubenswrapper[4947]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 03 07:08:19 crc kubenswrapper[4947]: > Dec 03 07:08:20 crc kubenswrapper[4947]: I1203 07:08:20.341550 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:08:20 crc kubenswrapper[4947]: I1203 07:08:20.397885 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.039247 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-2qbdx"] Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.040710 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2qbdx" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.072365 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-2qbdx"] Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.154828 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-mxrh4"] Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.155874 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-mxrh4" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.164626 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-0e6b-account-create-update-bfb9m"] Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.165766 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0e6b-account-create-update-bfb9m" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.169810 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.183677 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-0e6b-account-create-update-bfb9m"] Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.209089 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-mxrh4"] Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.240203 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30c40412-9287-4f1a-a221-6f3dc1c2f33b-operator-scripts\") pod \"barbican-db-create-mxrh4\" (UID: \"30c40412-9287-4f1a-a221-6f3dc1c2f33b\") " pod="openstack/barbican-db-create-mxrh4" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.240261 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blpbj\" (UniqueName: \"kubernetes.io/projected/6541ef13-a7b0-45f5-886e-ebf6ee0550bb-kube-api-access-blpbj\") pod \"cinder-db-create-2qbdx\" (UID: \"6541ef13-a7b0-45f5-886e-ebf6ee0550bb\") " pod="openstack/cinder-db-create-2qbdx" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.240283 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92ba5ee1-269e-4971-895f-2393110c2bcd-operator-scripts\") pod \"cinder-0e6b-account-create-update-bfb9m\" (UID: \"92ba5ee1-269e-4971-895f-2393110c2bcd\") " pod="openstack/cinder-0e6b-account-create-update-bfb9m" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.240453 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6541ef13-a7b0-45f5-886e-ebf6ee0550bb-operator-scripts\") pod \"cinder-db-create-2qbdx\" (UID: \"6541ef13-a7b0-45f5-886e-ebf6ee0550bb\") " pod="openstack/cinder-db-create-2qbdx" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.240515 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r8nb\" (UniqueName: \"kubernetes.io/projected/30c40412-9287-4f1a-a221-6f3dc1c2f33b-kube-api-access-5r8nb\") pod \"barbican-db-create-mxrh4\" (UID: \"30c40412-9287-4f1a-a221-6f3dc1c2f33b\") " pod="openstack/barbican-db-create-mxrh4" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.240547 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2xwz\" (UniqueName: \"kubernetes.io/projected/92ba5ee1-269e-4971-895f-2393110c2bcd-kube-api-access-z2xwz\") pod \"cinder-0e6b-account-create-update-bfb9m\" (UID: \"92ba5ee1-269e-4971-895f-2393110c2bcd\") " pod="openstack/cinder-0e6b-account-create-update-bfb9m" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.264550 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-0363-account-create-update-hgv5w"] Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.267274 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0363-account-create-update-hgv5w" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.270739 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0363-account-create-update-hgv5w"] Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.271326 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.341550 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30c40412-9287-4f1a-a221-6f3dc1c2f33b-operator-scripts\") pod \"barbican-db-create-mxrh4\" (UID: \"30c40412-9287-4f1a-a221-6f3dc1c2f33b\") " pod="openstack/barbican-db-create-mxrh4" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.341603 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blpbj\" (UniqueName: \"kubernetes.io/projected/6541ef13-a7b0-45f5-886e-ebf6ee0550bb-kube-api-access-blpbj\") pod \"cinder-db-create-2qbdx\" (UID: \"6541ef13-a7b0-45f5-886e-ebf6ee0550bb\") " pod="openstack/cinder-db-create-2qbdx" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.341646 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92ba5ee1-269e-4971-895f-2393110c2bcd-operator-scripts\") pod \"cinder-0e6b-account-create-update-bfb9m\" (UID: \"92ba5ee1-269e-4971-895f-2393110c2bcd\") " pod="openstack/cinder-0e6b-account-create-update-bfb9m" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.342775 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6541ef13-a7b0-45f5-886e-ebf6ee0550bb-operator-scripts\") pod \"cinder-db-create-2qbdx\" (UID: \"6541ef13-a7b0-45f5-886e-ebf6ee0550bb\") " pod="openstack/cinder-db-create-2qbdx" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.342803 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r8nb\" (UniqueName: \"kubernetes.io/projected/30c40412-9287-4f1a-a221-6f3dc1c2f33b-kube-api-access-5r8nb\") pod \"barbican-db-create-mxrh4\" (UID: \"30c40412-9287-4f1a-a221-6f3dc1c2f33b\") " pod="openstack/barbican-db-create-mxrh4" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.342824 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2xwz\" (UniqueName: \"kubernetes.io/projected/92ba5ee1-269e-4971-895f-2393110c2bcd-kube-api-access-z2xwz\") pod \"cinder-0e6b-account-create-update-bfb9m\" (UID: \"92ba5ee1-269e-4971-895f-2393110c2bcd\") " pod="openstack/cinder-0e6b-account-create-update-bfb9m" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.343526 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30c40412-9287-4f1a-a221-6f3dc1c2f33b-operator-scripts\") pod \"barbican-db-create-mxrh4\" (UID: \"30c40412-9287-4f1a-a221-6f3dc1c2f33b\") " pod="openstack/barbican-db-create-mxrh4" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.343854 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6541ef13-a7b0-45f5-886e-ebf6ee0550bb-operator-scripts\") pod \"cinder-db-create-2qbdx\" (UID: \"6541ef13-a7b0-45f5-886e-ebf6ee0550bb\") " pod="openstack/cinder-db-create-2qbdx" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.345381 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92ba5ee1-269e-4971-895f-2393110c2bcd-operator-scripts\") pod \"cinder-0e6b-account-create-update-bfb9m\" (UID: \"92ba5ee1-269e-4971-895f-2393110c2bcd\") " pod="openstack/cinder-0e6b-account-create-update-bfb9m" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.361794 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r8nb\" (UniqueName: \"kubernetes.io/projected/30c40412-9287-4f1a-a221-6f3dc1c2f33b-kube-api-access-5r8nb\") pod \"barbican-db-create-mxrh4\" (UID: \"30c40412-9287-4f1a-a221-6f3dc1c2f33b\") " pod="openstack/barbican-db-create-mxrh4" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.362359 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blpbj\" (UniqueName: \"kubernetes.io/projected/6541ef13-a7b0-45f5-886e-ebf6ee0550bb-kube-api-access-blpbj\") pod \"cinder-db-create-2qbdx\" (UID: \"6541ef13-a7b0-45f5-886e-ebf6ee0550bb\") " pod="openstack/cinder-db-create-2qbdx" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.373074 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2xwz\" (UniqueName: \"kubernetes.io/projected/92ba5ee1-269e-4971-895f-2393110c2bcd-kube-api-access-z2xwz\") pod \"cinder-0e6b-account-create-update-bfb9m\" (UID: \"92ba5ee1-269e-4971-895f-2393110c2bcd\") " pod="openstack/cinder-0e6b-account-create-update-bfb9m" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.412067 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-ms4p8"] Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.413403 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ms4p8" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.418363 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.418619 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-r75nw" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.418361 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.423717 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.427797 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ms4p8"] Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.445720 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcsnx\" (UniqueName: \"kubernetes.io/projected/b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52-kube-api-access-vcsnx\") pod \"barbican-0363-account-create-update-hgv5w\" (UID: \"b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52\") " pod="openstack/barbican-0363-account-create-update-hgv5w" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.445771 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52-operator-scripts\") pod \"barbican-0363-account-create-update-hgv5w\" (UID: \"b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52\") " pod="openstack/barbican-0363-account-create-update-hgv5w" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.445870 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvc79\" (UniqueName: \"kubernetes.io/projected/14e18665-c444-4a96-962a-cecea35695b1-kube-api-access-lvc79\") pod \"keystone-db-sync-ms4p8\" (UID: \"14e18665-c444-4a96-962a-cecea35695b1\") " pod="openstack/keystone-db-sync-ms4p8" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.445899 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e18665-c444-4a96-962a-cecea35695b1-combined-ca-bundle\") pod \"keystone-db-sync-ms4p8\" (UID: \"14e18665-c444-4a96-962a-cecea35695b1\") " pod="openstack/keystone-db-sync-ms4p8" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.445959 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e18665-c444-4a96-962a-cecea35695b1-config-data\") pod \"keystone-db-sync-ms4p8\" (UID: \"14e18665-c444-4a96-962a-cecea35695b1\") " pod="openstack/keystone-db-sync-ms4p8" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.478934 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-mxrh4" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.510583 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0e6b-account-create-update-bfb9m" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.548091 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52-operator-scripts\") pod \"barbican-0363-account-create-update-hgv5w\" (UID: \"b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52\") " pod="openstack/barbican-0363-account-create-update-hgv5w" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.548139 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcsnx\" (UniqueName: \"kubernetes.io/projected/b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52-kube-api-access-vcsnx\") pod \"barbican-0363-account-create-update-hgv5w\" (UID: \"b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52\") " pod="openstack/barbican-0363-account-create-update-hgv5w" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.548246 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvc79\" (UniqueName: \"kubernetes.io/projected/14e18665-c444-4a96-962a-cecea35695b1-kube-api-access-lvc79\") pod \"keystone-db-sync-ms4p8\" (UID: \"14e18665-c444-4a96-962a-cecea35695b1\") " pod="openstack/keystone-db-sync-ms4p8" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.548279 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e18665-c444-4a96-962a-cecea35695b1-combined-ca-bundle\") pod \"keystone-db-sync-ms4p8\" (UID: \"14e18665-c444-4a96-962a-cecea35695b1\") " pod="openstack/keystone-db-sync-ms4p8" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.548350 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e18665-c444-4a96-962a-cecea35695b1-config-data\") pod \"keystone-db-sync-ms4p8\" (UID: \"14e18665-c444-4a96-962a-cecea35695b1\") " pod="openstack/keystone-db-sync-ms4p8" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.549140 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52-operator-scripts\") pod \"barbican-0363-account-create-update-hgv5w\" (UID: \"b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52\") " pod="openstack/barbican-0363-account-create-update-hgv5w" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.552232 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e18665-c444-4a96-962a-cecea35695b1-combined-ca-bundle\") pod \"keystone-db-sync-ms4p8\" (UID: \"14e18665-c444-4a96-962a-cecea35695b1\") " pod="openstack/keystone-db-sync-ms4p8" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.565916 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-qj625"] Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.567063 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qj625" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.572459 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvc79\" (UniqueName: \"kubernetes.io/projected/14e18665-c444-4a96-962a-cecea35695b1-kube-api-access-lvc79\") pod \"keystone-db-sync-ms4p8\" (UID: \"14e18665-c444-4a96-962a-cecea35695b1\") " pod="openstack/keystone-db-sync-ms4p8" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.573153 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e18665-c444-4a96-962a-cecea35695b1-config-data\") pod \"keystone-db-sync-ms4p8\" (UID: \"14e18665-c444-4a96-962a-cecea35695b1\") " pod="openstack/keystone-db-sync-ms4p8" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.573212 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-896b-account-create-update-vt5xk"] Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.574113 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcsnx\" (UniqueName: \"kubernetes.io/projected/b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52-kube-api-access-vcsnx\") pod \"barbican-0363-account-create-update-hgv5w\" (UID: \"b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52\") " pod="openstack/barbican-0363-account-create-update-hgv5w" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.575176 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-896b-account-create-update-vt5xk" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.579943 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.581505 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qj625"] Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.591014 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-896b-account-create-update-vt5xk"] Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.621902 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0363-account-create-update-hgv5w" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.649747 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5-operator-scripts\") pod \"neutron-896b-account-create-update-vt5xk\" (UID: \"1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5\") " pod="openstack/neutron-896b-account-create-update-vt5xk" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.649985 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cm92\" (UniqueName: \"kubernetes.io/projected/0aca2603-7c75-448c-b019-c9403906ac3b-kube-api-access-9cm92\") pod \"neutron-db-create-qj625\" (UID: \"0aca2603-7c75-448c-b019-c9403906ac3b\") " pod="openstack/neutron-db-create-qj625" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.650043 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvw6x\" (UniqueName: \"kubernetes.io/projected/1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5-kube-api-access-xvw6x\") pod \"neutron-896b-account-create-update-vt5xk\" (UID: \"1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5\") " pod="openstack/neutron-896b-account-create-update-vt5xk" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.650071 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aca2603-7c75-448c-b019-c9403906ac3b-operator-scripts\") pod \"neutron-db-create-qj625\" (UID: \"0aca2603-7c75-448c-b019-c9403906ac3b\") " pod="openstack/neutron-db-create-qj625" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.659066 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2qbdx" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.749048 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ms4p8" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.751365 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5-operator-scripts\") pod \"neutron-896b-account-create-update-vt5xk\" (UID: \"1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5\") " pod="openstack/neutron-896b-account-create-update-vt5xk" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.751417 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cm92\" (UniqueName: \"kubernetes.io/projected/0aca2603-7c75-448c-b019-c9403906ac3b-kube-api-access-9cm92\") pod \"neutron-db-create-qj625\" (UID: \"0aca2603-7c75-448c-b019-c9403906ac3b\") " pod="openstack/neutron-db-create-qj625" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.751464 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvw6x\" (UniqueName: \"kubernetes.io/projected/1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5-kube-api-access-xvw6x\") pod \"neutron-896b-account-create-update-vt5xk\" (UID: \"1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5\") " pod="openstack/neutron-896b-account-create-update-vt5xk" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.751522 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aca2603-7c75-448c-b019-c9403906ac3b-operator-scripts\") pod \"neutron-db-create-qj625\" (UID: \"0aca2603-7c75-448c-b019-c9403906ac3b\") " pod="openstack/neutron-db-create-qj625" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.752205 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aca2603-7c75-448c-b019-c9403906ac3b-operator-scripts\") pod \"neutron-db-create-qj625\" (UID: \"0aca2603-7c75-448c-b019-c9403906ac3b\") " pod="openstack/neutron-db-create-qj625" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.752421 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5-operator-scripts\") pod \"neutron-896b-account-create-update-vt5xk\" (UID: \"1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5\") " pod="openstack/neutron-896b-account-create-update-vt5xk" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.770403 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvw6x\" (UniqueName: \"kubernetes.io/projected/1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5-kube-api-access-xvw6x\") pod \"neutron-896b-account-create-update-vt5xk\" (UID: \"1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5\") " pod="openstack/neutron-896b-account-create-update-vt5xk" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.773927 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cm92\" (UniqueName: \"kubernetes.io/projected/0aca2603-7c75-448c-b019-c9403906ac3b-kube-api-access-9cm92\") pod \"neutron-db-create-qj625\" (UID: \"0aca2603-7c75-448c-b019-c9403906ac3b\") " pod="openstack/neutron-db-create-qj625" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.937275 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qj625" Dec 03 07:08:22 crc kubenswrapper[4947]: I1203 07:08:22.944261 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-896b-account-create-update-vt5xk" Dec 03 07:08:23 crc kubenswrapper[4947]: E1203 07:08:23.354009 4947 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.196:43306->38.102.83.196:38979: write tcp 38.102.83.196:43306->38.102.83.196:38979: write: broken pipe Dec 03 07:08:23 crc kubenswrapper[4947]: I1203 07:08:23.933875 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerStarted","Data":"e4a196d70b59a52f0de0490ca863f45fe4efbfa9a908b2cfde82cec941e4b30f"} Dec 03 07:08:23 crc kubenswrapper[4947]: I1203 07:08:23.935030 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerStarted","Data":"bea9b5a2830e45fb27873f63fdf3c2659562adb3f096b0f77513dea99befbef1"} Dec 03 07:08:24 crc kubenswrapper[4947]: I1203 07:08:24.049351 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-0e6b-account-create-update-bfb9m"] Dec 03 07:08:24 crc kubenswrapper[4947]: I1203 07:08:24.058601 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ms4p8"] Dec 03 07:08:24 crc kubenswrapper[4947]: I1203 07:08:24.076618 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-mxrh4"] Dec 03 07:08:24 crc kubenswrapper[4947]: I1203 07:08:24.087374 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-2qbdx"] Dec 03 07:08:24 crc kubenswrapper[4947]: I1203 07:08:24.104061 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-84gbj-config-qsxnj"] Dec 03 07:08:24 crc kubenswrapper[4947]: I1203 07:08:24.268535 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-896b-account-create-update-vt5xk"] Dec 03 07:08:24 crc kubenswrapper[4947]: I1203 07:08:24.287712 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0363-account-create-update-hgv5w"] Dec 03 07:08:24 crc kubenswrapper[4947]: W1203 07:08:24.288610 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b89c1c2_cea2_4ad2_8594_9d3d1ab240f5.slice/crio-19425ae5de49c8182eefe8978f22c00cdbaa6e2ec401e45c1c478633ea1773e6 WatchSource:0}: Error finding container 19425ae5de49c8182eefe8978f22c00cdbaa6e2ec401e45c1c478633ea1773e6: Status 404 returned error can't find the container with id 19425ae5de49c8182eefe8978f22c00cdbaa6e2ec401e45c1c478633ea1773e6 Dec 03 07:08:24 crc kubenswrapper[4947]: I1203 07:08:24.311482 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qj625"] Dec 03 07:08:24 crc kubenswrapper[4947]: W1203 07:08:24.330795 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0aca2603_7c75_448c_b019_c9403906ac3b.slice/crio-5cb50a0c3982e1ed5024616978a887ed2dfe628da861a6f823571a169aabe398 WatchSource:0}: Error finding container 5cb50a0c3982e1ed5024616978a887ed2dfe628da861a6f823571a169aabe398: Status 404 returned error can't find the container with id 5cb50a0c3982e1ed5024616978a887ed2dfe628da861a6f823571a169aabe398 Dec 03 07:08:24 crc kubenswrapper[4947]: I1203 07:08:24.463583 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-84gbj" podUID="98211c56-fd23-46c2-9710-31fc562e2182" containerName="ovn-controller" probeResult="failure" output=< Dec 03 07:08:24 crc kubenswrapper[4947]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 03 07:08:24 crc kubenswrapper[4947]: > Dec 03 07:08:24 crc kubenswrapper[4947]: I1203 07:08:24.952595 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7tpzk" event={"ID":"000a9c13-4796-4ef4-ba6e-5e57e567dc57","Type":"ContainerStarted","Data":"12c4f57540cc8159dc9d9868fdd26f7712daf6b1b80c9065470a6becaf4c402b"} Dec 03 07:08:24 crc kubenswrapper[4947]: I1203 07:08:24.968730 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0363-account-create-update-hgv5w" event={"ID":"b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52","Type":"ContainerStarted","Data":"362bc95a1a55a281ea46b642a4e75653da431e70d6b31a1af4b85f03b79ba639"} Dec 03 07:08:24 crc kubenswrapper[4947]: I1203 07:08:24.968816 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0363-account-create-update-hgv5w" event={"ID":"b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52","Type":"ContainerStarted","Data":"5e4e1886b89839307e16d664bfadc91908ff080efbee0b547b48544afd6b83ec"} Dec 03 07:08:24 crc kubenswrapper[4947]: I1203 07:08:24.978322 4947 generic.go:334] "Generic (PLEG): container finished" podID="6541ef13-a7b0-45f5-886e-ebf6ee0550bb" containerID="7ab390ea5a32098708bae957ee6d754d53e5b9113cc224e30df12f6bde7d7e18" exitCode=0 Dec 03 07:08:24 crc kubenswrapper[4947]: I1203 07:08:24.978393 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-2qbdx" event={"ID":"6541ef13-a7b0-45f5-886e-ebf6ee0550bb","Type":"ContainerDied","Data":"7ab390ea5a32098708bae957ee6d754d53e5b9113cc224e30df12f6bde7d7e18"} Dec 03 07:08:24 crc kubenswrapper[4947]: I1203 07:08:24.978419 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-2qbdx" event={"ID":"6541ef13-a7b0-45f5-886e-ebf6ee0550bb","Type":"ContainerStarted","Data":"d63d54ea60d3ad8d722b2360074edcbf8523f5eff25fc7b9cae8b65828c90a00"} Dec 03 07:08:24 crc kubenswrapper[4947]: I1203 07:08:24.979500 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-896b-account-create-update-vt5xk" event={"ID":"1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5","Type":"ContainerStarted","Data":"19425ae5de49c8182eefe8978f22c00cdbaa6e2ec401e45c1c478633ea1773e6"} Dec 03 07:08:24 crc kubenswrapper[4947]: I1203 07:08:24.984831 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-7tpzk" podStartSLOduration=2.923913189 podStartE2EDuration="15.984814241s" podCreationTimestamp="2025-12-03 07:08:09 +0000 UTC" firstStartedPulling="2025-12-03 07:08:10.360794617 +0000 UTC m=+1151.621749043" lastFinishedPulling="2025-12-03 07:08:23.421695669 +0000 UTC m=+1164.682650095" observedRunningTime="2025-12-03 07:08:24.972828818 +0000 UTC m=+1166.233783254" watchObservedRunningTime="2025-12-03 07:08:24.984814241 +0000 UTC m=+1166.245768667" Dec 03 07:08:24 crc kubenswrapper[4947]: I1203 07:08:24.987675 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerStarted","Data":"f69a782b89a00f114b21b586623c5d8f0e73109af52bd1c6676ae4209fab1573"} Dec 03 07:08:24 crc kubenswrapper[4947]: I1203 07:08:24.987711 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerStarted","Data":"d87586511f96368d02332320b8070531b8e4c9823a90eea78876b046f2488da5"} Dec 03 07:08:24 crc kubenswrapper[4947]: I1203 07:08:24.989597 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84gbj-config-qsxnj" event={"ID":"af01dff7-404f-4000-a537-74c62cb2d840","Type":"ContainerStarted","Data":"16255723e7205ad7a536dbf4b69b6b3fcc4882ee357336dbdd9cdf6612ace516"} Dec 03 07:08:24 crc kubenswrapper[4947]: I1203 07:08:24.989617 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84gbj-config-qsxnj" event={"ID":"af01dff7-404f-4000-a537-74c62cb2d840","Type":"ContainerStarted","Data":"cf124b35289e3602fab891dce7074cca587d621ffc2270f5616b833df318fd77"} Dec 03 07:08:24 crc kubenswrapper[4947]: I1203 07:08:24.994895 4947 generic.go:334] "Generic (PLEG): container finished" podID="30c40412-9287-4f1a-a221-6f3dc1c2f33b" containerID="eea712e04988a28c6a66e0d143ed1e46cb7f5ad455b6cdf5305180775878b7b0" exitCode=0 Dec 03 07:08:24 crc kubenswrapper[4947]: I1203 07:08:24.995012 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-mxrh4" event={"ID":"30c40412-9287-4f1a-a221-6f3dc1c2f33b","Type":"ContainerDied","Data":"eea712e04988a28c6a66e0d143ed1e46cb7f5ad455b6cdf5305180775878b7b0"} Dec 03 07:08:24 crc kubenswrapper[4947]: I1203 07:08:24.995041 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-mxrh4" event={"ID":"30c40412-9287-4f1a-a221-6f3dc1c2f33b","Type":"ContainerStarted","Data":"1cf3220c88a88f9affef653fca6f2b97c439205d7713ae3a02507922cbd39dba"} Dec 03 07:08:25 crc kubenswrapper[4947]: I1203 07:08:25.000301 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ms4p8" event={"ID":"14e18665-c444-4a96-962a-cecea35695b1","Type":"ContainerStarted","Data":"8b0208cb1f8770f3632e47fb7a4c4b62be463d558ddccf7d13b2d4ebea3ae081"} Dec 03 07:08:25 crc kubenswrapper[4947]: I1203 07:08:25.002331 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0e6b-account-create-update-bfb9m" event={"ID":"92ba5ee1-269e-4971-895f-2393110c2bcd","Type":"ContainerStarted","Data":"6766d43117c70819e1c6cb4c5fa75c20b20eceebf474e1f3dd284b91392b89bb"} Dec 03 07:08:25 crc kubenswrapper[4947]: I1203 07:08:25.002356 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0e6b-account-create-update-bfb9m" event={"ID":"92ba5ee1-269e-4971-895f-2393110c2bcd","Type":"ContainerStarted","Data":"07d59be855be1ef6a2b4828dc5d24880a4eb0a42373a58c948a9affd6ac45695"} Dec 03 07:08:25 crc kubenswrapper[4947]: I1203 07:08:25.008276 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qj625" event={"ID":"0aca2603-7c75-448c-b019-c9403906ac3b","Type":"ContainerStarted","Data":"5cb50a0c3982e1ed5024616978a887ed2dfe628da861a6f823571a169aabe398"} Dec 03 07:08:25 crc kubenswrapper[4947]: I1203 07:08:25.018412 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-0363-account-create-update-hgv5w" podStartSLOduration=3.018391146 podStartE2EDuration="3.018391146s" podCreationTimestamp="2025-12-03 07:08:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:08:24.999772345 +0000 UTC m=+1166.260726771" watchObservedRunningTime="2025-12-03 07:08:25.018391146 +0000 UTC m=+1166.279345572" Dec 03 07:08:25 crc kubenswrapper[4947]: I1203 07:08:25.078399 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-84gbj-config-qsxnj" podStartSLOduration=11.078376964 podStartE2EDuration="11.078376964s" podCreationTimestamp="2025-12-03 07:08:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:08:25.043684648 +0000 UTC m=+1166.304639074" watchObservedRunningTime="2025-12-03 07:08:25.078376964 +0000 UTC m=+1166.339331390" Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.017457 4947 generic.go:334] "Generic (PLEG): container finished" podID="af01dff7-404f-4000-a537-74c62cb2d840" containerID="16255723e7205ad7a536dbf4b69b6b3fcc4882ee357336dbdd9cdf6612ace516" exitCode=0 Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.017829 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84gbj-config-qsxnj" event={"ID":"af01dff7-404f-4000-a537-74c62cb2d840","Type":"ContainerDied","Data":"16255723e7205ad7a536dbf4b69b6b3fcc4882ee357336dbdd9cdf6612ace516"} Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.019264 4947 generic.go:334] "Generic (PLEG): container finished" podID="0aca2603-7c75-448c-b019-c9403906ac3b" containerID="5ad3fa21c883f1399362511f39b2a0eba4dd79e18d1bbc686bd886b186157175" exitCode=0 Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.019339 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qj625" event={"ID":"0aca2603-7c75-448c-b019-c9403906ac3b","Type":"ContainerDied","Data":"5ad3fa21c883f1399362511f39b2a0eba4dd79e18d1bbc686bd886b186157175"} Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.020936 4947 generic.go:334] "Generic (PLEG): container finished" podID="b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52" containerID="362bc95a1a55a281ea46b642a4e75653da431e70d6b31a1af4b85f03b79ba639" exitCode=0 Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.021017 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0363-account-create-update-hgv5w" event={"ID":"b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52","Type":"ContainerDied","Data":"362bc95a1a55a281ea46b642a4e75653da431e70d6b31a1af4b85f03b79ba639"} Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.022354 4947 generic.go:334] "Generic (PLEG): container finished" podID="92ba5ee1-269e-4971-895f-2393110c2bcd" containerID="6766d43117c70819e1c6cb4c5fa75c20b20eceebf474e1f3dd284b91392b89bb" exitCode=0 Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.022554 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0e6b-account-create-update-bfb9m" event={"ID":"92ba5ee1-269e-4971-895f-2393110c2bcd","Type":"ContainerDied","Data":"6766d43117c70819e1c6cb4c5fa75c20b20eceebf474e1f3dd284b91392b89bb"} Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.024351 4947 generic.go:334] "Generic (PLEG): container finished" podID="1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5" containerID="6862eeba03902517d3aa9b7fe73e29b1783f7913945bfd9c6bffd507ee095166" exitCode=0 Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.024408 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-896b-account-create-update-vt5xk" event={"ID":"1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5","Type":"ContainerDied","Data":"6862eeba03902517d3aa9b7fe73e29b1783f7913945bfd9c6bffd507ee095166"} Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.042729 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerStarted","Data":"52672ead9b6ba2c835fb5ca4f95054db2a9ce1fa6df424a727875fabb4ce0dbc"} Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.134781 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=45.858998174999996 podStartE2EDuration="51.134757704s" podCreationTimestamp="2025-12-03 07:07:35 +0000 UTC" firstStartedPulling="2025-12-03 07:08:09.057781027 +0000 UTC m=+1150.318735453" lastFinishedPulling="2025-12-03 07:08:14.333540556 +0000 UTC m=+1155.594494982" observedRunningTime="2025-12-03 07:08:26.130419047 +0000 UTC m=+1167.391373473" watchObservedRunningTime="2025-12-03 07:08:26.134757704 +0000 UTC m=+1167.395712130" Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.394338 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-864b648dc7-g7kq4"] Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.396275 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.400282 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.427676 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864b648dc7-g7kq4"] Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.543773 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-ovsdbserver-sb\") pod \"dnsmasq-dns-864b648dc7-g7kq4\" (UID: \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\") " pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.543863 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-ovsdbserver-nb\") pod \"dnsmasq-dns-864b648dc7-g7kq4\" (UID: \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\") " pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.543893 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-dns-svc\") pod \"dnsmasq-dns-864b648dc7-g7kq4\" (UID: \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\") " pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.543945 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-config\") pod \"dnsmasq-dns-864b648dc7-g7kq4\" (UID: \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\") " pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.543994 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-dns-swift-storage-0\") pod \"dnsmasq-dns-864b648dc7-g7kq4\" (UID: \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\") " pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.544071 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvbf9\" (UniqueName: \"kubernetes.io/projected/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-kube-api-access-cvbf9\") pod \"dnsmasq-dns-864b648dc7-g7kq4\" (UID: \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\") " pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.645786 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-ovsdbserver-nb\") pod \"dnsmasq-dns-864b648dc7-g7kq4\" (UID: \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\") " pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.645844 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-dns-svc\") pod \"dnsmasq-dns-864b648dc7-g7kq4\" (UID: \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\") " pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.645882 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-config\") pod \"dnsmasq-dns-864b648dc7-g7kq4\" (UID: \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\") " pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.645932 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-dns-swift-storage-0\") pod \"dnsmasq-dns-864b648dc7-g7kq4\" (UID: \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\") " pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.645948 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvbf9\" (UniqueName: \"kubernetes.io/projected/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-kube-api-access-cvbf9\") pod \"dnsmasq-dns-864b648dc7-g7kq4\" (UID: \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\") " pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.645980 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-ovsdbserver-sb\") pod \"dnsmasq-dns-864b648dc7-g7kq4\" (UID: \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\") " pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.647077 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-ovsdbserver-sb\") pod \"dnsmasq-dns-864b648dc7-g7kq4\" (UID: \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\") " pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.647119 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-dns-svc\") pod \"dnsmasq-dns-864b648dc7-g7kq4\" (UID: \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\") " pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.647154 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-ovsdbserver-nb\") pod \"dnsmasq-dns-864b648dc7-g7kq4\" (UID: \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\") " pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.647724 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-config\") pod \"dnsmasq-dns-864b648dc7-g7kq4\" (UID: \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\") " pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.647761 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-dns-swift-storage-0\") pod \"dnsmasq-dns-864b648dc7-g7kq4\" (UID: \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\") " pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.665679 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvbf9\" (UniqueName: \"kubernetes.io/projected/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-kube-api-access-cvbf9\") pod \"dnsmasq-dns-864b648dc7-g7kq4\" (UID: \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\") " pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" Dec 03 07:08:26 crc kubenswrapper[4947]: I1203 07:08:26.725109 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.400728 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-84gbj" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.490708 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-896b-account-create-update-vt5xk" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.508289 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qj625" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.535655 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0363-account-create-update-hgv5w" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.539566 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2qbdx" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.545099 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0e6b-account-create-update-bfb9m" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.555459 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-mxrh4" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.573261 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84gbj-config-qsxnj" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.608041 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aca2603-7c75-448c-b019-c9403906ac3b-operator-scripts\") pod \"0aca2603-7c75-448c-b019-c9403906ac3b\" (UID: \"0aca2603-7c75-448c-b019-c9403906ac3b\") " Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.608242 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5-operator-scripts\") pod \"1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5\" (UID: \"1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5\") " Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.608311 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cm92\" (UniqueName: \"kubernetes.io/projected/0aca2603-7c75-448c-b019-c9403906ac3b-kube-api-access-9cm92\") pod \"0aca2603-7c75-448c-b019-c9403906ac3b\" (UID: \"0aca2603-7c75-448c-b019-c9403906ac3b\") " Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.608369 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvw6x\" (UniqueName: \"kubernetes.io/projected/1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5-kube-api-access-xvw6x\") pod \"1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5\" (UID: \"1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5\") " Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.610613 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5" (UID: "1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.611030 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0aca2603-7c75-448c-b019-c9403906ac3b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0aca2603-7c75-448c-b019-c9403906ac3b" (UID: "0aca2603-7c75-448c-b019-c9403906ac3b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.623828 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aca2603-7c75-448c-b019-c9403906ac3b-kube-api-access-9cm92" (OuterVolumeSpecName: "kube-api-access-9cm92") pod "0aca2603-7c75-448c-b019-c9403906ac3b" (UID: "0aca2603-7c75-448c-b019-c9403906ac3b"). InnerVolumeSpecName "kube-api-access-9cm92". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.625802 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5-kube-api-access-xvw6x" (OuterVolumeSpecName: "kube-api-access-xvw6x") pod "1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5" (UID: "1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5"). InnerVolumeSpecName "kube-api-access-xvw6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.709434 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2xwz\" (UniqueName: \"kubernetes.io/projected/92ba5ee1-269e-4971-895f-2393110c2bcd-kube-api-access-z2xwz\") pod \"92ba5ee1-269e-4971-895f-2393110c2bcd\" (UID: \"92ba5ee1-269e-4971-895f-2393110c2bcd\") " Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.709621 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/af01dff7-404f-4000-a537-74c62cb2d840-additional-scripts\") pod \"af01dff7-404f-4000-a537-74c62cb2d840\" (UID: \"af01dff7-404f-4000-a537-74c62cb2d840\") " Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.709642 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52-operator-scripts\") pod \"b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52\" (UID: \"b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52\") " Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.710352 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af01dff7-404f-4000-a537-74c62cb2d840-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "af01dff7-404f-4000-a537-74c62cb2d840" (UID: "af01dff7-404f-4000-a537-74c62cb2d840"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.710415 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/af01dff7-404f-4000-a537-74c62cb2d840-var-log-ovn\") pod \"af01dff7-404f-4000-a537-74c62cb2d840\" (UID: \"af01dff7-404f-4000-a537-74c62cb2d840\") " Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.710436 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7ch8\" (UniqueName: \"kubernetes.io/projected/af01dff7-404f-4000-a537-74c62cb2d840-kube-api-access-v7ch8\") pod \"af01dff7-404f-4000-a537-74c62cb2d840\" (UID: \"af01dff7-404f-4000-a537-74c62cb2d840\") " Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.710412 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52" (UID: "b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.710475 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af01dff7-404f-4000-a537-74c62cb2d840-scripts\") pod \"af01dff7-404f-4000-a537-74c62cb2d840\" (UID: \"af01dff7-404f-4000-a537-74c62cb2d840\") " Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.710475 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af01dff7-404f-4000-a537-74c62cb2d840-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "af01dff7-404f-4000-a537-74c62cb2d840" (UID: "af01dff7-404f-4000-a537-74c62cb2d840"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.711170 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af01dff7-404f-4000-a537-74c62cb2d840-scripts" (OuterVolumeSpecName: "scripts") pod "af01dff7-404f-4000-a537-74c62cb2d840" (UID: "af01dff7-404f-4000-a537-74c62cb2d840"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.711250 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30c40412-9287-4f1a-a221-6f3dc1c2f33b-operator-scripts\") pod \"30c40412-9287-4f1a-a221-6f3dc1c2f33b\" (UID: \"30c40412-9287-4f1a-a221-6f3dc1c2f33b\") " Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.711324 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blpbj\" (UniqueName: \"kubernetes.io/projected/6541ef13-a7b0-45f5-886e-ebf6ee0550bb-kube-api-access-blpbj\") pod \"6541ef13-a7b0-45f5-886e-ebf6ee0550bb\" (UID: \"6541ef13-a7b0-45f5-886e-ebf6ee0550bb\") " Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.712408 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92ba5ee1-269e-4971-895f-2393110c2bcd-operator-scripts\") pod \"92ba5ee1-269e-4971-895f-2393110c2bcd\" (UID: \"92ba5ee1-269e-4971-895f-2393110c2bcd\") " Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.712437 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/af01dff7-404f-4000-a537-74c62cb2d840-var-run\") pod \"af01dff7-404f-4000-a537-74c62cb2d840\" (UID: \"af01dff7-404f-4000-a537-74c62cb2d840\") " Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.712476 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/af01dff7-404f-4000-a537-74c62cb2d840-var-run-ovn\") pod \"af01dff7-404f-4000-a537-74c62cb2d840\" (UID: \"af01dff7-404f-4000-a537-74c62cb2d840\") " Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.712521 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcsnx\" (UniqueName: \"kubernetes.io/projected/b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52-kube-api-access-vcsnx\") pod \"b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52\" (UID: \"b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52\") " Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.712550 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5r8nb\" (UniqueName: \"kubernetes.io/projected/30c40412-9287-4f1a-a221-6f3dc1c2f33b-kube-api-access-5r8nb\") pod \"30c40412-9287-4f1a-a221-6f3dc1c2f33b\" (UID: \"30c40412-9287-4f1a-a221-6f3dc1c2f33b\") " Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.712611 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6541ef13-a7b0-45f5-886e-ebf6ee0550bb-operator-scripts\") pod \"6541ef13-a7b0-45f5-886e-ebf6ee0550bb\" (UID: \"6541ef13-a7b0-45f5-886e-ebf6ee0550bb\") " Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.712215 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92ba5ee1-269e-4971-895f-2393110c2bcd-kube-api-access-z2xwz" (OuterVolumeSpecName: "kube-api-access-z2xwz") pod "92ba5ee1-269e-4971-895f-2393110c2bcd" (UID: "92ba5ee1-269e-4971-895f-2393110c2bcd"). InnerVolumeSpecName "kube-api-access-z2xwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.712411 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30c40412-9287-4f1a-a221-6f3dc1c2f33b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "30c40412-9287-4f1a-a221-6f3dc1c2f33b" (UID: "30c40412-9287-4f1a-a221-6f3dc1c2f33b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.712455 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af01dff7-404f-4000-a537-74c62cb2d840-var-run" (OuterVolumeSpecName: "var-run") pod "af01dff7-404f-4000-a537-74c62cb2d840" (UID: "af01dff7-404f-4000-a537-74c62cb2d840"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.712821 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92ba5ee1-269e-4971-895f-2393110c2bcd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "92ba5ee1-269e-4971-895f-2393110c2bcd" (UID: "92ba5ee1-269e-4971-895f-2393110c2bcd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.712848 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af01dff7-404f-4000-a537-74c62cb2d840-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "af01dff7-404f-4000-a537-74c62cb2d840" (UID: "af01dff7-404f-4000-a537-74c62cb2d840"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.713209 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6541ef13-a7b0-45f5-886e-ebf6ee0550bb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6541ef13-a7b0-45f5-886e-ebf6ee0550bb" (UID: "6541ef13-a7b0-45f5-886e-ebf6ee0550bb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.713412 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92ba5ee1-269e-4971-895f-2393110c2bcd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.713432 4947 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/af01dff7-404f-4000-a537-74c62cb2d840-var-run\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.713443 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.713453 4947 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/af01dff7-404f-4000-a537-74c62cb2d840-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.713464 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6541ef13-a7b0-45f5-886e-ebf6ee0550bb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.713475 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2xwz\" (UniqueName: \"kubernetes.io/projected/92ba5ee1-269e-4971-895f-2393110c2bcd-kube-api-access-z2xwz\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.713529 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cm92\" (UniqueName: \"kubernetes.io/projected/0aca2603-7c75-448c-b019-c9403906ac3b-kube-api-access-9cm92\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.713542 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvw6x\" (UniqueName: \"kubernetes.io/projected/1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5-kube-api-access-xvw6x\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.713555 4947 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/af01dff7-404f-4000-a537-74c62cb2d840-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.713572 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.713593 4947 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/af01dff7-404f-4000-a537-74c62cb2d840-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.713607 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0aca2603-7c75-448c-b019-c9403906ac3b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.713621 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/af01dff7-404f-4000-a537-74c62cb2d840-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.713631 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30c40412-9287-4f1a-a221-6f3dc1c2f33b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.714122 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af01dff7-404f-4000-a537-74c62cb2d840-kube-api-access-v7ch8" (OuterVolumeSpecName: "kube-api-access-v7ch8") pod "af01dff7-404f-4000-a537-74c62cb2d840" (UID: "af01dff7-404f-4000-a537-74c62cb2d840"). InnerVolumeSpecName "kube-api-access-v7ch8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.715832 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30c40412-9287-4f1a-a221-6f3dc1c2f33b-kube-api-access-5r8nb" (OuterVolumeSpecName: "kube-api-access-5r8nb") pod "30c40412-9287-4f1a-a221-6f3dc1c2f33b" (UID: "30c40412-9287-4f1a-a221-6f3dc1c2f33b"). InnerVolumeSpecName "kube-api-access-5r8nb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.716420 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6541ef13-a7b0-45f5-886e-ebf6ee0550bb-kube-api-access-blpbj" (OuterVolumeSpecName: "kube-api-access-blpbj") pod "6541ef13-a7b0-45f5-886e-ebf6ee0550bb" (UID: "6541ef13-a7b0-45f5-886e-ebf6ee0550bb"). InnerVolumeSpecName "kube-api-access-blpbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.724475 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52-kube-api-access-vcsnx" (OuterVolumeSpecName: "kube-api-access-vcsnx") pod "b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52" (UID: "b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52"). InnerVolumeSpecName "kube-api-access-vcsnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.815388 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcsnx\" (UniqueName: \"kubernetes.io/projected/b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52-kube-api-access-vcsnx\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.815428 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5r8nb\" (UniqueName: \"kubernetes.io/projected/30c40412-9287-4f1a-a221-6f3dc1c2f33b-kube-api-access-5r8nb\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.815440 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7ch8\" (UniqueName: \"kubernetes.io/projected/af01dff7-404f-4000-a537-74c62cb2d840-kube-api-access-v7ch8\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.815453 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blpbj\" (UniqueName: \"kubernetes.io/projected/6541ef13-a7b0-45f5-886e-ebf6ee0550bb-kube-api-access-blpbj\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:29 crc kubenswrapper[4947]: I1203 07:08:29.847427 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864b648dc7-g7kq4"] Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.079227 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-mxrh4" event={"ID":"30c40412-9287-4f1a-a221-6f3dc1c2f33b","Type":"ContainerDied","Data":"1cf3220c88a88f9affef653fca6f2b97c439205d7713ae3a02507922cbd39dba"} Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.079262 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cf3220c88a88f9affef653fca6f2b97c439205d7713ae3a02507922cbd39dba" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.079260 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-mxrh4" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.080824 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qj625" event={"ID":"0aca2603-7c75-448c-b019-c9403906ac3b","Type":"ContainerDied","Data":"5cb50a0c3982e1ed5024616978a887ed2dfe628da861a6f823571a169aabe398"} Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.080857 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cb50a0c3982e1ed5024616978a887ed2dfe628da861a6f823571a169aabe398" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.080897 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qj625" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.082164 4947 generic.go:334] "Generic (PLEG): container finished" podID="da2cbe42-b6c3-462a-84de-b00fc24d4bd9" containerID="1fcd2394194886f71c7caa04a9fc06df43901eb75d345bb23d5d0e9c19b05e35" exitCode=0 Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.082209 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" event={"ID":"da2cbe42-b6c3-462a-84de-b00fc24d4bd9","Type":"ContainerDied","Data":"1fcd2394194886f71c7caa04a9fc06df43901eb75d345bb23d5d0e9c19b05e35"} Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.082225 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" event={"ID":"da2cbe42-b6c3-462a-84de-b00fc24d4bd9","Type":"ContainerStarted","Data":"9368786aa3f88fab2e25a980e6962d81541bad097f937d71f4f3f13caf7299ae"} Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.084563 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0363-account-create-update-hgv5w" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.084636 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0363-account-create-update-hgv5w" event={"ID":"b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52","Type":"ContainerDied","Data":"5e4e1886b89839307e16d664bfadc91908ff080efbee0b547b48544afd6b83ec"} Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.085447 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e4e1886b89839307e16d664bfadc91908ff080efbee0b547b48544afd6b83ec" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.104221 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0e6b-account-create-update-bfb9m" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.104415 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0e6b-account-create-update-bfb9m" event={"ID":"92ba5ee1-269e-4971-895f-2393110c2bcd","Type":"ContainerDied","Data":"07d59be855be1ef6a2b4828dc5d24880a4eb0a42373a58c948a9affd6ac45695"} Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.104441 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07d59be855be1ef6a2b4828dc5d24880a4eb0a42373a58c948a9affd6ac45695" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.106558 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-896b-account-create-update-vt5xk" event={"ID":"1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5","Type":"ContainerDied","Data":"19425ae5de49c8182eefe8978f22c00cdbaa6e2ec401e45c1c478633ea1773e6"} Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.106581 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19425ae5de49c8182eefe8978f22c00cdbaa6e2ec401e45c1c478633ea1773e6" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.106651 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-896b-account-create-update-vt5xk" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.119872 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84gbj-config-qsxnj" event={"ID":"af01dff7-404f-4000-a537-74c62cb2d840","Type":"ContainerDied","Data":"cf124b35289e3602fab891dce7074cca587d621ffc2270f5616b833df318fd77"} Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.119915 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf124b35289e3602fab891dce7074cca587d621ffc2270f5616b833df318fd77" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.119996 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84gbj-config-qsxnj" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.121940 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ms4p8" event={"ID":"14e18665-c444-4a96-962a-cecea35695b1","Type":"ContainerStarted","Data":"9b3c5a6fec243c546a76a99e7c5a8fb0550f8743b0fcd71dd7bf768c01e41571"} Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.144099 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-2qbdx" event={"ID":"6541ef13-a7b0-45f5-886e-ebf6ee0550bb","Type":"ContainerDied","Data":"d63d54ea60d3ad8d722b2360074edcbf8523f5eff25fc7b9cae8b65828c90a00"} Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.144145 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d63d54ea60d3ad8d722b2360074edcbf8523f5eff25fc7b9cae8b65828c90a00" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.144273 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2qbdx" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.152450 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-ms4p8" podStartSLOduration=2.882219786 podStartE2EDuration="8.152428214s" podCreationTimestamp="2025-12-03 07:08:22 +0000 UTC" firstStartedPulling="2025-12-03 07:08:24.095547126 +0000 UTC m=+1165.356501552" lastFinishedPulling="2025-12-03 07:08:29.365755564 +0000 UTC m=+1170.626709980" observedRunningTime="2025-12-03 07:08:30.142389223 +0000 UTC m=+1171.403343659" watchObservedRunningTime="2025-12-03 07:08:30.152428214 +0000 UTC m=+1171.413382640" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.729234 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-84gbj-config-qsxnj"] Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.741845 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-84gbj-config-qsxnj"] Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.878343 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-84gbj-config-j4tfr"] Dec 03 07:08:30 crc kubenswrapper[4947]: E1203 07:08:30.878880 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0aca2603-7c75-448c-b019-c9403906ac3b" containerName="mariadb-database-create" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.878908 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aca2603-7c75-448c-b019-c9403906ac3b" containerName="mariadb-database-create" Dec 03 07:08:30 crc kubenswrapper[4947]: E1203 07:08:30.878939 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30c40412-9287-4f1a-a221-6f3dc1c2f33b" containerName="mariadb-database-create" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.878952 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="30c40412-9287-4f1a-a221-6f3dc1c2f33b" containerName="mariadb-database-create" Dec 03 07:08:30 crc kubenswrapper[4947]: E1203 07:08:30.878985 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af01dff7-404f-4000-a537-74c62cb2d840" containerName="ovn-config" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.878999 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="af01dff7-404f-4000-a537-74c62cb2d840" containerName="ovn-config" Dec 03 07:08:30 crc kubenswrapper[4947]: E1203 07:08:30.879065 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92ba5ee1-269e-4971-895f-2393110c2bcd" containerName="mariadb-account-create-update" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.879079 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="92ba5ee1-269e-4971-895f-2393110c2bcd" containerName="mariadb-account-create-update" Dec 03 07:08:30 crc kubenswrapper[4947]: E1203 07:08:30.879107 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52" containerName="mariadb-account-create-update" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.879121 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52" containerName="mariadb-account-create-update" Dec 03 07:08:30 crc kubenswrapper[4947]: E1203 07:08:30.879145 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5" containerName="mariadb-account-create-update" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.879158 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5" containerName="mariadb-account-create-update" Dec 03 07:08:30 crc kubenswrapper[4947]: E1203 07:08:30.879185 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6541ef13-a7b0-45f5-886e-ebf6ee0550bb" containerName="mariadb-database-create" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.879199 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="6541ef13-a7b0-45f5-886e-ebf6ee0550bb" containerName="mariadb-database-create" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.879523 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="0aca2603-7c75-448c-b019-c9403906ac3b" containerName="mariadb-database-create" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.879550 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="6541ef13-a7b0-45f5-886e-ebf6ee0550bb" containerName="mariadb-database-create" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.879571 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="30c40412-9287-4f1a-a221-6f3dc1c2f33b" containerName="mariadb-database-create" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.879594 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="92ba5ee1-269e-4971-895f-2393110c2bcd" containerName="mariadb-account-create-update" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.879612 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52" containerName="mariadb-account-create-update" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.879647 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="af01dff7-404f-4000-a537-74c62cb2d840" containerName="ovn-config" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.879670 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5" containerName="mariadb-account-create-update" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.880556 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84gbj-config-j4tfr" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.884288 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 03 07:08:30 crc kubenswrapper[4947]: I1203 07:08:30.895243 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-84gbj-config-j4tfr"] Dec 03 07:08:31 crc kubenswrapper[4947]: I1203 07:08:31.035123 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-var-log-ovn\") pod \"ovn-controller-84gbj-config-j4tfr\" (UID: \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\") " pod="openstack/ovn-controller-84gbj-config-j4tfr" Dec 03 07:08:31 crc kubenswrapper[4947]: I1203 07:08:31.035360 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-var-run-ovn\") pod \"ovn-controller-84gbj-config-j4tfr\" (UID: \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\") " pod="openstack/ovn-controller-84gbj-config-j4tfr" Dec 03 07:08:31 crc kubenswrapper[4947]: I1203 07:08:31.035542 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-var-run\") pod \"ovn-controller-84gbj-config-j4tfr\" (UID: \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\") " pod="openstack/ovn-controller-84gbj-config-j4tfr" Dec 03 07:08:31 crc kubenswrapper[4947]: I1203 07:08:31.035594 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-scripts\") pod \"ovn-controller-84gbj-config-j4tfr\" (UID: \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\") " pod="openstack/ovn-controller-84gbj-config-j4tfr" Dec 03 07:08:31 crc kubenswrapper[4947]: I1203 07:08:31.035620 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jngw\" (UniqueName: \"kubernetes.io/projected/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-kube-api-access-2jngw\") pod \"ovn-controller-84gbj-config-j4tfr\" (UID: \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\") " pod="openstack/ovn-controller-84gbj-config-j4tfr" Dec 03 07:08:31 crc kubenswrapper[4947]: I1203 07:08:31.035674 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-additional-scripts\") pod \"ovn-controller-84gbj-config-j4tfr\" (UID: \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\") " pod="openstack/ovn-controller-84gbj-config-j4tfr" Dec 03 07:08:31 crc kubenswrapper[4947]: I1203 07:08:31.097242 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af01dff7-404f-4000-a537-74c62cb2d840" path="/var/lib/kubelet/pods/af01dff7-404f-4000-a537-74c62cb2d840/volumes" Dec 03 07:08:31 crc kubenswrapper[4947]: I1203 07:08:31.138050 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-var-run-ovn\") pod \"ovn-controller-84gbj-config-j4tfr\" (UID: \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\") " pod="openstack/ovn-controller-84gbj-config-j4tfr" Dec 03 07:08:31 crc kubenswrapper[4947]: I1203 07:08:31.138209 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-var-run\") pod \"ovn-controller-84gbj-config-j4tfr\" (UID: \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\") " pod="openstack/ovn-controller-84gbj-config-j4tfr" Dec 03 07:08:31 crc kubenswrapper[4947]: I1203 07:08:31.138295 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-scripts\") pod \"ovn-controller-84gbj-config-j4tfr\" (UID: \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\") " pod="openstack/ovn-controller-84gbj-config-j4tfr" Dec 03 07:08:31 crc kubenswrapper[4947]: I1203 07:08:31.138327 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jngw\" (UniqueName: \"kubernetes.io/projected/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-kube-api-access-2jngw\") pod \"ovn-controller-84gbj-config-j4tfr\" (UID: \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\") " pod="openstack/ovn-controller-84gbj-config-j4tfr" Dec 03 07:08:31 crc kubenswrapper[4947]: I1203 07:08:31.138363 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-var-run\") pod \"ovn-controller-84gbj-config-j4tfr\" (UID: \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\") " pod="openstack/ovn-controller-84gbj-config-j4tfr" Dec 03 07:08:31 crc kubenswrapper[4947]: I1203 07:08:31.138363 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-var-run-ovn\") pod \"ovn-controller-84gbj-config-j4tfr\" (UID: \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\") " pod="openstack/ovn-controller-84gbj-config-j4tfr" Dec 03 07:08:31 crc kubenswrapper[4947]: I1203 07:08:31.139417 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-additional-scripts\") pod \"ovn-controller-84gbj-config-j4tfr\" (UID: \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\") " pod="openstack/ovn-controller-84gbj-config-j4tfr" Dec 03 07:08:31 crc kubenswrapper[4947]: I1203 07:08:31.138414 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-additional-scripts\") pod \"ovn-controller-84gbj-config-j4tfr\" (UID: \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\") " pod="openstack/ovn-controller-84gbj-config-j4tfr" Dec 03 07:08:31 crc kubenswrapper[4947]: I1203 07:08:31.139643 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-var-log-ovn\") pod \"ovn-controller-84gbj-config-j4tfr\" (UID: \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\") " pod="openstack/ovn-controller-84gbj-config-j4tfr" Dec 03 07:08:31 crc kubenswrapper[4947]: I1203 07:08:31.139805 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-var-log-ovn\") pod \"ovn-controller-84gbj-config-j4tfr\" (UID: \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\") " pod="openstack/ovn-controller-84gbj-config-j4tfr" Dec 03 07:08:31 crc kubenswrapper[4947]: I1203 07:08:31.140485 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-scripts\") pod \"ovn-controller-84gbj-config-j4tfr\" (UID: \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\") " pod="openstack/ovn-controller-84gbj-config-j4tfr" Dec 03 07:08:31 crc kubenswrapper[4947]: I1203 07:08:31.156210 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" event={"ID":"da2cbe42-b6c3-462a-84de-b00fc24d4bd9","Type":"ContainerStarted","Data":"38261bb3d3c9d9b3bd91e4934c240f51380bf617f7bd3e64702a6c5f2ffc4b3c"} Dec 03 07:08:31 crc kubenswrapper[4947]: I1203 07:08:31.157101 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" Dec 03 07:08:31 crc kubenswrapper[4947]: I1203 07:08:31.160569 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jngw\" (UniqueName: \"kubernetes.io/projected/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-kube-api-access-2jngw\") pod \"ovn-controller-84gbj-config-j4tfr\" (UID: \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\") " pod="openstack/ovn-controller-84gbj-config-j4tfr" Dec 03 07:08:31 crc kubenswrapper[4947]: I1203 07:08:31.184333 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" podStartSLOduration=5.184314794 podStartE2EDuration="5.184314794s" podCreationTimestamp="2025-12-03 07:08:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:08:31.173986755 +0000 UTC m=+1172.434941201" watchObservedRunningTime="2025-12-03 07:08:31.184314794 +0000 UTC m=+1172.445269230" Dec 03 07:08:31 crc kubenswrapper[4947]: I1203 07:08:31.204310 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84gbj-config-j4tfr" Dec 03 07:08:31 crc kubenswrapper[4947]: I1203 07:08:31.743758 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-84gbj-config-j4tfr"] Dec 03 07:08:32 crc kubenswrapper[4947]: I1203 07:08:32.165455 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84gbj-config-j4tfr" event={"ID":"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0","Type":"ContainerStarted","Data":"dd6e54e39769bbb4c64e6c86b56de36b4789c1abff8e87123a2fbcd9dd179bd1"} Dec 03 07:08:32 crc kubenswrapper[4947]: I1203 07:08:32.165892 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84gbj-config-j4tfr" event={"ID":"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0","Type":"ContainerStarted","Data":"8125fdf975a8b5e63d4b506f2814c6ecab3b7bcd086574e53486944a5ac14aed"} Dec 03 07:08:33 crc kubenswrapper[4947]: I1203 07:08:33.173966 4947 generic.go:334] "Generic (PLEG): container finished" podID="a7414a3e-15ca-4e17-84b5-c8b1a68b66e0" containerID="dd6e54e39769bbb4c64e6c86b56de36b4789c1abff8e87123a2fbcd9dd179bd1" exitCode=0 Dec 03 07:08:33 crc kubenswrapper[4947]: I1203 07:08:33.174071 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84gbj-config-j4tfr" event={"ID":"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0","Type":"ContainerDied","Data":"dd6e54e39769bbb4c64e6c86b56de36b4789c1abff8e87123a2fbcd9dd179bd1"} Dec 03 07:08:33 crc kubenswrapper[4947]: I1203 07:08:33.176882 4947 generic.go:334] "Generic (PLEG): container finished" podID="14e18665-c444-4a96-962a-cecea35695b1" containerID="9b3c5a6fec243c546a76a99e7c5a8fb0550f8743b0fcd71dd7bf768c01e41571" exitCode=0 Dec 03 07:08:33 crc kubenswrapper[4947]: I1203 07:08:33.176903 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ms4p8" event={"ID":"14e18665-c444-4a96-962a-cecea35695b1","Type":"ContainerDied","Data":"9b3c5a6fec243c546a76a99e7c5a8fb0550f8743b0fcd71dd7bf768c01e41571"} Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.611470 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84gbj-config-j4tfr" Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.616253 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ms4p8" Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.702857 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-additional-scripts\") pod \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\" (UID: \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\") " Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.703236 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-var-run\") pod \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\" (UID: \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\") " Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.703442 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-var-log-ovn\") pod \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\" (UID: \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\") " Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.703555 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-var-run-ovn\") pod \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\" (UID: \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\") " Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.703650 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jngw\" (UniqueName: \"kubernetes.io/projected/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-kube-api-access-2jngw\") pod \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\" (UID: \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\") " Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.703299 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-var-run" (OuterVolumeSpecName: "var-run") pod "a7414a3e-15ca-4e17-84b5-c8b1a68b66e0" (UID: "a7414a3e-15ca-4e17-84b5-c8b1a68b66e0"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.703470 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a7414a3e-15ca-4e17-84b5-c8b1a68b66e0" (UID: "a7414a3e-15ca-4e17-84b5-c8b1a68b66e0"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.703581 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a7414a3e-15ca-4e17-84b5-c8b1a68b66e0" (UID: "a7414a3e-15ca-4e17-84b5-c8b1a68b66e0"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.703903 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-scripts\") pod \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\" (UID: \"a7414a3e-15ca-4e17-84b5-c8b1a68b66e0\") " Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.704041 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "a7414a3e-15ca-4e17-84b5-c8b1a68b66e0" (UID: "a7414a3e-15ca-4e17-84b5-c8b1a68b66e0"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.704333 4947 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-var-run\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.704403 4947 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.704466 4947 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.704546 4947 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.704892 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-scripts" (OuterVolumeSpecName: "scripts") pod "a7414a3e-15ca-4e17-84b5-c8b1a68b66e0" (UID: "a7414a3e-15ca-4e17-84b5-c8b1a68b66e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.709622 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-kube-api-access-2jngw" (OuterVolumeSpecName: "kube-api-access-2jngw") pod "a7414a3e-15ca-4e17-84b5-c8b1a68b66e0" (UID: "a7414a3e-15ca-4e17-84b5-c8b1a68b66e0"). InnerVolumeSpecName "kube-api-access-2jngw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.805828 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e18665-c444-4a96-962a-cecea35695b1-combined-ca-bundle\") pod \"14e18665-c444-4a96-962a-cecea35695b1\" (UID: \"14e18665-c444-4a96-962a-cecea35695b1\") " Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.805910 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e18665-c444-4a96-962a-cecea35695b1-config-data\") pod \"14e18665-c444-4a96-962a-cecea35695b1\" (UID: \"14e18665-c444-4a96-962a-cecea35695b1\") " Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.805998 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvc79\" (UniqueName: \"kubernetes.io/projected/14e18665-c444-4a96-962a-cecea35695b1-kube-api-access-lvc79\") pod \"14e18665-c444-4a96-962a-cecea35695b1\" (UID: \"14e18665-c444-4a96-962a-cecea35695b1\") " Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.806343 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jngw\" (UniqueName: \"kubernetes.io/projected/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-kube-api-access-2jngw\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.806358 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.814827 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-84gbj-config-j4tfr"] Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.818773 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14e18665-c444-4a96-962a-cecea35695b1-kube-api-access-lvc79" (OuterVolumeSpecName: "kube-api-access-lvc79") pod "14e18665-c444-4a96-962a-cecea35695b1" (UID: "14e18665-c444-4a96-962a-cecea35695b1"). InnerVolumeSpecName "kube-api-access-lvc79". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.823230 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-84gbj-config-j4tfr"] Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.828250 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14e18665-c444-4a96-962a-cecea35695b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14e18665-c444-4a96-962a-cecea35695b1" (UID: "14e18665-c444-4a96-962a-cecea35695b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.863528 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14e18665-c444-4a96-962a-cecea35695b1-config-data" (OuterVolumeSpecName: "config-data") pod "14e18665-c444-4a96-962a-cecea35695b1" (UID: "14e18665-c444-4a96-962a-cecea35695b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.908324 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e18665-c444-4a96-962a-cecea35695b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.908360 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14e18665-c444-4a96-962a-cecea35695b1-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:34 crc kubenswrapper[4947]: I1203 07:08:34.908375 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvc79\" (UniqueName: \"kubernetes.io/projected/14e18665-c444-4a96-962a-cecea35695b1-kube-api-access-lvc79\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.095868 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7414a3e-15ca-4e17-84b5-c8b1a68b66e0" path="/var/lib/kubelet/pods/a7414a3e-15ca-4e17-84b5-c8b1a68b66e0/volumes" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.200249 4947 scope.go:117] "RemoveContainer" containerID="dd6e54e39769bbb4c64e6c86b56de36b4789c1abff8e87123a2fbcd9dd179bd1" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.200269 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84gbj-config-j4tfr" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.203356 4947 generic.go:334] "Generic (PLEG): container finished" podID="000a9c13-4796-4ef4-ba6e-5e57e567dc57" containerID="12c4f57540cc8159dc9d9868fdd26f7712daf6b1b80c9065470a6becaf4c402b" exitCode=0 Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.203443 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7tpzk" event={"ID":"000a9c13-4796-4ef4-ba6e-5e57e567dc57","Type":"ContainerDied","Data":"12c4f57540cc8159dc9d9868fdd26f7712daf6b1b80c9065470a6becaf4c402b"} Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.207089 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ms4p8" event={"ID":"14e18665-c444-4a96-962a-cecea35695b1","Type":"ContainerDied","Data":"8b0208cb1f8770f3632e47fb7a4c4b62be463d558ddccf7d13b2d4ebea3ae081"} Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.207164 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b0208cb1f8770f3632e47fb7a4c4b62be463d558ddccf7d13b2d4ebea3ae081" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.207239 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ms4p8" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.477600 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864b648dc7-g7kq4"] Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.478238 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" podUID="da2cbe42-b6c3-462a-84de-b00fc24d4bd9" containerName="dnsmasq-dns" containerID="cri-o://38261bb3d3c9d9b3bd91e4934c240f51380bf617f7bd3e64702a6c5f2ffc4b3c" gracePeriod=10 Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.483557 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.543767 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-nfp5r"] Dec 03 07:08:35 crc kubenswrapper[4947]: E1203 07:08:35.544161 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14e18665-c444-4a96-962a-cecea35695b1" containerName="keystone-db-sync" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.544173 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="14e18665-c444-4a96-962a-cecea35695b1" containerName="keystone-db-sync" Dec 03 07:08:35 crc kubenswrapper[4947]: E1203 07:08:35.544189 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7414a3e-15ca-4e17-84b5-c8b1a68b66e0" containerName="ovn-config" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.544195 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7414a3e-15ca-4e17-84b5-c8b1a68b66e0" containerName="ovn-config" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.544382 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="14e18665-c444-4a96-962a-cecea35695b1" containerName="keystone-db-sync" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.544404 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7414a3e-15ca-4e17-84b5-c8b1a68b66e0" containerName="ovn-config" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.544985 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nfp5r" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.548291 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.548474 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.548617 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.548991 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-r75nw" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.549128 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.552618 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5678f567b5-qjg8q"] Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.553903 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.574721 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nfp5r"] Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.600290 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5678f567b5-qjg8q"] Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.684216 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-6bhnb"] Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.689048 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6bhnb" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.693709 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.694398 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nxr4p" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.694645 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.700938 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6bhnb"] Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.729900 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-scripts\") pod \"keystone-bootstrap-nfp5r\" (UID: \"54dfa127-2209-41a5-94b2-ee9db0005e00\") " pod="openstack/keystone-bootstrap-nfp5r" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.729987 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-credential-keys\") pod \"keystone-bootstrap-nfp5r\" (UID: \"54dfa127-2209-41a5-94b2-ee9db0005e00\") " pod="openstack/keystone-bootstrap-nfp5r" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.730024 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-ovsdbserver-sb\") pod \"dnsmasq-dns-5678f567b5-qjg8q\" (UID: \"f86315fb-2855-460d-b1bb-184f54fa4c26\") " pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.730054 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxpm2\" (UniqueName: \"kubernetes.io/projected/f86315fb-2855-460d-b1bb-184f54fa4c26-kube-api-access-fxpm2\") pod \"dnsmasq-dns-5678f567b5-qjg8q\" (UID: \"f86315fb-2855-460d-b1bb-184f54fa4c26\") " pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.730087 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-fernet-keys\") pod \"keystone-bootstrap-nfp5r\" (UID: \"54dfa127-2209-41a5-94b2-ee9db0005e00\") " pod="openstack/keystone-bootstrap-nfp5r" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.730152 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-config\") pod \"dnsmasq-dns-5678f567b5-qjg8q\" (UID: \"f86315fb-2855-460d-b1bb-184f54fa4c26\") " pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.730193 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-combined-ca-bundle\") pod \"keystone-bootstrap-nfp5r\" (UID: \"54dfa127-2209-41a5-94b2-ee9db0005e00\") " pod="openstack/keystone-bootstrap-nfp5r" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.730224 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-dns-svc\") pod \"dnsmasq-dns-5678f567b5-qjg8q\" (UID: \"f86315fb-2855-460d-b1bb-184f54fa4c26\") " pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.730259 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-config-data\") pod \"keystone-bootstrap-nfp5r\" (UID: \"54dfa127-2209-41a5-94b2-ee9db0005e00\") " pod="openstack/keystone-bootstrap-nfp5r" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.730285 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-dns-swift-storage-0\") pod \"dnsmasq-dns-5678f567b5-qjg8q\" (UID: \"f86315fb-2855-460d-b1bb-184f54fa4c26\") " pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.730327 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-ovsdbserver-nb\") pod \"dnsmasq-dns-5678f567b5-qjg8q\" (UID: \"f86315fb-2855-460d-b1bb-184f54fa4c26\") " pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.730361 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5lpn\" (UniqueName: \"kubernetes.io/projected/54dfa127-2209-41a5-94b2-ee9db0005e00-kube-api-access-g5lpn\") pod \"keystone-bootstrap-nfp5r\" (UID: \"54dfa127-2209-41a5-94b2-ee9db0005e00\") " pod="openstack/keystone-bootstrap-nfp5r" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.794965 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.796925 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.815840 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.816053 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.829352 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.831584 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-ovsdbserver-nb\") pod \"dnsmasq-dns-5678f567b5-qjg8q\" (UID: \"f86315fb-2855-460d-b1bb-184f54fa4c26\") " pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.831621 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5lpn\" (UniqueName: \"kubernetes.io/projected/54dfa127-2209-41a5-94b2-ee9db0005e00-kube-api-access-g5lpn\") pod \"keystone-bootstrap-nfp5r\" (UID: \"54dfa127-2209-41a5-94b2-ee9db0005e00\") " pod="openstack/keystone-bootstrap-nfp5r" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.831653 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-scripts\") pod \"keystone-bootstrap-nfp5r\" (UID: \"54dfa127-2209-41a5-94b2-ee9db0005e00\") " pod="openstack/keystone-bootstrap-nfp5r" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.831676 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdbpz\" (UniqueName: \"kubernetes.io/projected/af29daa3-e143-4ea0-bfe0-284fd103f8b3-kube-api-access-xdbpz\") pod \"cinder-db-sync-6bhnb\" (UID: \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\") " pod="openstack/cinder-db-sync-6bhnb" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.831705 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-credential-keys\") pod \"keystone-bootstrap-nfp5r\" (UID: \"54dfa127-2209-41a5-94b2-ee9db0005e00\") " pod="openstack/keystone-bootstrap-nfp5r" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.831724 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/af29daa3-e143-4ea0-bfe0-284fd103f8b3-etc-machine-id\") pod \"cinder-db-sync-6bhnb\" (UID: \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\") " pod="openstack/cinder-db-sync-6bhnb" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.831745 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-ovsdbserver-sb\") pod \"dnsmasq-dns-5678f567b5-qjg8q\" (UID: \"f86315fb-2855-460d-b1bb-184f54fa4c26\") " pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.831762 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af29daa3-e143-4ea0-bfe0-284fd103f8b3-combined-ca-bundle\") pod \"cinder-db-sync-6bhnb\" (UID: \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\") " pod="openstack/cinder-db-sync-6bhnb" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.831779 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af29daa3-e143-4ea0-bfe0-284fd103f8b3-db-sync-config-data\") pod \"cinder-db-sync-6bhnb\" (UID: \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\") " pod="openstack/cinder-db-sync-6bhnb" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.831797 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxpm2\" (UniqueName: \"kubernetes.io/projected/f86315fb-2855-460d-b1bb-184f54fa4c26-kube-api-access-fxpm2\") pod \"dnsmasq-dns-5678f567b5-qjg8q\" (UID: \"f86315fb-2855-460d-b1bb-184f54fa4c26\") " pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.831820 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-fernet-keys\") pod \"keystone-bootstrap-nfp5r\" (UID: \"54dfa127-2209-41a5-94b2-ee9db0005e00\") " pod="openstack/keystone-bootstrap-nfp5r" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.831854 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-config\") pod \"dnsmasq-dns-5678f567b5-qjg8q\" (UID: \"f86315fb-2855-460d-b1bb-184f54fa4c26\") " pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.831871 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af29daa3-e143-4ea0-bfe0-284fd103f8b3-scripts\") pod \"cinder-db-sync-6bhnb\" (UID: \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\") " pod="openstack/cinder-db-sync-6bhnb" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.831888 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af29daa3-e143-4ea0-bfe0-284fd103f8b3-config-data\") pod \"cinder-db-sync-6bhnb\" (UID: \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\") " pod="openstack/cinder-db-sync-6bhnb" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.831914 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-combined-ca-bundle\") pod \"keystone-bootstrap-nfp5r\" (UID: \"54dfa127-2209-41a5-94b2-ee9db0005e00\") " pod="openstack/keystone-bootstrap-nfp5r" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.831936 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-dns-svc\") pod \"dnsmasq-dns-5678f567b5-qjg8q\" (UID: \"f86315fb-2855-460d-b1bb-184f54fa4c26\") " pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.831953 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-config-data\") pod \"keystone-bootstrap-nfp5r\" (UID: \"54dfa127-2209-41a5-94b2-ee9db0005e00\") " pod="openstack/keystone-bootstrap-nfp5r" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.831973 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-dns-swift-storage-0\") pod \"dnsmasq-dns-5678f567b5-qjg8q\" (UID: \"f86315fb-2855-460d-b1bb-184f54fa4c26\") " pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.832302 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-ovsdbserver-nb\") pod \"dnsmasq-dns-5678f567b5-qjg8q\" (UID: \"f86315fb-2855-460d-b1bb-184f54fa4c26\") " pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.832712 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-dns-swift-storage-0\") pod \"dnsmasq-dns-5678f567b5-qjg8q\" (UID: \"f86315fb-2855-460d-b1bb-184f54fa4c26\") " pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.833810 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-ovsdbserver-sb\") pod \"dnsmasq-dns-5678f567b5-qjg8q\" (UID: \"f86315fb-2855-460d-b1bb-184f54fa4c26\") " pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.834388 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-config\") pod \"dnsmasq-dns-5678f567b5-qjg8q\" (UID: \"f86315fb-2855-460d-b1bb-184f54fa4c26\") " pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.834613 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-dns-svc\") pod \"dnsmasq-dns-5678f567b5-qjg8q\" (UID: \"f86315fb-2855-460d-b1bb-184f54fa4c26\") " pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.843200 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-config-data\") pod \"keystone-bootstrap-nfp5r\" (UID: \"54dfa127-2209-41a5-94b2-ee9db0005e00\") " pod="openstack/keystone-bootstrap-nfp5r" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.843941 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-combined-ca-bundle\") pod \"keystone-bootstrap-nfp5r\" (UID: \"54dfa127-2209-41a5-94b2-ee9db0005e00\") " pod="openstack/keystone-bootstrap-nfp5r" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.845199 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-scripts\") pod \"keystone-bootstrap-nfp5r\" (UID: \"54dfa127-2209-41a5-94b2-ee9db0005e00\") " pod="openstack/keystone-bootstrap-nfp5r" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.845588 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-fernet-keys\") pod \"keystone-bootstrap-nfp5r\" (UID: \"54dfa127-2209-41a5-94b2-ee9db0005e00\") " pod="openstack/keystone-bootstrap-nfp5r" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.847455 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-credential-keys\") pod \"keystone-bootstrap-nfp5r\" (UID: \"54dfa127-2209-41a5-94b2-ee9db0005e00\") " pod="openstack/keystone-bootstrap-nfp5r" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.867397 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5lpn\" (UniqueName: \"kubernetes.io/projected/54dfa127-2209-41a5-94b2-ee9db0005e00-kube-api-access-g5lpn\") pod \"keystone-bootstrap-nfp5r\" (UID: \"54dfa127-2209-41a5-94b2-ee9db0005e00\") " pod="openstack/keystone-bootstrap-nfp5r" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.874425 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nfp5r" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.879346 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxpm2\" (UniqueName: \"kubernetes.io/projected/f86315fb-2855-460d-b1bb-184f54fa4c26-kube-api-access-fxpm2\") pod \"dnsmasq-dns-5678f567b5-qjg8q\" (UID: \"f86315fb-2855-460d-b1bb-184f54fa4c26\") " pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.880518 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-gl6zw"] Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.889943 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gl6zw" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.891390 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.900374 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.901104 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.901211 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-5z2lj" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.921648 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-gl6zw"] Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.934325 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49777040-6a13-4610-a79c-6bc76d73212e-log-httpd\") pod \"ceilometer-0\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " pod="openstack/ceilometer-0" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.934406 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49777040-6a13-4610-a79c-6bc76d73212e-config-data\") pod \"ceilometer-0\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " pod="openstack/ceilometer-0" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.934430 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdbpz\" (UniqueName: \"kubernetes.io/projected/af29daa3-e143-4ea0-bfe0-284fd103f8b3-kube-api-access-xdbpz\") pod \"cinder-db-sync-6bhnb\" (UID: \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\") " pod="openstack/cinder-db-sync-6bhnb" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.934457 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbsbn\" (UniqueName: \"kubernetes.io/projected/49777040-6a13-4610-a79c-6bc76d73212e-kube-api-access-lbsbn\") pod \"ceilometer-0\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " pod="openstack/ceilometer-0" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.934473 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/af29daa3-e143-4ea0-bfe0-284fd103f8b3-etc-machine-id\") pod \"cinder-db-sync-6bhnb\" (UID: \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\") " pod="openstack/cinder-db-sync-6bhnb" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.934505 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af29daa3-e143-4ea0-bfe0-284fd103f8b3-combined-ca-bundle\") pod \"cinder-db-sync-6bhnb\" (UID: \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\") " pod="openstack/cinder-db-sync-6bhnb" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.934520 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af29daa3-e143-4ea0-bfe0-284fd103f8b3-db-sync-config-data\") pod \"cinder-db-sync-6bhnb\" (UID: \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\") " pod="openstack/cinder-db-sync-6bhnb" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.934551 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49777040-6a13-4610-a79c-6bc76d73212e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " pod="openstack/ceilometer-0" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.934577 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49777040-6a13-4610-a79c-6bc76d73212e-scripts\") pod \"ceilometer-0\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " pod="openstack/ceilometer-0" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.934602 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af29daa3-e143-4ea0-bfe0-284fd103f8b3-scripts\") pod \"cinder-db-sync-6bhnb\" (UID: \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\") " pod="openstack/cinder-db-sync-6bhnb" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.934619 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af29daa3-e143-4ea0-bfe0-284fd103f8b3-config-data\") pod \"cinder-db-sync-6bhnb\" (UID: \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\") " pod="openstack/cinder-db-sync-6bhnb" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.934634 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49777040-6a13-4610-a79c-6bc76d73212e-run-httpd\") pod \"ceilometer-0\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " pod="openstack/ceilometer-0" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.934656 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/49777040-6a13-4610-a79c-6bc76d73212e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " pod="openstack/ceilometer-0" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.934954 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/af29daa3-e143-4ea0-bfe0-284fd103f8b3-etc-machine-id\") pod \"cinder-db-sync-6bhnb\" (UID: \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\") " pod="openstack/cinder-db-sync-6bhnb" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.939050 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af29daa3-e143-4ea0-bfe0-284fd103f8b3-scripts\") pod \"cinder-db-sync-6bhnb\" (UID: \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\") " pod="openstack/cinder-db-sync-6bhnb" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.940680 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af29daa3-e143-4ea0-bfe0-284fd103f8b3-config-data\") pod \"cinder-db-sync-6bhnb\" (UID: \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\") " pod="openstack/cinder-db-sync-6bhnb" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.942599 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5678f567b5-qjg8q"] Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.955272 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af29daa3-e143-4ea0-bfe0-284fd103f8b3-db-sync-config-data\") pod \"cinder-db-sync-6bhnb\" (UID: \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\") " pod="openstack/cinder-db-sync-6bhnb" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.957983 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af29daa3-e143-4ea0-bfe0-284fd103f8b3-combined-ca-bundle\") pod \"cinder-db-sync-6bhnb\" (UID: \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\") " pod="openstack/cinder-db-sync-6bhnb" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.958064 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-nv68l"] Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.965343 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdbpz\" (UniqueName: \"kubernetes.io/projected/af29daa3-e143-4ea0-bfe0-284fd103f8b3-kube-api-access-xdbpz\") pod \"cinder-db-sync-6bhnb\" (UID: \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\") " pod="openstack/cinder-db-sync-6bhnb" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.967283 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-nv68l"] Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.967385 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nv68l" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.971852 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rvvfq" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.972413 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.979008 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-7bxn2"] Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.980098 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7bxn2" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.993801 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-cmcq8" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.993968 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.994006 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 07:08:35 crc kubenswrapper[4947]: I1203 07:08:35.995091 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7bxn2"] Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.006882 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74cd4f877c-8x92t"] Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.008275 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74cd4f877c-8x92t" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.016365 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6bhnb" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.019038 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74cd4f877c-8x92t"] Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.038672 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5883886-a832-4366-be78-449b4559e8d2-config-data\") pod \"placement-db-sync-gl6zw\" (UID: \"e5883886-a832-4366-be78-449b4559e8d2\") " pod="openstack/placement-db-sync-gl6zw" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.038707 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49777040-6a13-4610-a79c-6bc76d73212e-config-data\") pod \"ceilometer-0\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " pod="openstack/ceilometer-0" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.038757 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbsbn\" (UniqueName: \"kubernetes.io/projected/49777040-6a13-4610-a79c-6bc76d73212e-kube-api-access-lbsbn\") pod \"ceilometer-0\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " pod="openstack/ceilometer-0" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.038814 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5883886-a832-4366-be78-449b4559e8d2-logs\") pod \"placement-db-sync-gl6zw\" (UID: \"e5883886-a832-4366-be78-449b4559e8d2\") " pod="openstack/placement-db-sync-gl6zw" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.038841 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49777040-6a13-4610-a79c-6bc76d73212e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " pod="openstack/ceilometer-0" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.038866 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49777040-6a13-4610-a79c-6bc76d73212e-scripts\") pod \"ceilometer-0\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " pod="openstack/ceilometer-0" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.038890 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5883886-a832-4366-be78-449b4559e8d2-scripts\") pod \"placement-db-sync-gl6zw\" (UID: \"e5883886-a832-4366-be78-449b4559e8d2\") " pod="openstack/placement-db-sync-gl6zw" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.038921 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49777040-6a13-4610-a79c-6bc76d73212e-run-httpd\") pod \"ceilometer-0\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " pod="openstack/ceilometer-0" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.038936 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5883886-a832-4366-be78-449b4559e8d2-combined-ca-bundle\") pod \"placement-db-sync-gl6zw\" (UID: \"e5883886-a832-4366-be78-449b4559e8d2\") " pod="openstack/placement-db-sync-gl6zw" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.038965 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/49777040-6a13-4610-a79c-6bc76d73212e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " pod="openstack/ceilometer-0" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.038996 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49777040-6a13-4610-a79c-6bc76d73212e-log-httpd\") pod \"ceilometer-0\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " pod="openstack/ceilometer-0" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.039027 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b64t\" (UniqueName: \"kubernetes.io/projected/e5883886-a832-4366-be78-449b4559e8d2-kube-api-access-4b64t\") pod \"placement-db-sync-gl6zw\" (UID: \"e5883886-a832-4366-be78-449b4559e8d2\") " pod="openstack/placement-db-sync-gl6zw" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.040688 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49777040-6a13-4610-a79c-6bc76d73212e-run-httpd\") pod \"ceilometer-0\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " pod="openstack/ceilometer-0" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.040801 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49777040-6a13-4610-a79c-6bc76d73212e-log-httpd\") pod \"ceilometer-0\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " pod="openstack/ceilometer-0" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.043499 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49777040-6a13-4610-a79c-6bc76d73212e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " pod="openstack/ceilometer-0" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.063123 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/49777040-6a13-4610-a79c-6bc76d73212e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " pod="openstack/ceilometer-0" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.063766 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49777040-6a13-4610-a79c-6bc76d73212e-scripts\") pod \"ceilometer-0\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " pod="openstack/ceilometer-0" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.075255 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49777040-6a13-4610-a79c-6bc76d73212e-config-data\") pod \"ceilometer-0\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " pod="openstack/ceilometer-0" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.075269 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbsbn\" (UniqueName: \"kubernetes.io/projected/49777040-6a13-4610-a79c-6bc76d73212e-kube-api-access-lbsbn\") pod \"ceilometer-0\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " pod="openstack/ceilometer-0" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.110903 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.148408 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj5g6\" (UniqueName: \"kubernetes.io/projected/7c810f8e-8312-44fb-8de2-63f66e6efbb3-kube-api-access-dj5g6\") pod \"dnsmasq-dns-74cd4f877c-8x92t\" (UID: \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\") " pod="openstack/dnsmasq-dns-74cd4f877c-8x92t" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.148448 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xntzj\" (UniqueName: \"kubernetes.io/projected/a2da706f-4332-49a8-9c85-2d90186bd708-kube-api-access-xntzj\") pod \"neutron-db-sync-7bxn2\" (UID: \"a2da706f-4332-49a8-9c85-2d90186bd708\") " pod="openstack/neutron-db-sync-7bxn2" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.148682 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b64t\" (UniqueName: \"kubernetes.io/projected/e5883886-a832-4366-be78-449b4559e8d2-kube-api-access-4b64t\") pod \"placement-db-sync-gl6zw\" (UID: \"e5883886-a832-4366-be78-449b4559e8d2\") " pod="openstack/placement-db-sync-gl6zw" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.148721 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-ovsdbserver-nb\") pod \"dnsmasq-dns-74cd4f877c-8x92t\" (UID: \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\") " pod="openstack/dnsmasq-dns-74cd4f877c-8x92t" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.148763 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bqcd\" (UniqueName: \"kubernetes.io/projected/5abc234f-80ac-4e2e-a43d-2a6fe3453f8c-kube-api-access-9bqcd\") pod \"barbican-db-sync-nv68l\" (UID: \"5abc234f-80ac-4e2e-a43d-2a6fe3453f8c\") " pod="openstack/barbican-db-sync-nv68l" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.148818 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-dns-swift-storage-0\") pod \"dnsmasq-dns-74cd4f877c-8x92t\" (UID: \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\") " pod="openstack/dnsmasq-dns-74cd4f877c-8x92t" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.148864 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5abc234f-80ac-4e2e-a43d-2a6fe3453f8c-combined-ca-bundle\") pod \"barbican-db-sync-nv68l\" (UID: \"5abc234f-80ac-4e2e-a43d-2a6fe3453f8c\") " pod="openstack/barbican-db-sync-nv68l" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.148921 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5883886-a832-4366-be78-449b4559e8d2-config-data\") pod \"placement-db-sync-gl6zw\" (UID: \"e5883886-a832-4366-be78-449b4559e8d2\") " pod="openstack/placement-db-sync-gl6zw" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.148947 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-dns-svc\") pod \"dnsmasq-dns-74cd4f877c-8x92t\" (UID: \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\") " pod="openstack/dnsmasq-dns-74cd4f877c-8x92t" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.149015 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-ovsdbserver-sb\") pod \"dnsmasq-dns-74cd4f877c-8x92t\" (UID: \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\") " pod="openstack/dnsmasq-dns-74cd4f877c-8x92t" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.149065 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5883886-a832-4366-be78-449b4559e8d2-logs\") pod \"placement-db-sync-gl6zw\" (UID: \"e5883886-a832-4366-be78-449b4559e8d2\") " pod="openstack/placement-db-sync-gl6zw" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.149121 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5abc234f-80ac-4e2e-a43d-2a6fe3453f8c-db-sync-config-data\") pod \"barbican-db-sync-nv68l\" (UID: \"5abc234f-80ac-4e2e-a43d-2a6fe3453f8c\") " pod="openstack/barbican-db-sync-nv68l" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.149193 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5883886-a832-4366-be78-449b4559e8d2-scripts\") pod \"placement-db-sync-gl6zw\" (UID: \"e5883886-a832-4366-be78-449b4559e8d2\") " pod="openstack/placement-db-sync-gl6zw" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.149236 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2da706f-4332-49a8-9c85-2d90186bd708-config\") pod \"neutron-db-sync-7bxn2\" (UID: \"a2da706f-4332-49a8-9c85-2d90186bd708\") " pod="openstack/neutron-db-sync-7bxn2" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.149255 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-config\") pod \"dnsmasq-dns-74cd4f877c-8x92t\" (UID: \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\") " pod="openstack/dnsmasq-dns-74cd4f877c-8x92t" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.149313 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2da706f-4332-49a8-9c85-2d90186bd708-combined-ca-bundle\") pod \"neutron-db-sync-7bxn2\" (UID: \"a2da706f-4332-49a8-9c85-2d90186bd708\") " pod="openstack/neutron-db-sync-7bxn2" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.149343 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5883886-a832-4366-be78-449b4559e8d2-combined-ca-bundle\") pod \"placement-db-sync-gl6zw\" (UID: \"e5883886-a832-4366-be78-449b4559e8d2\") " pod="openstack/placement-db-sync-gl6zw" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.150309 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5883886-a832-4366-be78-449b4559e8d2-logs\") pod \"placement-db-sync-gl6zw\" (UID: \"e5883886-a832-4366-be78-449b4559e8d2\") " pod="openstack/placement-db-sync-gl6zw" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.163511 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5883886-a832-4366-be78-449b4559e8d2-config-data\") pod \"placement-db-sync-gl6zw\" (UID: \"e5883886-a832-4366-be78-449b4559e8d2\") " pod="openstack/placement-db-sync-gl6zw" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.167808 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b64t\" (UniqueName: \"kubernetes.io/projected/e5883886-a832-4366-be78-449b4559e8d2-kube-api-access-4b64t\") pod \"placement-db-sync-gl6zw\" (UID: \"e5883886-a832-4366-be78-449b4559e8d2\") " pod="openstack/placement-db-sync-gl6zw" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.167828 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5883886-a832-4366-be78-449b4559e8d2-scripts\") pod \"placement-db-sync-gl6zw\" (UID: \"e5883886-a832-4366-be78-449b4559e8d2\") " pod="openstack/placement-db-sync-gl6zw" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.171228 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5883886-a832-4366-be78-449b4559e8d2-combined-ca-bundle\") pod \"placement-db-sync-gl6zw\" (UID: \"e5883886-a832-4366-be78-449b4559e8d2\") " pod="openstack/placement-db-sync-gl6zw" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.194817 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.237562 4947 generic.go:334] "Generic (PLEG): container finished" podID="da2cbe42-b6c3-462a-84de-b00fc24d4bd9" containerID="38261bb3d3c9d9b3bd91e4934c240f51380bf617f7bd3e64702a6c5f2ffc4b3c" exitCode=0 Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.237997 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.238581 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" event={"ID":"da2cbe42-b6c3-462a-84de-b00fc24d4bd9","Type":"ContainerDied","Data":"38261bb3d3c9d9b3bd91e4934c240f51380bf617f7bd3e64702a6c5f2ffc4b3c"} Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.238610 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864b648dc7-g7kq4" event={"ID":"da2cbe42-b6c3-462a-84de-b00fc24d4bd9","Type":"ContainerDied","Data":"9368786aa3f88fab2e25a980e6962d81541bad097f937d71f4f3f13caf7299ae"} Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.238628 4947 scope.go:117] "RemoveContainer" containerID="38261bb3d3c9d9b3bd91e4934c240f51380bf617f7bd3e64702a6c5f2ffc4b3c" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.251310 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bqcd\" (UniqueName: \"kubernetes.io/projected/5abc234f-80ac-4e2e-a43d-2a6fe3453f8c-kube-api-access-9bqcd\") pod \"barbican-db-sync-nv68l\" (UID: \"5abc234f-80ac-4e2e-a43d-2a6fe3453f8c\") " pod="openstack/barbican-db-sync-nv68l" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.251404 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-dns-swift-storage-0\") pod \"dnsmasq-dns-74cd4f877c-8x92t\" (UID: \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\") " pod="openstack/dnsmasq-dns-74cd4f877c-8x92t" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.251621 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5abc234f-80ac-4e2e-a43d-2a6fe3453f8c-combined-ca-bundle\") pod \"barbican-db-sync-nv68l\" (UID: \"5abc234f-80ac-4e2e-a43d-2a6fe3453f8c\") " pod="openstack/barbican-db-sync-nv68l" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.251676 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-dns-svc\") pod \"dnsmasq-dns-74cd4f877c-8x92t\" (UID: \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\") " pod="openstack/dnsmasq-dns-74cd4f877c-8x92t" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.251732 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-ovsdbserver-sb\") pod \"dnsmasq-dns-74cd4f877c-8x92t\" (UID: \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\") " pod="openstack/dnsmasq-dns-74cd4f877c-8x92t" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.253016 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5abc234f-80ac-4e2e-a43d-2a6fe3453f8c-db-sync-config-data\") pod \"barbican-db-sync-nv68l\" (UID: \"5abc234f-80ac-4e2e-a43d-2a6fe3453f8c\") " pod="openstack/barbican-db-sync-nv68l" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.253053 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2da706f-4332-49a8-9c85-2d90186bd708-config\") pod \"neutron-db-sync-7bxn2\" (UID: \"a2da706f-4332-49a8-9c85-2d90186bd708\") " pod="openstack/neutron-db-sync-7bxn2" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.253072 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-config\") pod \"dnsmasq-dns-74cd4f877c-8x92t\" (UID: \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\") " pod="openstack/dnsmasq-dns-74cd4f877c-8x92t" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.253099 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2da706f-4332-49a8-9c85-2d90186bd708-combined-ca-bundle\") pod \"neutron-db-sync-7bxn2\" (UID: \"a2da706f-4332-49a8-9c85-2d90186bd708\") " pod="openstack/neutron-db-sync-7bxn2" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.253162 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj5g6\" (UniqueName: \"kubernetes.io/projected/7c810f8e-8312-44fb-8de2-63f66e6efbb3-kube-api-access-dj5g6\") pod \"dnsmasq-dns-74cd4f877c-8x92t\" (UID: \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\") " pod="openstack/dnsmasq-dns-74cd4f877c-8x92t" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.253175 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xntzj\" (UniqueName: \"kubernetes.io/projected/a2da706f-4332-49a8-9c85-2d90186bd708-kube-api-access-xntzj\") pod \"neutron-db-sync-7bxn2\" (UID: \"a2da706f-4332-49a8-9c85-2d90186bd708\") " pod="openstack/neutron-db-sync-7bxn2" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.253219 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-ovsdbserver-nb\") pod \"dnsmasq-dns-74cd4f877c-8x92t\" (UID: \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\") " pod="openstack/dnsmasq-dns-74cd4f877c-8x92t" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.255039 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-dns-swift-storage-0\") pod \"dnsmasq-dns-74cd4f877c-8x92t\" (UID: \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\") " pod="openstack/dnsmasq-dns-74cd4f877c-8x92t" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.262660 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-config\") pod \"dnsmasq-dns-74cd4f877c-8x92t\" (UID: \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\") " pod="openstack/dnsmasq-dns-74cd4f877c-8x92t" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.267421 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-dns-svc\") pod \"dnsmasq-dns-74cd4f877c-8x92t\" (UID: \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\") " pod="openstack/dnsmasq-dns-74cd4f877c-8x92t" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.268408 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-ovsdbserver-sb\") pod \"dnsmasq-dns-74cd4f877c-8x92t\" (UID: \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\") " pod="openstack/dnsmasq-dns-74cd4f877c-8x92t" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.303401 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gl6zw" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.309157 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-ovsdbserver-nb\") pod \"dnsmasq-dns-74cd4f877c-8x92t\" (UID: \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\") " pod="openstack/dnsmasq-dns-74cd4f877c-8x92t" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.309413 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2da706f-4332-49a8-9c85-2d90186bd708-combined-ca-bundle\") pod \"neutron-db-sync-7bxn2\" (UID: \"a2da706f-4332-49a8-9c85-2d90186bd708\") " pod="openstack/neutron-db-sync-7bxn2" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.310679 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2da706f-4332-49a8-9c85-2d90186bd708-config\") pod \"neutron-db-sync-7bxn2\" (UID: \"a2da706f-4332-49a8-9c85-2d90186bd708\") " pod="openstack/neutron-db-sync-7bxn2" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.318671 4947 scope.go:117] "RemoveContainer" containerID="1fcd2394194886f71c7caa04a9fc06df43901eb75d345bb23d5d0e9c19b05e35" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.320005 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5abc234f-80ac-4e2e-a43d-2a6fe3453f8c-db-sync-config-data\") pod \"barbican-db-sync-nv68l\" (UID: \"5abc234f-80ac-4e2e-a43d-2a6fe3453f8c\") " pod="openstack/barbican-db-sync-nv68l" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.327173 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj5g6\" (UniqueName: \"kubernetes.io/projected/7c810f8e-8312-44fb-8de2-63f66e6efbb3-kube-api-access-dj5g6\") pod \"dnsmasq-dns-74cd4f877c-8x92t\" (UID: \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\") " pod="openstack/dnsmasq-dns-74cd4f877c-8x92t" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.328038 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bqcd\" (UniqueName: \"kubernetes.io/projected/5abc234f-80ac-4e2e-a43d-2a6fe3453f8c-kube-api-access-9bqcd\") pod \"barbican-db-sync-nv68l\" (UID: \"5abc234f-80ac-4e2e-a43d-2a6fe3453f8c\") " pod="openstack/barbican-db-sync-nv68l" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.329947 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5abc234f-80ac-4e2e-a43d-2a6fe3453f8c-combined-ca-bundle\") pod \"barbican-db-sync-nv68l\" (UID: \"5abc234f-80ac-4e2e-a43d-2a6fe3453f8c\") " pod="openstack/barbican-db-sync-nv68l" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.332762 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xntzj\" (UniqueName: \"kubernetes.io/projected/a2da706f-4332-49a8-9c85-2d90186bd708-kube-api-access-xntzj\") pod \"neutron-db-sync-7bxn2\" (UID: \"a2da706f-4332-49a8-9c85-2d90186bd708\") " pod="openstack/neutron-db-sync-7bxn2" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.336356 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nv68l" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.354134 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-ovsdbserver-nb\") pod \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\" (UID: \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\") " Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.354293 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-dns-svc\") pod \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\" (UID: \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\") " Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.354328 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-config\") pod \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\" (UID: \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\") " Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.354386 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-ovsdbserver-sb\") pod \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\" (UID: \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\") " Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.354517 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvbf9\" (UniqueName: \"kubernetes.io/projected/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-kube-api-access-cvbf9\") pod \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\" (UID: \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\") " Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.354549 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-dns-swift-storage-0\") pod \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\" (UID: \"da2cbe42-b6c3-462a-84de-b00fc24d4bd9\") " Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.385919 4947 scope.go:117] "RemoveContainer" containerID="38261bb3d3c9d9b3bd91e4934c240f51380bf617f7bd3e64702a6c5f2ffc4b3c" Dec 03 07:08:36 crc kubenswrapper[4947]: E1203 07:08:36.386416 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38261bb3d3c9d9b3bd91e4934c240f51380bf617f7bd3e64702a6c5f2ffc4b3c\": container with ID starting with 38261bb3d3c9d9b3bd91e4934c240f51380bf617f7bd3e64702a6c5f2ffc4b3c not found: ID does not exist" containerID="38261bb3d3c9d9b3bd91e4934c240f51380bf617f7bd3e64702a6c5f2ffc4b3c" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.386452 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38261bb3d3c9d9b3bd91e4934c240f51380bf617f7bd3e64702a6c5f2ffc4b3c"} err="failed to get container status \"38261bb3d3c9d9b3bd91e4934c240f51380bf617f7bd3e64702a6c5f2ffc4b3c\": rpc error: code = NotFound desc = could not find container \"38261bb3d3c9d9b3bd91e4934c240f51380bf617f7bd3e64702a6c5f2ffc4b3c\": container with ID starting with 38261bb3d3c9d9b3bd91e4934c240f51380bf617f7bd3e64702a6c5f2ffc4b3c not found: ID does not exist" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.386481 4947 scope.go:117] "RemoveContainer" containerID="1fcd2394194886f71c7caa04a9fc06df43901eb75d345bb23d5d0e9c19b05e35" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.386637 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-kube-api-access-cvbf9" (OuterVolumeSpecName: "kube-api-access-cvbf9") pod "da2cbe42-b6c3-462a-84de-b00fc24d4bd9" (UID: "da2cbe42-b6c3-462a-84de-b00fc24d4bd9"). InnerVolumeSpecName "kube-api-access-cvbf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:08:36 crc kubenswrapper[4947]: E1203 07:08:36.386774 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fcd2394194886f71c7caa04a9fc06df43901eb75d345bb23d5d0e9c19b05e35\": container with ID starting with 1fcd2394194886f71c7caa04a9fc06df43901eb75d345bb23d5d0e9c19b05e35 not found: ID does not exist" containerID="1fcd2394194886f71c7caa04a9fc06df43901eb75d345bb23d5d0e9c19b05e35" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.386802 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fcd2394194886f71c7caa04a9fc06df43901eb75d345bb23d5d0e9c19b05e35"} err="failed to get container status \"1fcd2394194886f71c7caa04a9fc06df43901eb75d345bb23d5d0e9c19b05e35\": rpc error: code = NotFound desc = could not find container \"1fcd2394194886f71c7caa04a9fc06df43901eb75d345bb23d5d0e9c19b05e35\": container with ID starting with 1fcd2394194886f71c7caa04a9fc06df43901eb75d345bb23d5d0e9c19b05e35 not found: ID does not exist" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.396660 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7bxn2" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.412857 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74cd4f877c-8x92t" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.417012 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-config" (OuterVolumeSpecName: "config") pod "da2cbe42-b6c3-462a-84de-b00fc24d4bd9" (UID: "da2cbe42-b6c3-462a-84de-b00fc24d4bd9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.425410 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "da2cbe42-b6c3-462a-84de-b00fc24d4bd9" (UID: "da2cbe42-b6c3-462a-84de-b00fc24d4bd9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.449411 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "da2cbe42-b6c3-462a-84de-b00fc24d4bd9" (UID: "da2cbe42-b6c3-462a-84de-b00fc24d4bd9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.449979 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "da2cbe42-b6c3-462a-84de-b00fc24d4bd9" (UID: "da2cbe42-b6c3-462a-84de-b00fc24d4bd9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.456590 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.457133 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.457207 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.457263 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvbf9\" (UniqueName: \"kubernetes.io/projected/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-kube-api-access-cvbf9\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.457323 4947 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.457157 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "da2cbe42-b6c3-462a-84de-b00fc24d4bd9" (UID: "da2cbe42-b6c3-462a-84de-b00fc24d4bd9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.563482 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da2cbe42-b6c3-462a-84de-b00fc24d4bd9-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.632585 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5678f567b5-qjg8q"] Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.643151 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864b648dc7-g7kq4"] Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.648830 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-864b648dc7-g7kq4"] Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.658585 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-nfp5r"] Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.704841 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6bhnb"] Dec 03 07:08:36 crc kubenswrapper[4947]: W1203 07:08:36.751752 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54dfa127_2209_41a5_94b2_ee9db0005e00.slice/crio-a9038c7de05352b66ff8db21a2dc218eededec1bb577425e7180a6d9a70ac09d WatchSource:0}: Error finding container a9038c7de05352b66ff8db21a2dc218eededec1bb577425e7180a6d9a70ac09d: Status 404 returned error can't find the container with id a9038c7de05352b66ff8db21a2dc218eededec1bb577425e7180a6d9a70ac09d Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.847004 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7tpzk" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.860900 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.970959 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/000a9c13-4796-4ef4-ba6e-5e57e567dc57-combined-ca-bundle\") pod \"000a9c13-4796-4ef4-ba6e-5e57e567dc57\" (UID: \"000a9c13-4796-4ef4-ba6e-5e57e567dc57\") " Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.971065 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/000a9c13-4796-4ef4-ba6e-5e57e567dc57-config-data\") pod \"000a9c13-4796-4ef4-ba6e-5e57e567dc57\" (UID: \"000a9c13-4796-4ef4-ba6e-5e57e567dc57\") " Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.971087 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj8dp\" (UniqueName: \"kubernetes.io/projected/000a9c13-4796-4ef4-ba6e-5e57e567dc57-kube-api-access-vj8dp\") pod \"000a9c13-4796-4ef4-ba6e-5e57e567dc57\" (UID: \"000a9c13-4796-4ef4-ba6e-5e57e567dc57\") " Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.971191 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/000a9c13-4796-4ef4-ba6e-5e57e567dc57-db-sync-config-data\") pod \"000a9c13-4796-4ef4-ba6e-5e57e567dc57\" (UID: \"000a9c13-4796-4ef4-ba6e-5e57e567dc57\") " Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.978953 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/000a9c13-4796-4ef4-ba6e-5e57e567dc57-kube-api-access-vj8dp" (OuterVolumeSpecName: "kube-api-access-vj8dp") pod "000a9c13-4796-4ef4-ba6e-5e57e567dc57" (UID: "000a9c13-4796-4ef4-ba6e-5e57e567dc57"). InnerVolumeSpecName "kube-api-access-vj8dp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:08:36 crc kubenswrapper[4947]: I1203 07:08:36.982371 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/000a9c13-4796-4ef4-ba6e-5e57e567dc57-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "000a9c13-4796-4ef4-ba6e-5e57e567dc57" (UID: "000a9c13-4796-4ef4-ba6e-5e57e567dc57"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:08:36 crc kubenswrapper[4947]: E1203 07:08:36.996788 4947 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda2cbe42_b6c3_462a_84de_b00fc24d4bd9.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda2cbe42_b6c3_462a_84de_b00fc24d4bd9.slice/crio-9368786aa3f88fab2e25a980e6962d81541bad097f937d71f4f3f13caf7299ae\": RecentStats: unable to find data in memory cache]" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.003651 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/000a9c13-4796-4ef4-ba6e-5e57e567dc57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "000a9c13-4796-4ef4-ba6e-5e57e567dc57" (UID: "000a9c13-4796-4ef4-ba6e-5e57e567dc57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.023479 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-gl6zw"] Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.057714 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/000a9c13-4796-4ef4-ba6e-5e57e567dc57-config-data" (OuterVolumeSpecName: "config-data") pod "000a9c13-4796-4ef4-ba6e-5e57e567dc57" (UID: "000a9c13-4796-4ef4-ba6e-5e57e567dc57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.074024 4947 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/000a9c13-4796-4ef4-ba6e-5e57e567dc57-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.074062 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/000a9c13-4796-4ef4-ba6e-5e57e567dc57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.074074 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/000a9c13-4796-4ef4-ba6e-5e57e567dc57-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.074086 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vj8dp\" (UniqueName: \"kubernetes.io/projected/000a9c13-4796-4ef4-ba6e-5e57e567dc57-kube-api-access-vj8dp\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.101807 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da2cbe42-b6c3-462a-84de-b00fc24d4bd9" path="/var/lib/kubelet/pods/da2cbe42-b6c3-462a-84de-b00fc24d4bd9/volumes" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.122839 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-7bxn2"] Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.135618 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-nv68l"] Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.143850 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74cd4f877c-8x92t"] Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.257281 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74cd4f877c-8x92t" event={"ID":"7c810f8e-8312-44fb-8de2-63f66e6efbb3","Type":"ContainerStarted","Data":"ffdc61138eebd5d42c83d1a2472f5d947ea6c5899f7d31712594cbb3a1c87999"} Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.258465 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nv68l" event={"ID":"5abc234f-80ac-4e2e-a43d-2a6fe3453f8c","Type":"ContainerStarted","Data":"74821503fb3da8c0535a3bda45eba773af832e1a2416c5fd140798aaadfdf8aa"} Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.260266 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" event={"ID":"f86315fb-2855-460d-b1bb-184f54fa4c26","Type":"ContainerStarted","Data":"bb816660d345a03985fcad7f33dbeccf0329c265285ea9fe626aa4d7c58e5f81"} Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.260353 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" event={"ID":"f86315fb-2855-460d-b1bb-184f54fa4c26","Type":"ContainerStarted","Data":"a2438f1e6773ddf54d9a02463a86ab58715591c101cffecefb4c7f0671ed6532"} Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.261411 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7bxn2" event={"ID":"a2da706f-4332-49a8-9c85-2d90186bd708","Type":"ContainerStarted","Data":"9456371e2961855296de158d6690d8ecc7a90b9e73e94e866585c608eb4a30d4"} Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.265045 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-7tpzk" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.266843 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-7tpzk" event={"ID":"000a9c13-4796-4ef4-ba6e-5e57e567dc57","Type":"ContainerDied","Data":"1bbac8da2a59fbe20c18ba7d206ede392755099bf0186213f299d74cba03ef50"} Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.266885 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bbac8da2a59fbe20c18ba7d206ede392755099bf0186213f299d74cba03ef50" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.267028 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6bhnb" event={"ID":"af29daa3-e143-4ea0-bfe0-284fd103f8b3","Type":"ContainerStarted","Data":"bba5f5a73e19365681938bbd4e092d8cdc0204f119ec19a609fedc4b6ea5ac19"} Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.269160 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gl6zw" event={"ID":"e5883886-a832-4366-be78-449b4559e8d2","Type":"ContainerStarted","Data":"1c34eff082ecac97e7577ef5b2515e4828c3c5fd9f384a9f9969429481ca3ad6"} Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.270647 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nfp5r" event={"ID":"54dfa127-2209-41a5-94b2-ee9db0005e00","Type":"ContainerStarted","Data":"a78540289946534a350ef445e5a1ab10a4c50c6691c8ab7e7bd31dc482d3b51d"} Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.270668 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nfp5r" event={"ID":"54dfa127-2209-41a5-94b2-ee9db0005e00","Type":"ContainerStarted","Data":"a9038c7de05352b66ff8db21a2dc218eededec1bb577425e7180a6d9a70ac09d"} Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.274573 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"49777040-6a13-4610-a79c-6bc76d73212e","Type":"ContainerStarted","Data":"2979b22cc72ca2acc4aaa550f995388569d6f735dbc490ac2820f45d4ed4d67b"} Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.310920 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-nfp5r" podStartSLOduration=2.310901921 podStartE2EDuration="2.310901921s" podCreationTimestamp="2025-12-03 07:08:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:08:37.291719264 +0000 UTC m=+1178.552673690" watchObservedRunningTime="2025-12-03 07:08:37.310901921 +0000 UTC m=+1178.571856347" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.599425 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74cd4f877c-8x92t"] Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.633341 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74fd8b655f-4vsbx"] Dec 03 07:08:37 crc kubenswrapper[4947]: E1203 07:08:37.633689 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da2cbe42-b6c3-462a-84de-b00fc24d4bd9" containerName="init" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.633700 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2cbe42-b6c3-462a-84de-b00fc24d4bd9" containerName="init" Dec 03 07:08:37 crc kubenswrapper[4947]: E1203 07:08:37.633723 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da2cbe42-b6c3-462a-84de-b00fc24d4bd9" containerName="dnsmasq-dns" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.633729 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2cbe42-b6c3-462a-84de-b00fc24d4bd9" containerName="dnsmasq-dns" Dec 03 07:08:37 crc kubenswrapper[4947]: E1203 07:08:37.633750 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="000a9c13-4796-4ef4-ba6e-5e57e567dc57" containerName="glance-db-sync" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.633756 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="000a9c13-4796-4ef4-ba6e-5e57e567dc57" containerName="glance-db-sync" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.633918 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="da2cbe42-b6c3-462a-84de-b00fc24d4bd9" containerName="dnsmasq-dns" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.633937 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="000a9c13-4796-4ef4-ba6e-5e57e567dc57" containerName="glance-db-sync" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.634981 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.657127 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74fd8b655f-4vsbx"] Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.694795 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-ovsdbserver-nb\") pod \"dnsmasq-dns-74fd8b655f-4vsbx\" (UID: \"f44a9445-7982-4d53-aadc-02677a421f34\") " pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.694842 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-config\") pod \"dnsmasq-dns-74fd8b655f-4vsbx\" (UID: \"f44a9445-7982-4d53-aadc-02677a421f34\") " pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.694870 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsxjs\" (UniqueName: \"kubernetes.io/projected/f44a9445-7982-4d53-aadc-02677a421f34-kube-api-access-dsxjs\") pod \"dnsmasq-dns-74fd8b655f-4vsbx\" (UID: \"f44a9445-7982-4d53-aadc-02677a421f34\") " pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.694978 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-dns-swift-storage-0\") pod \"dnsmasq-dns-74fd8b655f-4vsbx\" (UID: \"f44a9445-7982-4d53-aadc-02677a421f34\") " pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.695036 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-dns-svc\") pod \"dnsmasq-dns-74fd8b655f-4vsbx\" (UID: \"f44a9445-7982-4d53-aadc-02677a421f34\") " pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.695097 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-ovsdbserver-sb\") pod \"dnsmasq-dns-74fd8b655f-4vsbx\" (UID: \"f44a9445-7982-4d53-aadc-02677a421f34\") " pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.797030 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-ovsdbserver-nb\") pod \"dnsmasq-dns-74fd8b655f-4vsbx\" (UID: \"f44a9445-7982-4d53-aadc-02677a421f34\") " pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.798294 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-config\") pod \"dnsmasq-dns-74fd8b655f-4vsbx\" (UID: \"f44a9445-7982-4d53-aadc-02677a421f34\") " pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.798437 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsxjs\" (UniqueName: \"kubernetes.io/projected/f44a9445-7982-4d53-aadc-02677a421f34-kube-api-access-dsxjs\") pod \"dnsmasq-dns-74fd8b655f-4vsbx\" (UID: \"f44a9445-7982-4d53-aadc-02677a421f34\") " pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.798477 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-dns-swift-storage-0\") pod \"dnsmasq-dns-74fd8b655f-4vsbx\" (UID: \"f44a9445-7982-4d53-aadc-02677a421f34\") " pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.798542 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-dns-svc\") pod \"dnsmasq-dns-74fd8b655f-4vsbx\" (UID: \"f44a9445-7982-4d53-aadc-02677a421f34\") " pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.798026 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-ovsdbserver-nb\") pod \"dnsmasq-dns-74fd8b655f-4vsbx\" (UID: \"f44a9445-7982-4d53-aadc-02677a421f34\") " pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.798592 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-ovsdbserver-sb\") pod \"dnsmasq-dns-74fd8b655f-4vsbx\" (UID: \"f44a9445-7982-4d53-aadc-02677a421f34\") " pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.799011 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-config\") pod \"dnsmasq-dns-74fd8b655f-4vsbx\" (UID: \"f44a9445-7982-4d53-aadc-02677a421f34\") " pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.799350 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-dns-svc\") pod \"dnsmasq-dns-74fd8b655f-4vsbx\" (UID: \"f44a9445-7982-4d53-aadc-02677a421f34\") " pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.799864 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-dns-swift-storage-0\") pod \"dnsmasq-dns-74fd8b655f-4vsbx\" (UID: \"f44a9445-7982-4d53-aadc-02677a421f34\") " pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.800543 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-ovsdbserver-sb\") pod \"dnsmasq-dns-74fd8b655f-4vsbx\" (UID: \"f44a9445-7982-4d53-aadc-02677a421f34\") " pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.816274 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsxjs\" (UniqueName: \"kubernetes.io/projected/f44a9445-7982-4d53-aadc-02677a421f34-kube-api-access-dsxjs\") pod \"dnsmasq-dns-74fd8b655f-4vsbx\" (UID: \"f44a9445-7982-4d53-aadc-02677a421f34\") " pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" Dec 03 07:08:37 crc kubenswrapper[4947]: I1203 07:08:37.953048 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" Dec 03 07:08:38 crc kubenswrapper[4947]: I1203 07:08:38.289157 4947 generic.go:334] "Generic (PLEG): container finished" podID="f86315fb-2855-460d-b1bb-184f54fa4c26" containerID="bb816660d345a03985fcad7f33dbeccf0329c265285ea9fe626aa4d7c58e5f81" exitCode=0 Dec 03 07:08:38 crc kubenswrapper[4947]: I1203 07:08:38.289331 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" event={"ID":"f86315fb-2855-460d-b1bb-184f54fa4c26","Type":"ContainerDied","Data":"bb816660d345a03985fcad7f33dbeccf0329c265285ea9fe626aa4d7c58e5f81"} Dec 03 07:08:38 crc kubenswrapper[4947]: I1203 07:08:38.291556 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7bxn2" event={"ID":"a2da706f-4332-49a8-9c85-2d90186bd708","Type":"ContainerStarted","Data":"ff60d3a64764b7b0a8521fbe7ced724e9b3c275421d8e741d28b55a0fee93a3a"} Dec 03 07:08:38 crc kubenswrapper[4947]: I1203 07:08:38.300998 4947 generic.go:334] "Generic (PLEG): container finished" podID="7c810f8e-8312-44fb-8de2-63f66e6efbb3" containerID="8b3e39987177a1ed71c2704c69c93a2a0d6678a2e79c823fc7794b72400599c2" exitCode=0 Dec 03 07:08:38 crc kubenswrapper[4947]: I1203 07:08:38.301182 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74cd4f877c-8x92t" event={"ID":"7c810f8e-8312-44fb-8de2-63f66e6efbb3","Type":"ContainerDied","Data":"8b3e39987177a1ed71c2704c69c93a2a0d6678a2e79c823fc7794b72400599c2"} Dec 03 07:08:38 crc kubenswrapper[4947]: I1203 07:08:38.347839 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-7bxn2" podStartSLOduration=3.347816156 podStartE2EDuration="3.347816156s" podCreationTimestamp="2025-12-03 07:08:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:08:38.344077745 +0000 UTC m=+1179.605032171" watchObservedRunningTime="2025-12-03 07:08:38.347816156 +0000 UTC m=+1179.608770582" Dec 03 07:08:38 crc kubenswrapper[4947]: I1203 07:08:38.468075 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74fd8b655f-4vsbx"] Dec 03 07:08:38 crc kubenswrapper[4947]: I1203 07:08:38.620421 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:08:38 crc kubenswrapper[4947]: I1203 07:08:38.631122 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 07:08:38 crc kubenswrapper[4947]: I1203 07:08:38.636902 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 03 07:08:38 crc kubenswrapper[4947]: I1203 07:08:38.637041 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-cgkdm" Dec 03 07:08:38 crc kubenswrapper[4947]: I1203 07:08:38.637143 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 07:08:38 crc kubenswrapper[4947]: I1203 07:08:38.645732 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.731176 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd9c951b-a551-470c-b2bf-900b0552b31b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.731232 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dd9c951b-a551-470c-b2bf-900b0552b31b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.731279 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzpmd\" (UniqueName: \"kubernetes.io/projected/dd9c951b-a551-470c-b2bf-900b0552b31b-kube-api-access-zzpmd\") pod \"glance-default-external-api-0\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.731322 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.731354 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd9c951b-a551-470c-b2bf-900b0552b31b-logs\") pod \"glance-default-external-api-0\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.731424 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd9c951b-a551-470c-b2bf-900b0552b31b-scripts\") pod \"glance-default-external-api-0\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.731443 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd9c951b-a551-470c-b2bf-900b0552b31b-config-data\") pod \"glance-default-external-api-0\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.766457 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.767944 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.772217 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.819578 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:08:39 crc kubenswrapper[4947]: E1203 07:08:38.820171 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data glance httpd-run kube-api-access-zzpmd logs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-external-api-0" podUID="dd9c951b-a551-470c-b2bf-900b0552b31b" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.836787 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd9c951b-a551-470c-b2bf-900b0552b31b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.836833 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dd9c951b-a551-470c-b2bf-900b0552b31b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.836877 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzpmd\" (UniqueName: \"kubernetes.io/projected/dd9c951b-a551-470c-b2bf-900b0552b31b-kube-api-access-zzpmd\") pod \"glance-default-external-api-0\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.836906 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.836948 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd9c951b-a551-470c-b2bf-900b0552b31b-logs\") pod \"glance-default-external-api-0\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.837005 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd9c951b-a551-470c-b2bf-900b0552b31b-scripts\") pod \"glance-default-external-api-0\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.837025 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd9c951b-a551-470c-b2bf-900b0552b31b-config-data\") pod \"glance-default-external-api-0\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.838172 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dd9c951b-a551-470c-b2bf-900b0552b31b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.844404 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd9c951b-a551-470c-b2bf-900b0552b31b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.844738 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.844749 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd9c951b-a551-470c-b2bf-900b0552b31b-logs\") pod \"glance-default-external-api-0\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.860733 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd9c951b-a551-470c-b2bf-900b0552b31b-scripts\") pod \"glance-default-external-api-0\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.867070 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd9c951b-a551-470c-b2bf-900b0552b31b-config-data\") pod \"glance-default-external-api-0\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.901978 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.923271 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.924194 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzpmd\" (UniqueName: \"kubernetes.io/projected/dd9c951b-a551-470c-b2bf-900b0552b31b-kube-api-access-zzpmd\") pod \"glance-default-external-api-0\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.932448 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:08:39 crc kubenswrapper[4947]: E1203 07:08:38.933059 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data glance httpd-run kube-api-access-mmlh2 logs scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-internal-api-0" podUID="46c7aad5-c9f4-4a36-876d-f3a805c97d3e" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.942013 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.942094 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.942119 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-logs\") pod \"glance-default-internal-api-0\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.942159 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.942204 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.942223 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmlh2\" (UniqueName: \"kubernetes.io/projected/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-kube-api-access-mmlh2\") pod \"glance-default-internal-api-0\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.942249 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:38.957052 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74cd4f877c-8x92t" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.024825 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.045854 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj5g6\" (UniqueName: \"kubernetes.io/projected/7c810f8e-8312-44fb-8de2-63f66e6efbb3-kube-api-access-dj5g6\") pod \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\" (UID: \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.045900 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-ovsdbserver-sb\") pod \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\" (UID: \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.045938 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-ovsdbserver-nb\") pod \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\" (UID: \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.046012 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-config\") pod \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\" (UID: \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.046098 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-dns-svc\") pod \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\" (UID: \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.046204 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-dns-swift-storage-0\") pod \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\" (UID: \"7c810f8e-8312-44fb-8de2-63f66e6efbb3\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.046475 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.046530 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-logs\") pod \"glance-default-internal-api-0\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.046575 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.046615 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.046631 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmlh2\" (UniqueName: \"kubernetes.io/projected/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-kube-api-access-mmlh2\") pod \"glance-default-internal-api-0\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.046655 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.046680 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.050912 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.053730 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-logs\") pod \"glance-default-internal-api-0\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.053957 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.056455 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.057645 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c810f8e-8312-44fb-8de2-63f66e6efbb3-kube-api-access-dj5g6" (OuterVolumeSpecName: "kube-api-access-dj5g6") pod "7c810f8e-8312-44fb-8de2-63f66e6efbb3" (UID: "7c810f8e-8312-44fb-8de2-63f66e6efbb3"). InnerVolumeSpecName "kube-api-access-dj5g6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.058872 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.064642 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.100657 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.101690 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7c810f8e-8312-44fb-8de2-63f66e6efbb3" (UID: "7c810f8e-8312-44fb-8de2-63f66e6efbb3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.114825 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmlh2\" (UniqueName: \"kubernetes.io/projected/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-kube-api-access-mmlh2\") pod \"glance-default-internal-api-0\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.148309 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-ovsdbserver-sb\") pod \"f86315fb-2855-460d-b1bb-184f54fa4c26\" (UID: \"f86315fb-2855-460d-b1bb-184f54fa4c26\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.148470 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-dns-svc\") pod \"f86315fb-2855-460d-b1bb-184f54fa4c26\" (UID: \"f86315fb-2855-460d-b1bb-184f54fa4c26\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.148774 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-ovsdbserver-nb\") pod \"f86315fb-2855-460d-b1bb-184f54fa4c26\" (UID: \"f86315fb-2855-460d-b1bb-184f54fa4c26\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.148802 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-config\") pod \"f86315fb-2855-460d-b1bb-184f54fa4c26\" (UID: \"f86315fb-2855-460d-b1bb-184f54fa4c26\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.148866 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-dns-swift-storage-0\") pod \"f86315fb-2855-460d-b1bb-184f54fa4c26\" (UID: \"f86315fb-2855-460d-b1bb-184f54fa4c26\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.148986 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxpm2\" (UniqueName: \"kubernetes.io/projected/f86315fb-2855-460d-b1bb-184f54fa4c26-kube-api-access-fxpm2\") pod \"f86315fb-2855-460d-b1bb-184f54fa4c26\" (UID: \"f86315fb-2855-460d-b1bb-184f54fa4c26\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.151178 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-config" (OuterVolumeSpecName: "config") pod "7c810f8e-8312-44fb-8de2-63f66e6efbb3" (UID: "7c810f8e-8312-44fb-8de2-63f66e6efbb3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.154191 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.154217 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj5g6\" (UniqueName: \"kubernetes.io/projected/7c810f8e-8312-44fb-8de2-63f66e6efbb3-kube-api-access-dj5g6\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.154228 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.160729 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7c810f8e-8312-44fb-8de2-63f66e6efbb3" (UID: "7c810f8e-8312-44fb-8de2-63f66e6efbb3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.161015 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7c810f8e-8312-44fb-8de2-63f66e6efbb3" (UID: "7c810f8e-8312-44fb-8de2-63f66e6efbb3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.162337 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f86315fb-2855-460d-b1bb-184f54fa4c26-kube-api-access-fxpm2" (OuterVolumeSpecName: "kube-api-access-fxpm2") pod "f86315fb-2855-460d-b1bb-184f54fa4c26" (UID: "f86315fb-2855-460d-b1bb-184f54fa4c26"). InnerVolumeSpecName "kube-api-access-fxpm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.163669 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7c810f8e-8312-44fb-8de2-63f66e6efbb3" (UID: "7c810f8e-8312-44fb-8de2-63f66e6efbb3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.184007 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f86315fb-2855-460d-b1bb-184f54fa4c26" (UID: "f86315fb-2855-460d-b1bb-184f54fa4c26"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.188261 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f86315fb-2855-460d-b1bb-184f54fa4c26" (UID: "f86315fb-2855-460d-b1bb-184f54fa4c26"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.189630 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-config" (OuterVolumeSpecName: "config") pod "f86315fb-2855-460d-b1bb-184f54fa4c26" (UID: "f86315fb-2855-460d-b1bb-184f54fa4c26"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.199159 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f86315fb-2855-460d-b1bb-184f54fa4c26" (UID: "f86315fb-2855-460d-b1bb-184f54fa4c26"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.207091 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f86315fb-2855-460d-b1bb-184f54fa4c26" (UID: "f86315fb-2855-460d-b1bb-184f54fa4c26"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.256369 4947 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.256402 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.256413 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.256421 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.256430 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.256439 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.256448 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7c810f8e-8312-44fb-8de2-63f66e6efbb3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.256456 4947 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f86315fb-2855-460d-b1bb-184f54fa4c26-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.256464 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxpm2\" (UniqueName: \"kubernetes.io/projected/f86315fb-2855-460d-b1bb-184f54fa4c26-kube-api-access-fxpm2\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.328009 4947 generic.go:334] "Generic (PLEG): container finished" podID="f44a9445-7982-4d53-aadc-02677a421f34" containerID="5c301b986d98fe8bb750064a1f7c325a3fad41154601504bfc8546352ee968f7" exitCode=0 Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.328576 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" event={"ID":"f44a9445-7982-4d53-aadc-02677a421f34","Type":"ContainerDied","Data":"5c301b986d98fe8bb750064a1f7c325a3fad41154601504bfc8546352ee968f7"} Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.328618 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" event={"ID":"f44a9445-7982-4d53-aadc-02677a421f34","Type":"ContainerStarted","Data":"90b4ab76d2e135e5edbf6f28a269210393b57168fabe335ab61446fad866d698"} Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.339093 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" event={"ID":"f86315fb-2855-460d-b1bb-184f54fa4c26","Type":"ContainerDied","Data":"a2438f1e6773ddf54d9a02463a86ab58715591c101cffecefb4c7f0671ed6532"} Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.339167 4947 scope.go:117] "RemoveContainer" containerID="bb816660d345a03985fcad7f33dbeccf0329c265285ea9fe626aa4d7c58e5f81" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.339129 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5678f567b5-qjg8q" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.345806 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.346381 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74cd4f877c-8x92t" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.353718 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74cd4f877c-8x92t" event={"ID":"7c810f8e-8312-44fb-8de2-63f66e6efbb3","Type":"ContainerDied","Data":"ffdc61138eebd5d42c83d1a2472f5d947ea6c5899f7d31712594cbb3a1c87999"} Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.358601 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.388895 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.395011 4947 scope.go:117] "RemoveContainer" containerID="8b3e39987177a1ed71c2704c69c93a2a0d6678a2e79c823fc7794b72400599c2" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.420838 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.457924 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5678f567b5-qjg8q"] Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.462149 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmlh2\" (UniqueName: \"kubernetes.io/projected/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-kube-api-access-mmlh2\") pod \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.462199 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-combined-ca-bundle\") pod \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.462228 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-config-data\") pod \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.462347 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-logs\") pod \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.462404 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-httpd-run\") pod \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.462436 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.462451 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-scripts\") pod \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\" (UID: \"46c7aad5-c9f4-4a36-876d-f3a805c97d3e\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.465482 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "46c7aad5-c9f4-4a36-876d-f3a805c97d3e" (UID: "46c7aad5-c9f4-4a36-876d-f3a805c97d3e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.466042 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-logs" (OuterVolumeSpecName: "logs") pod "46c7aad5-c9f4-4a36-876d-f3a805c97d3e" (UID: "46c7aad5-c9f4-4a36-876d-f3a805c97d3e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.474950 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-kube-api-access-mmlh2" (OuterVolumeSpecName: "kube-api-access-mmlh2") pod "46c7aad5-c9f4-4a36-876d-f3a805c97d3e" (UID: "46c7aad5-c9f4-4a36-876d-f3a805c97d3e"). InnerVolumeSpecName "kube-api-access-mmlh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.475059 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46c7aad5-c9f4-4a36-876d-f3a805c97d3e" (UID: "46c7aad5-c9f4-4a36-876d-f3a805c97d3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.477069 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-config-data" (OuterVolumeSpecName: "config-data") pod "46c7aad5-c9f4-4a36-876d-f3a805c97d3e" (UID: "46c7aad5-c9f4-4a36-876d-f3a805c97d3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.479515 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "46c7aad5-c9f4-4a36-876d-f3a805c97d3e" (UID: "46c7aad5-c9f4-4a36-876d-f3a805c97d3e"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.491215 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-scripts" (OuterVolumeSpecName: "scripts") pod "46c7aad5-c9f4-4a36-876d-f3a805c97d3e" (UID: "46c7aad5-c9f4-4a36-876d-f3a805c97d3e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.503323 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5678f567b5-qjg8q"] Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.537199 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74cd4f877c-8x92t"] Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.548398 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74cd4f877c-8x92t"] Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.564992 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd9c951b-a551-470c-b2bf-900b0552b31b-logs\") pod \"dd9c951b-a551-470c-b2bf-900b0552b31b\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.565040 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd9c951b-a551-470c-b2bf-900b0552b31b-config-data\") pod \"dd9c951b-a551-470c-b2bf-900b0552b31b\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.565127 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzpmd\" (UniqueName: \"kubernetes.io/projected/dd9c951b-a551-470c-b2bf-900b0552b31b-kube-api-access-zzpmd\") pod \"dd9c951b-a551-470c-b2bf-900b0552b31b\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.565143 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd9c951b-a551-470c-b2bf-900b0552b31b-scripts\") pod \"dd9c951b-a551-470c-b2bf-900b0552b31b\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.565219 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dd9c951b-a551-470c-b2bf-900b0552b31b-httpd-run\") pod \"dd9c951b-a551-470c-b2bf-900b0552b31b\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.565244 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd9c951b-a551-470c-b2bf-900b0552b31b-combined-ca-bundle\") pod \"dd9c951b-a551-470c-b2bf-900b0552b31b\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.565298 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"dd9c951b-a551-470c-b2bf-900b0552b31b\" (UID: \"dd9c951b-a551-470c-b2bf-900b0552b31b\") " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.565610 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmlh2\" (UniqueName: \"kubernetes.io/projected/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-kube-api-access-mmlh2\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.565621 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.565629 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.565637 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.565645 4947 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.565662 4947 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.565671 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46c7aad5-c9f4-4a36-876d-f3a805c97d3e-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.567284 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd9c951b-a551-470c-b2bf-900b0552b31b-logs" (OuterVolumeSpecName: "logs") pod "dd9c951b-a551-470c-b2bf-900b0552b31b" (UID: "dd9c951b-a551-470c-b2bf-900b0552b31b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.569537 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd9c951b-a551-470c-b2bf-900b0552b31b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "dd9c951b-a551-470c-b2bf-900b0552b31b" (UID: "dd9c951b-a551-470c-b2bf-900b0552b31b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.580350 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd9c951b-a551-470c-b2bf-900b0552b31b-scripts" (OuterVolumeSpecName: "scripts") pod "dd9c951b-a551-470c-b2bf-900b0552b31b" (UID: "dd9c951b-a551-470c-b2bf-900b0552b31b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.595201 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd9c951b-a551-470c-b2bf-900b0552b31b-config-data" (OuterVolumeSpecName: "config-data") pod "dd9c951b-a551-470c-b2bf-900b0552b31b" (UID: "dd9c951b-a551-470c-b2bf-900b0552b31b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.601077 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd9c951b-a551-470c-b2bf-900b0552b31b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd9c951b-a551-470c-b2bf-900b0552b31b" (UID: "dd9c951b-a551-470c-b2bf-900b0552b31b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.601968 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd9c951b-a551-470c-b2bf-900b0552b31b-kube-api-access-zzpmd" (OuterVolumeSpecName: "kube-api-access-zzpmd") pod "dd9c951b-a551-470c-b2bf-900b0552b31b" (UID: "dd9c951b-a551-470c-b2bf-900b0552b31b"). InnerVolumeSpecName "kube-api-access-zzpmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.613857 4947 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.617628 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "dd9c951b-a551-470c-b2bf-900b0552b31b" (UID: "dd9c951b-a551-470c-b2bf-900b0552b31b"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.667255 4947 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dd9c951b-a551-470c-b2bf-900b0552b31b-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.667277 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd9c951b-a551-470c-b2bf-900b0552b31b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.667312 4947 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.667321 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd9c951b-a551-470c-b2bf-900b0552b31b-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.667333 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd9c951b-a551-470c-b2bf-900b0552b31b-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.667341 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzpmd\" (UniqueName: \"kubernetes.io/projected/dd9c951b-a551-470c-b2bf-900b0552b31b-kube-api-access-zzpmd\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.667351 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd9c951b-a551-470c-b2bf-900b0552b31b-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.667363 4947 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.684056 4947 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 03 07:08:39 crc kubenswrapper[4947]: I1203 07:08:39.768778 4947 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.068829 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.358245 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" event={"ID":"f44a9445-7982-4d53-aadc-02677a421f34","Type":"ContainerStarted","Data":"13d5cceeb083c0a368fc683e65524466564ea34b1723cbd175b245b25c702296"} Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.358951 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.360651 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.367390 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.388689 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" podStartSLOduration=3.38866703 podStartE2EDuration="3.38866703s" podCreationTimestamp="2025-12-03 07:08:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:08:40.384123227 +0000 UTC m=+1181.645077673" watchObservedRunningTime="2025-12-03 07:08:40.38866703 +0000 UTC m=+1181.649621456" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.437656 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.450152 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.457934 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:08:40 crc kubenswrapper[4947]: E1203 07:08:40.458412 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c810f8e-8312-44fb-8de2-63f66e6efbb3" containerName="init" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.458435 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c810f8e-8312-44fb-8de2-63f66e6efbb3" containerName="init" Dec 03 07:08:40 crc kubenswrapper[4947]: E1203 07:08:40.458460 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f86315fb-2855-460d-b1bb-184f54fa4c26" containerName="init" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.458468 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f86315fb-2855-460d-b1bb-184f54fa4c26" containerName="init" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.458727 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c810f8e-8312-44fb-8de2-63f66e6efbb3" containerName="init" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.458754 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f86315fb-2855-460d-b1bb-184f54fa4c26" containerName="init" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.475597 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.492771 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.493073 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-cgkdm" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.493238 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.564567 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.573539 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.583366 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63877bed-285e-4d4b-b28c-f05ed9de57a5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.583429 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/63877bed-285e-4d4b-b28c-f05ed9de57a5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.583449 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63877bed-285e-4d4b-b28c-f05ed9de57a5-logs\") pod \"glance-default-external-api-0\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.583599 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggxq8\" (UniqueName: \"kubernetes.io/projected/63877bed-285e-4d4b-b28c-f05ed9de57a5-kube-api-access-ggxq8\") pod \"glance-default-external-api-0\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.583686 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.583795 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63877bed-285e-4d4b-b28c-f05ed9de57a5-scripts\") pod \"glance-default-external-api-0\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.583852 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63877bed-285e-4d4b-b28c-f05ed9de57a5-config-data\") pod \"glance-default-external-api-0\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.591180 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.598810 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.600143 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.603908 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.610861 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.685265 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.685311 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/135270e8-e939-40bd-b5ae-dcce0b765e82-scripts\") pod \"glance-default-internal-api-0\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.685337 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/135270e8-e939-40bd-b5ae-dcce0b765e82-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.685376 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/135270e8-e939-40bd-b5ae-dcce0b765e82-logs\") pod \"glance-default-internal-api-0\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.685401 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/135270e8-e939-40bd-b5ae-dcce0b765e82-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.685428 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63877bed-285e-4d4b-b28c-f05ed9de57a5-scripts\") pod \"glance-default-external-api-0\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.685464 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63877bed-285e-4d4b-b28c-f05ed9de57a5-config-data\") pod \"glance-default-external-api-0\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.685824 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63877bed-285e-4d4b-b28c-f05ed9de57a5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.685870 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l6nw\" (UniqueName: \"kubernetes.io/projected/135270e8-e939-40bd-b5ae-dcce0b765e82-kube-api-access-5l6nw\") pod \"glance-default-internal-api-0\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.685899 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.685922 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63877bed-285e-4d4b-b28c-f05ed9de57a5-logs\") pod \"glance-default-external-api-0\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.685940 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/63877bed-285e-4d4b-b28c-f05ed9de57a5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.685962 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/135270e8-e939-40bd-b5ae-dcce0b765e82-config-data\") pod \"glance-default-internal-api-0\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.685999 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggxq8\" (UniqueName: \"kubernetes.io/projected/63877bed-285e-4d4b-b28c-f05ed9de57a5-kube-api-access-ggxq8\") pod \"glance-default-external-api-0\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.686581 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.692182 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/63877bed-285e-4d4b-b28c-f05ed9de57a5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.692412 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63877bed-285e-4d4b-b28c-f05ed9de57a5-logs\") pod \"glance-default-external-api-0\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.693567 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63877bed-285e-4d4b-b28c-f05ed9de57a5-config-data\") pod \"glance-default-external-api-0\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.694651 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63877bed-285e-4d4b-b28c-f05ed9de57a5-scripts\") pod \"glance-default-external-api-0\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.698024 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63877bed-285e-4d4b-b28c-f05ed9de57a5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.702950 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggxq8\" (UniqueName: \"kubernetes.io/projected/63877bed-285e-4d4b-b28c-f05ed9de57a5-kube-api-access-ggxq8\") pod \"glance-default-external-api-0\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.716639 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.789402 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l6nw\" (UniqueName: \"kubernetes.io/projected/135270e8-e939-40bd-b5ae-dcce0b765e82-kube-api-access-5l6nw\") pod \"glance-default-internal-api-0\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.789448 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.789476 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/135270e8-e939-40bd-b5ae-dcce0b765e82-config-data\") pod \"glance-default-internal-api-0\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.789576 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/135270e8-e939-40bd-b5ae-dcce0b765e82-scripts\") pod \"glance-default-internal-api-0\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.789621 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.789648 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/135270e8-e939-40bd-b5ae-dcce0b765e82-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.789684 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/135270e8-e939-40bd-b5ae-dcce0b765e82-logs\") pod \"glance-default-internal-api-0\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.789709 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/135270e8-e939-40bd-b5ae-dcce0b765e82-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.790406 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/135270e8-e939-40bd-b5ae-dcce0b765e82-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.790443 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/135270e8-e939-40bd-b5ae-dcce0b765e82-logs\") pod \"glance-default-internal-api-0\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.795154 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/135270e8-e939-40bd-b5ae-dcce0b765e82-config-data\") pod \"glance-default-internal-api-0\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.795747 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/135270e8-e939-40bd-b5ae-dcce0b765e82-scripts\") pod \"glance-default-internal-api-0\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.798419 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/135270e8-e939-40bd-b5ae-dcce0b765e82-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.803882 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.805162 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l6nw\" (UniqueName: \"kubernetes.io/projected/135270e8-e939-40bd-b5ae-dcce0b765e82-kube-api-access-5l6nw\") pod \"glance-default-internal-api-0\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.824692 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:08:40 crc kubenswrapper[4947]: I1203 07:08:40.931596 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 07:08:41 crc kubenswrapper[4947]: I1203 07:08:41.096482 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46c7aad5-c9f4-4a36-876d-f3a805c97d3e" path="/var/lib/kubelet/pods/46c7aad5-c9f4-4a36-876d-f3a805c97d3e/volumes" Dec 03 07:08:41 crc kubenswrapper[4947]: I1203 07:08:41.096887 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c810f8e-8312-44fb-8de2-63f66e6efbb3" path="/var/lib/kubelet/pods/7c810f8e-8312-44fb-8de2-63f66e6efbb3/volumes" Dec 03 07:08:41 crc kubenswrapper[4947]: I1203 07:08:41.097667 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd9c951b-a551-470c-b2bf-900b0552b31b" path="/var/lib/kubelet/pods/dd9c951b-a551-470c-b2bf-900b0552b31b/volumes" Dec 03 07:08:41 crc kubenswrapper[4947]: I1203 07:08:41.098036 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f86315fb-2855-460d-b1bb-184f54fa4c26" path="/var/lib/kubelet/pods/f86315fb-2855-460d-b1bb-184f54fa4c26/volumes" Dec 03 07:08:41 crc kubenswrapper[4947]: I1203 07:08:41.358030 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:08:42 crc kubenswrapper[4947]: I1203 07:08:42.384638 4947 generic.go:334] "Generic (PLEG): container finished" podID="54dfa127-2209-41a5-94b2-ee9db0005e00" containerID="a78540289946534a350ef445e5a1ab10a4c50c6691c8ab7e7bd31dc482d3b51d" exitCode=0 Dec 03 07:08:42 crc kubenswrapper[4947]: I1203 07:08:42.384739 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nfp5r" event={"ID":"54dfa127-2209-41a5-94b2-ee9db0005e00","Type":"ContainerDied","Data":"a78540289946534a350ef445e5a1ab10a4c50c6691c8ab7e7bd31dc482d3b51d"} Dec 03 07:08:45 crc kubenswrapper[4947]: I1203 07:08:45.947978 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:08:46 crc kubenswrapper[4947]: I1203 07:08:46.069362 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:08:47 crc kubenswrapper[4947]: I1203 07:08:47.184304 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nfp5r" Dec 03 07:08:47 crc kubenswrapper[4947]: I1203 07:08:47.305418 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-combined-ca-bundle\") pod \"54dfa127-2209-41a5-94b2-ee9db0005e00\" (UID: \"54dfa127-2209-41a5-94b2-ee9db0005e00\") " Dec 03 07:08:47 crc kubenswrapper[4947]: I1203 07:08:47.305476 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-fernet-keys\") pod \"54dfa127-2209-41a5-94b2-ee9db0005e00\" (UID: \"54dfa127-2209-41a5-94b2-ee9db0005e00\") " Dec 03 07:08:47 crc kubenswrapper[4947]: I1203 07:08:47.305532 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-credential-keys\") pod \"54dfa127-2209-41a5-94b2-ee9db0005e00\" (UID: \"54dfa127-2209-41a5-94b2-ee9db0005e00\") " Dec 03 07:08:47 crc kubenswrapper[4947]: I1203 07:08:47.305630 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-config-data\") pod \"54dfa127-2209-41a5-94b2-ee9db0005e00\" (UID: \"54dfa127-2209-41a5-94b2-ee9db0005e00\") " Dec 03 07:08:47 crc kubenswrapper[4947]: I1203 07:08:47.305706 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5lpn\" (UniqueName: \"kubernetes.io/projected/54dfa127-2209-41a5-94b2-ee9db0005e00-kube-api-access-g5lpn\") pod \"54dfa127-2209-41a5-94b2-ee9db0005e00\" (UID: \"54dfa127-2209-41a5-94b2-ee9db0005e00\") " Dec 03 07:08:47 crc kubenswrapper[4947]: I1203 07:08:47.305787 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-scripts\") pod \"54dfa127-2209-41a5-94b2-ee9db0005e00\" (UID: \"54dfa127-2209-41a5-94b2-ee9db0005e00\") " Dec 03 07:08:47 crc kubenswrapper[4947]: I1203 07:08:47.311087 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-scripts" (OuterVolumeSpecName: "scripts") pod "54dfa127-2209-41a5-94b2-ee9db0005e00" (UID: "54dfa127-2209-41a5-94b2-ee9db0005e00"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:08:47 crc kubenswrapper[4947]: I1203 07:08:47.311644 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "54dfa127-2209-41a5-94b2-ee9db0005e00" (UID: "54dfa127-2209-41a5-94b2-ee9db0005e00"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:08:47 crc kubenswrapper[4947]: I1203 07:08:47.312289 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54dfa127-2209-41a5-94b2-ee9db0005e00-kube-api-access-g5lpn" (OuterVolumeSpecName: "kube-api-access-g5lpn") pod "54dfa127-2209-41a5-94b2-ee9db0005e00" (UID: "54dfa127-2209-41a5-94b2-ee9db0005e00"). InnerVolumeSpecName "kube-api-access-g5lpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:08:47 crc kubenswrapper[4947]: I1203 07:08:47.312393 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "54dfa127-2209-41a5-94b2-ee9db0005e00" (UID: "54dfa127-2209-41a5-94b2-ee9db0005e00"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:08:47 crc kubenswrapper[4947]: I1203 07:08:47.329975 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54dfa127-2209-41a5-94b2-ee9db0005e00" (UID: "54dfa127-2209-41a5-94b2-ee9db0005e00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:08:47 crc kubenswrapper[4947]: I1203 07:08:47.337527 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-config-data" (OuterVolumeSpecName: "config-data") pod "54dfa127-2209-41a5-94b2-ee9db0005e00" (UID: "54dfa127-2209-41a5-94b2-ee9db0005e00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:08:47 crc kubenswrapper[4947]: I1203 07:08:47.408066 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5lpn\" (UniqueName: \"kubernetes.io/projected/54dfa127-2209-41a5-94b2-ee9db0005e00-kube-api-access-g5lpn\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:47 crc kubenswrapper[4947]: I1203 07:08:47.408099 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:47 crc kubenswrapper[4947]: I1203 07:08:47.408110 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:47 crc kubenswrapper[4947]: I1203 07:08:47.408118 4947 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:47 crc kubenswrapper[4947]: I1203 07:08:47.408126 4947 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:47 crc kubenswrapper[4947]: I1203 07:08:47.408134 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54dfa127-2209-41a5-94b2-ee9db0005e00-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:47 crc kubenswrapper[4947]: I1203 07:08:47.441948 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-nfp5r" event={"ID":"54dfa127-2209-41a5-94b2-ee9db0005e00","Type":"ContainerDied","Data":"a9038c7de05352b66ff8db21a2dc218eededec1bb577425e7180a6d9a70ac09d"} Dec 03 07:08:47 crc kubenswrapper[4947]: I1203 07:08:47.441987 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9038c7de05352b66ff8db21a2dc218eededec1bb577425e7180a6d9a70ac09d" Dec 03 07:08:47 crc kubenswrapper[4947]: I1203 07:08:47.442004 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-nfp5r" Dec 03 07:08:47 crc kubenswrapper[4947]: I1203 07:08:47.443777 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"63877bed-285e-4d4b-b28c-f05ed9de57a5","Type":"ContainerStarted","Data":"aa439c36b9149353ebc9e304e177b53dc7e649743161f68963ea3459bf82603b"} Dec 03 07:08:47 crc kubenswrapper[4947]: I1203 07:08:47.954693 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.004208 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d5fbdd8c-74bxs"] Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.004853 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" podUID="2d17d252-d33f-4846-a443-2d99e5d3464c" containerName="dnsmasq-dns" containerID="cri-o://b3c68005731767ce6aebd52c56c18a6a1b8835a9b47f24276345bcc23bbcc5be" gracePeriod=10 Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.290170 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-nfp5r"] Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.297642 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-nfp5r"] Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.381470 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-6lcmv"] Dec 03 07:08:48 crc kubenswrapper[4947]: E1203 07:08:48.381999 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54dfa127-2209-41a5-94b2-ee9db0005e00" containerName="keystone-bootstrap" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.382021 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="54dfa127-2209-41a5-94b2-ee9db0005e00" containerName="keystone-bootstrap" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.382277 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="54dfa127-2209-41a5-94b2-ee9db0005e00" containerName="keystone-bootstrap" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.382969 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6lcmv" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.387040 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.387090 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.387438 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-r75nw" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.387705 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.389464 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.398105 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6lcmv"] Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.426341 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-fernet-keys\") pod \"keystone-bootstrap-6lcmv\" (UID: \"883ad48f-95eb-48b1-932b-98e145d203ab\") " pod="openstack/keystone-bootstrap-6lcmv" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.426398 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm6hn\" (UniqueName: \"kubernetes.io/projected/883ad48f-95eb-48b1-932b-98e145d203ab-kube-api-access-rm6hn\") pod \"keystone-bootstrap-6lcmv\" (UID: \"883ad48f-95eb-48b1-932b-98e145d203ab\") " pod="openstack/keystone-bootstrap-6lcmv" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.426441 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-config-data\") pod \"keystone-bootstrap-6lcmv\" (UID: \"883ad48f-95eb-48b1-932b-98e145d203ab\") " pod="openstack/keystone-bootstrap-6lcmv" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.426481 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-combined-ca-bundle\") pod \"keystone-bootstrap-6lcmv\" (UID: \"883ad48f-95eb-48b1-932b-98e145d203ab\") " pod="openstack/keystone-bootstrap-6lcmv" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.426597 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-scripts\") pod \"keystone-bootstrap-6lcmv\" (UID: \"883ad48f-95eb-48b1-932b-98e145d203ab\") " pod="openstack/keystone-bootstrap-6lcmv" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.426619 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-credential-keys\") pod \"keystone-bootstrap-6lcmv\" (UID: \"883ad48f-95eb-48b1-932b-98e145d203ab\") " pod="openstack/keystone-bootstrap-6lcmv" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.455694 4947 generic.go:334] "Generic (PLEG): container finished" podID="2d17d252-d33f-4846-a443-2d99e5d3464c" containerID="b3c68005731767ce6aebd52c56c18a6a1b8835a9b47f24276345bcc23bbcc5be" exitCode=0 Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.455746 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" event={"ID":"2d17d252-d33f-4846-a443-2d99e5d3464c","Type":"ContainerDied","Data":"b3c68005731767ce6aebd52c56c18a6a1b8835a9b47f24276345bcc23bbcc5be"} Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.528001 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-fernet-keys\") pod \"keystone-bootstrap-6lcmv\" (UID: \"883ad48f-95eb-48b1-932b-98e145d203ab\") " pod="openstack/keystone-bootstrap-6lcmv" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.528050 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm6hn\" (UniqueName: \"kubernetes.io/projected/883ad48f-95eb-48b1-932b-98e145d203ab-kube-api-access-rm6hn\") pod \"keystone-bootstrap-6lcmv\" (UID: \"883ad48f-95eb-48b1-932b-98e145d203ab\") " pod="openstack/keystone-bootstrap-6lcmv" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.528084 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-config-data\") pod \"keystone-bootstrap-6lcmv\" (UID: \"883ad48f-95eb-48b1-932b-98e145d203ab\") " pod="openstack/keystone-bootstrap-6lcmv" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.528111 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-combined-ca-bundle\") pod \"keystone-bootstrap-6lcmv\" (UID: \"883ad48f-95eb-48b1-932b-98e145d203ab\") " pod="openstack/keystone-bootstrap-6lcmv" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.528180 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-scripts\") pod \"keystone-bootstrap-6lcmv\" (UID: \"883ad48f-95eb-48b1-932b-98e145d203ab\") " pod="openstack/keystone-bootstrap-6lcmv" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.528197 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-credential-keys\") pod \"keystone-bootstrap-6lcmv\" (UID: \"883ad48f-95eb-48b1-932b-98e145d203ab\") " pod="openstack/keystone-bootstrap-6lcmv" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.532677 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-credential-keys\") pod \"keystone-bootstrap-6lcmv\" (UID: \"883ad48f-95eb-48b1-932b-98e145d203ab\") " pod="openstack/keystone-bootstrap-6lcmv" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.532688 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-fernet-keys\") pod \"keystone-bootstrap-6lcmv\" (UID: \"883ad48f-95eb-48b1-932b-98e145d203ab\") " pod="openstack/keystone-bootstrap-6lcmv" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.532857 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-combined-ca-bundle\") pod \"keystone-bootstrap-6lcmv\" (UID: \"883ad48f-95eb-48b1-932b-98e145d203ab\") " pod="openstack/keystone-bootstrap-6lcmv" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.534727 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-scripts\") pod \"keystone-bootstrap-6lcmv\" (UID: \"883ad48f-95eb-48b1-932b-98e145d203ab\") " pod="openstack/keystone-bootstrap-6lcmv" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.535041 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-config-data\") pod \"keystone-bootstrap-6lcmv\" (UID: \"883ad48f-95eb-48b1-932b-98e145d203ab\") " pod="openstack/keystone-bootstrap-6lcmv" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.547805 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm6hn\" (UniqueName: \"kubernetes.io/projected/883ad48f-95eb-48b1-932b-98e145d203ab-kube-api-access-rm6hn\") pod \"keystone-bootstrap-6lcmv\" (UID: \"883ad48f-95eb-48b1-932b-98e145d203ab\") " pod="openstack/keystone-bootstrap-6lcmv" Dec 03 07:08:48 crc kubenswrapper[4947]: I1203 07:08:48.700945 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6lcmv" Dec 03 07:08:49 crc kubenswrapper[4947]: I1203 07:08:49.102628 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54dfa127-2209-41a5-94b2-ee9db0005e00" path="/var/lib/kubelet/pods/54dfa127-2209-41a5-94b2-ee9db0005e00/volumes" Dec 03 07:08:55 crc kubenswrapper[4947]: I1203 07:08:55.559576 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" podUID="2d17d252-d33f-4846-a443-2d99e5d3464c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: i/o timeout" Dec 03 07:08:57 crc kubenswrapper[4947]: E1203 07:08:57.083599 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:82006b9c64d4c5f80483cda262d960ce6be4813665158ef1a53ea7734bbe431f" Dec 03 07:08:57 crc kubenswrapper[4947]: E1203 07:08:57.084035 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:82006b9c64d4c5f80483cda262d960ce6be4813665158ef1a53ea7734bbe431f,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9bqcd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-nv68l_openstack(5abc234f-80ac-4e2e-a43d-2a6fe3453f8c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 07:08:57 crc kubenswrapper[4947]: E1203 07:08:57.085319 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-nv68l" podUID="5abc234f-80ac-4e2e-a43d-2a6fe3453f8c" Dec 03 07:08:57 crc kubenswrapper[4947]: I1203 07:08:57.199523 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" Dec 03 07:08:57 crc kubenswrapper[4947]: I1203 07:08:57.288158 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2k9v\" (UniqueName: \"kubernetes.io/projected/2d17d252-d33f-4846-a443-2d99e5d3464c-kube-api-access-h2k9v\") pod \"2d17d252-d33f-4846-a443-2d99e5d3464c\" (UID: \"2d17d252-d33f-4846-a443-2d99e5d3464c\") " Dec 03 07:08:57 crc kubenswrapper[4947]: I1203 07:08:57.288304 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d17d252-d33f-4846-a443-2d99e5d3464c-ovsdbserver-sb\") pod \"2d17d252-d33f-4846-a443-2d99e5d3464c\" (UID: \"2d17d252-d33f-4846-a443-2d99e5d3464c\") " Dec 03 07:08:57 crc kubenswrapper[4947]: I1203 07:08:57.288330 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d17d252-d33f-4846-a443-2d99e5d3464c-ovsdbserver-nb\") pod \"2d17d252-d33f-4846-a443-2d99e5d3464c\" (UID: \"2d17d252-d33f-4846-a443-2d99e5d3464c\") " Dec 03 07:08:57 crc kubenswrapper[4947]: I1203 07:08:57.288360 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d17d252-d33f-4846-a443-2d99e5d3464c-config\") pod \"2d17d252-d33f-4846-a443-2d99e5d3464c\" (UID: \"2d17d252-d33f-4846-a443-2d99e5d3464c\") " Dec 03 07:08:57 crc kubenswrapper[4947]: I1203 07:08:57.288388 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d17d252-d33f-4846-a443-2d99e5d3464c-dns-svc\") pod \"2d17d252-d33f-4846-a443-2d99e5d3464c\" (UID: \"2d17d252-d33f-4846-a443-2d99e5d3464c\") " Dec 03 07:08:57 crc kubenswrapper[4947]: I1203 07:08:57.295337 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d17d252-d33f-4846-a443-2d99e5d3464c-kube-api-access-h2k9v" (OuterVolumeSpecName: "kube-api-access-h2k9v") pod "2d17d252-d33f-4846-a443-2d99e5d3464c" (UID: "2d17d252-d33f-4846-a443-2d99e5d3464c"). InnerVolumeSpecName "kube-api-access-h2k9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:08:57 crc kubenswrapper[4947]: I1203 07:08:57.336127 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d17d252-d33f-4846-a443-2d99e5d3464c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2d17d252-d33f-4846-a443-2d99e5d3464c" (UID: "2d17d252-d33f-4846-a443-2d99e5d3464c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:57 crc kubenswrapper[4947]: I1203 07:08:57.337911 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d17d252-d33f-4846-a443-2d99e5d3464c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2d17d252-d33f-4846-a443-2d99e5d3464c" (UID: "2d17d252-d33f-4846-a443-2d99e5d3464c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:57 crc kubenswrapper[4947]: I1203 07:08:57.341227 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d17d252-d33f-4846-a443-2d99e5d3464c-config" (OuterVolumeSpecName: "config") pod "2d17d252-d33f-4846-a443-2d99e5d3464c" (UID: "2d17d252-d33f-4846-a443-2d99e5d3464c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:57 crc kubenswrapper[4947]: I1203 07:08:57.343649 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d17d252-d33f-4846-a443-2d99e5d3464c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2d17d252-d33f-4846-a443-2d99e5d3464c" (UID: "2d17d252-d33f-4846-a443-2d99e5d3464c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:08:57 crc kubenswrapper[4947]: I1203 07:08:57.390023 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d17d252-d33f-4846-a443-2d99e5d3464c-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:57 crc kubenswrapper[4947]: I1203 07:08:57.390107 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d17d252-d33f-4846-a443-2d99e5d3464c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:57 crc kubenswrapper[4947]: I1203 07:08:57.390118 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2k9v\" (UniqueName: \"kubernetes.io/projected/2d17d252-d33f-4846-a443-2d99e5d3464c-kube-api-access-h2k9v\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:57 crc kubenswrapper[4947]: I1203 07:08:57.390128 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d17d252-d33f-4846-a443-2d99e5d3464c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:57 crc kubenswrapper[4947]: I1203 07:08:57.390136 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d17d252-d33f-4846-a443-2d99e5d3464c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:57 crc kubenswrapper[4947]: I1203 07:08:57.537033 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" Dec 03 07:08:57 crc kubenswrapper[4947]: I1203 07:08:57.537029 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" event={"ID":"2d17d252-d33f-4846-a443-2d99e5d3464c","Type":"ContainerDied","Data":"a2350a3008d4e7f690a842d4e8da78d89f9124b7656b7b3f3bd5358bbe989323"} Dec 03 07:08:57 crc kubenswrapper[4947]: I1203 07:08:57.537481 4947 scope.go:117] "RemoveContainer" containerID="b3c68005731767ce6aebd52c56c18a6a1b8835a9b47f24276345bcc23bbcc5be" Dec 03 07:08:57 crc kubenswrapper[4947]: E1203 07:08:57.538785 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:82006b9c64d4c5f80483cda262d960ce6be4813665158ef1a53ea7734bbe431f\\\"\"" pod="openstack/barbican-db-sync-nv68l" podUID="5abc234f-80ac-4e2e-a43d-2a6fe3453f8c" Dec 03 07:08:57 crc kubenswrapper[4947]: I1203 07:08:57.574698 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d5fbdd8c-74bxs"] Dec 03 07:08:57 crc kubenswrapper[4947]: I1203 07:08:57.581150 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59d5fbdd8c-74bxs"] Dec 03 07:08:58 crc kubenswrapper[4947]: I1203 07:08:58.199589 4947 scope.go:117] "RemoveContainer" containerID="4cfb0364aa14bf4135568be9bf4895f749e4233b4b26e4b93e7d9deb40ddf25f" Dec 03 07:08:58 crc kubenswrapper[4947]: E1203 07:08:58.244654 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:7a2056615520e272bae43ec3f34e2ba7a92c1d364b8d9106b53bd694619fc9c2" Dec 03 07:08:58 crc kubenswrapper[4947]: E1203 07:08:58.245093 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:7a2056615520e272bae43ec3f34e2ba7a92c1d364b8d9106b53bd694619fc9c2,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xdbpz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-6bhnb_openstack(af29daa3-e143-4ea0-bfe0-284fd103f8b3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 07:08:58 crc kubenswrapper[4947]: E1203 07:08:58.246312 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-6bhnb" podUID="af29daa3-e143-4ea0-bfe0-284fd103f8b3" Dec 03 07:08:58 crc kubenswrapper[4947]: I1203 07:08:58.550507 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gl6zw" event={"ID":"e5883886-a832-4366-be78-449b4559e8d2","Type":"ContainerStarted","Data":"80650da93ac6c7dbe9d3efbba4ab0980626758431c3f9a1e26df29dbb6b4dd90"} Dec 03 07:08:58 crc kubenswrapper[4947]: I1203 07:08:58.555508 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"49777040-6a13-4610-a79c-6bc76d73212e","Type":"ContainerStarted","Data":"5ee9d6e696e18d98a587678503f266198e00644707799900f8303a0136c8f6bf"} Dec 03 07:08:58 crc kubenswrapper[4947]: E1203 07:08:58.556901 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:7a2056615520e272bae43ec3f34e2ba7a92c1d364b8d9106b53bd694619fc9c2\\\"\"" pod="openstack/cinder-db-sync-6bhnb" podUID="af29daa3-e143-4ea0-bfe0-284fd103f8b3" Dec 03 07:08:58 crc kubenswrapper[4947]: I1203 07:08:58.567412 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-gl6zw" podStartSLOduration=2.424597115 podStartE2EDuration="23.567394221s" podCreationTimestamp="2025-12-03 07:08:35 +0000 UTC" firstStartedPulling="2025-12-03 07:08:37.023584014 +0000 UTC m=+1178.284538440" lastFinishedPulling="2025-12-03 07:08:58.16638112 +0000 UTC m=+1199.427335546" observedRunningTime="2025-12-03 07:08:58.566622771 +0000 UTC m=+1199.827577217" watchObservedRunningTime="2025-12-03 07:08:58.567394221 +0000 UTC m=+1199.828348657" Dec 03 07:08:58 crc kubenswrapper[4947]: I1203 07:08:58.683324 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6lcmv"] Dec 03 07:08:58 crc kubenswrapper[4947]: W1203 07:08:58.693530 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod883ad48f_95eb_48b1_932b_98e145d203ab.slice/crio-4e7b36f28f211dad441ba6d0b5fa3fb24f2de355006e59ebb9b6a75032f9239f WatchSource:0}: Error finding container 4e7b36f28f211dad441ba6d0b5fa3fb24f2de355006e59ebb9b6a75032f9239f: Status 404 returned error can't find the container with id 4e7b36f28f211dad441ba6d0b5fa3fb24f2de355006e59ebb9b6a75032f9239f Dec 03 07:08:58 crc kubenswrapper[4947]: I1203 07:08:58.763777 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:08:58 crc kubenswrapper[4947]: W1203 07:08:58.768375 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod135270e8_e939_40bd_b5ae_dcce0b765e82.slice/crio-2cecc3ce6de4f7a6df308b1fa3212cddec7923687d11894ee074daf7452f2e0f WatchSource:0}: Error finding container 2cecc3ce6de4f7a6df308b1fa3212cddec7923687d11894ee074daf7452f2e0f: Status 404 returned error can't find the container with id 2cecc3ce6de4f7a6df308b1fa3212cddec7923687d11894ee074daf7452f2e0f Dec 03 07:08:59 crc kubenswrapper[4947]: I1203 07:08:59.097978 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d17d252-d33f-4846-a443-2d99e5d3464c" path="/var/lib/kubelet/pods/2d17d252-d33f-4846-a443-2d99e5d3464c/volumes" Dec 03 07:08:59 crc kubenswrapper[4947]: I1203 07:08:59.564809 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6lcmv" event={"ID":"883ad48f-95eb-48b1-932b-98e145d203ab","Type":"ContainerStarted","Data":"c9a719a484374ac820c11d3eb970965b2a4af8d1426143c047b5501f316ae86a"} Dec 03 07:08:59 crc kubenswrapper[4947]: I1203 07:08:59.565112 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6lcmv" event={"ID":"883ad48f-95eb-48b1-932b-98e145d203ab","Type":"ContainerStarted","Data":"4e7b36f28f211dad441ba6d0b5fa3fb24f2de355006e59ebb9b6a75032f9239f"} Dec 03 07:08:59 crc kubenswrapper[4947]: I1203 07:08:59.570075 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"63877bed-285e-4d4b-b28c-f05ed9de57a5","Type":"ContainerStarted","Data":"79fec1007ab23633e2b3b9ded86b2c9e2f1888455ff085de9b59fd2ce634d8d0"} Dec 03 07:08:59 crc kubenswrapper[4947]: I1203 07:08:59.570141 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"63877bed-285e-4d4b-b28c-f05ed9de57a5","Type":"ContainerStarted","Data":"ee2754cf5daf0a715c37e5354df5f0ea770c343fff03bfa4e46560f4c7abfcdf"} Dec 03 07:08:59 crc kubenswrapper[4947]: I1203 07:08:59.570155 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="63877bed-285e-4d4b-b28c-f05ed9de57a5" containerName="glance-log" containerID="cri-o://ee2754cf5daf0a715c37e5354df5f0ea770c343fff03bfa4e46560f4c7abfcdf" gracePeriod=30 Dec 03 07:08:59 crc kubenswrapper[4947]: I1203 07:08:59.570155 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="63877bed-285e-4d4b-b28c-f05ed9de57a5" containerName="glance-httpd" containerID="cri-o://79fec1007ab23633e2b3b9ded86b2c9e2f1888455ff085de9b59fd2ce634d8d0" gracePeriod=30 Dec 03 07:08:59 crc kubenswrapper[4947]: I1203 07:08:59.577721 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"135270e8-e939-40bd-b5ae-dcce0b765e82","Type":"ContainerStarted","Data":"25b8ac5d33c5f9cf80d600bea126b30f3d31f227386e3b6152be24a8c840d315"} Dec 03 07:08:59 crc kubenswrapper[4947]: I1203 07:08:59.577771 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"135270e8-e939-40bd-b5ae-dcce0b765e82","Type":"ContainerStarted","Data":"2cecc3ce6de4f7a6df308b1fa3212cddec7923687d11894ee074daf7452f2e0f"} Dec 03 07:08:59 crc kubenswrapper[4947]: I1203 07:08:59.614308 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-6lcmv" podStartSLOduration=11.614289486 podStartE2EDuration="11.614289486s" podCreationTimestamp="2025-12-03 07:08:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:08:59.581842882 +0000 UTC m=+1200.842797308" watchObservedRunningTime="2025-12-03 07:08:59.614289486 +0000 UTC m=+1200.875243912" Dec 03 07:08:59 crc kubenswrapper[4947]: I1203 07:08:59.624274 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=19.624253395 podStartE2EDuration="19.624253395s" podCreationTimestamp="2025-12-03 07:08:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:08:59.612918479 +0000 UTC m=+1200.873872905" watchObservedRunningTime="2025-12-03 07:08:59.624253395 +0000 UTC m=+1200.885207821" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.467249 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.560721 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-59d5fbdd8c-74bxs" podUID="2d17d252-d33f-4846-a443-2d99e5d3464c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: i/o timeout" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.563075 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63877bed-285e-4d4b-b28c-f05ed9de57a5-logs\") pod \"63877bed-285e-4d4b-b28c-f05ed9de57a5\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.563194 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63877bed-285e-4d4b-b28c-f05ed9de57a5-combined-ca-bundle\") pod \"63877bed-285e-4d4b-b28c-f05ed9de57a5\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.563275 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63877bed-285e-4d4b-b28c-f05ed9de57a5-scripts\") pod \"63877bed-285e-4d4b-b28c-f05ed9de57a5\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.563314 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggxq8\" (UniqueName: \"kubernetes.io/projected/63877bed-285e-4d4b-b28c-f05ed9de57a5-kube-api-access-ggxq8\") pod \"63877bed-285e-4d4b-b28c-f05ed9de57a5\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.563339 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"63877bed-285e-4d4b-b28c-f05ed9de57a5\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.563366 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63877bed-285e-4d4b-b28c-f05ed9de57a5-config-data\") pod \"63877bed-285e-4d4b-b28c-f05ed9de57a5\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.563400 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/63877bed-285e-4d4b-b28c-f05ed9de57a5-httpd-run\") pod \"63877bed-285e-4d4b-b28c-f05ed9de57a5\" (UID: \"63877bed-285e-4d4b-b28c-f05ed9de57a5\") " Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.563559 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63877bed-285e-4d4b-b28c-f05ed9de57a5-logs" (OuterVolumeSpecName: "logs") pod "63877bed-285e-4d4b-b28c-f05ed9de57a5" (UID: "63877bed-285e-4d4b-b28c-f05ed9de57a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.563823 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63877bed-285e-4d4b-b28c-f05ed9de57a5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "63877bed-285e-4d4b-b28c-f05ed9de57a5" (UID: "63877bed-285e-4d4b-b28c-f05ed9de57a5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.563933 4947 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/63877bed-285e-4d4b-b28c-f05ed9de57a5-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.563943 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63877bed-285e-4d4b-b28c-f05ed9de57a5-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.571834 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63877bed-285e-4d4b-b28c-f05ed9de57a5-scripts" (OuterVolumeSpecName: "scripts") pod "63877bed-285e-4d4b-b28c-f05ed9de57a5" (UID: "63877bed-285e-4d4b-b28c-f05ed9de57a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.571842 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "63877bed-285e-4d4b-b28c-f05ed9de57a5" (UID: "63877bed-285e-4d4b-b28c-f05ed9de57a5"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.573001 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63877bed-285e-4d4b-b28c-f05ed9de57a5-kube-api-access-ggxq8" (OuterVolumeSpecName: "kube-api-access-ggxq8") pod "63877bed-285e-4d4b-b28c-f05ed9de57a5" (UID: "63877bed-285e-4d4b-b28c-f05ed9de57a5"). InnerVolumeSpecName "kube-api-access-ggxq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.588604 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"49777040-6a13-4610-a79c-6bc76d73212e","Type":"ContainerStarted","Data":"006949d1f57355ceaf7778805bb8d8a92436c8a6ddc791cb68e2623b4694a8d0"} Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.590792 4947 generic.go:334] "Generic (PLEG): container finished" podID="63877bed-285e-4d4b-b28c-f05ed9de57a5" containerID="79fec1007ab23633e2b3b9ded86b2c9e2f1888455ff085de9b59fd2ce634d8d0" exitCode=0 Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.590832 4947 generic.go:334] "Generic (PLEG): container finished" podID="63877bed-285e-4d4b-b28c-f05ed9de57a5" containerID="ee2754cf5daf0a715c37e5354df5f0ea770c343fff03bfa4e46560f4c7abfcdf" exitCode=143 Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.590847 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"63877bed-285e-4d4b-b28c-f05ed9de57a5","Type":"ContainerDied","Data":"79fec1007ab23633e2b3b9ded86b2c9e2f1888455ff085de9b59fd2ce634d8d0"} Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.590881 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"63877bed-285e-4d4b-b28c-f05ed9de57a5","Type":"ContainerDied","Data":"ee2754cf5daf0a715c37e5354df5f0ea770c343fff03bfa4e46560f4c7abfcdf"} Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.590892 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"63877bed-285e-4d4b-b28c-f05ed9de57a5","Type":"ContainerDied","Data":"aa439c36b9149353ebc9e304e177b53dc7e649743161f68963ea3459bf82603b"} Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.590913 4947 scope.go:117] "RemoveContainer" containerID="79fec1007ab23633e2b3b9ded86b2c9e2f1888455ff085de9b59fd2ce634d8d0" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.590834 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.592603 4947 generic.go:334] "Generic (PLEG): container finished" podID="e5883886-a832-4366-be78-449b4559e8d2" containerID="80650da93ac6c7dbe9d3efbba4ab0980626758431c3f9a1e26df29dbb6b4dd90" exitCode=0 Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.592691 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gl6zw" event={"ID":"e5883886-a832-4366-be78-449b4559e8d2","Type":"ContainerDied","Data":"80650da93ac6c7dbe9d3efbba4ab0980626758431c3f9a1e26df29dbb6b4dd90"} Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.597573 4947 generic.go:334] "Generic (PLEG): container finished" podID="a2da706f-4332-49a8-9c85-2d90186bd708" containerID="ff60d3a64764b7b0a8521fbe7ced724e9b3c275421d8e741d28b55a0fee93a3a" exitCode=0 Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.597697 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7bxn2" event={"ID":"a2da706f-4332-49a8-9c85-2d90186bd708","Type":"ContainerDied","Data":"ff60d3a64764b7b0a8521fbe7ced724e9b3c275421d8e741d28b55a0fee93a3a"} Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.609057 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63877bed-285e-4d4b-b28c-f05ed9de57a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63877bed-285e-4d4b-b28c-f05ed9de57a5" (UID: "63877bed-285e-4d4b-b28c-f05ed9de57a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.631719 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63877bed-285e-4d4b-b28c-f05ed9de57a5-config-data" (OuterVolumeSpecName: "config-data") pod "63877bed-285e-4d4b-b28c-f05ed9de57a5" (UID: "63877bed-285e-4d4b-b28c-f05ed9de57a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.667669 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63877bed-285e-4d4b-b28c-f05ed9de57a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.667696 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63877bed-285e-4d4b-b28c-f05ed9de57a5-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.667705 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggxq8\" (UniqueName: \"kubernetes.io/projected/63877bed-285e-4d4b-b28c-f05ed9de57a5-kube-api-access-ggxq8\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.667732 4947 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.667743 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63877bed-285e-4d4b-b28c-f05ed9de57a5-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.668069 4947 scope.go:117] "RemoveContainer" containerID="ee2754cf5daf0a715c37e5354df5f0ea770c343fff03bfa4e46560f4c7abfcdf" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.687834 4947 scope.go:117] "RemoveContainer" containerID="79fec1007ab23633e2b3b9ded86b2c9e2f1888455ff085de9b59fd2ce634d8d0" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.688204 4947 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 03 07:09:00 crc kubenswrapper[4947]: E1203 07:09:00.688218 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79fec1007ab23633e2b3b9ded86b2c9e2f1888455ff085de9b59fd2ce634d8d0\": container with ID starting with 79fec1007ab23633e2b3b9ded86b2c9e2f1888455ff085de9b59fd2ce634d8d0 not found: ID does not exist" containerID="79fec1007ab23633e2b3b9ded86b2c9e2f1888455ff085de9b59fd2ce634d8d0" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.688253 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79fec1007ab23633e2b3b9ded86b2c9e2f1888455ff085de9b59fd2ce634d8d0"} err="failed to get container status \"79fec1007ab23633e2b3b9ded86b2c9e2f1888455ff085de9b59fd2ce634d8d0\": rpc error: code = NotFound desc = could not find container \"79fec1007ab23633e2b3b9ded86b2c9e2f1888455ff085de9b59fd2ce634d8d0\": container with ID starting with 79fec1007ab23633e2b3b9ded86b2c9e2f1888455ff085de9b59fd2ce634d8d0 not found: ID does not exist" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.688281 4947 scope.go:117] "RemoveContainer" containerID="ee2754cf5daf0a715c37e5354df5f0ea770c343fff03bfa4e46560f4c7abfcdf" Dec 03 07:09:00 crc kubenswrapper[4947]: E1203 07:09:00.688587 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee2754cf5daf0a715c37e5354df5f0ea770c343fff03bfa4e46560f4c7abfcdf\": container with ID starting with ee2754cf5daf0a715c37e5354df5f0ea770c343fff03bfa4e46560f4c7abfcdf not found: ID does not exist" containerID="ee2754cf5daf0a715c37e5354df5f0ea770c343fff03bfa4e46560f4c7abfcdf" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.688725 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee2754cf5daf0a715c37e5354df5f0ea770c343fff03bfa4e46560f4c7abfcdf"} err="failed to get container status \"ee2754cf5daf0a715c37e5354df5f0ea770c343fff03bfa4e46560f4c7abfcdf\": rpc error: code = NotFound desc = could not find container \"ee2754cf5daf0a715c37e5354df5f0ea770c343fff03bfa4e46560f4c7abfcdf\": container with ID starting with ee2754cf5daf0a715c37e5354df5f0ea770c343fff03bfa4e46560f4c7abfcdf not found: ID does not exist" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.688810 4947 scope.go:117] "RemoveContainer" containerID="79fec1007ab23633e2b3b9ded86b2c9e2f1888455ff085de9b59fd2ce634d8d0" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.689119 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79fec1007ab23633e2b3b9ded86b2c9e2f1888455ff085de9b59fd2ce634d8d0"} err="failed to get container status \"79fec1007ab23633e2b3b9ded86b2c9e2f1888455ff085de9b59fd2ce634d8d0\": rpc error: code = NotFound desc = could not find container \"79fec1007ab23633e2b3b9ded86b2c9e2f1888455ff085de9b59fd2ce634d8d0\": container with ID starting with 79fec1007ab23633e2b3b9ded86b2c9e2f1888455ff085de9b59fd2ce634d8d0 not found: ID does not exist" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.689141 4947 scope.go:117] "RemoveContainer" containerID="ee2754cf5daf0a715c37e5354df5f0ea770c343fff03bfa4e46560f4c7abfcdf" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.689400 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee2754cf5daf0a715c37e5354df5f0ea770c343fff03bfa4e46560f4c7abfcdf"} err="failed to get container status \"ee2754cf5daf0a715c37e5354df5f0ea770c343fff03bfa4e46560f4c7abfcdf\": rpc error: code = NotFound desc = could not find container \"ee2754cf5daf0a715c37e5354df5f0ea770c343fff03bfa4e46560f4c7abfcdf\": container with ID starting with ee2754cf5daf0a715c37e5354df5f0ea770c343fff03bfa4e46560f4c7abfcdf not found: ID does not exist" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.768868 4947 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.927668 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.933831 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.959052 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:09:00 crc kubenswrapper[4947]: E1203 07:09:00.959388 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63877bed-285e-4d4b-b28c-f05ed9de57a5" containerName="glance-log" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.959404 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="63877bed-285e-4d4b-b28c-f05ed9de57a5" containerName="glance-log" Dec 03 07:09:00 crc kubenswrapper[4947]: E1203 07:09:00.959416 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d17d252-d33f-4846-a443-2d99e5d3464c" containerName="dnsmasq-dns" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.959422 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d17d252-d33f-4846-a443-2d99e5d3464c" containerName="dnsmasq-dns" Dec 03 07:09:00 crc kubenswrapper[4947]: E1203 07:09:00.959436 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d17d252-d33f-4846-a443-2d99e5d3464c" containerName="init" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.959442 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d17d252-d33f-4846-a443-2d99e5d3464c" containerName="init" Dec 03 07:09:00 crc kubenswrapper[4947]: E1203 07:09:00.959455 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63877bed-285e-4d4b-b28c-f05ed9de57a5" containerName="glance-httpd" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.959461 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="63877bed-285e-4d4b-b28c-f05ed9de57a5" containerName="glance-httpd" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.959629 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d17d252-d33f-4846-a443-2d99e5d3464c" containerName="dnsmasq-dns" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.959642 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="63877bed-285e-4d4b-b28c-f05ed9de57a5" containerName="glance-httpd" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.959656 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="63877bed-285e-4d4b-b28c-f05ed9de57a5" containerName="glance-log" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.960474 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.963235 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 07:09:00 crc kubenswrapper[4947]: I1203 07:09:00.963574 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.022425 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.080866 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f251bdf4-6211-48a9-8578-5acd9fb17c15-scripts\") pod \"glance-default-external-api-0\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.080911 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f251bdf4-6211-48a9-8578-5acd9fb17c15-config-data\") pod \"glance-default-external-api-0\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.080951 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gc8h\" (UniqueName: \"kubernetes.io/projected/f251bdf4-6211-48a9-8578-5acd9fb17c15-kube-api-access-9gc8h\") pod \"glance-default-external-api-0\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.081091 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f251bdf4-6211-48a9-8578-5acd9fb17c15-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.081216 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f251bdf4-6211-48a9-8578-5acd9fb17c15-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.081251 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f251bdf4-6211-48a9-8578-5acd9fb17c15-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.081290 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.081413 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f251bdf4-6211-48a9-8578-5acd9fb17c15-logs\") pod \"glance-default-external-api-0\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.106793 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63877bed-285e-4d4b-b28c-f05ed9de57a5" path="/var/lib/kubelet/pods/63877bed-285e-4d4b-b28c-f05ed9de57a5/volumes" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.183801 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f251bdf4-6211-48a9-8578-5acd9fb17c15-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.183867 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f251bdf4-6211-48a9-8578-5acd9fb17c15-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.183903 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.183948 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f251bdf4-6211-48a9-8578-5acd9fb17c15-logs\") pod \"glance-default-external-api-0\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.184376 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.184545 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f251bdf4-6211-48a9-8578-5acd9fb17c15-scripts\") pod \"glance-default-external-api-0\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.184585 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f251bdf4-6211-48a9-8578-5acd9fb17c15-config-data\") pod \"glance-default-external-api-0\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.184639 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gc8h\" (UniqueName: \"kubernetes.io/projected/f251bdf4-6211-48a9-8578-5acd9fb17c15-kube-api-access-9gc8h\") pod \"glance-default-external-api-0\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.184680 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f251bdf4-6211-48a9-8578-5acd9fb17c15-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.185173 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f251bdf4-6211-48a9-8578-5acd9fb17c15-logs\") pod \"glance-default-external-api-0\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.185234 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f251bdf4-6211-48a9-8578-5acd9fb17c15-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.187954 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f251bdf4-6211-48a9-8578-5acd9fb17c15-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.190735 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f251bdf4-6211-48a9-8578-5acd9fb17c15-config-data\") pod \"glance-default-external-api-0\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.199799 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f251bdf4-6211-48a9-8578-5acd9fb17c15-scripts\") pod \"glance-default-external-api-0\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.200616 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f251bdf4-6211-48a9-8578-5acd9fb17c15-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.209800 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gc8h\" (UniqueName: \"kubernetes.io/projected/f251bdf4-6211-48a9-8578-5acd9fb17c15-kube-api-access-9gc8h\") pod \"glance-default-external-api-0\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.218113 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.288910 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.611458 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="135270e8-e939-40bd-b5ae-dcce0b765e82" containerName="glance-log" containerID="cri-o://25b8ac5d33c5f9cf80d600bea126b30f3d31f227386e3b6152be24a8c840d315" gracePeriod=30 Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.612051 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"135270e8-e939-40bd-b5ae-dcce0b765e82","Type":"ContainerStarted","Data":"37a61c3b48b6f17b470b5bf98b88cccb67b367cdaa13aa27a2bc8ff1e8895ffa"} Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.612931 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="135270e8-e939-40bd-b5ae-dcce0b765e82" containerName="glance-httpd" containerID="cri-o://37a61c3b48b6f17b470b5bf98b88cccb67b367cdaa13aa27a2bc8ff1e8895ffa" gracePeriod=30 Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.646801 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=21.646782934 podStartE2EDuration="21.646782934s" podCreationTimestamp="2025-12-03 07:08:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:09:01.637179505 +0000 UTC m=+1202.898133931" watchObservedRunningTime="2025-12-03 07:09:01.646782934 +0000 UTC m=+1202.907737360" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.886590 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.922981 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7bxn2" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.975286 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gl6zw" Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.997792 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5883886-a832-4366-be78-449b4559e8d2-combined-ca-bundle\") pod \"e5883886-a832-4366-be78-449b4559e8d2\" (UID: \"e5883886-a832-4366-be78-449b4559e8d2\") " Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.997827 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5883886-a832-4366-be78-449b4559e8d2-config-data\") pod \"e5883886-a832-4366-be78-449b4559e8d2\" (UID: \"e5883886-a832-4366-be78-449b4559e8d2\") " Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.997863 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xntzj\" (UniqueName: \"kubernetes.io/projected/a2da706f-4332-49a8-9c85-2d90186bd708-kube-api-access-xntzj\") pod \"a2da706f-4332-49a8-9c85-2d90186bd708\" (UID: \"a2da706f-4332-49a8-9c85-2d90186bd708\") " Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.997914 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2da706f-4332-49a8-9c85-2d90186bd708-config\") pod \"a2da706f-4332-49a8-9c85-2d90186bd708\" (UID: \"a2da706f-4332-49a8-9c85-2d90186bd708\") " Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.997937 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b64t\" (UniqueName: \"kubernetes.io/projected/e5883886-a832-4366-be78-449b4559e8d2-kube-api-access-4b64t\") pod \"e5883886-a832-4366-be78-449b4559e8d2\" (UID: \"e5883886-a832-4366-be78-449b4559e8d2\") " Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.997997 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5883886-a832-4366-be78-449b4559e8d2-scripts\") pod \"e5883886-a832-4366-be78-449b4559e8d2\" (UID: \"e5883886-a832-4366-be78-449b4559e8d2\") " Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.998018 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2da706f-4332-49a8-9c85-2d90186bd708-combined-ca-bundle\") pod \"a2da706f-4332-49a8-9c85-2d90186bd708\" (UID: \"a2da706f-4332-49a8-9c85-2d90186bd708\") " Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.998115 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5883886-a832-4366-be78-449b4559e8d2-logs\") pod \"e5883886-a832-4366-be78-449b4559e8d2\" (UID: \"e5883886-a832-4366-be78-449b4559e8d2\") " Dec 03 07:09:01 crc kubenswrapper[4947]: I1203 07:09:01.998724 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5883886-a832-4366-be78-449b4559e8d2-logs" (OuterVolumeSpecName: "logs") pod "e5883886-a832-4366-be78-449b4559e8d2" (UID: "e5883886-a832-4366-be78-449b4559e8d2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.005215 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2da706f-4332-49a8-9c85-2d90186bd708-kube-api-access-xntzj" (OuterVolumeSpecName: "kube-api-access-xntzj") pod "a2da706f-4332-49a8-9c85-2d90186bd708" (UID: "a2da706f-4332-49a8-9c85-2d90186bd708"). InnerVolumeSpecName "kube-api-access-xntzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.009081 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5883886-a832-4366-be78-449b4559e8d2-kube-api-access-4b64t" (OuterVolumeSpecName: "kube-api-access-4b64t") pod "e5883886-a832-4366-be78-449b4559e8d2" (UID: "e5883886-a832-4366-be78-449b4559e8d2"). InnerVolumeSpecName "kube-api-access-4b64t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.013716 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5883886-a832-4366-be78-449b4559e8d2-scripts" (OuterVolumeSpecName: "scripts") pod "e5883886-a832-4366-be78-449b4559e8d2" (UID: "e5883886-a832-4366-be78-449b4559e8d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.029940 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5883886-a832-4366-be78-449b4559e8d2-config-data" (OuterVolumeSpecName: "config-data") pod "e5883886-a832-4366-be78-449b4559e8d2" (UID: "e5883886-a832-4366-be78-449b4559e8d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.034824 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5883886-a832-4366-be78-449b4559e8d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5883886-a832-4366-be78-449b4559e8d2" (UID: "e5883886-a832-4366-be78-449b4559e8d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.039744 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2da706f-4332-49a8-9c85-2d90186bd708-config" (OuterVolumeSpecName: "config") pod "a2da706f-4332-49a8-9c85-2d90186bd708" (UID: "a2da706f-4332-49a8-9c85-2d90186bd708"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.041450 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2da706f-4332-49a8-9c85-2d90186bd708-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2da706f-4332-49a8-9c85-2d90186bd708" (UID: "a2da706f-4332-49a8-9c85-2d90186bd708"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.100599 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5883886-a832-4366-be78-449b4559e8d2-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.100641 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5883886-a832-4366-be78-449b4559e8d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.100656 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5883886-a832-4366-be78-449b4559e8d2-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.100669 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xntzj\" (UniqueName: \"kubernetes.io/projected/a2da706f-4332-49a8-9c85-2d90186bd708-kube-api-access-xntzj\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.100683 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a2da706f-4332-49a8-9c85-2d90186bd708-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.100695 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b64t\" (UniqueName: \"kubernetes.io/projected/e5883886-a832-4366-be78-449b4559e8d2-kube-api-access-4b64t\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.100706 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5883886-a832-4366-be78-449b4559e8d2-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.100717 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2da706f-4332-49a8-9c85-2d90186bd708-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.273747 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.303755 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l6nw\" (UniqueName: \"kubernetes.io/projected/135270e8-e939-40bd-b5ae-dcce0b765e82-kube-api-access-5l6nw\") pod \"135270e8-e939-40bd-b5ae-dcce0b765e82\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.303839 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/135270e8-e939-40bd-b5ae-dcce0b765e82-config-data\") pod \"135270e8-e939-40bd-b5ae-dcce0b765e82\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.303937 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/135270e8-e939-40bd-b5ae-dcce0b765e82-combined-ca-bundle\") pod \"135270e8-e939-40bd-b5ae-dcce0b765e82\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.303960 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/135270e8-e939-40bd-b5ae-dcce0b765e82-scripts\") pod \"135270e8-e939-40bd-b5ae-dcce0b765e82\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.303985 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/135270e8-e939-40bd-b5ae-dcce0b765e82-httpd-run\") pod \"135270e8-e939-40bd-b5ae-dcce0b765e82\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.304016 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"135270e8-e939-40bd-b5ae-dcce0b765e82\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.304074 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/135270e8-e939-40bd-b5ae-dcce0b765e82-logs\") pod \"135270e8-e939-40bd-b5ae-dcce0b765e82\" (UID: \"135270e8-e939-40bd-b5ae-dcce0b765e82\") " Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.304598 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/135270e8-e939-40bd-b5ae-dcce0b765e82-logs" (OuterVolumeSpecName: "logs") pod "135270e8-e939-40bd-b5ae-dcce0b765e82" (UID: "135270e8-e939-40bd-b5ae-dcce0b765e82"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.305515 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/135270e8-e939-40bd-b5ae-dcce0b765e82-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "135270e8-e939-40bd-b5ae-dcce0b765e82" (UID: "135270e8-e939-40bd-b5ae-dcce0b765e82"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.305689 4947 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/135270e8-e939-40bd-b5ae-dcce0b765e82-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.305707 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/135270e8-e939-40bd-b5ae-dcce0b765e82-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.309935 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/135270e8-e939-40bd-b5ae-dcce0b765e82-scripts" (OuterVolumeSpecName: "scripts") pod "135270e8-e939-40bd-b5ae-dcce0b765e82" (UID: "135270e8-e939-40bd-b5ae-dcce0b765e82"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.309974 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/135270e8-e939-40bd-b5ae-dcce0b765e82-kube-api-access-5l6nw" (OuterVolumeSpecName: "kube-api-access-5l6nw") pod "135270e8-e939-40bd-b5ae-dcce0b765e82" (UID: "135270e8-e939-40bd-b5ae-dcce0b765e82"). InnerVolumeSpecName "kube-api-access-5l6nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.310167 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "135270e8-e939-40bd-b5ae-dcce0b765e82" (UID: "135270e8-e939-40bd-b5ae-dcce0b765e82"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.331533 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/135270e8-e939-40bd-b5ae-dcce0b765e82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "135270e8-e939-40bd-b5ae-dcce0b765e82" (UID: "135270e8-e939-40bd-b5ae-dcce0b765e82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.353469 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/135270e8-e939-40bd-b5ae-dcce0b765e82-config-data" (OuterVolumeSpecName: "config-data") pod "135270e8-e939-40bd-b5ae-dcce0b765e82" (UID: "135270e8-e939-40bd-b5ae-dcce0b765e82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.407286 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/135270e8-e939-40bd-b5ae-dcce0b765e82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.407529 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/135270e8-e939-40bd-b5ae-dcce0b765e82-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.407563 4947 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.407655 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l6nw\" (UniqueName: \"kubernetes.io/projected/135270e8-e939-40bd-b5ae-dcce0b765e82-kube-api-access-5l6nw\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.407670 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/135270e8-e939-40bd-b5ae-dcce0b765e82-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.441044 4947 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.509148 4947 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.622343 4947 generic.go:334] "Generic (PLEG): container finished" podID="883ad48f-95eb-48b1-932b-98e145d203ab" containerID="c9a719a484374ac820c11d3eb970965b2a4af8d1426143c047b5501f316ae86a" exitCode=0 Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.622423 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6lcmv" event={"ID":"883ad48f-95eb-48b1-932b-98e145d203ab","Type":"ContainerDied","Data":"c9a719a484374ac820c11d3eb970965b2a4af8d1426143c047b5501f316ae86a"} Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.624979 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gl6zw" event={"ID":"e5883886-a832-4366-be78-449b4559e8d2","Type":"ContainerDied","Data":"1c34eff082ecac97e7577ef5b2515e4828c3c5fd9f384a9f9969429481ca3ad6"} Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.625000 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gl6zw" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.625010 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c34eff082ecac97e7577ef5b2515e4828c3c5fd9f384a9f9969429481ca3ad6" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.626515 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f251bdf4-6211-48a9-8578-5acd9fb17c15","Type":"ContainerStarted","Data":"e806c545c620a41f44b266fdb943dd7692498f3256e3eebbb60b7528e0a8dc66"} Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.626537 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f251bdf4-6211-48a9-8578-5acd9fb17c15","Type":"ContainerStarted","Data":"350fe646218587bd19d9b7b59b7e9133ac8857e3b1713daf1e8d6e8c61de1d40"} Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.630277 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-7bxn2" event={"ID":"a2da706f-4332-49a8-9c85-2d90186bd708","Type":"ContainerDied","Data":"9456371e2961855296de158d6690d8ecc7a90b9e73e94e866585c608eb4a30d4"} Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.630310 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9456371e2961855296de158d6690d8ecc7a90b9e73e94e866585c608eb4a30d4" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.630362 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-7bxn2" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.643365 4947 generic.go:334] "Generic (PLEG): container finished" podID="135270e8-e939-40bd-b5ae-dcce0b765e82" containerID="37a61c3b48b6f17b470b5bf98b88cccb67b367cdaa13aa27a2bc8ff1e8895ffa" exitCode=0 Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.643397 4947 generic.go:334] "Generic (PLEG): container finished" podID="135270e8-e939-40bd-b5ae-dcce0b765e82" containerID="25b8ac5d33c5f9cf80d600bea126b30f3d31f227386e3b6152be24a8c840d315" exitCode=143 Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.643421 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"135270e8-e939-40bd-b5ae-dcce0b765e82","Type":"ContainerDied","Data":"37a61c3b48b6f17b470b5bf98b88cccb67b367cdaa13aa27a2bc8ff1e8895ffa"} Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.643448 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"135270e8-e939-40bd-b5ae-dcce0b765e82","Type":"ContainerDied","Data":"25b8ac5d33c5f9cf80d600bea126b30f3d31f227386e3b6152be24a8c840d315"} Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.643460 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"135270e8-e939-40bd-b5ae-dcce0b765e82","Type":"ContainerDied","Data":"2cecc3ce6de4f7a6df308b1fa3212cddec7923687d11894ee074daf7452f2e0f"} Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.643481 4947 scope.go:117] "RemoveContainer" containerID="37a61c3b48b6f17b470b5bf98b88cccb67b367cdaa13aa27a2bc8ff1e8895ffa" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.643640 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.720710 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.733550 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.751560 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-774d6d4878-7x6tj"] Dec 03 07:09:02 crc kubenswrapper[4947]: E1203 07:09:02.752031 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5883886-a832-4366-be78-449b4559e8d2" containerName="placement-db-sync" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.752047 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5883886-a832-4366-be78-449b4559e8d2" containerName="placement-db-sync" Dec 03 07:09:02 crc kubenswrapper[4947]: E1203 07:09:02.752064 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="135270e8-e939-40bd-b5ae-dcce0b765e82" containerName="glance-log" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.752072 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="135270e8-e939-40bd-b5ae-dcce0b765e82" containerName="glance-log" Dec 03 07:09:02 crc kubenswrapper[4947]: E1203 07:09:02.752090 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2da706f-4332-49a8-9c85-2d90186bd708" containerName="neutron-db-sync" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.752099 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2da706f-4332-49a8-9c85-2d90186bd708" containerName="neutron-db-sync" Dec 03 07:09:02 crc kubenswrapper[4947]: E1203 07:09:02.752135 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="135270e8-e939-40bd-b5ae-dcce0b765e82" containerName="glance-httpd" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.752143 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="135270e8-e939-40bd-b5ae-dcce0b765e82" containerName="glance-httpd" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.752355 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2da706f-4332-49a8-9c85-2d90186bd708" containerName="neutron-db-sync" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.752372 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5883886-a832-4366-be78-449b4559e8d2" containerName="placement-db-sync" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.752396 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="135270e8-e939-40bd-b5ae-dcce0b765e82" containerName="glance-log" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.752413 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="135270e8-e939-40bd-b5ae-dcce0b765e82" containerName="glance-httpd" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.753649 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.757514 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-774d6d4878-7x6tj"] Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.758235 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.758336 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.758302 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.758528 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-5z2lj" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.758703 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.782718 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.784124 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.788166 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.788583 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.800878 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.814293 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.814377 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-combined-ca-bundle\") pod \"placement-774d6d4878-7x6tj\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.814428 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-logs\") pod \"glance-default-internal-api-0\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.814452 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-logs\") pod \"placement-774d6d4878-7x6tj\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.814479 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw9mr\" (UniqueName: \"kubernetes.io/projected/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-kube-api-access-xw9mr\") pod \"glance-default-internal-api-0\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.814525 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.814557 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-config-data\") pod \"placement-774d6d4878-7x6tj\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.814579 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.814656 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzwvr\" (UniqueName: \"kubernetes.io/projected/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-kube-api-access-qzwvr\") pod \"placement-774d6d4878-7x6tj\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.814732 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.814892 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.814942 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-scripts\") pod \"placement-774d6d4878-7x6tj\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.814968 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-public-tls-certs\") pod \"placement-774d6d4878-7x6tj\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.815006 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-internal-tls-certs\") pod \"placement-774d6d4878-7x6tj\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.815029 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.860613 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-849ff95dc5-p27ds"] Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.862155 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.871167 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-849ff95dc5-p27ds"] Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.916235 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-config\") pod \"dnsmasq-dns-849ff95dc5-p27ds\" (UID: \"381b76ac-1ce2-4745-8e27-1398b83ee86a\") " pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.916290 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-logs\") pod \"glance-default-internal-api-0\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.916316 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr2hl\" (UniqueName: \"kubernetes.io/projected/381b76ac-1ce2-4745-8e27-1398b83ee86a-kube-api-access-rr2hl\") pod \"dnsmasq-dns-849ff95dc5-p27ds\" (UID: \"381b76ac-1ce2-4745-8e27-1398b83ee86a\") " pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.916336 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-logs\") pod \"placement-774d6d4878-7x6tj\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.916354 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw9mr\" (UniqueName: \"kubernetes.io/projected/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-kube-api-access-xw9mr\") pod \"glance-default-internal-api-0\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.916373 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.916395 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-config-data\") pod \"placement-774d6d4878-7x6tj\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.916410 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.916430 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzwvr\" (UniqueName: \"kubernetes.io/projected/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-kube-api-access-qzwvr\") pod \"placement-774d6d4878-7x6tj\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.916458 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.916510 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.916529 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-scripts\") pod \"placement-774d6d4878-7x6tj\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.916547 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-ovsdbserver-nb\") pod \"dnsmasq-dns-849ff95dc5-p27ds\" (UID: \"381b76ac-1ce2-4745-8e27-1398b83ee86a\") " pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.916564 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-public-tls-certs\") pod \"placement-774d6d4878-7x6tj\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.916583 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-internal-tls-certs\") pod \"placement-774d6d4878-7x6tj\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.916598 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-dns-swift-storage-0\") pod \"dnsmasq-dns-849ff95dc5-p27ds\" (UID: \"381b76ac-1ce2-4745-8e27-1398b83ee86a\") " pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.916615 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.916629 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-ovsdbserver-sb\") pod \"dnsmasq-dns-849ff95dc5-p27ds\" (UID: \"381b76ac-1ce2-4745-8e27-1398b83ee86a\") " pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.916652 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-dns-svc\") pod \"dnsmasq-dns-849ff95dc5-p27ds\" (UID: \"381b76ac-1ce2-4745-8e27-1398b83ee86a\") " pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.916669 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.916707 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-combined-ca-bundle\") pod \"placement-774d6d4878-7x6tj\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.918188 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-logs\") pod \"glance-default-internal-api-0\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.918720 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-logs\") pod \"placement-774d6d4878-7x6tj\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.918799 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.922318 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-internal-tls-certs\") pod \"placement-774d6d4878-7x6tj\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.922332 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-combined-ca-bundle\") pod \"placement-774d6d4878-7x6tj\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.922981 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-config-data\") pod \"placement-774d6d4878-7x6tj\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.923367 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.925139 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-public-tls-certs\") pod \"placement-774d6d4878-7x6tj\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.925266 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.925543 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.925568 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.927868 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.929268 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-scripts\") pod \"placement-774d6d4878-7x6tj\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.934247 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw9mr\" (UniqueName: \"kubernetes.io/projected/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-kube-api-access-xw9mr\") pod \"glance-default-internal-api-0\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.938838 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzwvr\" (UniqueName: \"kubernetes.io/projected/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-kube-api-access-qzwvr\") pod \"placement-774d6d4878-7x6tj\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.963255 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.996422 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5f849bbbf6-tf4mc"] Dec 03 07:09:02 crc kubenswrapper[4947]: I1203 07:09:02.997942 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f849bbbf6-tf4mc" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.001447 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-cmcq8" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.001888 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.002031 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.006660 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.008785 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5f849bbbf6-tf4mc"] Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.018714 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-ovsdbserver-nb\") pod \"dnsmasq-dns-849ff95dc5-p27ds\" (UID: \"381b76ac-1ce2-4745-8e27-1398b83ee86a\") " pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.018759 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-dns-swift-storage-0\") pod \"dnsmasq-dns-849ff95dc5-p27ds\" (UID: \"381b76ac-1ce2-4745-8e27-1398b83ee86a\") " pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.018780 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-ovsdbserver-sb\") pod \"dnsmasq-dns-849ff95dc5-p27ds\" (UID: \"381b76ac-1ce2-4745-8e27-1398b83ee86a\") " pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.018811 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-httpd-config\") pod \"neutron-5f849bbbf6-tf4mc\" (UID: \"d2e58a7d-365d-4296-b18f-5d1bfe90b61f\") " pod="openstack/neutron-5f849bbbf6-tf4mc" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.018838 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-dns-svc\") pod \"dnsmasq-dns-849ff95dc5-p27ds\" (UID: \"381b76ac-1ce2-4745-8e27-1398b83ee86a\") " pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.018859 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-combined-ca-bundle\") pod \"neutron-5f849bbbf6-tf4mc\" (UID: \"d2e58a7d-365d-4296-b18f-5d1bfe90b61f\") " pod="openstack/neutron-5f849bbbf6-tf4mc" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.018886 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbpf8\" (UniqueName: \"kubernetes.io/projected/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-kube-api-access-qbpf8\") pod \"neutron-5f849bbbf6-tf4mc\" (UID: \"d2e58a7d-365d-4296-b18f-5d1bfe90b61f\") " pod="openstack/neutron-5f849bbbf6-tf4mc" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.018920 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-config\") pod \"neutron-5f849bbbf6-tf4mc\" (UID: \"d2e58a7d-365d-4296-b18f-5d1bfe90b61f\") " pod="openstack/neutron-5f849bbbf6-tf4mc" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.018936 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-config\") pod \"dnsmasq-dns-849ff95dc5-p27ds\" (UID: \"381b76ac-1ce2-4745-8e27-1398b83ee86a\") " pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.018962 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr2hl\" (UniqueName: \"kubernetes.io/projected/381b76ac-1ce2-4745-8e27-1398b83ee86a-kube-api-access-rr2hl\") pod \"dnsmasq-dns-849ff95dc5-p27ds\" (UID: \"381b76ac-1ce2-4745-8e27-1398b83ee86a\") " pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.019034 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-ovndb-tls-certs\") pod \"neutron-5f849bbbf6-tf4mc\" (UID: \"d2e58a7d-365d-4296-b18f-5d1bfe90b61f\") " pod="openstack/neutron-5f849bbbf6-tf4mc" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.026582 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-dns-swift-storage-0\") pod \"dnsmasq-dns-849ff95dc5-p27ds\" (UID: \"381b76ac-1ce2-4745-8e27-1398b83ee86a\") " pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.027386 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-ovsdbserver-sb\") pod \"dnsmasq-dns-849ff95dc5-p27ds\" (UID: \"381b76ac-1ce2-4745-8e27-1398b83ee86a\") " pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.027428 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-config\") pod \"dnsmasq-dns-849ff95dc5-p27ds\" (UID: \"381b76ac-1ce2-4745-8e27-1398b83ee86a\") " pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.027932 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-dns-svc\") pod \"dnsmasq-dns-849ff95dc5-p27ds\" (UID: \"381b76ac-1ce2-4745-8e27-1398b83ee86a\") " pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.030472 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-ovsdbserver-nb\") pod \"dnsmasq-dns-849ff95dc5-p27ds\" (UID: \"381b76ac-1ce2-4745-8e27-1398b83ee86a\") " pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.035997 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr2hl\" (UniqueName: \"kubernetes.io/projected/381b76ac-1ce2-4745-8e27-1398b83ee86a-kube-api-access-rr2hl\") pod \"dnsmasq-dns-849ff95dc5-p27ds\" (UID: \"381b76ac-1ce2-4745-8e27-1398b83ee86a\") " pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.083024 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.094735 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="135270e8-e939-40bd-b5ae-dcce0b765e82" path="/var/lib/kubelet/pods/135270e8-e939-40bd-b5ae-dcce0b765e82/volumes" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.105097 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.120945 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-ovndb-tls-certs\") pod \"neutron-5f849bbbf6-tf4mc\" (UID: \"d2e58a7d-365d-4296-b18f-5d1bfe90b61f\") " pod="openstack/neutron-5f849bbbf6-tf4mc" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.121027 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-httpd-config\") pod \"neutron-5f849bbbf6-tf4mc\" (UID: \"d2e58a7d-365d-4296-b18f-5d1bfe90b61f\") " pod="openstack/neutron-5f849bbbf6-tf4mc" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.121073 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-combined-ca-bundle\") pod \"neutron-5f849bbbf6-tf4mc\" (UID: \"d2e58a7d-365d-4296-b18f-5d1bfe90b61f\") " pod="openstack/neutron-5f849bbbf6-tf4mc" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.121119 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbpf8\" (UniqueName: \"kubernetes.io/projected/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-kube-api-access-qbpf8\") pod \"neutron-5f849bbbf6-tf4mc\" (UID: \"d2e58a7d-365d-4296-b18f-5d1bfe90b61f\") " pod="openstack/neutron-5f849bbbf6-tf4mc" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.121273 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-config\") pod \"neutron-5f849bbbf6-tf4mc\" (UID: \"d2e58a7d-365d-4296-b18f-5d1bfe90b61f\") " pod="openstack/neutron-5f849bbbf6-tf4mc" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.125182 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-config\") pod \"neutron-5f849bbbf6-tf4mc\" (UID: \"d2e58a7d-365d-4296-b18f-5d1bfe90b61f\") " pod="openstack/neutron-5f849bbbf6-tf4mc" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.125542 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-httpd-config\") pod \"neutron-5f849bbbf6-tf4mc\" (UID: \"d2e58a7d-365d-4296-b18f-5d1bfe90b61f\") " pod="openstack/neutron-5f849bbbf6-tf4mc" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.126602 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-ovndb-tls-certs\") pod \"neutron-5f849bbbf6-tf4mc\" (UID: \"d2e58a7d-365d-4296-b18f-5d1bfe90b61f\") " pod="openstack/neutron-5f849bbbf6-tf4mc" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.129353 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-combined-ca-bundle\") pod \"neutron-5f849bbbf6-tf4mc\" (UID: \"d2e58a7d-365d-4296-b18f-5d1bfe90b61f\") " pod="openstack/neutron-5f849bbbf6-tf4mc" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.138820 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbpf8\" (UniqueName: \"kubernetes.io/projected/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-kube-api-access-qbpf8\") pod \"neutron-5f849bbbf6-tf4mc\" (UID: \"d2e58a7d-365d-4296-b18f-5d1bfe90b61f\") " pod="openstack/neutron-5f849bbbf6-tf4mc" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.183321 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" Dec 03 07:09:03 crc kubenswrapper[4947]: I1203 07:09:03.327315 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f849bbbf6-tf4mc" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.157045 4947 scope.go:117] "RemoveContainer" containerID="25b8ac5d33c5f9cf80d600bea126b30f3d31f227386e3b6152be24a8c840d315" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.204313 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6lcmv" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.265300 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm6hn\" (UniqueName: \"kubernetes.io/projected/883ad48f-95eb-48b1-932b-98e145d203ab-kube-api-access-rm6hn\") pod \"883ad48f-95eb-48b1-932b-98e145d203ab\" (UID: \"883ad48f-95eb-48b1-932b-98e145d203ab\") " Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.265388 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-credential-keys\") pod \"883ad48f-95eb-48b1-932b-98e145d203ab\" (UID: \"883ad48f-95eb-48b1-932b-98e145d203ab\") " Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.265436 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-fernet-keys\") pod \"883ad48f-95eb-48b1-932b-98e145d203ab\" (UID: \"883ad48f-95eb-48b1-932b-98e145d203ab\") " Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.265471 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-scripts\") pod \"883ad48f-95eb-48b1-932b-98e145d203ab\" (UID: \"883ad48f-95eb-48b1-932b-98e145d203ab\") " Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.265509 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-config-data\") pod \"883ad48f-95eb-48b1-932b-98e145d203ab\" (UID: \"883ad48f-95eb-48b1-932b-98e145d203ab\") " Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.265534 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-combined-ca-bundle\") pod \"883ad48f-95eb-48b1-932b-98e145d203ab\" (UID: \"883ad48f-95eb-48b1-932b-98e145d203ab\") " Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.270610 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "883ad48f-95eb-48b1-932b-98e145d203ab" (UID: "883ad48f-95eb-48b1-932b-98e145d203ab"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.270940 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/883ad48f-95eb-48b1-932b-98e145d203ab-kube-api-access-rm6hn" (OuterVolumeSpecName: "kube-api-access-rm6hn") pod "883ad48f-95eb-48b1-932b-98e145d203ab" (UID: "883ad48f-95eb-48b1-932b-98e145d203ab"). InnerVolumeSpecName "kube-api-access-rm6hn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.270957 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-scripts" (OuterVolumeSpecName: "scripts") pod "883ad48f-95eb-48b1-932b-98e145d203ab" (UID: "883ad48f-95eb-48b1-932b-98e145d203ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.288097 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "883ad48f-95eb-48b1-932b-98e145d203ab" (UID: "883ad48f-95eb-48b1-932b-98e145d203ab"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.289314 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-config-data" (OuterVolumeSpecName: "config-data") pod "883ad48f-95eb-48b1-932b-98e145d203ab" (UID: "883ad48f-95eb-48b1-932b-98e145d203ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.292197 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "883ad48f-95eb-48b1-932b-98e145d203ab" (UID: "883ad48f-95eb-48b1-932b-98e145d203ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.368255 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm6hn\" (UniqueName: \"kubernetes.io/projected/883ad48f-95eb-48b1-932b-98e145d203ab-kube-api-access-rm6hn\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.368290 4947 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.368304 4947 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.368316 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.368329 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.368348 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/883ad48f-95eb-48b1-932b-98e145d203ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.685034 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6lcmv" event={"ID":"883ad48f-95eb-48b1-932b-98e145d203ab","Type":"ContainerDied","Data":"4e7b36f28f211dad441ba6d0b5fa3fb24f2de355006e59ebb9b6a75032f9239f"} Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.685322 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e7b36f28f211dad441ba6d0b5fa3fb24f2de355006e59ebb9b6a75032f9239f" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.685093 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6lcmv" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.779065 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c779fbf97-r9gnn"] Dec 03 07:09:04 crc kubenswrapper[4947]: E1203 07:09:04.779531 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="883ad48f-95eb-48b1-932b-98e145d203ab" containerName="keystone-bootstrap" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.779552 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="883ad48f-95eb-48b1-932b-98e145d203ab" containerName="keystone-bootstrap" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.779762 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="883ad48f-95eb-48b1-932b-98e145d203ab" containerName="keystone-bootstrap" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.780506 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.785579 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.786002 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-r75nw" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.786786 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.787787 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.789119 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.789937 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.805226 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c779fbf97-r9gnn"] Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.877622 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-scripts\") pod \"keystone-c779fbf97-r9gnn\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.877674 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-credential-keys\") pod \"keystone-c779fbf97-r9gnn\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.877709 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-config-data\") pod \"keystone-c779fbf97-r9gnn\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.877863 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-combined-ca-bundle\") pod \"keystone-c779fbf97-r9gnn\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.877915 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-public-tls-certs\") pod \"keystone-c779fbf97-r9gnn\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.878267 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwrkw\" (UniqueName: \"kubernetes.io/projected/acfeae50-8757-4baf-a16a-c33ae100fdf2-kube-api-access-gwrkw\") pod \"keystone-c779fbf97-r9gnn\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.878322 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-fernet-keys\") pod \"keystone-c779fbf97-r9gnn\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.878347 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-internal-tls-certs\") pod \"keystone-c779fbf97-r9gnn\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.979909 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-fernet-keys\") pod \"keystone-c779fbf97-r9gnn\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.979950 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-internal-tls-certs\") pod \"keystone-c779fbf97-r9gnn\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.980010 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-scripts\") pod \"keystone-c779fbf97-r9gnn\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.980029 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-credential-keys\") pod \"keystone-c779fbf97-r9gnn\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.980050 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-config-data\") pod \"keystone-c779fbf97-r9gnn\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.980076 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-combined-ca-bundle\") pod \"keystone-c779fbf97-r9gnn\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.980094 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-public-tls-certs\") pod \"keystone-c779fbf97-r9gnn\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.980164 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwrkw\" (UniqueName: \"kubernetes.io/projected/acfeae50-8757-4baf-a16a-c33ae100fdf2-kube-api-access-gwrkw\") pod \"keystone-c779fbf97-r9gnn\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.983790 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-scripts\") pod \"keystone-c779fbf97-r9gnn\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.984144 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-combined-ca-bundle\") pod \"keystone-c779fbf97-r9gnn\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.984236 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-fernet-keys\") pod \"keystone-c779fbf97-r9gnn\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.984672 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-credential-keys\") pod \"keystone-c779fbf97-r9gnn\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.985520 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-internal-tls-certs\") pod \"keystone-c779fbf97-r9gnn\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:04 crc kubenswrapper[4947]: I1203 07:09:04.989764 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-public-tls-certs\") pod \"keystone-c779fbf97-r9gnn\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.003180 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwrkw\" (UniqueName: \"kubernetes.io/projected/acfeae50-8757-4baf-a16a-c33ae100fdf2-kube-api-access-gwrkw\") pod \"keystone-c779fbf97-r9gnn\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.003437 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-config-data\") pod \"keystone-c779fbf97-r9gnn\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.103043 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.271903 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7dd7c69-hhnlg"] Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.273239 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.275389 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.275717 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.282873 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7dd7c69-hhnlg"] Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.390209 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-public-tls-certs\") pod \"neutron-7dd7c69-hhnlg\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.390253 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-combined-ca-bundle\") pod \"neutron-7dd7c69-hhnlg\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.390283 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-config\") pod \"neutron-7dd7c69-hhnlg\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.390321 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-internal-tls-certs\") pod \"neutron-7dd7c69-hhnlg\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.390396 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-httpd-config\") pod \"neutron-7dd7c69-hhnlg\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.390434 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfnq2\" (UniqueName: \"kubernetes.io/projected/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-kube-api-access-xfnq2\") pod \"neutron-7dd7c69-hhnlg\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.390471 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-ovndb-tls-certs\") pod \"neutron-7dd7c69-hhnlg\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.493042 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-public-tls-certs\") pod \"neutron-7dd7c69-hhnlg\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.493104 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-combined-ca-bundle\") pod \"neutron-7dd7c69-hhnlg\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.493137 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-config\") pod \"neutron-7dd7c69-hhnlg\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.493185 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-internal-tls-certs\") pod \"neutron-7dd7c69-hhnlg\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.493227 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-httpd-config\") pod \"neutron-7dd7c69-hhnlg\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.493275 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfnq2\" (UniqueName: \"kubernetes.io/projected/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-kube-api-access-xfnq2\") pod \"neutron-7dd7c69-hhnlg\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.493316 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-ovndb-tls-certs\") pod \"neutron-7dd7c69-hhnlg\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.501291 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-public-tls-certs\") pod \"neutron-7dd7c69-hhnlg\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.502106 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-ovndb-tls-certs\") pod \"neutron-7dd7c69-hhnlg\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.513084 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-httpd-config\") pod \"neutron-7dd7c69-hhnlg\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.513168 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-config\") pod \"neutron-7dd7c69-hhnlg\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.513612 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-combined-ca-bundle\") pod \"neutron-7dd7c69-hhnlg\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.514078 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-internal-tls-certs\") pod \"neutron-7dd7c69-hhnlg\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.517058 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfnq2\" (UniqueName: \"kubernetes.io/projected/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-kube-api-access-xfnq2\") pod \"neutron-7dd7c69-hhnlg\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:09:05 crc kubenswrapper[4947]: I1203 07:09:05.599280 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:09:06 crc kubenswrapper[4947]: I1203 07:09:06.262361 4947 scope.go:117] "RemoveContainer" containerID="37a61c3b48b6f17b470b5bf98b88cccb67b367cdaa13aa27a2bc8ff1e8895ffa" Dec 03 07:09:06 crc kubenswrapper[4947]: E1203 07:09:06.263138 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37a61c3b48b6f17b470b5bf98b88cccb67b367cdaa13aa27a2bc8ff1e8895ffa\": container with ID starting with 37a61c3b48b6f17b470b5bf98b88cccb67b367cdaa13aa27a2bc8ff1e8895ffa not found: ID does not exist" containerID="37a61c3b48b6f17b470b5bf98b88cccb67b367cdaa13aa27a2bc8ff1e8895ffa" Dec 03 07:09:06 crc kubenswrapper[4947]: I1203 07:09:06.263166 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37a61c3b48b6f17b470b5bf98b88cccb67b367cdaa13aa27a2bc8ff1e8895ffa"} err="failed to get container status \"37a61c3b48b6f17b470b5bf98b88cccb67b367cdaa13aa27a2bc8ff1e8895ffa\": rpc error: code = NotFound desc = could not find container \"37a61c3b48b6f17b470b5bf98b88cccb67b367cdaa13aa27a2bc8ff1e8895ffa\": container with ID starting with 37a61c3b48b6f17b470b5bf98b88cccb67b367cdaa13aa27a2bc8ff1e8895ffa not found: ID does not exist" Dec 03 07:09:06 crc kubenswrapper[4947]: I1203 07:09:06.263184 4947 scope.go:117] "RemoveContainer" containerID="25b8ac5d33c5f9cf80d600bea126b30f3d31f227386e3b6152be24a8c840d315" Dec 03 07:09:06 crc kubenswrapper[4947]: E1203 07:09:06.263563 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25b8ac5d33c5f9cf80d600bea126b30f3d31f227386e3b6152be24a8c840d315\": container with ID starting with 25b8ac5d33c5f9cf80d600bea126b30f3d31f227386e3b6152be24a8c840d315 not found: ID does not exist" containerID="25b8ac5d33c5f9cf80d600bea126b30f3d31f227386e3b6152be24a8c840d315" Dec 03 07:09:06 crc kubenswrapper[4947]: I1203 07:09:06.263589 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25b8ac5d33c5f9cf80d600bea126b30f3d31f227386e3b6152be24a8c840d315"} err="failed to get container status \"25b8ac5d33c5f9cf80d600bea126b30f3d31f227386e3b6152be24a8c840d315\": rpc error: code = NotFound desc = could not find container \"25b8ac5d33c5f9cf80d600bea126b30f3d31f227386e3b6152be24a8c840d315\": container with ID starting with 25b8ac5d33c5f9cf80d600bea126b30f3d31f227386e3b6152be24a8c840d315 not found: ID does not exist" Dec 03 07:09:06 crc kubenswrapper[4947]: I1203 07:09:06.263605 4947 scope.go:117] "RemoveContainer" containerID="37a61c3b48b6f17b470b5bf98b88cccb67b367cdaa13aa27a2bc8ff1e8895ffa" Dec 03 07:09:06 crc kubenswrapper[4947]: I1203 07:09:06.263839 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37a61c3b48b6f17b470b5bf98b88cccb67b367cdaa13aa27a2bc8ff1e8895ffa"} err="failed to get container status \"37a61c3b48b6f17b470b5bf98b88cccb67b367cdaa13aa27a2bc8ff1e8895ffa\": rpc error: code = NotFound desc = could not find container \"37a61c3b48b6f17b470b5bf98b88cccb67b367cdaa13aa27a2bc8ff1e8895ffa\": container with ID starting with 37a61c3b48b6f17b470b5bf98b88cccb67b367cdaa13aa27a2bc8ff1e8895ffa not found: ID does not exist" Dec 03 07:09:06 crc kubenswrapper[4947]: I1203 07:09:06.263852 4947 scope.go:117] "RemoveContainer" containerID="25b8ac5d33c5f9cf80d600bea126b30f3d31f227386e3b6152be24a8c840d315" Dec 03 07:09:06 crc kubenswrapper[4947]: I1203 07:09:06.264275 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25b8ac5d33c5f9cf80d600bea126b30f3d31f227386e3b6152be24a8c840d315"} err="failed to get container status \"25b8ac5d33c5f9cf80d600bea126b30f3d31f227386e3b6152be24a8c840d315\": rpc error: code = NotFound desc = could not find container \"25b8ac5d33c5f9cf80d600bea126b30f3d31f227386e3b6152be24a8c840d315\": container with ID starting with 25b8ac5d33c5f9cf80d600bea126b30f3d31f227386e3b6152be24a8c840d315 not found: ID does not exist" Dec 03 07:09:06 crc kubenswrapper[4947]: I1203 07:09:06.708643 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"49777040-6a13-4610-a79c-6bc76d73212e","Type":"ContainerStarted","Data":"b119918308b9d42ec42ec890149a800784aa71c350b7f14abeab51c9801e58ac"} Dec 03 07:09:06 crc kubenswrapper[4947]: I1203 07:09:06.770856 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:09:06 crc kubenswrapper[4947]: W1203 07:09:06.788178 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ebe993b_68e0_4d68_8aa7_7895d7b6629a.slice/crio-ed499040dda7940e2ed46d09277c8ec4937c63504fc0cb2e22212c1b2bb32da0 WatchSource:0}: Error finding container ed499040dda7940e2ed46d09277c8ec4937c63504fc0cb2e22212c1b2bb32da0: Status 404 returned error can't find the container with id ed499040dda7940e2ed46d09277c8ec4937c63504fc0cb2e22212c1b2bb32da0 Dec 03 07:09:06 crc kubenswrapper[4947]: I1203 07:09:06.858261 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-774d6d4878-7x6tj"] Dec 03 07:09:06 crc kubenswrapper[4947]: I1203 07:09:06.870613 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-849ff95dc5-p27ds"] Dec 03 07:09:06 crc kubenswrapper[4947]: W1203 07:09:06.875549 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod381b76ac_1ce2_4745_8e27_1398b83ee86a.slice/crio-cb15f5f6a2656eff041a9539c65bab6633cd94f7f5330074b88fe20c1ab9a4f7 WatchSource:0}: Error finding container cb15f5f6a2656eff041a9539c65bab6633cd94f7f5330074b88fe20c1ab9a4f7: Status 404 returned error can't find the container with id cb15f5f6a2656eff041a9539c65bab6633cd94f7f5330074b88fe20c1ab9a4f7 Dec 03 07:09:06 crc kubenswrapper[4947]: I1203 07:09:06.990448 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5f849bbbf6-tf4mc"] Dec 03 07:09:07 crc kubenswrapper[4947]: I1203 07:09:07.044616 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c779fbf97-r9gnn"] Dec 03 07:09:07 crc kubenswrapper[4947]: W1203 07:09:07.050800 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacfeae50_8757_4baf_a16a_c33ae100fdf2.slice/crio-3fe9364c45274343b67fd57c7b371f66c669a4dc9cf7d36ea06a92221b7427f3 WatchSource:0}: Error finding container 3fe9364c45274343b67fd57c7b371f66c669a4dc9cf7d36ea06a92221b7427f3: Status 404 returned error can't find the container with id 3fe9364c45274343b67fd57c7b371f66c669a4dc9cf7d36ea06a92221b7427f3 Dec 03 07:09:07 crc kubenswrapper[4947]: I1203 07:09:07.127938 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7dd7c69-hhnlg"] Dec 03 07:09:07 crc kubenswrapper[4947]: W1203 07:09:07.138831 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccb28a2e_a946_4407_be07_6ac8eaad8ab1.slice/crio-1f84d80c8ff24b901e58b30eec441c6cdb8f1da326d393aabcc8fd0cdc22457e WatchSource:0}: Error finding container 1f84d80c8ff24b901e58b30eec441c6cdb8f1da326d393aabcc8fd0cdc22457e: Status 404 returned error can't find the container with id 1f84d80c8ff24b901e58b30eec441c6cdb8f1da326d393aabcc8fd0cdc22457e Dec 03 07:09:07 crc kubenswrapper[4947]: I1203 07:09:07.721048 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dd7c69-hhnlg" event={"ID":"ccb28a2e-a946-4407-be07-6ac8eaad8ab1","Type":"ContainerStarted","Data":"1f84d80c8ff24b901e58b30eec441c6cdb8f1da326d393aabcc8fd0cdc22457e"} Dec 03 07:09:07 crc kubenswrapper[4947]: I1203 07:09:07.723464 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c779fbf97-r9gnn" event={"ID":"acfeae50-8757-4baf-a16a-c33ae100fdf2","Type":"ContainerStarted","Data":"3fe9364c45274343b67fd57c7b371f66c669a4dc9cf7d36ea06a92221b7427f3"} Dec 03 07:09:07 crc kubenswrapper[4947]: I1203 07:09:07.725654 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" event={"ID":"381b76ac-1ce2-4745-8e27-1398b83ee86a","Type":"ContainerStarted","Data":"cb15f5f6a2656eff041a9539c65bab6633cd94f7f5330074b88fe20c1ab9a4f7"} Dec 03 07:09:07 crc kubenswrapper[4947]: I1203 07:09:07.727028 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-774d6d4878-7x6tj" event={"ID":"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998","Type":"ContainerStarted","Data":"745d4286ac2a70e6f05d7d5b44c411083a41f923a1acaec569d1c69ef6935d05"} Dec 03 07:09:07 crc kubenswrapper[4947]: I1203 07:09:07.728658 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2ebe993b-68e0-4d68-8aa7-7895d7b6629a","Type":"ContainerStarted","Data":"ed499040dda7940e2ed46d09277c8ec4937c63504fc0cb2e22212c1b2bb32da0"} Dec 03 07:09:07 crc kubenswrapper[4947]: I1203 07:09:07.730153 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f849bbbf6-tf4mc" event={"ID":"d2e58a7d-365d-4296-b18f-5d1bfe90b61f","Type":"ContainerStarted","Data":"a4aec8de6845f3239f5c93499c3d652accbe96dd8e72975e08a2818495982e3a"} Dec 03 07:09:08 crc kubenswrapper[4947]: I1203 07:09:08.747739 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f251bdf4-6211-48a9-8578-5acd9fb17c15","Type":"ContainerStarted","Data":"ff908012798704d1cea033ccfe8f4688f75d889f71dae78525c4981c116cc1dd"} Dec 03 07:09:09 crc kubenswrapper[4947]: I1203 07:09:09.760900 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dd7c69-hhnlg" event={"ID":"ccb28a2e-a946-4407-be07-6ac8eaad8ab1","Type":"ContainerStarted","Data":"4f26987218545ce4b3ac012485238ac52ce1b620c3025dd4686d6e203a028593"} Dec 03 07:09:09 crc kubenswrapper[4947]: I1203 07:09:09.764603 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c779fbf97-r9gnn" event={"ID":"acfeae50-8757-4baf-a16a-c33ae100fdf2","Type":"ContainerStarted","Data":"36b149648045b9cb069b114c1657d8c9f17d3c894192129d25f4f7e5044b7668"} Dec 03 07:09:09 crc kubenswrapper[4947]: I1203 07:09:09.765341 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:09 crc kubenswrapper[4947]: I1203 07:09:09.768646 4947 generic.go:334] "Generic (PLEG): container finished" podID="381b76ac-1ce2-4745-8e27-1398b83ee86a" containerID="c64ef254c2f2db085d3154cdb62f028dac0d5035511cd98cbc2fa5c0f19f09f9" exitCode=0 Dec 03 07:09:09 crc kubenswrapper[4947]: I1203 07:09:09.768707 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" event={"ID":"381b76ac-1ce2-4745-8e27-1398b83ee86a","Type":"ContainerDied","Data":"c64ef254c2f2db085d3154cdb62f028dac0d5035511cd98cbc2fa5c0f19f09f9"} Dec 03 07:09:09 crc kubenswrapper[4947]: I1203 07:09:09.776453 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-774d6d4878-7x6tj" event={"ID":"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998","Type":"ContainerStarted","Data":"845cd1a32bd6e9532c9423654e785193c73bf6e3b1aea2e7e18eadd318301cec"} Dec 03 07:09:09 crc kubenswrapper[4947]: I1203 07:09:09.780715 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2ebe993b-68e0-4d68-8aa7-7895d7b6629a","Type":"ContainerStarted","Data":"d807e47790e8fd7c199cf4bffef39b9faa580a74d25e33e730550da6bc0d3649"} Dec 03 07:09:09 crc kubenswrapper[4947]: I1203 07:09:09.793863 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-c779fbf97-r9gnn" podStartSLOduration=5.793841084 podStartE2EDuration="5.793841084s" podCreationTimestamp="2025-12-03 07:09:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:09:09.784969295 +0000 UTC m=+1211.045923751" watchObservedRunningTime="2025-12-03 07:09:09.793841084 +0000 UTC m=+1211.054795530" Dec 03 07:09:09 crc kubenswrapper[4947]: I1203 07:09:09.805636 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f849bbbf6-tf4mc" event={"ID":"d2e58a7d-365d-4296-b18f-5d1bfe90b61f","Type":"ContainerStarted","Data":"b5da53d34c11626bb7af6025e1d48b7788af57875c4c0af890dd758ea0e40aca"} Dec 03 07:09:09 crc kubenswrapper[4947]: I1203 07:09:09.805923 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f849bbbf6-tf4mc" event={"ID":"d2e58a7d-365d-4296-b18f-5d1bfe90b61f","Type":"ContainerStarted","Data":"4e1056f9dc24c16b8391f6730abe97c8c00219da69df68bd522af5ece0d4a512"} Dec 03 07:09:09 crc kubenswrapper[4947]: I1203 07:09:09.835583 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.835562439 podStartE2EDuration="9.835562439s" podCreationTimestamp="2025-12-03 07:09:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:09:09.831070718 +0000 UTC m=+1211.092025144" watchObservedRunningTime="2025-12-03 07:09:09.835562439 +0000 UTC m=+1211.096516865" Dec 03 07:09:10 crc kubenswrapper[4947]: I1203 07:09:10.814696 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-774d6d4878-7x6tj" event={"ID":"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998","Type":"ContainerStarted","Data":"ce7f8de092c5631abd0150a07a948c24b407959e654aa1d9ce964ed828bb05fa"} Dec 03 07:09:10 crc kubenswrapper[4947]: I1203 07:09:10.815014 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:10 crc kubenswrapper[4947]: I1203 07:09:10.818870 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2ebe993b-68e0-4d68-8aa7-7895d7b6629a","Type":"ContainerStarted","Data":"7fb52b68381c4325d19d29c3513a46fc49f21c55090c27471729406c824d03a5"} Dec 03 07:09:10 crc kubenswrapper[4947]: I1203 07:09:10.823698 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dd7c69-hhnlg" event={"ID":"ccb28a2e-a946-4407-be07-6ac8eaad8ab1","Type":"ContainerStarted","Data":"d1c453acbdb699e13cb35f456f04fa20cb22c76c9d9304778639baf320f5cf98"} Dec 03 07:09:10 crc kubenswrapper[4947]: I1203 07:09:10.824318 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:09:10 crc kubenswrapper[4947]: I1203 07:09:10.831309 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" event={"ID":"381b76ac-1ce2-4745-8e27-1398b83ee86a","Type":"ContainerStarted","Data":"40b240e4fbf74a9ff20f6c10c9d36360e68597fec6bdc52297632f6094c8a1e0"} Dec 03 07:09:10 crc kubenswrapper[4947]: I1203 07:09:10.831873 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5f849bbbf6-tf4mc" Dec 03 07:09:10 crc kubenswrapper[4947]: I1203 07:09:10.831998 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" Dec 03 07:09:10 crc kubenswrapper[4947]: I1203 07:09:10.849224 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-774d6d4878-7x6tj" podStartSLOduration=8.849197286999999 podStartE2EDuration="8.849197287s" podCreationTimestamp="2025-12-03 07:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:09:10.835807266 +0000 UTC m=+1212.096761692" watchObservedRunningTime="2025-12-03 07:09:10.849197287 +0000 UTC m=+1212.110151713" Dec 03 07:09:10 crc kubenswrapper[4947]: I1203 07:09:10.865470 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5f849bbbf6-tf4mc" podStartSLOduration=8.865448676 podStartE2EDuration="8.865448676s" podCreationTimestamp="2025-12-03 07:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:09:10.850440191 +0000 UTC m=+1212.111394627" watchObservedRunningTime="2025-12-03 07:09:10.865448676 +0000 UTC m=+1212.126403102" Dec 03 07:09:10 crc kubenswrapper[4947]: I1203 07:09:10.886775 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.88675259 podStartE2EDuration="8.88675259s" podCreationTimestamp="2025-12-03 07:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:09:10.871020226 +0000 UTC m=+1212.131974662" watchObservedRunningTime="2025-12-03 07:09:10.88675259 +0000 UTC m=+1212.147707016" Dec 03 07:09:10 crc kubenswrapper[4947]: I1203 07:09:10.895836 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" podStartSLOduration=8.895818714 podStartE2EDuration="8.895818714s" podCreationTimestamp="2025-12-03 07:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:09:10.888020394 +0000 UTC m=+1212.148974820" watchObservedRunningTime="2025-12-03 07:09:10.895818714 +0000 UTC m=+1212.156773140" Dec 03 07:09:10 crc kubenswrapper[4947]: I1203 07:09:10.917425 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7dd7c69-hhnlg" podStartSLOduration=5.917404756 podStartE2EDuration="5.917404756s" podCreationTimestamp="2025-12-03 07:09:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:09:10.907447568 +0000 UTC m=+1212.168402004" watchObservedRunningTime="2025-12-03 07:09:10.917404756 +0000 UTC m=+1212.178359192" Dec 03 07:09:11 crc kubenswrapper[4947]: I1203 07:09:11.290070 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 07:09:11 crc kubenswrapper[4947]: I1203 07:09:11.290119 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 07:09:11 crc kubenswrapper[4947]: I1203 07:09:11.356151 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 07:09:11 crc kubenswrapper[4947]: I1203 07:09:11.365997 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 07:09:11 crc kubenswrapper[4947]: I1203 07:09:11.838837 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:11 crc kubenswrapper[4947]: I1203 07:09:11.838874 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 07:09:11 crc kubenswrapper[4947]: I1203 07:09:11.839410 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 07:09:13 crc kubenswrapper[4947]: I1203 07:09:13.106521 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 07:09:13 crc kubenswrapper[4947]: I1203 07:09:13.106578 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 07:09:13 crc kubenswrapper[4947]: I1203 07:09:13.146121 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 07:09:13 crc kubenswrapper[4947]: I1203 07:09:13.149770 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 07:09:13 crc kubenswrapper[4947]: I1203 07:09:13.794991 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 07:09:13 crc kubenswrapper[4947]: I1203 07:09:13.866869 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 07:09:13 crc kubenswrapper[4947]: I1203 07:09:13.866914 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 07:09:15 crc kubenswrapper[4947]: I1203 07:09:15.665591 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 07:09:15 crc kubenswrapper[4947]: I1203 07:09:15.744277 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 07:09:16 crc kubenswrapper[4947]: I1203 07:09:16.910154 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"49777040-6a13-4610-a79c-6bc76d73212e","Type":"ContainerStarted","Data":"3622952c0d3c5869eaca3c3975f6263e7918784b4aa1bf6edb03f50379104817"} Dec 03 07:09:16 crc kubenswrapper[4947]: I1203 07:09:16.910455 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="49777040-6a13-4610-a79c-6bc76d73212e" containerName="ceilometer-notification-agent" containerID="cri-o://006949d1f57355ceaf7778805bb8d8a92436c8a6ddc791cb68e2623b4694a8d0" gracePeriod=30 Dec 03 07:09:16 crc kubenswrapper[4947]: I1203 07:09:16.910411 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="49777040-6a13-4610-a79c-6bc76d73212e" containerName="proxy-httpd" containerID="cri-o://3622952c0d3c5869eaca3c3975f6263e7918784b4aa1bf6edb03f50379104817" gracePeriod=30 Dec 03 07:09:16 crc kubenswrapper[4947]: I1203 07:09:16.910434 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="49777040-6a13-4610-a79c-6bc76d73212e" containerName="sg-core" containerID="cri-o://b119918308b9d42ec42ec890149a800784aa71c350b7f14abeab51c9801e58ac" gracePeriod=30 Dec 03 07:09:16 crc kubenswrapper[4947]: I1203 07:09:16.910344 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="49777040-6a13-4610-a79c-6bc76d73212e" containerName="ceilometer-central-agent" containerID="cri-o://5ee9d6e696e18d98a587678503f266198e00644707799900f8303a0136c8f6bf" gracePeriod=30 Dec 03 07:09:16 crc kubenswrapper[4947]: I1203 07:09:16.910767 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 07:09:16 crc kubenswrapper[4947]: I1203 07:09:16.915417 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nv68l" event={"ID":"5abc234f-80ac-4e2e-a43d-2a6fe3453f8c","Type":"ContainerStarted","Data":"87dd90cc8a08b7bf0637010541350c2b593540197fa2c3a2b4f64f7953ae225f"} Dec 03 07:09:16 crc kubenswrapper[4947]: I1203 07:09:16.956137 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.315920545 podStartE2EDuration="41.956120094s" podCreationTimestamp="2025-12-03 07:08:35 +0000 UTC" firstStartedPulling="2025-12-03 07:08:36.890121146 +0000 UTC m=+1178.151075572" lastFinishedPulling="2025-12-03 07:09:16.530320695 +0000 UTC m=+1217.791275121" observedRunningTime="2025-12-03 07:09:16.934027608 +0000 UTC m=+1218.194982054" watchObservedRunningTime="2025-12-03 07:09:16.956120094 +0000 UTC m=+1218.217074530" Dec 03 07:09:16 crc kubenswrapper[4947]: I1203 07:09:16.961895 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-nv68l" podStartSLOduration=2.587397065 podStartE2EDuration="41.961877789s" podCreationTimestamp="2025-12-03 07:08:35 +0000 UTC" firstStartedPulling="2025-12-03 07:08:37.142588213 +0000 UTC m=+1178.403542639" lastFinishedPulling="2025-12-03 07:09:16.517068937 +0000 UTC m=+1217.778023363" observedRunningTime="2025-12-03 07:09:16.959145326 +0000 UTC m=+1218.220099762" watchObservedRunningTime="2025-12-03 07:09:16.961877789 +0000 UTC m=+1218.222832215" Dec 03 07:09:17 crc kubenswrapper[4947]: I1203 07:09:17.857480 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 07:09:17 crc kubenswrapper[4947]: I1203 07:09:17.936783 4947 generic.go:334] "Generic (PLEG): container finished" podID="49777040-6a13-4610-a79c-6bc76d73212e" containerID="3622952c0d3c5869eaca3c3975f6263e7918784b4aa1bf6edb03f50379104817" exitCode=0 Dec 03 07:09:17 crc kubenswrapper[4947]: I1203 07:09:17.936823 4947 generic.go:334] "Generic (PLEG): container finished" podID="49777040-6a13-4610-a79c-6bc76d73212e" containerID="b119918308b9d42ec42ec890149a800784aa71c350b7f14abeab51c9801e58ac" exitCode=2 Dec 03 07:09:17 crc kubenswrapper[4947]: I1203 07:09:17.936833 4947 generic.go:334] "Generic (PLEG): container finished" podID="49777040-6a13-4610-a79c-6bc76d73212e" containerID="5ee9d6e696e18d98a587678503f266198e00644707799900f8303a0136c8f6bf" exitCode=0 Dec 03 07:09:17 crc kubenswrapper[4947]: I1203 07:09:17.936860 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"49777040-6a13-4610-a79c-6bc76d73212e","Type":"ContainerDied","Data":"3622952c0d3c5869eaca3c3975f6263e7918784b4aa1bf6edb03f50379104817"} Dec 03 07:09:17 crc kubenswrapper[4947]: I1203 07:09:17.936914 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"49777040-6a13-4610-a79c-6bc76d73212e","Type":"ContainerDied","Data":"b119918308b9d42ec42ec890149a800784aa71c350b7f14abeab51c9801e58ac"} Dec 03 07:09:17 crc kubenswrapper[4947]: I1203 07:09:17.936933 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"49777040-6a13-4610-a79c-6bc76d73212e","Type":"ContainerDied","Data":"5ee9d6e696e18d98a587678503f266198e00644707799900f8303a0136c8f6bf"} Dec 03 07:09:17 crc kubenswrapper[4947]: I1203 07:09:17.938699 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6bhnb" event={"ID":"af29daa3-e143-4ea0-bfe0-284fd103f8b3","Type":"ContainerStarted","Data":"d890527d7be95a78ab0069a1cc568b31239f334eaea64b0a493969424e024807"} Dec 03 07:09:17 crc kubenswrapper[4947]: I1203 07:09:17.962354 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-6bhnb" podStartSLOduration=3.221290245 podStartE2EDuration="42.962334242s" podCreationTimestamp="2025-12-03 07:08:35 +0000 UTC" firstStartedPulling="2025-12-03 07:08:36.778397174 +0000 UTC m=+1178.039351600" lastFinishedPulling="2025-12-03 07:09:16.519441171 +0000 UTC m=+1217.780395597" observedRunningTime="2025-12-03 07:09:17.95630292 +0000 UTC m=+1219.217257356" watchObservedRunningTime="2025-12-03 07:09:17.962334242 +0000 UTC m=+1219.223288668" Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.184826 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.279168 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74fd8b655f-4vsbx"] Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.279438 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" podUID="f44a9445-7982-4d53-aadc-02677a421f34" containerName="dnsmasq-dns" containerID="cri-o://13d5cceeb083c0a368fc683e65524466564ea34b1723cbd175b245b25c702296" gracePeriod=10 Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.732501 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.804674 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-config\") pod \"f44a9445-7982-4d53-aadc-02677a421f34\" (UID: \"f44a9445-7982-4d53-aadc-02677a421f34\") " Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.804745 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsxjs\" (UniqueName: \"kubernetes.io/projected/f44a9445-7982-4d53-aadc-02677a421f34-kube-api-access-dsxjs\") pod \"f44a9445-7982-4d53-aadc-02677a421f34\" (UID: \"f44a9445-7982-4d53-aadc-02677a421f34\") " Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.804780 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-dns-swift-storage-0\") pod \"f44a9445-7982-4d53-aadc-02677a421f34\" (UID: \"f44a9445-7982-4d53-aadc-02677a421f34\") " Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.804853 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-ovsdbserver-sb\") pod \"f44a9445-7982-4d53-aadc-02677a421f34\" (UID: \"f44a9445-7982-4d53-aadc-02677a421f34\") " Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.804922 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-ovsdbserver-nb\") pod \"f44a9445-7982-4d53-aadc-02677a421f34\" (UID: \"f44a9445-7982-4d53-aadc-02677a421f34\") " Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.804946 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-dns-svc\") pod \"f44a9445-7982-4d53-aadc-02677a421f34\" (UID: \"f44a9445-7982-4d53-aadc-02677a421f34\") " Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.817691 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f44a9445-7982-4d53-aadc-02677a421f34-kube-api-access-dsxjs" (OuterVolumeSpecName: "kube-api-access-dsxjs") pod "f44a9445-7982-4d53-aadc-02677a421f34" (UID: "f44a9445-7982-4d53-aadc-02677a421f34"). InnerVolumeSpecName "kube-api-access-dsxjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.855644 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f44a9445-7982-4d53-aadc-02677a421f34" (UID: "f44a9445-7982-4d53-aadc-02677a421f34"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.857019 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f44a9445-7982-4d53-aadc-02677a421f34" (UID: "f44a9445-7982-4d53-aadc-02677a421f34"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.866546 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f44a9445-7982-4d53-aadc-02677a421f34" (UID: "f44a9445-7982-4d53-aadc-02677a421f34"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.875745 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f44a9445-7982-4d53-aadc-02677a421f34" (UID: "f44a9445-7982-4d53-aadc-02677a421f34"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.882940 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-config" (OuterVolumeSpecName: "config") pod "f44a9445-7982-4d53-aadc-02677a421f34" (UID: "f44a9445-7982-4d53-aadc-02677a421f34"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.906389 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.906424 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.906436 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.906448 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.906461 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsxjs\" (UniqueName: \"kubernetes.io/projected/f44a9445-7982-4d53-aadc-02677a421f34-kube-api-access-dsxjs\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.906475 4947 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f44a9445-7982-4d53-aadc-02677a421f34-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.947582 4947 generic.go:334] "Generic (PLEG): container finished" podID="f44a9445-7982-4d53-aadc-02677a421f34" containerID="13d5cceeb083c0a368fc683e65524466564ea34b1723cbd175b245b25c702296" exitCode=0 Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.947626 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" event={"ID":"f44a9445-7982-4d53-aadc-02677a421f34","Type":"ContainerDied","Data":"13d5cceeb083c0a368fc683e65524466564ea34b1723cbd175b245b25c702296"} Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.947652 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" event={"ID":"f44a9445-7982-4d53-aadc-02677a421f34","Type":"ContainerDied","Data":"90b4ab76d2e135e5edbf6f28a269210393b57168fabe335ab61446fad866d698"} Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.947649 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74fd8b655f-4vsbx" Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.947668 4947 scope.go:117] "RemoveContainer" containerID="13d5cceeb083c0a368fc683e65524466564ea34b1723cbd175b245b25c702296" Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.980234 4947 scope.go:117] "RemoveContainer" containerID="5c301b986d98fe8bb750064a1f7c325a3fad41154601504bfc8546352ee968f7" Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.982683 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74fd8b655f-4vsbx"] Dec 03 07:09:18 crc kubenswrapper[4947]: I1203 07:09:18.991160 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74fd8b655f-4vsbx"] Dec 03 07:09:19 crc kubenswrapper[4947]: I1203 07:09:19.006335 4947 scope.go:117] "RemoveContainer" containerID="13d5cceeb083c0a368fc683e65524466564ea34b1723cbd175b245b25c702296" Dec 03 07:09:19 crc kubenswrapper[4947]: E1203 07:09:19.007663 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13d5cceeb083c0a368fc683e65524466564ea34b1723cbd175b245b25c702296\": container with ID starting with 13d5cceeb083c0a368fc683e65524466564ea34b1723cbd175b245b25c702296 not found: ID does not exist" containerID="13d5cceeb083c0a368fc683e65524466564ea34b1723cbd175b245b25c702296" Dec 03 07:09:19 crc kubenswrapper[4947]: I1203 07:09:19.007693 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13d5cceeb083c0a368fc683e65524466564ea34b1723cbd175b245b25c702296"} err="failed to get container status \"13d5cceeb083c0a368fc683e65524466564ea34b1723cbd175b245b25c702296\": rpc error: code = NotFound desc = could not find container \"13d5cceeb083c0a368fc683e65524466564ea34b1723cbd175b245b25c702296\": container with ID starting with 13d5cceeb083c0a368fc683e65524466564ea34b1723cbd175b245b25c702296 not found: ID does not exist" Dec 03 07:09:19 crc kubenswrapper[4947]: I1203 07:09:19.007715 4947 scope.go:117] "RemoveContainer" containerID="5c301b986d98fe8bb750064a1f7c325a3fad41154601504bfc8546352ee968f7" Dec 03 07:09:19 crc kubenswrapper[4947]: E1203 07:09:19.008075 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c301b986d98fe8bb750064a1f7c325a3fad41154601504bfc8546352ee968f7\": container with ID starting with 5c301b986d98fe8bb750064a1f7c325a3fad41154601504bfc8546352ee968f7 not found: ID does not exist" containerID="5c301b986d98fe8bb750064a1f7c325a3fad41154601504bfc8546352ee968f7" Dec 03 07:09:19 crc kubenswrapper[4947]: I1203 07:09:19.008119 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c301b986d98fe8bb750064a1f7c325a3fad41154601504bfc8546352ee968f7"} err="failed to get container status \"5c301b986d98fe8bb750064a1f7c325a3fad41154601504bfc8546352ee968f7\": rpc error: code = NotFound desc = could not find container \"5c301b986d98fe8bb750064a1f7c325a3fad41154601504bfc8546352ee968f7\": container with ID starting with 5c301b986d98fe8bb750064a1f7c325a3fad41154601504bfc8546352ee968f7 not found: ID does not exist" Dec 03 07:09:19 crc kubenswrapper[4947]: I1203 07:09:19.093933 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f44a9445-7982-4d53-aadc-02677a421f34" path="/var/lib/kubelet/pods/f44a9445-7982-4d53-aadc-02677a421f34/volumes" Dec 03 07:09:19 crc kubenswrapper[4947]: I1203 07:09:19.960529 4947 generic.go:334] "Generic (PLEG): container finished" podID="5abc234f-80ac-4e2e-a43d-2a6fe3453f8c" containerID="87dd90cc8a08b7bf0637010541350c2b593540197fa2c3a2b4f64f7953ae225f" exitCode=0 Dec 03 07:09:19 crc kubenswrapper[4947]: I1203 07:09:19.960612 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nv68l" event={"ID":"5abc234f-80ac-4e2e-a43d-2a6fe3453f8c","Type":"ContainerDied","Data":"87dd90cc8a08b7bf0637010541350c2b593540197fa2c3a2b4f64f7953ae225f"} Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.372808 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.533814 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/49777040-6a13-4610-a79c-6bc76d73212e-sg-core-conf-yaml\") pod \"49777040-6a13-4610-a79c-6bc76d73212e\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.534376 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49777040-6a13-4610-a79c-6bc76d73212e-scripts\") pod \"49777040-6a13-4610-a79c-6bc76d73212e\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.534455 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49777040-6a13-4610-a79c-6bc76d73212e-run-httpd\") pod \"49777040-6a13-4610-a79c-6bc76d73212e\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.534569 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49777040-6a13-4610-a79c-6bc76d73212e-config-data\") pod \"49777040-6a13-4610-a79c-6bc76d73212e\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.534667 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49777040-6a13-4610-a79c-6bc76d73212e-log-httpd\") pod \"49777040-6a13-4610-a79c-6bc76d73212e\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.534695 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49777040-6a13-4610-a79c-6bc76d73212e-combined-ca-bundle\") pod \"49777040-6a13-4610-a79c-6bc76d73212e\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.534883 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbsbn\" (UniqueName: \"kubernetes.io/projected/49777040-6a13-4610-a79c-6bc76d73212e-kube-api-access-lbsbn\") pod \"49777040-6a13-4610-a79c-6bc76d73212e\" (UID: \"49777040-6a13-4610-a79c-6bc76d73212e\") " Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.534959 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49777040-6a13-4610-a79c-6bc76d73212e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "49777040-6a13-4610-a79c-6bc76d73212e" (UID: "49777040-6a13-4610-a79c-6bc76d73212e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.535564 4947 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49777040-6a13-4610-a79c-6bc76d73212e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.535734 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49777040-6a13-4610-a79c-6bc76d73212e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "49777040-6a13-4610-a79c-6bc76d73212e" (UID: "49777040-6a13-4610-a79c-6bc76d73212e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.549934 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49777040-6a13-4610-a79c-6bc76d73212e-scripts" (OuterVolumeSpecName: "scripts") pod "49777040-6a13-4610-a79c-6bc76d73212e" (UID: "49777040-6a13-4610-a79c-6bc76d73212e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.549991 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49777040-6a13-4610-a79c-6bc76d73212e-kube-api-access-lbsbn" (OuterVolumeSpecName: "kube-api-access-lbsbn") pod "49777040-6a13-4610-a79c-6bc76d73212e" (UID: "49777040-6a13-4610-a79c-6bc76d73212e"). InnerVolumeSpecName "kube-api-access-lbsbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.571164 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49777040-6a13-4610-a79c-6bc76d73212e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "49777040-6a13-4610-a79c-6bc76d73212e" (UID: "49777040-6a13-4610-a79c-6bc76d73212e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.631837 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49777040-6a13-4610-a79c-6bc76d73212e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49777040-6a13-4610-a79c-6bc76d73212e" (UID: "49777040-6a13-4610-a79c-6bc76d73212e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.637891 4947 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/49777040-6a13-4610-a79c-6bc76d73212e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.637930 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49777040-6a13-4610-a79c-6bc76d73212e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.637944 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbsbn\" (UniqueName: \"kubernetes.io/projected/49777040-6a13-4610-a79c-6bc76d73212e-kube-api-access-lbsbn\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.637956 4947 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/49777040-6a13-4610-a79c-6bc76d73212e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.637967 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49777040-6a13-4610-a79c-6bc76d73212e-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.645038 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49777040-6a13-4610-a79c-6bc76d73212e-config-data" (OuterVolumeSpecName: "config-data") pod "49777040-6a13-4610-a79c-6bc76d73212e" (UID: "49777040-6a13-4610-a79c-6bc76d73212e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.739485 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49777040-6a13-4610-a79c-6bc76d73212e-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.978525 4947 generic.go:334] "Generic (PLEG): container finished" podID="49777040-6a13-4610-a79c-6bc76d73212e" containerID="006949d1f57355ceaf7778805bb8d8a92436c8a6ddc791cb68e2623b4694a8d0" exitCode=0 Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.978745 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.979045 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"49777040-6a13-4610-a79c-6bc76d73212e","Type":"ContainerDied","Data":"006949d1f57355ceaf7778805bb8d8a92436c8a6ddc791cb68e2623b4694a8d0"} Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.979104 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"49777040-6a13-4610-a79c-6bc76d73212e","Type":"ContainerDied","Data":"2979b22cc72ca2acc4aaa550f995388569d6f735dbc490ac2820f45d4ed4d67b"} Dec 03 07:09:20 crc kubenswrapper[4947]: I1203 07:09:20.979133 4947 scope.go:117] "RemoveContainer" containerID="3622952c0d3c5869eaca3c3975f6263e7918784b4aa1bf6edb03f50379104817" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.026603 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.031006 4947 scope.go:117] "RemoveContainer" containerID="b119918308b9d42ec42ec890149a800784aa71c350b7f14abeab51c9801e58ac" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.053684 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.067542 4947 scope.go:117] "RemoveContainer" containerID="006949d1f57355ceaf7778805bb8d8a92436c8a6ddc791cb68e2623b4694a8d0" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.068919 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:09:21 crc kubenswrapper[4947]: E1203 07:09:21.069464 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49777040-6a13-4610-a79c-6bc76d73212e" containerName="ceilometer-notification-agent" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.069518 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="49777040-6a13-4610-a79c-6bc76d73212e" containerName="ceilometer-notification-agent" Dec 03 07:09:21 crc kubenswrapper[4947]: E1203 07:09:21.069571 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49777040-6a13-4610-a79c-6bc76d73212e" containerName="ceilometer-central-agent" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.069586 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="49777040-6a13-4610-a79c-6bc76d73212e" containerName="ceilometer-central-agent" Dec 03 07:09:21 crc kubenswrapper[4947]: E1203 07:09:21.069614 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f44a9445-7982-4d53-aadc-02677a421f34" containerName="init" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.069628 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f44a9445-7982-4d53-aadc-02677a421f34" containerName="init" Dec 03 07:09:21 crc kubenswrapper[4947]: E1203 07:09:21.069690 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49777040-6a13-4610-a79c-6bc76d73212e" containerName="proxy-httpd" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.069705 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="49777040-6a13-4610-a79c-6bc76d73212e" containerName="proxy-httpd" Dec 03 07:09:21 crc kubenswrapper[4947]: E1203 07:09:21.069726 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f44a9445-7982-4d53-aadc-02677a421f34" containerName="dnsmasq-dns" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.069738 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f44a9445-7982-4d53-aadc-02677a421f34" containerName="dnsmasq-dns" Dec 03 07:09:21 crc kubenswrapper[4947]: E1203 07:09:21.069765 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49777040-6a13-4610-a79c-6bc76d73212e" containerName="sg-core" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.069777 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="49777040-6a13-4610-a79c-6bc76d73212e" containerName="sg-core" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.070063 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="49777040-6a13-4610-a79c-6bc76d73212e" containerName="proxy-httpd" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.070113 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="49777040-6a13-4610-a79c-6bc76d73212e" containerName="ceilometer-notification-agent" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.070135 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="49777040-6a13-4610-a79c-6bc76d73212e" containerName="ceilometer-central-agent" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.070162 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f44a9445-7982-4d53-aadc-02677a421f34" containerName="dnsmasq-dns" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.070177 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="49777040-6a13-4610-a79c-6bc76d73212e" containerName="sg-core" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.073311 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.077231 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.078535 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.079906 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.115646 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49777040-6a13-4610-a79c-6bc76d73212e" path="/var/lib/kubelet/pods/49777040-6a13-4610-a79c-6bc76d73212e/volumes" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.134384 4947 scope.go:117] "RemoveContainer" containerID="5ee9d6e696e18d98a587678503f266198e00644707799900f8303a0136c8f6bf" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.147438 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f30107-717d-402f-b501-8911cede9cc1-config-data\") pod \"ceilometer-0\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " pod="openstack/ceilometer-0" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.147477 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79f30107-717d-402f-b501-8911cede9cc1-run-httpd\") pod \"ceilometer-0\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " pod="openstack/ceilometer-0" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.147512 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f30107-717d-402f-b501-8911cede9cc1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " pod="openstack/ceilometer-0" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.147564 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79f30107-717d-402f-b501-8911cede9cc1-scripts\") pod \"ceilometer-0\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " pod="openstack/ceilometer-0" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.147587 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79f30107-717d-402f-b501-8911cede9cc1-log-httpd\") pod \"ceilometer-0\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " pod="openstack/ceilometer-0" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.147616 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79f30107-717d-402f-b501-8911cede9cc1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " pod="openstack/ceilometer-0" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.147631 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddgcd\" (UniqueName: \"kubernetes.io/projected/79f30107-717d-402f-b501-8911cede9cc1-kube-api-access-ddgcd\") pod \"ceilometer-0\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " pod="openstack/ceilometer-0" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.167821 4947 scope.go:117] "RemoveContainer" containerID="3622952c0d3c5869eaca3c3975f6263e7918784b4aa1bf6edb03f50379104817" Dec 03 07:09:21 crc kubenswrapper[4947]: E1203 07:09:21.168242 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3622952c0d3c5869eaca3c3975f6263e7918784b4aa1bf6edb03f50379104817\": container with ID starting with 3622952c0d3c5869eaca3c3975f6263e7918784b4aa1bf6edb03f50379104817 not found: ID does not exist" containerID="3622952c0d3c5869eaca3c3975f6263e7918784b4aa1bf6edb03f50379104817" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.168282 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3622952c0d3c5869eaca3c3975f6263e7918784b4aa1bf6edb03f50379104817"} err="failed to get container status \"3622952c0d3c5869eaca3c3975f6263e7918784b4aa1bf6edb03f50379104817\": rpc error: code = NotFound desc = could not find container \"3622952c0d3c5869eaca3c3975f6263e7918784b4aa1bf6edb03f50379104817\": container with ID starting with 3622952c0d3c5869eaca3c3975f6263e7918784b4aa1bf6edb03f50379104817 not found: ID does not exist" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.168313 4947 scope.go:117] "RemoveContainer" containerID="b119918308b9d42ec42ec890149a800784aa71c350b7f14abeab51c9801e58ac" Dec 03 07:09:21 crc kubenswrapper[4947]: E1203 07:09:21.169187 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b119918308b9d42ec42ec890149a800784aa71c350b7f14abeab51c9801e58ac\": container with ID starting with b119918308b9d42ec42ec890149a800784aa71c350b7f14abeab51c9801e58ac not found: ID does not exist" containerID="b119918308b9d42ec42ec890149a800784aa71c350b7f14abeab51c9801e58ac" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.169232 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b119918308b9d42ec42ec890149a800784aa71c350b7f14abeab51c9801e58ac"} err="failed to get container status \"b119918308b9d42ec42ec890149a800784aa71c350b7f14abeab51c9801e58ac\": rpc error: code = NotFound desc = could not find container \"b119918308b9d42ec42ec890149a800784aa71c350b7f14abeab51c9801e58ac\": container with ID starting with b119918308b9d42ec42ec890149a800784aa71c350b7f14abeab51c9801e58ac not found: ID does not exist" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.169258 4947 scope.go:117] "RemoveContainer" containerID="006949d1f57355ceaf7778805bb8d8a92436c8a6ddc791cb68e2623b4694a8d0" Dec 03 07:09:21 crc kubenswrapper[4947]: E1203 07:09:21.169579 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"006949d1f57355ceaf7778805bb8d8a92436c8a6ddc791cb68e2623b4694a8d0\": container with ID starting with 006949d1f57355ceaf7778805bb8d8a92436c8a6ddc791cb68e2623b4694a8d0 not found: ID does not exist" containerID="006949d1f57355ceaf7778805bb8d8a92436c8a6ddc791cb68e2623b4694a8d0" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.169651 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"006949d1f57355ceaf7778805bb8d8a92436c8a6ddc791cb68e2623b4694a8d0"} err="failed to get container status \"006949d1f57355ceaf7778805bb8d8a92436c8a6ddc791cb68e2623b4694a8d0\": rpc error: code = NotFound desc = could not find container \"006949d1f57355ceaf7778805bb8d8a92436c8a6ddc791cb68e2623b4694a8d0\": container with ID starting with 006949d1f57355ceaf7778805bb8d8a92436c8a6ddc791cb68e2623b4694a8d0 not found: ID does not exist" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.169668 4947 scope.go:117] "RemoveContainer" containerID="5ee9d6e696e18d98a587678503f266198e00644707799900f8303a0136c8f6bf" Dec 03 07:09:21 crc kubenswrapper[4947]: E1203 07:09:21.169922 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ee9d6e696e18d98a587678503f266198e00644707799900f8303a0136c8f6bf\": container with ID starting with 5ee9d6e696e18d98a587678503f266198e00644707799900f8303a0136c8f6bf not found: ID does not exist" containerID="5ee9d6e696e18d98a587678503f266198e00644707799900f8303a0136c8f6bf" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.169950 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee9d6e696e18d98a587678503f266198e00644707799900f8303a0136c8f6bf"} err="failed to get container status \"5ee9d6e696e18d98a587678503f266198e00644707799900f8303a0136c8f6bf\": rpc error: code = NotFound desc = could not find container \"5ee9d6e696e18d98a587678503f266198e00644707799900f8303a0136c8f6bf\": container with ID starting with 5ee9d6e696e18d98a587678503f266198e00644707799900f8303a0136c8f6bf not found: ID does not exist" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.252997 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f30107-717d-402f-b501-8911cede9cc1-config-data\") pod \"ceilometer-0\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " pod="openstack/ceilometer-0" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.253302 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79f30107-717d-402f-b501-8911cede9cc1-run-httpd\") pod \"ceilometer-0\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " pod="openstack/ceilometer-0" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.253345 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f30107-717d-402f-b501-8911cede9cc1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " pod="openstack/ceilometer-0" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.253427 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79f30107-717d-402f-b501-8911cede9cc1-scripts\") pod \"ceilometer-0\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " pod="openstack/ceilometer-0" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.253477 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79f30107-717d-402f-b501-8911cede9cc1-log-httpd\") pod \"ceilometer-0\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " pod="openstack/ceilometer-0" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.253570 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79f30107-717d-402f-b501-8911cede9cc1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " pod="openstack/ceilometer-0" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.253598 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddgcd\" (UniqueName: \"kubernetes.io/projected/79f30107-717d-402f-b501-8911cede9cc1-kube-api-access-ddgcd\") pod \"ceilometer-0\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " pod="openstack/ceilometer-0" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.255236 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79f30107-717d-402f-b501-8911cede9cc1-run-httpd\") pod \"ceilometer-0\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " pod="openstack/ceilometer-0" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.259140 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f30107-717d-402f-b501-8911cede9cc1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " pod="openstack/ceilometer-0" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.259308 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f30107-717d-402f-b501-8911cede9cc1-config-data\") pod \"ceilometer-0\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " pod="openstack/ceilometer-0" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.262461 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79f30107-717d-402f-b501-8911cede9cc1-log-httpd\") pod \"ceilometer-0\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " pod="openstack/ceilometer-0" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.262775 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79f30107-717d-402f-b501-8911cede9cc1-scripts\") pod \"ceilometer-0\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " pod="openstack/ceilometer-0" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.275380 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79f30107-717d-402f-b501-8911cede9cc1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " pod="openstack/ceilometer-0" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.276069 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddgcd\" (UniqueName: \"kubernetes.io/projected/79f30107-717d-402f-b501-8911cede9cc1-kube-api-access-ddgcd\") pod \"ceilometer-0\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " pod="openstack/ceilometer-0" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.374787 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nv68l" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.437410 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.456279 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5abc234f-80ac-4e2e-a43d-2a6fe3453f8c-combined-ca-bundle\") pod \"5abc234f-80ac-4e2e-a43d-2a6fe3453f8c\" (UID: \"5abc234f-80ac-4e2e-a43d-2a6fe3453f8c\") " Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.456371 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5abc234f-80ac-4e2e-a43d-2a6fe3453f8c-db-sync-config-data\") pod \"5abc234f-80ac-4e2e-a43d-2a6fe3453f8c\" (UID: \"5abc234f-80ac-4e2e-a43d-2a6fe3453f8c\") " Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.456606 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bqcd\" (UniqueName: \"kubernetes.io/projected/5abc234f-80ac-4e2e-a43d-2a6fe3453f8c-kube-api-access-9bqcd\") pod \"5abc234f-80ac-4e2e-a43d-2a6fe3453f8c\" (UID: \"5abc234f-80ac-4e2e-a43d-2a6fe3453f8c\") " Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.461601 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5abc234f-80ac-4e2e-a43d-2a6fe3453f8c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5abc234f-80ac-4e2e-a43d-2a6fe3453f8c" (UID: "5abc234f-80ac-4e2e-a43d-2a6fe3453f8c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.462074 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5abc234f-80ac-4e2e-a43d-2a6fe3453f8c-kube-api-access-9bqcd" (OuterVolumeSpecName: "kube-api-access-9bqcd") pod "5abc234f-80ac-4e2e-a43d-2a6fe3453f8c" (UID: "5abc234f-80ac-4e2e-a43d-2a6fe3453f8c"). InnerVolumeSpecName "kube-api-access-9bqcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.480103 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5abc234f-80ac-4e2e-a43d-2a6fe3453f8c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5abc234f-80ac-4e2e-a43d-2a6fe3453f8c" (UID: "5abc234f-80ac-4e2e-a43d-2a6fe3453f8c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.558789 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5abc234f-80ac-4e2e-a43d-2a6fe3453f8c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.558826 4947 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5abc234f-80ac-4e2e-a43d-2a6fe3453f8c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.558838 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bqcd\" (UniqueName: \"kubernetes.io/projected/5abc234f-80ac-4e2e-a43d-2a6fe3453f8c-kube-api-access-9bqcd\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.907316 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:09:21 crc kubenswrapper[4947]: W1203 07:09:21.913530 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79f30107_717d_402f_b501_8911cede9cc1.slice/crio-f49ec8273910e28ffc99f6296b8020eedfe647ec2d713fb72d56e9960fcf312d WatchSource:0}: Error finding container f49ec8273910e28ffc99f6296b8020eedfe647ec2d713fb72d56e9960fcf312d: Status 404 returned error can't find the container with id f49ec8273910e28ffc99f6296b8020eedfe647ec2d713fb72d56e9960fcf312d Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.990746 4947 generic.go:334] "Generic (PLEG): container finished" podID="af29daa3-e143-4ea0-bfe0-284fd103f8b3" containerID="d890527d7be95a78ab0069a1cc568b31239f334eaea64b0a493969424e024807" exitCode=0 Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.990811 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6bhnb" event={"ID":"af29daa3-e143-4ea0-bfe0-284fd103f8b3","Type":"ContainerDied","Data":"d890527d7be95a78ab0069a1cc568b31239f334eaea64b0a493969424e024807"} Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.991926 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79f30107-717d-402f-b501-8911cede9cc1","Type":"ContainerStarted","Data":"f49ec8273910e28ffc99f6296b8020eedfe647ec2d713fb72d56e9960fcf312d"} Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.993107 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nv68l" event={"ID":"5abc234f-80ac-4e2e-a43d-2a6fe3453f8c","Type":"ContainerDied","Data":"74821503fb3da8c0535a3bda45eba773af832e1a2416c5fd140798aaadfdf8aa"} Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.993138 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74821503fb3da8c0535a3bda45eba773af832e1a2416c5fd140798aaadfdf8aa" Dec 03 07:09:21 crc kubenswrapper[4947]: I1203 07:09:21.993196 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nv68l" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.303242 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5db58dc49-hq6px"] Dec 03 07:09:22 crc kubenswrapper[4947]: E1203 07:09:22.304313 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5abc234f-80ac-4e2e-a43d-2a6fe3453f8c" containerName="barbican-db-sync" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.304340 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5abc234f-80ac-4e2e-a43d-2a6fe3453f8c" containerName="barbican-db-sync" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.304694 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5abc234f-80ac-4e2e-a43d-2a6fe3453f8c" containerName="barbican-db-sync" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.305870 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5db58dc49-hq6px" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.315120 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6c489f678-crqhz"] Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.318698 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6c489f678-crqhz" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.329402 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rvvfq" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.329724 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.330237 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.330242 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.377554 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6c489f678-crqhz"] Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.385581 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5db58dc49-hq6px"] Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.416837 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65dd957765-625xd"] Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.418462 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65dd957765-625xd" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.462441 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65dd957765-625xd"] Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.482721 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed654f44-78c3-4118-83d5-e2a5d917c4f4-combined-ca-bundle\") pod \"barbican-worker-5db58dc49-hq6px\" (UID: \"ed654f44-78c3-4118-83d5-e2a5d917c4f4\") " pod="openstack/barbican-worker-5db58dc49-hq6px" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.482761 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-combined-ca-bundle\") pod \"barbican-keystone-listener-6c489f678-crqhz\" (UID: \"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3\") " pod="openstack/barbican-keystone-listener-6c489f678-crqhz" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.482788 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqvl4\" (UniqueName: \"kubernetes.io/projected/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-kube-api-access-mqvl4\") pod \"barbican-keystone-listener-6c489f678-crqhz\" (UID: \"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3\") " pod="openstack/barbican-keystone-listener-6c489f678-crqhz" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.482811 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed654f44-78c3-4118-83d5-e2a5d917c4f4-config-data\") pod \"barbican-worker-5db58dc49-hq6px\" (UID: \"ed654f44-78c3-4118-83d5-e2a5d917c4f4\") " pod="openstack/barbican-worker-5db58dc49-hq6px" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.482827 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hff2z\" (UniqueName: \"kubernetes.io/projected/ed654f44-78c3-4118-83d5-e2a5d917c4f4-kube-api-access-hff2z\") pod \"barbican-worker-5db58dc49-hq6px\" (UID: \"ed654f44-78c3-4118-83d5-e2a5d917c4f4\") " pod="openstack/barbican-worker-5db58dc49-hq6px" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.482843 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed654f44-78c3-4118-83d5-e2a5d917c4f4-config-data-custom\") pod \"barbican-worker-5db58dc49-hq6px\" (UID: \"ed654f44-78c3-4118-83d5-e2a5d917c4f4\") " pod="openstack/barbican-worker-5db58dc49-hq6px" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.482867 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-config-data-custom\") pod \"barbican-keystone-listener-6c489f678-crqhz\" (UID: \"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3\") " pod="openstack/barbican-keystone-listener-6c489f678-crqhz" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.482900 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-config-data\") pod \"barbican-keystone-listener-6c489f678-crqhz\" (UID: \"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3\") " pod="openstack/barbican-keystone-listener-6c489f678-crqhz" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.482916 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-logs\") pod \"barbican-keystone-listener-6c489f678-crqhz\" (UID: \"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3\") " pod="openstack/barbican-keystone-listener-6c489f678-crqhz" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.482946 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed654f44-78c3-4118-83d5-e2a5d917c4f4-logs\") pod \"barbican-worker-5db58dc49-hq6px\" (UID: \"ed654f44-78c3-4118-83d5-e2a5d917c4f4\") " pod="openstack/barbican-worker-5db58dc49-hq6px" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.588016 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-dns-swift-storage-0\") pod \"dnsmasq-dns-65dd957765-625xd\" (UID: \"81dcf640-a47a-4436-b76f-113616974810\") " pod="openstack/dnsmasq-dns-65dd957765-625xd" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.588062 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5zqs\" (UniqueName: \"kubernetes.io/projected/81dcf640-a47a-4436-b76f-113616974810-kube-api-access-n5zqs\") pod \"dnsmasq-dns-65dd957765-625xd\" (UID: \"81dcf640-a47a-4436-b76f-113616974810\") " pod="openstack/dnsmasq-dns-65dd957765-625xd" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.588095 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-dns-svc\") pod \"dnsmasq-dns-65dd957765-625xd\" (UID: \"81dcf640-a47a-4436-b76f-113616974810\") " pod="openstack/dnsmasq-dns-65dd957765-625xd" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.588152 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed654f44-78c3-4118-83d5-e2a5d917c4f4-combined-ca-bundle\") pod \"barbican-worker-5db58dc49-hq6px\" (UID: \"ed654f44-78c3-4118-83d5-e2a5d917c4f4\") " pod="openstack/barbican-worker-5db58dc49-hq6px" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.588170 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-combined-ca-bundle\") pod \"barbican-keystone-listener-6c489f678-crqhz\" (UID: \"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3\") " pod="openstack/barbican-keystone-listener-6c489f678-crqhz" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.588193 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqvl4\" (UniqueName: \"kubernetes.io/projected/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-kube-api-access-mqvl4\") pod \"barbican-keystone-listener-6c489f678-crqhz\" (UID: \"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3\") " pod="openstack/barbican-keystone-listener-6c489f678-crqhz" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.588215 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed654f44-78c3-4118-83d5-e2a5d917c4f4-config-data\") pod \"barbican-worker-5db58dc49-hq6px\" (UID: \"ed654f44-78c3-4118-83d5-e2a5d917c4f4\") " pod="openstack/barbican-worker-5db58dc49-hq6px" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.588233 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hff2z\" (UniqueName: \"kubernetes.io/projected/ed654f44-78c3-4118-83d5-e2a5d917c4f4-kube-api-access-hff2z\") pod \"barbican-worker-5db58dc49-hq6px\" (UID: \"ed654f44-78c3-4118-83d5-e2a5d917c4f4\") " pod="openstack/barbican-worker-5db58dc49-hq6px" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.588249 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed654f44-78c3-4118-83d5-e2a5d917c4f4-config-data-custom\") pod \"barbican-worker-5db58dc49-hq6px\" (UID: \"ed654f44-78c3-4118-83d5-e2a5d917c4f4\") " pod="openstack/barbican-worker-5db58dc49-hq6px" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.588274 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-config-data-custom\") pod \"barbican-keystone-listener-6c489f678-crqhz\" (UID: \"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3\") " pod="openstack/barbican-keystone-listener-6c489f678-crqhz" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.588309 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-config-data\") pod \"barbican-keystone-listener-6c489f678-crqhz\" (UID: \"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3\") " pod="openstack/barbican-keystone-listener-6c489f678-crqhz" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.588323 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-logs\") pod \"barbican-keystone-listener-6c489f678-crqhz\" (UID: \"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3\") " pod="openstack/barbican-keystone-listener-6c489f678-crqhz" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.588346 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-ovsdbserver-nb\") pod \"dnsmasq-dns-65dd957765-625xd\" (UID: \"81dcf640-a47a-4436-b76f-113616974810\") " pod="openstack/dnsmasq-dns-65dd957765-625xd" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.588365 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed654f44-78c3-4118-83d5-e2a5d917c4f4-logs\") pod \"barbican-worker-5db58dc49-hq6px\" (UID: \"ed654f44-78c3-4118-83d5-e2a5d917c4f4\") " pod="openstack/barbican-worker-5db58dc49-hq6px" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.588386 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-ovsdbserver-sb\") pod \"dnsmasq-dns-65dd957765-625xd\" (UID: \"81dcf640-a47a-4436-b76f-113616974810\") " pod="openstack/dnsmasq-dns-65dd957765-625xd" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.588407 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-config\") pod \"dnsmasq-dns-65dd957765-625xd\" (UID: \"81dcf640-a47a-4436-b76f-113616974810\") " pod="openstack/dnsmasq-dns-65dd957765-625xd" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.589410 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed654f44-78c3-4118-83d5-e2a5d917c4f4-logs\") pod \"barbican-worker-5db58dc49-hq6px\" (UID: \"ed654f44-78c3-4118-83d5-e2a5d917c4f4\") " pod="openstack/barbican-worker-5db58dc49-hq6px" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.592258 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-logs\") pod \"barbican-keystone-listener-6c489f678-crqhz\" (UID: \"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3\") " pod="openstack/barbican-keystone-listener-6c489f678-crqhz" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.594263 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed654f44-78c3-4118-83d5-e2a5d917c4f4-combined-ca-bundle\") pod \"barbican-worker-5db58dc49-hq6px\" (UID: \"ed654f44-78c3-4118-83d5-e2a5d917c4f4\") " pod="openstack/barbican-worker-5db58dc49-hq6px" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.602213 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed654f44-78c3-4118-83d5-e2a5d917c4f4-config-data-custom\") pod \"barbican-worker-5db58dc49-hq6px\" (UID: \"ed654f44-78c3-4118-83d5-e2a5d917c4f4\") " pod="openstack/barbican-worker-5db58dc49-hq6px" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.602379 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-config-data\") pod \"barbican-keystone-listener-6c489f678-crqhz\" (UID: \"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3\") " pod="openstack/barbican-keystone-listener-6c489f678-crqhz" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.602747 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-combined-ca-bundle\") pod \"barbican-keystone-listener-6c489f678-crqhz\" (UID: \"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3\") " pod="openstack/barbican-keystone-listener-6c489f678-crqhz" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.603302 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-config-data-custom\") pod \"barbican-keystone-listener-6c489f678-crqhz\" (UID: \"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3\") " pod="openstack/barbican-keystone-listener-6c489f678-crqhz" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.607177 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed654f44-78c3-4118-83d5-e2a5d917c4f4-config-data\") pod \"barbican-worker-5db58dc49-hq6px\" (UID: \"ed654f44-78c3-4118-83d5-e2a5d917c4f4\") " pod="openstack/barbican-worker-5db58dc49-hq6px" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.621082 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqvl4\" (UniqueName: \"kubernetes.io/projected/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-kube-api-access-mqvl4\") pod \"barbican-keystone-listener-6c489f678-crqhz\" (UID: \"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3\") " pod="openstack/barbican-keystone-listener-6c489f678-crqhz" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.624537 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7f76bbf57b-xqff8"] Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.625916 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f76bbf57b-xqff8" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.633815 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hff2z\" (UniqueName: \"kubernetes.io/projected/ed654f44-78c3-4118-83d5-e2a5d917c4f4-kube-api-access-hff2z\") pod \"barbican-worker-5db58dc49-hq6px\" (UID: \"ed654f44-78c3-4118-83d5-e2a5d917c4f4\") " pod="openstack/barbican-worker-5db58dc49-hq6px" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.634269 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.642521 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5db58dc49-hq6px" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.652549 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6c489f678-crqhz" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.667702 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7f76bbf57b-xqff8"] Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.690204 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-dns-swift-storage-0\") pod \"dnsmasq-dns-65dd957765-625xd\" (UID: \"81dcf640-a47a-4436-b76f-113616974810\") " pod="openstack/dnsmasq-dns-65dd957765-625xd" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.690251 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5zqs\" (UniqueName: \"kubernetes.io/projected/81dcf640-a47a-4436-b76f-113616974810-kube-api-access-n5zqs\") pod \"dnsmasq-dns-65dd957765-625xd\" (UID: \"81dcf640-a47a-4436-b76f-113616974810\") " pod="openstack/dnsmasq-dns-65dd957765-625xd" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.690282 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-dns-svc\") pod \"dnsmasq-dns-65dd957765-625xd\" (UID: \"81dcf640-a47a-4436-b76f-113616974810\") " pod="openstack/dnsmasq-dns-65dd957765-625xd" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.690364 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-ovsdbserver-nb\") pod \"dnsmasq-dns-65dd957765-625xd\" (UID: \"81dcf640-a47a-4436-b76f-113616974810\") " pod="openstack/dnsmasq-dns-65dd957765-625xd" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.690389 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-ovsdbserver-sb\") pod \"dnsmasq-dns-65dd957765-625xd\" (UID: \"81dcf640-a47a-4436-b76f-113616974810\") " pod="openstack/dnsmasq-dns-65dd957765-625xd" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.690410 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-config\") pod \"dnsmasq-dns-65dd957765-625xd\" (UID: \"81dcf640-a47a-4436-b76f-113616974810\") " pod="openstack/dnsmasq-dns-65dd957765-625xd" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.691068 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-dns-swift-storage-0\") pod \"dnsmasq-dns-65dd957765-625xd\" (UID: \"81dcf640-a47a-4436-b76f-113616974810\") " pod="openstack/dnsmasq-dns-65dd957765-625xd" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.691094 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-config\") pod \"dnsmasq-dns-65dd957765-625xd\" (UID: \"81dcf640-a47a-4436-b76f-113616974810\") " pod="openstack/dnsmasq-dns-65dd957765-625xd" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.691639 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-ovsdbserver-nb\") pod \"dnsmasq-dns-65dd957765-625xd\" (UID: \"81dcf640-a47a-4436-b76f-113616974810\") " pod="openstack/dnsmasq-dns-65dd957765-625xd" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.691717 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-dns-svc\") pod \"dnsmasq-dns-65dd957765-625xd\" (UID: \"81dcf640-a47a-4436-b76f-113616974810\") " pod="openstack/dnsmasq-dns-65dd957765-625xd" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.692082 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-ovsdbserver-sb\") pod \"dnsmasq-dns-65dd957765-625xd\" (UID: \"81dcf640-a47a-4436-b76f-113616974810\") " pod="openstack/dnsmasq-dns-65dd957765-625xd" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.708753 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5zqs\" (UniqueName: \"kubernetes.io/projected/81dcf640-a47a-4436-b76f-113616974810-kube-api-access-n5zqs\") pod \"dnsmasq-dns-65dd957765-625xd\" (UID: \"81dcf640-a47a-4436-b76f-113616974810\") " pod="openstack/dnsmasq-dns-65dd957765-625xd" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.749582 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65dd957765-625xd" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.791629 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44491e74-cc03-4920-9a13-dd45fdc80a72-config-data\") pod \"barbican-api-7f76bbf57b-xqff8\" (UID: \"44491e74-cc03-4920-9a13-dd45fdc80a72\") " pod="openstack/barbican-api-7f76bbf57b-xqff8" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.791702 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h62cg\" (UniqueName: \"kubernetes.io/projected/44491e74-cc03-4920-9a13-dd45fdc80a72-kube-api-access-h62cg\") pod \"barbican-api-7f76bbf57b-xqff8\" (UID: \"44491e74-cc03-4920-9a13-dd45fdc80a72\") " pod="openstack/barbican-api-7f76bbf57b-xqff8" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.791731 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44491e74-cc03-4920-9a13-dd45fdc80a72-config-data-custom\") pod \"barbican-api-7f76bbf57b-xqff8\" (UID: \"44491e74-cc03-4920-9a13-dd45fdc80a72\") " pod="openstack/barbican-api-7f76bbf57b-xqff8" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.791799 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44491e74-cc03-4920-9a13-dd45fdc80a72-logs\") pod \"barbican-api-7f76bbf57b-xqff8\" (UID: \"44491e74-cc03-4920-9a13-dd45fdc80a72\") " pod="openstack/barbican-api-7f76bbf57b-xqff8" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.791816 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44491e74-cc03-4920-9a13-dd45fdc80a72-combined-ca-bundle\") pod \"barbican-api-7f76bbf57b-xqff8\" (UID: \"44491e74-cc03-4920-9a13-dd45fdc80a72\") " pod="openstack/barbican-api-7f76bbf57b-xqff8" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.895086 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h62cg\" (UniqueName: \"kubernetes.io/projected/44491e74-cc03-4920-9a13-dd45fdc80a72-kube-api-access-h62cg\") pod \"barbican-api-7f76bbf57b-xqff8\" (UID: \"44491e74-cc03-4920-9a13-dd45fdc80a72\") " pod="openstack/barbican-api-7f76bbf57b-xqff8" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.895281 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44491e74-cc03-4920-9a13-dd45fdc80a72-config-data-custom\") pod \"barbican-api-7f76bbf57b-xqff8\" (UID: \"44491e74-cc03-4920-9a13-dd45fdc80a72\") " pod="openstack/barbican-api-7f76bbf57b-xqff8" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.904219 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44491e74-cc03-4920-9a13-dd45fdc80a72-logs\") pod \"barbican-api-7f76bbf57b-xqff8\" (UID: \"44491e74-cc03-4920-9a13-dd45fdc80a72\") " pod="openstack/barbican-api-7f76bbf57b-xqff8" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.904278 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44491e74-cc03-4920-9a13-dd45fdc80a72-combined-ca-bundle\") pod \"barbican-api-7f76bbf57b-xqff8\" (UID: \"44491e74-cc03-4920-9a13-dd45fdc80a72\") " pod="openstack/barbican-api-7f76bbf57b-xqff8" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.904471 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44491e74-cc03-4920-9a13-dd45fdc80a72-config-data\") pod \"barbican-api-7f76bbf57b-xqff8\" (UID: \"44491e74-cc03-4920-9a13-dd45fdc80a72\") " pod="openstack/barbican-api-7f76bbf57b-xqff8" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.904836 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44491e74-cc03-4920-9a13-dd45fdc80a72-logs\") pod \"barbican-api-7f76bbf57b-xqff8\" (UID: \"44491e74-cc03-4920-9a13-dd45fdc80a72\") " pod="openstack/barbican-api-7f76bbf57b-xqff8" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.912587 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44491e74-cc03-4920-9a13-dd45fdc80a72-config-data-custom\") pod \"barbican-api-7f76bbf57b-xqff8\" (UID: \"44491e74-cc03-4920-9a13-dd45fdc80a72\") " pod="openstack/barbican-api-7f76bbf57b-xqff8" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.915202 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44491e74-cc03-4920-9a13-dd45fdc80a72-combined-ca-bundle\") pod \"barbican-api-7f76bbf57b-xqff8\" (UID: \"44491e74-cc03-4920-9a13-dd45fdc80a72\") " pod="openstack/barbican-api-7f76bbf57b-xqff8" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.916195 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44491e74-cc03-4920-9a13-dd45fdc80a72-config-data\") pod \"barbican-api-7f76bbf57b-xqff8\" (UID: \"44491e74-cc03-4920-9a13-dd45fdc80a72\") " pod="openstack/barbican-api-7f76bbf57b-xqff8" Dec 03 07:09:22 crc kubenswrapper[4947]: I1203 07:09:22.924851 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h62cg\" (UniqueName: \"kubernetes.io/projected/44491e74-cc03-4920-9a13-dd45fdc80a72-kube-api-access-h62cg\") pod \"barbican-api-7f76bbf57b-xqff8\" (UID: \"44491e74-cc03-4920-9a13-dd45fdc80a72\") " pod="openstack/barbican-api-7f76bbf57b-xqff8" Dec 03 07:09:23 crc kubenswrapper[4947]: I1203 07:09:23.002549 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79f30107-717d-402f-b501-8911cede9cc1","Type":"ContainerStarted","Data":"c160cfd5c71b2291c7347c163364a20cda48c438a7c41328117d8d8cb217a442"} Dec 03 07:09:23 crc kubenswrapper[4947]: I1203 07:09:23.131684 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5db58dc49-hq6px"] Dec 03 07:09:23 crc kubenswrapper[4947]: I1203 07:09:23.180915 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f76bbf57b-xqff8" Dec 03 07:09:23 crc kubenswrapper[4947]: I1203 07:09:23.202397 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6c489f678-crqhz"] Dec 03 07:09:23 crc kubenswrapper[4947]: I1203 07:09:23.309021 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6bhnb" Dec 03 07:09:23 crc kubenswrapper[4947]: I1203 07:09:23.310935 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65dd957765-625xd"] Dec 03 07:09:23 crc kubenswrapper[4947]: I1203 07:09:23.413474 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/af29daa3-e143-4ea0-bfe0-284fd103f8b3-etc-machine-id\") pod \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\" (UID: \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\") " Dec 03 07:09:23 crc kubenswrapper[4947]: I1203 07:09:23.413560 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af29daa3-e143-4ea0-bfe0-284fd103f8b3-scripts\") pod \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\" (UID: \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\") " Dec 03 07:09:23 crc kubenswrapper[4947]: I1203 07:09:23.413637 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af29daa3-e143-4ea0-bfe0-284fd103f8b3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "af29daa3-e143-4ea0-bfe0-284fd103f8b3" (UID: "af29daa3-e143-4ea0-bfe0-284fd103f8b3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:09:23 crc kubenswrapper[4947]: I1203 07:09:23.413679 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdbpz\" (UniqueName: \"kubernetes.io/projected/af29daa3-e143-4ea0-bfe0-284fd103f8b3-kube-api-access-xdbpz\") pod \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\" (UID: \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\") " Dec 03 07:09:23 crc kubenswrapper[4947]: I1203 07:09:23.413708 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af29daa3-e143-4ea0-bfe0-284fd103f8b3-db-sync-config-data\") pod \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\" (UID: \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\") " Dec 03 07:09:23 crc kubenswrapper[4947]: I1203 07:09:23.413757 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af29daa3-e143-4ea0-bfe0-284fd103f8b3-config-data\") pod \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\" (UID: \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\") " Dec 03 07:09:23 crc kubenswrapper[4947]: I1203 07:09:23.413829 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af29daa3-e143-4ea0-bfe0-284fd103f8b3-combined-ca-bundle\") pod \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\" (UID: \"af29daa3-e143-4ea0-bfe0-284fd103f8b3\") " Dec 03 07:09:23 crc kubenswrapper[4947]: I1203 07:09:23.414609 4947 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/af29daa3-e143-4ea0-bfe0-284fd103f8b3-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:23 crc kubenswrapper[4947]: I1203 07:09:23.418893 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af29daa3-e143-4ea0-bfe0-284fd103f8b3-scripts" (OuterVolumeSpecName: "scripts") pod "af29daa3-e143-4ea0-bfe0-284fd103f8b3" (UID: "af29daa3-e143-4ea0-bfe0-284fd103f8b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:23 crc kubenswrapper[4947]: I1203 07:09:23.420938 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af29daa3-e143-4ea0-bfe0-284fd103f8b3-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "af29daa3-e143-4ea0-bfe0-284fd103f8b3" (UID: "af29daa3-e143-4ea0-bfe0-284fd103f8b3"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:23 crc kubenswrapper[4947]: I1203 07:09:23.429102 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af29daa3-e143-4ea0-bfe0-284fd103f8b3-kube-api-access-xdbpz" (OuterVolumeSpecName: "kube-api-access-xdbpz") pod "af29daa3-e143-4ea0-bfe0-284fd103f8b3" (UID: "af29daa3-e143-4ea0-bfe0-284fd103f8b3"). InnerVolumeSpecName "kube-api-access-xdbpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:09:23 crc kubenswrapper[4947]: I1203 07:09:23.471718 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af29daa3-e143-4ea0-bfe0-284fd103f8b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af29daa3-e143-4ea0-bfe0-284fd103f8b3" (UID: "af29daa3-e143-4ea0-bfe0-284fd103f8b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:23 crc kubenswrapper[4947]: I1203 07:09:23.485111 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af29daa3-e143-4ea0-bfe0-284fd103f8b3-config-data" (OuterVolumeSpecName: "config-data") pod "af29daa3-e143-4ea0-bfe0-284fd103f8b3" (UID: "af29daa3-e143-4ea0-bfe0-284fd103f8b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:23 crc kubenswrapper[4947]: I1203 07:09:23.516095 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdbpz\" (UniqueName: \"kubernetes.io/projected/af29daa3-e143-4ea0-bfe0-284fd103f8b3-kube-api-access-xdbpz\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:23 crc kubenswrapper[4947]: I1203 07:09:23.516395 4947 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/af29daa3-e143-4ea0-bfe0-284fd103f8b3-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:23 crc kubenswrapper[4947]: I1203 07:09:23.516468 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af29daa3-e143-4ea0-bfe0-284fd103f8b3-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:23 crc kubenswrapper[4947]: I1203 07:09:23.516550 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af29daa3-e143-4ea0-bfe0-284fd103f8b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:23 crc kubenswrapper[4947]: I1203 07:09:23.516607 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af29daa3-e143-4ea0-bfe0-284fd103f8b3-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:23 crc kubenswrapper[4947]: I1203 07:09:23.681480 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7f76bbf57b-xqff8"] Dec 03 07:09:23 crc kubenswrapper[4947]: W1203 07:09:23.682633 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44491e74_cc03_4920_9a13_dd45fdc80a72.slice/crio-146a5f784ba6512a4b2d9d2cfe7cde91876292056a4d3f93a47a3ef343d4a360 WatchSource:0}: Error finding container 146a5f784ba6512a4b2d9d2cfe7cde91876292056a4d3f93a47a3ef343d4a360: Status 404 returned error can't find the container with id 146a5f784ba6512a4b2d9d2cfe7cde91876292056a4d3f93a47a3ef343d4a360 Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.014619 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79f30107-717d-402f-b501-8911cede9cc1","Type":"ContainerStarted","Data":"f301cef4448a9a3bc7351f13982546d0b0e78a6d82a42cf4eee6ea3729b5da0b"} Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.021782 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5db58dc49-hq6px" event={"ID":"ed654f44-78c3-4118-83d5-e2a5d917c4f4","Type":"ContainerStarted","Data":"72a7f211b5a75b4a185ba853cf9914e224dc0742880a659bebae781e9648aaf2"} Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.026288 4947 generic.go:334] "Generic (PLEG): container finished" podID="81dcf640-a47a-4436-b76f-113616974810" containerID="49197484814ed3ec9db61da46408fd0cef1ca0bd94925afacfeff68264f590ad" exitCode=0 Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.026382 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65dd957765-625xd" event={"ID":"81dcf640-a47a-4436-b76f-113616974810","Type":"ContainerDied","Data":"49197484814ed3ec9db61da46408fd0cef1ca0bd94925afacfeff68264f590ad"} Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.026414 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65dd957765-625xd" event={"ID":"81dcf640-a47a-4436-b76f-113616974810","Type":"ContainerStarted","Data":"48fa75378c478cc09e0b7fbdb67b7449ecf56193f116b7b27827a611822a4e66"} Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.047362 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c489f678-crqhz" event={"ID":"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3","Type":"ContainerStarted","Data":"8b7773436f1b032bb60a82d8c350c484575d97ff3f911f3e0eaad7489225d38f"} Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.056602 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f76bbf57b-xqff8" event={"ID":"44491e74-cc03-4920-9a13-dd45fdc80a72","Type":"ContainerStarted","Data":"c0da3ad03e90ebc0e594700ba50b9c8a222500604c0d1efcb5b7f081efbdace6"} Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.056657 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f76bbf57b-xqff8" event={"ID":"44491e74-cc03-4920-9a13-dd45fdc80a72","Type":"ContainerStarted","Data":"146a5f784ba6512a4b2d9d2cfe7cde91876292056a4d3f93a47a3ef343d4a360"} Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.059483 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6bhnb" event={"ID":"af29daa3-e143-4ea0-bfe0-284fd103f8b3","Type":"ContainerDied","Data":"bba5f5a73e19365681938bbd4e092d8cdc0204f119ec19a609fedc4b6ea5ac19"} Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.059536 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bba5f5a73e19365681938bbd4e092d8cdc0204f119ec19a609fedc4b6ea5ac19" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.059628 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6bhnb" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.230599 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 07:09:24 crc kubenswrapper[4947]: E1203 07:09:24.231300 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af29daa3-e143-4ea0-bfe0-284fd103f8b3" containerName="cinder-db-sync" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.231323 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="af29daa3-e143-4ea0-bfe0-284fd103f8b3" containerName="cinder-db-sync" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.231592 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="af29daa3-e143-4ea0-bfe0-284fd103f8b3" containerName="cinder-db-sync" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.232745 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.236219 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.236408 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nxr4p" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.236569 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.239930 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.270888 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.334083 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/18a8b60d-0ea3-40ce-be39-269c1f62080c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"18a8b60d-0ea3-40ce-be39-269c1f62080c\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.334121 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18a8b60d-0ea3-40ce-be39-269c1f62080c-scripts\") pod \"cinder-scheduler-0\" (UID: \"18a8b60d-0ea3-40ce-be39-269c1f62080c\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.334141 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18a8b60d-0ea3-40ce-be39-269c1f62080c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"18a8b60d-0ea3-40ce-be39-269c1f62080c\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.334187 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfrhh\" (UniqueName: \"kubernetes.io/projected/18a8b60d-0ea3-40ce-be39-269c1f62080c-kube-api-access-nfrhh\") pod \"cinder-scheduler-0\" (UID: \"18a8b60d-0ea3-40ce-be39-269c1f62080c\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.334225 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18a8b60d-0ea3-40ce-be39-269c1f62080c-config-data\") pod \"cinder-scheduler-0\" (UID: \"18a8b60d-0ea3-40ce-be39-269c1f62080c\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.334272 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18a8b60d-0ea3-40ce-be39-269c1f62080c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"18a8b60d-0ea3-40ce-be39-269c1f62080c\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.353694 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65dd957765-625xd"] Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.371061 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c77d8b67c-s2lt2"] Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.373245 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.384385 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c77d8b67c-s2lt2"] Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.439590 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-config\") pod \"dnsmasq-dns-5c77d8b67c-s2lt2\" (UID: \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\") " pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.439646 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-ovsdbserver-sb\") pod \"dnsmasq-dns-5c77d8b67c-s2lt2\" (UID: \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\") " pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.439683 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-dns-swift-storage-0\") pod \"dnsmasq-dns-5c77d8b67c-s2lt2\" (UID: \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\") " pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.439740 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/18a8b60d-0ea3-40ce-be39-269c1f62080c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"18a8b60d-0ea3-40ce-be39-269c1f62080c\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.439756 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18a8b60d-0ea3-40ce-be39-269c1f62080c-scripts\") pod \"cinder-scheduler-0\" (UID: \"18a8b60d-0ea3-40ce-be39-269c1f62080c\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.439773 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18a8b60d-0ea3-40ce-be39-269c1f62080c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"18a8b60d-0ea3-40ce-be39-269c1f62080c\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.439814 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-dns-svc\") pod \"dnsmasq-dns-5c77d8b67c-s2lt2\" (UID: \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\") " pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.439837 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfrhh\" (UniqueName: \"kubernetes.io/projected/18a8b60d-0ea3-40ce-be39-269c1f62080c-kube-api-access-nfrhh\") pod \"cinder-scheduler-0\" (UID: \"18a8b60d-0ea3-40ce-be39-269c1f62080c\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.439839 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/18a8b60d-0ea3-40ce-be39-269c1f62080c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"18a8b60d-0ea3-40ce-be39-269c1f62080c\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.439853 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-ovsdbserver-nb\") pod \"dnsmasq-dns-5c77d8b67c-s2lt2\" (UID: \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\") " pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.439912 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18a8b60d-0ea3-40ce-be39-269c1f62080c-config-data\") pod \"cinder-scheduler-0\" (UID: \"18a8b60d-0ea3-40ce-be39-269c1f62080c\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.440378 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18a8b60d-0ea3-40ce-be39-269c1f62080c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"18a8b60d-0ea3-40ce-be39-269c1f62080c\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.440444 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncthr\" (UniqueName: \"kubernetes.io/projected/64a14fe8-90be-4d6e-890f-a42fd3ea6894-kube-api-access-ncthr\") pod \"dnsmasq-dns-5c77d8b67c-s2lt2\" (UID: \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\") " pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.450264 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18a8b60d-0ea3-40ce-be39-269c1f62080c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"18a8b60d-0ea3-40ce-be39-269c1f62080c\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.473180 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18a8b60d-0ea3-40ce-be39-269c1f62080c-scripts\") pod \"cinder-scheduler-0\" (UID: \"18a8b60d-0ea3-40ce-be39-269c1f62080c\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.473243 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfrhh\" (UniqueName: \"kubernetes.io/projected/18a8b60d-0ea3-40ce-be39-269c1f62080c-kube-api-access-nfrhh\") pod \"cinder-scheduler-0\" (UID: \"18a8b60d-0ea3-40ce-be39-269c1f62080c\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.473268 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18a8b60d-0ea3-40ce-be39-269c1f62080c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"18a8b60d-0ea3-40ce-be39-269c1f62080c\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.473728 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18a8b60d-0ea3-40ce-be39-269c1f62080c-config-data\") pod \"cinder-scheduler-0\" (UID: \"18a8b60d-0ea3-40ce-be39-269c1f62080c\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.512157 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.513947 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.519398 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.531638 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.543054 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-ovsdbserver-nb\") pod \"dnsmasq-dns-5c77d8b67c-s2lt2\" (UID: \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\") " pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.543150 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncthr\" (UniqueName: \"kubernetes.io/projected/64a14fe8-90be-4d6e-890f-a42fd3ea6894-kube-api-access-ncthr\") pod \"dnsmasq-dns-5c77d8b67c-s2lt2\" (UID: \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\") " pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.543244 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-config\") pod \"dnsmasq-dns-5c77d8b67c-s2lt2\" (UID: \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\") " pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.544124 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-ovsdbserver-nb\") pod \"dnsmasq-dns-5c77d8b67c-s2lt2\" (UID: \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\") " pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.544716 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-config\") pod \"dnsmasq-dns-5c77d8b67c-s2lt2\" (UID: \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\") " pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.544760 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-ovsdbserver-sb\") pod \"dnsmasq-dns-5c77d8b67c-s2lt2\" (UID: \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\") " pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.544823 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-dns-swift-storage-0\") pod \"dnsmasq-dns-5c77d8b67c-s2lt2\" (UID: \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\") " pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.544873 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-ovsdbserver-sb\") pod \"dnsmasq-dns-5c77d8b67c-s2lt2\" (UID: \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\") " pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.544941 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-dns-svc\") pod \"dnsmasq-dns-5c77d8b67c-s2lt2\" (UID: \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\") " pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.546460 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-dns-swift-storage-0\") pod \"dnsmasq-dns-5c77d8b67c-s2lt2\" (UID: \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\") " pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.548173 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-dns-svc\") pod \"dnsmasq-dns-5c77d8b67c-s2lt2\" (UID: \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\") " pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.551774 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.563144 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncthr\" (UniqueName: \"kubernetes.io/projected/64a14fe8-90be-4d6e-890f-a42fd3ea6894-kube-api-access-ncthr\") pod \"dnsmasq-dns-5c77d8b67c-s2lt2\" (UID: \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\") " pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.647550 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5590bf3-6e25-4dde-836a-bf6fe8aec722-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " pod="openstack/cinder-api-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.648039 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5590bf3-6e25-4dde-836a-bf6fe8aec722-config-data-custom\") pod \"cinder-api-0\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " pod="openstack/cinder-api-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.648072 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5590bf3-6e25-4dde-836a-bf6fe8aec722-config-data\") pod \"cinder-api-0\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " pod="openstack/cinder-api-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.648120 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5590bf3-6e25-4dde-836a-bf6fe8aec722-logs\") pod \"cinder-api-0\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " pod="openstack/cinder-api-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.648144 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wprnr\" (UniqueName: \"kubernetes.io/projected/e5590bf3-6e25-4dde-836a-bf6fe8aec722-kube-api-access-wprnr\") pod \"cinder-api-0\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " pod="openstack/cinder-api-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.648169 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5590bf3-6e25-4dde-836a-bf6fe8aec722-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " pod="openstack/cinder-api-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.648220 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5590bf3-6e25-4dde-836a-bf6fe8aec722-scripts\") pod \"cinder-api-0\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " pod="openstack/cinder-api-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.718013 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.749325 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5590bf3-6e25-4dde-836a-bf6fe8aec722-scripts\") pod \"cinder-api-0\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " pod="openstack/cinder-api-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.749396 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5590bf3-6e25-4dde-836a-bf6fe8aec722-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " pod="openstack/cinder-api-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.749421 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5590bf3-6e25-4dde-836a-bf6fe8aec722-config-data-custom\") pod \"cinder-api-0\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " pod="openstack/cinder-api-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.749444 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5590bf3-6e25-4dde-836a-bf6fe8aec722-config-data\") pod \"cinder-api-0\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " pod="openstack/cinder-api-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.749501 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5590bf3-6e25-4dde-836a-bf6fe8aec722-logs\") pod \"cinder-api-0\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " pod="openstack/cinder-api-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.749523 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wprnr\" (UniqueName: \"kubernetes.io/projected/e5590bf3-6e25-4dde-836a-bf6fe8aec722-kube-api-access-wprnr\") pod \"cinder-api-0\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " pod="openstack/cinder-api-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.749549 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5590bf3-6e25-4dde-836a-bf6fe8aec722-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " pod="openstack/cinder-api-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.749652 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5590bf3-6e25-4dde-836a-bf6fe8aec722-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " pod="openstack/cinder-api-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.751010 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5590bf3-6e25-4dde-836a-bf6fe8aec722-logs\") pod \"cinder-api-0\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " pod="openstack/cinder-api-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.755738 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5590bf3-6e25-4dde-836a-bf6fe8aec722-scripts\") pod \"cinder-api-0\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " pod="openstack/cinder-api-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.756405 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5590bf3-6e25-4dde-836a-bf6fe8aec722-config-data\") pod \"cinder-api-0\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " pod="openstack/cinder-api-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.760614 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5590bf3-6e25-4dde-836a-bf6fe8aec722-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " pod="openstack/cinder-api-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.767412 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5590bf3-6e25-4dde-836a-bf6fe8aec722-config-data-custom\") pod \"cinder-api-0\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " pod="openstack/cinder-api-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.774097 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wprnr\" (UniqueName: \"kubernetes.io/projected/e5590bf3-6e25-4dde-836a-bf6fe8aec722-kube-api-access-wprnr\") pod \"cinder-api-0\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " pod="openstack/cinder-api-0" Dec 03 07:09:24 crc kubenswrapper[4947]: I1203 07:09:24.840019 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 07:09:25 crc kubenswrapper[4947]: I1203 07:09:25.067206 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 07:09:25 crc kubenswrapper[4947]: I1203 07:09:25.073200 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f76bbf57b-xqff8" event={"ID":"44491e74-cc03-4920-9a13-dd45fdc80a72","Type":"ContainerStarted","Data":"acf01c9bf5d5aba52071c93afce6a2c5fd26f396ce1a3fab1750d45a6798e5c3"} Dec 03 07:09:25 crc kubenswrapper[4947]: I1203 07:09:25.073343 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7f76bbf57b-xqff8" Dec 03 07:09:25 crc kubenswrapper[4947]: I1203 07:09:25.079639 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65dd957765-625xd" event={"ID":"81dcf640-a47a-4436-b76f-113616974810","Type":"ContainerStarted","Data":"3642460d41a834c63dbd2439c55e7a5337fb4287d7ad88ec7bb86a273aa50fac"} Dec 03 07:09:25 crc kubenswrapper[4947]: I1203 07:09:25.079904 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65dd957765-625xd" podUID="81dcf640-a47a-4436-b76f-113616974810" containerName="dnsmasq-dns" containerID="cri-o://3642460d41a834c63dbd2439c55e7a5337fb4287d7ad88ec7bb86a273aa50fac" gracePeriod=10 Dec 03 07:09:25 crc kubenswrapper[4947]: I1203 07:09:25.081359 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65dd957765-625xd" Dec 03 07:09:25 crc kubenswrapper[4947]: I1203 07:09:25.101216 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7f76bbf57b-xqff8" podStartSLOduration=3.101188471 podStartE2EDuration="3.101188471s" podCreationTimestamp="2025-12-03 07:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:09:25.093918445 +0000 UTC m=+1226.354872891" watchObservedRunningTime="2025-12-03 07:09:25.101188471 +0000 UTC m=+1226.362142887" Dec 03 07:09:25 crc kubenswrapper[4947]: I1203 07:09:25.114711 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65dd957765-625xd" podStartSLOduration=3.114691415 podStartE2EDuration="3.114691415s" podCreationTimestamp="2025-12-03 07:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:09:25.110028099 +0000 UTC m=+1226.370982545" watchObservedRunningTime="2025-12-03 07:09:25.114691415 +0000 UTC m=+1226.375645921" Dec 03 07:09:25 crc kubenswrapper[4947]: W1203 07:09:25.377046 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64a14fe8_90be_4d6e_890f_a42fd3ea6894.slice/crio-0cd8242708aaa2f64241bde15baff96c45c872ddae2768e869246a42ff93030a WatchSource:0}: Error finding container 0cd8242708aaa2f64241bde15baff96c45c872ddae2768e869246a42ff93030a: Status 404 returned error can't find the container with id 0cd8242708aaa2f64241bde15baff96c45c872ddae2768e869246a42ff93030a Dec 03 07:09:25 crc kubenswrapper[4947]: I1203 07:09:25.377061 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c77d8b67c-s2lt2"] Dec 03 07:09:25 crc kubenswrapper[4947]: I1203 07:09:25.469234 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 07:09:25 crc kubenswrapper[4947]: W1203 07:09:25.478153 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5590bf3_6e25_4dde_836a_bf6fe8aec722.slice/crio-ec359dde238b7b484b8d425c97e2ec2372738f91a97d53aeb2acfe3ec9f8a527 WatchSource:0}: Error finding container ec359dde238b7b484b8d425c97e2ec2372738f91a97d53aeb2acfe3ec9f8a527: Status 404 returned error can't find the container with id ec359dde238b7b484b8d425c97e2ec2372738f91a97d53aeb2acfe3ec9f8a527 Dec 03 07:09:26 crc kubenswrapper[4947]: I1203 07:09:26.111537 4947 generic.go:334] "Generic (PLEG): container finished" podID="64a14fe8-90be-4d6e-890f-a42fd3ea6894" containerID="85c3379202dda47b0fbc87de0172fe9a6295c490b27d81791913aeafe5f28ddb" exitCode=0 Dec 03 07:09:26 crc kubenswrapper[4947]: I1203 07:09:26.111952 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" event={"ID":"64a14fe8-90be-4d6e-890f-a42fd3ea6894","Type":"ContainerDied","Data":"85c3379202dda47b0fbc87de0172fe9a6295c490b27d81791913aeafe5f28ddb"} Dec 03 07:09:26 crc kubenswrapper[4947]: I1203 07:09:26.111986 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" event={"ID":"64a14fe8-90be-4d6e-890f-a42fd3ea6894","Type":"ContainerStarted","Data":"0cd8242708aaa2f64241bde15baff96c45c872ddae2768e869246a42ff93030a"} Dec 03 07:09:26 crc kubenswrapper[4947]: I1203 07:09:26.116454 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79f30107-717d-402f-b501-8911cede9cc1","Type":"ContainerStarted","Data":"f667c287f4884fc39a6408c862e2612ec32c1352d5f89338f61c8b71168745ef"} Dec 03 07:09:26 crc kubenswrapper[4947]: I1203 07:09:26.119167 4947 generic.go:334] "Generic (PLEG): container finished" podID="81dcf640-a47a-4436-b76f-113616974810" containerID="3642460d41a834c63dbd2439c55e7a5337fb4287d7ad88ec7bb86a273aa50fac" exitCode=0 Dec 03 07:09:26 crc kubenswrapper[4947]: I1203 07:09:26.119209 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65dd957765-625xd" event={"ID":"81dcf640-a47a-4436-b76f-113616974810","Type":"ContainerDied","Data":"3642460d41a834c63dbd2439c55e7a5337fb4287d7ad88ec7bb86a273aa50fac"} Dec 03 07:09:26 crc kubenswrapper[4947]: I1203 07:09:26.120763 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e5590bf3-6e25-4dde-836a-bf6fe8aec722","Type":"ContainerStarted","Data":"4f03fb183232f6e903022c0a3bf495656b02a2976e65090505377a46e35f25af"} Dec 03 07:09:26 crc kubenswrapper[4947]: I1203 07:09:26.120785 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e5590bf3-6e25-4dde-836a-bf6fe8aec722","Type":"ContainerStarted","Data":"ec359dde238b7b484b8d425c97e2ec2372738f91a97d53aeb2acfe3ec9f8a527"} Dec 03 07:09:26 crc kubenswrapper[4947]: I1203 07:09:26.128356 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"18a8b60d-0ea3-40ce-be39-269c1f62080c","Type":"ContainerStarted","Data":"d8fbeeb275c514c957528815059558dbddf329e5e5a6a5b7d9ea7ebb8e5b6057"} Dec 03 07:09:26 crc kubenswrapper[4947]: I1203 07:09:26.128809 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7f76bbf57b-xqff8" Dec 03 07:09:26 crc kubenswrapper[4947]: I1203 07:09:26.874580 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65dd957765-625xd" Dec 03 07:09:27 crc kubenswrapper[4947]: I1203 07:09:27.006515 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-dns-svc\") pod \"81dcf640-a47a-4436-b76f-113616974810\" (UID: \"81dcf640-a47a-4436-b76f-113616974810\") " Dec 03 07:09:27 crc kubenswrapper[4947]: I1203 07:09:27.006829 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-ovsdbserver-nb\") pod \"81dcf640-a47a-4436-b76f-113616974810\" (UID: \"81dcf640-a47a-4436-b76f-113616974810\") " Dec 03 07:09:27 crc kubenswrapper[4947]: I1203 07:09:27.006959 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5zqs\" (UniqueName: \"kubernetes.io/projected/81dcf640-a47a-4436-b76f-113616974810-kube-api-access-n5zqs\") pod \"81dcf640-a47a-4436-b76f-113616974810\" (UID: \"81dcf640-a47a-4436-b76f-113616974810\") " Dec 03 07:09:27 crc kubenswrapper[4947]: I1203 07:09:27.007060 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-ovsdbserver-sb\") pod \"81dcf640-a47a-4436-b76f-113616974810\" (UID: \"81dcf640-a47a-4436-b76f-113616974810\") " Dec 03 07:09:27 crc kubenswrapper[4947]: I1203 07:09:27.007362 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-dns-swift-storage-0\") pod \"81dcf640-a47a-4436-b76f-113616974810\" (UID: \"81dcf640-a47a-4436-b76f-113616974810\") " Dec 03 07:09:27 crc kubenswrapper[4947]: I1203 07:09:27.007711 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-config\") pod \"81dcf640-a47a-4436-b76f-113616974810\" (UID: \"81dcf640-a47a-4436-b76f-113616974810\") " Dec 03 07:09:27 crc kubenswrapper[4947]: I1203 07:09:27.017434 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81dcf640-a47a-4436-b76f-113616974810-kube-api-access-n5zqs" (OuterVolumeSpecName: "kube-api-access-n5zqs") pod "81dcf640-a47a-4436-b76f-113616974810" (UID: "81dcf640-a47a-4436-b76f-113616974810"). InnerVolumeSpecName "kube-api-access-n5zqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:09:27 crc kubenswrapper[4947]: I1203 07:09:27.059196 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "81dcf640-a47a-4436-b76f-113616974810" (UID: "81dcf640-a47a-4436-b76f-113616974810"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:09:27 crc kubenswrapper[4947]: I1203 07:09:27.059244 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "81dcf640-a47a-4436-b76f-113616974810" (UID: "81dcf640-a47a-4436-b76f-113616974810"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:09:27 crc kubenswrapper[4947]: I1203 07:09:27.068753 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "81dcf640-a47a-4436-b76f-113616974810" (UID: "81dcf640-a47a-4436-b76f-113616974810"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:09:27 crc kubenswrapper[4947]: I1203 07:09:27.069703 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "81dcf640-a47a-4436-b76f-113616974810" (UID: "81dcf640-a47a-4436-b76f-113616974810"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:09:27 crc kubenswrapper[4947]: I1203 07:09:27.079795 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-config" (OuterVolumeSpecName: "config") pod "81dcf640-a47a-4436-b76f-113616974810" (UID: "81dcf640-a47a-4436-b76f-113616974810"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:09:27 crc kubenswrapper[4947]: I1203 07:09:27.110512 4947 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:27 crc kubenswrapper[4947]: I1203 07:09:27.110549 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:27 crc kubenswrapper[4947]: I1203 07:09:27.110561 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:27 crc kubenswrapper[4947]: I1203 07:09:27.110574 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:27 crc kubenswrapper[4947]: I1203 07:09:27.110586 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5zqs\" (UniqueName: \"kubernetes.io/projected/81dcf640-a47a-4436-b76f-113616974810-kube-api-access-n5zqs\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:27 crc kubenswrapper[4947]: I1203 07:09:27.110598 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/81dcf640-a47a-4436-b76f-113616974810-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:27 crc kubenswrapper[4947]: I1203 07:09:27.152317 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65dd957765-625xd" event={"ID":"81dcf640-a47a-4436-b76f-113616974810","Type":"ContainerDied","Data":"48fa75378c478cc09e0b7fbdb67b7449ecf56193f116b7b27827a611822a4e66"} Dec 03 07:09:27 crc kubenswrapper[4947]: I1203 07:09:27.152356 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65dd957765-625xd" Dec 03 07:09:27 crc kubenswrapper[4947]: I1203 07:09:27.152375 4947 scope.go:117] "RemoveContainer" containerID="3642460d41a834c63dbd2439c55e7a5337fb4287d7ad88ec7bb86a273aa50fac" Dec 03 07:09:27 crc kubenswrapper[4947]: I1203 07:09:27.190108 4947 scope.go:117] "RemoveContainer" containerID="49197484814ed3ec9db61da46408fd0cef1ca0bd94925afacfeff68264f590ad" Dec 03 07:09:27 crc kubenswrapper[4947]: I1203 07:09:27.195803 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65dd957765-625xd"] Dec 03 07:09:27 crc kubenswrapper[4947]: I1203 07:09:27.205321 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65dd957765-625xd"] Dec 03 07:09:28 crc kubenswrapper[4947]: I1203 07:09:28.173001 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"18a8b60d-0ea3-40ce-be39-269c1f62080c","Type":"ContainerStarted","Data":"709315361d917f44e800602797bbf94a88212d69d3aeae1f71cc1a6209fadd62"} Dec 03 07:09:28 crc kubenswrapper[4947]: I1203 07:09:28.175790 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" event={"ID":"64a14fe8-90be-4d6e-890f-a42fd3ea6894","Type":"ContainerStarted","Data":"812f24422eebf2b9f2ac39a2da773fc81fcbeacd0d393438e6eb4d93eb191a7e"} Dec 03 07:09:28 crc kubenswrapper[4947]: I1203 07:09:28.175940 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" Dec 03 07:09:28 crc kubenswrapper[4947]: I1203 07:09:28.180689 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5db58dc49-hq6px" event={"ID":"ed654f44-78c3-4118-83d5-e2a5d917c4f4","Type":"ContainerStarted","Data":"e0dfb3c4add7777ef4ada7896099eaab6442a879387a5254d9c5547a18dee051"} Dec 03 07:09:28 crc kubenswrapper[4947]: I1203 07:09:28.180731 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5db58dc49-hq6px" event={"ID":"ed654f44-78c3-4118-83d5-e2a5d917c4f4","Type":"ContainerStarted","Data":"73b7a501be5ba2d71f1c1dd2ceb4c2ccc770dcda00e913d4135212d4d5b47c39"} Dec 03 07:09:28 crc kubenswrapper[4947]: I1203 07:09:28.195435 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e5590bf3-6e25-4dde-836a-bf6fe8aec722","Type":"ContainerStarted","Data":"0d55b21d65b4d3b37dc04a0d33a64912b2f15c6cc75213fee04366817d11816c"} Dec 03 07:09:28 crc kubenswrapper[4947]: I1203 07:09:28.196294 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 03 07:09:28 crc kubenswrapper[4947]: I1203 07:09:28.207079 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c489f678-crqhz" event={"ID":"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3","Type":"ContainerStarted","Data":"a809b23cd1a249dbc9331e612526f60ed5bdd18fa6472bcbf28960c85be485be"} Dec 03 07:09:28 crc kubenswrapper[4947]: I1203 07:09:28.207121 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c489f678-crqhz" event={"ID":"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3","Type":"ContainerStarted","Data":"aac3920ae7cd9e0b2c5f777e7aa7e5d8fbbea4f5ca8c93fe99023a44910f66a8"} Dec 03 07:09:28 crc kubenswrapper[4947]: I1203 07:09:28.208874 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" podStartSLOduration=4.208860925 podStartE2EDuration="4.208860925s" podCreationTimestamp="2025-12-03 07:09:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:09:28.198707962 +0000 UTC m=+1229.459662388" watchObservedRunningTime="2025-12-03 07:09:28.208860925 +0000 UTC m=+1229.469815351" Dec 03 07:09:28 crc kubenswrapper[4947]: I1203 07:09:28.232577 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5db58dc49-hq6px" podStartSLOduration=2.425180865 podStartE2EDuration="6.232554104s" podCreationTimestamp="2025-12-03 07:09:22 +0000 UTC" firstStartedPulling="2025-12-03 07:09:23.147715814 +0000 UTC m=+1224.408670250" lastFinishedPulling="2025-12-03 07:09:26.955089053 +0000 UTC m=+1228.216043489" observedRunningTime="2025-12-03 07:09:28.228696281 +0000 UTC m=+1229.489650707" watchObservedRunningTime="2025-12-03 07:09:28.232554104 +0000 UTC m=+1229.493508530" Dec 03 07:09:28 crc kubenswrapper[4947]: I1203 07:09:28.257844 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.257823645 podStartE2EDuration="4.257823645s" podCreationTimestamp="2025-12-03 07:09:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:09:28.246387617 +0000 UTC m=+1229.507342043" watchObservedRunningTime="2025-12-03 07:09:28.257823645 +0000 UTC m=+1229.518778071" Dec 03 07:09:28 crc kubenswrapper[4947]: I1203 07:09:28.267552 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6c489f678-crqhz" podStartSLOduration=2.591727545 podStartE2EDuration="6.267533387s" podCreationTimestamp="2025-12-03 07:09:22 +0000 UTC" firstStartedPulling="2025-12-03 07:09:23.227480065 +0000 UTC m=+1224.488434491" lastFinishedPulling="2025-12-03 07:09:26.903285907 +0000 UTC m=+1228.164240333" observedRunningTime="2025-12-03 07:09:28.264921448 +0000 UTC m=+1229.525875874" watchObservedRunningTime="2025-12-03 07:09:28.267533387 +0000 UTC m=+1229.528487813" Dec 03 07:09:28 crc kubenswrapper[4947]: I1203 07:09:28.769108 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.095150 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81dcf640-a47a-4436-b76f-113616974810" path="/var/lib/kubelet/pods/81dcf640-a47a-4436-b76f-113616974810/volumes" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.216502 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79f30107-717d-402f-b501-8911cede9cc1","Type":"ContainerStarted","Data":"df76336b36c90d537c03130eb2ed5f3b7320b066919dea5df4667e3191f5e1db"} Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.217825 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.220241 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"18a8b60d-0ea3-40ce-be39-269c1f62080c","Type":"ContainerStarted","Data":"ddb9ef11798c34d77e0d97ef44acce8f8c0246b9349e43220f9c73f9895075aa"} Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.250008 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.383766888 podStartE2EDuration="8.249990565s" podCreationTimestamp="2025-12-03 07:09:21 +0000 UTC" firstStartedPulling="2025-12-03 07:09:21.917897327 +0000 UTC m=+1223.178851763" lastFinishedPulling="2025-12-03 07:09:27.784121004 +0000 UTC m=+1229.045075440" observedRunningTime="2025-12-03 07:09:29.240090168 +0000 UTC m=+1230.501044594" watchObservedRunningTime="2025-12-03 07:09:29.249990565 +0000 UTC m=+1230.510944981" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.275706 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.4136424659999998 podStartE2EDuration="5.275689708s" podCreationTimestamp="2025-12-03 07:09:24 +0000 UTC" firstStartedPulling="2025-12-03 07:09:25.093119093 +0000 UTC m=+1226.354073519" lastFinishedPulling="2025-12-03 07:09:26.955166325 +0000 UTC m=+1228.216120761" observedRunningTime="2025-12-03 07:09:29.271275549 +0000 UTC m=+1230.532229975" watchObservedRunningTime="2025-12-03 07:09:29.275689708 +0000 UTC m=+1230.536644134" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.470131 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-59d779d8d8-6jl5c"] Dec 03 07:09:29 crc kubenswrapper[4947]: E1203 07:09:29.473839 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81dcf640-a47a-4436-b76f-113616974810" containerName="dnsmasq-dns" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.473857 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="81dcf640-a47a-4436-b76f-113616974810" containerName="dnsmasq-dns" Dec 03 07:09:29 crc kubenswrapper[4947]: E1203 07:09:29.473875 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81dcf640-a47a-4436-b76f-113616974810" containerName="init" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.473881 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="81dcf640-a47a-4436-b76f-113616974810" containerName="init" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.474070 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="81dcf640-a47a-4436-b76f-113616974810" containerName="dnsmasq-dns" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.474954 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.512598 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.512769 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.524599 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-59d779d8d8-6jl5c"] Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.552808 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.561963 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-internal-tls-certs\") pod \"barbican-api-59d779d8d8-6jl5c\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.562011 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4e9d6dc-e814-485d-842b-9266732c7924-logs\") pod \"barbican-api-59d779d8d8-6jl5c\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.562052 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-combined-ca-bundle\") pod \"barbican-api-59d779d8d8-6jl5c\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.562134 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-config-data\") pod \"barbican-api-59d779d8d8-6jl5c\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.562241 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-public-tls-certs\") pod \"barbican-api-59d779d8d8-6jl5c\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.562291 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-config-data-custom\") pod \"barbican-api-59d779d8d8-6jl5c\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.562336 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqk9b\" (UniqueName: \"kubernetes.io/projected/e4e9d6dc-e814-485d-842b-9266732c7924-kube-api-access-lqk9b\") pod \"barbican-api-59d779d8d8-6jl5c\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.664308 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-public-tls-certs\") pod \"barbican-api-59d779d8d8-6jl5c\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.664388 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-config-data-custom\") pod \"barbican-api-59d779d8d8-6jl5c\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.664444 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqk9b\" (UniqueName: \"kubernetes.io/projected/e4e9d6dc-e814-485d-842b-9266732c7924-kube-api-access-lqk9b\") pod \"barbican-api-59d779d8d8-6jl5c\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.664524 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4e9d6dc-e814-485d-842b-9266732c7924-logs\") pod \"barbican-api-59d779d8d8-6jl5c\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.664548 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-internal-tls-certs\") pod \"barbican-api-59d779d8d8-6jl5c\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.664574 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-combined-ca-bundle\") pod \"barbican-api-59d779d8d8-6jl5c\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.664611 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-config-data\") pod \"barbican-api-59d779d8d8-6jl5c\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.665131 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4e9d6dc-e814-485d-842b-9266732c7924-logs\") pod \"barbican-api-59d779d8d8-6jl5c\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.671925 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-internal-tls-certs\") pod \"barbican-api-59d779d8d8-6jl5c\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.672171 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-config-data-custom\") pod \"barbican-api-59d779d8d8-6jl5c\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.672359 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-combined-ca-bundle\") pod \"barbican-api-59d779d8d8-6jl5c\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.672462 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-config-data\") pod \"barbican-api-59d779d8d8-6jl5c\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.676857 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-public-tls-certs\") pod \"barbican-api-59d779d8d8-6jl5c\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.684981 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqk9b\" (UniqueName: \"kubernetes.io/projected/e4e9d6dc-e814-485d-842b-9266732c7924-kube-api-access-lqk9b\") pod \"barbican-api-59d779d8d8-6jl5c\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:29 crc kubenswrapper[4947]: I1203 07:09:29.826624 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:30 crc kubenswrapper[4947]: I1203 07:09:30.235262 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e5590bf3-6e25-4dde-836a-bf6fe8aec722" containerName="cinder-api-log" containerID="cri-o://4f03fb183232f6e903022c0a3bf495656b02a2976e65090505377a46e35f25af" gracePeriod=30 Dec 03 07:09:30 crc kubenswrapper[4947]: I1203 07:09:30.235426 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e5590bf3-6e25-4dde-836a-bf6fe8aec722" containerName="cinder-api" containerID="cri-o://0d55b21d65b4d3b37dc04a0d33a64912b2f15c6cc75213fee04366817d11816c" gracePeriod=30 Dec 03 07:09:30 crc kubenswrapper[4947]: I1203 07:09:30.319114 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7f76bbf57b-xqff8" Dec 03 07:09:30 crc kubenswrapper[4947]: W1203 07:09:30.326067 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4e9d6dc_e814_485d_842b_9266732c7924.slice/crio-61e0344adc9512703ce97344b0be8ee9daf603e20b56691300046d1f21604dc5 WatchSource:0}: Error finding container 61e0344adc9512703ce97344b0be8ee9daf603e20b56691300046d1f21604dc5: Status 404 returned error can't find the container with id 61e0344adc9512703ce97344b0be8ee9daf603e20b56691300046d1f21604dc5 Dec 03 07:09:30 crc kubenswrapper[4947]: I1203 07:09:30.327633 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-59d779d8d8-6jl5c"] Dec 03 07:09:30 crc kubenswrapper[4947]: I1203 07:09:30.990677 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.216321 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5590bf3-6e25-4dde-836a-bf6fe8aec722-config-data\") pod \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.216369 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5590bf3-6e25-4dde-836a-bf6fe8aec722-etc-machine-id\") pod \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.216408 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5590bf3-6e25-4dde-836a-bf6fe8aec722-combined-ca-bundle\") pod \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.216503 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5590bf3-6e25-4dde-836a-bf6fe8aec722-logs\") pod \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.216513 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e5590bf3-6e25-4dde-836a-bf6fe8aec722-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e5590bf3-6e25-4dde-836a-bf6fe8aec722" (UID: "e5590bf3-6e25-4dde-836a-bf6fe8aec722"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.216538 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5590bf3-6e25-4dde-836a-bf6fe8aec722-scripts\") pod \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.216700 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wprnr\" (UniqueName: \"kubernetes.io/projected/e5590bf3-6e25-4dde-836a-bf6fe8aec722-kube-api-access-wprnr\") pod \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.216804 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5590bf3-6e25-4dde-836a-bf6fe8aec722-config-data-custom\") pod \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\" (UID: \"e5590bf3-6e25-4dde-836a-bf6fe8aec722\") " Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.217030 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5590bf3-6e25-4dde-836a-bf6fe8aec722-logs" (OuterVolumeSpecName: "logs") pod "e5590bf3-6e25-4dde-836a-bf6fe8aec722" (UID: "e5590bf3-6e25-4dde-836a-bf6fe8aec722"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.217579 4947 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e5590bf3-6e25-4dde-836a-bf6fe8aec722-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.217603 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5590bf3-6e25-4dde-836a-bf6fe8aec722-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.221713 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5590bf3-6e25-4dde-836a-bf6fe8aec722-kube-api-access-wprnr" (OuterVolumeSpecName: "kube-api-access-wprnr") pod "e5590bf3-6e25-4dde-836a-bf6fe8aec722" (UID: "e5590bf3-6e25-4dde-836a-bf6fe8aec722"). InnerVolumeSpecName "kube-api-access-wprnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.222705 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5590bf3-6e25-4dde-836a-bf6fe8aec722-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e5590bf3-6e25-4dde-836a-bf6fe8aec722" (UID: "e5590bf3-6e25-4dde-836a-bf6fe8aec722"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.224636 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5590bf3-6e25-4dde-836a-bf6fe8aec722-scripts" (OuterVolumeSpecName: "scripts") pod "e5590bf3-6e25-4dde-836a-bf6fe8aec722" (UID: "e5590bf3-6e25-4dde-836a-bf6fe8aec722"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.250912 4947 generic.go:334] "Generic (PLEG): container finished" podID="e5590bf3-6e25-4dde-836a-bf6fe8aec722" containerID="0d55b21d65b4d3b37dc04a0d33a64912b2f15c6cc75213fee04366817d11816c" exitCode=0 Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.250954 4947 generic.go:334] "Generic (PLEG): container finished" podID="e5590bf3-6e25-4dde-836a-bf6fe8aec722" containerID="4f03fb183232f6e903022c0a3bf495656b02a2976e65090505377a46e35f25af" exitCode=143 Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.251004 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e5590bf3-6e25-4dde-836a-bf6fe8aec722","Type":"ContainerDied","Data":"0d55b21d65b4d3b37dc04a0d33a64912b2f15c6cc75213fee04366817d11816c"} Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.251040 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e5590bf3-6e25-4dde-836a-bf6fe8aec722","Type":"ContainerDied","Data":"4f03fb183232f6e903022c0a3bf495656b02a2976e65090505377a46e35f25af"} Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.251092 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e5590bf3-6e25-4dde-836a-bf6fe8aec722","Type":"ContainerDied","Data":"ec359dde238b7b484b8d425c97e2ec2372738f91a97d53aeb2acfe3ec9f8a527"} Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.251138 4947 scope.go:117] "RemoveContainer" containerID="0d55b21d65b4d3b37dc04a0d33a64912b2f15c6cc75213fee04366817d11816c" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.251347 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.258872 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5590bf3-6e25-4dde-836a-bf6fe8aec722-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5590bf3-6e25-4dde-836a-bf6fe8aec722" (UID: "e5590bf3-6e25-4dde-836a-bf6fe8aec722"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.259640 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59d779d8d8-6jl5c" event={"ID":"e4e9d6dc-e814-485d-842b-9266732c7924","Type":"ContainerStarted","Data":"c60ae6e938e54908332dbaa6ddd689e443ba56a7fa23dbd3d3249373512ca680"} Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.259693 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59d779d8d8-6jl5c" event={"ID":"e4e9d6dc-e814-485d-842b-9266732c7924","Type":"ContainerStarted","Data":"0df9670d5726b37e2f523e33898d50880cf6beeea48f329f234fc69c23009757"} Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.259707 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59d779d8d8-6jl5c" event={"ID":"e4e9d6dc-e814-485d-842b-9266732c7924","Type":"ContainerStarted","Data":"61e0344adc9512703ce97344b0be8ee9daf603e20b56691300046d1f21604dc5"} Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.259741 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.259973 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.279871 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-59d779d8d8-6jl5c" podStartSLOduration=2.279853064 podStartE2EDuration="2.279853064s" podCreationTimestamp="2025-12-03 07:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:09:31.277480928 +0000 UTC m=+1232.538435354" watchObservedRunningTime="2025-12-03 07:09:31.279853064 +0000 UTC m=+1232.540807490" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.294968 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5590bf3-6e25-4dde-836a-bf6fe8aec722-config-data" (OuterVolumeSpecName: "config-data") pod "e5590bf3-6e25-4dde-836a-bf6fe8aec722" (UID: "e5590bf3-6e25-4dde-836a-bf6fe8aec722"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.319509 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wprnr\" (UniqueName: \"kubernetes.io/projected/e5590bf3-6e25-4dde-836a-bf6fe8aec722-kube-api-access-wprnr\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.319538 4947 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e5590bf3-6e25-4dde-836a-bf6fe8aec722-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.319547 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5590bf3-6e25-4dde-836a-bf6fe8aec722-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.319557 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5590bf3-6e25-4dde-836a-bf6fe8aec722-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.319567 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e5590bf3-6e25-4dde-836a-bf6fe8aec722-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.348075 4947 scope.go:117] "RemoveContainer" containerID="4f03fb183232f6e903022c0a3bf495656b02a2976e65090505377a46e35f25af" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.368308 4947 scope.go:117] "RemoveContainer" containerID="0d55b21d65b4d3b37dc04a0d33a64912b2f15c6cc75213fee04366817d11816c" Dec 03 07:09:31 crc kubenswrapper[4947]: E1203 07:09:31.368701 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d55b21d65b4d3b37dc04a0d33a64912b2f15c6cc75213fee04366817d11816c\": container with ID starting with 0d55b21d65b4d3b37dc04a0d33a64912b2f15c6cc75213fee04366817d11816c not found: ID does not exist" containerID="0d55b21d65b4d3b37dc04a0d33a64912b2f15c6cc75213fee04366817d11816c" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.368729 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d55b21d65b4d3b37dc04a0d33a64912b2f15c6cc75213fee04366817d11816c"} err="failed to get container status \"0d55b21d65b4d3b37dc04a0d33a64912b2f15c6cc75213fee04366817d11816c\": rpc error: code = NotFound desc = could not find container \"0d55b21d65b4d3b37dc04a0d33a64912b2f15c6cc75213fee04366817d11816c\": container with ID starting with 0d55b21d65b4d3b37dc04a0d33a64912b2f15c6cc75213fee04366817d11816c not found: ID does not exist" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.368748 4947 scope.go:117] "RemoveContainer" containerID="4f03fb183232f6e903022c0a3bf495656b02a2976e65090505377a46e35f25af" Dec 03 07:09:31 crc kubenswrapper[4947]: E1203 07:09:31.369010 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f03fb183232f6e903022c0a3bf495656b02a2976e65090505377a46e35f25af\": container with ID starting with 4f03fb183232f6e903022c0a3bf495656b02a2976e65090505377a46e35f25af not found: ID does not exist" containerID="4f03fb183232f6e903022c0a3bf495656b02a2976e65090505377a46e35f25af" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.369029 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f03fb183232f6e903022c0a3bf495656b02a2976e65090505377a46e35f25af"} err="failed to get container status \"4f03fb183232f6e903022c0a3bf495656b02a2976e65090505377a46e35f25af\": rpc error: code = NotFound desc = could not find container \"4f03fb183232f6e903022c0a3bf495656b02a2976e65090505377a46e35f25af\": container with ID starting with 4f03fb183232f6e903022c0a3bf495656b02a2976e65090505377a46e35f25af not found: ID does not exist" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.369043 4947 scope.go:117] "RemoveContainer" containerID="0d55b21d65b4d3b37dc04a0d33a64912b2f15c6cc75213fee04366817d11816c" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.369543 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d55b21d65b4d3b37dc04a0d33a64912b2f15c6cc75213fee04366817d11816c"} err="failed to get container status \"0d55b21d65b4d3b37dc04a0d33a64912b2f15c6cc75213fee04366817d11816c\": rpc error: code = NotFound desc = could not find container \"0d55b21d65b4d3b37dc04a0d33a64912b2f15c6cc75213fee04366817d11816c\": container with ID starting with 0d55b21d65b4d3b37dc04a0d33a64912b2f15c6cc75213fee04366817d11816c not found: ID does not exist" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.369626 4947 scope.go:117] "RemoveContainer" containerID="4f03fb183232f6e903022c0a3bf495656b02a2976e65090505377a46e35f25af" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.370016 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f03fb183232f6e903022c0a3bf495656b02a2976e65090505377a46e35f25af"} err="failed to get container status \"4f03fb183232f6e903022c0a3bf495656b02a2976e65090505377a46e35f25af\": rpc error: code = NotFound desc = could not find container \"4f03fb183232f6e903022c0a3bf495656b02a2976e65090505377a46e35f25af\": container with ID starting with 4f03fb183232f6e903022c0a3bf495656b02a2976e65090505377a46e35f25af not found: ID does not exist" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.586861 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.595354 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.621985 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 03 07:09:31 crc kubenswrapper[4947]: E1203 07:09:31.622937 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5590bf3-6e25-4dde-836a-bf6fe8aec722" containerName="cinder-api-log" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.622956 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5590bf3-6e25-4dde-836a-bf6fe8aec722" containerName="cinder-api-log" Dec 03 07:09:31 crc kubenswrapper[4947]: E1203 07:09:31.623031 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5590bf3-6e25-4dde-836a-bf6fe8aec722" containerName="cinder-api" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.623039 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5590bf3-6e25-4dde-836a-bf6fe8aec722" containerName="cinder-api" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.623354 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5590bf3-6e25-4dde-836a-bf6fe8aec722" containerName="cinder-api" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.623386 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5590bf3-6e25-4dde-836a-bf6fe8aec722" containerName="cinder-api-log" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.624374 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.628419 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.628690 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.628811 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.646163 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.729897 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.729945 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-config-data-custom\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.730000 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9a879185-9b9a-45a5-a211-c61faf308cbb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.730025 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54k7t\" (UniqueName: \"kubernetes.io/projected/9a879185-9b9a-45a5-a211-c61faf308cbb-kube-api-access-54k7t\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.730162 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.730258 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-config-data\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.730303 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-scripts\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.730336 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a879185-9b9a-45a5-a211-c61faf308cbb-logs\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.730393 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.832437 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54k7t\" (UniqueName: \"kubernetes.io/projected/9a879185-9b9a-45a5-a211-c61faf308cbb-kube-api-access-54k7t\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.832540 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.832563 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-config-data\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.832594 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-scripts\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.832618 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a879185-9b9a-45a5-a211-c61faf308cbb-logs\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.832633 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.833192 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.833215 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-config-data-custom\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.833241 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9a879185-9b9a-45a5-a211-c61faf308cbb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.833468 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9a879185-9b9a-45a5-a211-c61faf308cbb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.833544 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a879185-9b9a-45a5-a211-c61faf308cbb-logs\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.849375 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.849683 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.849683 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-scripts\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.850282 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.851779 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54k7t\" (UniqueName: \"kubernetes.io/projected/9a879185-9b9a-45a5-a211-c61faf308cbb-kube-api-access-54k7t\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.853527 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-config-data\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.859268 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-config-data-custom\") pod \"cinder-api-0\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " pod="openstack/cinder-api-0" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.881390 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7f76bbf57b-xqff8" Dec 03 07:09:31 crc kubenswrapper[4947]: I1203 07:09:31.947181 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 07:09:32 crc kubenswrapper[4947]: I1203 07:09:32.429776 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 07:09:33 crc kubenswrapper[4947]: I1203 07:09:33.098212 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5590bf3-6e25-4dde-836a-bf6fe8aec722" path="/var/lib/kubelet/pods/e5590bf3-6e25-4dde-836a-bf6fe8aec722/volumes" Dec 03 07:09:33 crc kubenswrapper[4947]: I1203 07:09:33.295113 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9a879185-9b9a-45a5-a211-c61faf308cbb","Type":"ContainerStarted","Data":"97036c64cfae8886767f870b26986f84c7ae69a1a7110b3ff8a8e15f7e00dd75"} Dec 03 07:09:33 crc kubenswrapper[4947]: I1203 07:09:33.295369 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9a879185-9b9a-45a5-a211-c61faf308cbb","Type":"ContainerStarted","Data":"4ae36dfda6ddd253abc4dbf1ff3fc0ae9aa5dbafaec8e55c9f72c05188ef29b9"} Dec 03 07:09:33 crc kubenswrapper[4947]: I1203 07:09:33.340402 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5f849bbbf6-tf4mc" Dec 03 07:09:34 crc kubenswrapper[4947]: I1203 07:09:34.308902 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9a879185-9b9a-45a5-a211-c61faf308cbb","Type":"ContainerStarted","Data":"737f46e0f99576fb923247d82fc1eef29b5d49123f805ead5797b64495cf9e63"} Dec 03 07:09:34 crc kubenswrapper[4947]: I1203 07:09:34.309342 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 03 07:09:34 crc kubenswrapper[4947]: I1203 07:09:34.334721 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.3347025009999998 podStartE2EDuration="3.334702501s" podCreationTimestamp="2025-12-03 07:09:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:09:34.32454798 +0000 UTC m=+1235.585502406" watchObservedRunningTime="2025-12-03 07:09:34.334702501 +0000 UTC m=+1235.595656917" Dec 03 07:09:34 crc kubenswrapper[4947]: I1203 07:09:34.651673 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:34 crc kubenswrapper[4947]: I1203 07:09:34.653848 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:09:34 crc kubenswrapper[4947]: I1203 07:09:34.721662 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" Dec 03 07:09:34 crc kubenswrapper[4947]: I1203 07:09:34.819272 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-849ff95dc5-p27ds"] Dec 03 07:09:34 crc kubenswrapper[4947]: I1203 07:09:34.828045 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" podUID="381b76ac-1ce2-4745-8e27-1398b83ee86a" containerName="dnsmasq-dns" containerID="cri-o://40b240e4fbf74a9ff20f6c10c9d36360e68597fec6bdc52297632f6094c8a1e0" gracePeriod=10 Dec 03 07:09:34 crc kubenswrapper[4947]: I1203 07:09:34.844413 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 03 07:09:34 crc kubenswrapper[4947]: I1203 07:09:34.906384 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 07:09:35 crc kubenswrapper[4947]: I1203 07:09:35.318866 4947 generic.go:334] "Generic (PLEG): container finished" podID="381b76ac-1ce2-4745-8e27-1398b83ee86a" containerID="40b240e4fbf74a9ff20f6c10c9d36360e68597fec6bdc52297632f6094c8a1e0" exitCode=0 Dec 03 07:09:35 crc kubenswrapper[4947]: I1203 07:09:35.319790 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" event={"ID":"381b76ac-1ce2-4745-8e27-1398b83ee86a","Type":"ContainerDied","Data":"40b240e4fbf74a9ff20f6c10c9d36360e68597fec6bdc52297632f6094c8a1e0"} Dec 03 07:09:35 crc kubenswrapper[4947]: I1203 07:09:35.319943 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="18a8b60d-0ea3-40ce-be39-269c1f62080c" containerName="cinder-scheduler" containerID="cri-o://709315361d917f44e800602797bbf94a88212d69d3aeae1f71cc1a6209fadd62" gracePeriod=30 Dec 03 07:09:35 crc kubenswrapper[4947]: I1203 07:09:35.321036 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="18a8b60d-0ea3-40ce-be39-269c1f62080c" containerName="probe" containerID="cri-o://ddb9ef11798c34d77e0d97ef44acce8f8c0246b9349e43220f9c73f9895075aa" gracePeriod=30 Dec 03 07:09:35 crc kubenswrapper[4947]: I1203 07:09:35.615216 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:09:35 crc kubenswrapper[4947]: I1203 07:09:35.684217 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5f849bbbf6-tf4mc"] Dec 03 07:09:35 crc kubenswrapper[4947]: I1203 07:09:35.684488 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5f849bbbf6-tf4mc" podUID="d2e58a7d-365d-4296-b18f-5d1bfe90b61f" containerName="neutron-api" containerID="cri-o://4e1056f9dc24c16b8391f6730abe97c8c00219da69df68bd522af5ece0d4a512" gracePeriod=30 Dec 03 07:09:35 crc kubenswrapper[4947]: I1203 07:09:35.684603 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5f849bbbf6-tf4mc" podUID="d2e58a7d-365d-4296-b18f-5d1bfe90b61f" containerName="neutron-httpd" containerID="cri-o://b5da53d34c11626bb7af6025e1d48b7788af57875c4c0af890dd758ea0e40aca" gracePeriod=30 Dec 03 07:09:35 crc kubenswrapper[4947]: I1203 07:09:35.939888 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.019110 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-dns-svc\") pod \"381b76ac-1ce2-4745-8e27-1398b83ee86a\" (UID: \"381b76ac-1ce2-4745-8e27-1398b83ee86a\") " Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.019199 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-config\") pod \"381b76ac-1ce2-4745-8e27-1398b83ee86a\" (UID: \"381b76ac-1ce2-4745-8e27-1398b83ee86a\") " Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.019229 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-ovsdbserver-nb\") pod \"381b76ac-1ce2-4745-8e27-1398b83ee86a\" (UID: \"381b76ac-1ce2-4745-8e27-1398b83ee86a\") " Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.019296 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-ovsdbserver-sb\") pod \"381b76ac-1ce2-4745-8e27-1398b83ee86a\" (UID: \"381b76ac-1ce2-4745-8e27-1398b83ee86a\") " Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.019325 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-dns-swift-storage-0\") pod \"381b76ac-1ce2-4745-8e27-1398b83ee86a\" (UID: \"381b76ac-1ce2-4745-8e27-1398b83ee86a\") " Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.019390 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr2hl\" (UniqueName: \"kubernetes.io/projected/381b76ac-1ce2-4745-8e27-1398b83ee86a-kube-api-access-rr2hl\") pod \"381b76ac-1ce2-4745-8e27-1398b83ee86a\" (UID: \"381b76ac-1ce2-4745-8e27-1398b83ee86a\") " Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.053664 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/381b76ac-1ce2-4745-8e27-1398b83ee86a-kube-api-access-rr2hl" (OuterVolumeSpecName: "kube-api-access-rr2hl") pod "381b76ac-1ce2-4745-8e27-1398b83ee86a" (UID: "381b76ac-1ce2-4745-8e27-1398b83ee86a"). InnerVolumeSpecName "kube-api-access-rr2hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.124267 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr2hl\" (UniqueName: \"kubernetes.io/projected/381b76ac-1ce2-4745-8e27-1398b83ee86a-kube-api-access-rr2hl\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.183184 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "381b76ac-1ce2-4745-8e27-1398b83ee86a" (UID: "381b76ac-1ce2-4745-8e27-1398b83ee86a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.193053 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "381b76ac-1ce2-4745-8e27-1398b83ee86a" (UID: "381b76ac-1ce2-4745-8e27-1398b83ee86a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.194000 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "381b76ac-1ce2-4745-8e27-1398b83ee86a" (UID: "381b76ac-1ce2-4745-8e27-1398b83ee86a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.200080 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-config" (OuterVolumeSpecName: "config") pod "381b76ac-1ce2-4745-8e27-1398b83ee86a" (UID: "381b76ac-1ce2-4745-8e27-1398b83ee86a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.227033 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.227059 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.227068 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.227078 4947 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.236999 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "381b76ac-1ce2-4745-8e27-1398b83ee86a" (UID: "381b76ac-1ce2-4745-8e27-1398b83ee86a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.328142 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/381b76ac-1ce2-4745-8e27-1398b83ee86a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.331520 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" event={"ID":"381b76ac-1ce2-4745-8e27-1398b83ee86a","Type":"ContainerDied","Data":"cb15f5f6a2656eff041a9539c65bab6633cd94f7f5330074b88fe20c1ab9a4f7"} Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.331565 4947 scope.go:117] "RemoveContainer" containerID="40b240e4fbf74a9ff20f6c10c9d36360e68597fec6bdc52297632f6094c8a1e0" Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.331682 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-849ff95dc5-p27ds" Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.338847 4947 generic.go:334] "Generic (PLEG): container finished" podID="18a8b60d-0ea3-40ce-be39-269c1f62080c" containerID="ddb9ef11798c34d77e0d97ef44acce8f8c0246b9349e43220f9c73f9895075aa" exitCode=0 Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.339199 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"18a8b60d-0ea3-40ce-be39-269c1f62080c","Type":"ContainerDied","Data":"ddb9ef11798c34d77e0d97ef44acce8f8c0246b9349e43220f9c73f9895075aa"} Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.341743 4947 generic.go:334] "Generic (PLEG): container finished" podID="d2e58a7d-365d-4296-b18f-5d1bfe90b61f" containerID="b5da53d34c11626bb7af6025e1d48b7788af57875c4c0af890dd758ea0e40aca" exitCode=0 Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.341769 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f849bbbf6-tf4mc" event={"ID":"d2e58a7d-365d-4296-b18f-5d1bfe90b61f","Type":"ContainerDied","Data":"b5da53d34c11626bb7af6025e1d48b7788af57875c4c0af890dd758ea0e40aca"} Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.384916 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-849ff95dc5-p27ds"] Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.391937 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-849ff95dc5-p27ds"] Dec 03 07:09:36 crc kubenswrapper[4947]: I1203 07:09:36.396718 4947 scope.go:117] "RemoveContainer" containerID="c64ef254c2f2db085d3154cdb62f028dac0d5035511cd98cbc2fa5c0f19f09f9" Dec 03 07:09:37 crc kubenswrapper[4947]: I1203 07:09:37.094657 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="381b76ac-1ce2-4745-8e27-1398b83ee86a" path="/var/lib/kubelet/pods/381b76ac-1ce2-4745-8e27-1398b83ee86a/volumes" Dec 03 07:09:37 crc kubenswrapper[4947]: I1203 07:09:37.232969 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.382300 4947 generic.go:334] "Generic (PLEG): container finished" podID="18a8b60d-0ea3-40ce-be39-269c1f62080c" containerID="709315361d917f44e800602797bbf94a88212d69d3aeae1f71cc1a6209fadd62" exitCode=0 Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.382444 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"18a8b60d-0ea3-40ce-be39-269c1f62080c","Type":"ContainerDied","Data":"709315361d917f44e800602797bbf94a88212d69d3aeae1f71cc1a6209fadd62"} Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.491696 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 03 07:09:39 crc kubenswrapper[4947]: E1203 07:09:39.492285 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="381b76ac-1ce2-4745-8e27-1398b83ee86a" containerName="init" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.492357 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="381b76ac-1ce2-4745-8e27-1398b83ee86a" containerName="init" Dec 03 07:09:39 crc kubenswrapper[4947]: E1203 07:09:39.492414 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="381b76ac-1ce2-4745-8e27-1398b83ee86a" containerName="dnsmasq-dns" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.492465 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="381b76ac-1ce2-4745-8e27-1398b83ee86a" containerName="dnsmasq-dns" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.492701 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="381b76ac-1ce2-4745-8e27-1398b83ee86a" containerName="dnsmasq-dns" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.493306 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.496572 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.496681 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.496578 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-rn58g" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.511636 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.682923 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-openstack-config\") pod \"openstackclient\" (UID: \"c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b\") " pod="openstack/openstackclient" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.683000 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvl8q\" (UniqueName: \"kubernetes.io/projected/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-kube-api-access-mvl8q\") pod \"openstackclient\" (UID: \"c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b\") " pod="openstack/openstackclient" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.683252 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b\") " pod="openstack/openstackclient" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.683455 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-openstack-config-secret\") pod \"openstackclient\" (UID: \"c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b\") " pod="openstack/openstackclient" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.758010 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.788206 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 03 07:09:39 crc kubenswrapper[4947]: E1203 07:09:39.789358 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-mvl8q openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.789829 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18a8b60d-0ea3-40ce-be39-269c1f62080c-config-data\") pod \"18a8b60d-0ea3-40ce-be39-269c1f62080c\" (UID: \"18a8b60d-0ea3-40ce-be39-269c1f62080c\") " Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.789875 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18a8b60d-0ea3-40ce-be39-269c1f62080c-config-data-custom\") pod \"18a8b60d-0ea3-40ce-be39-269c1f62080c\" (UID: \"18a8b60d-0ea3-40ce-be39-269c1f62080c\") " Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.789954 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfrhh\" (UniqueName: \"kubernetes.io/projected/18a8b60d-0ea3-40ce-be39-269c1f62080c-kube-api-access-nfrhh\") pod \"18a8b60d-0ea3-40ce-be39-269c1f62080c\" (UID: \"18a8b60d-0ea3-40ce-be39-269c1f62080c\") " Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.789981 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/18a8b60d-0ea3-40ce-be39-269c1f62080c-etc-machine-id\") pod \"18a8b60d-0ea3-40ce-be39-269c1f62080c\" (UID: \"18a8b60d-0ea3-40ce-be39-269c1f62080c\") " Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.790000 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18a8b60d-0ea3-40ce-be39-269c1f62080c-combined-ca-bundle\") pod \"18a8b60d-0ea3-40ce-be39-269c1f62080c\" (UID: \"18a8b60d-0ea3-40ce-be39-269c1f62080c\") " Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.790046 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18a8b60d-0ea3-40ce-be39-269c1f62080c-scripts\") pod \"18a8b60d-0ea3-40ce-be39-269c1f62080c\" (UID: \"18a8b60d-0ea3-40ce-be39-269c1f62080c\") " Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.790357 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b\") " pod="openstack/openstackclient" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.790431 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-openstack-config-secret\") pod \"openstackclient\" (UID: \"c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b\") " pod="openstack/openstackclient" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.790467 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-openstack-config\") pod \"openstackclient\" (UID: \"c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b\") " pod="openstack/openstackclient" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.790498 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvl8q\" (UniqueName: \"kubernetes.io/projected/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-kube-api-access-mvl8q\") pod \"openstackclient\" (UID: \"c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b\") " pod="openstack/openstackclient" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.797276 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/18a8b60d-0ea3-40ce-be39-269c1f62080c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "18a8b60d-0ea3-40ce-be39-269c1f62080c" (UID: "18a8b60d-0ea3-40ce-be39-269c1f62080c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.798251 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-openstack-config\") pod \"openstackclient\" (UID: \"c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b\") " pod="openstack/openstackclient" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.810330 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 03 07:09:39 crc kubenswrapper[4947]: E1203 07:09:39.815950 4947 projected.go:194] Error preparing data for projected volume kube-api-access-mvl8q for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 03 07:09:39 crc kubenswrapper[4947]: E1203 07:09:39.816029 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-kube-api-access-mvl8q podName:c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b nodeName:}" failed. No retries permitted until 2025-12-03 07:09:40.316000118 +0000 UTC m=+1241.576954544 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mvl8q" (UniqueName: "kubernetes.io/projected/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-kube-api-access-mvl8q") pod "openstackclient" (UID: "c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: User "system:node:crc" cannot create resource "serviceaccounts/token" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.816943 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18a8b60d-0ea3-40ce-be39-269c1f62080c-kube-api-access-nfrhh" (OuterVolumeSpecName: "kube-api-access-nfrhh") pod "18a8b60d-0ea3-40ce-be39-269c1f62080c" (UID: "18a8b60d-0ea3-40ce-be39-269c1f62080c"). InnerVolumeSpecName "kube-api-access-nfrhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.818688 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18a8b60d-0ea3-40ce-be39-269c1f62080c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "18a8b60d-0ea3-40ce-be39-269c1f62080c" (UID: "18a8b60d-0ea3-40ce-be39-269c1f62080c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.827694 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18a8b60d-0ea3-40ce-be39-269c1f62080c-scripts" (OuterVolumeSpecName: "scripts") pod "18a8b60d-0ea3-40ce-be39-269c1f62080c" (UID: "18a8b60d-0ea3-40ce-be39-269c1f62080c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.829038 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-openstack-config-secret\") pod \"openstackclient\" (UID: \"c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b\") " pod="openstack/openstackclient" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.857077 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b\") " pod="openstack/openstackclient" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.893396 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfrhh\" (UniqueName: \"kubernetes.io/projected/18a8b60d-0ea3-40ce-be39-269c1f62080c-kube-api-access-nfrhh\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.893431 4947 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/18a8b60d-0ea3-40ce-be39-269c1f62080c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.893441 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18a8b60d-0ea3-40ce-be39-269c1f62080c-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.893450 4947 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18a8b60d-0ea3-40ce-be39-269c1f62080c-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.933655 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 03 07:09:39 crc kubenswrapper[4947]: E1203 07:09:39.934051 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18a8b60d-0ea3-40ce-be39-269c1f62080c" containerName="cinder-scheduler" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.934069 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="18a8b60d-0ea3-40ce-be39-269c1f62080c" containerName="cinder-scheduler" Dec 03 07:09:39 crc kubenswrapper[4947]: E1203 07:09:39.934088 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18a8b60d-0ea3-40ce-be39-269c1f62080c" containerName="probe" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.934095 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="18a8b60d-0ea3-40ce-be39-269c1f62080c" containerName="probe" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.934265 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="18a8b60d-0ea3-40ce-be39-269c1f62080c" containerName="probe" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.934282 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="18a8b60d-0ea3-40ce-be39-269c1f62080c" containerName="cinder-scheduler" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.934932 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.958995 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.963654 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18a8b60d-0ea3-40ce-be39-269c1f62080c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18a8b60d-0ea3-40ce-be39-269c1f62080c" (UID: "18a8b60d-0ea3-40ce-be39-269c1f62080c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.994456 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/28892119-e165-46ea-a903-08207e491378-openstack-config-secret\") pod \"openstackclient\" (UID: \"28892119-e165-46ea-a903-08207e491378\") " pod="openstack/openstackclient" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.994524 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl5hq\" (UniqueName: \"kubernetes.io/projected/28892119-e165-46ea-a903-08207e491378-kube-api-access-bl5hq\") pod \"openstackclient\" (UID: \"28892119-e165-46ea-a903-08207e491378\") " pod="openstack/openstackclient" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.994579 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/28892119-e165-46ea-a903-08207e491378-openstack-config\") pod \"openstackclient\" (UID: \"28892119-e165-46ea-a903-08207e491378\") " pod="openstack/openstackclient" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.994618 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28892119-e165-46ea-a903-08207e491378-combined-ca-bundle\") pod \"openstackclient\" (UID: \"28892119-e165-46ea-a903-08207e491378\") " pod="openstack/openstackclient" Dec 03 07:09:39 crc kubenswrapper[4947]: I1203 07:09:39.994674 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18a8b60d-0ea3-40ce-be39-269c1f62080c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.046611 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18a8b60d-0ea3-40ce-be39-269c1f62080c-config-data" (OuterVolumeSpecName: "config-data") pod "18a8b60d-0ea3-40ce-be39-269c1f62080c" (UID: "18a8b60d-0ea3-40ce-be39-269c1f62080c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.095825 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/28892119-e165-46ea-a903-08207e491378-openstack-config-secret\") pod \"openstackclient\" (UID: \"28892119-e165-46ea-a903-08207e491378\") " pod="openstack/openstackclient" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.095899 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl5hq\" (UniqueName: \"kubernetes.io/projected/28892119-e165-46ea-a903-08207e491378-kube-api-access-bl5hq\") pod \"openstackclient\" (UID: \"28892119-e165-46ea-a903-08207e491378\") " pod="openstack/openstackclient" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.095955 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/28892119-e165-46ea-a903-08207e491378-openstack-config\") pod \"openstackclient\" (UID: \"28892119-e165-46ea-a903-08207e491378\") " pod="openstack/openstackclient" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.095991 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28892119-e165-46ea-a903-08207e491378-combined-ca-bundle\") pod \"openstackclient\" (UID: \"28892119-e165-46ea-a903-08207e491378\") " pod="openstack/openstackclient" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.096034 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18a8b60d-0ea3-40ce-be39-269c1f62080c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.098369 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/28892119-e165-46ea-a903-08207e491378-openstack-config\") pod \"openstackclient\" (UID: \"28892119-e165-46ea-a903-08207e491378\") " pod="openstack/openstackclient" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.100145 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/28892119-e165-46ea-a903-08207e491378-openstack-config-secret\") pod \"openstackclient\" (UID: \"28892119-e165-46ea-a903-08207e491378\") " pod="openstack/openstackclient" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.100590 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28892119-e165-46ea-a903-08207e491378-combined-ca-bundle\") pod \"openstackclient\" (UID: \"28892119-e165-46ea-a903-08207e491378\") " pod="openstack/openstackclient" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.117871 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl5hq\" (UniqueName: \"kubernetes.io/projected/28892119-e165-46ea-a903-08207e491378-kube-api-access-bl5hq\") pod \"openstackclient\" (UID: \"28892119-e165-46ea-a903-08207e491378\") " pod="openstack/openstackclient" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.276759 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.402393 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvl8q\" (UniqueName: \"kubernetes.io/projected/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-kube-api-access-mvl8q\") pod \"openstackclient\" (UID: \"c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b\") " pod="openstack/openstackclient" Dec 03 07:09:40 crc kubenswrapper[4947]: E1203 07:09:40.404981 4947 projected.go:194] Error preparing data for projected volume kube-api-access-mvl8q for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b) does not match the UID in record. The object might have been deleted and then recreated Dec 03 07:09:40 crc kubenswrapper[4947]: E1203 07:09:40.406168 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-kube-api-access-mvl8q podName:c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b nodeName:}" failed. No retries permitted until 2025-12-03 07:09:41.40614996 +0000 UTC m=+1242.667104386 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-mvl8q" (UniqueName: "kubernetes.io/projected/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-kube-api-access-mvl8q") pod "openstackclient" (UID: "c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b) does not match the UID in record. The object might have been deleted and then recreated Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.419177 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.419607 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.419579 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"18a8b60d-0ea3-40ce-be39-269c1f62080c","Type":"ContainerDied","Data":"d8fbeeb275c514c957528815059558dbddf329e5e5a6a5b7d9ea7ebb8e5b6057"} Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.421680 4947 scope.go:117] "RemoveContainer" containerID="ddb9ef11798c34d77e0d97ef44acce8f8c0246b9349e43220f9c73f9895075aa" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.422119 4947 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b" podUID="28892119-e165-46ea-a903-08207e491378" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.462274 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.481651 4947 scope.go:117] "RemoveContainer" containerID="709315361d917f44e800602797bbf94a88212d69d3aeae1f71cc1a6209fadd62" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.503545 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.510438 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-combined-ca-bundle\") pod \"c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b\" (UID: \"c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b\") " Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.510482 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-openstack-config-secret\") pod \"c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b\" (UID: \"c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b\") " Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.510544 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-openstack-config\") pod \"c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b\" (UID: \"c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b\") " Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.510906 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvl8q\" (UniqueName: \"kubernetes.io/projected/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-kube-api-access-mvl8q\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.511359 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b" (UID: "c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.518808 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b" (UID: "c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.557064 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b" (UID: "c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.560536 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.584137 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.590075 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.590867 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.606598 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.613066 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-config-data\") pod \"cinder-scheduler-0\" (UID: \"49caa6da-3c98-4c49-ab22-62121ff908cf\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.613145 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"49caa6da-3c98-4c49-ab22-62121ff908cf\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.613179 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvzxl\" (UniqueName: \"kubernetes.io/projected/49caa6da-3c98-4c49-ab22-62121ff908cf-kube-api-access-cvzxl\") pod \"cinder-scheduler-0\" (UID: \"49caa6da-3c98-4c49-ab22-62121ff908cf\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.613200 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49caa6da-3c98-4c49-ab22-62121ff908cf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"49caa6da-3c98-4c49-ab22-62121ff908cf\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.613246 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-scripts\") pod \"cinder-scheduler-0\" (UID: \"49caa6da-3c98-4c49-ab22-62121ff908cf\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.613277 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"49caa6da-3c98-4c49-ab22-62121ff908cf\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.613318 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.613331 4947 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.613342 4947 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.715646 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvzxl\" (UniqueName: \"kubernetes.io/projected/49caa6da-3c98-4c49-ab22-62121ff908cf-kube-api-access-cvzxl\") pod \"cinder-scheduler-0\" (UID: \"49caa6da-3c98-4c49-ab22-62121ff908cf\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.715692 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49caa6da-3c98-4c49-ab22-62121ff908cf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"49caa6da-3c98-4c49-ab22-62121ff908cf\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.715755 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-scripts\") pod \"cinder-scheduler-0\" (UID: \"49caa6da-3c98-4c49-ab22-62121ff908cf\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.715791 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"49caa6da-3c98-4c49-ab22-62121ff908cf\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.715828 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-config-data\") pod \"cinder-scheduler-0\" (UID: \"49caa6da-3c98-4c49-ab22-62121ff908cf\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.715879 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"49caa6da-3c98-4c49-ab22-62121ff908cf\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.717586 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49caa6da-3c98-4c49-ab22-62121ff908cf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"49caa6da-3c98-4c49-ab22-62121ff908cf\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.719666 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"49caa6da-3c98-4c49-ab22-62121ff908cf\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.720067 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-config-data\") pod \"cinder-scheduler-0\" (UID: \"49caa6da-3c98-4c49-ab22-62121ff908cf\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.720981 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"49caa6da-3c98-4c49-ab22-62121ff908cf\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.727936 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-scripts\") pod \"cinder-scheduler-0\" (UID: \"49caa6da-3c98-4c49-ab22-62121ff908cf\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.738081 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvzxl\" (UniqueName: \"kubernetes.io/projected/49caa6da-3c98-4c49-ab22-62121ff908cf-kube-api-access-cvzxl\") pod \"cinder-scheduler-0\" (UID: \"49caa6da-3c98-4c49-ab22-62121ff908cf\") " pod="openstack/cinder-scheduler-0" Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.819177 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 07:09:40 crc kubenswrapper[4947]: I1203 07:09:40.946556 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 07:09:41 crc kubenswrapper[4947]: I1203 07:09:41.095980 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18a8b60d-0ea3-40ce-be39-269c1f62080c" path="/var/lib/kubelet/pods/18a8b60d-0ea3-40ce-be39-269c1f62080c/volumes" Dec 03 07:09:41 crc kubenswrapper[4947]: I1203 07:09:41.096781 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b" path="/var/lib/kubelet/pods/c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b/volumes" Dec 03 07:09:41 crc kubenswrapper[4947]: I1203 07:09:41.422745 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 07:09:41 crc kubenswrapper[4947]: I1203 07:09:41.431750 4947 generic.go:334] "Generic (PLEG): container finished" podID="d2e58a7d-365d-4296-b18f-5d1bfe90b61f" containerID="4e1056f9dc24c16b8391f6730abe97c8c00219da69df68bd522af5ece0d4a512" exitCode=0 Dec 03 07:09:41 crc kubenswrapper[4947]: I1203 07:09:41.431828 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f849bbbf6-tf4mc" event={"ID":"d2e58a7d-365d-4296-b18f-5d1bfe90b61f","Type":"ContainerDied","Data":"4e1056f9dc24c16b8391f6730abe97c8c00219da69df68bd522af5ece0d4a512"} Dec 03 07:09:41 crc kubenswrapper[4947]: I1203 07:09:41.436655 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 07:09:41 crc kubenswrapper[4947]: I1203 07:09:41.436980 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"28892119-e165-46ea-a903-08207e491378","Type":"ContainerStarted","Data":"53db952ba8195f9ef5b205d48683416dc07320bacbe06d84d0cfd4984a9e69a2"} Dec 03 07:09:41 crc kubenswrapper[4947]: I1203 07:09:41.584938 4947 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c0f8cc6f-4c42-4317-a0f7-e7d24cb8400b" podUID="28892119-e165-46ea-a903-08207e491378" Dec 03 07:09:41 crc kubenswrapper[4947]: I1203 07:09:41.640438 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:41 crc kubenswrapper[4947]: I1203 07:09:41.707593 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:09:41 crc kubenswrapper[4947]: I1203 07:09:41.779266 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7f76bbf57b-xqff8"] Dec 03 07:09:41 crc kubenswrapper[4947]: I1203 07:09:41.780202 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7f76bbf57b-xqff8" podUID="44491e74-cc03-4920-9a13-dd45fdc80a72" containerName="barbican-api-log" containerID="cri-o://c0da3ad03e90ebc0e594700ba50b9c8a222500604c0d1efcb5b7f081efbdace6" gracePeriod=30 Dec 03 07:09:41 crc kubenswrapper[4947]: I1203 07:09:41.781214 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7f76bbf57b-xqff8" podUID="44491e74-cc03-4920-9a13-dd45fdc80a72" containerName="barbican-api" containerID="cri-o://acf01c9bf5d5aba52071c93afce6a2c5fd26f396ce1a3fab1750d45a6798e5c3" gracePeriod=30 Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.087137 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f849bbbf6-tf4mc" Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.248093 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-ovndb-tls-certs\") pod \"d2e58a7d-365d-4296-b18f-5d1bfe90b61f\" (UID: \"d2e58a7d-365d-4296-b18f-5d1bfe90b61f\") " Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.248187 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-httpd-config\") pod \"d2e58a7d-365d-4296-b18f-5d1bfe90b61f\" (UID: \"d2e58a7d-365d-4296-b18f-5d1bfe90b61f\") " Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.248208 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-combined-ca-bundle\") pod \"d2e58a7d-365d-4296-b18f-5d1bfe90b61f\" (UID: \"d2e58a7d-365d-4296-b18f-5d1bfe90b61f\") " Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.248362 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbpf8\" (UniqueName: \"kubernetes.io/projected/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-kube-api-access-qbpf8\") pod \"d2e58a7d-365d-4296-b18f-5d1bfe90b61f\" (UID: \"d2e58a7d-365d-4296-b18f-5d1bfe90b61f\") " Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.248403 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-config\") pod \"d2e58a7d-365d-4296-b18f-5d1bfe90b61f\" (UID: \"d2e58a7d-365d-4296-b18f-5d1bfe90b61f\") " Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.257614 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-kube-api-access-qbpf8" (OuterVolumeSpecName: "kube-api-access-qbpf8") pod "d2e58a7d-365d-4296-b18f-5d1bfe90b61f" (UID: "d2e58a7d-365d-4296-b18f-5d1bfe90b61f"). InnerVolumeSpecName "kube-api-access-qbpf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.257918 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d2e58a7d-365d-4296-b18f-5d1bfe90b61f" (UID: "d2e58a7d-365d-4296-b18f-5d1bfe90b61f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.350167 4947 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.350203 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbpf8\" (UniqueName: \"kubernetes.io/projected/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-kube-api-access-qbpf8\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.413138 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2e58a7d-365d-4296-b18f-5d1bfe90b61f" (UID: "d2e58a7d-365d-4296-b18f-5d1bfe90b61f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.444821 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-config" (OuterVolumeSpecName: "config") pod "d2e58a7d-365d-4296-b18f-5d1bfe90b61f" (UID: "d2e58a7d-365d-4296-b18f-5d1bfe90b61f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.449734 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d2e58a7d-365d-4296-b18f-5d1bfe90b61f" (UID: "d2e58a7d-365d-4296-b18f-5d1bfe90b61f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.452128 4947 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.452160 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.452175 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d2e58a7d-365d-4296-b18f-5d1bfe90b61f-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.529146 4947 generic.go:334] "Generic (PLEG): container finished" podID="44491e74-cc03-4920-9a13-dd45fdc80a72" containerID="c0da3ad03e90ebc0e594700ba50b9c8a222500604c0d1efcb5b7f081efbdace6" exitCode=143 Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.531910 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5f849bbbf6-tf4mc" Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.532065 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f76bbf57b-xqff8" event={"ID":"44491e74-cc03-4920-9a13-dd45fdc80a72","Type":"ContainerDied","Data":"c0da3ad03e90ebc0e594700ba50b9c8a222500604c0d1efcb5b7f081efbdace6"} Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.532133 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5f849bbbf6-tf4mc" event={"ID":"d2e58a7d-365d-4296-b18f-5d1bfe90b61f","Type":"ContainerDied","Data":"a4aec8de6845f3239f5c93499c3d652accbe96dd8e72975e08a2818495982e3a"} Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.532162 4947 scope.go:117] "RemoveContainer" containerID="b5da53d34c11626bb7af6025e1d48b7788af57875c4c0af890dd758ea0e40aca" Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.549821 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"49caa6da-3c98-4c49-ab22-62121ff908cf","Type":"ContainerStarted","Data":"23fd79e84a8a0d20a65d598b706fa955381b97ec7814c70275bf4b3eb633dcce"} Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.549906 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"49caa6da-3c98-4c49-ab22-62121ff908cf","Type":"ContainerStarted","Data":"d040ac489ceeebebbd6c8f3183918ed793779fdf3503c2652969f88e0af4c4fa"} Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.582687 4947 scope.go:117] "RemoveContainer" containerID="4e1056f9dc24c16b8391f6730abe97c8c00219da69df68bd522af5ece0d4a512" Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.600358 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5f849bbbf6-tf4mc"] Dec 03 07:09:42 crc kubenswrapper[4947]: I1203 07:09:42.607414 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5f849bbbf6-tf4mc"] Dec 03 07:09:43 crc kubenswrapper[4947]: I1203 07:09:43.093759 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2e58a7d-365d-4296-b18f-5d1bfe90b61f" path="/var/lib/kubelet/pods/d2e58a7d-365d-4296-b18f-5d1bfe90b61f/volumes" Dec 03 07:09:43 crc kubenswrapper[4947]: I1203 07:09:43.566811 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"49caa6da-3c98-4c49-ab22-62121ff908cf","Type":"ContainerStarted","Data":"208ecb2e4a9580e581c1630c83f89daf84de2499a774647eff59bb9623f3cf65"} Dec 03 07:09:43 crc kubenswrapper[4947]: I1203 07:09:43.592032 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.5920158669999998 podStartE2EDuration="3.592015867s" podCreationTimestamp="2025-12-03 07:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:09:43.582960678 +0000 UTC m=+1244.843915104" watchObservedRunningTime="2025-12-03 07:09:43.592015867 +0000 UTC m=+1244.852970293" Dec 03 07:09:44 crc kubenswrapper[4947]: I1203 07:09:44.982532 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 03 07:09:45 crc kubenswrapper[4947]: I1203 07:09:45.417256 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7f76bbf57b-xqff8" podUID="44491e74-cc03-4920-9a13-dd45fdc80a72" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:55886->10.217.0.160:9311: read: connection reset by peer" Dec 03 07:09:45 crc kubenswrapper[4947]: I1203 07:09:45.417268 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7f76bbf57b-xqff8" podUID="44491e74-cc03-4920-9a13-dd45fdc80a72" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.160:9311/healthcheck\": read tcp 10.217.0.2:55902->10.217.0.160:9311: read: connection reset by peer" Dec 03 07:09:45 crc kubenswrapper[4947]: I1203 07:09:45.595955 4947 generic.go:334] "Generic (PLEG): container finished" podID="44491e74-cc03-4920-9a13-dd45fdc80a72" containerID="acf01c9bf5d5aba52071c93afce6a2c5fd26f396ce1a3fab1750d45a6798e5c3" exitCode=0 Dec 03 07:09:45 crc kubenswrapper[4947]: I1203 07:09:45.596414 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f76bbf57b-xqff8" event={"ID":"44491e74-cc03-4920-9a13-dd45fdc80a72","Type":"ContainerDied","Data":"acf01c9bf5d5aba52071c93afce6a2c5fd26f396ce1a3fab1750d45a6798e5c3"} Dec 03 07:09:45 crc kubenswrapper[4947]: I1203 07:09:45.909285 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f76bbf57b-xqff8" Dec 03 07:09:45 crc kubenswrapper[4947]: I1203 07:09:45.933543 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h62cg\" (UniqueName: \"kubernetes.io/projected/44491e74-cc03-4920-9a13-dd45fdc80a72-kube-api-access-h62cg\") pod \"44491e74-cc03-4920-9a13-dd45fdc80a72\" (UID: \"44491e74-cc03-4920-9a13-dd45fdc80a72\") " Dec 03 07:09:45 crc kubenswrapper[4947]: I1203 07:09:45.933723 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44491e74-cc03-4920-9a13-dd45fdc80a72-config-data\") pod \"44491e74-cc03-4920-9a13-dd45fdc80a72\" (UID: \"44491e74-cc03-4920-9a13-dd45fdc80a72\") " Dec 03 07:09:45 crc kubenswrapper[4947]: I1203 07:09:45.933831 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44491e74-cc03-4920-9a13-dd45fdc80a72-config-data-custom\") pod \"44491e74-cc03-4920-9a13-dd45fdc80a72\" (UID: \"44491e74-cc03-4920-9a13-dd45fdc80a72\") " Dec 03 07:09:45 crc kubenswrapper[4947]: I1203 07:09:45.933862 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44491e74-cc03-4920-9a13-dd45fdc80a72-combined-ca-bundle\") pod \"44491e74-cc03-4920-9a13-dd45fdc80a72\" (UID: \"44491e74-cc03-4920-9a13-dd45fdc80a72\") " Dec 03 07:09:45 crc kubenswrapper[4947]: I1203 07:09:45.933942 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44491e74-cc03-4920-9a13-dd45fdc80a72-logs\") pod \"44491e74-cc03-4920-9a13-dd45fdc80a72\" (UID: \"44491e74-cc03-4920-9a13-dd45fdc80a72\") " Dec 03 07:09:45 crc kubenswrapper[4947]: I1203 07:09:45.934869 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44491e74-cc03-4920-9a13-dd45fdc80a72-logs" (OuterVolumeSpecName: "logs") pod "44491e74-cc03-4920-9a13-dd45fdc80a72" (UID: "44491e74-cc03-4920-9a13-dd45fdc80a72"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:09:45 crc kubenswrapper[4947]: I1203 07:09:45.947450 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 07:09:45 crc kubenswrapper[4947]: I1203 07:09:45.948444 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44491e74-cc03-4920-9a13-dd45fdc80a72-kube-api-access-h62cg" (OuterVolumeSpecName: "kube-api-access-h62cg") pod "44491e74-cc03-4920-9a13-dd45fdc80a72" (UID: "44491e74-cc03-4920-9a13-dd45fdc80a72"). InnerVolumeSpecName "kube-api-access-h62cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:09:45 crc kubenswrapper[4947]: I1203 07:09:45.961964 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44491e74-cc03-4920-9a13-dd45fdc80a72-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "44491e74-cc03-4920-9a13-dd45fdc80a72" (UID: "44491e74-cc03-4920-9a13-dd45fdc80a72"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:45 crc kubenswrapper[4947]: I1203 07:09:45.966965 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44491e74-cc03-4920-9a13-dd45fdc80a72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44491e74-cc03-4920-9a13-dd45fdc80a72" (UID: "44491e74-cc03-4920-9a13-dd45fdc80a72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:46 crc kubenswrapper[4947]: I1203 07:09:46.008095 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44491e74-cc03-4920-9a13-dd45fdc80a72-config-data" (OuterVolumeSpecName: "config-data") pod "44491e74-cc03-4920-9a13-dd45fdc80a72" (UID: "44491e74-cc03-4920-9a13-dd45fdc80a72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:46 crc kubenswrapper[4947]: I1203 07:09:46.035825 4947 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/44491e74-cc03-4920-9a13-dd45fdc80a72-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:46 crc kubenswrapper[4947]: I1203 07:09:46.035858 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44491e74-cc03-4920-9a13-dd45fdc80a72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:46 crc kubenswrapper[4947]: I1203 07:09:46.035868 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/44491e74-cc03-4920-9a13-dd45fdc80a72-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:46 crc kubenswrapper[4947]: I1203 07:09:46.035879 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h62cg\" (UniqueName: \"kubernetes.io/projected/44491e74-cc03-4920-9a13-dd45fdc80a72-kube-api-access-h62cg\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:46 crc kubenswrapper[4947]: I1203 07:09:46.035890 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44491e74-cc03-4920-9a13-dd45fdc80a72-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:46 crc kubenswrapper[4947]: I1203 07:09:46.608347 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f76bbf57b-xqff8" event={"ID":"44491e74-cc03-4920-9a13-dd45fdc80a72","Type":"ContainerDied","Data":"146a5f784ba6512a4b2d9d2cfe7cde91876292056a4d3f93a47a3ef343d4a360"} Dec 03 07:09:46 crc kubenswrapper[4947]: I1203 07:09:46.608589 4947 scope.go:117] "RemoveContainer" containerID="acf01c9bf5d5aba52071c93afce6a2c5fd26f396ce1a3fab1750d45a6798e5c3" Dec 03 07:09:46 crc kubenswrapper[4947]: I1203 07:09:46.608451 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f76bbf57b-xqff8" Dec 03 07:09:46 crc kubenswrapper[4947]: I1203 07:09:46.660555 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7f76bbf57b-xqff8"] Dec 03 07:09:46 crc kubenswrapper[4947]: I1203 07:09:46.668035 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7f76bbf57b-xqff8"] Dec 03 07:09:46 crc kubenswrapper[4947]: I1203 07:09:46.879190 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:09:46 crc kubenswrapper[4947]: I1203 07:09:46.879558 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="79f30107-717d-402f-b501-8911cede9cc1" containerName="ceilometer-central-agent" containerID="cri-o://c160cfd5c71b2291c7347c163364a20cda48c438a7c41328117d8d8cb217a442" gracePeriod=30 Dec 03 07:09:46 crc kubenswrapper[4947]: I1203 07:09:46.879614 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="79f30107-717d-402f-b501-8911cede9cc1" containerName="proxy-httpd" containerID="cri-o://df76336b36c90d537c03130eb2ed5f3b7320b066919dea5df4667e3191f5e1db" gracePeriod=30 Dec 03 07:09:46 crc kubenswrapper[4947]: I1203 07:09:46.879692 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="79f30107-717d-402f-b501-8911cede9cc1" containerName="sg-core" containerID="cri-o://f667c287f4884fc39a6408c862e2612ec32c1352d5f89338f61c8b71168745ef" gracePeriod=30 Dec 03 07:09:46 crc kubenswrapper[4947]: I1203 07:09:46.879749 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="79f30107-717d-402f-b501-8911cede9cc1" containerName="ceilometer-notification-agent" containerID="cri-o://f301cef4448a9a3bc7351f13982546d0b0e78a6d82a42cf4eee6ea3729b5da0b" gracePeriod=30 Dec 03 07:09:46 crc kubenswrapper[4947]: I1203 07:09:46.887929 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="79f30107-717d-402f-b501-8911cede9cc1" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.156:3000/\": EOF" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.007517 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6ddbf97597-l6hz9"] Dec 03 07:09:47 crc kubenswrapper[4947]: E1203 07:09:47.008256 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2e58a7d-365d-4296-b18f-5d1bfe90b61f" containerName="neutron-httpd" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.008281 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2e58a7d-365d-4296-b18f-5d1bfe90b61f" containerName="neutron-httpd" Dec 03 07:09:47 crc kubenswrapper[4947]: E1203 07:09:47.008297 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44491e74-cc03-4920-9a13-dd45fdc80a72" containerName="barbican-api" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.008307 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="44491e74-cc03-4920-9a13-dd45fdc80a72" containerName="barbican-api" Dec 03 07:09:47 crc kubenswrapper[4947]: E1203 07:09:47.008330 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2e58a7d-365d-4296-b18f-5d1bfe90b61f" containerName="neutron-api" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.008340 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2e58a7d-365d-4296-b18f-5d1bfe90b61f" containerName="neutron-api" Dec 03 07:09:47 crc kubenswrapper[4947]: E1203 07:09:47.008369 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44491e74-cc03-4920-9a13-dd45fdc80a72" containerName="barbican-api-log" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.008377 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="44491e74-cc03-4920-9a13-dd45fdc80a72" containerName="barbican-api-log" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.008609 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="44491e74-cc03-4920-9a13-dd45fdc80a72" containerName="barbican-api-log" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.008623 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2e58a7d-365d-4296-b18f-5d1bfe90b61f" containerName="neutron-httpd" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.008636 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2e58a7d-365d-4296-b18f-5d1bfe90b61f" containerName="neutron-api" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.008654 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="44491e74-cc03-4920-9a13-dd45fdc80a72" containerName="barbican-api" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.009837 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.011411 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.011802 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.014814 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.033759 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6ddbf97597-l6hz9"] Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.050524 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stpzs\" (UniqueName: \"kubernetes.io/projected/1f87802c-3846-486e-a131-39a7fe336c96-kube-api-access-stpzs\") pod \"swift-proxy-6ddbf97597-l6hz9\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.050576 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f87802c-3846-486e-a131-39a7fe336c96-internal-tls-certs\") pod \"swift-proxy-6ddbf97597-l6hz9\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.050617 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f87802c-3846-486e-a131-39a7fe336c96-run-httpd\") pod \"swift-proxy-6ddbf97597-l6hz9\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.050654 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f87802c-3846-486e-a131-39a7fe336c96-public-tls-certs\") pod \"swift-proxy-6ddbf97597-l6hz9\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.050689 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f87802c-3846-486e-a131-39a7fe336c96-log-httpd\") pod \"swift-proxy-6ddbf97597-l6hz9\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.050785 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f87802c-3846-486e-a131-39a7fe336c96-combined-ca-bundle\") pod \"swift-proxy-6ddbf97597-l6hz9\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.050816 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1f87802c-3846-486e-a131-39a7fe336c96-etc-swift\") pod \"swift-proxy-6ddbf97597-l6hz9\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.050927 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f87802c-3846-486e-a131-39a7fe336c96-config-data\") pod \"swift-proxy-6ddbf97597-l6hz9\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.100617 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44491e74-cc03-4920-9a13-dd45fdc80a72" path="/var/lib/kubelet/pods/44491e74-cc03-4920-9a13-dd45fdc80a72/volumes" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.153175 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f87802c-3846-486e-a131-39a7fe336c96-public-tls-certs\") pod \"swift-proxy-6ddbf97597-l6hz9\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.153239 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f87802c-3846-486e-a131-39a7fe336c96-log-httpd\") pod \"swift-proxy-6ddbf97597-l6hz9\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.153303 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f87802c-3846-486e-a131-39a7fe336c96-combined-ca-bundle\") pod \"swift-proxy-6ddbf97597-l6hz9\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.153332 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1f87802c-3846-486e-a131-39a7fe336c96-etc-swift\") pod \"swift-proxy-6ddbf97597-l6hz9\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.153419 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f87802c-3846-486e-a131-39a7fe336c96-config-data\") pod \"swift-proxy-6ddbf97597-l6hz9\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.153509 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stpzs\" (UniqueName: \"kubernetes.io/projected/1f87802c-3846-486e-a131-39a7fe336c96-kube-api-access-stpzs\") pod \"swift-proxy-6ddbf97597-l6hz9\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.153539 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f87802c-3846-486e-a131-39a7fe336c96-internal-tls-certs\") pod \"swift-proxy-6ddbf97597-l6hz9\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.153569 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f87802c-3846-486e-a131-39a7fe336c96-run-httpd\") pod \"swift-proxy-6ddbf97597-l6hz9\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.154135 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f87802c-3846-486e-a131-39a7fe336c96-run-httpd\") pod \"swift-proxy-6ddbf97597-l6hz9\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.154825 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f87802c-3846-486e-a131-39a7fe336c96-log-httpd\") pod \"swift-proxy-6ddbf97597-l6hz9\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.159649 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1f87802c-3846-486e-a131-39a7fe336c96-etc-swift\") pod \"swift-proxy-6ddbf97597-l6hz9\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.160982 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f87802c-3846-486e-a131-39a7fe336c96-public-tls-certs\") pod \"swift-proxy-6ddbf97597-l6hz9\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.161227 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f87802c-3846-486e-a131-39a7fe336c96-combined-ca-bundle\") pod \"swift-proxy-6ddbf97597-l6hz9\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.163153 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f87802c-3846-486e-a131-39a7fe336c96-internal-tls-certs\") pod \"swift-proxy-6ddbf97597-l6hz9\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.173376 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f87802c-3846-486e-a131-39a7fe336c96-config-data\") pod \"swift-proxy-6ddbf97597-l6hz9\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.180760 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stpzs\" (UniqueName: \"kubernetes.io/projected/1f87802c-3846-486e-a131-39a7fe336c96-kube-api-access-stpzs\") pod \"swift-proxy-6ddbf97597-l6hz9\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.335187 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.621017 4947 generic.go:334] "Generic (PLEG): container finished" podID="79f30107-717d-402f-b501-8911cede9cc1" containerID="df76336b36c90d537c03130eb2ed5f3b7320b066919dea5df4667e3191f5e1db" exitCode=0 Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.621055 4947 generic.go:334] "Generic (PLEG): container finished" podID="79f30107-717d-402f-b501-8911cede9cc1" containerID="f667c287f4884fc39a6408c862e2612ec32c1352d5f89338f61c8b71168745ef" exitCode=2 Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.621066 4947 generic.go:334] "Generic (PLEG): container finished" podID="79f30107-717d-402f-b501-8911cede9cc1" containerID="c160cfd5c71b2291c7347c163364a20cda48c438a7c41328117d8d8cb217a442" exitCode=0 Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.621089 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79f30107-717d-402f-b501-8911cede9cc1","Type":"ContainerDied","Data":"df76336b36c90d537c03130eb2ed5f3b7320b066919dea5df4667e3191f5e1db"} Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.621116 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79f30107-717d-402f-b501-8911cede9cc1","Type":"ContainerDied","Data":"f667c287f4884fc39a6408c862e2612ec32c1352d5f89338f61c8b71168745ef"} Dec 03 07:09:47 crc kubenswrapper[4947]: I1203 07:09:47.621130 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79f30107-717d-402f-b501-8911cede9cc1","Type":"ContainerDied","Data":"c160cfd5c71b2291c7347c163364a20cda48c438a7c41328117d8d8cb217a442"} Dec 03 07:09:50 crc kubenswrapper[4947]: I1203 07:09:50.835338 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:09:50 crc kubenswrapper[4947]: I1203 07:09:50.836019 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f251bdf4-6211-48a9-8578-5acd9fb17c15" containerName="glance-log" containerID="cri-o://e806c545c620a41f44b266fdb943dd7692498f3256e3eebbb60b7528e0a8dc66" gracePeriod=30 Dec 03 07:09:50 crc kubenswrapper[4947]: I1203 07:09:50.836244 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f251bdf4-6211-48a9-8578-5acd9fb17c15" containerName="glance-httpd" containerID="cri-o://ff908012798704d1cea033ccfe8f4688f75d889f71dae78525c4981c116cc1dd" gracePeriod=30 Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.149931 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.385578 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-9xgkg"] Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.386723 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9xgkg" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.396647 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9xgkg"] Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.438653 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="79f30107-717d-402f-b501-8911cede9cc1" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.156:3000/\": dial tcp 10.217.0.156:3000: connect: connection refused" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.443883 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-b13d-account-create-update-k6qnx"] Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.445289 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b13d-account-create-update-k6qnx" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.446922 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.460556 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-wnv6s"] Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.463131 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wnv6s" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.479399 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-b13d-account-create-update-k6qnx"] Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.488692 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-wnv6s"] Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.522187 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/941b0633-5ae9-4203-ad2e-8f4644056d2b-operator-scripts\") pod \"nova-api-db-create-9xgkg\" (UID: \"941b0633-5ae9-4203-ad2e-8f4644056d2b\") " pod="openstack/nova-api-db-create-9xgkg" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.522599 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68pld\" (UniqueName: \"kubernetes.io/projected/941b0633-5ae9-4203-ad2e-8f4644056d2b-kube-api-access-68pld\") pod \"nova-api-db-create-9xgkg\" (UID: \"941b0633-5ae9-4203-ad2e-8f4644056d2b\") " pod="openstack/nova-api-db-create-9xgkg" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.615908 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-kgjrr"] Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.617635 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kgjrr" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.624321 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68pld\" (UniqueName: \"kubernetes.io/projected/941b0633-5ae9-4203-ad2e-8f4644056d2b-kube-api-access-68pld\") pod \"nova-api-db-create-9xgkg\" (UID: \"941b0633-5ae9-4203-ad2e-8f4644056d2b\") " pod="openstack/nova-api-db-create-9xgkg" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.624397 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e07d2dc5-7d24-4294-8623-7b50b59c2135-operator-scripts\") pod \"nova-cell0-db-create-wnv6s\" (UID: \"e07d2dc5-7d24-4294-8623-7b50b59c2135\") " pod="openstack/nova-cell0-db-create-wnv6s" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.624434 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl5zf\" (UniqueName: \"kubernetes.io/projected/e07d2dc5-7d24-4294-8623-7b50b59c2135-kube-api-access-zl5zf\") pod \"nova-cell0-db-create-wnv6s\" (UID: \"e07d2dc5-7d24-4294-8623-7b50b59c2135\") " pod="openstack/nova-cell0-db-create-wnv6s" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.624540 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eba16a83-33ec-44b8-9494-8a5efa7e59e7-operator-scripts\") pod \"nova-api-b13d-account-create-update-k6qnx\" (UID: \"eba16a83-33ec-44b8-9494-8a5efa7e59e7\") " pod="openstack/nova-api-b13d-account-create-update-k6qnx" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.624574 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tmjs\" (UniqueName: \"kubernetes.io/projected/eba16a83-33ec-44b8-9494-8a5efa7e59e7-kube-api-access-2tmjs\") pod \"nova-api-b13d-account-create-update-k6qnx\" (UID: \"eba16a83-33ec-44b8-9494-8a5efa7e59e7\") " pod="openstack/nova-api-b13d-account-create-update-k6qnx" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.625030 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/941b0633-5ae9-4203-ad2e-8f4644056d2b-operator-scripts\") pod \"nova-api-db-create-9xgkg\" (UID: \"941b0633-5ae9-4203-ad2e-8f4644056d2b\") " pod="openstack/nova-api-db-create-9xgkg" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.626014 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/941b0633-5ae9-4203-ad2e-8f4644056d2b-operator-scripts\") pod \"nova-api-db-create-9xgkg\" (UID: \"941b0633-5ae9-4203-ad2e-8f4644056d2b\") " pod="openstack/nova-api-db-create-9xgkg" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.637613 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-kgjrr"] Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.647361 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68pld\" (UniqueName: \"kubernetes.io/projected/941b0633-5ae9-4203-ad2e-8f4644056d2b-kube-api-access-68pld\") pod \"nova-api-db-create-9xgkg\" (UID: \"941b0633-5ae9-4203-ad2e-8f4644056d2b\") " pod="openstack/nova-api-db-create-9xgkg" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.655441 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-c18d-account-create-update-ss77z"] Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.656793 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c18d-account-create-update-ss77z" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.663525 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.684871 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c18d-account-create-update-ss77z"] Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.691646 4947 generic.go:334] "Generic (PLEG): container finished" podID="f251bdf4-6211-48a9-8578-5acd9fb17c15" containerID="e806c545c620a41f44b266fdb943dd7692498f3256e3eebbb60b7528e0a8dc66" exitCode=143 Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.691722 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f251bdf4-6211-48a9-8578-5acd9fb17c15","Type":"ContainerDied","Data":"e806c545c620a41f44b266fdb943dd7692498f3256e3eebbb60b7528e0a8dc66"} Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.713154 4947 generic.go:334] "Generic (PLEG): container finished" podID="79f30107-717d-402f-b501-8911cede9cc1" containerID="f301cef4448a9a3bc7351f13982546d0b0e78a6d82a42cf4eee6ea3729b5da0b" exitCode=0 Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.713209 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79f30107-717d-402f-b501-8911cede9cc1","Type":"ContainerDied","Data":"f301cef4448a9a3bc7351f13982546d0b0e78a6d82a42cf4eee6ea3729b5da0b"} Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.725181 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9xgkg" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.727812 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsqmg\" (UniqueName: \"kubernetes.io/projected/c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf-kube-api-access-qsqmg\") pod \"nova-cell1-db-create-kgjrr\" (UID: \"c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf\") " pod="openstack/nova-cell1-db-create-kgjrr" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.727864 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eba16a83-33ec-44b8-9494-8a5efa7e59e7-operator-scripts\") pod \"nova-api-b13d-account-create-update-k6qnx\" (UID: \"eba16a83-33ec-44b8-9494-8a5efa7e59e7\") " pod="openstack/nova-api-b13d-account-create-update-k6qnx" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.727884 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tmjs\" (UniqueName: \"kubernetes.io/projected/eba16a83-33ec-44b8-9494-8a5efa7e59e7-kube-api-access-2tmjs\") pod \"nova-api-b13d-account-create-update-k6qnx\" (UID: \"eba16a83-33ec-44b8-9494-8a5efa7e59e7\") " pod="openstack/nova-api-b13d-account-create-update-k6qnx" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.727931 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c45c77c-1306-4434-9b1f-9f6d45522d6a-operator-scripts\") pod \"nova-cell0-c18d-account-create-update-ss77z\" (UID: \"2c45c77c-1306-4434-9b1f-9f6d45522d6a\") " pod="openstack/nova-cell0-c18d-account-create-update-ss77z" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.727954 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf-operator-scripts\") pod \"nova-cell1-db-create-kgjrr\" (UID: \"c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf\") " pod="openstack/nova-cell1-db-create-kgjrr" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.728032 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e07d2dc5-7d24-4294-8623-7b50b59c2135-operator-scripts\") pod \"nova-cell0-db-create-wnv6s\" (UID: \"e07d2dc5-7d24-4294-8623-7b50b59c2135\") " pod="openstack/nova-cell0-db-create-wnv6s" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.728071 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl5zf\" (UniqueName: \"kubernetes.io/projected/e07d2dc5-7d24-4294-8623-7b50b59c2135-kube-api-access-zl5zf\") pod \"nova-cell0-db-create-wnv6s\" (UID: \"e07d2dc5-7d24-4294-8623-7b50b59c2135\") " pod="openstack/nova-cell0-db-create-wnv6s" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.728117 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b4wk\" (UniqueName: \"kubernetes.io/projected/2c45c77c-1306-4434-9b1f-9f6d45522d6a-kube-api-access-5b4wk\") pod \"nova-cell0-c18d-account-create-update-ss77z\" (UID: \"2c45c77c-1306-4434-9b1f-9f6d45522d6a\") " pod="openstack/nova-cell0-c18d-account-create-update-ss77z" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.728985 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eba16a83-33ec-44b8-9494-8a5efa7e59e7-operator-scripts\") pod \"nova-api-b13d-account-create-update-k6qnx\" (UID: \"eba16a83-33ec-44b8-9494-8a5efa7e59e7\") " pod="openstack/nova-api-b13d-account-create-update-k6qnx" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.729741 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e07d2dc5-7d24-4294-8623-7b50b59c2135-operator-scripts\") pod \"nova-cell0-db-create-wnv6s\" (UID: \"e07d2dc5-7d24-4294-8623-7b50b59c2135\") " pod="openstack/nova-cell0-db-create-wnv6s" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.754240 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tmjs\" (UniqueName: \"kubernetes.io/projected/eba16a83-33ec-44b8-9494-8a5efa7e59e7-kube-api-access-2tmjs\") pod \"nova-api-b13d-account-create-update-k6qnx\" (UID: \"eba16a83-33ec-44b8-9494-8a5efa7e59e7\") " pod="openstack/nova-api-b13d-account-create-update-k6qnx" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.757753 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl5zf\" (UniqueName: \"kubernetes.io/projected/e07d2dc5-7d24-4294-8623-7b50b59c2135-kube-api-access-zl5zf\") pod \"nova-cell0-db-create-wnv6s\" (UID: \"e07d2dc5-7d24-4294-8623-7b50b59c2135\") " pod="openstack/nova-cell0-db-create-wnv6s" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.769444 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b13d-account-create-update-k6qnx" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.785094 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wnv6s" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.827241 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-56ad-account-create-update-d94hj"] Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.828716 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-56ad-account-create-update-d94hj" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.829514 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c45c77c-1306-4434-9b1f-9f6d45522d6a-operator-scripts\") pod \"nova-cell0-c18d-account-create-update-ss77z\" (UID: \"2c45c77c-1306-4434-9b1f-9f6d45522d6a\") " pod="openstack/nova-cell0-c18d-account-create-update-ss77z" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.829561 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf-operator-scripts\") pod \"nova-cell1-db-create-kgjrr\" (UID: \"c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf\") " pod="openstack/nova-cell1-db-create-kgjrr" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.829677 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b4wk\" (UniqueName: \"kubernetes.io/projected/2c45c77c-1306-4434-9b1f-9f6d45522d6a-kube-api-access-5b4wk\") pod \"nova-cell0-c18d-account-create-update-ss77z\" (UID: \"2c45c77c-1306-4434-9b1f-9f6d45522d6a\") " pod="openstack/nova-cell0-c18d-account-create-update-ss77z" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.829724 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsqmg\" (UniqueName: \"kubernetes.io/projected/c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf-kube-api-access-qsqmg\") pod \"nova-cell1-db-create-kgjrr\" (UID: \"c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf\") " pod="openstack/nova-cell1-db-create-kgjrr" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.830350 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf-operator-scripts\") pod \"nova-cell1-db-create-kgjrr\" (UID: \"c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf\") " pod="openstack/nova-cell1-db-create-kgjrr" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.830552 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c45c77c-1306-4434-9b1f-9f6d45522d6a-operator-scripts\") pod \"nova-cell0-c18d-account-create-update-ss77z\" (UID: \"2c45c77c-1306-4434-9b1f-9f6d45522d6a\") " pod="openstack/nova-cell0-c18d-account-create-update-ss77z" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.830878 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.845978 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-56ad-account-create-update-d94hj"] Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.848137 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsqmg\" (UniqueName: \"kubernetes.io/projected/c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf-kube-api-access-qsqmg\") pod \"nova-cell1-db-create-kgjrr\" (UID: \"c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf\") " pod="openstack/nova-cell1-db-create-kgjrr" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.865351 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b4wk\" (UniqueName: \"kubernetes.io/projected/2c45c77c-1306-4434-9b1f-9f6d45522d6a-kube-api-access-5b4wk\") pod \"nova-cell0-c18d-account-create-update-ss77z\" (UID: \"2c45c77c-1306-4434-9b1f-9f6d45522d6a\") " pod="openstack/nova-cell0-c18d-account-create-update-ss77z" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.930937 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72d77a00-5a8b-4ac6-a764-c558f079bd04-operator-scripts\") pod \"nova-cell1-56ad-account-create-update-d94hj\" (UID: \"72d77a00-5a8b-4ac6-a764-c558f079bd04\") " pod="openstack/nova-cell1-56ad-account-create-update-d94hj" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.931585 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwg2r\" (UniqueName: \"kubernetes.io/projected/72d77a00-5a8b-4ac6-a764-c558f079bd04-kube-api-access-hwg2r\") pod \"nova-cell1-56ad-account-create-update-d94hj\" (UID: \"72d77a00-5a8b-4ac6-a764-c558f079bd04\") " pod="openstack/nova-cell1-56ad-account-create-update-d94hj" Dec 03 07:09:51 crc kubenswrapper[4947]: I1203 07:09:51.952893 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kgjrr" Dec 03 07:09:52 crc kubenswrapper[4947]: I1203 07:09:52.001179 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c18d-account-create-update-ss77z" Dec 03 07:09:52 crc kubenswrapper[4947]: I1203 07:09:52.032818 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72d77a00-5a8b-4ac6-a764-c558f079bd04-operator-scripts\") pod \"nova-cell1-56ad-account-create-update-d94hj\" (UID: \"72d77a00-5a8b-4ac6-a764-c558f079bd04\") " pod="openstack/nova-cell1-56ad-account-create-update-d94hj" Dec 03 07:09:52 crc kubenswrapper[4947]: I1203 07:09:52.032925 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwg2r\" (UniqueName: \"kubernetes.io/projected/72d77a00-5a8b-4ac6-a764-c558f079bd04-kube-api-access-hwg2r\") pod \"nova-cell1-56ad-account-create-update-d94hj\" (UID: \"72d77a00-5a8b-4ac6-a764-c558f079bd04\") " pod="openstack/nova-cell1-56ad-account-create-update-d94hj" Dec 03 07:09:52 crc kubenswrapper[4947]: I1203 07:09:52.033933 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72d77a00-5a8b-4ac6-a764-c558f079bd04-operator-scripts\") pod \"nova-cell1-56ad-account-create-update-d94hj\" (UID: \"72d77a00-5a8b-4ac6-a764-c558f079bd04\") " pod="openstack/nova-cell1-56ad-account-create-update-d94hj" Dec 03 07:09:52 crc kubenswrapper[4947]: I1203 07:09:52.047825 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwg2r\" (UniqueName: \"kubernetes.io/projected/72d77a00-5a8b-4ac6-a764-c558f079bd04-kube-api-access-hwg2r\") pod \"nova-cell1-56ad-account-create-update-d94hj\" (UID: \"72d77a00-5a8b-4ac6-a764-c558f079bd04\") " pod="openstack/nova-cell1-56ad-account-create-update-d94hj" Dec 03 07:09:52 crc kubenswrapper[4947]: I1203 07:09:52.146425 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-56ad-account-create-update-d94hj" Dec 03 07:09:52 crc kubenswrapper[4947]: I1203 07:09:52.369838 4947 scope.go:117] "RemoveContainer" containerID="c0da3ad03e90ebc0e594700ba50b9c8a222500604c0d1efcb5b7f081efbdace6" Dec 03 07:09:52 crc kubenswrapper[4947]: I1203 07:09:52.858997 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:09:52 crc kubenswrapper[4947]: I1203 07:09:52.946294 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddgcd\" (UniqueName: \"kubernetes.io/projected/79f30107-717d-402f-b501-8911cede9cc1-kube-api-access-ddgcd\") pod \"79f30107-717d-402f-b501-8911cede9cc1\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " Dec 03 07:09:52 crc kubenswrapper[4947]: I1203 07:09:52.946613 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79f30107-717d-402f-b501-8911cede9cc1-sg-core-conf-yaml\") pod \"79f30107-717d-402f-b501-8911cede9cc1\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " Dec 03 07:09:52 crc kubenswrapper[4947]: I1203 07:09:52.946664 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f30107-717d-402f-b501-8911cede9cc1-combined-ca-bundle\") pod \"79f30107-717d-402f-b501-8911cede9cc1\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " Dec 03 07:09:52 crc kubenswrapper[4947]: I1203 07:09:52.946692 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79f30107-717d-402f-b501-8911cede9cc1-log-httpd\") pod \"79f30107-717d-402f-b501-8911cede9cc1\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " Dec 03 07:09:52 crc kubenswrapper[4947]: I1203 07:09:52.946720 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f30107-717d-402f-b501-8911cede9cc1-config-data\") pod \"79f30107-717d-402f-b501-8911cede9cc1\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " Dec 03 07:09:52 crc kubenswrapper[4947]: I1203 07:09:52.946766 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79f30107-717d-402f-b501-8911cede9cc1-run-httpd\") pod \"79f30107-717d-402f-b501-8911cede9cc1\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " Dec 03 07:09:52 crc kubenswrapper[4947]: I1203 07:09:52.946825 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79f30107-717d-402f-b501-8911cede9cc1-scripts\") pod \"79f30107-717d-402f-b501-8911cede9cc1\" (UID: \"79f30107-717d-402f-b501-8911cede9cc1\") " Dec 03 07:09:52 crc kubenswrapper[4947]: I1203 07:09:52.947556 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79f30107-717d-402f-b501-8911cede9cc1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "79f30107-717d-402f-b501-8911cede9cc1" (UID: "79f30107-717d-402f-b501-8911cede9cc1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:09:52 crc kubenswrapper[4947]: I1203 07:09:52.947910 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79f30107-717d-402f-b501-8911cede9cc1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "79f30107-717d-402f-b501-8911cede9cc1" (UID: "79f30107-717d-402f-b501-8911cede9cc1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:09:52 crc kubenswrapper[4947]: I1203 07:09:52.957622 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79f30107-717d-402f-b501-8911cede9cc1-scripts" (OuterVolumeSpecName: "scripts") pod "79f30107-717d-402f-b501-8911cede9cc1" (UID: "79f30107-717d-402f-b501-8911cede9cc1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:52 crc kubenswrapper[4947]: I1203 07:09:52.972702 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79f30107-717d-402f-b501-8911cede9cc1-kube-api-access-ddgcd" (OuterVolumeSpecName: "kube-api-access-ddgcd") pod "79f30107-717d-402f-b501-8911cede9cc1" (UID: "79f30107-717d-402f-b501-8911cede9cc1"). InnerVolumeSpecName "kube-api-access-ddgcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:09:52 crc kubenswrapper[4947]: I1203 07:09:52.992875 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79f30107-717d-402f-b501-8911cede9cc1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "79f30107-717d-402f-b501-8911cede9cc1" (UID: "79f30107-717d-402f-b501-8911cede9cc1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.049566 4947 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79f30107-717d-402f-b501-8911cede9cc1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.049617 4947 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79f30107-717d-402f-b501-8911cede9cc1-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.049626 4947 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79f30107-717d-402f-b501-8911cede9cc1-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.049634 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79f30107-717d-402f-b501-8911cede9cc1-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.049644 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddgcd\" (UniqueName: \"kubernetes.io/projected/79f30107-717d-402f-b501-8911cede9cc1-kube-api-access-ddgcd\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.061841 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79f30107-717d-402f-b501-8911cede9cc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79f30107-717d-402f-b501-8911cede9cc1" (UID: "79f30107-717d-402f-b501-8911cede9cc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.076848 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79f30107-717d-402f-b501-8911cede9cc1-config-data" (OuterVolumeSpecName: "config-data") pod "79f30107-717d-402f-b501-8911cede9cc1" (UID: "79f30107-717d-402f-b501-8911cede9cc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.151777 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f30107-717d-402f-b501-8911cede9cc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.151820 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f30107-717d-402f-b501-8911cede9cc1-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.153200 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-56ad-account-create-update-d94hj"] Dec 03 07:09:53 crc kubenswrapper[4947]: W1203 07:09:53.158068 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72d77a00_5a8b_4ac6_a764_c558f079bd04.slice/crio-803475bbef4802a6d3f327a83e481ca23b1e1ca4198cee2091705fdf9393a2bd WatchSource:0}: Error finding container 803475bbef4802a6d3f327a83e481ca23b1e1ca4198cee2091705fdf9393a2bd: Status 404 returned error can't find the container with id 803475bbef4802a6d3f327a83e481ca23b1e1ca4198cee2091705fdf9393a2bd Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.245679 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6ddbf97597-l6hz9"] Dec 03 07:09:53 crc kubenswrapper[4947]: W1203 07:09:53.250388 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f87802c_3846_486e_a131_39a7fe336c96.slice/crio-4a3594817c6134092e02431f08ce019e2382e93c314324d8cb6c1e3448a88cdd WatchSource:0}: Error finding container 4a3594817c6134092e02431f08ce019e2382e93c314324d8cb6c1e3448a88cdd: Status 404 returned error can't find the container with id 4a3594817c6134092e02431f08ce019e2382e93c314324d8cb6c1e3448a88cdd Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.384556 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-kgjrr"] Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.407750 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-b13d-account-create-update-k6qnx"] Dec 03 07:09:53 crc kubenswrapper[4947]: W1203 07:09:53.415569 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeba16a83_33ec_44b8_9494_8a5efa7e59e7.slice/crio-c633d07c5473b6b160d0c6d55f9e538d7ae18d76ecf8d3222bae7fb8d861652e WatchSource:0}: Error finding container c633d07c5473b6b160d0c6d55f9e538d7ae18d76ecf8d3222bae7fb8d861652e: Status 404 returned error can't find the container with id c633d07c5473b6b160d0c6d55f9e538d7ae18d76ecf8d3222bae7fb8d861652e Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.425060 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9xgkg"] Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.618418 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c18d-account-create-update-ss77z"] Dec 03 07:09:53 crc kubenswrapper[4947]: W1203 07:09:53.621668 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode07d2dc5_7d24_4294_8623_7b50b59c2135.slice/crio-655fbeaede3f4d9dcd47f1ea06953b3cd067f3e1f8b773dc3b21490b2f07134d WatchSource:0}: Error finding container 655fbeaede3f4d9dcd47f1ea06953b3cd067f3e1f8b773dc3b21490b2f07134d: Status 404 returned error can't find the container with id 655fbeaede3f4d9dcd47f1ea06953b3cd067f3e1f8b773dc3b21490b2f07134d Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.630523 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-wnv6s"] Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.759855 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"28892119-e165-46ea-a903-08207e491378","Type":"ContainerStarted","Data":"e942659fea577abdb40bc1687013b7988fc7bb6ef1815a0d789c1541253c9e6a"} Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.779753 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b13d-account-create-update-k6qnx" event={"ID":"eba16a83-33ec-44b8-9494-8a5efa7e59e7","Type":"ContainerStarted","Data":"15964c1303d9d7cbf96b97388f04d9d2a0066502b8675cf6bd31dd81efcad9d4"} Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.779797 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b13d-account-create-update-k6qnx" event={"ID":"eba16a83-33ec-44b8-9494-8a5efa7e59e7","Type":"ContainerStarted","Data":"c633d07c5473b6b160d0c6d55f9e538d7ae18d76ecf8d3222bae7fb8d861652e"} Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.805282 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-56ad-account-create-update-d94hj" event={"ID":"72d77a00-5a8b-4ac6-a764-c558f079bd04","Type":"ContainerStarted","Data":"11184cabbbb6be466af52f75261a132e0dea9e3c2152fd1f2181fdacb612abd2"} Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.805345 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-56ad-account-create-update-d94hj" event={"ID":"72d77a00-5a8b-4ac6-a764-c558f079bd04","Type":"ContainerStarted","Data":"803475bbef4802a6d3f327a83e481ca23b1e1ca4198cee2091705fdf9393a2bd"} Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.814035 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c18d-account-create-update-ss77z" event={"ID":"2c45c77c-1306-4434-9b1f-9f6d45522d6a","Type":"ContainerStarted","Data":"c259b51ffbcf3a80f7bbdb68581da336b6326187ac61c736c68aa3969c2e4285"} Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.818417 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-b13d-account-create-update-k6qnx" podStartSLOduration=2.8184048170000002 podStartE2EDuration="2.818404817s" podCreationTimestamp="2025-12-03 07:09:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:09:53.816195587 +0000 UTC m=+1255.077150013" watchObservedRunningTime="2025-12-03 07:09:53.818404817 +0000 UTC m=+1255.079359243" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.827739 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.172103883 podStartE2EDuration="14.827724475s" podCreationTimestamp="2025-12-03 07:09:39 +0000 UTC" firstStartedPulling="2025-12-03 07:09:40.821207997 +0000 UTC m=+1242.082162423" lastFinishedPulling="2025-12-03 07:09:52.476828589 +0000 UTC m=+1253.737783015" observedRunningTime="2025-12-03 07:09:53.79600481 +0000 UTC m=+1255.056959236" watchObservedRunningTime="2025-12-03 07:09:53.827724475 +0000 UTC m=+1255.088678901" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.829265 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kgjrr" event={"ID":"c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf","Type":"ContainerStarted","Data":"00d072c696fe0a03b60855d152210d0d79d395112c447d7edb863c26fd2e08db"} Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.829318 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kgjrr" event={"ID":"c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf","Type":"ContainerStarted","Data":"5ae4badf977d1d7b1509aeac95d69cc83970647cd9a305278d2c09e477a3e442"} Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.836819 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-56ad-account-create-update-d94hj" podStartSLOduration=2.836802454 podStartE2EDuration="2.836802454s" podCreationTimestamp="2025-12-03 07:09:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:09:53.835143549 +0000 UTC m=+1255.096097975" watchObservedRunningTime="2025-12-03 07:09:53.836802454 +0000 UTC m=+1255.097756880" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.861929 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-kgjrr" podStartSLOduration=2.861907166 podStartE2EDuration="2.861907166s" podCreationTimestamp="2025-12-03 07:09:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:09:53.852755454 +0000 UTC m=+1255.113709880" watchObservedRunningTime="2025-12-03 07:09:53.861907166 +0000 UTC m=+1255.122861592" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.886687 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.886814 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79f30107-717d-402f-b501-8911cede9cc1","Type":"ContainerDied","Data":"f49ec8273910e28ffc99f6296b8020eedfe647ec2d713fb72d56e9960fcf312d"} Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.886873 4947 scope.go:117] "RemoveContainer" containerID="df76336b36c90d537c03130eb2ed5f3b7320b066919dea5df4667e3191f5e1db" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.908713 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9xgkg" event={"ID":"941b0633-5ae9-4203-ad2e-8f4644056d2b","Type":"ContainerStarted","Data":"6100514238c35219823e96f507d19e92cdaf0f5f76bc0b651132a8cdfecdd83f"} Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.908767 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9xgkg" event={"ID":"941b0633-5ae9-4203-ad2e-8f4644056d2b","Type":"ContainerStarted","Data":"ac5c6dcbdd3a064f0d4afc512c08c9f6ba5f50e96b15146499ce9de1ae763956"} Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.933428 4947 scope.go:117] "RemoveContainer" containerID="f667c287f4884fc39a6408c862e2612ec32c1352d5f89338f61c8b71168745ef" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.942584 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.944067 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6ddbf97597-l6hz9" event={"ID":"1f87802c-3846-486e-a131-39a7fe336c96","Type":"ContainerStarted","Data":"1803fa6075e7e974f61cf6cd7ebcdef0211618286da42aad5c1687fe396a9e93"} Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.944118 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6ddbf97597-l6hz9" event={"ID":"1f87802c-3846-486e-a131-39a7fe336c96","Type":"ContainerStarted","Data":"4a3594817c6134092e02431f08ce019e2382e93c314324d8cb6c1e3448a88cdd"} Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.944948 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.945029 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.954580 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.959328 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-wnv6s" event={"ID":"e07d2dc5-7d24-4294-8623-7b50b59c2135","Type":"ContainerStarted","Data":"655fbeaede3f4d9dcd47f1ea06953b3cd067f3e1f8b773dc3b21490b2f07134d"} Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.975254 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:09:53 crc kubenswrapper[4947]: E1203 07:09:53.979378 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f30107-717d-402f-b501-8911cede9cc1" containerName="proxy-httpd" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.979418 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f30107-717d-402f-b501-8911cede9cc1" containerName="proxy-httpd" Dec 03 07:09:53 crc kubenswrapper[4947]: E1203 07:09:53.979453 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f30107-717d-402f-b501-8911cede9cc1" containerName="ceilometer-central-agent" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.979461 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f30107-717d-402f-b501-8911cede9cc1" containerName="ceilometer-central-agent" Dec 03 07:09:53 crc kubenswrapper[4947]: E1203 07:09:53.979471 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f30107-717d-402f-b501-8911cede9cc1" containerName="ceilometer-notification-agent" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.979476 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f30107-717d-402f-b501-8911cede9cc1" containerName="ceilometer-notification-agent" Dec 03 07:09:53 crc kubenswrapper[4947]: E1203 07:09:53.979502 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f30107-717d-402f-b501-8911cede9cc1" containerName="sg-core" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.979508 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f30107-717d-402f-b501-8911cede9cc1" containerName="sg-core" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.979714 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="79f30107-717d-402f-b501-8911cede9cc1" containerName="sg-core" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.979727 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="79f30107-717d-402f-b501-8911cede9cc1" containerName="ceilometer-notification-agent" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.979747 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="79f30107-717d-402f-b501-8911cede9cc1" containerName="proxy-httpd" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.979757 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="79f30107-717d-402f-b501-8911cede9cc1" containerName="ceilometer-central-agent" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.982265 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.983119 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-9xgkg" podStartSLOduration=2.983103536 podStartE2EDuration="2.983103536s" podCreationTimestamp="2025-12-03 07:09:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:09:53.966042006 +0000 UTC m=+1255.226996452" watchObservedRunningTime="2025-12-03 07:09:53.983103536 +0000 UTC m=+1255.244057962" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.986574 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 07:09:53 crc kubenswrapper[4947]: I1203 07:09:53.986871 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.005598 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.006452 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6ddbf97597-l6hz9" podStartSLOduration=8.006434088 podStartE2EDuration="8.006434088s" podCreationTimestamp="2025-12-03 07:09:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:09:54.002794928 +0000 UTC m=+1255.263749354" watchObservedRunningTime="2025-12-03 07:09:54.006434088 +0000 UTC m=+1255.267388514" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.013380 4947 scope.go:117] "RemoveContainer" containerID="f301cef4448a9a3bc7351f13982546d0b0e78a6d82a42cf4eee6ea3729b5da0b" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.055678 4947 scope.go:117] "RemoveContainer" containerID="c160cfd5c71b2291c7347c163364a20cda48c438a7c41328117d8d8cb217a442" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.072026 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e45e5af1-6468-419e-8400-58df1eb1ec64-run-httpd\") pod \"ceilometer-0\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " pod="openstack/ceilometer-0" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.072378 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e45e5af1-6468-419e-8400-58df1eb1ec64-config-data\") pod \"ceilometer-0\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " pod="openstack/ceilometer-0" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.072469 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e45e5af1-6468-419e-8400-58df1eb1ec64-log-httpd\") pod \"ceilometer-0\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " pod="openstack/ceilometer-0" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.072604 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e45e5af1-6468-419e-8400-58df1eb1ec64-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " pod="openstack/ceilometer-0" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.072713 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e45e5af1-6468-419e-8400-58df1eb1ec64-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " pod="openstack/ceilometer-0" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.073213 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfqwb\" (UniqueName: \"kubernetes.io/projected/e45e5af1-6468-419e-8400-58df1eb1ec64-kube-api-access-sfqwb\") pod \"ceilometer-0\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " pod="openstack/ceilometer-0" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.073644 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e45e5af1-6468-419e-8400-58df1eb1ec64-scripts\") pod \"ceilometer-0\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " pod="openstack/ceilometer-0" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.174948 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfqwb\" (UniqueName: \"kubernetes.io/projected/e45e5af1-6468-419e-8400-58df1eb1ec64-kube-api-access-sfqwb\") pod \"ceilometer-0\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " pod="openstack/ceilometer-0" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.175206 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e45e5af1-6468-419e-8400-58df1eb1ec64-scripts\") pod \"ceilometer-0\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " pod="openstack/ceilometer-0" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.175608 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e45e5af1-6468-419e-8400-58df1eb1ec64-run-httpd\") pod \"ceilometer-0\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " pod="openstack/ceilometer-0" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.175757 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e45e5af1-6468-419e-8400-58df1eb1ec64-config-data\") pod \"ceilometer-0\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " pod="openstack/ceilometer-0" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.176203 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e45e5af1-6468-419e-8400-58df1eb1ec64-log-httpd\") pod \"ceilometer-0\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " pod="openstack/ceilometer-0" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.176409 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e45e5af1-6468-419e-8400-58df1eb1ec64-run-httpd\") pod \"ceilometer-0\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " pod="openstack/ceilometer-0" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.176659 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e45e5af1-6468-419e-8400-58df1eb1ec64-log-httpd\") pod \"ceilometer-0\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " pod="openstack/ceilometer-0" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.177047 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e45e5af1-6468-419e-8400-58df1eb1ec64-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " pod="openstack/ceilometer-0" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.177209 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e45e5af1-6468-419e-8400-58df1eb1ec64-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " pod="openstack/ceilometer-0" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.179456 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e45e5af1-6468-419e-8400-58df1eb1ec64-scripts\") pod \"ceilometer-0\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " pod="openstack/ceilometer-0" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.183165 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e45e5af1-6468-419e-8400-58df1eb1ec64-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " pod="openstack/ceilometer-0" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.184592 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e45e5af1-6468-419e-8400-58df1eb1ec64-config-data\") pod \"ceilometer-0\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " pod="openstack/ceilometer-0" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.194871 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e45e5af1-6468-419e-8400-58df1eb1ec64-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " pod="openstack/ceilometer-0" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.201395 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfqwb\" (UniqueName: \"kubernetes.io/projected/e45e5af1-6468-419e-8400-58df1eb1ec64-kube-api-access-sfqwb\") pod \"ceilometer-0\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " pod="openstack/ceilometer-0" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.308574 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.638456 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.799088 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f251bdf4-6211-48a9-8578-5acd9fb17c15-public-tls-certs\") pod \"f251bdf4-6211-48a9-8578-5acd9fb17c15\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.799398 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f251bdf4-6211-48a9-8578-5acd9fb17c15-logs\") pod \"f251bdf4-6211-48a9-8578-5acd9fb17c15\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.799469 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f251bdf4-6211-48a9-8578-5acd9fb17c15-combined-ca-bundle\") pod \"f251bdf4-6211-48a9-8578-5acd9fb17c15\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.799503 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f251bdf4-6211-48a9-8578-5acd9fb17c15-httpd-run\") pod \"f251bdf4-6211-48a9-8578-5acd9fb17c15\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.799542 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gc8h\" (UniqueName: \"kubernetes.io/projected/f251bdf4-6211-48a9-8578-5acd9fb17c15-kube-api-access-9gc8h\") pod \"f251bdf4-6211-48a9-8578-5acd9fb17c15\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.799591 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f251bdf4-6211-48a9-8578-5acd9fb17c15-config-data\") pod \"f251bdf4-6211-48a9-8578-5acd9fb17c15\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.799612 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f251bdf4-6211-48a9-8578-5acd9fb17c15-scripts\") pod \"f251bdf4-6211-48a9-8578-5acd9fb17c15\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.799666 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"f251bdf4-6211-48a9-8578-5acd9fb17c15\" (UID: \"f251bdf4-6211-48a9-8578-5acd9fb17c15\") " Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.800086 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f251bdf4-6211-48a9-8578-5acd9fb17c15-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f251bdf4-6211-48a9-8578-5acd9fb17c15" (UID: "f251bdf4-6211-48a9-8578-5acd9fb17c15"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.800174 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f251bdf4-6211-48a9-8578-5acd9fb17c15-logs" (OuterVolumeSpecName: "logs") pod "f251bdf4-6211-48a9-8578-5acd9fb17c15" (UID: "f251bdf4-6211-48a9-8578-5acd9fb17c15"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.810980 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "f251bdf4-6211-48a9-8578-5acd9fb17c15" (UID: "f251bdf4-6211-48a9-8578-5acd9fb17c15"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.811145 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f251bdf4-6211-48a9-8578-5acd9fb17c15-kube-api-access-9gc8h" (OuterVolumeSpecName: "kube-api-access-9gc8h") pod "f251bdf4-6211-48a9-8578-5acd9fb17c15" (UID: "f251bdf4-6211-48a9-8578-5acd9fb17c15"). InnerVolumeSpecName "kube-api-access-9gc8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.813640 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f251bdf4-6211-48a9-8578-5acd9fb17c15-scripts" (OuterVolumeSpecName: "scripts") pod "f251bdf4-6211-48a9-8578-5acd9fb17c15" (UID: "f251bdf4-6211-48a9-8578-5acd9fb17c15"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.857187 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f251bdf4-6211-48a9-8578-5acd9fb17c15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f251bdf4-6211-48a9-8578-5acd9fb17c15" (UID: "f251bdf4-6211-48a9-8578-5acd9fb17c15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.857568 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f251bdf4-6211-48a9-8578-5acd9fb17c15-config-data" (OuterVolumeSpecName: "config-data") pod "f251bdf4-6211-48a9-8578-5acd9fb17c15" (UID: "f251bdf4-6211-48a9-8578-5acd9fb17c15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.872188 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f251bdf4-6211-48a9-8578-5acd9fb17c15-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f251bdf4-6211-48a9-8578-5acd9fb17c15" (UID: "f251bdf4-6211-48a9-8578-5acd9fb17c15"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.896098 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:09:54 crc kubenswrapper[4947]: W1203 07:09:54.898101 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode45e5af1_6468_419e_8400_58df1eb1ec64.slice/crio-4e05fbe2fdaf5e4d65c8805a90fa321a8e90eca68912782dd893fbce3606c76d WatchSource:0}: Error finding container 4e05fbe2fdaf5e4d65c8805a90fa321a8e90eca68912782dd893fbce3606c76d: Status 404 returned error can't find the container with id 4e05fbe2fdaf5e4d65c8805a90fa321a8e90eca68912782dd893fbce3606c76d Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.901132 4947 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f251bdf4-6211-48a9-8578-5acd9fb17c15-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.901162 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f251bdf4-6211-48a9-8578-5acd9fb17c15-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.901175 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f251bdf4-6211-48a9-8578-5acd9fb17c15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.901186 4947 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f251bdf4-6211-48a9-8578-5acd9fb17c15-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.901197 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gc8h\" (UniqueName: \"kubernetes.io/projected/f251bdf4-6211-48a9-8578-5acd9fb17c15-kube-api-access-9gc8h\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.901210 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f251bdf4-6211-48a9-8578-5acd9fb17c15-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.901220 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f251bdf4-6211-48a9-8578-5acd9fb17c15-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.901293 4947 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.958769 4947 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.969476 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6ddbf97597-l6hz9" event={"ID":"1f87802c-3846-486e-a131-39a7fe336c96","Type":"ContainerStarted","Data":"a6e1ead0fefc1c6dc26fdc188da65352aea353653dca0f86b588f1fdd21857ed"} Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.971079 4947 generic.go:334] "Generic (PLEG): container finished" podID="72d77a00-5a8b-4ac6-a764-c558f079bd04" containerID="11184cabbbb6be466af52f75261a132e0dea9e3c2152fd1f2181fdacb612abd2" exitCode=0 Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.971519 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-56ad-account-create-update-d94hj" event={"ID":"72d77a00-5a8b-4ac6-a764-c558f079bd04","Type":"ContainerDied","Data":"11184cabbbb6be466af52f75261a132e0dea9e3c2152fd1f2181fdacb612abd2"} Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.975361 4947 generic.go:334] "Generic (PLEG): container finished" podID="e07d2dc5-7d24-4294-8623-7b50b59c2135" containerID="d961d7a3cdd727c0e7fbdb61831a58352766133832f234d26b0ca308916654cf" exitCode=0 Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.975409 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-wnv6s" event={"ID":"e07d2dc5-7d24-4294-8623-7b50b59c2135","Type":"ContainerDied","Data":"d961d7a3cdd727c0e7fbdb61831a58352766133832f234d26b0ca308916654cf"} Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.977415 4947 generic.go:334] "Generic (PLEG): container finished" podID="c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf" containerID="00d072c696fe0a03b60855d152210d0d79d395112c447d7edb863c26fd2e08db" exitCode=0 Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.977525 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kgjrr" event={"ID":"c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf","Type":"ContainerDied","Data":"00d072c696fe0a03b60855d152210d0d79d395112c447d7edb863c26fd2e08db"} Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.978827 4947 generic.go:334] "Generic (PLEG): container finished" podID="2c45c77c-1306-4434-9b1f-9f6d45522d6a" containerID="76a4bdae9bb828027148c5b11ebef7f1e1e3ae1474d196cf4f90b8703f56d94a" exitCode=0 Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.978865 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c18d-account-create-update-ss77z" event={"ID":"2c45c77c-1306-4434-9b1f-9f6d45522d6a","Type":"ContainerDied","Data":"76a4bdae9bb828027148c5b11ebef7f1e1e3ae1474d196cf4f90b8703f56d94a"} Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.980834 4947 generic.go:334] "Generic (PLEG): container finished" podID="eba16a83-33ec-44b8-9494-8a5efa7e59e7" containerID="15964c1303d9d7cbf96b97388f04d9d2a0066502b8675cf6bd31dd81efcad9d4" exitCode=0 Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.980908 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b13d-account-create-update-k6qnx" event={"ID":"eba16a83-33ec-44b8-9494-8a5efa7e59e7","Type":"ContainerDied","Data":"15964c1303d9d7cbf96b97388f04d9d2a0066502b8675cf6bd31dd81efcad9d4"} Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.982148 4947 generic.go:334] "Generic (PLEG): container finished" podID="f251bdf4-6211-48a9-8578-5acd9fb17c15" containerID="ff908012798704d1cea033ccfe8f4688f75d889f71dae78525c4981c116cc1dd" exitCode=0 Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.982242 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.984056 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f251bdf4-6211-48a9-8578-5acd9fb17c15","Type":"ContainerDied","Data":"ff908012798704d1cea033ccfe8f4688f75d889f71dae78525c4981c116cc1dd"} Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.984104 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f251bdf4-6211-48a9-8578-5acd9fb17c15","Type":"ContainerDied","Data":"350fe646218587bd19d9b7b59b7e9133ac8857e3b1713daf1e8d6e8c61de1d40"} Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.984126 4947 scope.go:117] "RemoveContainer" containerID="ff908012798704d1cea033ccfe8f4688f75d889f71dae78525c4981c116cc1dd" Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.988028 4947 generic.go:334] "Generic (PLEG): container finished" podID="941b0633-5ae9-4203-ad2e-8f4644056d2b" containerID="6100514238c35219823e96f507d19e92cdaf0f5f76bc0b651132a8cdfecdd83f" exitCode=0 Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.988096 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9xgkg" event={"ID":"941b0633-5ae9-4203-ad2e-8f4644056d2b","Type":"ContainerDied","Data":"6100514238c35219823e96f507d19e92cdaf0f5f76bc0b651132a8cdfecdd83f"} Dec 03 07:09:54 crc kubenswrapper[4947]: I1203 07:09:54.991958 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e45e5af1-6468-419e-8400-58df1eb1ec64","Type":"ContainerStarted","Data":"4e05fbe2fdaf5e4d65c8805a90fa321a8e90eca68912782dd893fbce3606c76d"} Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.006455 4947 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.015454 4947 scope.go:117] "RemoveContainer" containerID="e806c545c620a41f44b266fdb943dd7692498f3256e3eebbb60b7528e0a8dc66" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.047283 4947 scope.go:117] "RemoveContainer" containerID="ff908012798704d1cea033ccfe8f4688f75d889f71dae78525c4981c116cc1dd" Dec 03 07:09:55 crc kubenswrapper[4947]: E1203 07:09:55.047740 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff908012798704d1cea033ccfe8f4688f75d889f71dae78525c4981c116cc1dd\": container with ID starting with ff908012798704d1cea033ccfe8f4688f75d889f71dae78525c4981c116cc1dd not found: ID does not exist" containerID="ff908012798704d1cea033ccfe8f4688f75d889f71dae78525c4981c116cc1dd" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.047778 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff908012798704d1cea033ccfe8f4688f75d889f71dae78525c4981c116cc1dd"} err="failed to get container status \"ff908012798704d1cea033ccfe8f4688f75d889f71dae78525c4981c116cc1dd\": rpc error: code = NotFound desc = could not find container \"ff908012798704d1cea033ccfe8f4688f75d889f71dae78525c4981c116cc1dd\": container with ID starting with ff908012798704d1cea033ccfe8f4688f75d889f71dae78525c4981c116cc1dd not found: ID does not exist" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.047811 4947 scope.go:117] "RemoveContainer" containerID="e806c545c620a41f44b266fdb943dd7692498f3256e3eebbb60b7528e0a8dc66" Dec 03 07:09:55 crc kubenswrapper[4947]: E1203 07:09:55.048198 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e806c545c620a41f44b266fdb943dd7692498f3256e3eebbb60b7528e0a8dc66\": container with ID starting with e806c545c620a41f44b266fdb943dd7692498f3256e3eebbb60b7528e0a8dc66 not found: ID does not exist" containerID="e806c545c620a41f44b266fdb943dd7692498f3256e3eebbb60b7528e0a8dc66" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.048226 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e806c545c620a41f44b266fdb943dd7692498f3256e3eebbb60b7528e0a8dc66"} err="failed to get container status \"e806c545c620a41f44b266fdb943dd7692498f3256e3eebbb60b7528e0a8dc66\": rpc error: code = NotFound desc = could not find container \"e806c545c620a41f44b266fdb943dd7692498f3256e3eebbb60b7528e0a8dc66\": container with ID starting with e806c545c620a41f44b266fdb943dd7692498f3256e3eebbb60b7528e0a8dc66 not found: ID does not exist" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.092995 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79f30107-717d-402f-b501-8911cede9cc1" path="/var/lib/kubelet/pods/79f30107-717d-402f-b501-8911cede9cc1/volumes" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.093890 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.100311 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.109145 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:09:55 crc kubenswrapper[4947]: E1203 07:09:55.110882 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f251bdf4-6211-48a9-8578-5acd9fb17c15" containerName="glance-log" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.110908 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f251bdf4-6211-48a9-8578-5acd9fb17c15" containerName="glance-log" Dec 03 07:09:55 crc kubenswrapper[4947]: E1203 07:09:55.110938 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f251bdf4-6211-48a9-8578-5acd9fb17c15" containerName="glance-httpd" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.110944 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f251bdf4-6211-48a9-8578-5acd9fb17c15" containerName="glance-httpd" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.111117 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f251bdf4-6211-48a9-8578-5acd9fb17c15" containerName="glance-log" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.111140 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f251bdf4-6211-48a9-8578-5acd9fb17c15" containerName="glance-httpd" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.112056 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.120869 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.121517 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.130173 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.209516 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdgcj\" (UniqueName: \"kubernetes.io/projected/bdb45354-43cd-41e7-a511-95357eb656e5-kube-api-access-zdgcj\") pod \"glance-default-external-api-0\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.209592 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb45354-43cd-41e7-a511-95357eb656e5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.209824 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdb45354-43cd-41e7-a511-95357eb656e5-config-data\") pod \"glance-default-external-api-0\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.209911 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdb45354-43cd-41e7-a511-95357eb656e5-logs\") pod \"glance-default-external-api-0\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.209989 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb45354-43cd-41e7-a511-95357eb656e5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.210009 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdb45354-43cd-41e7-a511-95357eb656e5-scripts\") pod \"glance-default-external-api-0\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.210233 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.210321 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bdb45354-43cd-41e7-a511-95357eb656e5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.312141 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.312573 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bdb45354-43cd-41e7-a511-95357eb656e5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.312623 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdgcj\" (UniqueName: \"kubernetes.io/projected/bdb45354-43cd-41e7-a511-95357eb656e5-kube-api-access-zdgcj\") pod \"glance-default-external-api-0\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.312652 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb45354-43cd-41e7-a511-95357eb656e5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.312768 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdb45354-43cd-41e7-a511-95357eb656e5-config-data\") pod \"glance-default-external-api-0\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.312840 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdb45354-43cd-41e7-a511-95357eb656e5-logs\") pod \"glance-default-external-api-0\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.312894 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb45354-43cd-41e7-a511-95357eb656e5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.312925 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdb45354-43cd-41e7-a511-95357eb656e5-scripts\") pod \"glance-default-external-api-0\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.313709 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bdb45354-43cd-41e7-a511-95357eb656e5-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.312445 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.315835 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdb45354-43cd-41e7-a511-95357eb656e5-logs\") pod \"glance-default-external-api-0\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.321877 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb45354-43cd-41e7-a511-95357eb656e5-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.323100 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdb45354-43cd-41e7-a511-95357eb656e5-scripts\") pod \"glance-default-external-api-0\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.333759 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdb45354-43cd-41e7-a511-95357eb656e5-config-data\") pod \"glance-default-external-api-0\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.380387 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdgcj\" (UniqueName: \"kubernetes.io/projected/bdb45354-43cd-41e7-a511-95357eb656e5-kube-api-access-zdgcj\") pod \"glance-default-external-api-0\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.381310 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb45354-43cd-41e7-a511-95357eb656e5-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.432967 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.433369 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2ebe993b-68e0-4d68-8aa7-7895d7b6629a" containerName="glance-log" containerID="cri-o://d807e47790e8fd7c199cf4bffef39b9faa580a74d25e33e730550da6bc0d3649" gracePeriod=30 Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.433915 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2ebe993b-68e0-4d68-8aa7-7895d7b6629a" containerName="glance-httpd" containerID="cri-o://7fb52b68381c4325d19d29c3513a46fc49f21c55090c27471729406c824d03a5" gracePeriod=30 Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.459820 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " pod="openstack/glance-default-external-api-0" Dec 03 07:09:55 crc kubenswrapper[4947]: I1203 07:09:55.750182 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 07:09:56 crc kubenswrapper[4947]: I1203 07:09:56.021972 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e45e5af1-6468-419e-8400-58df1eb1ec64","Type":"ContainerStarted","Data":"55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6"} Dec 03 07:09:56 crc kubenswrapper[4947]: I1203 07:09:56.028477 4947 generic.go:334] "Generic (PLEG): container finished" podID="2ebe993b-68e0-4d68-8aa7-7895d7b6629a" containerID="d807e47790e8fd7c199cf4bffef39b9faa580a74d25e33e730550da6bc0d3649" exitCode=143 Dec 03 07:09:56 crc kubenswrapper[4947]: I1203 07:09:56.028569 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2ebe993b-68e0-4d68-8aa7-7895d7b6629a","Type":"ContainerDied","Data":"d807e47790e8fd7c199cf4bffef39b9faa580a74d25e33e730550da6bc0d3649"} Dec 03 07:09:56 crc kubenswrapper[4947]: I1203 07:09:56.412044 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9xgkg" Dec 03 07:09:56 crc kubenswrapper[4947]: I1203 07:09:56.412074 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:09:56 crc kubenswrapper[4947]: I1203 07:09:56.454803 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68pld\" (UniqueName: \"kubernetes.io/projected/941b0633-5ae9-4203-ad2e-8f4644056d2b-kube-api-access-68pld\") pod \"941b0633-5ae9-4203-ad2e-8f4644056d2b\" (UID: \"941b0633-5ae9-4203-ad2e-8f4644056d2b\") " Dec 03 07:09:56 crc kubenswrapper[4947]: I1203 07:09:56.455019 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/941b0633-5ae9-4203-ad2e-8f4644056d2b-operator-scripts\") pod \"941b0633-5ae9-4203-ad2e-8f4644056d2b\" (UID: \"941b0633-5ae9-4203-ad2e-8f4644056d2b\") " Dec 03 07:09:56 crc kubenswrapper[4947]: I1203 07:09:56.456114 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/941b0633-5ae9-4203-ad2e-8f4644056d2b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "941b0633-5ae9-4203-ad2e-8f4644056d2b" (UID: "941b0633-5ae9-4203-ad2e-8f4644056d2b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:09:56 crc kubenswrapper[4947]: I1203 07:09:56.460356 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/941b0633-5ae9-4203-ad2e-8f4644056d2b-kube-api-access-68pld" (OuterVolumeSpecName: "kube-api-access-68pld") pod "941b0633-5ae9-4203-ad2e-8f4644056d2b" (UID: "941b0633-5ae9-4203-ad2e-8f4644056d2b"). InnerVolumeSpecName "kube-api-access-68pld". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:09:56 crc kubenswrapper[4947]: I1203 07:09:56.570768 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68pld\" (UniqueName: \"kubernetes.io/projected/941b0633-5ae9-4203-ad2e-8f4644056d2b-kube-api-access-68pld\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:56 crc kubenswrapper[4947]: I1203 07:09:56.570807 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/941b0633-5ae9-4203-ad2e-8f4644056d2b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:56 crc kubenswrapper[4947]: I1203 07:09:56.607504 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:09:56 crc kubenswrapper[4947]: I1203 07:09:56.638221 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-56ad-account-create-update-d94hj" Dec 03 07:09:56 crc kubenswrapper[4947]: I1203 07:09:56.781627 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72d77a00-5a8b-4ac6-a764-c558f079bd04-operator-scripts\") pod \"72d77a00-5a8b-4ac6-a764-c558f079bd04\" (UID: \"72d77a00-5a8b-4ac6-a764-c558f079bd04\") " Dec 03 07:09:56 crc kubenswrapper[4947]: I1203 07:09:56.781924 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwg2r\" (UniqueName: \"kubernetes.io/projected/72d77a00-5a8b-4ac6-a764-c558f079bd04-kube-api-access-hwg2r\") pod \"72d77a00-5a8b-4ac6-a764-c558f079bd04\" (UID: \"72d77a00-5a8b-4ac6-a764-c558f079bd04\") " Dec 03 07:09:56 crc kubenswrapper[4947]: I1203 07:09:56.786611 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72d77a00-5a8b-4ac6-a764-c558f079bd04-kube-api-access-hwg2r" (OuterVolumeSpecName: "kube-api-access-hwg2r") pod "72d77a00-5a8b-4ac6-a764-c558f079bd04" (UID: "72d77a00-5a8b-4ac6-a764-c558f079bd04"). InnerVolumeSpecName "kube-api-access-hwg2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:09:56 crc kubenswrapper[4947]: I1203 07:09:56.788108 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72d77a00-5a8b-4ac6-a764-c558f079bd04-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "72d77a00-5a8b-4ac6-a764-c558f079bd04" (UID: "72d77a00-5a8b-4ac6-a764-c558f079bd04"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:09:56 crc kubenswrapper[4947]: I1203 07:09:56.884133 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72d77a00-5a8b-4ac6-a764-c558f079bd04-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:56 crc kubenswrapper[4947]: I1203 07:09:56.884173 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwg2r\" (UniqueName: \"kubernetes.io/projected/72d77a00-5a8b-4ac6-a764-c558f079bd04-kube-api-access-hwg2r\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.054251 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e45e5af1-6468-419e-8400-58df1eb1ec64","Type":"ContainerStarted","Data":"8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59"} Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.055832 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bdb45354-43cd-41e7-a511-95357eb656e5","Type":"ContainerStarted","Data":"6e1b04e5cf8ca5937521ad7d5877ba5cd9a40faaa2d794fbaada6378c65d27c3"} Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.058137 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c18d-account-create-update-ss77z" event={"ID":"2c45c77c-1306-4434-9b1f-9f6d45522d6a","Type":"ContainerDied","Data":"c259b51ffbcf3a80f7bbdb68581da336b6326187ac61c736c68aa3969c2e4285"} Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.058175 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c259b51ffbcf3a80f7bbdb68581da336b6326187ac61c736c68aa3969c2e4285" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.069376 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b13d-account-create-update-k6qnx" event={"ID":"eba16a83-33ec-44b8-9494-8a5efa7e59e7","Type":"ContainerDied","Data":"c633d07c5473b6b160d0c6d55f9e538d7ae18d76ecf8d3222bae7fb8d861652e"} Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.069431 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c633d07c5473b6b160d0c6d55f9e538d7ae18d76ecf8d3222bae7fb8d861652e" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.078537 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9xgkg" event={"ID":"941b0633-5ae9-4203-ad2e-8f4644056d2b","Type":"ContainerDied","Data":"ac5c6dcbdd3a064f0d4afc512c08c9f6ba5f50e96b15146499ce9de1ae763956"} Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.078583 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac5c6dcbdd3a064f0d4afc512c08c9f6ba5f50e96b15146499ce9de1ae763956" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.078653 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9xgkg" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.086340 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-56ad-account-create-update-d94hj" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.107769 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f251bdf4-6211-48a9-8578-5acd9fb17c15" path="/var/lib/kubelet/pods/f251bdf4-6211-48a9-8578-5acd9fb17c15/volumes" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.108276 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c18d-account-create-update-ss77z" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.111848 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-56ad-account-create-update-d94hj" event={"ID":"72d77a00-5a8b-4ac6-a764-c558f079bd04","Type":"ContainerDied","Data":"803475bbef4802a6d3f327a83e481ca23b1e1ca4198cee2091705fdf9393a2bd"} Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.111889 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="803475bbef4802a6d3f327a83e481ca23b1e1ca4198cee2091705fdf9393a2bd" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.111992 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-wnv6s" event={"ID":"e07d2dc5-7d24-4294-8623-7b50b59c2135","Type":"ContainerDied","Data":"655fbeaede3f4d9dcd47f1ea06953b3cd067f3e1f8b773dc3b21490b2f07134d"} Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.112014 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="655fbeaede3f4d9dcd47f1ea06953b3cd067f3e1f8b773dc3b21490b2f07134d" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.114820 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-kgjrr" event={"ID":"c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf","Type":"ContainerDied","Data":"5ae4badf977d1d7b1509aeac95d69cc83970647cd9a305278d2c09e477a3e442"} Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.114854 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ae4badf977d1d7b1509aeac95d69cc83970647cd9a305278d2c09e477a3e442" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.124772 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wnv6s" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.168562 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kgjrr" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.169763 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b13d-account-create-update-k6qnx" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.204668 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl5zf\" (UniqueName: \"kubernetes.io/projected/e07d2dc5-7d24-4294-8623-7b50b59c2135-kube-api-access-zl5zf\") pod \"e07d2dc5-7d24-4294-8623-7b50b59c2135\" (UID: \"e07d2dc5-7d24-4294-8623-7b50b59c2135\") " Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.204805 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e07d2dc5-7d24-4294-8623-7b50b59c2135-operator-scripts\") pod \"e07d2dc5-7d24-4294-8623-7b50b59c2135\" (UID: \"e07d2dc5-7d24-4294-8623-7b50b59c2135\") " Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.204865 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5b4wk\" (UniqueName: \"kubernetes.io/projected/2c45c77c-1306-4434-9b1f-9f6d45522d6a-kube-api-access-5b4wk\") pod \"2c45c77c-1306-4434-9b1f-9f6d45522d6a\" (UID: \"2c45c77c-1306-4434-9b1f-9f6d45522d6a\") " Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.204889 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c45c77c-1306-4434-9b1f-9f6d45522d6a-operator-scripts\") pod \"2c45c77c-1306-4434-9b1f-9f6d45522d6a\" (UID: \"2c45c77c-1306-4434-9b1f-9f6d45522d6a\") " Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.205368 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e07d2dc5-7d24-4294-8623-7b50b59c2135-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e07d2dc5-7d24-4294-8623-7b50b59c2135" (UID: "e07d2dc5-7d24-4294-8623-7b50b59c2135"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.205642 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c45c77c-1306-4434-9b1f-9f6d45522d6a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2c45c77c-1306-4434-9b1f-9f6d45522d6a" (UID: "2c45c77c-1306-4434-9b1f-9f6d45522d6a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.209451 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c45c77c-1306-4434-9b1f-9f6d45522d6a-kube-api-access-5b4wk" (OuterVolumeSpecName: "kube-api-access-5b4wk") pod "2c45c77c-1306-4434-9b1f-9f6d45522d6a" (UID: "2c45c77c-1306-4434-9b1f-9f6d45522d6a"). InnerVolumeSpecName "kube-api-access-5b4wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.211772 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e07d2dc5-7d24-4294-8623-7b50b59c2135-kube-api-access-zl5zf" (OuterVolumeSpecName: "kube-api-access-zl5zf") pod "e07d2dc5-7d24-4294-8623-7b50b59c2135" (UID: "e07d2dc5-7d24-4294-8623-7b50b59c2135"). InnerVolumeSpecName "kube-api-access-zl5zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.306239 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eba16a83-33ec-44b8-9494-8a5efa7e59e7-operator-scripts\") pod \"eba16a83-33ec-44b8-9494-8a5efa7e59e7\" (UID: \"eba16a83-33ec-44b8-9494-8a5efa7e59e7\") " Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.306470 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf-operator-scripts\") pod \"c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf\" (UID: \"c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf\") " Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.306504 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tmjs\" (UniqueName: \"kubernetes.io/projected/eba16a83-33ec-44b8-9494-8a5efa7e59e7-kube-api-access-2tmjs\") pod \"eba16a83-33ec-44b8-9494-8a5efa7e59e7\" (UID: \"eba16a83-33ec-44b8-9494-8a5efa7e59e7\") " Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.306590 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsqmg\" (UniqueName: \"kubernetes.io/projected/c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf-kube-api-access-qsqmg\") pod \"c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf\" (UID: \"c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf\") " Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.306908 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf" (UID: "c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.306930 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl5zf\" (UniqueName: \"kubernetes.io/projected/e07d2dc5-7d24-4294-8623-7b50b59c2135-kube-api-access-zl5zf\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.306968 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e07d2dc5-7d24-4294-8623-7b50b59c2135-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.306980 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5b4wk\" (UniqueName: \"kubernetes.io/projected/2c45c77c-1306-4434-9b1f-9f6d45522d6a-kube-api-access-5b4wk\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.306990 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c45c77c-1306-4434-9b1f-9f6d45522d6a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.307111 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eba16a83-33ec-44b8-9494-8a5efa7e59e7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eba16a83-33ec-44b8-9494-8a5efa7e59e7" (UID: "eba16a83-33ec-44b8-9494-8a5efa7e59e7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.309659 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eba16a83-33ec-44b8-9494-8a5efa7e59e7-kube-api-access-2tmjs" (OuterVolumeSpecName: "kube-api-access-2tmjs") pod "eba16a83-33ec-44b8-9494-8a5efa7e59e7" (UID: "eba16a83-33ec-44b8-9494-8a5efa7e59e7"). InnerVolumeSpecName "kube-api-access-2tmjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.311638 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf-kube-api-access-qsqmg" (OuterVolumeSpecName: "kube-api-access-qsqmg") pod "c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf" (UID: "c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf"). InnerVolumeSpecName "kube-api-access-qsqmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.408323 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eba16a83-33ec-44b8-9494-8a5efa7e59e7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.408356 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.408368 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tmjs\" (UniqueName: \"kubernetes.io/projected/eba16a83-33ec-44b8-9494-8a5efa7e59e7-kube-api-access-2tmjs\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:57 crc kubenswrapper[4947]: I1203 07:09:57.408380 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsqmg\" (UniqueName: \"kubernetes.io/projected/c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf-kube-api-access-qsqmg\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:58 crc kubenswrapper[4947]: I1203 07:09:58.124684 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bdb45354-43cd-41e7-a511-95357eb656e5","Type":"ContainerStarted","Data":"7a42de092f2c4832987606718c90461323cdaca36bc998fcfa88de29b59c873d"} Dec 03 07:09:58 crc kubenswrapper[4947]: I1203 07:09:58.124728 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bdb45354-43cd-41e7-a511-95357eb656e5","Type":"ContainerStarted","Data":"1661b453aa4675cf59e5985d9eef19957b032c863d92cb6f01e34413e202e848"} Dec 03 07:09:58 crc kubenswrapper[4947]: I1203 07:09:58.127256 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e45e5af1-6468-419e-8400-58df1eb1ec64","Type":"ContainerStarted","Data":"d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3"} Dec 03 07:09:58 crc kubenswrapper[4947]: I1203 07:09:58.127287 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b13d-account-create-update-k6qnx" Dec 03 07:09:58 crc kubenswrapper[4947]: I1203 07:09:58.127319 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c18d-account-create-update-ss77z" Dec 03 07:09:58 crc kubenswrapper[4947]: I1203 07:09:58.127329 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-wnv6s" Dec 03 07:09:58 crc kubenswrapper[4947]: I1203 07:09:58.127345 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-kgjrr" Dec 03 07:09:58 crc kubenswrapper[4947]: I1203 07:09:58.175116 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.175098557 podStartE2EDuration="3.175098557s" podCreationTimestamp="2025-12-03 07:09:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:09:58.155811105 +0000 UTC m=+1259.416765531" watchObservedRunningTime="2025-12-03 07:09:58.175098557 +0000 UTC m=+1259.436052983" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.068245 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.140207 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.140301 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-internal-tls-certs\") pod \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.140334 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-httpd-run\") pod \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.140428 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw9mr\" (UniqueName: \"kubernetes.io/projected/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-kube-api-access-xw9mr\") pod \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.140550 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-logs\") pod \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.140620 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-combined-ca-bundle\") pod \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.140661 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-scripts\") pod \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.140679 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-config-data\") pod \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\" (UID: \"2ebe993b-68e0-4d68-8aa7-7895d7b6629a\") " Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.146285 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-logs" (OuterVolumeSpecName: "logs") pod "2ebe993b-68e0-4d68-8aa7-7895d7b6629a" (UID: "2ebe993b-68e0-4d68-8aa7-7895d7b6629a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.151982 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2ebe993b-68e0-4d68-8aa7-7895d7b6629a" (UID: "2ebe993b-68e0-4d68-8aa7-7895d7b6629a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.152473 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-scripts" (OuterVolumeSpecName: "scripts") pod "2ebe993b-68e0-4d68-8aa7-7895d7b6629a" (UID: "2ebe993b-68e0-4d68-8aa7-7895d7b6629a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.154880 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-kube-api-access-xw9mr" (OuterVolumeSpecName: "kube-api-access-xw9mr") pod "2ebe993b-68e0-4d68-8aa7-7895d7b6629a" (UID: "2ebe993b-68e0-4d68-8aa7-7895d7b6629a"). InnerVolumeSpecName "kube-api-access-xw9mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.162288 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e45e5af1-6468-419e-8400-58df1eb1ec64","Type":"ContainerStarted","Data":"604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02"} Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.162445 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e45e5af1-6468-419e-8400-58df1eb1ec64" containerName="ceilometer-central-agent" containerID="cri-o://55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6" gracePeriod=30 Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.162646 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.162700 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e45e5af1-6468-419e-8400-58df1eb1ec64" containerName="proxy-httpd" containerID="cri-o://604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02" gracePeriod=30 Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.162750 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e45e5af1-6468-419e-8400-58df1eb1ec64" containerName="sg-core" containerID="cri-o://d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3" gracePeriod=30 Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.162782 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e45e5af1-6468-419e-8400-58df1eb1ec64" containerName="ceilometer-notification-agent" containerID="cri-o://8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59" gracePeriod=30 Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.169687 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "2ebe993b-68e0-4d68-8aa7-7895d7b6629a" (UID: "2ebe993b-68e0-4d68-8aa7-7895d7b6629a"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.182635 4947 generic.go:334] "Generic (PLEG): container finished" podID="2ebe993b-68e0-4d68-8aa7-7895d7b6629a" containerID="7fb52b68381c4325d19d29c3513a46fc49f21c55090c27471729406c824d03a5" exitCode=0 Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.183133 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.183167 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2ebe993b-68e0-4d68-8aa7-7895d7b6629a","Type":"ContainerDied","Data":"7fb52b68381c4325d19d29c3513a46fc49f21c55090c27471729406c824d03a5"} Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.183205 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2ebe993b-68e0-4d68-8aa7-7895d7b6629a","Type":"ContainerDied","Data":"ed499040dda7940e2ed46d09277c8ec4937c63504fc0cb2e22212c1b2bb32da0"} Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.183220 4947 scope.go:117] "RemoveContainer" containerID="7fb52b68381c4325d19d29c3513a46fc49f21c55090c27471729406c824d03a5" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.203628 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.7875100059999998 podStartE2EDuration="6.203612557s" podCreationTimestamp="2025-12-03 07:09:53 +0000 UTC" firstStartedPulling="2025-12-03 07:09:54.901925614 +0000 UTC m=+1256.162880040" lastFinishedPulling="2025-12-03 07:09:58.318028165 +0000 UTC m=+1259.578982591" observedRunningTime="2025-12-03 07:09:59.195629728 +0000 UTC m=+1260.456584154" watchObservedRunningTime="2025-12-03 07:09:59.203612557 +0000 UTC m=+1260.464566983" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.217430 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ebe993b-68e0-4d68-8aa7-7895d7b6629a" (UID: "2ebe993b-68e0-4d68-8aa7-7895d7b6629a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.246241 4947 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.246276 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw9mr\" (UniqueName: \"kubernetes.io/projected/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-kube-api-access-xw9mr\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.246290 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.246300 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.246312 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.246332 4947 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.269705 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-config-data" (OuterVolumeSpecName: "config-data") pod "2ebe993b-68e0-4d68-8aa7-7895d7b6629a" (UID: "2ebe993b-68e0-4d68-8aa7-7895d7b6629a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.281639 4947 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.325096 4947 scope.go:117] "RemoveContainer" containerID="d807e47790e8fd7c199cf4bffef39b9faa580a74d25e33e730550da6bc0d3649" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.325279 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2ebe993b-68e0-4d68-8aa7-7895d7b6629a" (UID: "2ebe993b-68e0-4d68-8aa7-7895d7b6629a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.354624 4947 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.354663 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ebe993b-68e0-4d68-8aa7-7895d7b6629a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.354675 4947 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.365271 4947 scope.go:117] "RemoveContainer" containerID="7fb52b68381c4325d19d29c3513a46fc49f21c55090c27471729406c824d03a5" Dec 03 07:09:59 crc kubenswrapper[4947]: E1203 07:09:59.367028 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fb52b68381c4325d19d29c3513a46fc49f21c55090c27471729406c824d03a5\": container with ID starting with 7fb52b68381c4325d19d29c3513a46fc49f21c55090c27471729406c824d03a5 not found: ID does not exist" containerID="7fb52b68381c4325d19d29c3513a46fc49f21c55090c27471729406c824d03a5" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.367063 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fb52b68381c4325d19d29c3513a46fc49f21c55090c27471729406c824d03a5"} err="failed to get container status \"7fb52b68381c4325d19d29c3513a46fc49f21c55090c27471729406c824d03a5\": rpc error: code = NotFound desc = could not find container \"7fb52b68381c4325d19d29c3513a46fc49f21c55090c27471729406c824d03a5\": container with ID starting with 7fb52b68381c4325d19d29c3513a46fc49f21c55090c27471729406c824d03a5 not found: ID does not exist" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.367090 4947 scope.go:117] "RemoveContainer" containerID="d807e47790e8fd7c199cf4bffef39b9faa580a74d25e33e730550da6bc0d3649" Dec 03 07:09:59 crc kubenswrapper[4947]: E1203 07:09:59.367975 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d807e47790e8fd7c199cf4bffef39b9faa580a74d25e33e730550da6bc0d3649\": container with ID starting with d807e47790e8fd7c199cf4bffef39b9faa580a74d25e33e730550da6bc0d3649 not found: ID does not exist" containerID="d807e47790e8fd7c199cf4bffef39b9faa580a74d25e33e730550da6bc0d3649" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.368012 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d807e47790e8fd7c199cf4bffef39b9faa580a74d25e33e730550da6bc0d3649"} err="failed to get container status \"d807e47790e8fd7c199cf4bffef39b9faa580a74d25e33e730550da6bc0d3649\": rpc error: code = NotFound desc = could not find container \"d807e47790e8fd7c199cf4bffef39b9faa580a74d25e33e730550da6bc0d3649\": container with ID starting with d807e47790e8fd7c199cf4bffef39b9faa580a74d25e33e730550da6bc0d3649 not found: ID does not exist" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.513397 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.534750 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.546049 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:09:59 crc kubenswrapper[4947]: E1203 07:09:59.546400 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e07d2dc5-7d24-4294-8623-7b50b59c2135" containerName="mariadb-database-create" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.546417 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e07d2dc5-7d24-4294-8623-7b50b59c2135" containerName="mariadb-database-create" Dec 03 07:09:59 crc kubenswrapper[4947]: E1203 07:09:59.546430 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eba16a83-33ec-44b8-9494-8a5efa7e59e7" containerName="mariadb-account-create-update" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.546436 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="eba16a83-33ec-44b8-9494-8a5efa7e59e7" containerName="mariadb-account-create-update" Dec 03 07:09:59 crc kubenswrapper[4947]: E1203 07:09:59.546443 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c45c77c-1306-4434-9b1f-9f6d45522d6a" containerName="mariadb-account-create-update" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.546449 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c45c77c-1306-4434-9b1f-9f6d45522d6a" containerName="mariadb-account-create-update" Dec 03 07:09:59 crc kubenswrapper[4947]: E1203 07:09:59.546459 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf" containerName="mariadb-database-create" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.546465 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf" containerName="mariadb-database-create" Dec 03 07:09:59 crc kubenswrapper[4947]: E1203 07:09:59.546478 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72d77a00-5a8b-4ac6-a764-c558f079bd04" containerName="mariadb-account-create-update" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.546483 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="72d77a00-5a8b-4ac6-a764-c558f079bd04" containerName="mariadb-account-create-update" Dec 03 07:09:59 crc kubenswrapper[4947]: E1203 07:09:59.546509 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="941b0633-5ae9-4203-ad2e-8f4644056d2b" containerName="mariadb-database-create" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.546515 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="941b0633-5ae9-4203-ad2e-8f4644056d2b" containerName="mariadb-database-create" Dec 03 07:09:59 crc kubenswrapper[4947]: E1203 07:09:59.546525 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ebe993b-68e0-4d68-8aa7-7895d7b6629a" containerName="glance-log" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.546531 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ebe993b-68e0-4d68-8aa7-7895d7b6629a" containerName="glance-log" Dec 03 07:09:59 crc kubenswrapper[4947]: E1203 07:09:59.546546 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ebe993b-68e0-4d68-8aa7-7895d7b6629a" containerName="glance-httpd" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.546551 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ebe993b-68e0-4d68-8aa7-7895d7b6629a" containerName="glance-httpd" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.546724 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c45c77c-1306-4434-9b1f-9f6d45522d6a" containerName="mariadb-account-create-update" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.546738 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="941b0633-5ae9-4203-ad2e-8f4644056d2b" containerName="mariadb-database-create" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.546748 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ebe993b-68e0-4d68-8aa7-7895d7b6629a" containerName="glance-httpd" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.546756 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ebe993b-68e0-4d68-8aa7-7895d7b6629a" containerName="glance-log" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.546763 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="eba16a83-33ec-44b8-9494-8a5efa7e59e7" containerName="mariadb-account-create-update" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.546772 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="72d77a00-5a8b-4ac6-a764-c558f079bd04" containerName="mariadb-account-create-update" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.546784 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="e07d2dc5-7d24-4294-8623-7b50b59c2135" containerName="mariadb-database-create" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.546797 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf" containerName="mariadb-database-create" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.547704 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.554292 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.554617 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.562453 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.660171 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.660365 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.660413 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.660457 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.660623 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-logs\") pod \"glance-default-internal-api-0\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.660734 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.660861 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9xrb\" (UniqueName: \"kubernetes.io/projected/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-kube-api-access-j9xrb\") pod \"glance-default-internal-api-0\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.660972 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.762120 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9xrb\" (UniqueName: \"kubernetes.io/projected/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-kube-api-access-j9xrb\") pod \"glance-default-internal-api-0\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.762191 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.762236 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.762289 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.762308 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.762330 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.762356 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-logs\") pod \"glance-default-internal-api-0\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.762865 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.762909 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-logs\") pod \"glance-default-internal-api-0\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.762921 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.763159 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.768770 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.769477 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.769633 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.771432 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.778868 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9xrb\" (UniqueName: \"kubernetes.io/projected/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-kube-api-access-j9xrb\") pod \"glance-default-internal-api-0\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.820047 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " pod="openstack/glance-default-internal-api-0" Dec 03 07:09:59 crc kubenswrapper[4947]: I1203 07:09:59.886393 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.086602 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.086898 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.094968 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.170213 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e45e5af1-6468-419e-8400-58df1eb1ec64-config-data\") pod \"e45e5af1-6468-419e-8400-58df1eb1ec64\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.170260 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfqwb\" (UniqueName: \"kubernetes.io/projected/e45e5af1-6468-419e-8400-58df1eb1ec64-kube-api-access-sfqwb\") pod \"e45e5af1-6468-419e-8400-58df1eb1ec64\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.170999 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e45e5af1-6468-419e-8400-58df1eb1ec64-combined-ca-bundle\") pod \"e45e5af1-6468-419e-8400-58df1eb1ec64\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.171337 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e45e5af1-6468-419e-8400-58df1eb1ec64-scripts\") pod \"e45e5af1-6468-419e-8400-58df1eb1ec64\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.171414 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e45e5af1-6468-419e-8400-58df1eb1ec64-sg-core-conf-yaml\") pod \"e45e5af1-6468-419e-8400-58df1eb1ec64\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.171459 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e45e5af1-6468-419e-8400-58df1eb1ec64-run-httpd\") pod \"e45e5af1-6468-419e-8400-58df1eb1ec64\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.172637 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e45e5af1-6468-419e-8400-58df1eb1ec64-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e45e5af1-6468-419e-8400-58df1eb1ec64" (UID: "e45e5af1-6468-419e-8400-58df1eb1ec64"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.177603 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e45e5af1-6468-419e-8400-58df1eb1ec64-scripts" (OuterVolumeSpecName: "scripts") pod "e45e5af1-6468-419e-8400-58df1eb1ec64" (UID: "e45e5af1-6468-419e-8400-58df1eb1ec64"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.181080 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e45e5af1-6468-419e-8400-58df1eb1ec64-kube-api-access-sfqwb" (OuterVolumeSpecName: "kube-api-access-sfqwb") pod "e45e5af1-6468-419e-8400-58df1eb1ec64" (UID: "e45e5af1-6468-419e-8400-58df1eb1ec64"). InnerVolumeSpecName "kube-api-access-sfqwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.181095 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e45e5af1-6468-419e-8400-58df1eb1ec64-log-httpd\") pod \"e45e5af1-6468-419e-8400-58df1eb1ec64\" (UID: \"e45e5af1-6468-419e-8400-58df1eb1ec64\") " Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.181534 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e45e5af1-6468-419e-8400-58df1eb1ec64-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e45e5af1-6468-419e-8400-58df1eb1ec64" (UID: "e45e5af1-6468-419e-8400-58df1eb1ec64"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.181919 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfqwb\" (UniqueName: \"kubernetes.io/projected/e45e5af1-6468-419e-8400-58df1eb1ec64-kube-api-access-sfqwb\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.181933 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e45e5af1-6468-419e-8400-58df1eb1ec64-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.181943 4947 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e45e5af1-6468-419e-8400-58df1eb1ec64-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.181952 4947 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e45e5af1-6468-419e-8400-58df1eb1ec64-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.198310 4947 generic.go:334] "Generic (PLEG): container finished" podID="e45e5af1-6468-419e-8400-58df1eb1ec64" containerID="604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02" exitCode=0 Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.198346 4947 generic.go:334] "Generic (PLEG): container finished" podID="e45e5af1-6468-419e-8400-58df1eb1ec64" containerID="d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3" exitCode=2 Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.198359 4947 generic.go:334] "Generic (PLEG): container finished" podID="e45e5af1-6468-419e-8400-58df1eb1ec64" containerID="8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59" exitCode=0 Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.198369 4947 generic.go:334] "Generic (PLEG): container finished" podID="e45e5af1-6468-419e-8400-58df1eb1ec64" containerID="55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6" exitCode=0 Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.198409 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e45e5af1-6468-419e-8400-58df1eb1ec64","Type":"ContainerDied","Data":"604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02"} Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.198438 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e45e5af1-6468-419e-8400-58df1eb1ec64","Type":"ContainerDied","Data":"d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3"} Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.198450 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e45e5af1-6468-419e-8400-58df1eb1ec64","Type":"ContainerDied","Data":"8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59"} Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.198460 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e45e5af1-6468-419e-8400-58df1eb1ec64","Type":"ContainerDied","Data":"55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6"} Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.198471 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e45e5af1-6468-419e-8400-58df1eb1ec64","Type":"ContainerDied","Data":"4e05fbe2fdaf5e4d65c8805a90fa321a8e90eca68912782dd893fbce3606c76d"} Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.198507 4947 scope.go:117] "RemoveContainer" containerID="604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.198665 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.220579 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e45e5af1-6468-419e-8400-58df1eb1ec64-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e45e5af1-6468-419e-8400-58df1eb1ec64" (UID: "e45e5af1-6468-419e-8400-58df1eb1ec64"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.233310 4947 scope.go:117] "RemoveContainer" containerID="d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.244949 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e45e5af1-6468-419e-8400-58df1eb1ec64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e45e5af1-6468-419e-8400-58df1eb1ec64" (UID: "e45e5af1-6468-419e-8400-58df1eb1ec64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.265089 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e45e5af1-6468-419e-8400-58df1eb1ec64-config-data" (OuterVolumeSpecName: "config-data") pod "e45e5af1-6468-419e-8400-58df1eb1ec64" (UID: "e45e5af1-6468-419e-8400-58df1eb1ec64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.277853 4947 scope.go:117] "RemoveContainer" containerID="8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.284555 4947 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e45e5af1-6468-419e-8400-58df1eb1ec64-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.284583 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e45e5af1-6468-419e-8400-58df1eb1ec64-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.284594 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e45e5af1-6468-419e-8400-58df1eb1ec64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.338078 4947 scope.go:117] "RemoveContainer" containerID="55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.378589 4947 scope.go:117] "RemoveContainer" containerID="604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02" Dec 03 07:10:00 crc kubenswrapper[4947]: E1203 07:10:00.380922 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02\": container with ID starting with 604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02 not found: ID does not exist" containerID="604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.380976 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02"} err="failed to get container status \"604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02\": rpc error: code = NotFound desc = could not find container \"604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02\": container with ID starting with 604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02 not found: ID does not exist" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.381047 4947 scope.go:117] "RemoveContainer" containerID="d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3" Dec 03 07:10:00 crc kubenswrapper[4947]: E1203 07:10:00.381371 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3\": container with ID starting with d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3 not found: ID does not exist" containerID="d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.381403 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3"} err="failed to get container status \"d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3\": rpc error: code = NotFound desc = could not find container \"d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3\": container with ID starting with d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3 not found: ID does not exist" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.381422 4947 scope.go:117] "RemoveContainer" containerID="8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59" Dec 03 07:10:00 crc kubenswrapper[4947]: E1203 07:10:00.381679 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59\": container with ID starting with 8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59 not found: ID does not exist" containerID="8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.381703 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59"} err="failed to get container status \"8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59\": rpc error: code = NotFound desc = could not find container \"8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59\": container with ID starting with 8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59 not found: ID does not exist" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.381721 4947 scope.go:117] "RemoveContainer" containerID="55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6" Dec 03 07:10:00 crc kubenswrapper[4947]: E1203 07:10:00.381963 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6\": container with ID starting with 55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6 not found: ID does not exist" containerID="55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.381987 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6"} err="failed to get container status \"55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6\": rpc error: code = NotFound desc = could not find container \"55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6\": container with ID starting with 55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6 not found: ID does not exist" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.382004 4947 scope.go:117] "RemoveContainer" containerID="604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.382218 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02"} err="failed to get container status \"604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02\": rpc error: code = NotFound desc = could not find container \"604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02\": container with ID starting with 604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02 not found: ID does not exist" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.382241 4947 scope.go:117] "RemoveContainer" containerID="d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.382460 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3"} err="failed to get container status \"d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3\": rpc error: code = NotFound desc = could not find container \"d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3\": container with ID starting with d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3 not found: ID does not exist" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.382484 4947 scope.go:117] "RemoveContainer" containerID="8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.382904 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59"} err="failed to get container status \"8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59\": rpc error: code = NotFound desc = could not find container \"8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59\": container with ID starting with 8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59 not found: ID does not exist" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.382925 4947 scope.go:117] "RemoveContainer" containerID="55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.383157 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6"} err="failed to get container status \"55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6\": rpc error: code = NotFound desc = could not find container \"55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6\": container with ID starting with 55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6 not found: ID does not exist" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.383184 4947 scope.go:117] "RemoveContainer" containerID="604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.383453 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02"} err="failed to get container status \"604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02\": rpc error: code = NotFound desc = could not find container \"604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02\": container with ID starting with 604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02 not found: ID does not exist" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.383472 4947 scope.go:117] "RemoveContainer" containerID="d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.383924 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3"} err="failed to get container status \"d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3\": rpc error: code = NotFound desc = could not find container \"d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3\": container with ID starting with d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3 not found: ID does not exist" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.383954 4947 scope.go:117] "RemoveContainer" containerID="8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.384164 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59"} err="failed to get container status \"8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59\": rpc error: code = NotFound desc = could not find container \"8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59\": container with ID starting with 8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59 not found: ID does not exist" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.384184 4947 scope.go:117] "RemoveContainer" containerID="55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.384395 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6"} err="failed to get container status \"55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6\": rpc error: code = NotFound desc = could not find container \"55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6\": container with ID starting with 55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6 not found: ID does not exist" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.384421 4947 scope.go:117] "RemoveContainer" containerID="604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.384678 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02"} err="failed to get container status \"604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02\": rpc error: code = NotFound desc = could not find container \"604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02\": container with ID starting with 604b66b39b9e5c279f0e9e8db7acb2383e35041ab9c541a2539c3ff731c2ac02 not found: ID does not exist" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.384700 4947 scope.go:117] "RemoveContainer" containerID="d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.384934 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3"} err="failed to get container status \"d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3\": rpc error: code = NotFound desc = could not find container \"d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3\": container with ID starting with d72785673805b9ad433c95d777369cccd38991a79203ef9564e0a86e0a763be3 not found: ID does not exist" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.384958 4947 scope.go:117] "RemoveContainer" containerID="8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.385213 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59"} err="failed to get container status \"8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59\": rpc error: code = NotFound desc = could not find container \"8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59\": container with ID starting with 8c2ec6f79300bc61fdba8eb9cacd373d5a1eaa987c9e76ac7bad728a6cdb4c59 not found: ID does not exist" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.385237 4947 scope.go:117] "RemoveContainer" containerID="55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.385815 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6"} err="failed to get container status \"55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6\": rpc error: code = NotFound desc = could not find container \"55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6\": container with ID starting with 55c365e0e5e175ca98afe37a2742127ab5058987685ae3a59b17aa0f066cacc6 not found: ID does not exist" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.526107 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.547477 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.553899 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.576793 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:10:00 crc kubenswrapper[4947]: E1203 07:10:00.577158 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e45e5af1-6468-419e-8400-58df1eb1ec64" containerName="sg-core" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.577175 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e45e5af1-6468-419e-8400-58df1eb1ec64" containerName="sg-core" Dec 03 07:10:00 crc kubenswrapper[4947]: E1203 07:10:00.577199 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e45e5af1-6468-419e-8400-58df1eb1ec64" containerName="ceilometer-notification-agent" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.577208 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e45e5af1-6468-419e-8400-58df1eb1ec64" containerName="ceilometer-notification-agent" Dec 03 07:10:00 crc kubenswrapper[4947]: E1203 07:10:00.577233 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e45e5af1-6468-419e-8400-58df1eb1ec64" containerName="proxy-httpd" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.577240 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e45e5af1-6468-419e-8400-58df1eb1ec64" containerName="proxy-httpd" Dec 03 07:10:00 crc kubenswrapper[4947]: E1203 07:10:00.577252 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e45e5af1-6468-419e-8400-58df1eb1ec64" containerName="ceilometer-central-agent" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.577258 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e45e5af1-6468-419e-8400-58df1eb1ec64" containerName="ceilometer-central-agent" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.577406 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="e45e5af1-6468-419e-8400-58df1eb1ec64" containerName="sg-core" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.577423 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="e45e5af1-6468-419e-8400-58df1eb1ec64" containerName="ceilometer-notification-agent" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.577439 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="e45e5af1-6468-419e-8400-58df1eb1ec64" containerName="proxy-httpd" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.577452 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="e45e5af1-6468-419e-8400-58df1eb1ec64" containerName="ceilometer-central-agent" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.579162 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.590102 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.591445 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.601678 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.692007 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-config-data\") pod \"ceilometer-0\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " pod="openstack/ceilometer-0" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.692054 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-scripts\") pod \"ceilometer-0\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " pod="openstack/ceilometer-0" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.692108 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-run-httpd\") pod \"ceilometer-0\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " pod="openstack/ceilometer-0" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.692162 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-log-httpd\") pod \"ceilometer-0\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " pod="openstack/ceilometer-0" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.692187 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " pod="openstack/ceilometer-0" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.692220 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " pod="openstack/ceilometer-0" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.692253 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxh9n\" (UniqueName: \"kubernetes.io/projected/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-kube-api-access-pxh9n\") pod \"ceilometer-0\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " pod="openstack/ceilometer-0" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.794036 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxh9n\" (UniqueName: \"kubernetes.io/projected/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-kube-api-access-pxh9n\") pod \"ceilometer-0\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " pod="openstack/ceilometer-0" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.794105 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-config-data\") pod \"ceilometer-0\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " pod="openstack/ceilometer-0" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.794132 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-scripts\") pod \"ceilometer-0\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " pod="openstack/ceilometer-0" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.794184 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-run-httpd\") pod \"ceilometer-0\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " pod="openstack/ceilometer-0" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.794229 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-log-httpd\") pod \"ceilometer-0\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " pod="openstack/ceilometer-0" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.794254 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " pod="openstack/ceilometer-0" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.794288 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " pod="openstack/ceilometer-0" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.795607 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-run-httpd\") pod \"ceilometer-0\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " pod="openstack/ceilometer-0" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.796034 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-log-httpd\") pod \"ceilometer-0\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " pod="openstack/ceilometer-0" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.799312 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-scripts\") pod \"ceilometer-0\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " pod="openstack/ceilometer-0" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.799484 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-config-data\") pod \"ceilometer-0\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " pod="openstack/ceilometer-0" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.802571 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " pod="openstack/ceilometer-0" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.813834 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " pod="openstack/ceilometer-0" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.814083 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxh9n\" (UniqueName: \"kubernetes.io/projected/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-kube-api-access-pxh9n\") pod \"ceilometer-0\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " pod="openstack/ceilometer-0" Dec 03 07:10:00 crc kubenswrapper[4947]: I1203 07:10:00.913377 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:10:01 crc kubenswrapper[4947]: I1203 07:10:01.103052 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ebe993b-68e0-4d68-8aa7-7895d7b6629a" path="/var/lib/kubelet/pods/2ebe993b-68e0-4d68-8aa7-7895d7b6629a/volumes" Dec 03 07:10:01 crc kubenswrapper[4947]: I1203 07:10:01.110166 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e45e5af1-6468-419e-8400-58df1eb1ec64" path="/var/lib/kubelet/pods/e45e5af1-6468-419e-8400-58df1eb1ec64/volumes" Dec 03 07:10:01 crc kubenswrapper[4947]: I1203 07:10:01.211887 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4","Type":"ContainerStarted","Data":"a1759e709aa0048498574524e67b384d6d64c42bed6642269b86d9285c67cdf0"} Dec 03 07:10:01 crc kubenswrapper[4947]: I1203 07:10:01.211933 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4","Type":"ContainerStarted","Data":"3eebaeb6a06911e5c12aced0d174869aeed0c91e99a4115f55a2036d926368db"} Dec 03 07:10:01 crc kubenswrapper[4947]: I1203 07:10:01.257439 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:10:01 crc kubenswrapper[4947]: I1203 07:10:01.399833 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:10:01 crc kubenswrapper[4947]: W1203 07:10:01.415624 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bc208a9_8f54_4ccb_b2a5_bca6d026e8b6.slice/crio-999efc5ea9cec925ef11f841804af040abab0bfeecd2d27bfd807249836052f1 WatchSource:0}: Error finding container 999efc5ea9cec925ef11f841804af040abab0bfeecd2d27bfd807249836052f1: Status 404 returned error can't find the container with id 999efc5ea9cec925ef11f841804af040abab0bfeecd2d27bfd807249836052f1 Dec 03 07:10:01 crc kubenswrapper[4947]: I1203 07:10:01.973516 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tjs6k"] Dec 03 07:10:01 crc kubenswrapper[4947]: I1203 07:10:01.975208 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tjs6k" Dec 03 07:10:01 crc kubenswrapper[4947]: I1203 07:10:01.978952 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 03 07:10:01 crc kubenswrapper[4947]: I1203 07:10:01.979943 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-5nj62" Dec 03 07:10:01 crc kubenswrapper[4947]: I1203 07:10:01.981016 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 07:10:01 crc kubenswrapper[4947]: I1203 07:10:01.981837 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tjs6k"] Dec 03 07:10:02 crc kubenswrapper[4947]: I1203 07:10:02.023701 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d069b8e8-8592-4fa1-a5e7-74d91e239c0a-config-data\") pod \"nova-cell0-conductor-db-sync-tjs6k\" (UID: \"d069b8e8-8592-4fa1-a5e7-74d91e239c0a\") " pod="openstack/nova-cell0-conductor-db-sync-tjs6k" Dec 03 07:10:02 crc kubenswrapper[4947]: I1203 07:10:02.023741 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d069b8e8-8592-4fa1-a5e7-74d91e239c0a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tjs6k\" (UID: \"d069b8e8-8592-4fa1-a5e7-74d91e239c0a\") " pod="openstack/nova-cell0-conductor-db-sync-tjs6k" Dec 03 07:10:02 crc kubenswrapper[4947]: I1203 07:10:02.023773 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d069b8e8-8592-4fa1-a5e7-74d91e239c0a-scripts\") pod \"nova-cell0-conductor-db-sync-tjs6k\" (UID: \"d069b8e8-8592-4fa1-a5e7-74d91e239c0a\") " pod="openstack/nova-cell0-conductor-db-sync-tjs6k" Dec 03 07:10:02 crc kubenswrapper[4947]: I1203 07:10:02.023922 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pdnx\" (UniqueName: \"kubernetes.io/projected/d069b8e8-8592-4fa1-a5e7-74d91e239c0a-kube-api-access-7pdnx\") pod \"nova-cell0-conductor-db-sync-tjs6k\" (UID: \"d069b8e8-8592-4fa1-a5e7-74d91e239c0a\") " pod="openstack/nova-cell0-conductor-db-sync-tjs6k" Dec 03 07:10:02 crc kubenswrapper[4947]: I1203 07:10:02.125638 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d069b8e8-8592-4fa1-a5e7-74d91e239c0a-config-data\") pod \"nova-cell0-conductor-db-sync-tjs6k\" (UID: \"d069b8e8-8592-4fa1-a5e7-74d91e239c0a\") " pod="openstack/nova-cell0-conductor-db-sync-tjs6k" Dec 03 07:10:02 crc kubenswrapper[4947]: I1203 07:10:02.125699 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d069b8e8-8592-4fa1-a5e7-74d91e239c0a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tjs6k\" (UID: \"d069b8e8-8592-4fa1-a5e7-74d91e239c0a\") " pod="openstack/nova-cell0-conductor-db-sync-tjs6k" Dec 03 07:10:02 crc kubenswrapper[4947]: I1203 07:10:02.125751 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d069b8e8-8592-4fa1-a5e7-74d91e239c0a-scripts\") pod \"nova-cell0-conductor-db-sync-tjs6k\" (UID: \"d069b8e8-8592-4fa1-a5e7-74d91e239c0a\") " pod="openstack/nova-cell0-conductor-db-sync-tjs6k" Dec 03 07:10:02 crc kubenswrapper[4947]: I1203 07:10:02.125817 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pdnx\" (UniqueName: \"kubernetes.io/projected/d069b8e8-8592-4fa1-a5e7-74d91e239c0a-kube-api-access-7pdnx\") pod \"nova-cell0-conductor-db-sync-tjs6k\" (UID: \"d069b8e8-8592-4fa1-a5e7-74d91e239c0a\") " pod="openstack/nova-cell0-conductor-db-sync-tjs6k" Dec 03 07:10:02 crc kubenswrapper[4947]: I1203 07:10:02.131793 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d069b8e8-8592-4fa1-a5e7-74d91e239c0a-config-data\") pod \"nova-cell0-conductor-db-sync-tjs6k\" (UID: \"d069b8e8-8592-4fa1-a5e7-74d91e239c0a\") " pod="openstack/nova-cell0-conductor-db-sync-tjs6k" Dec 03 07:10:02 crc kubenswrapper[4947]: I1203 07:10:02.137233 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d069b8e8-8592-4fa1-a5e7-74d91e239c0a-scripts\") pod \"nova-cell0-conductor-db-sync-tjs6k\" (UID: \"d069b8e8-8592-4fa1-a5e7-74d91e239c0a\") " pod="openstack/nova-cell0-conductor-db-sync-tjs6k" Dec 03 07:10:02 crc kubenswrapper[4947]: I1203 07:10:02.137815 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d069b8e8-8592-4fa1-a5e7-74d91e239c0a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tjs6k\" (UID: \"d069b8e8-8592-4fa1-a5e7-74d91e239c0a\") " pod="openstack/nova-cell0-conductor-db-sync-tjs6k" Dec 03 07:10:02 crc kubenswrapper[4947]: I1203 07:10:02.153088 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pdnx\" (UniqueName: \"kubernetes.io/projected/d069b8e8-8592-4fa1-a5e7-74d91e239c0a-kube-api-access-7pdnx\") pod \"nova-cell0-conductor-db-sync-tjs6k\" (UID: \"d069b8e8-8592-4fa1-a5e7-74d91e239c0a\") " pod="openstack/nova-cell0-conductor-db-sync-tjs6k" Dec 03 07:10:02 crc kubenswrapper[4947]: I1203 07:10:02.223891 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6","Type":"ContainerStarted","Data":"b262485eacc650b1fb6bdc1401c55f8f8c9d2ec668252ad4f261c834e453186f"} Dec 03 07:10:02 crc kubenswrapper[4947]: I1203 07:10:02.224801 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6","Type":"ContainerStarted","Data":"999efc5ea9cec925ef11f841804af040abab0bfeecd2d27bfd807249836052f1"} Dec 03 07:10:02 crc kubenswrapper[4947]: I1203 07:10:02.226698 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4","Type":"ContainerStarted","Data":"64b3099ce4ae64f657b3ba6a9d2a21ac15dc33c2c3dddccc593d4d83045820bf"} Dec 03 07:10:02 crc kubenswrapper[4947]: I1203 07:10:02.262718 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.262693961 podStartE2EDuration="3.262693961s" podCreationTimestamp="2025-12-03 07:09:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:10:02.25433222 +0000 UTC m=+1263.515286666" watchObservedRunningTime="2025-12-03 07:10:02.262693961 +0000 UTC m=+1263.523648397" Dec 03 07:10:02 crc kubenswrapper[4947]: I1203 07:10:02.349027 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tjs6k" Dec 03 07:10:02 crc kubenswrapper[4947]: I1203 07:10:02.416666 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:10:02 crc kubenswrapper[4947]: I1203 07:10:02.422113 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:10:02 crc kubenswrapper[4947]: W1203 07:10:02.857353 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd069b8e8_8592_4fa1_a5e7_74d91e239c0a.slice/crio-a631ff823a7f6b005158afd8d58f9a3dbde6ac7604a071e95d4122e15fed28e0 WatchSource:0}: Error finding container a631ff823a7f6b005158afd8d58f9a3dbde6ac7604a071e95d4122e15fed28e0: Status 404 returned error can't find the container with id a631ff823a7f6b005158afd8d58f9a3dbde6ac7604a071e95d4122e15fed28e0 Dec 03 07:10:02 crc kubenswrapper[4947]: I1203 07:10:02.860450 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tjs6k"] Dec 03 07:10:03 crc kubenswrapper[4947]: I1203 07:10:03.273673 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6","Type":"ContainerStarted","Data":"58a8a2b73a7116f632d661d6064751d991800f47ad2b5f297d0240e4c907a011"} Dec 03 07:10:03 crc kubenswrapper[4947]: I1203 07:10:03.275040 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tjs6k" event={"ID":"d069b8e8-8592-4fa1-a5e7-74d91e239c0a","Type":"ContainerStarted","Data":"a631ff823a7f6b005158afd8d58f9a3dbde6ac7604a071e95d4122e15fed28e0"} Dec 03 07:10:04 crc kubenswrapper[4947]: I1203 07:10:04.289369 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6","Type":"ContainerStarted","Data":"5beca4066aa0a5a4f4f7a61bd052c232b10e31c02e134b01598a90bd63ddf8b5"} Dec 03 07:10:05 crc kubenswrapper[4947]: I1203 07:10:05.750323 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 07:10:05 crc kubenswrapper[4947]: I1203 07:10:05.750901 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 07:10:05 crc kubenswrapper[4947]: I1203 07:10:05.801873 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 07:10:05 crc kubenswrapper[4947]: I1203 07:10:05.809997 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 07:10:06 crc kubenswrapper[4947]: I1203 07:10:06.308302 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6","Type":"ContainerStarted","Data":"2463befbb378d2b6a668effba45dcc6093c5831c1c7767d2ef21938a986b0287"} Dec 03 07:10:06 crc kubenswrapper[4947]: I1203 07:10:06.308620 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" containerName="ceilometer-central-agent" containerID="cri-o://b262485eacc650b1fb6bdc1401c55f8f8c9d2ec668252ad4f261c834e453186f" gracePeriod=30 Dec 03 07:10:06 crc kubenswrapper[4947]: I1203 07:10:06.308712 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" containerName="ceilometer-notification-agent" containerID="cri-o://58a8a2b73a7116f632d661d6064751d991800f47ad2b5f297d0240e4c907a011" gracePeriod=30 Dec 03 07:10:06 crc kubenswrapper[4947]: I1203 07:10:06.308697 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" containerName="sg-core" containerID="cri-o://5beca4066aa0a5a4f4f7a61bd052c232b10e31c02e134b01598a90bd63ddf8b5" gracePeriod=30 Dec 03 07:10:06 crc kubenswrapper[4947]: I1203 07:10:06.308863 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 07:10:06 crc kubenswrapper[4947]: I1203 07:10:06.308892 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 07:10:06 crc kubenswrapper[4947]: I1203 07:10:06.309027 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" containerName="proxy-httpd" containerID="cri-o://2463befbb378d2b6a668effba45dcc6093c5831c1c7767d2ef21938a986b0287" gracePeriod=30 Dec 03 07:10:06 crc kubenswrapper[4947]: I1203 07:10:06.344226 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.447459861 podStartE2EDuration="6.344207357s" podCreationTimestamp="2025-12-03 07:10:00 +0000 UTC" firstStartedPulling="2025-12-03 07:10:01.418408596 +0000 UTC m=+1262.679363022" lastFinishedPulling="2025-12-03 07:10:05.315156102 +0000 UTC m=+1266.576110518" observedRunningTime="2025-12-03 07:10:06.335682903 +0000 UTC m=+1267.596637389" watchObservedRunningTime="2025-12-03 07:10:06.344207357 +0000 UTC m=+1267.605161783" Dec 03 07:10:07 crc kubenswrapper[4947]: I1203 07:10:07.325193 4947 generic.go:334] "Generic (PLEG): container finished" podID="5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" containerID="2463befbb378d2b6a668effba45dcc6093c5831c1c7767d2ef21938a986b0287" exitCode=0 Dec 03 07:10:07 crc kubenswrapper[4947]: I1203 07:10:07.326579 4947 generic.go:334] "Generic (PLEG): container finished" podID="5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" containerID="5beca4066aa0a5a4f4f7a61bd052c232b10e31c02e134b01598a90bd63ddf8b5" exitCode=2 Dec 03 07:10:07 crc kubenswrapper[4947]: I1203 07:10:07.326843 4947 generic.go:334] "Generic (PLEG): container finished" podID="5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" containerID="58a8a2b73a7116f632d661d6064751d991800f47ad2b5f297d0240e4c907a011" exitCode=0 Dec 03 07:10:07 crc kubenswrapper[4947]: I1203 07:10:07.325368 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6","Type":"ContainerDied","Data":"2463befbb378d2b6a668effba45dcc6093c5831c1c7767d2ef21938a986b0287"} Dec 03 07:10:07 crc kubenswrapper[4947]: I1203 07:10:07.327000 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6","Type":"ContainerDied","Data":"5beca4066aa0a5a4f4f7a61bd052c232b10e31c02e134b01598a90bd63ddf8b5"} Dec 03 07:10:07 crc kubenswrapper[4947]: I1203 07:10:07.327033 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6","Type":"ContainerDied","Data":"58a8a2b73a7116f632d661d6064751d991800f47ad2b5f297d0240e4c907a011"} Dec 03 07:10:08 crc kubenswrapper[4947]: I1203 07:10:08.178667 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 07:10:08 crc kubenswrapper[4947]: I1203 07:10:08.187107 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 07:10:09 crc kubenswrapper[4947]: E1203 07:10:09.323326 4947 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ebe993b_68e0_4d68_8aa7_7895d7b6629a.slice/crio-ed499040dda7940e2ed46d09277c8ec4937c63504fc0cb2e22212c1b2bb32da0\": RecentStats: unable to find data in memory cache]" Dec 03 07:10:09 crc kubenswrapper[4947]: I1203 07:10:09.887470 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 07:10:09 crc kubenswrapper[4947]: I1203 07:10:09.887830 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 07:10:09 crc kubenswrapper[4947]: I1203 07:10:09.926316 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 07:10:09 crc kubenswrapper[4947]: I1203 07:10:09.931274 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 07:10:10 crc kubenswrapper[4947]: I1203 07:10:10.360470 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 07:10:10 crc kubenswrapper[4947]: I1203 07:10:10.360586 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 07:10:12 crc kubenswrapper[4947]: I1203 07:10:12.615070 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 07:10:12 crc kubenswrapper[4947]: I1203 07:10:12.615166 4947 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 07:10:12 crc kubenswrapper[4947]: I1203 07:10:12.653262 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 07:10:14 crc kubenswrapper[4947]: I1203 07:10:14.410068 4947 generic.go:334] "Generic (PLEG): container finished" podID="5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" containerID="b262485eacc650b1fb6bdc1401c55f8f8c9d2ec668252ad4f261c834e453186f" exitCode=0 Dec 03 07:10:14 crc kubenswrapper[4947]: I1203 07:10:14.410149 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6","Type":"ContainerDied","Data":"b262485eacc650b1fb6bdc1401c55f8f8c9d2ec668252ad4f261c834e453186f"} Dec 03 07:10:16 crc kubenswrapper[4947]: E1203 07:10:16.008558 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-nova-conductor@sha256:a96d336d231eee461559cfe82b025874ce2b8652520297bc5143559694ebac58" Dec 03 07:10:16 crc kubenswrapper[4947]: E1203 07:10:16.008759 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nova-cell0-conductor-db-sync,Image:quay.io/podified-antelope-centos9/openstack-nova-conductor@sha256:a96d336d231eee461559cfe82b025874ce2b8652520297bc5143559694ebac58,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CELL_NAME,Value:cell0,ValueFrom:nil,},EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:false,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/kolla/config_files/config.json,SubPath:nova-conductor-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7pdnx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42436,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-cell0-conductor-db-sync-tjs6k_openstack(d069b8e8-8592-4fa1-a5e7-74d91e239c0a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 07:10:16 crc kubenswrapper[4947]: E1203 07:10:16.010122 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/nova-cell0-conductor-db-sync-tjs6k" podUID="d069b8e8-8592-4fa1-a5e7-74d91e239c0a" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.365009 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.435468 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.435457 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6","Type":"ContainerDied","Data":"999efc5ea9cec925ef11f841804af040abab0bfeecd2d27bfd807249836052f1"} Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.435657 4947 scope.go:117] "RemoveContainer" containerID="2463befbb378d2b6a668effba45dcc6093c5831c1c7767d2ef21938a986b0287" Dec 03 07:10:16 crc kubenswrapper[4947]: E1203 07:10:16.437088 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-nova-conductor@sha256:a96d336d231eee461559cfe82b025874ce2b8652520297bc5143559694ebac58\\\"\"" pod="openstack/nova-cell0-conductor-db-sync-tjs6k" podUID="d069b8e8-8592-4fa1-a5e7-74d91e239c0a" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.465969 4947 scope.go:117] "RemoveContainer" containerID="5beca4066aa0a5a4f4f7a61bd052c232b10e31c02e134b01598a90bd63ddf8b5" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.474462 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxh9n\" (UniqueName: \"kubernetes.io/projected/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-kube-api-access-pxh9n\") pod \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.474578 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-combined-ca-bundle\") pod \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.474616 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-log-httpd\") pod \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.474700 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-config-data\") pod \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.474812 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-sg-core-conf-yaml\") pod \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.474890 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-run-httpd\") pod \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.474934 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-scripts\") pod \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\" (UID: \"5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6\") " Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.476174 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" (UID: "5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.476690 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" (UID: "5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.482369 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-kube-api-access-pxh9n" (OuterVolumeSpecName: "kube-api-access-pxh9n") pod "5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" (UID: "5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6"). InnerVolumeSpecName "kube-api-access-pxh9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.489470 4947 scope.go:117] "RemoveContainer" containerID="58a8a2b73a7116f632d661d6064751d991800f47ad2b5f297d0240e4c907a011" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.499859 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-scripts" (OuterVolumeSpecName: "scripts") pod "5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" (UID: "5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.506747 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" (UID: "5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.556866 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" (UID: "5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.577813 4947 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.577848 4947 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.577857 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.577876 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxh9n\" (UniqueName: \"kubernetes.io/projected/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-kube-api-access-pxh9n\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.577887 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.577895 4947 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.580194 4947 scope.go:117] "RemoveContainer" containerID="b262485eacc650b1fb6bdc1401c55f8f8c9d2ec668252ad4f261c834e453186f" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.587142 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-config-data" (OuterVolumeSpecName: "config-data") pod "5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" (UID: "5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.681406 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.782828 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.799700 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.830874 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:10:16 crc kubenswrapper[4947]: E1203 07:10:16.831311 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" containerName="ceilometer-central-agent" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.831335 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" containerName="ceilometer-central-agent" Dec 03 07:10:16 crc kubenswrapper[4947]: E1203 07:10:16.831356 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" containerName="ceilometer-notification-agent" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.831365 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" containerName="ceilometer-notification-agent" Dec 03 07:10:16 crc kubenswrapper[4947]: E1203 07:10:16.831381 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" containerName="sg-core" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.831390 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" containerName="sg-core" Dec 03 07:10:16 crc kubenswrapper[4947]: E1203 07:10:16.831430 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" containerName="proxy-httpd" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.831437 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" containerName="proxy-httpd" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.831650 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" containerName="ceilometer-central-agent" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.831676 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" containerName="proxy-httpd" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.831700 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" containerName="sg-core" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.831713 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" containerName="ceilometer-notification-agent" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.833608 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.836059 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.837330 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.842606 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.985519 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86094256-7d4a-4efa-a139-6307919b49df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " pod="openstack/ceilometer-0" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.985894 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86094256-7d4a-4efa-a139-6307919b49df-log-httpd\") pod \"ceilometer-0\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " pod="openstack/ceilometer-0" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.986110 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86094256-7d4a-4efa-a139-6307919b49df-config-data\") pod \"ceilometer-0\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " pod="openstack/ceilometer-0" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.986179 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txg6d\" (UniqueName: \"kubernetes.io/projected/86094256-7d4a-4efa-a139-6307919b49df-kube-api-access-txg6d\") pod \"ceilometer-0\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " pod="openstack/ceilometer-0" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.986270 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86094256-7d4a-4efa-a139-6307919b49df-run-httpd\") pod \"ceilometer-0\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " pod="openstack/ceilometer-0" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.986302 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86094256-7d4a-4efa-a139-6307919b49df-scripts\") pod \"ceilometer-0\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " pod="openstack/ceilometer-0" Dec 03 07:10:16 crc kubenswrapper[4947]: I1203 07:10:16.986437 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86094256-7d4a-4efa-a139-6307919b49df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " pod="openstack/ceilometer-0" Dec 03 07:10:17 crc kubenswrapper[4947]: I1203 07:10:17.087707 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86094256-7d4a-4efa-a139-6307919b49df-config-data\") pod \"ceilometer-0\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " pod="openstack/ceilometer-0" Dec 03 07:10:17 crc kubenswrapper[4947]: I1203 07:10:17.087761 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txg6d\" (UniqueName: \"kubernetes.io/projected/86094256-7d4a-4efa-a139-6307919b49df-kube-api-access-txg6d\") pod \"ceilometer-0\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " pod="openstack/ceilometer-0" Dec 03 07:10:17 crc kubenswrapper[4947]: I1203 07:10:17.087795 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86094256-7d4a-4efa-a139-6307919b49df-run-httpd\") pod \"ceilometer-0\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " pod="openstack/ceilometer-0" Dec 03 07:10:17 crc kubenswrapper[4947]: I1203 07:10:17.087811 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86094256-7d4a-4efa-a139-6307919b49df-scripts\") pod \"ceilometer-0\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " pod="openstack/ceilometer-0" Dec 03 07:10:17 crc kubenswrapper[4947]: I1203 07:10:17.087843 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86094256-7d4a-4efa-a139-6307919b49df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " pod="openstack/ceilometer-0" Dec 03 07:10:17 crc kubenswrapper[4947]: I1203 07:10:17.087924 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86094256-7d4a-4efa-a139-6307919b49df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " pod="openstack/ceilometer-0" Dec 03 07:10:17 crc kubenswrapper[4947]: I1203 07:10:17.087947 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86094256-7d4a-4efa-a139-6307919b49df-log-httpd\") pod \"ceilometer-0\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " pod="openstack/ceilometer-0" Dec 03 07:10:17 crc kubenswrapper[4947]: I1203 07:10:17.088314 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86094256-7d4a-4efa-a139-6307919b49df-run-httpd\") pod \"ceilometer-0\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " pod="openstack/ceilometer-0" Dec 03 07:10:17 crc kubenswrapper[4947]: I1203 07:10:17.088388 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86094256-7d4a-4efa-a139-6307919b49df-log-httpd\") pod \"ceilometer-0\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " pod="openstack/ceilometer-0" Dec 03 07:10:17 crc kubenswrapper[4947]: I1203 07:10:17.092907 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86094256-7d4a-4efa-a139-6307919b49df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " pod="openstack/ceilometer-0" Dec 03 07:10:17 crc kubenswrapper[4947]: I1203 07:10:17.093166 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86094256-7d4a-4efa-a139-6307919b49df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " pod="openstack/ceilometer-0" Dec 03 07:10:17 crc kubenswrapper[4947]: I1203 07:10:17.094077 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6" path="/var/lib/kubelet/pods/5bc208a9-8f54-4ccb-b2a5-bca6d026e8b6/volumes" Dec 03 07:10:17 crc kubenswrapper[4947]: I1203 07:10:17.097619 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86094256-7d4a-4efa-a139-6307919b49df-scripts\") pod \"ceilometer-0\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " pod="openstack/ceilometer-0" Dec 03 07:10:17 crc kubenswrapper[4947]: I1203 07:10:17.097860 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86094256-7d4a-4efa-a139-6307919b49df-config-data\") pod \"ceilometer-0\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " pod="openstack/ceilometer-0" Dec 03 07:10:17 crc kubenswrapper[4947]: I1203 07:10:17.113799 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txg6d\" (UniqueName: \"kubernetes.io/projected/86094256-7d4a-4efa-a139-6307919b49df-kube-api-access-txg6d\") pod \"ceilometer-0\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " pod="openstack/ceilometer-0" Dec 03 07:10:17 crc kubenswrapper[4947]: I1203 07:10:17.155306 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:10:17 crc kubenswrapper[4947]: W1203 07:10:17.641327 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86094256_7d4a_4efa_a139_6307919b49df.slice/crio-746f459723ac30640b2873c5dce961b30b6397f1e9a9875ec09333b37ca951ca WatchSource:0}: Error finding container 746f459723ac30640b2873c5dce961b30b6397f1e9a9875ec09333b37ca951ca: Status 404 returned error can't find the container with id 746f459723ac30640b2873c5dce961b30b6397f1e9a9875ec09333b37ca951ca Dec 03 07:10:17 crc kubenswrapper[4947]: I1203 07:10:17.646117 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:10:18 crc kubenswrapper[4947]: I1203 07:10:18.458705 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86094256-7d4a-4efa-a139-6307919b49df","Type":"ContainerStarted","Data":"4dd21e18ea9b766d173cac4a7cca8ab19e1f895a693d2aeb8930b570ca37b679"} Dec 03 07:10:18 crc kubenswrapper[4947]: I1203 07:10:18.459020 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86094256-7d4a-4efa-a139-6307919b49df","Type":"ContainerStarted","Data":"746f459723ac30640b2873c5dce961b30b6397f1e9a9875ec09333b37ca951ca"} Dec 03 07:10:19 crc kubenswrapper[4947]: I1203 07:10:19.470437 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86094256-7d4a-4efa-a139-6307919b49df","Type":"ContainerStarted","Data":"1a5ff4a6cb09f9a251e9389d617500670f0e52741fc8c1ae750ba2e5427dcd2c"} Dec 03 07:10:19 crc kubenswrapper[4947]: E1203 07:10:19.621683 4947 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ebe993b_68e0_4d68_8aa7_7895d7b6629a.slice/crio-ed499040dda7940e2ed46d09277c8ec4937c63504fc0cb2e22212c1b2bb32da0\": RecentStats: unable to find data in memory cache]" Dec 03 07:10:20 crc kubenswrapper[4947]: I1203 07:10:20.485907 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86094256-7d4a-4efa-a139-6307919b49df","Type":"ContainerStarted","Data":"fd3a7ddd942b0bf7c993d8ec4b13420f86e53176d46ec11df8e6fb68de4c06e1"} Dec 03 07:10:22 crc kubenswrapper[4947]: I1203 07:10:22.512770 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86094256-7d4a-4efa-a139-6307919b49df","Type":"ContainerStarted","Data":"f7bda5b91e9919c7430bc8360e444d25515cab851b10eefd33d0b2c98ed8f764"} Dec 03 07:10:22 crc kubenswrapper[4947]: I1203 07:10:22.513363 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 07:10:22 crc kubenswrapper[4947]: I1203 07:10:22.552539 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.821963474 podStartE2EDuration="6.552517499s" podCreationTimestamp="2025-12-03 07:10:16 +0000 UTC" firstStartedPulling="2025-12-03 07:10:17.645392744 +0000 UTC m=+1278.906347170" lastFinishedPulling="2025-12-03 07:10:21.375946759 +0000 UTC m=+1282.636901195" observedRunningTime="2025-12-03 07:10:22.545877907 +0000 UTC m=+1283.806832343" watchObservedRunningTime="2025-12-03 07:10:22.552517499 +0000 UTC m=+1283.813471945" Dec 03 07:10:29 crc kubenswrapper[4947]: I1203 07:10:29.574000 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tjs6k" event={"ID":"d069b8e8-8592-4fa1-a5e7-74d91e239c0a","Type":"ContainerStarted","Data":"755fd0d95a886192addb01b57efa724676305c9c137608f6fd6ccd18807ef9cb"} Dec 03 07:10:29 crc kubenswrapper[4947]: I1203 07:10:29.589150 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-tjs6k" podStartSLOduration=2.971722463 podStartE2EDuration="28.589127684s" podCreationTimestamp="2025-12-03 07:10:01 +0000 UTC" firstStartedPulling="2025-12-03 07:10:02.859878017 +0000 UTC m=+1264.120832443" lastFinishedPulling="2025-12-03 07:10:28.477283238 +0000 UTC m=+1289.738237664" observedRunningTime="2025-12-03 07:10:29.588722694 +0000 UTC m=+1290.849677120" watchObservedRunningTime="2025-12-03 07:10:29.589127684 +0000 UTC m=+1290.850082131" Dec 03 07:10:29 crc kubenswrapper[4947]: E1203 07:10:29.890586 4947 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ebe993b_68e0_4d68_8aa7_7895d7b6629a.slice/crio-ed499040dda7940e2ed46d09277c8ec4937c63504fc0cb2e22212c1b2bb32da0\": RecentStats: unable to find data in memory cache]" Dec 03 07:10:30 crc kubenswrapper[4947]: I1203 07:10:30.086862 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:10:30 crc kubenswrapper[4947]: I1203 07:10:30.086935 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:10:38 crc kubenswrapper[4947]: I1203 07:10:38.668704 4947 generic.go:334] "Generic (PLEG): container finished" podID="d069b8e8-8592-4fa1-a5e7-74d91e239c0a" containerID="755fd0d95a886192addb01b57efa724676305c9c137608f6fd6ccd18807ef9cb" exitCode=0 Dec 03 07:10:38 crc kubenswrapper[4947]: I1203 07:10:38.668788 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tjs6k" event={"ID":"d069b8e8-8592-4fa1-a5e7-74d91e239c0a","Type":"ContainerDied","Data":"755fd0d95a886192addb01b57efa724676305c9c137608f6fd6ccd18807ef9cb"} Dec 03 07:10:39 crc kubenswrapper[4947]: I1203 07:10:39.981572 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tjs6k" Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.071467 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d069b8e8-8592-4fa1-a5e7-74d91e239c0a-config-data\") pod \"d069b8e8-8592-4fa1-a5e7-74d91e239c0a\" (UID: \"d069b8e8-8592-4fa1-a5e7-74d91e239c0a\") " Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.071962 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d069b8e8-8592-4fa1-a5e7-74d91e239c0a-scripts\") pod \"d069b8e8-8592-4fa1-a5e7-74d91e239c0a\" (UID: \"d069b8e8-8592-4fa1-a5e7-74d91e239c0a\") " Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.072007 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d069b8e8-8592-4fa1-a5e7-74d91e239c0a-combined-ca-bundle\") pod \"d069b8e8-8592-4fa1-a5e7-74d91e239c0a\" (UID: \"d069b8e8-8592-4fa1-a5e7-74d91e239c0a\") " Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.072152 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pdnx\" (UniqueName: \"kubernetes.io/projected/d069b8e8-8592-4fa1-a5e7-74d91e239c0a-kube-api-access-7pdnx\") pod \"d069b8e8-8592-4fa1-a5e7-74d91e239c0a\" (UID: \"d069b8e8-8592-4fa1-a5e7-74d91e239c0a\") " Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.077326 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d069b8e8-8592-4fa1-a5e7-74d91e239c0a-scripts" (OuterVolumeSpecName: "scripts") pod "d069b8e8-8592-4fa1-a5e7-74d91e239c0a" (UID: "d069b8e8-8592-4fa1-a5e7-74d91e239c0a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.094169 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d069b8e8-8592-4fa1-a5e7-74d91e239c0a-kube-api-access-7pdnx" (OuterVolumeSpecName: "kube-api-access-7pdnx") pod "d069b8e8-8592-4fa1-a5e7-74d91e239c0a" (UID: "d069b8e8-8592-4fa1-a5e7-74d91e239c0a"). InnerVolumeSpecName "kube-api-access-7pdnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.104957 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d069b8e8-8592-4fa1-a5e7-74d91e239c0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d069b8e8-8592-4fa1-a5e7-74d91e239c0a" (UID: "d069b8e8-8592-4fa1-a5e7-74d91e239c0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.105358 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d069b8e8-8592-4fa1-a5e7-74d91e239c0a-config-data" (OuterVolumeSpecName: "config-data") pod "d069b8e8-8592-4fa1-a5e7-74d91e239c0a" (UID: "d069b8e8-8592-4fa1-a5e7-74d91e239c0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:10:40 crc kubenswrapper[4947]: E1203 07:10:40.119616 4947 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ebe993b_68e0_4d68_8aa7_7895d7b6629a.slice/crio-ed499040dda7940e2ed46d09277c8ec4937c63504fc0cb2e22212c1b2bb32da0\": RecentStats: unable to find data in memory cache]" Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.175227 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d069b8e8-8592-4fa1-a5e7-74d91e239c0a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.175272 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d069b8e8-8592-4fa1-a5e7-74d91e239c0a-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.175283 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d069b8e8-8592-4fa1-a5e7-74d91e239c0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.175297 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pdnx\" (UniqueName: \"kubernetes.io/projected/d069b8e8-8592-4fa1-a5e7-74d91e239c0a-kube-api-access-7pdnx\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.694586 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tjs6k" event={"ID":"d069b8e8-8592-4fa1-a5e7-74d91e239c0a","Type":"ContainerDied","Data":"a631ff823a7f6b005158afd8d58f9a3dbde6ac7604a071e95d4122e15fed28e0"} Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.694657 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a631ff823a7f6b005158afd8d58f9a3dbde6ac7604a071e95d4122e15fed28e0" Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.694670 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tjs6k" Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.803104 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 07:10:40 crc kubenswrapper[4947]: E1203 07:10:40.803647 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d069b8e8-8592-4fa1-a5e7-74d91e239c0a" containerName="nova-cell0-conductor-db-sync" Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.803676 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="d069b8e8-8592-4fa1-a5e7-74d91e239c0a" containerName="nova-cell0-conductor-db-sync" Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.803993 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="d069b8e8-8592-4fa1-a5e7-74d91e239c0a" containerName="nova-cell0-conductor-db-sync" Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.804971 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.807932 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-5nj62" Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.808187 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.816212 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.887118 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f0a57ca-d063-4d27-ac54-f5431cca2971-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9f0a57ca-d063-4d27-ac54-f5431cca2971\") " pod="openstack/nova-cell0-conductor-0" Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.887477 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxjvw\" (UniqueName: \"kubernetes.io/projected/9f0a57ca-d063-4d27-ac54-f5431cca2971-kube-api-access-dxjvw\") pod \"nova-cell0-conductor-0\" (UID: \"9f0a57ca-d063-4d27-ac54-f5431cca2971\") " pod="openstack/nova-cell0-conductor-0" Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.887603 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f0a57ca-d063-4d27-ac54-f5431cca2971-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9f0a57ca-d063-4d27-ac54-f5431cca2971\") " pod="openstack/nova-cell0-conductor-0" Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.989367 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f0a57ca-d063-4d27-ac54-f5431cca2971-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9f0a57ca-d063-4d27-ac54-f5431cca2971\") " pod="openstack/nova-cell0-conductor-0" Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.989587 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f0a57ca-d063-4d27-ac54-f5431cca2971-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9f0a57ca-d063-4d27-ac54-f5431cca2971\") " pod="openstack/nova-cell0-conductor-0" Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.989621 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxjvw\" (UniqueName: \"kubernetes.io/projected/9f0a57ca-d063-4d27-ac54-f5431cca2971-kube-api-access-dxjvw\") pod \"nova-cell0-conductor-0\" (UID: \"9f0a57ca-d063-4d27-ac54-f5431cca2971\") " pod="openstack/nova-cell0-conductor-0" Dec 03 07:10:40 crc kubenswrapper[4947]: I1203 07:10:40.995692 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f0a57ca-d063-4d27-ac54-f5431cca2971-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9f0a57ca-d063-4d27-ac54-f5431cca2971\") " pod="openstack/nova-cell0-conductor-0" Dec 03 07:10:41 crc kubenswrapper[4947]: I1203 07:10:41.007064 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f0a57ca-d063-4d27-ac54-f5431cca2971-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9f0a57ca-d063-4d27-ac54-f5431cca2971\") " pod="openstack/nova-cell0-conductor-0" Dec 03 07:10:41 crc kubenswrapper[4947]: I1203 07:10:41.012297 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxjvw\" (UniqueName: \"kubernetes.io/projected/9f0a57ca-d063-4d27-ac54-f5431cca2971-kube-api-access-dxjvw\") pod \"nova-cell0-conductor-0\" (UID: \"9f0a57ca-d063-4d27-ac54-f5431cca2971\") " pod="openstack/nova-cell0-conductor-0" Dec 03 07:10:41 crc kubenswrapper[4947]: I1203 07:10:41.146301 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 07:10:41 crc kubenswrapper[4947]: I1203 07:10:41.610486 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 07:10:41 crc kubenswrapper[4947]: W1203 07:10:41.611130 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f0a57ca_d063_4d27_ac54_f5431cca2971.slice/crio-4d37839e18ca3ca0588ed76fb7e8068a744a42e03b230a1c507cefdfdc97801a WatchSource:0}: Error finding container 4d37839e18ca3ca0588ed76fb7e8068a744a42e03b230a1c507cefdfdc97801a: Status 404 returned error can't find the container with id 4d37839e18ca3ca0588ed76fb7e8068a744a42e03b230a1c507cefdfdc97801a Dec 03 07:10:41 crc kubenswrapper[4947]: I1203 07:10:41.708563 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9f0a57ca-d063-4d27-ac54-f5431cca2971","Type":"ContainerStarted","Data":"4d37839e18ca3ca0588ed76fb7e8068a744a42e03b230a1c507cefdfdc97801a"} Dec 03 07:10:42 crc kubenswrapper[4947]: I1203 07:10:42.720483 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9f0a57ca-d063-4d27-ac54-f5431cca2971","Type":"ContainerStarted","Data":"334c2441513d95f243b52e76544d548f66b710f9a5c7ddb3358d9df97ecd5f57"} Dec 03 07:10:42 crc kubenswrapper[4947]: I1203 07:10:42.720747 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 03 07:10:42 crc kubenswrapper[4947]: I1203 07:10:42.752974 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.752952076 podStartE2EDuration="2.752952076s" podCreationTimestamp="2025-12-03 07:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:10:42.748633607 +0000 UTC m=+1304.009588043" watchObservedRunningTime="2025-12-03 07:10:42.752952076 +0000 UTC m=+1304.013906512" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.177640 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.666163 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-b6497"] Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.667433 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-b6497" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.673509 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.681558 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-b6497"] Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.720953 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.794677 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.798436 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.800296 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.824309 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8trd\" (UniqueName: \"kubernetes.io/projected/73c410b2-0cdc-45f3-b06d-c67fd543e76c-kube-api-access-b8trd\") pod \"nova-cell0-cell-mapping-b6497\" (UID: \"73c410b2-0cdc-45f3-b06d-c67fd543e76c\") " pod="openstack/nova-cell0-cell-mapping-b6497" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.824819 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73c410b2-0cdc-45f3-b06d-c67fd543e76c-scripts\") pod \"nova-cell0-cell-mapping-b6497\" (UID: \"73c410b2-0cdc-45f3-b06d-c67fd543e76c\") " pod="openstack/nova-cell0-cell-mapping-b6497" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.826551 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c410b2-0cdc-45f3-b06d-c67fd543e76c-config-data\") pod \"nova-cell0-cell-mapping-b6497\" (UID: \"73c410b2-0cdc-45f3-b06d-c67fd543e76c\") " pod="openstack/nova-cell0-cell-mapping-b6497" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.826620 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c410b2-0cdc-45f3-b06d-c67fd543e76c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-b6497\" (UID: \"73c410b2-0cdc-45f3-b06d-c67fd543e76c\") " pod="openstack/nova-cell0-cell-mapping-b6497" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.853454 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.898358 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.899832 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.902203 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.915168 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.929042 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73c410b2-0cdc-45f3-b06d-c67fd543e76c-scripts\") pod \"nova-cell0-cell-mapping-b6497\" (UID: \"73c410b2-0cdc-45f3-b06d-c67fd543e76c\") " pod="openstack/nova-cell0-cell-mapping-b6497" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.929089 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c410b2-0cdc-45f3-b06d-c67fd543e76c-config-data\") pod \"nova-cell0-cell-mapping-b6497\" (UID: \"73c410b2-0cdc-45f3-b06d-c67fd543e76c\") " pod="openstack/nova-cell0-cell-mapping-b6497" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.929132 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c410b2-0cdc-45f3-b06d-c67fd543e76c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-b6497\" (UID: \"73c410b2-0cdc-45f3-b06d-c67fd543e76c\") " pod="openstack/nova-cell0-cell-mapping-b6497" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.929163 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6079f692-0780-41f6-a551-399974561bd3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6079f692-0780-41f6-a551-399974561bd3\") " pod="openstack/nova-api-0" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.929186 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6079f692-0780-41f6-a551-399974561bd3-config-data\") pod \"nova-api-0\" (UID: \"6079f692-0780-41f6-a551-399974561bd3\") " pod="openstack/nova-api-0" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.929290 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6079f692-0780-41f6-a551-399974561bd3-logs\") pod \"nova-api-0\" (UID: \"6079f692-0780-41f6-a551-399974561bd3\") " pod="openstack/nova-api-0" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.929320 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsvxf\" (UniqueName: \"kubernetes.io/projected/6079f692-0780-41f6-a551-399974561bd3-kube-api-access-gsvxf\") pod \"nova-api-0\" (UID: \"6079f692-0780-41f6-a551-399974561bd3\") " pod="openstack/nova-api-0" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.929348 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8trd\" (UniqueName: \"kubernetes.io/projected/73c410b2-0cdc-45f3-b06d-c67fd543e76c-kube-api-access-b8trd\") pod \"nova-cell0-cell-mapping-b6497\" (UID: \"73c410b2-0cdc-45f3-b06d-c67fd543e76c\") " pod="openstack/nova-cell0-cell-mapping-b6497" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.951332 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73c410b2-0cdc-45f3-b06d-c67fd543e76c-scripts\") pod \"nova-cell0-cell-mapping-b6497\" (UID: \"73c410b2-0cdc-45f3-b06d-c67fd543e76c\") " pod="openstack/nova-cell0-cell-mapping-b6497" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.951393 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.952513 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.954847 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.955140 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c410b2-0cdc-45f3-b06d-c67fd543e76c-config-data\") pod \"nova-cell0-cell-mapping-b6497\" (UID: \"73c410b2-0cdc-45f3-b06d-c67fd543e76c\") " pod="openstack/nova-cell0-cell-mapping-b6497" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.965156 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8trd\" (UniqueName: \"kubernetes.io/projected/73c410b2-0cdc-45f3-b06d-c67fd543e76c-kube-api-access-b8trd\") pod \"nova-cell0-cell-mapping-b6497\" (UID: \"73c410b2-0cdc-45f3-b06d-c67fd543e76c\") " pod="openstack/nova-cell0-cell-mapping-b6497" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.976190 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c410b2-0cdc-45f3-b06d-c67fd543e76c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-b6497\" (UID: \"73c410b2-0cdc-45f3-b06d-c67fd543e76c\") " pod="openstack/nova-cell0-cell-mapping-b6497" Dec 03 07:10:46 crc kubenswrapper[4947]: I1203 07:10:46.994955 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.023710 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.024976 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.028233 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.030621 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f17cf5-7ef1-4773-818d-02f42a6ae7fe-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"39f17cf5-7ef1-4773-818d-02f42a6ae7fe\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.030723 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6079f692-0780-41f6-a551-399974561bd3-logs\") pod \"nova-api-0\" (UID: \"6079f692-0780-41f6-a551-399974561bd3\") " pod="openstack/nova-api-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.030771 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsvxf\" (UniqueName: \"kubernetes.io/projected/6079f692-0780-41f6-a551-399974561bd3-kube-api-access-gsvxf\") pod \"nova-api-0\" (UID: \"6079f692-0780-41f6-a551-399974561bd3\") " pod="openstack/nova-api-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.030809 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr7dc\" (UniqueName: \"kubernetes.io/projected/39f17cf5-7ef1-4773-818d-02f42a6ae7fe-kube-api-access-rr7dc\") pod \"nova-cell1-novncproxy-0\" (UID: \"39f17cf5-7ef1-4773-818d-02f42a6ae7fe\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.030848 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39f17cf5-7ef1-4773-818d-02f42a6ae7fe-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"39f17cf5-7ef1-4773-818d-02f42a6ae7fe\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.030884 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af7a4b9d-480a-421d-b453-a05c3c88b9b7-config-data\") pod \"nova-metadata-0\" (UID: \"af7a4b9d-480a-421d-b453-a05c3c88b9b7\") " pod="openstack/nova-metadata-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.030941 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6079f692-0780-41f6-a551-399974561bd3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6079f692-0780-41f6-a551-399974561bd3\") " pod="openstack/nova-api-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.030969 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6079f692-0780-41f6-a551-399974561bd3-config-data\") pod \"nova-api-0\" (UID: \"6079f692-0780-41f6-a551-399974561bd3\") " pod="openstack/nova-api-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.031001 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af7a4b9d-480a-421d-b453-a05c3c88b9b7-logs\") pod \"nova-metadata-0\" (UID: \"af7a4b9d-480a-421d-b453-a05c3c88b9b7\") " pod="openstack/nova-metadata-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.031040 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv6r5\" (UniqueName: \"kubernetes.io/projected/af7a4b9d-480a-421d-b453-a05c3c88b9b7-kube-api-access-fv6r5\") pod \"nova-metadata-0\" (UID: \"af7a4b9d-480a-421d-b453-a05c3c88b9b7\") " pod="openstack/nova-metadata-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.031075 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af7a4b9d-480a-421d-b453-a05c3c88b9b7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"af7a4b9d-480a-421d-b453-a05c3c88b9b7\") " pod="openstack/nova-metadata-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.031720 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6079f692-0780-41f6-a551-399974561bd3-logs\") pod \"nova-api-0\" (UID: \"6079f692-0780-41f6-a551-399974561bd3\") " pod="openstack/nova-api-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.038321 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6079f692-0780-41f6-a551-399974561bd3-config-data\") pod \"nova-api-0\" (UID: \"6079f692-0780-41f6-a551-399974561bd3\") " pod="openstack/nova-api-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.038959 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6079f692-0780-41f6-a551-399974561bd3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6079f692-0780-41f6-a551-399974561bd3\") " pod="openstack/nova-api-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.039331 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-b6497" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.050286 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.057660 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsvxf\" (UniqueName: \"kubernetes.io/projected/6079f692-0780-41f6-a551-399974561bd3-kube-api-access-gsvxf\") pod \"nova-api-0\" (UID: \"6079f692-0780-41f6-a551-399974561bd3\") " pod="openstack/nova-api-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.098960 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c4475fdfc-mc4tg"] Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.104142 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.105529 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c4475fdfc-mc4tg"] Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.129273 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.133203 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr7dc\" (UniqueName: \"kubernetes.io/projected/39f17cf5-7ef1-4773-818d-02f42a6ae7fe-kube-api-access-rr7dc\") pod \"nova-cell1-novncproxy-0\" (UID: \"39f17cf5-7ef1-4773-818d-02f42a6ae7fe\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.133265 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39f17cf5-7ef1-4773-818d-02f42a6ae7fe-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"39f17cf5-7ef1-4773-818d-02f42a6ae7fe\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.133298 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af7a4b9d-480a-421d-b453-a05c3c88b9b7-config-data\") pod \"nova-metadata-0\" (UID: \"af7a4b9d-480a-421d-b453-a05c3c88b9b7\") " pod="openstack/nova-metadata-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.133361 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtdb8\" (UniqueName: \"kubernetes.io/projected/a05c53bc-13be-4dac-b112-11f05a9e5e64-kube-api-access-wtdb8\") pod \"nova-scheduler-0\" (UID: \"a05c53bc-13be-4dac-b112-11f05a9e5e64\") " pod="openstack/nova-scheduler-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.133429 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af7a4b9d-480a-421d-b453-a05c3c88b9b7-logs\") pod \"nova-metadata-0\" (UID: \"af7a4b9d-480a-421d-b453-a05c3c88b9b7\") " pod="openstack/nova-metadata-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.133479 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv6r5\" (UniqueName: \"kubernetes.io/projected/af7a4b9d-480a-421d-b453-a05c3c88b9b7-kube-api-access-fv6r5\") pod \"nova-metadata-0\" (UID: \"af7a4b9d-480a-421d-b453-a05c3c88b9b7\") " pod="openstack/nova-metadata-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.133546 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af7a4b9d-480a-421d-b453-a05c3c88b9b7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"af7a4b9d-480a-421d-b453-a05c3c88b9b7\") " pod="openstack/nova-metadata-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.133592 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f17cf5-7ef1-4773-818d-02f42a6ae7fe-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"39f17cf5-7ef1-4773-818d-02f42a6ae7fe\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.133668 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a05c53bc-13be-4dac-b112-11f05a9e5e64-config-data\") pod \"nova-scheduler-0\" (UID: \"a05c53bc-13be-4dac-b112-11f05a9e5e64\") " pod="openstack/nova-scheduler-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.133717 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a05c53bc-13be-4dac-b112-11f05a9e5e64-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a05c53bc-13be-4dac-b112-11f05a9e5e64\") " pod="openstack/nova-scheduler-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.134214 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af7a4b9d-480a-421d-b453-a05c3c88b9b7-logs\") pod \"nova-metadata-0\" (UID: \"af7a4b9d-480a-421d-b453-a05c3c88b9b7\") " pod="openstack/nova-metadata-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.155273 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af7a4b9d-480a-421d-b453-a05c3c88b9b7-config-data\") pod \"nova-metadata-0\" (UID: \"af7a4b9d-480a-421d-b453-a05c3c88b9b7\") " pod="openstack/nova-metadata-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.155853 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f17cf5-7ef1-4773-818d-02f42a6ae7fe-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"39f17cf5-7ef1-4773-818d-02f42a6ae7fe\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.155866 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39f17cf5-7ef1-4773-818d-02f42a6ae7fe-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"39f17cf5-7ef1-4773-818d-02f42a6ae7fe\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.162603 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv6r5\" (UniqueName: \"kubernetes.io/projected/af7a4b9d-480a-421d-b453-a05c3c88b9b7-kube-api-access-fv6r5\") pod \"nova-metadata-0\" (UID: \"af7a4b9d-480a-421d-b453-a05c3c88b9b7\") " pod="openstack/nova-metadata-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.159556 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af7a4b9d-480a-421d-b453-a05c3c88b9b7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"af7a4b9d-480a-421d-b453-a05c3c88b9b7\") " pod="openstack/nova-metadata-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.164802 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.176009 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr7dc\" (UniqueName: \"kubernetes.io/projected/39f17cf5-7ef1-4773-818d-02f42a6ae7fe-kube-api-access-rr7dc\") pod \"nova-cell1-novncproxy-0\" (UID: \"39f17cf5-7ef1-4773-818d-02f42a6ae7fe\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.198286 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.247925 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.249061 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-dns-svc\") pod \"dnsmasq-dns-5c4475fdfc-mc4tg\" (UID: \"fb3dcc0e-36f7-40d7-97a4-81120af23608\") " pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.249211 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a05c53bc-13be-4dac-b112-11f05a9e5e64-config-data\") pod \"nova-scheduler-0\" (UID: \"a05c53bc-13be-4dac-b112-11f05a9e5e64\") " pod="openstack/nova-scheduler-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.249381 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a05c53bc-13be-4dac-b112-11f05a9e5e64-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a05c53bc-13be-4dac-b112-11f05a9e5e64\") " pod="openstack/nova-scheduler-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.249418 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8zlp\" (UniqueName: \"kubernetes.io/projected/fb3dcc0e-36f7-40d7-97a4-81120af23608-kube-api-access-x8zlp\") pod \"dnsmasq-dns-5c4475fdfc-mc4tg\" (UID: \"fb3dcc0e-36f7-40d7-97a4-81120af23608\") " pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.249611 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-dns-swift-storage-0\") pod \"dnsmasq-dns-5c4475fdfc-mc4tg\" (UID: \"fb3dcc0e-36f7-40d7-97a4-81120af23608\") " pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.250673 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtdb8\" (UniqueName: \"kubernetes.io/projected/a05c53bc-13be-4dac-b112-11f05a9e5e64-kube-api-access-wtdb8\") pod \"nova-scheduler-0\" (UID: \"a05c53bc-13be-4dac-b112-11f05a9e5e64\") " pod="openstack/nova-scheduler-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.251774 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-ovsdbserver-sb\") pod \"dnsmasq-dns-5c4475fdfc-mc4tg\" (UID: \"fb3dcc0e-36f7-40d7-97a4-81120af23608\") " pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.251894 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-config\") pod \"dnsmasq-dns-5c4475fdfc-mc4tg\" (UID: \"fb3dcc0e-36f7-40d7-97a4-81120af23608\") " pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.251987 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-ovsdbserver-nb\") pod \"dnsmasq-dns-5c4475fdfc-mc4tg\" (UID: \"fb3dcc0e-36f7-40d7-97a4-81120af23608\") " pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.255591 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a05c53bc-13be-4dac-b112-11f05a9e5e64-config-data\") pod \"nova-scheduler-0\" (UID: \"a05c53bc-13be-4dac-b112-11f05a9e5e64\") " pod="openstack/nova-scheduler-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.274401 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtdb8\" (UniqueName: \"kubernetes.io/projected/a05c53bc-13be-4dac-b112-11f05a9e5e64-kube-api-access-wtdb8\") pod \"nova-scheduler-0\" (UID: \"a05c53bc-13be-4dac-b112-11f05a9e5e64\") " pod="openstack/nova-scheduler-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.282065 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a05c53bc-13be-4dac-b112-11f05a9e5e64-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a05c53bc-13be-4dac-b112-11f05a9e5e64\") " pod="openstack/nova-scheduler-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.358431 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-ovsdbserver-nb\") pod \"dnsmasq-dns-5c4475fdfc-mc4tg\" (UID: \"fb3dcc0e-36f7-40d7-97a4-81120af23608\") " pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.358593 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-dns-svc\") pod \"dnsmasq-dns-5c4475fdfc-mc4tg\" (UID: \"fb3dcc0e-36f7-40d7-97a4-81120af23608\") " pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.358682 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8zlp\" (UniqueName: \"kubernetes.io/projected/fb3dcc0e-36f7-40d7-97a4-81120af23608-kube-api-access-x8zlp\") pod \"dnsmasq-dns-5c4475fdfc-mc4tg\" (UID: \"fb3dcc0e-36f7-40d7-97a4-81120af23608\") " pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.358718 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-dns-swift-storage-0\") pod \"dnsmasq-dns-5c4475fdfc-mc4tg\" (UID: \"fb3dcc0e-36f7-40d7-97a4-81120af23608\") " pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.358863 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-ovsdbserver-sb\") pod \"dnsmasq-dns-5c4475fdfc-mc4tg\" (UID: \"fb3dcc0e-36f7-40d7-97a4-81120af23608\") " pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.358908 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-config\") pod \"dnsmasq-dns-5c4475fdfc-mc4tg\" (UID: \"fb3dcc0e-36f7-40d7-97a4-81120af23608\") " pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.364560 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-config\") pod \"dnsmasq-dns-5c4475fdfc-mc4tg\" (UID: \"fb3dcc0e-36f7-40d7-97a4-81120af23608\") " pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.367372 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-dns-svc\") pod \"dnsmasq-dns-5c4475fdfc-mc4tg\" (UID: \"fb3dcc0e-36f7-40d7-97a4-81120af23608\") " pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.367372 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-ovsdbserver-nb\") pod \"dnsmasq-dns-5c4475fdfc-mc4tg\" (UID: \"fb3dcc0e-36f7-40d7-97a4-81120af23608\") " pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.367662 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-ovsdbserver-sb\") pod \"dnsmasq-dns-5c4475fdfc-mc4tg\" (UID: \"fb3dcc0e-36f7-40d7-97a4-81120af23608\") " pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.367898 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-dns-swift-storage-0\") pod \"dnsmasq-dns-5c4475fdfc-mc4tg\" (UID: \"fb3dcc0e-36f7-40d7-97a4-81120af23608\") " pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.389164 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8zlp\" (UniqueName: \"kubernetes.io/projected/fb3dcc0e-36f7-40d7-97a4-81120af23608-kube-api-access-x8zlp\") pod \"dnsmasq-dns-5c4475fdfc-mc4tg\" (UID: \"fb3dcc0e-36f7-40d7-97a4-81120af23608\") " pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.512939 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.537604 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.701146 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-b6497"] Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.792539 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-b6497" event={"ID":"73c410b2-0cdc-45f3-b06d-c67fd543e76c","Type":"ContainerStarted","Data":"167c120126a420e395cb05d9862522ab206b1bd4380a1dada720a6de7498162e"} Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.833768 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:10:47 crc kubenswrapper[4947]: W1203 07:10:47.848340 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6079f692_0780_41f6_a551_399974561bd3.slice/crio-72c0c7750517c120614d966ae2db9c5e3eaf595a55140dc4d4180bcf2492edcc WatchSource:0}: Error finding container 72c0c7750517c120614d966ae2db9c5e3eaf595a55140dc4d4180bcf2492edcc: Status 404 returned error can't find the container with id 72c0c7750517c120614d966ae2db9c5e3eaf595a55140dc4d4180bcf2492edcc Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.858905 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-h5z8j"] Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.860397 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-h5z8j" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.863152 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.863305 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.878038 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-h5z8j"] Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.919724 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.939774 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.976770 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl7sp\" (UniqueName: \"kubernetes.io/projected/b6e0b77d-5e1e-4a81-abad-d35da9b42aaa-kube-api-access-dl7sp\") pod \"nova-cell1-conductor-db-sync-h5z8j\" (UID: \"b6e0b77d-5e1e-4a81-abad-d35da9b42aaa\") " pod="openstack/nova-cell1-conductor-db-sync-h5z8j" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.976830 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6e0b77d-5e1e-4a81-abad-d35da9b42aaa-scripts\") pod \"nova-cell1-conductor-db-sync-h5z8j\" (UID: \"b6e0b77d-5e1e-4a81-abad-d35da9b42aaa\") " pod="openstack/nova-cell1-conductor-db-sync-h5z8j" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.976945 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6e0b77d-5e1e-4a81-abad-d35da9b42aaa-config-data\") pod \"nova-cell1-conductor-db-sync-h5z8j\" (UID: \"b6e0b77d-5e1e-4a81-abad-d35da9b42aaa\") " pod="openstack/nova-cell1-conductor-db-sync-h5z8j" Dec 03 07:10:47 crc kubenswrapper[4947]: I1203 07:10:47.977231 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e0b77d-5e1e-4a81-abad-d35da9b42aaa-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-h5z8j\" (UID: \"b6e0b77d-5e1e-4a81-abad-d35da9b42aaa\") " pod="openstack/nova-cell1-conductor-db-sync-h5z8j" Dec 03 07:10:48 crc kubenswrapper[4947]: I1203 07:10:48.078899 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e0b77d-5e1e-4a81-abad-d35da9b42aaa-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-h5z8j\" (UID: \"b6e0b77d-5e1e-4a81-abad-d35da9b42aaa\") " pod="openstack/nova-cell1-conductor-db-sync-h5z8j" Dec 03 07:10:48 crc kubenswrapper[4947]: I1203 07:10:48.078959 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl7sp\" (UniqueName: \"kubernetes.io/projected/b6e0b77d-5e1e-4a81-abad-d35da9b42aaa-kube-api-access-dl7sp\") pod \"nova-cell1-conductor-db-sync-h5z8j\" (UID: \"b6e0b77d-5e1e-4a81-abad-d35da9b42aaa\") " pod="openstack/nova-cell1-conductor-db-sync-h5z8j" Dec 03 07:10:48 crc kubenswrapper[4947]: I1203 07:10:48.078997 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6e0b77d-5e1e-4a81-abad-d35da9b42aaa-scripts\") pod \"nova-cell1-conductor-db-sync-h5z8j\" (UID: \"b6e0b77d-5e1e-4a81-abad-d35da9b42aaa\") " pod="openstack/nova-cell1-conductor-db-sync-h5z8j" Dec 03 07:10:48 crc kubenswrapper[4947]: I1203 07:10:48.079030 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6e0b77d-5e1e-4a81-abad-d35da9b42aaa-config-data\") pod \"nova-cell1-conductor-db-sync-h5z8j\" (UID: \"b6e0b77d-5e1e-4a81-abad-d35da9b42aaa\") " pod="openstack/nova-cell1-conductor-db-sync-h5z8j" Dec 03 07:10:48 crc kubenswrapper[4947]: I1203 07:10:48.085309 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6e0b77d-5e1e-4a81-abad-d35da9b42aaa-scripts\") pod \"nova-cell1-conductor-db-sync-h5z8j\" (UID: \"b6e0b77d-5e1e-4a81-abad-d35da9b42aaa\") " pod="openstack/nova-cell1-conductor-db-sync-h5z8j" Dec 03 07:10:48 crc kubenswrapper[4947]: I1203 07:10:48.086777 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e0b77d-5e1e-4a81-abad-d35da9b42aaa-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-h5z8j\" (UID: \"b6e0b77d-5e1e-4a81-abad-d35da9b42aaa\") " pod="openstack/nova-cell1-conductor-db-sync-h5z8j" Dec 03 07:10:48 crc kubenswrapper[4947]: I1203 07:10:48.089539 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6e0b77d-5e1e-4a81-abad-d35da9b42aaa-config-data\") pod \"nova-cell1-conductor-db-sync-h5z8j\" (UID: \"b6e0b77d-5e1e-4a81-abad-d35da9b42aaa\") " pod="openstack/nova-cell1-conductor-db-sync-h5z8j" Dec 03 07:10:48 crc kubenswrapper[4947]: I1203 07:10:48.109403 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl7sp\" (UniqueName: \"kubernetes.io/projected/b6e0b77d-5e1e-4a81-abad-d35da9b42aaa-kube-api-access-dl7sp\") pod \"nova-cell1-conductor-db-sync-h5z8j\" (UID: \"b6e0b77d-5e1e-4a81-abad-d35da9b42aaa\") " pod="openstack/nova-cell1-conductor-db-sync-h5z8j" Dec 03 07:10:48 crc kubenswrapper[4947]: I1203 07:10:48.113583 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:10:48 crc kubenswrapper[4947]: I1203 07:10:48.124245 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c4475fdfc-mc4tg"] Dec 03 07:10:48 crc kubenswrapper[4947]: I1203 07:10:48.180188 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-h5z8j" Dec 03 07:10:48 crc kubenswrapper[4947]: I1203 07:10:48.675977 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-h5z8j"] Dec 03 07:10:48 crc kubenswrapper[4947]: I1203 07:10:48.801696 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"af7a4b9d-480a-421d-b453-a05c3c88b9b7","Type":"ContainerStarted","Data":"e4270663d6f6e4b039bd8dceafb44fc8b3b787534f86cf50542c6d70175eb091"} Dec 03 07:10:48 crc kubenswrapper[4947]: I1203 07:10:48.803736 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-b6497" event={"ID":"73c410b2-0cdc-45f3-b06d-c67fd543e76c","Type":"ContainerStarted","Data":"79d76a072f599124aea7aa155893adf39a85cca4bb9f8e9f6280be548ec04f73"} Dec 03 07:10:48 crc kubenswrapper[4947]: I1203 07:10:48.805274 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6079f692-0780-41f6-a551-399974561bd3","Type":"ContainerStarted","Data":"72c0c7750517c120614d966ae2db9c5e3eaf595a55140dc4d4180bcf2492edcc"} Dec 03 07:10:48 crc kubenswrapper[4947]: I1203 07:10:48.807823 4947 generic.go:334] "Generic (PLEG): container finished" podID="fb3dcc0e-36f7-40d7-97a4-81120af23608" containerID="a6ce595ade9a9475a433b8be016f0ffd0956854fd235a02fee7a7b549bfe6fea" exitCode=0 Dec 03 07:10:48 crc kubenswrapper[4947]: I1203 07:10:48.807852 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" event={"ID":"fb3dcc0e-36f7-40d7-97a4-81120af23608","Type":"ContainerDied","Data":"a6ce595ade9a9475a433b8be016f0ffd0956854fd235a02fee7a7b549bfe6fea"} Dec 03 07:10:48 crc kubenswrapper[4947]: I1203 07:10:48.807877 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" event={"ID":"fb3dcc0e-36f7-40d7-97a4-81120af23608","Type":"ContainerStarted","Data":"af00954a0f0e7ac18b91114b334be8feeb6fd15b01ace78c528ac044195bf51d"} Dec 03 07:10:48 crc kubenswrapper[4947]: I1203 07:10:48.809765 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a05c53bc-13be-4dac-b112-11f05a9e5e64","Type":"ContainerStarted","Data":"b5033e057d24c1ec8ca5579c19beceab1a3ddd4dea9075f56d4983bf302231d2"} Dec 03 07:10:48 crc kubenswrapper[4947]: I1203 07:10:48.812326 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"39f17cf5-7ef1-4773-818d-02f42a6ae7fe","Type":"ContainerStarted","Data":"d091e0f001c42b13a3b8150bc4b33ecb88ddb74f0efb1878d59c811f4cc52597"} Dec 03 07:10:48 crc kubenswrapper[4947]: I1203 07:10:48.813857 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-h5z8j" event={"ID":"b6e0b77d-5e1e-4a81-abad-d35da9b42aaa","Type":"ContainerStarted","Data":"66f7a619409e1c59af3028c3c3b4257b3a1ff5688ec2dc833767398c524339cc"} Dec 03 07:10:48 crc kubenswrapper[4947]: I1203 07:10:48.824420 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-b6497" podStartSLOduration=2.824402115 podStartE2EDuration="2.824402115s" podCreationTimestamp="2025-12-03 07:10:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:10:48.816573219 +0000 UTC m=+1310.077527645" watchObservedRunningTime="2025-12-03 07:10:48.824402115 +0000 UTC m=+1310.085356531" Dec 03 07:10:49 crc kubenswrapper[4947]: I1203 07:10:49.831500 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" event={"ID":"fb3dcc0e-36f7-40d7-97a4-81120af23608","Type":"ContainerStarted","Data":"47543c62ad6a1a8db1e3df0f70a0cf3357416e4167bbed925cb86aecffd099af"} Dec 03 07:10:49 crc kubenswrapper[4947]: I1203 07:10:49.831912 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" Dec 03 07:10:49 crc kubenswrapper[4947]: I1203 07:10:49.835090 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-h5z8j" event={"ID":"b6e0b77d-5e1e-4a81-abad-d35da9b42aaa","Type":"ContainerStarted","Data":"e87240b302c77a397917136cebd59e0e317e8b17fdf4bfb67df917c0ec2d7cbd"} Dec 03 07:10:49 crc kubenswrapper[4947]: I1203 07:10:49.855379 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" podStartSLOduration=3.855360453 podStartE2EDuration="3.855360453s" podCreationTimestamp="2025-12-03 07:10:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:10:49.846686864 +0000 UTC m=+1311.107641300" watchObservedRunningTime="2025-12-03 07:10:49.855360453 +0000 UTC m=+1311.116314879" Dec 03 07:10:49 crc kubenswrapper[4947]: I1203 07:10:49.868093 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-h5z8j" podStartSLOduration=2.868047733 podStartE2EDuration="2.868047733s" podCreationTimestamp="2025-12-03 07:10:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:10:49.86612038 +0000 UTC m=+1311.127074806" watchObservedRunningTime="2025-12-03 07:10:49.868047733 +0000 UTC m=+1311.129002159" Dec 03 07:10:50 crc kubenswrapper[4947]: E1203 07:10:50.398450 4947 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ebe993b_68e0_4d68_8aa7_7895d7b6629a.slice/crio-ed499040dda7940e2ed46d09277c8ec4937c63504fc0cb2e22212c1b2bb32da0\": RecentStats: unable to find data in memory cache]" Dec 03 07:10:50 crc kubenswrapper[4947]: I1203 07:10:50.504619 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:10:50 crc kubenswrapper[4947]: I1203 07:10:50.517988 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 07:10:51 crc kubenswrapper[4947]: I1203 07:10:51.875382 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"af7a4b9d-480a-421d-b453-a05c3c88b9b7","Type":"ContainerStarted","Data":"740c81b246eee1044b1ec083cbd9cae57ecc0373bda8bf20601ed85685958292"} Dec 03 07:10:51 crc kubenswrapper[4947]: I1203 07:10:51.875840 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"af7a4b9d-480a-421d-b453-a05c3c88b9b7","Type":"ContainerStarted","Data":"42a61fa85c2675a8998d9fc903fe182c3a014ca37aec22f33bb4af405157e275"} Dec 03 07:10:51 crc kubenswrapper[4947]: I1203 07:10:51.875766 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="af7a4b9d-480a-421d-b453-a05c3c88b9b7" containerName="nova-metadata-log" containerID="cri-o://42a61fa85c2675a8998d9fc903fe182c3a014ca37aec22f33bb4af405157e275" gracePeriod=30 Dec 03 07:10:51 crc kubenswrapper[4947]: I1203 07:10:51.875817 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="af7a4b9d-480a-421d-b453-a05c3c88b9b7" containerName="nova-metadata-metadata" containerID="cri-o://740c81b246eee1044b1ec083cbd9cae57ecc0373bda8bf20601ed85685958292" gracePeriod=30 Dec 03 07:10:51 crc kubenswrapper[4947]: I1203 07:10:51.879594 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6079f692-0780-41f6-a551-399974561bd3","Type":"ContainerStarted","Data":"4dd4ebee4948d69a2730257fd4f989c2a863bba9bdfac93199d0f7632b8b35c5"} Dec 03 07:10:51 crc kubenswrapper[4947]: I1203 07:10:51.879640 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6079f692-0780-41f6-a551-399974561bd3","Type":"ContainerStarted","Data":"7744fa6687c156278f7c710a5cc4ece059686b1a297f3eb72a825c051b3e8c23"} Dec 03 07:10:51 crc kubenswrapper[4947]: I1203 07:10:51.885194 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a05c53bc-13be-4dac-b112-11f05a9e5e64","Type":"ContainerStarted","Data":"c1361038662985e5b390ba5ddb3eb2331e23a0aef8af1d8f6b49870d5a02c952"} Dec 03 07:10:51 crc kubenswrapper[4947]: I1203 07:10:51.887621 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"39f17cf5-7ef1-4773-818d-02f42a6ae7fe","Type":"ContainerStarted","Data":"00aa7a80e06314abdb440924c5edafb9eaef6ba88ad895856328a1ead1571418"} Dec 03 07:10:51 crc kubenswrapper[4947]: I1203 07:10:51.887856 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="39f17cf5-7ef1-4773-818d-02f42a6ae7fe" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://00aa7a80e06314abdb440924c5edafb9eaef6ba88ad895856328a1ead1571418" gracePeriod=30 Dec 03 07:10:51 crc kubenswrapper[4947]: I1203 07:10:51.901277 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.08765071 podStartE2EDuration="5.901262679s" podCreationTimestamp="2025-12-03 07:10:46 +0000 UTC" firstStartedPulling="2025-12-03 07:10:47.92527043 +0000 UTC m=+1309.186224856" lastFinishedPulling="2025-12-03 07:10:50.738882399 +0000 UTC m=+1311.999836825" observedRunningTime="2025-12-03 07:10:51.896461296 +0000 UTC m=+1313.157415722" watchObservedRunningTime="2025-12-03 07:10:51.901262679 +0000 UTC m=+1313.162217105" Dec 03 07:10:51 crc kubenswrapper[4947]: I1203 07:10:51.936162 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.127375983 podStartE2EDuration="5.936140349s" podCreationTimestamp="2025-12-03 07:10:46 +0000 UTC" firstStartedPulling="2025-12-03 07:10:47.927991265 +0000 UTC m=+1309.188945691" lastFinishedPulling="2025-12-03 07:10:50.736755631 +0000 UTC m=+1311.997710057" observedRunningTime="2025-12-03 07:10:51.914642277 +0000 UTC m=+1313.175596703" watchObservedRunningTime="2025-12-03 07:10:51.936140349 +0000 UTC m=+1313.197094775" Dec 03 07:10:51 crc kubenswrapper[4947]: I1203 07:10:51.944274 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.321666757 podStartE2EDuration="5.944253983s" podCreationTimestamp="2025-12-03 07:10:46 +0000 UTC" firstStartedPulling="2025-12-03 07:10:48.116315893 +0000 UTC m=+1309.377270319" lastFinishedPulling="2025-12-03 07:10:50.738903129 +0000 UTC m=+1311.999857545" observedRunningTime="2025-12-03 07:10:51.931671596 +0000 UTC m=+1313.192626022" watchObservedRunningTime="2025-12-03 07:10:51.944253983 +0000 UTC m=+1313.205208419" Dec 03 07:10:51 crc kubenswrapper[4947]: I1203 07:10:51.958689 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.070047534 podStartE2EDuration="5.958668951s" podCreationTimestamp="2025-12-03 07:10:46 +0000 UTC" firstStartedPulling="2025-12-03 07:10:47.850265642 +0000 UTC m=+1309.111220068" lastFinishedPulling="2025-12-03 07:10:50.738887059 +0000 UTC m=+1311.999841485" observedRunningTime="2025-12-03 07:10:51.949678602 +0000 UTC m=+1313.210633038" watchObservedRunningTime="2025-12-03 07:10:51.958668951 +0000 UTC m=+1313.219623377" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.199730 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.248743 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.248799 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.454487 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.514651 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.594458 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af7a4b9d-480a-421d-b453-a05c3c88b9b7-combined-ca-bundle\") pod \"af7a4b9d-480a-421d-b453-a05c3c88b9b7\" (UID: \"af7a4b9d-480a-421d-b453-a05c3c88b9b7\") " Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.594734 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af7a4b9d-480a-421d-b453-a05c3c88b9b7-config-data\") pod \"af7a4b9d-480a-421d-b453-a05c3c88b9b7\" (UID: \"af7a4b9d-480a-421d-b453-a05c3c88b9b7\") " Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.594785 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af7a4b9d-480a-421d-b453-a05c3c88b9b7-logs\") pod \"af7a4b9d-480a-421d-b453-a05c3c88b9b7\" (UID: \"af7a4b9d-480a-421d-b453-a05c3c88b9b7\") " Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.594878 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fv6r5\" (UniqueName: \"kubernetes.io/projected/af7a4b9d-480a-421d-b453-a05c3c88b9b7-kube-api-access-fv6r5\") pod \"af7a4b9d-480a-421d-b453-a05c3c88b9b7\" (UID: \"af7a4b9d-480a-421d-b453-a05c3c88b9b7\") " Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.595126 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af7a4b9d-480a-421d-b453-a05c3c88b9b7-logs" (OuterVolumeSpecName: "logs") pod "af7a4b9d-480a-421d-b453-a05c3c88b9b7" (UID: "af7a4b9d-480a-421d-b453-a05c3c88b9b7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.595344 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af7a4b9d-480a-421d-b453-a05c3c88b9b7-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.601062 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af7a4b9d-480a-421d-b453-a05c3c88b9b7-kube-api-access-fv6r5" (OuterVolumeSpecName: "kube-api-access-fv6r5") pod "af7a4b9d-480a-421d-b453-a05c3c88b9b7" (UID: "af7a4b9d-480a-421d-b453-a05c3c88b9b7"). InnerVolumeSpecName "kube-api-access-fv6r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.622378 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af7a4b9d-480a-421d-b453-a05c3c88b9b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af7a4b9d-480a-421d-b453-a05c3c88b9b7" (UID: "af7a4b9d-480a-421d-b453-a05c3c88b9b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.628122 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af7a4b9d-480a-421d-b453-a05c3c88b9b7-config-data" (OuterVolumeSpecName: "config-data") pod "af7a4b9d-480a-421d-b453-a05c3c88b9b7" (UID: "af7a4b9d-480a-421d-b453-a05c3c88b9b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.696782 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af7a4b9d-480a-421d-b453-a05c3c88b9b7-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.696826 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fv6r5\" (UniqueName: \"kubernetes.io/projected/af7a4b9d-480a-421d-b453-a05c3c88b9b7-kube-api-access-fv6r5\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.696840 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af7a4b9d-480a-421d-b453-a05c3c88b9b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.897781 4947 generic.go:334] "Generic (PLEG): container finished" podID="af7a4b9d-480a-421d-b453-a05c3c88b9b7" containerID="740c81b246eee1044b1ec083cbd9cae57ecc0373bda8bf20601ed85685958292" exitCode=0 Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.897832 4947 generic.go:334] "Generic (PLEG): container finished" podID="af7a4b9d-480a-421d-b453-a05c3c88b9b7" containerID="42a61fa85c2675a8998d9fc903fe182c3a014ca37aec22f33bb4af405157e275" exitCode=143 Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.897867 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"af7a4b9d-480a-421d-b453-a05c3c88b9b7","Type":"ContainerDied","Data":"740c81b246eee1044b1ec083cbd9cae57ecc0373bda8bf20601ed85685958292"} Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.897937 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"af7a4b9d-480a-421d-b453-a05c3c88b9b7","Type":"ContainerDied","Data":"42a61fa85c2675a8998d9fc903fe182c3a014ca37aec22f33bb4af405157e275"} Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.897951 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"af7a4b9d-480a-421d-b453-a05c3c88b9b7","Type":"ContainerDied","Data":"e4270663d6f6e4b039bd8dceafb44fc8b3b787534f86cf50542c6d70175eb091"} Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.897951 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.897971 4947 scope.go:117] "RemoveContainer" containerID="740c81b246eee1044b1ec083cbd9cae57ecc0373bda8bf20601ed85685958292" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.924710 4947 scope.go:117] "RemoveContainer" containerID="42a61fa85c2675a8998d9fc903fe182c3a014ca37aec22f33bb4af405157e275" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.951136 4947 scope.go:117] "RemoveContainer" containerID="740c81b246eee1044b1ec083cbd9cae57ecc0373bda8bf20601ed85685958292" Dec 03 07:10:52 crc kubenswrapper[4947]: E1203 07:10:52.953935 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"740c81b246eee1044b1ec083cbd9cae57ecc0373bda8bf20601ed85685958292\": container with ID starting with 740c81b246eee1044b1ec083cbd9cae57ecc0373bda8bf20601ed85685958292 not found: ID does not exist" containerID="740c81b246eee1044b1ec083cbd9cae57ecc0373bda8bf20601ed85685958292" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.953990 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"740c81b246eee1044b1ec083cbd9cae57ecc0373bda8bf20601ed85685958292"} err="failed to get container status \"740c81b246eee1044b1ec083cbd9cae57ecc0373bda8bf20601ed85685958292\": rpc error: code = NotFound desc = could not find container \"740c81b246eee1044b1ec083cbd9cae57ecc0373bda8bf20601ed85685958292\": container with ID starting with 740c81b246eee1044b1ec083cbd9cae57ecc0373bda8bf20601ed85685958292 not found: ID does not exist" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.954022 4947 scope.go:117] "RemoveContainer" containerID="42a61fa85c2675a8998d9fc903fe182c3a014ca37aec22f33bb4af405157e275" Dec 03 07:10:52 crc kubenswrapper[4947]: E1203 07:10:52.957913 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42a61fa85c2675a8998d9fc903fe182c3a014ca37aec22f33bb4af405157e275\": container with ID starting with 42a61fa85c2675a8998d9fc903fe182c3a014ca37aec22f33bb4af405157e275 not found: ID does not exist" containerID="42a61fa85c2675a8998d9fc903fe182c3a014ca37aec22f33bb4af405157e275" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.957970 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42a61fa85c2675a8998d9fc903fe182c3a014ca37aec22f33bb4af405157e275"} err="failed to get container status \"42a61fa85c2675a8998d9fc903fe182c3a014ca37aec22f33bb4af405157e275\": rpc error: code = NotFound desc = could not find container \"42a61fa85c2675a8998d9fc903fe182c3a014ca37aec22f33bb4af405157e275\": container with ID starting with 42a61fa85c2675a8998d9fc903fe182c3a014ca37aec22f33bb4af405157e275 not found: ID does not exist" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.958006 4947 scope.go:117] "RemoveContainer" containerID="740c81b246eee1044b1ec083cbd9cae57ecc0373bda8bf20601ed85685958292" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.958461 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"740c81b246eee1044b1ec083cbd9cae57ecc0373bda8bf20601ed85685958292"} err="failed to get container status \"740c81b246eee1044b1ec083cbd9cae57ecc0373bda8bf20601ed85685958292\": rpc error: code = NotFound desc = could not find container \"740c81b246eee1044b1ec083cbd9cae57ecc0373bda8bf20601ed85685958292\": container with ID starting with 740c81b246eee1044b1ec083cbd9cae57ecc0373bda8bf20601ed85685958292 not found: ID does not exist" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.958507 4947 scope.go:117] "RemoveContainer" containerID="42a61fa85c2675a8998d9fc903fe182c3a014ca37aec22f33bb4af405157e275" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.958756 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42a61fa85c2675a8998d9fc903fe182c3a014ca37aec22f33bb4af405157e275"} err="failed to get container status \"42a61fa85c2675a8998d9fc903fe182c3a014ca37aec22f33bb4af405157e275\": rpc error: code = NotFound desc = could not find container \"42a61fa85c2675a8998d9fc903fe182c3a014ca37aec22f33bb4af405157e275\": container with ID starting with 42a61fa85c2675a8998d9fc903fe182c3a014ca37aec22f33bb4af405157e275 not found: ID does not exist" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.959135 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.973205 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.985645 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:10:52 crc kubenswrapper[4947]: E1203 07:10:52.987692 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af7a4b9d-480a-421d-b453-a05c3c88b9b7" containerName="nova-metadata-metadata" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.987722 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="af7a4b9d-480a-421d-b453-a05c3c88b9b7" containerName="nova-metadata-metadata" Dec 03 07:10:52 crc kubenswrapper[4947]: E1203 07:10:52.987748 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af7a4b9d-480a-421d-b453-a05c3c88b9b7" containerName="nova-metadata-log" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.987757 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="af7a4b9d-480a-421d-b453-a05c3c88b9b7" containerName="nova-metadata-log" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.988023 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="af7a4b9d-480a-421d-b453-a05c3c88b9b7" containerName="nova-metadata-metadata" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.988048 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="af7a4b9d-480a-421d-b453-a05c3c88b9b7" containerName="nova-metadata-log" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.989402 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.995946 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.996052 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 07:10:52 crc kubenswrapper[4947]: I1203 07:10:52.996601 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:10:53 crc kubenswrapper[4947]: I1203 07:10:53.097636 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af7a4b9d-480a-421d-b453-a05c3c88b9b7" path="/var/lib/kubelet/pods/af7a4b9d-480a-421d-b453-a05c3c88b9b7/volumes" Dec 03 07:10:53 crc kubenswrapper[4947]: I1203 07:10:53.103879 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4a891a4d-ebae-46f8-b68f-ff6404f1d27e\") " pod="openstack/nova-metadata-0" Dec 03 07:10:53 crc kubenswrapper[4947]: I1203 07:10:53.103944 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-config-data\") pod \"nova-metadata-0\" (UID: \"4a891a4d-ebae-46f8-b68f-ff6404f1d27e\") " pod="openstack/nova-metadata-0" Dec 03 07:10:53 crc kubenswrapper[4947]: I1203 07:10:53.103996 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4a891a4d-ebae-46f8-b68f-ff6404f1d27e\") " pod="openstack/nova-metadata-0" Dec 03 07:10:53 crc kubenswrapper[4947]: I1203 07:10:53.104126 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-logs\") pod \"nova-metadata-0\" (UID: \"4a891a4d-ebae-46f8-b68f-ff6404f1d27e\") " pod="openstack/nova-metadata-0" Dec 03 07:10:53 crc kubenswrapper[4947]: I1203 07:10:53.104319 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwpgk\" (UniqueName: \"kubernetes.io/projected/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-kube-api-access-cwpgk\") pod \"nova-metadata-0\" (UID: \"4a891a4d-ebae-46f8-b68f-ff6404f1d27e\") " pod="openstack/nova-metadata-0" Dec 03 07:10:53 crc kubenswrapper[4947]: I1203 07:10:53.206577 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwpgk\" (UniqueName: \"kubernetes.io/projected/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-kube-api-access-cwpgk\") pod \"nova-metadata-0\" (UID: \"4a891a4d-ebae-46f8-b68f-ff6404f1d27e\") " pod="openstack/nova-metadata-0" Dec 03 07:10:53 crc kubenswrapper[4947]: I1203 07:10:53.206662 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4a891a4d-ebae-46f8-b68f-ff6404f1d27e\") " pod="openstack/nova-metadata-0" Dec 03 07:10:53 crc kubenswrapper[4947]: I1203 07:10:53.206692 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-config-data\") pod \"nova-metadata-0\" (UID: \"4a891a4d-ebae-46f8-b68f-ff6404f1d27e\") " pod="openstack/nova-metadata-0" Dec 03 07:10:53 crc kubenswrapper[4947]: I1203 07:10:53.206750 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4a891a4d-ebae-46f8-b68f-ff6404f1d27e\") " pod="openstack/nova-metadata-0" Dec 03 07:10:53 crc kubenswrapper[4947]: I1203 07:10:53.206818 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-logs\") pod \"nova-metadata-0\" (UID: \"4a891a4d-ebae-46f8-b68f-ff6404f1d27e\") " pod="openstack/nova-metadata-0" Dec 03 07:10:53 crc kubenswrapper[4947]: I1203 07:10:53.207638 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-logs\") pod \"nova-metadata-0\" (UID: \"4a891a4d-ebae-46f8-b68f-ff6404f1d27e\") " pod="openstack/nova-metadata-0" Dec 03 07:10:53 crc kubenswrapper[4947]: I1203 07:10:53.211598 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-config-data\") pod \"nova-metadata-0\" (UID: \"4a891a4d-ebae-46f8-b68f-ff6404f1d27e\") " pod="openstack/nova-metadata-0" Dec 03 07:10:53 crc kubenswrapper[4947]: I1203 07:10:53.211969 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4a891a4d-ebae-46f8-b68f-ff6404f1d27e\") " pod="openstack/nova-metadata-0" Dec 03 07:10:53 crc kubenswrapper[4947]: I1203 07:10:53.215884 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4a891a4d-ebae-46f8-b68f-ff6404f1d27e\") " pod="openstack/nova-metadata-0" Dec 03 07:10:53 crc kubenswrapper[4947]: I1203 07:10:53.228893 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwpgk\" (UniqueName: \"kubernetes.io/projected/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-kube-api-access-cwpgk\") pod \"nova-metadata-0\" (UID: \"4a891a4d-ebae-46f8-b68f-ff6404f1d27e\") " pod="openstack/nova-metadata-0" Dec 03 07:10:53 crc kubenswrapper[4947]: I1203 07:10:53.310462 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:10:53 crc kubenswrapper[4947]: I1203 07:10:53.426508 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 07:10:53 crc kubenswrapper[4947]: I1203 07:10:53.427004 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="76881366-670a-494f-ba95-7c5187ba80e8" containerName="kube-state-metrics" containerID="cri-o://93a8e7d2461c418769befcef894e43ae7999b371ae989342fcebf8d6081bc7bb" gracePeriod=30 Dec 03 07:10:53 crc kubenswrapper[4947]: I1203 07:10:53.780580 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:10:53 crc kubenswrapper[4947]: W1203 07:10:53.795380 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a891a4d_ebae_46f8_b68f_ff6404f1d27e.slice/crio-6367923194f0cc13ac5851931115713879bff63e3a5b8da986b9508cddaf3226 WatchSource:0}: Error finding container 6367923194f0cc13ac5851931115713879bff63e3a5b8da986b9508cddaf3226: Status 404 returned error can't find the container with id 6367923194f0cc13ac5851931115713879bff63e3a5b8da986b9508cddaf3226 Dec 03 07:10:53 crc kubenswrapper[4947]: I1203 07:10:53.911758 4947 generic.go:334] "Generic (PLEG): container finished" podID="76881366-670a-494f-ba95-7c5187ba80e8" containerID="93a8e7d2461c418769befcef894e43ae7999b371ae989342fcebf8d6081bc7bb" exitCode=2 Dec 03 07:10:53 crc kubenswrapper[4947]: I1203 07:10:53.911826 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"76881366-670a-494f-ba95-7c5187ba80e8","Type":"ContainerDied","Data":"93a8e7d2461c418769befcef894e43ae7999b371ae989342fcebf8d6081bc7bb"} Dec 03 07:10:53 crc kubenswrapper[4947]: I1203 07:10:53.913157 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a891a4d-ebae-46f8-b68f-ff6404f1d27e","Type":"ContainerStarted","Data":"6367923194f0cc13ac5851931115713879bff63e3a5b8da986b9508cddaf3226"} Dec 03 07:10:54 crc kubenswrapper[4947]: I1203 07:10:54.290835 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 07:10:54 crc kubenswrapper[4947]: I1203 07:10:54.432230 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4znd\" (UniqueName: \"kubernetes.io/projected/76881366-670a-494f-ba95-7c5187ba80e8-kube-api-access-d4znd\") pod \"76881366-670a-494f-ba95-7c5187ba80e8\" (UID: \"76881366-670a-494f-ba95-7c5187ba80e8\") " Dec 03 07:10:54 crc kubenswrapper[4947]: I1203 07:10:54.444305 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76881366-670a-494f-ba95-7c5187ba80e8-kube-api-access-d4znd" (OuterVolumeSpecName: "kube-api-access-d4znd") pod "76881366-670a-494f-ba95-7c5187ba80e8" (UID: "76881366-670a-494f-ba95-7c5187ba80e8"). InnerVolumeSpecName "kube-api-access-d4znd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:10:54 crc kubenswrapper[4947]: I1203 07:10:54.534637 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4znd\" (UniqueName: \"kubernetes.io/projected/76881366-670a-494f-ba95-7c5187ba80e8-kube-api-access-d4znd\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:54 crc kubenswrapper[4947]: I1203 07:10:54.931206 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"76881366-670a-494f-ba95-7c5187ba80e8","Type":"ContainerDied","Data":"19319df9642e415de8d25c9c6f6fb4e4f404292a76344615b35465139baacb3a"} Dec 03 07:10:54 crc kubenswrapper[4947]: I1203 07:10:54.931791 4947 scope.go:117] "RemoveContainer" containerID="93a8e7d2461c418769befcef894e43ae7999b371ae989342fcebf8d6081bc7bb" Dec 03 07:10:54 crc kubenswrapper[4947]: I1203 07:10:54.931700 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 07:10:54 crc kubenswrapper[4947]: I1203 07:10:54.939826 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a891a4d-ebae-46f8-b68f-ff6404f1d27e","Type":"ContainerStarted","Data":"08cce81fc33827ee0e084ab4ea157213336d668fcf2cf97064310593dc6b89bd"} Dec 03 07:10:54 crc kubenswrapper[4947]: I1203 07:10:54.939894 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a891a4d-ebae-46f8-b68f-ff6404f1d27e","Type":"ContainerStarted","Data":"9bcd71ef2a21098bc6f9bdc9f0fb8f302f3a6cc0c6e8971885458c2c85b20a28"} Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.000277 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.000255172 podStartE2EDuration="3.000255172s" podCreationTimestamp="2025-12-03 07:10:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:10:54.967750306 +0000 UTC m=+1316.228704842" watchObservedRunningTime="2025-12-03 07:10:55.000255172 +0000 UTC m=+1316.261209598" Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.002705 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.013839 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.026126 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 07:10:55 crc kubenswrapper[4947]: E1203 07:10:55.026735 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76881366-670a-494f-ba95-7c5187ba80e8" containerName="kube-state-metrics" Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.026754 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="76881366-670a-494f-ba95-7c5187ba80e8" containerName="kube-state-metrics" Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.026959 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="76881366-670a-494f-ba95-7c5187ba80e8" containerName="kube-state-metrics" Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.028694 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.032824 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.032915 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.033551 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.044164 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/90736cf6-0db1-44a8-b285-4d319f0951f8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"90736cf6-0db1-44a8-b285-4d319f0951f8\") " pod="openstack/kube-state-metrics-0" Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.044237 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/90736cf6-0db1-44a8-b285-4d319f0951f8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"90736cf6-0db1-44a8-b285-4d319f0951f8\") " pod="openstack/kube-state-metrics-0" Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.044309 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90736cf6-0db1-44a8-b285-4d319f0951f8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"90736cf6-0db1-44a8-b285-4d319f0951f8\") " pod="openstack/kube-state-metrics-0" Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.044327 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j9j5\" (UniqueName: \"kubernetes.io/projected/90736cf6-0db1-44a8-b285-4d319f0951f8-kube-api-access-5j9j5\") pod \"kube-state-metrics-0\" (UID: \"90736cf6-0db1-44a8-b285-4d319f0951f8\") " pod="openstack/kube-state-metrics-0" Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.094098 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76881366-670a-494f-ba95-7c5187ba80e8" path="/var/lib/kubelet/pods/76881366-670a-494f-ba95-7c5187ba80e8/volumes" Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.146094 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/90736cf6-0db1-44a8-b285-4d319f0951f8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"90736cf6-0db1-44a8-b285-4d319f0951f8\") " pod="openstack/kube-state-metrics-0" Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.146184 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/90736cf6-0db1-44a8-b285-4d319f0951f8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"90736cf6-0db1-44a8-b285-4d319f0951f8\") " pod="openstack/kube-state-metrics-0" Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.146275 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90736cf6-0db1-44a8-b285-4d319f0951f8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"90736cf6-0db1-44a8-b285-4d319f0951f8\") " pod="openstack/kube-state-metrics-0" Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.146293 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j9j5\" (UniqueName: \"kubernetes.io/projected/90736cf6-0db1-44a8-b285-4d319f0951f8-kube-api-access-5j9j5\") pod \"kube-state-metrics-0\" (UID: \"90736cf6-0db1-44a8-b285-4d319f0951f8\") " pod="openstack/kube-state-metrics-0" Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.150902 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90736cf6-0db1-44a8-b285-4d319f0951f8-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"90736cf6-0db1-44a8-b285-4d319f0951f8\") " pod="openstack/kube-state-metrics-0" Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.151264 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/90736cf6-0db1-44a8-b285-4d319f0951f8-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"90736cf6-0db1-44a8-b285-4d319f0951f8\") " pod="openstack/kube-state-metrics-0" Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.152028 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/90736cf6-0db1-44a8-b285-4d319f0951f8-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"90736cf6-0db1-44a8-b285-4d319f0951f8\") " pod="openstack/kube-state-metrics-0" Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.168843 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j9j5\" (UniqueName: \"kubernetes.io/projected/90736cf6-0db1-44a8-b285-4d319f0951f8-kube-api-access-5j9j5\") pod \"kube-state-metrics-0\" (UID: \"90736cf6-0db1-44a8-b285-4d319f0951f8\") " pod="openstack/kube-state-metrics-0" Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.284456 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.284869 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="86094256-7d4a-4efa-a139-6307919b49df" containerName="sg-core" containerID="cri-o://fd3a7ddd942b0bf7c993d8ec4b13420f86e53176d46ec11df8e6fb68de4c06e1" gracePeriod=30 Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.284926 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="86094256-7d4a-4efa-a139-6307919b49df" containerName="ceilometer-notification-agent" containerID="cri-o://1a5ff4a6cb09f9a251e9389d617500670f0e52741fc8c1ae750ba2e5427dcd2c" gracePeriod=30 Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.284869 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="86094256-7d4a-4efa-a139-6307919b49df" containerName="proxy-httpd" containerID="cri-o://f7bda5b91e9919c7430bc8360e444d25515cab851b10eefd33d0b2c98ed8f764" gracePeriod=30 Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.284810 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="86094256-7d4a-4efa-a139-6307919b49df" containerName="ceilometer-central-agent" containerID="cri-o://4dd21e18ea9b766d173cac4a7cca8ab19e1f895a693d2aeb8930b570ca37b679" gracePeriod=30 Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.406976 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.948716 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.950663 4947 generic.go:334] "Generic (PLEG): container finished" podID="86094256-7d4a-4efa-a139-6307919b49df" containerID="f7bda5b91e9919c7430bc8360e444d25515cab851b10eefd33d0b2c98ed8f764" exitCode=0 Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.950700 4947 generic.go:334] "Generic (PLEG): container finished" podID="86094256-7d4a-4efa-a139-6307919b49df" containerID="fd3a7ddd942b0bf7c993d8ec4b13420f86e53176d46ec11df8e6fb68de4c06e1" exitCode=2 Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.950711 4947 generic.go:334] "Generic (PLEG): container finished" podID="86094256-7d4a-4efa-a139-6307919b49df" containerID="4dd21e18ea9b766d173cac4a7cca8ab19e1f895a693d2aeb8930b570ca37b679" exitCode=0 Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.950741 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86094256-7d4a-4efa-a139-6307919b49df","Type":"ContainerDied","Data":"f7bda5b91e9919c7430bc8360e444d25515cab851b10eefd33d0b2c98ed8f764"} Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.950777 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86094256-7d4a-4efa-a139-6307919b49df","Type":"ContainerDied","Data":"fd3a7ddd942b0bf7c993d8ec4b13420f86e53176d46ec11df8e6fb68de4c06e1"} Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.950788 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86094256-7d4a-4efa-a139-6307919b49df","Type":"ContainerDied","Data":"4dd21e18ea9b766d173cac4a7cca8ab19e1f895a693d2aeb8930b570ca37b679"} Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.953262 4947 generic.go:334] "Generic (PLEG): container finished" podID="73c410b2-0cdc-45f3-b06d-c67fd543e76c" containerID="79d76a072f599124aea7aa155893adf39a85cca4bb9f8e9f6280be548ec04f73" exitCode=0 Dec 03 07:10:55 crc kubenswrapper[4947]: I1203 07:10:55.953419 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-b6497" event={"ID":"73c410b2-0cdc-45f3-b06d-c67fd543e76c","Type":"ContainerDied","Data":"79d76a072f599124aea7aa155893adf39a85cca4bb9f8e9f6280be548ec04f73"} Dec 03 07:10:55 crc kubenswrapper[4947]: W1203 07:10:55.954421 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90736cf6_0db1_44a8_b285_4d319f0951f8.slice/crio-316b25cd33a810b4ce65bf015ec2ce9702d7d22b0a407e7f649a18f9f2d17e0e WatchSource:0}: Error finding container 316b25cd33a810b4ce65bf015ec2ce9702d7d22b0a407e7f649a18f9f2d17e0e: Status 404 returned error can't find the container with id 316b25cd33a810b4ce65bf015ec2ce9702d7d22b0a407e7f649a18f9f2d17e0e Dec 03 07:10:56 crc kubenswrapper[4947]: I1203 07:10:56.962913 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"90736cf6-0db1-44a8-b285-4d319f0951f8","Type":"ContainerStarted","Data":"3fe0141e6c7dfad5f3686eb0461017dbced765ef02bc95a048da886e97ae70e3"} Dec 03 07:10:56 crc kubenswrapper[4947]: I1203 07:10:56.963845 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 03 07:10:56 crc kubenswrapper[4947]: I1203 07:10:56.963867 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"90736cf6-0db1-44a8-b285-4d319f0951f8","Type":"ContainerStarted","Data":"316b25cd33a810b4ce65bf015ec2ce9702d7d22b0a407e7f649a18f9f2d17e0e"} Dec 03 07:10:56 crc kubenswrapper[4947]: I1203 07:10:56.965465 4947 generic.go:334] "Generic (PLEG): container finished" podID="b6e0b77d-5e1e-4a81-abad-d35da9b42aaa" containerID="e87240b302c77a397917136cebd59e0e317e8b17fdf4bfb67df917c0ec2d7cbd" exitCode=0 Dec 03 07:10:56 crc kubenswrapper[4947]: I1203 07:10:56.965651 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-h5z8j" event={"ID":"b6e0b77d-5e1e-4a81-abad-d35da9b42aaa","Type":"ContainerDied","Data":"e87240b302c77a397917136cebd59e0e317e8b17fdf4bfb67df917c0ec2d7cbd"} Dec 03 07:10:57 crc kubenswrapper[4947]: I1203 07:10:57.006298 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.635896172 podStartE2EDuration="3.006277498s" podCreationTimestamp="2025-12-03 07:10:54 +0000 UTC" firstStartedPulling="2025-12-03 07:10:55.957043746 +0000 UTC m=+1317.217998172" lastFinishedPulling="2025-12-03 07:10:56.327425062 +0000 UTC m=+1317.588379498" observedRunningTime="2025-12-03 07:10:56.979768468 +0000 UTC m=+1318.240722894" watchObservedRunningTime="2025-12-03 07:10:57.006277498 +0000 UTC m=+1318.267231924" Dec 03 07:10:57 crc kubenswrapper[4947]: I1203 07:10:57.129988 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 07:10:57 crc kubenswrapper[4947]: I1203 07:10:57.130017 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 07:10:57 crc kubenswrapper[4947]: I1203 07:10:57.318024 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-b6497" Dec 03 07:10:57 crc kubenswrapper[4947]: I1203 07:10:57.395384 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c410b2-0cdc-45f3-b06d-c67fd543e76c-config-data\") pod \"73c410b2-0cdc-45f3-b06d-c67fd543e76c\" (UID: \"73c410b2-0cdc-45f3-b06d-c67fd543e76c\") " Dec 03 07:10:57 crc kubenswrapper[4947]: I1203 07:10:57.395585 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73c410b2-0cdc-45f3-b06d-c67fd543e76c-scripts\") pod \"73c410b2-0cdc-45f3-b06d-c67fd543e76c\" (UID: \"73c410b2-0cdc-45f3-b06d-c67fd543e76c\") " Dec 03 07:10:57 crc kubenswrapper[4947]: I1203 07:10:57.395656 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8trd\" (UniqueName: \"kubernetes.io/projected/73c410b2-0cdc-45f3-b06d-c67fd543e76c-kube-api-access-b8trd\") pod \"73c410b2-0cdc-45f3-b06d-c67fd543e76c\" (UID: \"73c410b2-0cdc-45f3-b06d-c67fd543e76c\") " Dec 03 07:10:57 crc kubenswrapper[4947]: I1203 07:10:57.395758 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c410b2-0cdc-45f3-b06d-c67fd543e76c-combined-ca-bundle\") pod \"73c410b2-0cdc-45f3-b06d-c67fd543e76c\" (UID: \"73c410b2-0cdc-45f3-b06d-c67fd543e76c\") " Dec 03 07:10:57 crc kubenswrapper[4947]: I1203 07:10:57.402100 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73c410b2-0cdc-45f3-b06d-c67fd543e76c-kube-api-access-b8trd" (OuterVolumeSpecName: "kube-api-access-b8trd") pod "73c410b2-0cdc-45f3-b06d-c67fd543e76c" (UID: "73c410b2-0cdc-45f3-b06d-c67fd543e76c"). InnerVolumeSpecName "kube-api-access-b8trd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:10:57 crc kubenswrapper[4947]: I1203 07:10:57.404867 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c410b2-0cdc-45f3-b06d-c67fd543e76c-scripts" (OuterVolumeSpecName: "scripts") pod "73c410b2-0cdc-45f3-b06d-c67fd543e76c" (UID: "73c410b2-0cdc-45f3-b06d-c67fd543e76c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:10:57 crc kubenswrapper[4947]: I1203 07:10:57.424579 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c410b2-0cdc-45f3-b06d-c67fd543e76c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73c410b2-0cdc-45f3-b06d-c67fd543e76c" (UID: "73c410b2-0cdc-45f3-b06d-c67fd543e76c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:10:57 crc kubenswrapper[4947]: I1203 07:10:57.430471 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73c410b2-0cdc-45f3-b06d-c67fd543e76c-config-data" (OuterVolumeSpecName: "config-data") pod "73c410b2-0cdc-45f3-b06d-c67fd543e76c" (UID: "73c410b2-0cdc-45f3-b06d-c67fd543e76c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:10:57 crc kubenswrapper[4947]: I1203 07:10:57.497863 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73c410b2-0cdc-45f3-b06d-c67fd543e76c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:57 crc kubenswrapper[4947]: I1203 07:10:57.497904 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73c410b2-0cdc-45f3-b06d-c67fd543e76c-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:57 crc kubenswrapper[4947]: I1203 07:10:57.497919 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8trd\" (UniqueName: \"kubernetes.io/projected/73c410b2-0cdc-45f3-b06d-c67fd543e76c-kube-api-access-b8trd\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:57 crc kubenswrapper[4947]: I1203 07:10:57.497935 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73c410b2-0cdc-45f3-b06d-c67fd543e76c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:57 crc kubenswrapper[4947]: I1203 07:10:57.513927 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 07:10:57 crc kubenswrapper[4947]: I1203 07:10:57.539563 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" Dec 03 07:10:57 crc kubenswrapper[4947]: I1203 07:10:57.541728 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 07:10:57 crc kubenswrapper[4947]: I1203 07:10:57.615694 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c77d8b67c-s2lt2"] Dec 03 07:10:57 crc kubenswrapper[4947]: I1203 07:10:57.616008 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" podUID="64a14fe8-90be-4d6e-890f-a42fd3ea6894" containerName="dnsmasq-dns" containerID="cri-o://812f24422eebf2b9f2ac39a2da773fc81fcbeacd0d393438e6eb4d93eb191a7e" gracePeriod=10 Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.007064 4947 generic.go:334] "Generic (PLEG): container finished" podID="64a14fe8-90be-4d6e-890f-a42fd3ea6894" containerID="812f24422eebf2b9f2ac39a2da773fc81fcbeacd0d393438e6eb4d93eb191a7e" exitCode=0 Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.007186 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" event={"ID":"64a14fe8-90be-4d6e-890f-a42fd3ea6894","Type":"ContainerDied","Data":"812f24422eebf2b9f2ac39a2da773fc81fcbeacd0d393438e6eb4d93eb191a7e"} Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.012980 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-b6497" event={"ID":"73c410b2-0cdc-45f3-b06d-c67fd543e76c","Type":"ContainerDied","Data":"167c120126a420e395cb05d9862522ab206b1bd4380a1dada720a6de7498162e"} Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.013033 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="167c120126a420e395cb05d9862522ab206b1bd4380a1dada720a6de7498162e" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.013119 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-b6497" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.066763 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.075753 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.110042 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncthr\" (UniqueName: \"kubernetes.io/projected/64a14fe8-90be-4d6e-890f-a42fd3ea6894-kube-api-access-ncthr\") pod \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\" (UID: \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\") " Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.110320 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-ovsdbserver-nb\") pod \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\" (UID: \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\") " Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.110437 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-dns-svc\") pod \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\" (UID: \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\") " Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.110655 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-dns-swift-storage-0\") pod \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\" (UID: \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\") " Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.110820 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-config\") pod \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\" (UID: \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\") " Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.110997 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-ovsdbserver-sb\") pod \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\" (UID: \"64a14fe8-90be-4d6e-890f-a42fd3ea6894\") " Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.123816 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64a14fe8-90be-4d6e-890f-a42fd3ea6894-kube-api-access-ncthr" (OuterVolumeSpecName: "kube-api-access-ncthr") pod "64a14fe8-90be-4d6e-890f-a42fd3ea6894" (UID: "64a14fe8-90be-4d6e-890f-a42fd3ea6894"). InnerVolumeSpecName "kube-api-access-ncthr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.159292 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.159519 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6079f692-0780-41f6-a551-399974561bd3" containerName="nova-api-log" containerID="cri-o://7744fa6687c156278f7c710a5cc4ece059686b1a297f3eb72a825c051b3e8c23" gracePeriod=30 Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.160159 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6079f692-0780-41f6-a551-399974561bd3" containerName="nova-api-api" containerID="cri-o://4dd4ebee4948d69a2730257fd4f989c2a863bba9bdfac93199d0f7632b8b35c5" gracePeriod=30 Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.171917 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6079f692-0780-41f6-a551-399974561bd3" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.172000 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6079f692-0780-41f6-a551-399974561bd3" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.184:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.197384 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.197871 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4a891a4d-ebae-46f8-b68f-ff6404f1d27e" containerName="nova-metadata-log" containerID="cri-o://9bcd71ef2a21098bc6f9bdc9f0fb8f302f3a6cc0c6e8971885458c2c85b20a28" gracePeriod=30 Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.198551 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4a891a4d-ebae-46f8-b68f-ff6404f1d27e" containerName="nova-metadata-metadata" containerID="cri-o://08cce81fc33827ee0e084ab4ea157213336d668fcf2cf97064310593dc6b89bd" gracePeriod=30 Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.213972 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncthr\" (UniqueName: \"kubernetes.io/projected/64a14fe8-90be-4d6e-890f-a42fd3ea6894-kube-api-access-ncthr\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.236473 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "64a14fe8-90be-4d6e-890f-a42fd3ea6894" (UID: "64a14fe8-90be-4d6e-890f-a42fd3ea6894"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.241319 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "64a14fe8-90be-4d6e-890f-a42fd3ea6894" (UID: "64a14fe8-90be-4d6e-890f-a42fd3ea6894"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.269259 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-config" (OuterVolumeSpecName: "config") pod "64a14fe8-90be-4d6e-890f-a42fd3ea6894" (UID: "64a14fe8-90be-4d6e-890f-a42fd3ea6894"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.311402 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.311456 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.315529 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.315549 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.315558 4947 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.318773 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "64a14fe8-90be-4d6e-890f-a42fd3ea6894" (UID: "64a14fe8-90be-4d6e-890f-a42fd3ea6894"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.410648 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "64a14fe8-90be-4d6e-890f-a42fd3ea6894" (UID: "64a14fe8-90be-4d6e-890f-a42fd3ea6894"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.418746 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.418783 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64a14fe8-90be-4d6e-890f-a42fd3ea6894-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.560461 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-h5z8j" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.621996 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6e0b77d-5e1e-4a81-abad-d35da9b42aaa-config-data\") pod \"b6e0b77d-5e1e-4a81-abad-d35da9b42aaa\" (UID: \"b6e0b77d-5e1e-4a81-abad-d35da9b42aaa\") " Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.622056 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6e0b77d-5e1e-4a81-abad-d35da9b42aaa-scripts\") pod \"b6e0b77d-5e1e-4a81-abad-d35da9b42aaa\" (UID: \"b6e0b77d-5e1e-4a81-abad-d35da9b42aaa\") " Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.622182 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl7sp\" (UniqueName: \"kubernetes.io/projected/b6e0b77d-5e1e-4a81-abad-d35da9b42aaa-kube-api-access-dl7sp\") pod \"b6e0b77d-5e1e-4a81-abad-d35da9b42aaa\" (UID: \"b6e0b77d-5e1e-4a81-abad-d35da9b42aaa\") " Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.622334 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e0b77d-5e1e-4a81-abad-d35da9b42aaa-combined-ca-bundle\") pod \"b6e0b77d-5e1e-4a81-abad-d35da9b42aaa\" (UID: \"b6e0b77d-5e1e-4a81-abad-d35da9b42aaa\") " Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.626717 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6e0b77d-5e1e-4a81-abad-d35da9b42aaa-kube-api-access-dl7sp" (OuterVolumeSpecName: "kube-api-access-dl7sp") pod "b6e0b77d-5e1e-4a81-abad-d35da9b42aaa" (UID: "b6e0b77d-5e1e-4a81-abad-d35da9b42aaa"). InnerVolumeSpecName "kube-api-access-dl7sp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.627596 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6e0b77d-5e1e-4a81-abad-d35da9b42aaa-scripts" (OuterVolumeSpecName: "scripts") pod "b6e0b77d-5e1e-4a81-abad-d35da9b42aaa" (UID: "b6e0b77d-5e1e-4a81-abad-d35da9b42aaa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.649667 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6e0b77d-5e1e-4a81-abad-d35da9b42aaa-config-data" (OuterVolumeSpecName: "config-data") pod "b6e0b77d-5e1e-4a81-abad-d35da9b42aaa" (UID: "b6e0b77d-5e1e-4a81-abad-d35da9b42aaa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.650000 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6e0b77d-5e1e-4a81-abad-d35da9b42aaa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6e0b77d-5e1e-4a81-abad-d35da9b42aaa" (UID: "b6e0b77d-5e1e-4a81-abad-d35da9b42aaa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.698191 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.724944 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl7sp\" (UniqueName: \"kubernetes.io/projected/b6e0b77d-5e1e-4a81-abad-d35da9b42aaa-kube-api-access-dl7sp\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.724975 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e0b77d-5e1e-4a81-abad-d35da9b42aaa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.724984 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6e0b77d-5e1e-4a81-abad-d35da9b42aaa-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:58 crc kubenswrapper[4947]: I1203 07:10:58.724992 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6e0b77d-5e1e-4a81-abad-d35da9b42aaa-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.042589 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-h5z8j" event={"ID":"b6e0b77d-5e1e-4a81-abad-d35da9b42aaa","Type":"ContainerDied","Data":"66f7a619409e1c59af3028c3c3b4257b3a1ff5688ec2dc833767398c524339cc"} Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.042624 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66f7a619409e1c59af3028c3c3b4257b3a1ff5688ec2dc833767398c524339cc" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.042744 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-h5z8j" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.048878 4947 generic.go:334] "Generic (PLEG): container finished" podID="4a891a4d-ebae-46f8-b68f-ff6404f1d27e" containerID="08cce81fc33827ee0e084ab4ea157213336d668fcf2cf97064310593dc6b89bd" exitCode=0 Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.049139 4947 generic.go:334] "Generic (PLEG): container finished" podID="4a891a4d-ebae-46f8-b68f-ff6404f1d27e" containerID="9bcd71ef2a21098bc6f9bdc9f0fb8f302f3a6cc0c6e8971885458c2c85b20a28" exitCode=143 Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.049211 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a891a4d-ebae-46f8-b68f-ff6404f1d27e","Type":"ContainerDied","Data":"08cce81fc33827ee0e084ab4ea157213336d668fcf2cf97064310593dc6b89bd"} Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.049238 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a891a4d-ebae-46f8-b68f-ff6404f1d27e","Type":"ContainerDied","Data":"9bcd71ef2a21098bc6f9bdc9f0fb8f302f3a6cc0c6e8971885458c2c85b20a28"} Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.052127 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" event={"ID":"64a14fe8-90be-4d6e-890f-a42fd3ea6894","Type":"ContainerDied","Data":"0cd8242708aaa2f64241bde15baff96c45c872ddae2768e869246a42ff93030a"} Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.052187 4947 scope.go:117] "RemoveContainer" containerID="812f24422eebf2b9f2ac39a2da773fc81fcbeacd0d393438e6eb4d93eb191a7e" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.054519 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c77d8b67c-s2lt2" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.065831 4947 generic.go:334] "Generic (PLEG): container finished" podID="6079f692-0780-41f6-a551-399974561bd3" containerID="7744fa6687c156278f7c710a5cc4ece059686b1a297f3eb72a825c051b3e8c23" exitCode=143 Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.066482 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6079f692-0780-41f6-a551-399974561bd3","Type":"ContainerDied","Data":"7744fa6687c156278f7c710a5cc4ece059686b1a297f3eb72a825c051b3e8c23"} Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.100303 4947 scope.go:117] "RemoveContainer" containerID="85c3379202dda47b0fbc87de0172fe9a6295c490b27d81791913aeafe5f28ddb" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.116950 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 07:10:59 crc kubenswrapper[4947]: E1203 07:10:59.117760 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64a14fe8-90be-4d6e-890f-a42fd3ea6894" containerName="init" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.117791 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="64a14fe8-90be-4d6e-890f-a42fd3ea6894" containerName="init" Dec 03 07:10:59 crc kubenswrapper[4947]: E1203 07:10:59.117806 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73c410b2-0cdc-45f3-b06d-c67fd543e76c" containerName="nova-manage" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.117812 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="73c410b2-0cdc-45f3-b06d-c67fd543e76c" containerName="nova-manage" Dec 03 07:10:59 crc kubenswrapper[4947]: E1203 07:10:59.117826 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e0b77d-5e1e-4a81-abad-d35da9b42aaa" containerName="nova-cell1-conductor-db-sync" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.117832 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e0b77d-5e1e-4a81-abad-d35da9b42aaa" containerName="nova-cell1-conductor-db-sync" Dec 03 07:10:59 crc kubenswrapper[4947]: E1203 07:10:59.117858 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64a14fe8-90be-4d6e-890f-a42fd3ea6894" containerName="dnsmasq-dns" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.117864 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="64a14fe8-90be-4d6e-890f-a42fd3ea6894" containerName="dnsmasq-dns" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.118095 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6e0b77d-5e1e-4a81-abad-d35da9b42aaa" containerName="nova-cell1-conductor-db-sync" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.118121 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="64a14fe8-90be-4d6e-890f-a42fd3ea6894" containerName="dnsmasq-dns" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.118134 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="73c410b2-0cdc-45f3-b06d-c67fd543e76c" containerName="nova-manage" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.118718 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.118800 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.120746 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.135268 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b4b931b-69ee-4ff2-b01a-85d45fc93ec4-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3b4b931b-69ee-4ff2-b01a-85d45fc93ec4\") " pod="openstack/nova-cell1-conductor-0" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.135443 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b4b931b-69ee-4ff2-b01a-85d45fc93ec4-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3b4b931b-69ee-4ff2-b01a-85d45fc93ec4\") " pod="openstack/nova-cell1-conductor-0" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.135570 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh77l\" (UniqueName: \"kubernetes.io/projected/3b4b931b-69ee-4ff2-b01a-85d45fc93ec4-kube-api-access-fh77l\") pod \"nova-cell1-conductor-0\" (UID: \"3b4b931b-69ee-4ff2-b01a-85d45fc93ec4\") " pod="openstack/nova-cell1-conductor-0" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.229830 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.237615 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh77l\" (UniqueName: \"kubernetes.io/projected/3b4b931b-69ee-4ff2-b01a-85d45fc93ec4-kube-api-access-fh77l\") pod \"nova-cell1-conductor-0\" (UID: \"3b4b931b-69ee-4ff2-b01a-85d45fc93ec4\") " pod="openstack/nova-cell1-conductor-0" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.237744 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b4b931b-69ee-4ff2-b01a-85d45fc93ec4-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3b4b931b-69ee-4ff2-b01a-85d45fc93ec4\") " pod="openstack/nova-cell1-conductor-0" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.237901 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b4b931b-69ee-4ff2-b01a-85d45fc93ec4-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3b4b931b-69ee-4ff2-b01a-85d45fc93ec4\") " pod="openstack/nova-cell1-conductor-0" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.243053 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b4b931b-69ee-4ff2-b01a-85d45fc93ec4-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"3b4b931b-69ee-4ff2-b01a-85d45fc93ec4\") " pod="openstack/nova-cell1-conductor-0" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.247731 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b4b931b-69ee-4ff2-b01a-85d45fc93ec4-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"3b4b931b-69ee-4ff2-b01a-85d45fc93ec4\") " pod="openstack/nova-cell1-conductor-0" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.275599 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c77d8b67c-s2lt2"] Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.282373 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh77l\" (UniqueName: \"kubernetes.io/projected/3b4b931b-69ee-4ff2-b01a-85d45fc93ec4-kube-api-access-fh77l\") pod \"nova-cell1-conductor-0\" (UID: \"3b4b931b-69ee-4ff2-b01a-85d45fc93ec4\") " pod="openstack/nova-cell1-conductor-0" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.288211 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c77d8b67c-s2lt2"] Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.340214 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-combined-ca-bundle\") pod \"4a891a4d-ebae-46f8-b68f-ff6404f1d27e\" (UID: \"4a891a4d-ebae-46f8-b68f-ff6404f1d27e\") " Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.340294 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-logs\") pod \"4a891a4d-ebae-46f8-b68f-ff6404f1d27e\" (UID: \"4a891a4d-ebae-46f8-b68f-ff6404f1d27e\") " Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.340437 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-config-data\") pod \"4a891a4d-ebae-46f8-b68f-ff6404f1d27e\" (UID: \"4a891a4d-ebae-46f8-b68f-ff6404f1d27e\") " Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.340477 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwpgk\" (UniqueName: \"kubernetes.io/projected/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-kube-api-access-cwpgk\") pod \"4a891a4d-ebae-46f8-b68f-ff6404f1d27e\" (UID: \"4a891a4d-ebae-46f8-b68f-ff6404f1d27e\") " Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.340605 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-nova-metadata-tls-certs\") pod \"4a891a4d-ebae-46f8-b68f-ff6404f1d27e\" (UID: \"4a891a4d-ebae-46f8-b68f-ff6404f1d27e\") " Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.341594 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-logs" (OuterVolumeSpecName: "logs") pod "4a891a4d-ebae-46f8-b68f-ff6404f1d27e" (UID: "4a891a4d-ebae-46f8-b68f-ff6404f1d27e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.345481 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-kube-api-access-cwpgk" (OuterVolumeSpecName: "kube-api-access-cwpgk") pod "4a891a4d-ebae-46f8-b68f-ff6404f1d27e" (UID: "4a891a4d-ebae-46f8-b68f-ff6404f1d27e"). InnerVolumeSpecName "kube-api-access-cwpgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.369383 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-config-data" (OuterVolumeSpecName: "config-data") pod "4a891a4d-ebae-46f8-b68f-ff6404f1d27e" (UID: "4a891a4d-ebae-46f8-b68f-ff6404f1d27e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.379571 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a891a4d-ebae-46f8-b68f-ff6404f1d27e" (UID: "4a891a4d-ebae-46f8-b68f-ff6404f1d27e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.401348 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4a891a4d-ebae-46f8-b68f-ff6404f1d27e" (UID: "4a891a4d-ebae-46f8-b68f-ff6404f1d27e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.443429 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.443876 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwpgk\" (UniqueName: \"kubernetes.io/projected/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-kube-api-access-cwpgk\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.443895 4947 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.443908 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.443921 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a891a4d-ebae-46f8-b68f-ff6404f1d27e-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.462749 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 07:10:59 crc kubenswrapper[4947]: E1203 07:10:59.773739 4947 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/e75031055e7027491fc6a493029e0a2cfedb8ca06ecf69e1888672109d1734bc/diff" to get inode usage: stat /var/lib/containers/storage/overlay/e75031055e7027491fc6a493029e0a2cfedb8ca06ecf69e1888672109d1734bc/diff: no such file or directory, extraDiskErr: Dec 03 07:10:59 crc kubenswrapper[4947]: I1203 07:10:59.877400 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.077747 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3b4b931b-69ee-4ff2-b01a-85d45fc93ec4","Type":"ContainerStarted","Data":"b2676120afc69da3f98f532e81b7734ae53d0aa77b47c4fe30d93bbbb6ab2a0a"} Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.078025 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.080196 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4a891a4d-ebae-46f8-b68f-ff6404f1d27e","Type":"ContainerDied","Data":"6367923194f0cc13ac5851931115713879bff63e3a5b8da986b9508cddaf3226"} Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.080247 4947 scope.go:117] "RemoveContainer" containerID="08cce81fc33827ee0e084ab4ea157213336d668fcf2cf97064310593dc6b89bd" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.080344 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.083806 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a05c53bc-13be-4dac-b112-11f05a9e5e64" containerName="nova-scheduler-scheduler" containerID="cri-o://c1361038662985e5b390ba5ddb3eb2331e23a0aef8af1d8f6b49870d5a02c952" gracePeriod=30 Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.086432 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.086469 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.086516 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.087095 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c728d116f3c53bcc6037be4215b8c5da0c570fa9f0fddd2b5bb621a8286fe726"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.087135 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://c728d116f3c53bcc6037be4215b8c5da0c570fa9f0fddd2b5bb621a8286fe726" gracePeriod=600 Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.100794 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.100778987 podStartE2EDuration="1.100778987s" podCreationTimestamp="2025-12-03 07:10:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:11:00.091237174 +0000 UTC m=+1321.352191590" watchObservedRunningTime="2025-12-03 07:11:00.100778987 +0000 UTC m=+1321.361733413" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.121668 4947 scope.go:117] "RemoveContainer" containerID="9bcd71ef2a21098bc6f9bdc9f0fb8f302f3a6cc0c6e8971885458c2c85b20a28" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.144337 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.154450 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.180776 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:11:00 crc kubenswrapper[4947]: E1203 07:11:00.181129 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a891a4d-ebae-46f8-b68f-ff6404f1d27e" containerName="nova-metadata-metadata" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.181143 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a891a4d-ebae-46f8-b68f-ff6404f1d27e" containerName="nova-metadata-metadata" Dec 03 07:11:00 crc kubenswrapper[4947]: E1203 07:11:00.181167 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a891a4d-ebae-46f8-b68f-ff6404f1d27e" containerName="nova-metadata-log" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.181173 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a891a4d-ebae-46f8-b68f-ff6404f1d27e" containerName="nova-metadata-log" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.181342 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a891a4d-ebae-46f8-b68f-ff6404f1d27e" containerName="nova-metadata-metadata" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.181371 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a891a4d-ebae-46f8-b68f-ff6404f1d27e" containerName="nova-metadata-log" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.185939 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.192678 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.192735 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.195464 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.258619 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-logs\") pod \"nova-metadata-0\" (UID: \"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb\") " pod="openstack/nova-metadata-0" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.258729 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgr5w\" (UniqueName: \"kubernetes.io/projected/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-kube-api-access-rgr5w\") pod \"nova-metadata-0\" (UID: \"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb\") " pod="openstack/nova-metadata-0" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.258768 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-config-data\") pod \"nova-metadata-0\" (UID: \"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb\") " pod="openstack/nova-metadata-0" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.258903 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb\") " pod="openstack/nova-metadata-0" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.258923 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb\") " pod="openstack/nova-metadata-0" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.360183 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgr5w\" (UniqueName: \"kubernetes.io/projected/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-kube-api-access-rgr5w\") pod \"nova-metadata-0\" (UID: \"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb\") " pod="openstack/nova-metadata-0" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.360249 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-config-data\") pod \"nova-metadata-0\" (UID: \"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb\") " pod="openstack/nova-metadata-0" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.360438 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb\") " pod="openstack/nova-metadata-0" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.361235 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb\") " pod="openstack/nova-metadata-0" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.361718 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-logs\") pod \"nova-metadata-0\" (UID: \"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb\") " pod="openstack/nova-metadata-0" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.362374 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-logs\") pod \"nova-metadata-0\" (UID: \"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb\") " pod="openstack/nova-metadata-0" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.366158 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb\") " pod="openstack/nova-metadata-0" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.372813 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb\") " pod="openstack/nova-metadata-0" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.373569 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-config-data\") pod \"nova-metadata-0\" (UID: \"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb\") " pod="openstack/nova-metadata-0" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.380966 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgr5w\" (UniqueName: \"kubernetes.io/projected/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-kube-api-access-rgr5w\") pod \"nova-metadata-0\" (UID: \"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb\") " pod="openstack/nova-metadata-0" Dec 03 07:11:00 crc kubenswrapper[4947]: I1203 07:11:00.516738 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:11:01 crc kubenswrapper[4947]: I1203 07:11:01.009548 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:11:01 crc kubenswrapper[4947]: I1203 07:11:01.094148 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a891a4d-ebae-46f8-b68f-ff6404f1d27e" path="/var/lib/kubelet/pods/4a891a4d-ebae-46f8-b68f-ff6404f1d27e/volumes" Dec 03 07:11:01 crc kubenswrapper[4947]: I1203 07:11:01.095384 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64a14fe8-90be-4d6e-890f-a42fd3ea6894" path="/var/lib/kubelet/pods/64a14fe8-90be-4d6e-890f-a42fd3ea6894/volumes" Dec 03 07:11:01 crc kubenswrapper[4947]: I1203 07:11:01.097072 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb","Type":"ContainerStarted","Data":"674be2f90e601d970ee9042a032f26841eb718e2becd084ed2f6b2dcfeb35cc4"} Dec 03 07:11:01 crc kubenswrapper[4947]: I1203 07:11:01.101034 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="c728d116f3c53bcc6037be4215b8c5da0c570fa9f0fddd2b5bb621a8286fe726" exitCode=0 Dec 03 07:11:01 crc kubenswrapper[4947]: I1203 07:11:01.101107 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"c728d116f3c53bcc6037be4215b8c5da0c570fa9f0fddd2b5bb621a8286fe726"} Dec 03 07:11:01 crc kubenswrapper[4947]: I1203 07:11:01.101134 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946"} Dec 03 07:11:01 crc kubenswrapper[4947]: I1203 07:11:01.101151 4947 scope.go:117] "RemoveContainer" containerID="766a670181f7035ab6014723203ba9e8cefa098d94e396af08b21929329a9713" Dec 03 07:11:01 crc kubenswrapper[4947]: I1203 07:11:01.103401 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3b4b931b-69ee-4ff2-b01a-85d45fc93ec4","Type":"ContainerStarted","Data":"05a614732107d76e00cf7484e0feb6e0485e39e5f9e2a1c05314d7e15cca22c4"} Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.116604 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb","Type":"ContainerStarted","Data":"dc591bbee2d5e3cc3a14003e50f03a3ae91573cb0b598e3ed489f246342d7787"} Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.117103 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb","Type":"ContainerStarted","Data":"cf3c99366a69b3b27ef00f1b5932b90a4fb81ed8d32e2d8c5167ae5a5a24d5ab"} Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.121108 4947 generic.go:334] "Generic (PLEG): container finished" podID="86094256-7d4a-4efa-a139-6307919b49df" containerID="1a5ff4a6cb09f9a251e9389d617500670f0e52741fc8c1ae750ba2e5427dcd2c" exitCode=0 Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.121195 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86094256-7d4a-4efa-a139-6307919b49df","Type":"ContainerDied","Data":"1a5ff4a6cb09f9a251e9389d617500670f0e52741fc8c1ae750ba2e5427dcd2c"} Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.146450 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.146427136 podStartE2EDuration="2.146427136s" podCreationTimestamp="2025-12-03 07:11:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:11:02.142911178 +0000 UTC m=+1323.403865624" watchObservedRunningTime="2025-12-03 07:11:02.146427136 +0000 UTC m=+1323.407381562" Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.444140 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.505562 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txg6d\" (UniqueName: \"kubernetes.io/projected/86094256-7d4a-4efa-a139-6307919b49df-kube-api-access-txg6d\") pod \"86094256-7d4a-4efa-a139-6307919b49df\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.505719 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86094256-7d4a-4efa-a139-6307919b49df-sg-core-conf-yaml\") pod \"86094256-7d4a-4efa-a139-6307919b49df\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.505790 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86094256-7d4a-4efa-a139-6307919b49df-log-httpd\") pod \"86094256-7d4a-4efa-a139-6307919b49df\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.505881 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86094256-7d4a-4efa-a139-6307919b49df-run-httpd\") pod \"86094256-7d4a-4efa-a139-6307919b49df\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.505937 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86094256-7d4a-4efa-a139-6307919b49df-config-data\") pod \"86094256-7d4a-4efa-a139-6307919b49df\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.505959 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86094256-7d4a-4efa-a139-6307919b49df-scripts\") pod \"86094256-7d4a-4efa-a139-6307919b49df\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.505996 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86094256-7d4a-4efa-a139-6307919b49df-combined-ca-bundle\") pod \"86094256-7d4a-4efa-a139-6307919b49df\" (UID: \"86094256-7d4a-4efa-a139-6307919b49df\") " Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.506183 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86094256-7d4a-4efa-a139-6307919b49df-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "86094256-7d4a-4efa-a139-6307919b49df" (UID: "86094256-7d4a-4efa-a139-6307919b49df"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.506342 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86094256-7d4a-4efa-a139-6307919b49df-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "86094256-7d4a-4efa-a139-6307919b49df" (UID: "86094256-7d4a-4efa-a139-6307919b49df"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.506520 4947 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86094256-7d4a-4efa-a139-6307919b49df-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.506538 4947 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86094256-7d4a-4efa-a139-6307919b49df-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:02 crc kubenswrapper[4947]: E1203 07:11:02.518070 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c1361038662985e5b390ba5ddb3eb2331e23a0aef8af1d8f6b49870d5a02c952" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 07:11:02 crc kubenswrapper[4947]: E1203 07:11:02.523355 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c1361038662985e5b390ba5ddb3eb2331e23a0aef8af1d8f6b49870d5a02c952" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 07:11:02 crc kubenswrapper[4947]: E1203 07:11:02.525073 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c1361038662985e5b390ba5ddb3eb2331e23a0aef8af1d8f6b49870d5a02c952" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 07:11:02 crc kubenswrapper[4947]: E1203 07:11:02.525144 4947 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="a05c53bc-13be-4dac-b112-11f05a9e5e64" containerName="nova-scheduler-scheduler" Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.542785 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86094256-7d4a-4efa-a139-6307919b49df-scripts" (OuterVolumeSpecName: "scripts") pod "86094256-7d4a-4efa-a139-6307919b49df" (UID: "86094256-7d4a-4efa-a139-6307919b49df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.542959 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86094256-7d4a-4efa-a139-6307919b49df-kube-api-access-txg6d" (OuterVolumeSpecName: "kube-api-access-txg6d") pod "86094256-7d4a-4efa-a139-6307919b49df" (UID: "86094256-7d4a-4efa-a139-6307919b49df"). InnerVolumeSpecName "kube-api-access-txg6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.545770 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86094256-7d4a-4efa-a139-6307919b49df-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "86094256-7d4a-4efa-a139-6307919b49df" (UID: "86094256-7d4a-4efa-a139-6307919b49df"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.597538 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86094256-7d4a-4efa-a139-6307919b49df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86094256-7d4a-4efa-a139-6307919b49df" (UID: "86094256-7d4a-4efa-a139-6307919b49df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.607577 4947 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86094256-7d4a-4efa-a139-6307919b49df-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.607607 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86094256-7d4a-4efa-a139-6307919b49df-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.607616 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86094256-7d4a-4efa-a139-6307919b49df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.607625 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txg6d\" (UniqueName: \"kubernetes.io/projected/86094256-7d4a-4efa-a139-6307919b49df-kube-api-access-txg6d\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.632592 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86094256-7d4a-4efa-a139-6307919b49df-config-data" (OuterVolumeSpecName: "config-data") pod "86094256-7d4a-4efa-a139-6307919b49df" (UID: "86094256-7d4a-4efa-a139-6307919b49df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:02 crc kubenswrapper[4947]: I1203 07:11:02.709081 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86094256-7d4a-4efa-a139-6307919b49df-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.138523 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86094256-7d4a-4efa-a139-6307919b49df","Type":"ContainerDied","Data":"746f459723ac30640b2873c5dce961b30b6397f1e9a9875ec09333b37ca951ca"} Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.138588 4947 scope.go:117] "RemoveContainer" containerID="f7bda5b91e9919c7430bc8360e444d25515cab851b10eefd33d0b2c98ed8f764" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.138761 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.145206 4947 generic.go:334] "Generic (PLEG): container finished" podID="a05c53bc-13be-4dac-b112-11f05a9e5e64" containerID="c1361038662985e5b390ba5ddb3eb2331e23a0aef8af1d8f6b49870d5a02c952" exitCode=0 Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.145412 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a05c53bc-13be-4dac-b112-11f05a9e5e64","Type":"ContainerDied","Data":"c1361038662985e5b390ba5ddb3eb2331e23a0aef8af1d8f6b49870d5a02c952"} Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.167804 4947 scope.go:117] "RemoveContainer" containerID="fd3a7ddd942b0bf7c993d8ec4b13420f86e53176d46ec11df8e6fb68de4c06e1" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.192537 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.213830 4947 scope.go:117] "RemoveContainer" containerID="1a5ff4a6cb09f9a251e9389d617500670f0e52741fc8c1ae750ba2e5427dcd2c" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.218931 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.264925 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:11:03 crc kubenswrapper[4947]: E1203 07:11:03.265410 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86094256-7d4a-4efa-a139-6307919b49df" containerName="ceilometer-central-agent" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.265432 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="86094256-7d4a-4efa-a139-6307919b49df" containerName="ceilometer-central-agent" Dec 03 07:11:03 crc kubenswrapper[4947]: E1203 07:11:03.265456 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86094256-7d4a-4efa-a139-6307919b49df" containerName="proxy-httpd" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.265465 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="86094256-7d4a-4efa-a139-6307919b49df" containerName="proxy-httpd" Dec 03 07:11:03 crc kubenswrapper[4947]: E1203 07:11:03.265506 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86094256-7d4a-4efa-a139-6307919b49df" containerName="ceilometer-notification-agent" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.265515 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="86094256-7d4a-4efa-a139-6307919b49df" containerName="ceilometer-notification-agent" Dec 03 07:11:03 crc kubenswrapper[4947]: E1203 07:11:03.265532 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86094256-7d4a-4efa-a139-6307919b49df" containerName="sg-core" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.265540 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="86094256-7d4a-4efa-a139-6307919b49df" containerName="sg-core" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.265762 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="86094256-7d4a-4efa-a139-6307919b49df" containerName="ceilometer-central-agent" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.265782 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="86094256-7d4a-4efa-a139-6307919b49df" containerName="sg-core" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.265806 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="86094256-7d4a-4efa-a139-6307919b49df" containerName="proxy-httpd" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.265824 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="86094256-7d4a-4efa-a139-6307919b49df" containerName="ceilometer-notification-agent" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.267869 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.275269 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.275678 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.276078 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.276165 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.277824 4947 scope.go:117] "RemoveContainer" containerID="4dd21e18ea9b766d173cac4a7cca8ab19e1f895a693d2aeb8930b570ca37b679" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.317275 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.317675 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-config-data\") pod \"ceilometer-0\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.317823 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.317870 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de1bec54-7da3-41be-978e-2c1320f17696-log-httpd\") pod \"ceilometer-0\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.317932 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-scripts\") pod \"ceilometer-0\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.317958 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pr2b\" (UniqueName: \"kubernetes.io/projected/de1bec54-7da3-41be-978e-2c1320f17696-kube-api-access-8pr2b\") pod \"ceilometer-0\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.318046 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.318147 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de1bec54-7da3-41be-978e-2c1320f17696-run-httpd\") pod \"ceilometer-0\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.420001 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.420067 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de1bec54-7da3-41be-978e-2c1320f17696-log-httpd\") pod \"ceilometer-0\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.420132 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-scripts\") pod \"ceilometer-0\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.420157 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pr2b\" (UniqueName: \"kubernetes.io/projected/de1bec54-7da3-41be-978e-2c1320f17696-kube-api-access-8pr2b\") pod \"ceilometer-0\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.420218 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.420682 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de1bec54-7da3-41be-978e-2c1320f17696-run-httpd\") pod \"ceilometer-0\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.420959 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de1bec54-7da3-41be-978e-2c1320f17696-log-httpd\") pod \"ceilometer-0\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.421122 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de1bec54-7da3-41be-978e-2c1320f17696-run-httpd\") pod \"ceilometer-0\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.420717 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.421233 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-config-data\") pod \"ceilometer-0\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.426434 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.426982 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-config-data\") pod \"ceilometer-0\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.428271 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.429998 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.432235 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-scripts\") pod \"ceilometer-0\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.443331 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pr2b\" (UniqueName: \"kubernetes.io/projected/de1bec54-7da3-41be-978e-2c1320f17696-kube-api-access-8pr2b\") pod \"ceilometer-0\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.602482 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.748143 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.826526 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a05c53bc-13be-4dac-b112-11f05a9e5e64-combined-ca-bundle\") pod \"a05c53bc-13be-4dac-b112-11f05a9e5e64\" (UID: \"a05c53bc-13be-4dac-b112-11f05a9e5e64\") " Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.826603 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a05c53bc-13be-4dac-b112-11f05a9e5e64-config-data\") pod \"a05c53bc-13be-4dac-b112-11f05a9e5e64\" (UID: \"a05c53bc-13be-4dac-b112-11f05a9e5e64\") " Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.826745 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtdb8\" (UniqueName: \"kubernetes.io/projected/a05c53bc-13be-4dac-b112-11f05a9e5e64-kube-api-access-wtdb8\") pod \"a05c53bc-13be-4dac-b112-11f05a9e5e64\" (UID: \"a05c53bc-13be-4dac-b112-11f05a9e5e64\") " Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.832643 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a05c53bc-13be-4dac-b112-11f05a9e5e64-kube-api-access-wtdb8" (OuterVolumeSpecName: "kube-api-access-wtdb8") pod "a05c53bc-13be-4dac-b112-11f05a9e5e64" (UID: "a05c53bc-13be-4dac-b112-11f05a9e5e64"). InnerVolumeSpecName "kube-api-access-wtdb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.852989 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a05c53bc-13be-4dac-b112-11f05a9e5e64-config-data" (OuterVolumeSpecName: "config-data") pod "a05c53bc-13be-4dac-b112-11f05a9e5e64" (UID: "a05c53bc-13be-4dac-b112-11f05a9e5e64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.870474 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a05c53bc-13be-4dac-b112-11f05a9e5e64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a05c53bc-13be-4dac-b112-11f05a9e5e64" (UID: "a05c53bc-13be-4dac-b112-11f05a9e5e64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.929111 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtdb8\" (UniqueName: \"kubernetes.io/projected/a05c53bc-13be-4dac-b112-11f05a9e5e64-kube-api-access-wtdb8\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.929141 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a05c53bc-13be-4dac-b112-11f05a9e5e64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:03 crc kubenswrapper[4947]: I1203 07:11:03.929151 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a05c53bc-13be-4dac-b112-11f05a9e5e64-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:04 crc kubenswrapper[4947]: W1203 07:11:04.065662 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde1bec54_7da3_41be_978e_2c1320f17696.slice/crio-3c217a3aa79b3611329f731b091c6cc55beb28916aa11945d69960bb6abef16c WatchSource:0}: Error finding container 3c217a3aa79b3611329f731b091c6cc55beb28916aa11945d69960bb6abef16c: Status 404 returned error can't find the container with id 3c217a3aa79b3611329f731b091c6cc55beb28916aa11945d69960bb6abef16c Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.070600 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.156577 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de1bec54-7da3-41be-978e-2c1320f17696","Type":"ContainerStarted","Data":"3c217a3aa79b3611329f731b091c6cc55beb28916aa11945d69960bb6abef16c"} Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.158763 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a05c53bc-13be-4dac-b112-11f05a9e5e64","Type":"ContainerDied","Data":"b5033e057d24c1ec8ca5579c19beceab1a3ddd4dea9075f56d4983bf302231d2"} Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.159057 4947 scope.go:117] "RemoveContainer" containerID="c1361038662985e5b390ba5ddb3eb2331e23a0aef8af1d8f6b49870d5a02c952" Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.158785 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.164577 4947 generic.go:334] "Generic (PLEG): container finished" podID="6079f692-0780-41f6-a551-399974561bd3" containerID="4dd4ebee4948d69a2730257fd4f989c2a863bba9bdfac93199d0f7632b8b35c5" exitCode=0 Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.164615 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6079f692-0780-41f6-a551-399974561bd3","Type":"ContainerDied","Data":"4dd4ebee4948d69a2730257fd4f989c2a863bba9bdfac93199d0f7632b8b35c5"} Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.202167 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.210033 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.218192 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:11:04 crc kubenswrapper[4947]: E1203 07:11:04.218655 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a05c53bc-13be-4dac-b112-11f05a9e5e64" containerName="nova-scheduler-scheduler" Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.218672 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="a05c53bc-13be-4dac-b112-11f05a9e5e64" containerName="nova-scheduler-scheduler" Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.218918 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="a05c53bc-13be-4dac-b112-11f05a9e5e64" containerName="nova-scheduler-scheduler" Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.219550 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.221809 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.249076 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.345093 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19fed18a-18d1-43de-9e4c-12dd5973261f-config-data\") pod \"nova-scheduler-0\" (UID: \"19fed18a-18d1-43de-9e4c-12dd5973261f\") " pod="openstack/nova-scheduler-0" Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.345193 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg8mw\" (UniqueName: \"kubernetes.io/projected/19fed18a-18d1-43de-9e4c-12dd5973261f-kube-api-access-gg8mw\") pod \"nova-scheduler-0\" (UID: \"19fed18a-18d1-43de-9e4c-12dd5973261f\") " pod="openstack/nova-scheduler-0" Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.345252 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19fed18a-18d1-43de-9e4c-12dd5973261f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"19fed18a-18d1-43de-9e4c-12dd5973261f\") " pod="openstack/nova-scheduler-0" Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.446498 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19fed18a-18d1-43de-9e4c-12dd5973261f-config-data\") pod \"nova-scheduler-0\" (UID: \"19fed18a-18d1-43de-9e4c-12dd5973261f\") " pod="openstack/nova-scheduler-0" Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.446597 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg8mw\" (UniqueName: \"kubernetes.io/projected/19fed18a-18d1-43de-9e4c-12dd5973261f-kube-api-access-gg8mw\") pod \"nova-scheduler-0\" (UID: \"19fed18a-18d1-43de-9e4c-12dd5973261f\") " pod="openstack/nova-scheduler-0" Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.446651 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19fed18a-18d1-43de-9e4c-12dd5973261f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"19fed18a-18d1-43de-9e4c-12dd5973261f\") " pod="openstack/nova-scheduler-0" Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.455887 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19fed18a-18d1-43de-9e4c-12dd5973261f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"19fed18a-18d1-43de-9e4c-12dd5973261f\") " pod="openstack/nova-scheduler-0" Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.455926 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19fed18a-18d1-43de-9e4c-12dd5973261f-config-data\") pod \"nova-scheduler-0\" (UID: \"19fed18a-18d1-43de-9e4c-12dd5973261f\") " pod="openstack/nova-scheduler-0" Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.468284 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg8mw\" (UniqueName: \"kubernetes.io/projected/19fed18a-18d1-43de-9e4c-12dd5973261f-kube-api-access-gg8mw\") pod \"nova-scheduler-0\" (UID: \"19fed18a-18d1-43de-9e4c-12dd5973261f\") " pod="openstack/nova-scheduler-0" Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.553345 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.564414 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.650241 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6079f692-0780-41f6-a551-399974561bd3-logs\") pod \"6079f692-0780-41f6-a551-399974561bd3\" (UID: \"6079f692-0780-41f6-a551-399974561bd3\") " Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.650314 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6079f692-0780-41f6-a551-399974561bd3-combined-ca-bundle\") pod \"6079f692-0780-41f6-a551-399974561bd3\" (UID: \"6079f692-0780-41f6-a551-399974561bd3\") " Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.650455 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsvxf\" (UniqueName: \"kubernetes.io/projected/6079f692-0780-41f6-a551-399974561bd3-kube-api-access-gsvxf\") pod \"6079f692-0780-41f6-a551-399974561bd3\" (UID: \"6079f692-0780-41f6-a551-399974561bd3\") " Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.650483 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6079f692-0780-41f6-a551-399974561bd3-config-data\") pod \"6079f692-0780-41f6-a551-399974561bd3\" (UID: \"6079f692-0780-41f6-a551-399974561bd3\") " Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.650903 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6079f692-0780-41f6-a551-399974561bd3-logs" (OuterVolumeSpecName: "logs") pod "6079f692-0780-41f6-a551-399974561bd3" (UID: "6079f692-0780-41f6-a551-399974561bd3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.654920 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6079f692-0780-41f6-a551-399974561bd3-kube-api-access-gsvxf" (OuterVolumeSpecName: "kube-api-access-gsvxf") pod "6079f692-0780-41f6-a551-399974561bd3" (UID: "6079f692-0780-41f6-a551-399974561bd3"). InnerVolumeSpecName "kube-api-access-gsvxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.677996 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6079f692-0780-41f6-a551-399974561bd3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6079f692-0780-41f6-a551-399974561bd3" (UID: "6079f692-0780-41f6-a551-399974561bd3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.679628 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6079f692-0780-41f6-a551-399974561bd3-config-data" (OuterVolumeSpecName: "config-data") pod "6079f692-0780-41f6-a551-399974561bd3" (UID: "6079f692-0780-41f6-a551-399974561bd3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.753342 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsvxf\" (UniqueName: \"kubernetes.io/projected/6079f692-0780-41f6-a551-399974561bd3-kube-api-access-gsvxf\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.753371 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6079f692-0780-41f6-a551-399974561bd3-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.753380 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6079f692-0780-41f6-a551-399974561bd3-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:04 crc kubenswrapper[4947]: I1203 07:11:04.753389 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6079f692-0780-41f6-a551-399974561bd3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.048472 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:11:05 crc kubenswrapper[4947]: W1203 07:11:05.058357 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19fed18a_18d1_43de_9e4c_12dd5973261f.slice/crio-fe95f89dee5449a8b8fa1a8c90670e587b42387ae3b6e6c99ee5e28f5389dcbf WatchSource:0}: Error finding container fe95f89dee5449a8b8fa1a8c90670e587b42387ae3b6e6c99ee5e28f5389dcbf: Status 404 returned error can't find the container with id fe95f89dee5449a8b8fa1a8c90670e587b42387ae3b6e6c99ee5e28f5389dcbf Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.096459 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86094256-7d4a-4efa-a139-6307919b49df" path="/var/lib/kubelet/pods/86094256-7d4a-4efa-a139-6307919b49df/volumes" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.097387 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a05c53bc-13be-4dac-b112-11f05a9e5e64" path="/var/lib/kubelet/pods/a05c53bc-13be-4dac-b112-11f05a9e5e64/volumes" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.173252 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"19fed18a-18d1-43de-9e4c-12dd5973261f","Type":"ContainerStarted","Data":"fe95f89dee5449a8b8fa1a8c90670e587b42387ae3b6e6c99ee5e28f5389dcbf"} Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.176224 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.176223 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6079f692-0780-41f6-a551-399974561bd3","Type":"ContainerDied","Data":"72c0c7750517c120614d966ae2db9c5e3eaf595a55140dc4d4180bcf2492edcc"} Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.176272 4947 scope.go:117] "RemoveContainer" containerID="4dd4ebee4948d69a2730257fd4f989c2a863bba9bdfac93199d0f7632b8b35c5" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.215864 4947 scope.go:117] "RemoveContainer" containerID="7744fa6687c156278f7c710a5cc4ece059686b1a297f3eb72a825c051b3e8c23" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.217947 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.255525 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.274748 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 07:11:05 crc kubenswrapper[4947]: E1203 07:11:05.275096 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6079f692-0780-41f6-a551-399974561bd3" containerName="nova-api-api" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.275116 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="6079f692-0780-41f6-a551-399974561bd3" containerName="nova-api-api" Dec 03 07:11:05 crc kubenswrapper[4947]: E1203 07:11:05.275124 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6079f692-0780-41f6-a551-399974561bd3" containerName="nova-api-log" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.275130 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="6079f692-0780-41f6-a551-399974561bd3" containerName="nova-api-log" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.275419 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="6079f692-0780-41f6-a551-399974561bd3" containerName="nova-api-log" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.275452 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="6079f692-0780-41f6-a551-399974561bd3" containerName="nova-api-api" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.277262 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.280536 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.283160 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.414133 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.467037 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c5fc6f-a718-4732-ac68-9a615f00ae41-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c2c5fc6f-a718-4732-ac68-9a615f00ae41\") " pod="openstack/nova-api-0" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.467097 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c5fc6f-a718-4732-ac68-9a615f00ae41-config-data\") pod \"nova-api-0\" (UID: \"c2c5fc6f-a718-4732-ac68-9a615f00ae41\") " pod="openstack/nova-api-0" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.467139 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2c5fc6f-a718-4732-ac68-9a615f00ae41-logs\") pod \"nova-api-0\" (UID: \"c2c5fc6f-a718-4732-ac68-9a615f00ae41\") " pod="openstack/nova-api-0" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.467279 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk7cc\" (UniqueName: \"kubernetes.io/projected/c2c5fc6f-a718-4732-ac68-9a615f00ae41-kube-api-access-zk7cc\") pod \"nova-api-0\" (UID: \"c2c5fc6f-a718-4732-ac68-9a615f00ae41\") " pod="openstack/nova-api-0" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.518617 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.518668 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.568970 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c5fc6f-a718-4732-ac68-9a615f00ae41-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c2c5fc6f-a718-4732-ac68-9a615f00ae41\") " pod="openstack/nova-api-0" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.569580 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c5fc6f-a718-4732-ac68-9a615f00ae41-config-data\") pod \"nova-api-0\" (UID: \"c2c5fc6f-a718-4732-ac68-9a615f00ae41\") " pod="openstack/nova-api-0" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.569741 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2c5fc6f-a718-4732-ac68-9a615f00ae41-logs\") pod \"nova-api-0\" (UID: \"c2c5fc6f-a718-4732-ac68-9a615f00ae41\") " pod="openstack/nova-api-0" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.570182 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2c5fc6f-a718-4732-ac68-9a615f00ae41-logs\") pod \"nova-api-0\" (UID: \"c2c5fc6f-a718-4732-ac68-9a615f00ae41\") " pod="openstack/nova-api-0" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.570626 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk7cc\" (UniqueName: \"kubernetes.io/projected/c2c5fc6f-a718-4732-ac68-9a615f00ae41-kube-api-access-zk7cc\") pod \"nova-api-0\" (UID: \"c2c5fc6f-a718-4732-ac68-9a615f00ae41\") " pod="openstack/nova-api-0" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.582297 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c5fc6f-a718-4732-ac68-9a615f00ae41-config-data\") pod \"nova-api-0\" (UID: \"c2c5fc6f-a718-4732-ac68-9a615f00ae41\") " pod="openstack/nova-api-0" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.583585 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c5fc6f-a718-4732-ac68-9a615f00ae41-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c2c5fc6f-a718-4732-ac68-9a615f00ae41\") " pod="openstack/nova-api-0" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.588634 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk7cc\" (UniqueName: \"kubernetes.io/projected/c2c5fc6f-a718-4732-ac68-9a615f00ae41-kube-api-access-zk7cc\") pod \"nova-api-0\" (UID: \"c2c5fc6f-a718-4732-ac68-9a615f00ae41\") " pod="openstack/nova-api-0" Dec 03 07:11:05 crc kubenswrapper[4947]: I1203 07:11:05.743540 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:11:06 crc kubenswrapper[4947]: I1203 07:11:06.186211 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:11:06 crc kubenswrapper[4947]: I1203 07:11:06.197154 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"19fed18a-18d1-43de-9e4c-12dd5973261f","Type":"ContainerStarted","Data":"74a0ff0d36c6b7b16656e7c33d29da7739023bf5d90f7d5f48f54ec9614687a6"} Dec 03 07:11:06 crc kubenswrapper[4947]: I1203 07:11:06.217048 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.217024261 podStartE2EDuration="2.217024261s" podCreationTimestamp="2025-12-03 07:11:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:11:06.212947969 +0000 UTC m=+1327.473902395" watchObservedRunningTime="2025-12-03 07:11:06.217024261 +0000 UTC m=+1327.477978697" Dec 03 07:11:07 crc kubenswrapper[4947]: I1203 07:11:07.102117 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6079f692-0780-41f6-a551-399974561bd3" path="/var/lib/kubelet/pods/6079f692-0780-41f6-a551-399974561bd3/volumes" Dec 03 07:11:07 crc kubenswrapper[4947]: I1203 07:11:07.207925 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2c5fc6f-a718-4732-ac68-9a615f00ae41","Type":"ContainerStarted","Data":"37e5173b78d3e090c6810c141ef73dcd9ea3eea79f5a9b4bef9a6e2c28992d44"} Dec 03 07:11:07 crc kubenswrapper[4947]: I1203 07:11:07.207978 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2c5fc6f-a718-4732-ac68-9a615f00ae41","Type":"ContainerStarted","Data":"1eed95fac85f9c373ae37147274997e62d9e99ca146385b5eb301eca4fe27913"} Dec 03 07:11:07 crc kubenswrapper[4947]: I1203 07:11:07.210674 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de1bec54-7da3-41be-978e-2c1320f17696","Type":"ContainerStarted","Data":"11fd85caff5b752f632ced5bbcf8ee310de084b8ad7f078cf535c5808caca885"} Dec 03 07:11:08 crc kubenswrapper[4947]: I1203 07:11:08.235528 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de1bec54-7da3-41be-978e-2c1320f17696","Type":"ContainerStarted","Data":"5e323d623f28108c32c9e48e5416f9570afdae02ae011b7879e21005051af10e"} Dec 03 07:11:08 crc kubenswrapper[4947]: I1203 07:11:08.241511 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2c5fc6f-a718-4732-ac68-9a615f00ae41","Type":"ContainerStarted","Data":"bb4180d81a2515133df4ec416520faecc286d9ee4e0f3d6a52cad3d89fda6049"} Dec 03 07:11:08 crc kubenswrapper[4947]: I1203 07:11:08.265779 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.265757114 podStartE2EDuration="3.265757114s" podCreationTimestamp="2025-12-03 07:11:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:11:08.258570327 +0000 UTC m=+1329.519524763" watchObservedRunningTime="2025-12-03 07:11:08.265757114 +0000 UTC m=+1329.526711550" Dec 03 07:11:09 crc kubenswrapper[4947]: I1203 07:11:09.257021 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de1bec54-7da3-41be-978e-2c1320f17696","Type":"ContainerStarted","Data":"695ddbf2412109c51d1380548208b0db7dc07236da4dbfc486b84de170ffcc8a"} Dec 03 07:11:09 crc kubenswrapper[4947]: I1203 07:11:09.494661 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 03 07:11:09 crc kubenswrapper[4947]: I1203 07:11:09.565420 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 07:11:10 crc kubenswrapper[4947]: I1203 07:11:10.268475 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de1bec54-7da3-41be-978e-2c1320f17696","Type":"ContainerStarted","Data":"da2dfddfcc237bf188c2cef01436afbbd19c4b289f4f37f3f942330161aa6709"} Dec 03 07:11:10 crc kubenswrapper[4947]: I1203 07:11:10.269195 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 07:11:10 crc kubenswrapper[4947]: I1203 07:11:10.296699 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.6589502189999998 podStartE2EDuration="7.296678047s" podCreationTimestamp="2025-12-03 07:11:03 +0000 UTC" firstStartedPulling="2025-12-03 07:11:04.06852961 +0000 UTC m=+1325.329484036" lastFinishedPulling="2025-12-03 07:11:09.706257438 +0000 UTC m=+1330.967211864" observedRunningTime="2025-12-03 07:11:10.288353078 +0000 UTC m=+1331.549307514" watchObservedRunningTime="2025-12-03 07:11:10.296678047 +0000 UTC m=+1331.557632473" Dec 03 07:11:10 crc kubenswrapper[4947]: I1203 07:11:10.533998 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 07:11:10 crc kubenswrapper[4947]: I1203 07:11:10.534291 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 07:11:11 crc kubenswrapper[4947]: I1203 07:11:11.545686 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 07:11:11 crc kubenswrapper[4947]: I1203 07:11:11.546426 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 07:11:14 crc kubenswrapper[4947]: I1203 07:11:14.565687 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 07:11:14 crc kubenswrapper[4947]: I1203 07:11:14.604522 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 07:11:15 crc kubenswrapper[4947]: I1203 07:11:15.363678 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 07:11:15 crc kubenswrapper[4947]: I1203 07:11:15.744725 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 07:11:15 crc kubenswrapper[4947]: I1203 07:11:15.744813 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 07:11:16 crc kubenswrapper[4947]: I1203 07:11:16.827798 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c2c5fc6f-a718-4732-ac68-9a615f00ae41" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 07:11:16 crc kubenswrapper[4947]: I1203 07:11:16.827861 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c2c5fc6f-a718-4732-ac68-9a615f00ae41" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 07:11:20 crc kubenswrapper[4947]: I1203 07:11:20.551159 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 07:11:20 crc kubenswrapper[4947]: I1203 07:11:20.553342 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 07:11:20 crc kubenswrapper[4947]: I1203 07:11:20.563155 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 07:11:21 crc kubenswrapper[4947]: I1203 07:11:21.398981 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 07:11:22 crc kubenswrapper[4947]: I1203 07:11:22.400703 4947 generic.go:334] "Generic (PLEG): container finished" podID="39f17cf5-7ef1-4773-818d-02f42a6ae7fe" containerID="00aa7a80e06314abdb440924c5edafb9eaef6ba88ad895856328a1ead1571418" exitCode=137 Dec 03 07:11:22 crc kubenswrapper[4947]: I1203 07:11:22.401605 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"39f17cf5-7ef1-4773-818d-02f42a6ae7fe","Type":"ContainerDied","Data":"00aa7a80e06314abdb440924c5edafb9eaef6ba88ad895856328a1ead1571418"} Dec 03 07:11:22 crc kubenswrapper[4947]: I1203 07:11:22.401651 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"39f17cf5-7ef1-4773-818d-02f42a6ae7fe","Type":"ContainerDied","Data":"d091e0f001c42b13a3b8150bc4b33ecb88ddb74f0efb1878d59c811f4cc52597"} Dec 03 07:11:22 crc kubenswrapper[4947]: I1203 07:11:22.401662 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d091e0f001c42b13a3b8150bc4b33ecb88ddb74f0efb1878d59c811f4cc52597" Dec 03 07:11:22 crc kubenswrapper[4947]: I1203 07:11:22.462480 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:11:22 crc kubenswrapper[4947]: I1203 07:11:22.523208 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr7dc\" (UniqueName: \"kubernetes.io/projected/39f17cf5-7ef1-4773-818d-02f42a6ae7fe-kube-api-access-rr7dc\") pod \"39f17cf5-7ef1-4773-818d-02f42a6ae7fe\" (UID: \"39f17cf5-7ef1-4773-818d-02f42a6ae7fe\") " Dec 03 07:11:22 crc kubenswrapper[4947]: I1203 07:11:22.523845 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39f17cf5-7ef1-4773-818d-02f42a6ae7fe-config-data\") pod \"39f17cf5-7ef1-4773-818d-02f42a6ae7fe\" (UID: \"39f17cf5-7ef1-4773-818d-02f42a6ae7fe\") " Dec 03 07:11:22 crc kubenswrapper[4947]: I1203 07:11:22.523973 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f17cf5-7ef1-4773-818d-02f42a6ae7fe-combined-ca-bundle\") pod \"39f17cf5-7ef1-4773-818d-02f42a6ae7fe\" (UID: \"39f17cf5-7ef1-4773-818d-02f42a6ae7fe\") " Dec 03 07:11:22 crc kubenswrapper[4947]: I1203 07:11:22.533978 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39f17cf5-7ef1-4773-818d-02f42a6ae7fe-kube-api-access-rr7dc" (OuterVolumeSpecName: "kube-api-access-rr7dc") pod "39f17cf5-7ef1-4773-818d-02f42a6ae7fe" (UID: "39f17cf5-7ef1-4773-818d-02f42a6ae7fe"). InnerVolumeSpecName "kube-api-access-rr7dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:11:22 crc kubenswrapper[4947]: I1203 07:11:22.553637 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39f17cf5-7ef1-4773-818d-02f42a6ae7fe-config-data" (OuterVolumeSpecName: "config-data") pod "39f17cf5-7ef1-4773-818d-02f42a6ae7fe" (UID: "39f17cf5-7ef1-4773-818d-02f42a6ae7fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:22 crc kubenswrapper[4947]: I1203 07:11:22.559610 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39f17cf5-7ef1-4773-818d-02f42a6ae7fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "39f17cf5-7ef1-4773-818d-02f42a6ae7fe" (UID: "39f17cf5-7ef1-4773-818d-02f42a6ae7fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:22 crc kubenswrapper[4947]: I1203 07:11:22.625190 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/39f17cf5-7ef1-4773-818d-02f42a6ae7fe-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:22 crc kubenswrapper[4947]: I1203 07:11:22.625225 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/39f17cf5-7ef1-4773-818d-02f42a6ae7fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:22 crc kubenswrapper[4947]: I1203 07:11:22.625237 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr7dc\" (UniqueName: \"kubernetes.io/projected/39f17cf5-7ef1-4773-818d-02f42a6ae7fe-kube-api-access-rr7dc\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.409906 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.457043 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.478486 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.494356 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 07:11:23 crc kubenswrapper[4947]: E1203 07:11:23.494739 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39f17cf5-7ef1-4773-818d-02f42a6ae7fe" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.494755 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="39f17cf5-7ef1-4773-818d-02f42a6ae7fe" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.494960 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="39f17cf5-7ef1-4773-818d-02f42a6ae7fe" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.495531 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.497447 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.497881 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.498136 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.520673 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.649056 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e80985eb-c6e0-4ffc-9b98-b1c92be266eb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.649430 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e80985eb-c6e0-4ffc-9b98-b1c92be266eb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.649464 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmjvz\" (UniqueName: \"kubernetes.io/projected/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-kube-api-access-fmjvz\") pod \"nova-cell1-novncproxy-0\" (UID: \"e80985eb-c6e0-4ffc-9b98-b1c92be266eb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.649537 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e80985eb-c6e0-4ffc-9b98-b1c92be266eb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.649789 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e80985eb-c6e0-4ffc-9b98-b1c92be266eb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.752659 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e80985eb-c6e0-4ffc-9b98-b1c92be266eb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.752816 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e80985eb-c6e0-4ffc-9b98-b1c92be266eb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.752880 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e80985eb-c6e0-4ffc-9b98-b1c92be266eb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.752912 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmjvz\" (UniqueName: \"kubernetes.io/projected/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-kube-api-access-fmjvz\") pod \"nova-cell1-novncproxy-0\" (UID: \"e80985eb-c6e0-4ffc-9b98-b1c92be266eb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.752993 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e80985eb-c6e0-4ffc-9b98-b1c92be266eb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.758562 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e80985eb-c6e0-4ffc-9b98-b1c92be266eb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.758773 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e80985eb-c6e0-4ffc-9b98-b1c92be266eb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.758789 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e80985eb-c6e0-4ffc-9b98-b1c92be266eb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.760426 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e80985eb-c6e0-4ffc-9b98-b1c92be266eb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.776065 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmjvz\" (UniqueName: \"kubernetes.io/projected/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-kube-api-access-fmjvz\") pod \"nova-cell1-novncproxy-0\" (UID: \"e80985eb-c6e0-4ffc-9b98-b1c92be266eb\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:11:23 crc kubenswrapper[4947]: I1203 07:11:23.816713 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:11:24 crc kubenswrapper[4947]: I1203 07:11:24.251970 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 07:11:24 crc kubenswrapper[4947]: I1203 07:11:24.425451 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e80985eb-c6e0-4ffc-9b98-b1c92be266eb","Type":"ContainerStarted","Data":"2de63a9f7da24cf13720666eadfee51c31ec63a3d57b90de0b7a6a039cfb7ff6"} Dec 03 07:11:25 crc kubenswrapper[4947]: I1203 07:11:25.092556 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39f17cf5-7ef1-4773-818d-02f42a6ae7fe" path="/var/lib/kubelet/pods/39f17cf5-7ef1-4773-818d-02f42a6ae7fe/volumes" Dec 03 07:11:25 crc kubenswrapper[4947]: I1203 07:11:25.435586 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e80985eb-c6e0-4ffc-9b98-b1c92be266eb","Type":"ContainerStarted","Data":"7ddbee771d7d031c4a9d787606308301ec00a05893e9401d6d27e942db47b236"} Dec 03 07:11:25 crc kubenswrapper[4947]: I1203 07:11:25.457757 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.457734102 podStartE2EDuration="2.457734102s" podCreationTimestamp="2025-12-03 07:11:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:11:25.45327746 +0000 UTC m=+1346.714231896" watchObservedRunningTime="2025-12-03 07:11:25.457734102 +0000 UTC m=+1346.718688528" Dec 03 07:11:25 crc kubenswrapper[4947]: I1203 07:11:25.748479 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 07:11:25 crc kubenswrapper[4947]: I1203 07:11:25.749023 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 07:11:25 crc kubenswrapper[4947]: I1203 07:11:25.754584 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 07:11:25 crc kubenswrapper[4947]: I1203 07:11:25.758877 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 07:11:26 crc kubenswrapper[4947]: I1203 07:11:26.445955 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 07:11:26 crc kubenswrapper[4947]: I1203 07:11:26.452224 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 07:11:26 crc kubenswrapper[4947]: I1203 07:11:26.665390 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9cbcb645-6xgx7"] Dec 03 07:11:26 crc kubenswrapper[4947]: I1203 07:11:26.667296 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" Dec 03 07:11:26 crc kubenswrapper[4947]: I1203 07:11:26.679799 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9cbcb645-6xgx7"] Dec 03 07:11:26 crc kubenswrapper[4947]: I1203 07:11:26.711244 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5thm\" (UniqueName: \"kubernetes.io/projected/b4a227e4-8c2a-4880-9944-877640627cd0-kube-api-access-p5thm\") pod \"dnsmasq-dns-5c9cbcb645-6xgx7\" (UID: \"b4a227e4-8c2a-4880-9944-877640627cd0\") " pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" Dec 03 07:11:26 crc kubenswrapper[4947]: I1203 07:11:26.711685 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9cbcb645-6xgx7\" (UID: \"b4a227e4-8c2a-4880-9944-877640627cd0\") " pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" Dec 03 07:11:26 crc kubenswrapper[4947]: I1203 07:11:26.711875 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9cbcb645-6xgx7\" (UID: \"b4a227e4-8c2a-4880-9944-877640627cd0\") " pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" Dec 03 07:11:26 crc kubenswrapper[4947]: I1203 07:11:26.711984 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-dns-svc\") pod \"dnsmasq-dns-5c9cbcb645-6xgx7\" (UID: \"b4a227e4-8c2a-4880-9944-877640627cd0\") " pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" Dec 03 07:11:26 crc kubenswrapper[4947]: I1203 07:11:26.712519 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9cbcb645-6xgx7\" (UID: \"b4a227e4-8c2a-4880-9944-877640627cd0\") " pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" Dec 03 07:11:26 crc kubenswrapper[4947]: I1203 07:11:26.712676 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-config\") pod \"dnsmasq-dns-5c9cbcb645-6xgx7\" (UID: \"b4a227e4-8c2a-4880-9944-877640627cd0\") " pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" Dec 03 07:11:26 crc kubenswrapper[4947]: I1203 07:11:26.814559 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9cbcb645-6xgx7\" (UID: \"b4a227e4-8c2a-4880-9944-877640627cd0\") " pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" Dec 03 07:11:26 crc kubenswrapper[4947]: I1203 07:11:26.814604 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-dns-svc\") pod \"dnsmasq-dns-5c9cbcb645-6xgx7\" (UID: \"b4a227e4-8c2a-4880-9944-877640627cd0\") " pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" Dec 03 07:11:26 crc kubenswrapper[4947]: I1203 07:11:26.814623 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9cbcb645-6xgx7\" (UID: \"b4a227e4-8c2a-4880-9944-877640627cd0\") " pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" Dec 03 07:11:26 crc kubenswrapper[4947]: I1203 07:11:26.814640 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-config\") pod \"dnsmasq-dns-5c9cbcb645-6xgx7\" (UID: \"b4a227e4-8c2a-4880-9944-877640627cd0\") " pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" Dec 03 07:11:26 crc kubenswrapper[4947]: I1203 07:11:26.814729 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5thm\" (UniqueName: \"kubernetes.io/projected/b4a227e4-8c2a-4880-9944-877640627cd0-kube-api-access-p5thm\") pod \"dnsmasq-dns-5c9cbcb645-6xgx7\" (UID: \"b4a227e4-8c2a-4880-9944-877640627cd0\") " pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" Dec 03 07:11:26 crc kubenswrapper[4947]: I1203 07:11:26.814773 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9cbcb645-6xgx7\" (UID: \"b4a227e4-8c2a-4880-9944-877640627cd0\") " pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" Dec 03 07:11:26 crc kubenswrapper[4947]: I1203 07:11:26.815662 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9cbcb645-6xgx7\" (UID: \"b4a227e4-8c2a-4880-9944-877640627cd0\") " pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" Dec 03 07:11:26 crc kubenswrapper[4947]: I1203 07:11:26.816143 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9cbcb645-6xgx7\" (UID: \"b4a227e4-8c2a-4880-9944-877640627cd0\") " pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" Dec 03 07:11:26 crc kubenswrapper[4947]: I1203 07:11:26.816754 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-dns-svc\") pod \"dnsmasq-dns-5c9cbcb645-6xgx7\" (UID: \"b4a227e4-8c2a-4880-9944-877640627cd0\") " pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" Dec 03 07:11:26 crc kubenswrapper[4947]: I1203 07:11:26.816986 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-config\") pod \"dnsmasq-dns-5c9cbcb645-6xgx7\" (UID: \"b4a227e4-8c2a-4880-9944-877640627cd0\") " pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" Dec 03 07:11:26 crc kubenswrapper[4947]: I1203 07:11:26.817096 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9cbcb645-6xgx7\" (UID: \"b4a227e4-8c2a-4880-9944-877640627cd0\") " pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" Dec 03 07:11:26 crc kubenswrapper[4947]: I1203 07:11:26.837550 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5thm\" (UniqueName: \"kubernetes.io/projected/b4a227e4-8c2a-4880-9944-877640627cd0-kube-api-access-p5thm\") pod \"dnsmasq-dns-5c9cbcb645-6xgx7\" (UID: \"b4a227e4-8c2a-4880-9944-877640627cd0\") " pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" Dec 03 07:11:27 crc kubenswrapper[4947]: I1203 07:11:27.004428 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" Dec 03 07:11:27 crc kubenswrapper[4947]: I1203 07:11:27.534628 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9cbcb645-6xgx7"] Dec 03 07:11:27 crc kubenswrapper[4947]: W1203 07:11:27.535241 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4a227e4_8c2a_4880_9944_877640627cd0.slice/crio-c5bae347e6279d9024cb5005defb36680899b4dc6be37e8fdddd2e0da6c64681 WatchSource:0}: Error finding container c5bae347e6279d9024cb5005defb36680899b4dc6be37e8fdddd2e0da6c64681: Status 404 returned error can't find the container with id c5bae347e6279d9024cb5005defb36680899b4dc6be37e8fdddd2e0da6c64681 Dec 03 07:11:28 crc kubenswrapper[4947]: I1203 07:11:28.465746 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" event={"ID":"b4a227e4-8c2a-4880-9944-877640627cd0","Type":"ContainerDied","Data":"9903cf2e6781315a233e9ee018d5a8e37a88e216dc112a5608259f1c8c2f8932"} Dec 03 07:11:28 crc kubenswrapper[4947]: I1203 07:11:28.466324 4947 generic.go:334] "Generic (PLEG): container finished" podID="b4a227e4-8c2a-4880-9944-877640627cd0" containerID="9903cf2e6781315a233e9ee018d5a8e37a88e216dc112a5608259f1c8c2f8932" exitCode=0 Dec 03 07:11:28 crc kubenswrapper[4947]: I1203 07:11:28.466524 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" event={"ID":"b4a227e4-8c2a-4880-9944-877640627cd0","Type":"ContainerStarted","Data":"c5bae347e6279d9024cb5005defb36680899b4dc6be37e8fdddd2e0da6c64681"} Dec 03 07:11:28 crc kubenswrapper[4947]: I1203 07:11:28.817847 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:11:29 crc kubenswrapper[4947]: I1203 07:11:29.002724 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:11:29 crc kubenswrapper[4947]: I1203 07:11:29.003241 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de1bec54-7da3-41be-978e-2c1320f17696" containerName="ceilometer-central-agent" containerID="cri-o://11fd85caff5b752f632ced5bbcf8ee310de084b8ad7f078cf535c5808caca885" gracePeriod=30 Dec 03 07:11:29 crc kubenswrapper[4947]: I1203 07:11:29.003359 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de1bec54-7da3-41be-978e-2c1320f17696" containerName="ceilometer-notification-agent" containerID="cri-o://5e323d623f28108c32c9e48e5416f9570afdae02ae011b7879e21005051af10e" gracePeriod=30 Dec 03 07:11:29 crc kubenswrapper[4947]: I1203 07:11:29.003294 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de1bec54-7da3-41be-978e-2c1320f17696" containerName="sg-core" containerID="cri-o://695ddbf2412109c51d1380548208b0db7dc07236da4dbfc486b84de170ffcc8a" gracePeriod=30 Dec 03 07:11:29 crc kubenswrapper[4947]: I1203 07:11:29.003324 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de1bec54-7da3-41be-978e-2c1320f17696" containerName="proxy-httpd" containerID="cri-o://da2dfddfcc237bf188c2cef01436afbbd19c4b289f4f37f3f942330161aa6709" gracePeriod=30 Dec 03 07:11:29 crc kubenswrapper[4947]: I1203 07:11:29.011235 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="de1bec54-7da3-41be-978e-2c1320f17696" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.194:3000/\": EOF" Dec 03 07:11:29 crc kubenswrapper[4947]: I1203 07:11:29.272350 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:11:29 crc kubenswrapper[4947]: I1203 07:11:29.486075 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" event={"ID":"b4a227e4-8c2a-4880-9944-877640627cd0","Type":"ContainerStarted","Data":"cdbab280e2ed4809a2997438db415a2fa098c89060067db08594d46673b283c3"} Dec 03 07:11:29 crc kubenswrapper[4947]: I1203 07:11:29.486178 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" Dec 03 07:11:29 crc kubenswrapper[4947]: I1203 07:11:29.496388 4947 generic.go:334] "Generic (PLEG): container finished" podID="de1bec54-7da3-41be-978e-2c1320f17696" containerID="da2dfddfcc237bf188c2cef01436afbbd19c4b289f4f37f3f942330161aa6709" exitCode=0 Dec 03 07:11:29 crc kubenswrapper[4947]: I1203 07:11:29.496426 4947 generic.go:334] "Generic (PLEG): container finished" podID="de1bec54-7da3-41be-978e-2c1320f17696" containerID="695ddbf2412109c51d1380548208b0db7dc07236da4dbfc486b84de170ffcc8a" exitCode=2 Dec 03 07:11:29 crc kubenswrapper[4947]: I1203 07:11:29.496436 4947 generic.go:334] "Generic (PLEG): container finished" podID="de1bec54-7da3-41be-978e-2c1320f17696" containerID="11fd85caff5b752f632ced5bbcf8ee310de084b8ad7f078cf535c5808caca885" exitCode=0 Dec 03 07:11:29 crc kubenswrapper[4947]: I1203 07:11:29.496440 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de1bec54-7da3-41be-978e-2c1320f17696","Type":"ContainerDied","Data":"da2dfddfcc237bf188c2cef01436afbbd19c4b289f4f37f3f942330161aa6709"} Dec 03 07:11:29 crc kubenswrapper[4947]: I1203 07:11:29.496597 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de1bec54-7da3-41be-978e-2c1320f17696","Type":"ContainerDied","Data":"695ddbf2412109c51d1380548208b0db7dc07236da4dbfc486b84de170ffcc8a"} Dec 03 07:11:29 crc kubenswrapper[4947]: I1203 07:11:29.496610 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de1bec54-7da3-41be-978e-2c1320f17696","Type":"ContainerDied","Data":"11fd85caff5b752f632ced5bbcf8ee310de084b8ad7f078cf535c5808caca885"} Dec 03 07:11:29 crc kubenswrapper[4947]: I1203 07:11:29.496682 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c2c5fc6f-a718-4732-ac68-9a615f00ae41" containerName="nova-api-log" containerID="cri-o://37e5173b78d3e090c6810c141ef73dcd9ea3eea79f5a9b4bef9a6e2c28992d44" gracePeriod=30 Dec 03 07:11:29 crc kubenswrapper[4947]: I1203 07:11:29.496721 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c2c5fc6f-a718-4732-ac68-9a615f00ae41" containerName="nova-api-api" containerID="cri-o://bb4180d81a2515133df4ec416520faecc286d9ee4e0f3d6a52cad3d89fda6049" gracePeriod=30 Dec 03 07:11:29 crc kubenswrapper[4947]: I1203 07:11:29.517104 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" podStartSLOduration=3.5170854179999997 podStartE2EDuration="3.517085418s" podCreationTimestamp="2025-12-03 07:11:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:11:29.50408903 +0000 UTC m=+1350.765043476" watchObservedRunningTime="2025-12-03 07:11:29.517085418 +0000 UTC m=+1350.778039834" Dec 03 07:11:30 crc kubenswrapper[4947]: I1203 07:11:30.512446 4947 generic.go:334] "Generic (PLEG): container finished" podID="c2c5fc6f-a718-4732-ac68-9a615f00ae41" containerID="37e5173b78d3e090c6810c141ef73dcd9ea3eea79f5a9b4bef9a6e2c28992d44" exitCode=143 Dec 03 07:11:30 crc kubenswrapper[4947]: I1203 07:11:30.512537 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2c5fc6f-a718-4732-ac68-9a615f00ae41","Type":"ContainerDied","Data":"37e5173b78d3e090c6810c141ef73dcd9ea3eea79f5a9b4bef9a6e2c28992d44"} Dec 03 07:11:33 crc kubenswrapper[4947]: I1203 07:11:33.543654 4947 generic.go:334] "Generic (PLEG): container finished" podID="de1bec54-7da3-41be-978e-2c1320f17696" containerID="5e323d623f28108c32c9e48e5416f9570afdae02ae011b7879e21005051af10e" exitCode=0 Dec 03 07:11:33 crc kubenswrapper[4947]: I1203 07:11:33.544020 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de1bec54-7da3-41be-978e-2c1320f17696","Type":"ContainerDied","Data":"5e323d623f28108c32c9e48e5416f9570afdae02ae011b7879e21005051af10e"} Dec 03 07:11:33 crc kubenswrapper[4947]: I1203 07:11:33.550827 4947 generic.go:334] "Generic (PLEG): container finished" podID="c2c5fc6f-a718-4732-ac68-9a615f00ae41" containerID="bb4180d81a2515133df4ec416520faecc286d9ee4e0f3d6a52cad3d89fda6049" exitCode=0 Dec 03 07:11:33 crc kubenswrapper[4947]: I1203 07:11:33.550850 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2c5fc6f-a718-4732-ac68-9a615f00ae41","Type":"ContainerDied","Data":"bb4180d81a2515133df4ec416520faecc286d9ee4e0f3d6a52cad3d89fda6049"} Dec 03 07:11:33 crc kubenswrapper[4947]: I1203 07:11:33.604001 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="de1bec54-7da3-41be-978e-2c1320f17696" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.194:3000/\": dial tcp 10.217.0.194:3000: connect: connection refused" Dec 03 07:11:33 crc kubenswrapper[4947]: I1203 07:11:33.695791 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:11:33 crc kubenswrapper[4947]: I1203 07:11:33.816907 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:11:33 crc kubenswrapper[4947]: I1203 07:11:33.843527 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:11:33 crc kubenswrapper[4947]: I1203 07:11:33.864077 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c5fc6f-a718-4732-ac68-9a615f00ae41-config-data\") pod \"c2c5fc6f-a718-4732-ac68-9a615f00ae41\" (UID: \"c2c5fc6f-a718-4732-ac68-9a615f00ae41\") " Dec 03 07:11:33 crc kubenswrapper[4947]: I1203 07:11:33.864126 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2c5fc6f-a718-4732-ac68-9a615f00ae41-logs\") pod \"c2c5fc6f-a718-4732-ac68-9a615f00ae41\" (UID: \"c2c5fc6f-a718-4732-ac68-9a615f00ae41\") " Dec 03 07:11:33 crc kubenswrapper[4947]: I1203 07:11:33.864189 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk7cc\" (UniqueName: \"kubernetes.io/projected/c2c5fc6f-a718-4732-ac68-9a615f00ae41-kube-api-access-zk7cc\") pod \"c2c5fc6f-a718-4732-ac68-9a615f00ae41\" (UID: \"c2c5fc6f-a718-4732-ac68-9a615f00ae41\") " Dec 03 07:11:33 crc kubenswrapper[4947]: I1203 07:11:33.864241 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c5fc6f-a718-4732-ac68-9a615f00ae41-combined-ca-bundle\") pod \"c2c5fc6f-a718-4732-ac68-9a615f00ae41\" (UID: \"c2c5fc6f-a718-4732-ac68-9a615f00ae41\") " Dec 03 07:11:33 crc kubenswrapper[4947]: I1203 07:11:33.865859 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2c5fc6f-a718-4732-ac68-9a615f00ae41-logs" (OuterVolumeSpecName: "logs") pod "c2c5fc6f-a718-4732-ac68-9a615f00ae41" (UID: "c2c5fc6f-a718-4732-ac68-9a615f00ae41"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:11:33 crc kubenswrapper[4947]: I1203 07:11:33.870771 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2c5fc6f-a718-4732-ac68-9a615f00ae41-kube-api-access-zk7cc" (OuterVolumeSpecName: "kube-api-access-zk7cc") pod "c2c5fc6f-a718-4732-ac68-9a615f00ae41" (UID: "c2c5fc6f-a718-4732-ac68-9a615f00ae41"). InnerVolumeSpecName "kube-api-access-zk7cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:11:33 crc kubenswrapper[4947]: I1203 07:11:33.902450 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c5fc6f-a718-4732-ac68-9a615f00ae41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2c5fc6f-a718-4732-ac68-9a615f00ae41" (UID: "c2c5fc6f-a718-4732-ac68-9a615f00ae41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:33 crc kubenswrapper[4947]: I1203 07:11:33.914030 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c5fc6f-a718-4732-ac68-9a615f00ae41-config-data" (OuterVolumeSpecName: "config-data") pod "c2c5fc6f-a718-4732-ac68-9a615f00ae41" (UID: "c2c5fc6f-a718-4732-ac68-9a615f00ae41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:33 crc kubenswrapper[4947]: I1203 07:11:33.967288 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2c5fc6f-a718-4732-ac68-9a615f00ae41-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:33 crc kubenswrapper[4947]: I1203 07:11:33.967331 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c2c5fc6f-a718-4732-ac68-9a615f00ae41-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:33 crc kubenswrapper[4947]: I1203 07:11:33.967341 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk7cc\" (UniqueName: \"kubernetes.io/projected/c2c5fc6f-a718-4732-ac68-9a615f00ae41-kube-api-access-zk7cc\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:33 crc kubenswrapper[4947]: I1203 07:11:33.967349 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2c5fc6f-a718-4732-ac68-9a615f00ae41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:33 crc kubenswrapper[4947]: I1203 07:11:33.988509 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.068000 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-ceilometer-tls-certs\") pod \"de1bec54-7da3-41be-978e-2c1320f17696\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.068065 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-config-data\") pod \"de1bec54-7da3-41be-978e-2c1320f17696\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.068097 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pr2b\" (UniqueName: \"kubernetes.io/projected/de1bec54-7da3-41be-978e-2c1320f17696-kube-api-access-8pr2b\") pod \"de1bec54-7da3-41be-978e-2c1320f17696\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.068198 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-sg-core-conf-yaml\") pod \"de1bec54-7da3-41be-978e-2c1320f17696\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.068233 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-scripts\") pod \"de1bec54-7da3-41be-978e-2c1320f17696\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.068276 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de1bec54-7da3-41be-978e-2c1320f17696-run-httpd\") pod \"de1bec54-7da3-41be-978e-2c1320f17696\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.068342 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de1bec54-7da3-41be-978e-2c1320f17696-log-httpd\") pod \"de1bec54-7da3-41be-978e-2c1320f17696\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.068419 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-combined-ca-bundle\") pod \"de1bec54-7da3-41be-978e-2c1320f17696\" (UID: \"de1bec54-7da3-41be-978e-2c1320f17696\") " Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.068904 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de1bec54-7da3-41be-978e-2c1320f17696-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "de1bec54-7da3-41be-978e-2c1320f17696" (UID: "de1bec54-7da3-41be-978e-2c1320f17696"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.069090 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de1bec54-7da3-41be-978e-2c1320f17696-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "de1bec54-7da3-41be-978e-2c1320f17696" (UID: "de1bec54-7da3-41be-978e-2c1320f17696"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.071911 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-scripts" (OuterVolumeSpecName: "scripts") pod "de1bec54-7da3-41be-978e-2c1320f17696" (UID: "de1bec54-7da3-41be-978e-2c1320f17696"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.072650 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de1bec54-7da3-41be-978e-2c1320f17696-kube-api-access-8pr2b" (OuterVolumeSpecName: "kube-api-access-8pr2b") pod "de1bec54-7da3-41be-978e-2c1320f17696" (UID: "de1bec54-7da3-41be-978e-2c1320f17696"). InnerVolumeSpecName "kube-api-access-8pr2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.103268 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "de1bec54-7da3-41be-978e-2c1320f17696" (UID: "de1bec54-7da3-41be-978e-2c1320f17696"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.118231 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "de1bec54-7da3-41be-978e-2c1320f17696" (UID: "de1bec54-7da3-41be-978e-2c1320f17696"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.152735 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de1bec54-7da3-41be-978e-2c1320f17696" (UID: "de1bec54-7da3-41be-978e-2c1320f17696"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.174000 4947 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de1bec54-7da3-41be-978e-2c1320f17696-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.174034 4947 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de1bec54-7da3-41be-978e-2c1320f17696-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.174049 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.174073 4947 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.174086 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pr2b\" (UniqueName: \"kubernetes.io/projected/de1bec54-7da3-41be-978e-2c1320f17696-kube-api-access-8pr2b\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.174096 4947 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.174110 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.182167 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-config-data" (OuterVolumeSpecName: "config-data") pod "de1bec54-7da3-41be-978e-2c1320f17696" (UID: "de1bec54-7da3-41be-978e-2c1320f17696"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.276514 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de1bec54-7da3-41be-978e-2c1320f17696-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.560627 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c2c5fc6f-a718-4732-ac68-9a615f00ae41","Type":"ContainerDied","Data":"1eed95fac85f9c373ae37147274997e62d9e99ca146385b5eb301eca4fe27913"} Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.561084 4947 scope.go:117] "RemoveContainer" containerID="bb4180d81a2515133df4ec416520faecc286d9ee4e0f3d6a52cad3d89fda6049" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.560793 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.565608 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.565659 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de1bec54-7da3-41be-978e-2c1320f17696","Type":"ContainerDied","Data":"3c217a3aa79b3611329f731b091c6cc55beb28916aa11945d69960bb6abef16c"} Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.587086 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.614883 4947 scope.go:117] "RemoveContainer" containerID="37e5173b78d3e090c6810c141ef73dcd9ea3eea79f5a9b4bef9a6e2c28992d44" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.660356 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.664825 4947 scope.go:117] "RemoveContainer" containerID="da2dfddfcc237bf188c2cef01436afbbd19c4b289f4f37f3f942330161aa6709" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.674806 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.686140 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 07:11:34 crc kubenswrapper[4947]: E1203 07:11:34.686641 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1bec54-7da3-41be-978e-2c1320f17696" containerName="proxy-httpd" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.686658 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1bec54-7da3-41be-978e-2c1320f17696" containerName="proxy-httpd" Dec 03 07:11:34 crc kubenswrapper[4947]: E1203 07:11:34.686675 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2c5fc6f-a718-4732-ac68-9a615f00ae41" containerName="nova-api-api" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.686682 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2c5fc6f-a718-4732-ac68-9a615f00ae41" containerName="nova-api-api" Dec 03 07:11:34 crc kubenswrapper[4947]: E1203 07:11:34.686714 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1bec54-7da3-41be-978e-2c1320f17696" containerName="ceilometer-central-agent" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.686722 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1bec54-7da3-41be-978e-2c1320f17696" containerName="ceilometer-central-agent" Dec 03 07:11:34 crc kubenswrapper[4947]: E1203 07:11:34.686733 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2c5fc6f-a718-4732-ac68-9a615f00ae41" containerName="nova-api-log" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.686739 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2c5fc6f-a718-4732-ac68-9a615f00ae41" containerName="nova-api-log" Dec 03 07:11:34 crc kubenswrapper[4947]: E1203 07:11:34.686757 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1bec54-7da3-41be-978e-2c1320f17696" containerName="ceilometer-notification-agent" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.686763 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1bec54-7da3-41be-978e-2c1320f17696" containerName="ceilometer-notification-agent" Dec 03 07:11:34 crc kubenswrapper[4947]: E1203 07:11:34.686779 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1bec54-7da3-41be-978e-2c1320f17696" containerName="sg-core" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.686786 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1bec54-7da3-41be-978e-2c1320f17696" containerName="sg-core" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.686984 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2c5fc6f-a718-4732-ac68-9a615f00ae41" containerName="nova-api-log" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.687000 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="de1bec54-7da3-41be-978e-2c1320f17696" containerName="ceilometer-central-agent" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.687017 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="de1bec54-7da3-41be-978e-2c1320f17696" containerName="ceilometer-notification-agent" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.687033 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2c5fc6f-a718-4732-ac68-9a615f00ae41" containerName="nova-api-api" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.687053 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="de1bec54-7da3-41be-978e-2c1320f17696" containerName="sg-core" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.687068 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="de1bec54-7da3-41be-978e-2c1320f17696" containerName="proxy-httpd" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.688386 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.690687 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.692744 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.692962 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.693673 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.698426 4947 scope.go:117] "RemoveContainer" containerID="695ddbf2412109c51d1380548208b0db7dc07236da4dbfc486b84de170ffcc8a" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.704775 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.712596 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.741228 4947 scope.go:117] "RemoveContainer" containerID="5e323d623f28108c32c9e48e5416f9570afdae02ae011b7879e21005051af10e" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.768888 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.771144 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.777223 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.777241 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.777280 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.786828 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.787644 4947 scope.go:117] "RemoveContainer" containerID="11fd85caff5b752f632ced5bbcf8ee310de084b8ad7f078cf535c5808caca885" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.799085 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-logs\") pod \"nova-api-0\" (UID: \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\") " pod="openstack/nova-api-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.799326 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\") " pod="openstack/nova-api-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.799450 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\") " pod="openstack/nova-api-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.799667 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-public-tls-certs\") pod \"nova-api-0\" (UID: \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\") " pod="openstack/nova-api-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.799799 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5kfs\" (UniqueName: \"kubernetes.io/projected/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-kube-api-access-d5kfs\") pod \"nova-api-0\" (UID: \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\") " pod="openstack/nova-api-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.799889 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-config-data\") pod \"nova-api-0\" (UID: \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\") " pod="openstack/nova-api-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.818789 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-dtngb"] Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.819915 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dtngb" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.823817 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.823967 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 03 07:11:34 crc kubenswrapper[4947]: E1203 07:11:34.835936 4947 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2c5fc6f_a718_4732_ac68_9a615f00ae41.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde1bec54_7da3_41be_978e_2c1320f17696.slice\": RecentStats: unable to find data in memory cache]" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.865004 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dtngb"] Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.901528 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33bdabb7-a612-499f-855c-74da636d845a-run-httpd\") pod \"ceilometer-0\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " pod="openstack/ceilometer-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.901612 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\") " pod="openstack/nova-api-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.902539 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-scripts\") pod \"ceilometer-0\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " pod="openstack/ceilometer-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.902580 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " pod="openstack/ceilometer-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.902610 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\") " pod="openstack/nova-api-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.902630 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-public-tls-certs\") pod \"nova-api-0\" (UID: \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\") " pod="openstack/nova-api-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.902675 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-config-data\") pod \"ceilometer-0\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " pod="openstack/ceilometer-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.902701 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " pod="openstack/ceilometer-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.902753 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5kfs\" (UniqueName: \"kubernetes.io/projected/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-kube-api-access-d5kfs\") pod \"nova-api-0\" (UID: \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\") " pod="openstack/nova-api-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.902771 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-config-data\") pod \"nova-api-0\" (UID: \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\") " pod="openstack/nova-api-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.902826 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " pod="openstack/ceilometer-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.902846 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-logs\") pod \"nova-api-0\" (UID: \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\") " pod="openstack/nova-api-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.902865 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33bdabb7-a612-499f-855c-74da636d845a-log-httpd\") pod \"ceilometer-0\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " pod="openstack/ceilometer-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.902912 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nhw5\" (UniqueName: \"kubernetes.io/projected/33bdabb7-a612-499f-855c-74da636d845a-kube-api-access-4nhw5\") pod \"ceilometer-0\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " pod="openstack/ceilometer-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.903383 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-logs\") pod \"nova-api-0\" (UID: \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\") " pod="openstack/nova-api-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.908176 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\") " pod="openstack/nova-api-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.908347 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-config-data\") pod \"nova-api-0\" (UID: \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\") " pod="openstack/nova-api-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.908749 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\") " pod="openstack/nova-api-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.908982 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-public-tls-certs\") pod \"nova-api-0\" (UID: \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\") " pod="openstack/nova-api-0" Dec 03 07:11:34 crc kubenswrapper[4947]: I1203 07:11:34.919835 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5kfs\" (UniqueName: \"kubernetes.io/projected/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-kube-api-access-d5kfs\") pod \"nova-api-0\" (UID: \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\") " pod="openstack/nova-api-0" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.004397 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-config-data\") pod \"ceilometer-0\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " pod="openstack/ceilometer-0" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.004831 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " pod="openstack/ceilometer-0" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.004881 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a182aa69-32cc-4496-a551-d5e0fafda4af-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dtngb\" (UID: \"a182aa69-32cc-4496-a551-d5e0fafda4af\") " pod="openstack/nova-cell1-cell-mapping-dtngb" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.004922 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svzf7\" (UniqueName: \"kubernetes.io/projected/a182aa69-32cc-4496-a551-d5e0fafda4af-kube-api-access-svzf7\") pod \"nova-cell1-cell-mapping-dtngb\" (UID: \"a182aa69-32cc-4496-a551-d5e0fafda4af\") " pod="openstack/nova-cell1-cell-mapping-dtngb" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.004962 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " pod="openstack/ceilometer-0" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.005057 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33bdabb7-a612-499f-855c-74da636d845a-log-httpd\") pod \"ceilometer-0\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " pod="openstack/ceilometer-0" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.005138 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a182aa69-32cc-4496-a551-d5e0fafda4af-scripts\") pod \"nova-cell1-cell-mapping-dtngb\" (UID: \"a182aa69-32cc-4496-a551-d5e0fafda4af\") " pod="openstack/nova-cell1-cell-mapping-dtngb" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.005298 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nhw5\" (UniqueName: \"kubernetes.io/projected/33bdabb7-a612-499f-855c-74da636d845a-kube-api-access-4nhw5\") pod \"ceilometer-0\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " pod="openstack/ceilometer-0" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.005358 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a182aa69-32cc-4496-a551-d5e0fafda4af-config-data\") pod \"nova-cell1-cell-mapping-dtngb\" (UID: \"a182aa69-32cc-4496-a551-d5e0fafda4af\") " pod="openstack/nova-cell1-cell-mapping-dtngb" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.005437 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33bdabb7-a612-499f-855c-74da636d845a-run-httpd\") pod \"ceilometer-0\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " pod="openstack/ceilometer-0" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.005458 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33bdabb7-a612-499f-855c-74da636d845a-log-httpd\") pod \"ceilometer-0\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " pod="openstack/ceilometer-0" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.005552 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-scripts\") pod \"ceilometer-0\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " pod="openstack/ceilometer-0" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.005591 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " pod="openstack/ceilometer-0" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.005689 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33bdabb7-a612-499f-855c-74da636d845a-run-httpd\") pod \"ceilometer-0\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " pod="openstack/ceilometer-0" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.008370 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " pod="openstack/ceilometer-0" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.008955 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-config-data\") pod \"ceilometer-0\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " pod="openstack/ceilometer-0" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.009037 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " pod="openstack/ceilometer-0" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.009678 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " pod="openstack/ceilometer-0" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.011223 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-scripts\") pod \"ceilometer-0\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " pod="openstack/ceilometer-0" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.025167 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.038011 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nhw5\" (UniqueName: \"kubernetes.io/projected/33bdabb7-a612-499f-855c-74da636d845a-kube-api-access-4nhw5\") pod \"ceilometer-0\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " pod="openstack/ceilometer-0" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.107344 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a182aa69-32cc-4496-a551-d5e0fafda4af-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dtngb\" (UID: \"a182aa69-32cc-4496-a551-d5e0fafda4af\") " pod="openstack/nova-cell1-cell-mapping-dtngb" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.107399 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svzf7\" (UniqueName: \"kubernetes.io/projected/a182aa69-32cc-4496-a551-d5e0fafda4af-kube-api-access-svzf7\") pod \"nova-cell1-cell-mapping-dtngb\" (UID: \"a182aa69-32cc-4496-a551-d5e0fafda4af\") " pod="openstack/nova-cell1-cell-mapping-dtngb" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.107445 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a182aa69-32cc-4496-a551-d5e0fafda4af-scripts\") pod \"nova-cell1-cell-mapping-dtngb\" (UID: \"a182aa69-32cc-4496-a551-d5e0fafda4af\") " pod="openstack/nova-cell1-cell-mapping-dtngb" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.107483 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a182aa69-32cc-4496-a551-d5e0fafda4af-config-data\") pod \"nova-cell1-cell-mapping-dtngb\" (UID: \"a182aa69-32cc-4496-a551-d5e0fafda4af\") " pod="openstack/nova-cell1-cell-mapping-dtngb" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.111631 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a182aa69-32cc-4496-a551-d5e0fafda4af-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-dtngb\" (UID: \"a182aa69-32cc-4496-a551-d5e0fafda4af\") " pod="openstack/nova-cell1-cell-mapping-dtngb" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.112945 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a182aa69-32cc-4496-a551-d5e0fafda4af-scripts\") pod \"nova-cell1-cell-mapping-dtngb\" (UID: \"a182aa69-32cc-4496-a551-d5e0fafda4af\") " pod="openstack/nova-cell1-cell-mapping-dtngb" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.113236 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.114413 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a182aa69-32cc-4496-a551-d5e0fafda4af-config-data\") pod \"nova-cell1-cell-mapping-dtngb\" (UID: \"a182aa69-32cc-4496-a551-d5e0fafda4af\") " pod="openstack/nova-cell1-cell-mapping-dtngb" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.114468 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2c5fc6f-a718-4732-ac68-9a615f00ae41" path="/var/lib/kubelet/pods/c2c5fc6f-a718-4732-ac68-9a615f00ae41/volumes" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.115139 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de1bec54-7da3-41be-978e-2c1320f17696" path="/var/lib/kubelet/pods/de1bec54-7da3-41be-978e-2c1320f17696/volumes" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.126267 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svzf7\" (UniqueName: \"kubernetes.io/projected/a182aa69-32cc-4496-a551-d5e0fafda4af-kube-api-access-svzf7\") pod \"nova-cell1-cell-mapping-dtngb\" (UID: \"a182aa69-32cc-4496-a551-d5e0fafda4af\") " pod="openstack/nova-cell1-cell-mapping-dtngb" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.182728 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dtngb" Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.509823 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.582323 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c","Type":"ContainerStarted","Data":"f91676b474eb4a6002981ad330b162e4ac0ec6992f6951f3deba0ad8e0f185df"} Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.614826 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.626663 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 07:11:35 crc kubenswrapper[4947]: I1203 07:11:35.778679 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-dtngb"] Dec 03 07:11:36 crc kubenswrapper[4947]: I1203 07:11:36.592574 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33bdabb7-a612-499f-855c-74da636d845a","Type":"ContainerStarted","Data":"f90261f5afceebdaeb77f3c54024be52a62232f173b9c1fd29905e0065ebea01"} Dec 03 07:11:36 crc kubenswrapper[4947]: I1203 07:11:36.592994 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33bdabb7-a612-499f-855c-74da636d845a","Type":"ContainerStarted","Data":"5714ddceec7271190cc5133b35c79321785ae793eb33df5212a52ff5283aad95"} Dec 03 07:11:36 crc kubenswrapper[4947]: I1203 07:11:36.594662 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c","Type":"ContainerStarted","Data":"3bbd20eee7d855044093490e7574e086cb90bfddfad3913813e9517af68f605b"} Dec 03 07:11:36 crc kubenswrapper[4947]: I1203 07:11:36.594685 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c","Type":"ContainerStarted","Data":"8c3db10286ca926d6e5d7238ea3ecd67c112283fafc23e6b9d135c3d043cd7ec"} Dec 03 07:11:36 crc kubenswrapper[4947]: I1203 07:11:36.597636 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dtngb" event={"ID":"a182aa69-32cc-4496-a551-d5e0fafda4af","Type":"ContainerStarted","Data":"2fcba5d7c876bd3b90e8851cfc63ebb2e3bd539f891a545b8488e8ea2168ca5f"} Dec 03 07:11:36 crc kubenswrapper[4947]: I1203 07:11:36.597662 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dtngb" event={"ID":"a182aa69-32cc-4496-a551-d5e0fafda4af","Type":"ContainerStarted","Data":"0114d2b3c0ac97aa56575f159e496524176d41548f1d0cb26d42d1360e08a9b5"} Dec 03 07:11:36 crc kubenswrapper[4947]: I1203 07:11:36.622913 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.62289391 podStartE2EDuration="2.62289391s" podCreationTimestamp="2025-12-03 07:11:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:11:36.61602589 +0000 UTC m=+1357.876980316" watchObservedRunningTime="2025-12-03 07:11:36.62289391 +0000 UTC m=+1357.883848336" Dec 03 07:11:36 crc kubenswrapper[4947]: I1203 07:11:36.648355 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-dtngb" podStartSLOduration=2.648338141 podStartE2EDuration="2.648338141s" podCreationTimestamp="2025-12-03 07:11:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:11:36.642046258 +0000 UTC m=+1357.903000674" watchObservedRunningTime="2025-12-03 07:11:36.648338141 +0000 UTC m=+1357.909292567" Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.005666 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.132256 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c4475fdfc-mc4tg"] Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.132962 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" podUID="fb3dcc0e-36f7-40d7-97a4-81120af23608" containerName="dnsmasq-dns" containerID="cri-o://47543c62ad6a1a8db1e3df0f70a0cf3357416e4167bbed925cb86aecffd099af" gracePeriod=10 Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.605990 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33bdabb7-a612-499f-855c-74da636d845a","Type":"ContainerStarted","Data":"f62233bb632eafcaea6d217ab7dcf4824646a80db89f733c7f2c4e7b72497014"} Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.608215 4947 generic.go:334] "Generic (PLEG): container finished" podID="fb3dcc0e-36f7-40d7-97a4-81120af23608" containerID="47543c62ad6a1a8db1e3df0f70a0cf3357416e4167bbed925cb86aecffd099af" exitCode=0 Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.608285 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" event={"ID":"fb3dcc0e-36f7-40d7-97a4-81120af23608","Type":"ContainerDied","Data":"47543c62ad6a1a8db1e3df0f70a0cf3357416e4167bbed925cb86aecffd099af"} Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.608315 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" event={"ID":"fb3dcc0e-36f7-40d7-97a4-81120af23608","Type":"ContainerDied","Data":"af00954a0f0e7ac18b91114b334be8feeb6fd15b01ace78c528ac044195bf51d"} Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.608328 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af00954a0f0e7ac18b91114b334be8feeb6fd15b01ace78c528ac044195bf51d" Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.637663 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.778428 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-ovsdbserver-sb\") pod \"fb3dcc0e-36f7-40d7-97a4-81120af23608\" (UID: \"fb3dcc0e-36f7-40d7-97a4-81120af23608\") " Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.778483 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-ovsdbserver-nb\") pod \"fb3dcc0e-36f7-40d7-97a4-81120af23608\" (UID: \"fb3dcc0e-36f7-40d7-97a4-81120af23608\") " Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.778625 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-dns-svc\") pod \"fb3dcc0e-36f7-40d7-97a4-81120af23608\" (UID: \"fb3dcc0e-36f7-40d7-97a4-81120af23608\") " Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.778715 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-config\") pod \"fb3dcc0e-36f7-40d7-97a4-81120af23608\" (UID: \"fb3dcc0e-36f7-40d7-97a4-81120af23608\") " Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.778765 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-dns-swift-storage-0\") pod \"fb3dcc0e-36f7-40d7-97a4-81120af23608\" (UID: \"fb3dcc0e-36f7-40d7-97a4-81120af23608\") " Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.778813 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8zlp\" (UniqueName: \"kubernetes.io/projected/fb3dcc0e-36f7-40d7-97a4-81120af23608-kube-api-access-x8zlp\") pod \"fb3dcc0e-36f7-40d7-97a4-81120af23608\" (UID: \"fb3dcc0e-36f7-40d7-97a4-81120af23608\") " Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.784417 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb3dcc0e-36f7-40d7-97a4-81120af23608-kube-api-access-x8zlp" (OuterVolumeSpecName: "kube-api-access-x8zlp") pod "fb3dcc0e-36f7-40d7-97a4-81120af23608" (UID: "fb3dcc0e-36f7-40d7-97a4-81120af23608"). InnerVolumeSpecName "kube-api-access-x8zlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.826552 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-config" (OuterVolumeSpecName: "config") pod "fb3dcc0e-36f7-40d7-97a4-81120af23608" (UID: "fb3dcc0e-36f7-40d7-97a4-81120af23608"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.828256 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fb3dcc0e-36f7-40d7-97a4-81120af23608" (UID: "fb3dcc0e-36f7-40d7-97a4-81120af23608"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.832380 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fb3dcc0e-36f7-40d7-97a4-81120af23608" (UID: "fb3dcc0e-36f7-40d7-97a4-81120af23608"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.848179 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fb3dcc0e-36f7-40d7-97a4-81120af23608" (UID: "fb3dcc0e-36f7-40d7-97a4-81120af23608"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.870973 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fb3dcc0e-36f7-40d7-97a4-81120af23608" (UID: "fb3dcc0e-36f7-40d7-97a4-81120af23608"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.880684 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.880722 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.880732 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.880741 4947 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.880750 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8zlp\" (UniqueName: \"kubernetes.io/projected/fb3dcc0e-36f7-40d7-97a4-81120af23608-kube-api-access-x8zlp\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:37 crc kubenswrapper[4947]: I1203 07:11:37.880761 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb3dcc0e-36f7-40d7-97a4-81120af23608-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:38 crc kubenswrapper[4947]: I1203 07:11:38.619529 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33bdabb7-a612-499f-855c-74da636d845a","Type":"ContainerStarted","Data":"1788caba560e975af4980a6b4d8f9d5dc9b8edfaba731e0826ca1c2cec99f032"} Dec 03 07:11:38 crc kubenswrapper[4947]: I1203 07:11:38.619563 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" Dec 03 07:11:38 crc kubenswrapper[4947]: I1203 07:11:38.681740 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c4475fdfc-mc4tg"] Dec 03 07:11:38 crc kubenswrapper[4947]: I1203 07:11:38.699485 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c4475fdfc-mc4tg"] Dec 03 07:11:39 crc kubenswrapper[4947]: I1203 07:11:39.113205 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb3dcc0e-36f7-40d7-97a4-81120af23608" path="/var/lib/kubelet/pods/fb3dcc0e-36f7-40d7-97a4-81120af23608/volumes" Dec 03 07:11:39 crc kubenswrapper[4947]: I1203 07:11:39.630520 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33bdabb7-a612-499f-855c-74da636d845a","Type":"ContainerStarted","Data":"f053572725c98712d4c872ef77a6d07d92fe84368f69a4183abee2c484393e0d"} Dec 03 07:11:39 crc kubenswrapper[4947]: I1203 07:11:39.631820 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 07:11:39 crc kubenswrapper[4947]: I1203 07:11:39.662425 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.272463726 podStartE2EDuration="5.662404586s" podCreationTimestamp="2025-12-03 07:11:34 +0000 UTC" firstStartedPulling="2025-12-03 07:11:35.625313892 +0000 UTC m=+1356.886268318" lastFinishedPulling="2025-12-03 07:11:39.015254752 +0000 UTC m=+1360.276209178" observedRunningTime="2025-12-03 07:11:39.656351582 +0000 UTC m=+1360.917306008" watchObservedRunningTime="2025-12-03 07:11:39.662404586 +0000 UTC m=+1360.923359012" Dec 03 07:11:41 crc kubenswrapper[4947]: I1203 07:11:41.648524 4947 generic.go:334] "Generic (PLEG): container finished" podID="a182aa69-32cc-4496-a551-d5e0fafda4af" containerID="2fcba5d7c876bd3b90e8851cfc63ebb2e3bd539f891a545b8488e8ea2168ca5f" exitCode=0 Dec 03 07:11:41 crc kubenswrapper[4947]: I1203 07:11:41.648581 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dtngb" event={"ID":"a182aa69-32cc-4496-a551-d5e0fafda4af","Type":"ContainerDied","Data":"2fcba5d7c876bd3b90e8851cfc63ebb2e3bd539f891a545b8488e8ea2168ca5f"} Dec 03 07:11:42 crc kubenswrapper[4947]: I1203 07:11:42.539475 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c4475fdfc-mc4tg" podUID="fb3dcc0e-36f7-40d7-97a4-81120af23608" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.188:5353: i/o timeout" Dec 03 07:11:43 crc kubenswrapper[4947]: I1203 07:11:43.096224 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dtngb" Dec 03 07:11:43 crc kubenswrapper[4947]: I1203 07:11:43.280980 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a182aa69-32cc-4496-a551-d5e0fafda4af-combined-ca-bundle\") pod \"a182aa69-32cc-4496-a551-d5e0fafda4af\" (UID: \"a182aa69-32cc-4496-a551-d5e0fafda4af\") " Dec 03 07:11:43 crc kubenswrapper[4947]: I1203 07:11:43.281289 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svzf7\" (UniqueName: \"kubernetes.io/projected/a182aa69-32cc-4496-a551-d5e0fafda4af-kube-api-access-svzf7\") pod \"a182aa69-32cc-4496-a551-d5e0fafda4af\" (UID: \"a182aa69-32cc-4496-a551-d5e0fafda4af\") " Dec 03 07:11:43 crc kubenswrapper[4947]: I1203 07:11:43.281436 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a182aa69-32cc-4496-a551-d5e0fafda4af-scripts\") pod \"a182aa69-32cc-4496-a551-d5e0fafda4af\" (UID: \"a182aa69-32cc-4496-a551-d5e0fafda4af\") " Dec 03 07:11:43 crc kubenswrapper[4947]: I1203 07:11:43.281659 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a182aa69-32cc-4496-a551-d5e0fafda4af-config-data\") pod \"a182aa69-32cc-4496-a551-d5e0fafda4af\" (UID: \"a182aa69-32cc-4496-a551-d5e0fafda4af\") " Dec 03 07:11:43 crc kubenswrapper[4947]: I1203 07:11:43.288434 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a182aa69-32cc-4496-a551-d5e0fafda4af-scripts" (OuterVolumeSpecName: "scripts") pod "a182aa69-32cc-4496-a551-d5e0fafda4af" (UID: "a182aa69-32cc-4496-a551-d5e0fafda4af"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:43 crc kubenswrapper[4947]: I1203 07:11:43.290799 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a182aa69-32cc-4496-a551-d5e0fafda4af-kube-api-access-svzf7" (OuterVolumeSpecName: "kube-api-access-svzf7") pod "a182aa69-32cc-4496-a551-d5e0fafda4af" (UID: "a182aa69-32cc-4496-a551-d5e0fafda4af"). InnerVolumeSpecName "kube-api-access-svzf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:11:43 crc kubenswrapper[4947]: I1203 07:11:43.320210 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a182aa69-32cc-4496-a551-d5e0fafda4af-config-data" (OuterVolumeSpecName: "config-data") pod "a182aa69-32cc-4496-a551-d5e0fafda4af" (UID: "a182aa69-32cc-4496-a551-d5e0fafda4af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:43 crc kubenswrapper[4947]: I1203 07:11:43.337056 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a182aa69-32cc-4496-a551-d5e0fafda4af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a182aa69-32cc-4496-a551-d5e0fafda4af" (UID: "a182aa69-32cc-4496-a551-d5e0fafda4af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:43 crc kubenswrapper[4947]: I1203 07:11:43.383994 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a182aa69-32cc-4496-a551-d5e0fafda4af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:43 crc kubenswrapper[4947]: I1203 07:11:43.384036 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svzf7\" (UniqueName: \"kubernetes.io/projected/a182aa69-32cc-4496-a551-d5e0fafda4af-kube-api-access-svzf7\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:43 crc kubenswrapper[4947]: I1203 07:11:43.384050 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a182aa69-32cc-4496-a551-d5e0fafda4af-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:43 crc kubenswrapper[4947]: I1203 07:11:43.384060 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a182aa69-32cc-4496-a551-d5e0fafda4af-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:43 crc kubenswrapper[4947]: I1203 07:11:43.706483 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-dtngb" event={"ID":"a182aa69-32cc-4496-a551-d5e0fafda4af","Type":"ContainerDied","Data":"0114d2b3c0ac97aa56575f159e496524176d41548f1d0cb26d42d1360e08a9b5"} Dec 03 07:11:43 crc kubenswrapper[4947]: I1203 07:11:43.706584 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0114d2b3c0ac97aa56575f159e496524176d41548f1d0cb26d42d1360e08a9b5" Dec 03 07:11:43 crc kubenswrapper[4947]: I1203 07:11:43.706680 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-dtngb" Dec 03 07:11:43 crc kubenswrapper[4947]: I1203 07:11:43.877641 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:11:43 crc kubenswrapper[4947]: I1203 07:11:43.877911 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="19fed18a-18d1-43de-9e4c-12dd5973261f" containerName="nova-scheduler-scheduler" containerID="cri-o://74a0ff0d36c6b7b16656e7c33d29da7739023bf5d90f7d5f48f54ec9614687a6" gracePeriod=30 Dec 03 07:11:43 crc kubenswrapper[4947]: I1203 07:11:43.894931 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:11:43 crc kubenswrapper[4947]: I1203 07:11:43.895203 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c" containerName="nova-api-log" containerID="cri-o://8c3db10286ca926d6e5d7238ea3ecd67c112283fafc23e6b9d135c3d043cd7ec" gracePeriod=30 Dec 03 07:11:43 crc kubenswrapper[4947]: I1203 07:11:43.895306 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c" containerName="nova-api-api" containerID="cri-o://3bbd20eee7d855044093490e7574e086cb90bfddfad3913813e9517af68f605b" gracePeriod=30 Dec 03 07:11:43 crc kubenswrapper[4947]: I1203 07:11:43.920654 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:11:43 crc kubenswrapper[4947]: I1203 07:11:43.922366 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb" containerName="nova-metadata-metadata" containerID="cri-o://dc591bbee2d5e3cc3a14003e50f03a3ae91573cb0b598e3ed489f246342d7787" gracePeriod=30 Dec 03 07:11:43 crc kubenswrapper[4947]: I1203 07:11:43.922563 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb" containerName="nova-metadata-log" containerID="cri-o://cf3c99366a69b3b27ef00f1b5932b90a4fb81ed8d32e2d8c5167ae5a5a24d5ab" gracePeriod=30 Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.486997 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:11:44 crc kubenswrapper[4947]: E1203 07:11:44.566739 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74a0ff0d36c6b7b16656e7c33d29da7739023bf5d90f7d5f48f54ec9614687a6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 07:11:44 crc kubenswrapper[4947]: E1203 07:11:44.567982 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74a0ff0d36c6b7b16656e7c33d29da7739023bf5d90f7d5f48f54ec9614687a6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 07:11:44 crc kubenswrapper[4947]: E1203 07:11:44.569551 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="74a0ff0d36c6b7b16656e7c33d29da7739023bf5d90f7d5f48f54ec9614687a6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 07:11:44 crc kubenswrapper[4947]: E1203 07:11:44.569612 4947 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="19fed18a-18d1-43de-9e4c-12dd5973261f" containerName="nova-scheduler-scheduler" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.624611 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5kfs\" (UniqueName: \"kubernetes.io/projected/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-kube-api-access-d5kfs\") pod \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\" (UID: \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\") " Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.625115 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-config-data\") pod \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\" (UID: \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\") " Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.625167 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-internal-tls-certs\") pod \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\" (UID: \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\") " Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.625222 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-logs\") pod \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\" (UID: \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\") " Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.625246 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-combined-ca-bundle\") pod \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\" (UID: \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\") " Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.625271 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-public-tls-certs\") pod \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\" (UID: \"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c\") " Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.625783 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-logs" (OuterVolumeSpecName: "logs") pod "75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c" (UID: "75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.626151 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.634932 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-kube-api-access-d5kfs" (OuterVolumeSpecName: "kube-api-access-d5kfs") pod "75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c" (UID: "75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c"). InnerVolumeSpecName "kube-api-access-d5kfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.653157 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c" (UID: "75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.658484 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-config-data" (OuterVolumeSpecName: "config-data") pod "75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c" (UID: "75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.678580 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c" (UID: "75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.678920 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c" (UID: "75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.717623 4947 generic.go:334] "Generic (PLEG): container finished" podID="9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb" containerID="cf3c99366a69b3b27ef00f1b5932b90a4fb81ed8d32e2d8c5167ae5a5a24d5ab" exitCode=143 Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.717725 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb","Type":"ContainerDied","Data":"cf3c99366a69b3b27ef00f1b5932b90a4fb81ed8d32e2d8c5167ae5a5a24d5ab"} Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.719964 4947 generic.go:334] "Generic (PLEG): container finished" podID="75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c" containerID="3bbd20eee7d855044093490e7574e086cb90bfddfad3913813e9517af68f605b" exitCode=0 Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.719992 4947 generic.go:334] "Generic (PLEG): container finished" podID="75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c" containerID="8c3db10286ca926d6e5d7238ea3ecd67c112283fafc23e6b9d135c3d043cd7ec" exitCode=143 Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.719993 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c","Type":"ContainerDied","Data":"3bbd20eee7d855044093490e7574e086cb90bfddfad3913813e9517af68f605b"} Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.720029 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.720046 4947 scope.go:117] "RemoveContainer" containerID="3bbd20eee7d855044093490e7574e086cb90bfddfad3913813e9517af68f605b" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.720034 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c","Type":"ContainerDied","Data":"8c3db10286ca926d6e5d7238ea3ecd67c112283fafc23e6b9d135c3d043cd7ec"} Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.720146 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c","Type":"ContainerDied","Data":"f91676b474eb4a6002981ad330b162e4ac0ec6992f6951f3deba0ad8e0f185df"} Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.728163 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.728188 4947 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.728199 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.728208 4947 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.728218 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5kfs\" (UniqueName: \"kubernetes.io/projected/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c-kube-api-access-d5kfs\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.744582 4947 scope.go:117] "RemoveContainer" containerID="8c3db10286ca926d6e5d7238ea3ecd67c112283fafc23e6b9d135c3d043cd7ec" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.758764 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.770655 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.790058 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 07:11:44 crc kubenswrapper[4947]: E1203 07:11:44.790452 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c" containerName="nova-api-api" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.790467 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c" containerName="nova-api-api" Dec 03 07:11:44 crc kubenswrapper[4947]: E1203 07:11:44.790512 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3dcc0e-36f7-40d7-97a4-81120af23608" containerName="init" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.790519 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3dcc0e-36f7-40d7-97a4-81120af23608" containerName="init" Dec 03 07:11:44 crc kubenswrapper[4947]: E1203 07:11:44.790525 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a182aa69-32cc-4496-a551-d5e0fafda4af" containerName="nova-manage" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.790531 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="a182aa69-32cc-4496-a551-d5e0fafda4af" containerName="nova-manage" Dec 03 07:11:44 crc kubenswrapper[4947]: E1203 07:11:44.790544 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c" containerName="nova-api-log" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.790550 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c" containerName="nova-api-log" Dec 03 07:11:44 crc kubenswrapper[4947]: E1203 07:11:44.790576 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3dcc0e-36f7-40d7-97a4-81120af23608" containerName="dnsmasq-dns" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.790581 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3dcc0e-36f7-40d7-97a4-81120af23608" containerName="dnsmasq-dns" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.790751 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c" containerName="nova-api-log" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.790769 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb3dcc0e-36f7-40d7-97a4-81120af23608" containerName="dnsmasq-dns" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.790792 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c" containerName="nova-api-api" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.790827 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="a182aa69-32cc-4496-a551-d5e0fafda4af" containerName="nova-manage" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.791936 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.794231 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.794441 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.794860 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.800223 4947 scope.go:117] "RemoveContainer" containerID="3bbd20eee7d855044093490e7574e086cb90bfddfad3913813e9517af68f605b" Dec 03 07:11:44 crc kubenswrapper[4947]: E1203 07:11:44.801474 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bbd20eee7d855044093490e7574e086cb90bfddfad3913813e9517af68f605b\": container with ID starting with 3bbd20eee7d855044093490e7574e086cb90bfddfad3913813e9517af68f605b not found: ID does not exist" containerID="3bbd20eee7d855044093490e7574e086cb90bfddfad3913813e9517af68f605b" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.801520 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bbd20eee7d855044093490e7574e086cb90bfddfad3913813e9517af68f605b"} err="failed to get container status \"3bbd20eee7d855044093490e7574e086cb90bfddfad3913813e9517af68f605b\": rpc error: code = NotFound desc = could not find container \"3bbd20eee7d855044093490e7574e086cb90bfddfad3913813e9517af68f605b\": container with ID starting with 3bbd20eee7d855044093490e7574e086cb90bfddfad3913813e9517af68f605b not found: ID does not exist" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.801547 4947 scope.go:117] "RemoveContainer" containerID="8c3db10286ca926d6e5d7238ea3ecd67c112283fafc23e6b9d135c3d043cd7ec" Dec 03 07:11:44 crc kubenswrapper[4947]: E1203 07:11:44.802943 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c3db10286ca926d6e5d7238ea3ecd67c112283fafc23e6b9d135c3d043cd7ec\": container with ID starting with 8c3db10286ca926d6e5d7238ea3ecd67c112283fafc23e6b9d135c3d043cd7ec not found: ID does not exist" containerID="8c3db10286ca926d6e5d7238ea3ecd67c112283fafc23e6b9d135c3d043cd7ec" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.802983 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c3db10286ca926d6e5d7238ea3ecd67c112283fafc23e6b9d135c3d043cd7ec"} err="failed to get container status \"8c3db10286ca926d6e5d7238ea3ecd67c112283fafc23e6b9d135c3d043cd7ec\": rpc error: code = NotFound desc = could not find container \"8c3db10286ca926d6e5d7238ea3ecd67c112283fafc23e6b9d135c3d043cd7ec\": container with ID starting with 8c3db10286ca926d6e5d7238ea3ecd67c112283fafc23e6b9d135c3d043cd7ec not found: ID does not exist" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.803003 4947 scope.go:117] "RemoveContainer" containerID="3bbd20eee7d855044093490e7574e086cb90bfddfad3913813e9517af68f605b" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.803603 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bbd20eee7d855044093490e7574e086cb90bfddfad3913813e9517af68f605b"} err="failed to get container status \"3bbd20eee7d855044093490e7574e086cb90bfddfad3913813e9517af68f605b\": rpc error: code = NotFound desc = could not find container \"3bbd20eee7d855044093490e7574e086cb90bfddfad3913813e9517af68f605b\": container with ID starting with 3bbd20eee7d855044093490e7574e086cb90bfddfad3913813e9517af68f605b not found: ID does not exist" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.803640 4947 scope.go:117] "RemoveContainer" containerID="8c3db10286ca926d6e5d7238ea3ecd67c112283fafc23e6b9d135c3d043cd7ec" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.804079 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c3db10286ca926d6e5d7238ea3ecd67c112283fafc23e6b9d135c3d043cd7ec"} err="failed to get container status \"8c3db10286ca926d6e5d7238ea3ecd67c112283fafc23e6b9d135c3d043cd7ec\": rpc error: code = NotFound desc = could not find container \"8c3db10286ca926d6e5d7238ea3ecd67c112283fafc23e6b9d135c3d043cd7ec\": container with ID starting with 8c3db10286ca926d6e5d7238ea3ecd67c112283fafc23e6b9d135c3d043cd7ec not found: ID does not exist" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.814372 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.931128 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jm4h\" (UniqueName: \"kubernetes.io/projected/98d05028-c68a-4438-afcd-161c4f974b08-kube-api-access-8jm4h\") pod \"nova-api-0\" (UID: \"98d05028-c68a-4438-afcd-161c4f974b08\") " pod="openstack/nova-api-0" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.931241 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98d05028-c68a-4438-afcd-161c4f974b08-config-data\") pod \"nova-api-0\" (UID: \"98d05028-c68a-4438-afcd-161c4f974b08\") " pod="openstack/nova-api-0" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.931338 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98d05028-c68a-4438-afcd-161c4f974b08-logs\") pod \"nova-api-0\" (UID: \"98d05028-c68a-4438-afcd-161c4f974b08\") " pod="openstack/nova-api-0" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.931444 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d05028-c68a-4438-afcd-161c4f974b08-public-tls-certs\") pod \"nova-api-0\" (UID: \"98d05028-c68a-4438-afcd-161c4f974b08\") " pod="openstack/nova-api-0" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.931679 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d05028-c68a-4438-afcd-161c4f974b08-internal-tls-certs\") pod \"nova-api-0\" (UID: \"98d05028-c68a-4438-afcd-161c4f974b08\") " pod="openstack/nova-api-0" Dec 03 07:11:44 crc kubenswrapper[4947]: I1203 07:11:44.931945 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d05028-c68a-4438-afcd-161c4f974b08-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"98d05028-c68a-4438-afcd-161c4f974b08\") " pod="openstack/nova-api-0" Dec 03 07:11:45 crc kubenswrapper[4947]: I1203 07:11:45.033588 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d05028-c68a-4438-afcd-161c4f974b08-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"98d05028-c68a-4438-afcd-161c4f974b08\") " pod="openstack/nova-api-0" Dec 03 07:11:45 crc kubenswrapper[4947]: I1203 07:11:45.033711 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jm4h\" (UniqueName: \"kubernetes.io/projected/98d05028-c68a-4438-afcd-161c4f974b08-kube-api-access-8jm4h\") pod \"nova-api-0\" (UID: \"98d05028-c68a-4438-afcd-161c4f974b08\") " pod="openstack/nova-api-0" Dec 03 07:11:45 crc kubenswrapper[4947]: I1203 07:11:45.033764 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98d05028-c68a-4438-afcd-161c4f974b08-config-data\") pod \"nova-api-0\" (UID: \"98d05028-c68a-4438-afcd-161c4f974b08\") " pod="openstack/nova-api-0" Dec 03 07:11:45 crc kubenswrapper[4947]: I1203 07:11:45.033796 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98d05028-c68a-4438-afcd-161c4f974b08-logs\") pod \"nova-api-0\" (UID: \"98d05028-c68a-4438-afcd-161c4f974b08\") " pod="openstack/nova-api-0" Dec 03 07:11:45 crc kubenswrapper[4947]: I1203 07:11:45.033859 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d05028-c68a-4438-afcd-161c4f974b08-public-tls-certs\") pod \"nova-api-0\" (UID: \"98d05028-c68a-4438-afcd-161c4f974b08\") " pod="openstack/nova-api-0" Dec 03 07:11:45 crc kubenswrapper[4947]: I1203 07:11:45.033931 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d05028-c68a-4438-afcd-161c4f974b08-internal-tls-certs\") pod \"nova-api-0\" (UID: \"98d05028-c68a-4438-afcd-161c4f974b08\") " pod="openstack/nova-api-0" Dec 03 07:11:45 crc kubenswrapper[4947]: I1203 07:11:45.038615 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d05028-c68a-4438-afcd-161c4f974b08-internal-tls-certs\") pod \"nova-api-0\" (UID: \"98d05028-c68a-4438-afcd-161c4f974b08\") " pod="openstack/nova-api-0" Dec 03 07:11:45 crc kubenswrapper[4947]: I1203 07:11:45.038973 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98d05028-c68a-4438-afcd-161c4f974b08-logs\") pod \"nova-api-0\" (UID: \"98d05028-c68a-4438-afcd-161c4f974b08\") " pod="openstack/nova-api-0" Dec 03 07:11:45 crc kubenswrapper[4947]: I1203 07:11:45.039879 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d05028-c68a-4438-afcd-161c4f974b08-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"98d05028-c68a-4438-afcd-161c4f974b08\") " pod="openstack/nova-api-0" Dec 03 07:11:45 crc kubenswrapper[4947]: I1203 07:11:45.041195 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98d05028-c68a-4438-afcd-161c4f974b08-config-data\") pod \"nova-api-0\" (UID: \"98d05028-c68a-4438-afcd-161c4f974b08\") " pod="openstack/nova-api-0" Dec 03 07:11:45 crc kubenswrapper[4947]: I1203 07:11:45.042148 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d05028-c68a-4438-afcd-161c4f974b08-public-tls-certs\") pod \"nova-api-0\" (UID: \"98d05028-c68a-4438-afcd-161c4f974b08\") " pod="openstack/nova-api-0" Dec 03 07:11:45 crc kubenswrapper[4947]: I1203 07:11:45.062432 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jm4h\" (UniqueName: \"kubernetes.io/projected/98d05028-c68a-4438-afcd-161c4f974b08-kube-api-access-8jm4h\") pod \"nova-api-0\" (UID: \"98d05028-c68a-4438-afcd-161c4f974b08\") " pod="openstack/nova-api-0" Dec 03 07:11:45 crc kubenswrapper[4947]: I1203 07:11:45.096123 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c" path="/var/lib/kubelet/pods/75deb1d1-d1a3-43fd-88bb-15ef0d15ca1c/volumes" Dec 03 07:11:45 crc kubenswrapper[4947]: I1203 07:11:45.109597 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:11:45 crc kubenswrapper[4947]: I1203 07:11:45.617580 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:11:45 crc kubenswrapper[4947]: I1203 07:11:45.734826 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"98d05028-c68a-4438-afcd-161c4f974b08","Type":"ContainerStarted","Data":"46a0c85d673476dcf28c18dc595980d0048f2463589a11107160fb8331c9aeab"} Dec 03 07:11:46 crc kubenswrapper[4947]: I1203 07:11:46.746612 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"98d05028-c68a-4438-afcd-161c4f974b08","Type":"ContainerStarted","Data":"6e0c48cd3228e4b00d96c837f40857c671662d7b07ee44989563b07cafc4c386"} Dec 03 07:11:46 crc kubenswrapper[4947]: I1203 07:11:46.746957 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"98d05028-c68a-4438-afcd-161c4f974b08","Type":"ContainerStarted","Data":"1ce62f53e6a80cc9f688e02da11287f9cc62f54ad243c0af9a53ea71ef8a6f49"} Dec 03 07:11:46 crc kubenswrapper[4947]: I1203 07:11:46.766546 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.766526898 podStartE2EDuration="2.766526898s" podCreationTimestamp="2025-12-03 07:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:11:46.765834868 +0000 UTC m=+1368.026789314" watchObservedRunningTime="2025-12-03 07:11:46.766526898 +0000 UTC m=+1368.027481344" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.056111 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:60830->10.217.0.193:8775: read: connection reset by peer" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.056204 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:60832->10.217.0.193:8775: read: connection reset by peer" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.482925 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.582569 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-combined-ca-bundle\") pod \"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb\" (UID: \"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb\") " Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.582643 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-config-data\") pod \"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb\" (UID: \"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb\") " Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.582824 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgr5w\" (UniqueName: \"kubernetes.io/projected/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-kube-api-access-rgr5w\") pod \"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb\" (UID: \"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb\") " Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.582869 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-logs\") pod \"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb\" (UID: \"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb\") " Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.582905 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-nova-metadata-tls-certs\") pod \"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb\" (UID: \"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb\") " Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.583612 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-logs" (OuterVolumeSpecName: "logs") pod "9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb" (UID: "9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.601096 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-kube-api-access-rgr5w" (OuterVolumeSpecName: "kube-api-access-rgr5w") pod "9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb" (UID: "9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb"). InnerVolumeSpecName "kube-api-access-rgr5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.612451 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb" (UID: "9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.623704 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-config-data" (OuterVolumeSpecName: "config-data") pod "9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb" (UID: "9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.644253 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb" (UID: "9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.684852 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgr5w\" (UniqueName: \"kubernetes.io/projected/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-kube-api-access-rgr5w\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.684915 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.684929 4947 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.684941 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.684954 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.758027 4947 generic.go:334] "Generic (PLEG): container finished" podID="9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb" containerID="dc591bbee2d5e3cc3a14003e50f03a3ae91573cb0b598e3ed489f246342d7787" exitCode=0 Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.758100 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb","Type":"ContainerDied","Data":"dc591bbee2d5e3cc3a14003e50f03a3ae91573cb0b598e3ed489f246342d7787"} Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.758165 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb","Type":"ContainerDied","Data":"674be2f90e601d970ee9042a032f26841eb718e2becd084ed2f6b2dcfeb35cc4"} Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.758190 4947 scope.go:117] "RemoveContainer" containerID="dc591bbee2d5e3cc3a14003e50f03a3ae91573cb0b598e3ed489f246342d7787" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.759545 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.790754 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.792179 4947 scope.go:117] "RemoveContainer" containerID="cf3c99366a69b3b27ef00f1b5932b90a4fb81ed8d32e2d8c5167ae5a5a24d5ab" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.798319 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.830917 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:11:47 crc kubenswrapper[4947]: E1203 07:11:47.831354 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb" containerName="nova-metadata-metadata" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.831373 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb" containerName="nova-metadata-metadata" Dec 03 07:11:47 crc kubenswrapper[4947]: E1203 07:11:47.831424 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb" containerName="nova-metadata-log" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.831435 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb" containerName="nova-metadata-log" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.831673 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb" containerName="nova-metadata-metadata" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.831712 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb" containerName="nova-metadata-log" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.832982 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.835025 4947 scope.go:117] "RemoveContainer" containerID="dc591bbee2d5e3cc3a14003e50f03a3ae91573cb0b598e3ed489f246342d7787" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.835677 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.835845 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 07:11:47 crc kubenswrapper[4947]: E1203 07:11:47.838435 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc591bbee2d5e3cc3a14003e50f03a3ae91573cb0b598e3ed489f246342d7787\": container with ID starting with dc591bbee2d5e3cc3a14003e50f03a3ae91573cb0b598e3ed489f246342d7787 not found: ID does not exist" containerID="dc591bbee2d5e3cc3a14003e50f03a3ae91573cb0b598e3ed489f246342d7787" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.838475 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc591bbee2d5e3cc3a14003e50f03a3ae91573cb0b598e3ed489f246342d7787"} err="failed to get container status \"dc591bbee2d5e3cc3a14003e50f03a3ae91573cb0b598e3ed489f246342d7787\": rpc error: code = NotFound desc = could not find container \"dc591bbee2d5e3cc3a14003e50f03a3ae91573cb0b598e3ed489f246342d7787\": container with ID starting with dc591bbee2d5e3cc3a14003e50f03a3ae91573cb0b598e3ed489f246342d7787 not found: ID does not exist" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.838585 4947 scope.go:117] "RemoveContainer" containerID="cf3c99366a69b3b27ef00f1b5932b90a4fb81ed8d32e2d8c5167ae5a5a24d5ab" Dec 03 07:11:47 crc kubenswrapper[4947]: E1203 07:11:47.842039 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf3c99366a69b3b27ef00f1b5932b90a4fb81ed8d32e2d8c5167ae5a5a24d5ab\": container with ID starting with cf3c99366a69b3b27ef00f1b5932b90a4fb81ed8d32e2d8c5167ae5a5a24d5ab not found: ID does not exist" containerID="cf3c99366a69b3b27ef00f1b5932b90a4fb81ed8d32e2d8c5167ae5a5a24d5ab" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.842076 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf3c99366a69b3b27ef00f1b5932b90a4fb81ed8d32e2d8c5167ae5a5a24d5ab"} err="failed to get container status \"cf3c99366a69b3b27ef00f1b5932b90a4fb81ed8d32e2d8c5167ae5a5a24d5ab\": rpc error: code = NotFound desc = could not find container \"cf3c99366a69b3b27ef00f1b5932b90a4fb81ed8d32e2d8c5167ae5a5a24d5ab\": container with ID starting with cf3c99366a69b3b27ef00f1b5932b90a4fb81ed8d32e2d8c5167ae5a5a24d5ab not found: ID does not exist" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.847624 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.994748 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/899d3d67-ec63-4d5f-ad93-c40003578347-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"899d3d67-ec63-4d5f-ad93-c40003578347\") " pod="openstack/nova-metadata-0" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.995037 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnv2p\" (UniqueName: \"kubernetes.io/projected/899d3d67-ec63-4d5f-ad93-c40003578347-kube-api-access-jnv2p\") pod \"nova-metadata-0\" (UID: \"899d3d67-ec63-4d5f-ad93-c40003578347\") " pod="openstack/nova-metadata-0" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.995057 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/899d3d67-ec63-4d5f-ad93-c40003578347-config-data\") pod \"nova-metadata-0\" (UID: \"899d3d67-ec63-4d5f-ad93-c40003578347\") " pod="openstack/nova-metadata-0" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.995109 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/899d3d67-ec63-4d5f-ad93-c40003578347-logs\") pod \"nova-metadata-0\" (UID: \"899d3d67-ec63-4d5f-ad93-c40003578347\") " pod="openstack/nova-metadata-0" Dec 03 07:11:47 crc kubenswrapper[4947]: I1203 07:11:47.995127 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/899d3d67-ec63-4d5f-ad93-c40003578347-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"899d3d67-ec63-4d5f-ad93-c40003578347\") " pod="openstack/nova-metadata-0" Dec 03 07:11:48 crc kubenswrapper[4947]: I1203 07:11:48.096732 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/899d3d67-ec63-4d5f-ad93-c40003578347-logs\") pod \"nova-metadata-0\" (UID: \"899d3d67-ec63-4d5f-ad93-c40003578347\") " pod="openstack/nova-metadata-0" Dec 03 07:11:48 crc kubenswrapper[4947]: I1203 07:11:48.096796 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/899d3d67-ec63-4d5f-ad93-c40003578347-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"899d3d67-ec63-4d5f-ad93-c40003578347\") " pod="openstack/nova-metadata-0" Dec 03 07:11:48 crc kubenswrapper[4947]: I1203 07:11:48.096904 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/899d3d67-ec63-4d5f-ad93-c40003578347-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"899d3d67-ec63-4d5f-ad93-c40003578347\") " pod="openstack/nova-metadata-0" Dec 03 07:11:48 crc kubenswrapper[4947]: I1203 07:11:48.096975 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnv2p\" (UniqueName: \"kubernetes.io/projected/899d3d67-ec63-4d5f-ad93-c40003578347-kube-api-access-jnv2p\") pod \"nova-metadata-0\" (UID: \"899d3d67-ec63-4d5f-ad93-c40003578347\") " pod="openstack/nova-metadata-0" Dec 03 07:11:48 crc kubenswrapper[4947]: I1203 07:11:48.097000 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/899d3d67-ec63-4d5f-ad93-c40003578347-config-data\") pod \"nova-metadata-0\" (UID: \"899d3d67-ec63-4d5f-ad93-c40003578347\") " pod="openstack/nova-metadata-0" Dec 03 07:11:48 crc kubenswrapper[4947]: I1203 07:11:48.097224 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/899d3d67-ec63-4d5f-ad93-c40003578347-logs\") pod \"nova-metadata-0\" (UID: \"899d3d67-ec63-4d5f-ad93-c40003578347\") " pod="openstack/nova-metadata-0" Dec 03 07:11:48 crc kubenswrapper[4947]: I1203 07:11:48.100731 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/899d3d67-ec63-4d5f-ad93-c40003578347-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"899d3d67-ec63-4d5f-ad93-c40003578347\") " pod="openstack/nova-metadata-0" Dec 03 07:11:48 crc kubenswrapper[4947]: I1203 07:11:48.101076 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/899d3d67-ec63-4d5f-ad93-c40003578347-config-data\") pod \"nova-metadata-0\" (UID: \"899d3d67-ec63-4d5f-ad93-c40003578347\") " pod="openstack/nova-metadata-0" Dec 03 07:11:48 crc kubenswrapper[4947]: I1203 07:11:48.101261 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/899d3d67-ec63-4d5f-ad93-c40003578347-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"899d3d67-ec63-4d5f-ad93-c40003578347\") " pod="openstack/nova-metadata-0" Dec 03 07:11:48 crc kubenswrapper[4947]: I1203 07:11:48.123333 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnv2p\" (UniqueName: \"kubernetes.io/projected/899d3d67-ec63-4d5f-ad93-c40003578347-kube-api-access-jnv2p\") pod \"nova-metadata-0\" (UID: \"899d3d67-ec63-4d5f-ad93-c40003578347\") " pod="openstack/nova-metadata-0" Dec 03 07:11:48 crc kubenswrapper[4947]: I1203 07:11:48.152724 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:11:48 crc kubenswrapper[4947]: I1203 07:11:48.634051 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:11:48 crc kubenswrapper[4947]: W1203 07:11:48.640629 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod899d3d67_ec63_4d5f_ad93_c40003578347.slice/crio-37b9c944600f7edad3c33294e138cb44fc8c7f386c32f123235cf07899642232 WatchSource:0}: Error finding container 37b9c944600f7edad3c33294e138cb44fc8c7f386c32f123235cf07899642232: Status 404 returned error can't find the container with id 37b9c944600f7edad3c33294e138cb44fc8c7f386c32f123235cf07899642232 Dec 03 07:11:48 crc kubenswrapper[4947]: I1203 07:11:48.775789 4947 generic.go:334] "Generic (PLEG): container finished" podID="19fed18a-18d1-43de-9e4c-12dd5973261f" containerID="74a0ff0d36c6b7b16656e7c33d29da7739023bf5d90f7d5f48f54ec9614687a6" exitCode=0 Dec 03 07:11:48 crc kubenswrapper[4947]: I1203 07:11:48.775894 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"19fed18a-18d1-43de-9e4c-12dd5973261f","Type":"ContainerDied","Data":"74a0ff0d36c6b7b16656e7c33d29da7739023bf5d90f7d5f48f54ec9614687a6"} Dec 03 07:11:48 crc kubenswrapper[4947]: I1203 07:11:48.778135 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"899d3d67-ec63-4d5f-ad93-c40003578347","Type":"ContainerStarted","Data":"37b9c944600f7edad3c33294e138cb44fc8c7f386c32f123235cf07899642232"} Dec 03 07:11:48 crc kubenswrapper[4947]: I1203 07:11:48.862956 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 07:11:48 crc kubenswrapper[4947]: I1203 07:11:48.973911 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19fed18a-18d1-43de-9e4c-12dd5973261f-combined-ca-bundle\") pod \"19fed18a-18d1-43de-9e4c-12dd5973261f\" (UID: \"19fed18a-18d1-43de-9e4c-12dd5973261f\") " Dec 03 07:11:48 crc kubenswrapper[4947]: I1203 07:11:48.974039 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg8mw\" (UniqueName: \"kubernetes.io/projected/19fed18a-18d1-43de-9e4c-12dd5973261f-kube-api-access-gg8mw\") pod \"19fed18a-18d1-43de-9e4c-12dd5973261f\" (UID: \"19fed18a-18d1-43de-9e4c-12dd5973261f\") " Dec 03 07:11:48 crc kubenswrapper[4947]: I1203 07:11:48.974262 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19fed18a-18d1-43de-9e4c-12dd5973261f-config-data\") pod \"19fed18a-18d1-43de-9e4c-12dd5973261f\" (UID: \"19fed18a-18d1-43de-9e4c-12dd5973261f\") " Dec 03 07:11:48 crc kubenswrapper[4947]: I1203 07:11:48.982223 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19fed18a-18d1-43de-9e4c-12dd5973261f-kube-api-access-gg8mw" (OuterVolumeSpecName: "kube-api-access-gg8mw") pod "19fed18a-18d1-43de-9e4c-12dd5973261f" (UID: "19fed18a-18d1-43de-9e4c-12dd5973261f"). InnerVolumeSpecName "kube-api-access-gg8mw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:11:49 crc kubenswrapper[4947]: I1203 07:11:49.011842 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19fed18a-18d1-43de-9e4c-12dd5973261f-config-data" (OuterVolumeSpecName: "config-data") pod "19fed18a-18d1-43de-9e4c-12dd5973261f" (UID: "19fed18a-18d1-43de-9e4c-12dd5973261f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:49 crc kubenswrapper[4947]: I1203 07:11:49.018805 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19fed18a-18d1-43de-9e4c-12dd5973261f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19fed18a-18d1-43de-9e4c-12dd5973261f" (UID: "19fed18a-18d1-43de-9e4c-12dd5973261f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:11:49 crc kubenswrapper[4947]: I1203 07:11:49.076020 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19fed18a-18d1-43de-9e4c-12dd5973261f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:49 crc kubenswrapper[4947]: I1203 07:11:49.076051 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg8mw\" (UniqueName: \"kubernetes.io/projected/19fed18a-18d1-43de-9e4c-12dd5973261f-kube-api-access-gg8mw\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:49 crc kubenswrapper[4947]: I1203 07:11:49.076062 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19fed18a-18d1-43de-9e4c-12dd5973261f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:11:49 crc kubenswrapper[4947]: I1203 07:11:49.097612 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb" path="/var/lib/kubelet/pods/9482fd2f-dfeb-46ba-9bc5-fb0f6cf83afb/volumes" Dec 03 07:11:49 crc kubenswrapper[4947]: I1203 07:11:49.809719 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"899d3d67-ec63-4d5f-ad93-c40003578347","Type":"ContainerStarted","Data":"a8b3ef7eb0ba7d8af170edb23f786fe99d24451a36fd652d4dbc94d3f46220ba"} Dec 03 07:11:49 crc kubenswrapper[4947]: I1203 07:11:49.809772 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"899d3d67-ec63-4d5f-ad93-c40003578347","Type":"ContainerStarted","Data":"96cdcb461d21fa3b9f6f2564295436608c7aad6f8b2a1e0421630332a70976f2"} Dec 03 07:11:49 crc kubenswrapper[4947]: I1203 07:11:49.813116 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"19fed18a-18d1-43de-9e4c-12dd5973261f","Type":"ContainerDied","Data":"fe95f89dee5449a8b8fa1a8c90670e587b42387ae3b6e6c99ee5e28f5389dcbf"} Dec 03 07:11:49 crc kubenswrapper[4947]: I1203 07:11:49.813176 4947 scope.go:117] "RemoveContainer" containerID="74a0ff0d36c6b7b16656e7c33d29da7739023bf5d90f7d5f48f54ec9614687a6" Dec 03 07:11:49 crc kubenswrapper[4947]: I1203 07:11:49.813445 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 07:11:49 crc kubenswrapper[4947]: I1203 07:11:49.846467 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.846438345 podStartE2EDuration="2.846438345s" podCreationTimestamp="2025-12-03 07:11:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:11:49.837222715 +0000 UTC m=+1371.098177181" watchObservedRunningTime="2025-12-03 07:11:49.846438345 +0000 UTC m=+1371.107392811" Dec 03 07:11:49 crc kubenswrapper[4947]: I1203 07:11:49.877046 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:11:49 crc kubenswrapper[4947]: I1203 07:11:49.890555 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:11:49 crc kubenswrapper[4947]: I1203 07:11:49.903336 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:11:49 crc kubenswrapper[4947]: E1203 07:11:49.903733 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19fed18a-18d1-43de-9e4c-12dd5973261f" containerName="nova-scheduler-scheduler" Dec 03 07:11:49 crc kubenswrapper[4947]: I1203 07:11:49.903749 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="19fed18a-18d1-43de-9e4c-12dd5973261f" containerName="nova-scheduler-scheduler" Dec 03 07:11:49 crc kubenswrapper[4947]: I1203 07:11:49.903979 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="19fed18a-18d1-43de-9e4c-12dd5973261f" containerName="nova-scheduler-scheduler" Dec 03 07:11:49 crc kubenswrapper[4947]: I1203 07:11:49.905115 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 07:11:49 crc kubenswrapper[4947]: I1203 07:11:49.909013 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 07:11:49 crc kubenswrapper[4947]: I1203 07:11:49.915979 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:11:50 crc kubenswrapper[4947]: I1203 07:11:50.098978 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3735b7db-e9a7-4be6-9c74-cad0131f2c0b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3735b7db-e9a7-4be6-9c74-cad0131f2c0b\") " pod="openstack/nova-scheduler-0" Dec 03 07:11:50 crc kubenswrapper[4947]: I1203 07:11:50.099112 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3735b7db-e9a7-4be6-9c74-cad0131f2c0b-config-data\") pod \"nova-scheduler-0\" (UID: \"3735b7db-e9a7-4be6-9c74-cad0131f2c0b\") " pod="openstack/nova-scheduler-0" Dec 03 07:11:50 crc kubenswrapper[4947]: I1203 07:11:50.099147 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r6kr\" (UniqueName: \"kubernetes.io/projected/3735b7db-e9a7-4be6-9c74-cad0131f2c0b-kube-api-access-9r6kr\") pod \"nova-scheduler-0\" (UID: \"3735b7db-e9a7-4be6-9c74-cad0131f2c0b\") " pod="openstack/nova-scheduler-0" Dec 03 07:11:50 crc kubenswrapper[4947]: I1203 07:11:50.201022 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3735b7db-e9a7-4be6-9c74-cad0131f2c0b-config-data\") pod \"nova-scheduler-0\" (UID: \"3735b7db-e9a7-4be6-9c74-cad0131f2c0b\") " pod="openstack/nova-scheduler-0" Dec 03 07:11:50 crc kubenswrapper[4947]: I1203 07:11:50.201068 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r6kr\" (UniqueName: \"kubernetes.io/projected/3735b7db-e9a7-4be6-9c74-cad0131f2c0b-kube-api-access-9r6kr\") pod \"nova-scheduler-0\" (UID: \"3735b7db-e9a7-4be6-9c74-cad0131f2c0b\") " pod="openstack/nova-scheduler-0" Dec 03 07:11:50 crc kubenswrapper[4947]: I1203 07:11:50.201217 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3735b7db-e9a7-4be6-9c74-cad0131f2c0b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3735b7db-e9a7-4be6-9c74-cad0131f2c0b\") " pod="openstack/nova-scheduler-0" Dec 03 07:11:50 crc kubenswrapper[4947]: I1203 07:11:50.210561 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3735b7db-e9a7-4be6-9c74-cad0131f2c0b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3735b7db-e9a7-4be6-9c74-cad0131f2c0b\") " pod="openstack/nova-scheduler-0" Dec 03 07:11:50 crc kubenswrapper[4947]: I1203 07:11:50.211305 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3735b7db-e9a7-4be6-9c74-cad0131f2c0b-config-data\") pod \"nova-scheduler-0\" (UID: \"3735b7db-e9a7-4be6-9c74-cad0131f2c0b\") " pod="openstack/nova-scheduler-0" Dec 03 07:11:50 crc kubenswrapper[4947]: I1203 07:11:50.230915 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r6kr\" (UniqueName: \"kubernetes.io/projected/3735b7db-e9a7-4be6-9c74-cad0131f2c0b-kube-api-access-9r6kr\") pod \"nova-scheduler-0\" (UID: \"3735b7db-e9a7-4be6-9c74-cad0131f2c0b\") " pod="openstack/nova-scheduler-0" Dec 03 07:11:50 crc kubenswrapper[4947]: I1203 07:11:50.255262 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 07:11:50 crc kubenswrapper[4947]: W1203 07:11:50.747396 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3735b7db_e9a7_4be6_9c74_cad0131f2c0b.slice/crio-dfc7a7d790de3be7195a0e499989a47fad20dcf7e377c93900a96c4b9f7ad988 WatchSource:0}: Error finding container dfc7a7d790de3be7195a0e499989a47fad20dcf7e377c93900a96c4b9f7ad988: Status 404 returned error can't find the container with id dfc7a7d790de3be7195a0e499989a47fad20dcf7e377c93900a96c4b9f7ad988 Dec 03 07:11:50 crc kubenswrapper[4947]: I1203 07:11:50.747695 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:11:50 crc kubenswrapper[4947]: I1203 07:11:50.822130 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3735b7db-e9a7-4be6-9c74-cad0131f2c0b","Type":"ContainerStarted","Data":"dfc7a7d790de3be7195a0e499989a47fad20dcf7e377c93900a96c4b9f7ad988"} Dec 03 07:11:51 crc kubenswrapper[4947]: I1203 07:11:51.096677 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19fed18a-18d1-43de-9e4c-12dd5973261f" path="/var/lib/kubelet/pods/19fed18a-18d1-43de-9e4c-12dd5973261f/volumes" Dec 03 07:11:51 crc kubenswrapper[4947]: I1203 07:11:51.836340 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3735b7db-e9a7-4be6-9c74-cad0131f2c0b","Type":"ContainerStarted","Data":"d0205492af2a84d2597e544b1fac1933410305054e5bf3b202d9e1f108f9cdfa"} Dec 03 07:11:51 crc kubenswrapper[4947]: I1203 07:11:51.859086 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.859069856 podStartE2EDuration="2.859069856s" podCreationTimestamp="2025-12-03 07:11:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:11:51.855346314 +0000 UTC m=+1373.116300740" watchObservedRunningTime="2025-12-03 07:11:51.859069856 +0000 UTC m=+1373.120024292" Dec 03 07:11:53 crc kubenswrapper[4947]: I1203 07:11:53.153154 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 07:11:53 crc kubenswrapper[4947]: I1203 07:11:53.153234 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 07:11:55 crc kubenswrapper[4947]: I1203 07:11:55.109910 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 07:11:55 crc kubenswrapper[4947]: I1203 07:11:55.110261 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 07:11:55 crc kubenswrapper[4947]: I1203 07:11:55.255593 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 07:11:56 crc kubenswrapper[4947]: I1203 07:11:56.122767 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="98d05028-c68a-4438-afcd-161c4f974b08" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 07:11:56 crc kubenswrapper[4947]: I1203 07:11:56.122810 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="98d05028-c68a-4438-afcd-161c4f974b08" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.202:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 07:11:58 crc kubenswrapper[4947]: I1203 07:11:58.153651 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 07:11:58 crc kubenswrapper[4947]: I1203 07:11:58.153974 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 07:11:59 crc kubenswrapper[4947]: I1203 07:11:59.169724 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="899d3d67-ec63-4d5f-ad93-c40003578347" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 07:11:59 crc kubenswrapper[4947]: I1203 07:11:59.169726 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="899d3d67-ec63-4d5f-ad93-c40003578347" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 07:12:00 crc kubenswrapper[4947]: I1203 07:12:00.255504 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 07:12:00 crc kubenswrapper[4947]: I1203 07:12:00.284278 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 07:12:00 crc kubenswrapper[4947]: I1203 07:12:00.953522 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 07:12:05 crc kubenswrapper[4947]: I1203 07:12:05.115602 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 07:12:05 crc kubenswrapper[4947]: I1203 07:12:05.116337 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 07:12:05 crc kubenswrapper[4947]: I1203 07:12:05.116662 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 07:12:05 crc kubenswrapper[4947]: I1203 07:12:05.121053 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 07:12:05 crc kubenswrapper[4947]: I1203 07:12:05.121392 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 07:12:05 crc kubenswrapper[4947]: I1203 07:12:05.976413 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 07:12:05 crc kubenswrapper[4947]: I1203 07:12:05.983506 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 07:12:08 crc kubenswrapper[4947]: I1203 07:12:08.160172 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 07:12:08 crc kubenswrapper[4947]: I1203 07:12:08.161641 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 07:12:08 crc kubenswrapper[4947]: I1203 07:12:08.174258 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 07:12:09 crc kubenswrapper[4947]: I1203 07:12:09.009162 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.029587 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.075718 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.075909 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="28892119-e165-46ea-a903-08207e491378" containerName="openstackclient" containerID="cri-o://e942659fea577abdb40bc1687013b7988fc7bb6ef1815a0d789c1541253c9e6a" gracePeriod=2 Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.145425 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.193654 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.193960 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="6c475475-916c-4267-8064-f932c04d0df2" containerName="openstack-network-exporter" containerID="cri-o://909ef5b644a4ef4623502b6cd9ac440ea6dcc840b5372f80ecaa44d612f7650b" gracePeriod=300 Dec 03 07:12:30 crc kubenswrapper[4947]: E1203 07:12:30.199481 4947 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 03 07:12:30 crc kubenswrapper[4947]: E1203 07:12:30.199591 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-config-data podName:5367165f-75ec-4633-8042-edfe91e3be60 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:30.699574696 +0000 UTC m=+1411.960529122 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-config-data") pod "rabbitmq-cell1-server-0" (UID: "5367165f-75ec-4633-8042-edfe91e3be60") : configmap "rabbitmq-cell1-config-data" not found Dec 03 07:12:30 crc kubenswrapper[4947]: E1203 07:12:30.199987 4947 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Dec 03 07:12:30 crc kubenswrapper[4947]: E1203 07:12:30.200008 4947 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-6ddbf97597-l6hz9: secret "swift-conf" not found Dec 03 07:12:30 crc kubenswrapper[4947]: E1203 07:12:30.200032 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1f87802c-3846-486e-a131-39a7fe336c96-etc-swift podName:1f87802c-3846-486e-a131-39a7fe336c96 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:30.700024488 +0000 UTC m=+1411.960978914 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1f87802c-3846-486e-a131-39a7fe336c96-etc-swift") pod "swift-proxy-6ddbf97597-l6hz9" (UID: "1f87802c-3846-486e-a131-39a7fe336c96") : secret "swift-conf" not found Dec 03 07:12:30 crc kubenswrapper[4947]: E1203 07:12:30.201160 4947 secret.go:188] Couldn't get secret openstack/cinder-config-data: secret "cinder-config-data" not found Dec 03 07:12:30 crc kubenswrapper[4947]: E1203 07:12:30.201323 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-config-data podName:49caa6da-3c98-4c49-ab22-62121ff908cf nodeName:}" failed. No retries permitted until 2025-12-03 07:12:30.701286993 +0000 UTC m=+1411.962241419 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-config-data") pod "cinder-scheduler-0" (UID: "49caa6da-3c98-4c49-ab22-62121ff908cf") : secret "cinder-config-data" not found Dec 03 07:12:30 crc kubenswrapper[4947]: E1203 07:12:30.201369 4947 secret.go:188] Couldn't get secret openstack/cinder-scripts: secret "cinder-scripts" not found Dec 03 07:12:30 crc kubenswrapper[4947]: E1203 07:12:30.201626 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-scripts podName:49caa6da-3c98-4c49-ab22-62121ff908cf nodeName:}" failed. No retries permitted until 2025-12-03 07:12:30.701617322 +0000 UTC m=+1411.962571748 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-scripts") pod "cinder-scheduler-0" (UID: "49caa6da-3c98-4c49-ab22-62121ff908cf") : secret "cinder-scripts" not found Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.272514 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder0e6b-account-delete-kmbkl"] Dec 03 07:12:30 crc kubenswrapper[4947]: E1203 07:12:30.272921 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28892119-e165-46ea-a903-08207e491378" containerName="openstackclient" Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.272937 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="28892119-e165-46ea-a903-08207e491378" containerName="openstackclient" Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.273112 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="28892119-e165-46ea-a903-08207e491378" containerName="openstackclient" Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.274660 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder0e6b-account-delete-kmbkl" Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.301032 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c259ef0-fa34-4ef1-a954-a15bd61ed120-operator-scripts\") pod \"cinder0e6b-account-delete-kmbkl\" (UID: \"8c259ef0-fa34-4ef1-a954-a15bd61ed120\") " pod="openstack/cinder0e6b-account-delete-kmbkl" Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.301191 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9smc\" (UniqueName: \"kubernetes.io/projected/8c259ef0-fa34-4ef1-a954-a15bd61ed120-kube-api-access-z9smc\") pod \"cinder0e6b-account-delete-kmbkl\" (UID: \"8c259ef0-fa34-4ef1-a954-a15bd61ed120\") " pod="openstack/cinder0e6b-account-delete-kmbkl" Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.319975 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder0e6b-account-delete-kmbkl"] Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.381830 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican0363-account-delete-22cjb"] Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.383042 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican0363-account-delete-22cjb" Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.402561 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.402977 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="90b682d1-68e6-49a4-83a6-51b1b40b7e99" containerName="openstack-network-exporter" containerID="cri-o://a00403c9e5c9c0356f8cda0bb2f5ea33101133e03ecec9269cd5a9a058bc1298" gracePeriod=300 Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.403830 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c259ef0-fa34-4ef1-a954-a15bd61ed120-operator-scripts\") pod \"cinder0e6b-account-delete-kmbkl\" (UID: \"8c259ef0-fa34-4ef1-a954-a15bd61ed120\") " pod="openstack/cinder0e6b-account-delete-kmbkl" Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.403964 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9smc\" (UniqueName: \"kubernetes.io/projected/8c259ef0-fa34-4ef1-a954-a15bd61ed120-kube-api-access-z9smc\") pod \"cinder0e6b-account-delete-kmbkl\" (UID: \"8c259ef0-fa34-4ef1-a954-a15bd61ed120\") " pod="openstack/cinder0e6b-account-delete-kmbkl" Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.404833 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c259ef0-fa34-4ef1-a954-a15bd61ed120-operator-scripts\") pod \"cinder0e6b-account-delete-kmbkl\" (UID: \"8c259ef0-fa34-4ef1-a954-a15bd61ed120\") " pod="openstack/cinder0e6b-account-delete-kmbkl" Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.418007 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.418257 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="855f6436-68f0-42d8-a12a-bf25632440c1" containerName="ovn-northd" containerID="cri-o://9e770c8d82efa23dd527c2a4232bde18aef5979e147a5871d763e136d274533e" gracePeriod=30 Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.418400 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="855f6436-68f0-42d8-a12a-bf25632440c1" containerName="openstack-network-exporter" containerID="cri-o://9d943d955b4387689a8d712809d10af11da1db79e92a195ba57d31cce0773125" gracePeriod=30 Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.443253 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="6c475475-916c-4267-8064-f932c04d0df2" containerName="ovsdbserver-nb" containerID="cri-o://759d1c8703ad59941336de48361002019ed4b13b124229b7732fe2bcf89eedf4" gracePeriod=300 Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.477229 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican0363-account-delete-22cjb"] Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.556001 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmmgg\" (UniqueName: \"kubernetes.io/projected/44c0f438-8d06-433e-af68-49ccacd9a017-kube-api-access-jmmgg\") pod \"barbican0363-account-delete-22cjb\" (UID: \"44c0f438-8d06-433e-af68-49ccacd9a017\") " pod="openstack/barbican0363-account-delete-22cjb" Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.561978 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44c0f438-8d06-433e-af68-49ccacd9a017-operator-scripts\") pod \"barbican0363-account-delete-22cjb\" (UID: \"44c0f438-8d06-433e-af68-49ccacd9a017\") " pod="openstack/barbican0363-account-delete-22cjb" Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.599320 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="90b682d1-68e6-49a4-83a6-51b1b40b7e99" containerName="ovsdbserver-sb" containerID="cri-o://a79ca432e119cfe4382eac1658d9f1475820f86e884d9d267beaad6d791c9e27" gracePeriod=300 Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.644942 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9smc\" (UniqueName: \"kubernetes.io/projected/8c259ef0-fa34-4ef1-a954-a15bd61ed120-kube-api-access-z9smc\") pod \"cinder0e6b-account-delete-kmbkl\" (UID: \"8c259ef0-fa34-4ef1-a954-a15bd61ed120\") " pod="openstack/cinder0e6b-account-delete-kmbkl" Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.695981 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-6bhnb"] Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.736556 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-6bhnb"] Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.777478 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-nv68l"] Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.798243 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44c0f438-8d06-433e-af68-49ccacd9a017-operator-scripts\") pod \"barbican0363-account-delete-22cjb\" (UID: \"44c0f438-8d06-433e-af68-49ccacd9a017\") " pod="openstack/barbican0363-account-delete-22cjb" Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.798668 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmmgg\" (UniqueName: \"kubernetes.io/projected/44c0f438-8d06-433e-af68-49ccacd9a017-kube-api-access-jmmgg\") pod \"barbican0363-account-delete-22cjb\" (UID: \"44c0f438-8d06-433e-af68-49ccacd9a017\") " pod="openstack/barbican0363-account-delete-22cjb" Dec 03 07:12:30 crc kubenswrapper[4947]: E1203 07:12:30.798869 4947 secret.go:188] Couldn't get secret openstack/cinder-config-data: secret "cinder-config-data" not found Dec 03 07:12:30 crc kubenswrapper[4947]: E1203 07:12:30.798976 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-config-data podName:49caa6da-3c98-4c49-ab22-62121ff908cf nodeName:}" failed. No retries permitted until 2025-12-03 07:12:31.79896108 +0000 UTC m=+1413.059915506 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-config-data") pod "cinder-scheduler-0" (UID: "49caa6da-3c98-4c49-ab22-62121ff908cf") : secret "cinder-config-data" not found Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.800478 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44c0f438-8d06-433e-af68-49ccacd9a017-operator-scripts\") pod \"barbican0363-account-delete-22cjb\" (UID: \"44c0f438-8d06-433e-af68-49ccacd9a017\") " pod="openstack/barbican0363-account-delete-22cjb" Dec 03 07:12:30 crc kubenswrapper[4947]: E1203 07:12:30.801437 4947 projected.go:263] Couldn't get secret openstack/swift-proxy-config-data: secret "swift-proxy-config-data" not found Dec 03 07:12:30 crc kubenswrapper[4947]: E1203 07:12:30.801551 4947 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Dec 03 07:12:30 crc kubenswrapper[4947]: E1203 07:12:30.801635 4947 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 07:12:30 crc kubenswrapper[4947]: E1203 07:12:30.801705 4947 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-6ddbf97597-l6hz9: [secret "swift-proxy-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Dec 03 07:12:30 crc kubenswrapper[4947]: E1203 07:12:30.801793 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1f87802c-3846-486e-a131-39a7fe336c96-etc-swift podName:1f87802c-3846-486e-a131-39a7fe336c96 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:31.801780067 +0000 UTC m=+1413.062734493 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1f87802c-3846-486e-a131-39a7fe336c96-etc-swift") pod "swift-proxy-6ddbf97597-l6hz9" (UID: "1f87802c-3846-486e-a131-39a7fe336c96") : [secret "swift-proxy-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Dec 03 07:12:30 crc kubenswrapper[4947]: E1203 07:12:30.801892 4947 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 03 07:12:30 crc kubenswrapper[4947]: E1203 07:12:30.801962 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-config-data podName:5367165f-75ec-4633-8042-edfe91e3be60 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:31.801955292 +0000 UTC m=+1413.062909718 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-config-data") pod "rabbitmq-cell1-server-0" (UID: "5367165f-75ec-4633-8042-edfe91e3be60") : configmap "rabbitmq-cell1-config-data" not found Dec 03 07:12:30 crc kubenswrapper[4947]: E1203 07:12:30.803033 4947 secret.go:188] Couldn't get secret openstack/cinder-scripts: secret "cinder-scripts" not found Dec 03 07:12:30 crc kubenswrapper[4947]: E1203 07:12:30.803148 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-scripts podName:49caa6da-3c98-4c49-ab22-62121ff908cf nodeName:}" failed. No retries permitted until 2025-12-03 07:12:31.803139224 +0000 UTC m=+1413.064093650 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-scripts") pod "cinder-scheduler-0" (UID: "49caa6da-3c98-4c49-ab22-62121ff908cf") : secret "cinder-scripts" not found Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.837441 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-nv68l"] Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.876825 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmmgg\" (UniqueName: \"kubernetes.io/projected/44c0f438-8d06-433e-af68-49ccacd9a017-kube-api-access-jmmgg\") pod \"barbican0363-account-delete-22cjb\" (UID: \"44c0f438-8d06-433e-af68-49ccacd9a017\") " pod="openstack/barbican0363-account-delete-22cjb" Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.891962 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placementbdf2-account-delete-qmjm2"] Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.893346 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementbdf2-account-delete-qmjm2" Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.927713 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder0e6b-account-delete-kmbkl" Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.928581 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placementbdf2-account-delete-qmjm2"] Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.938869 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance93c8-account-delete-j67wz"] Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.949559 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance93c8-account-delete-j67wz" Dec 03 07:12:30 crc kubenswrapper[4947]: I1203 07:12:30.984994 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance93c8-account-delete-j67wz"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.004947 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6a94dde-b399-408b-93f2-488d02be7f07-operator-scripts\") pod \"glance93c8-account-delete-j67wz\" (UID: \"c6a94dde-b399-408b-93f2-488d02be7f07\") " pod="openstack/glance93c8-account-delete-j67wz" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.005045 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmq74\" (UniqueName: \"kubernetes.io/projected/1256396c-ee12-469e-864f-c87983516079-kube-api-access-xmq74\") pod \"placementbdf2-account-delete-qmjm2\" (UID: \"1256396c-ee12-469e-864f-c87983516079\") " pod="openstack/placementbdf2-account-delete-qmjm2" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.005129 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n9xw\" (UniqueName: \"kubernetes.io/projected/c6a94dde-b399-408b-93f2-488d02be7f07-kube-api-access-2n9xw\") pod \"glance93c8-account-delete-j67wz\" (UID: \"c6a94dde-b399-408b-93f2-488d02be7f07\") " pod="openstack/glance93c8-account-delete-j67wz" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.005165 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1256396c-ee12-469e-864f-c87983516079-operator-scripts\") pod \"placementbdf2-account-delete-qmjm2\" (UID: \"1256396c-ee12-469e-864f-c87983516079\") " pod="openstack/placementbdf2-account-delete-qmjm2" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.017934 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-gl6zw"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.108806 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmq74\" (UniqueName: \"kubernetes.io/projected/1256396c-ee12-469e-864f-c87983516079-kube-api-access-xmq74\") pod \"placementbdf2-account-delete-qmjm2\" (UID: \"1256396c-ee12-469e-864f-c87983516079\") " pod="openstack/placementbdf2-account-delete-qmjm2" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.108944 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n9xw\" (UniqueName: \"kubernetes.io/projected/c6a94dde-b399-408b-93f2-488d02be7f07-kube-api-access-2n9xw\") pod \"glance93c8-account-delete-j67wz\" (UID: \"c6a94dde-b399-408b-93f2-488d02be7f07\") " pod="openstack/glance93c8-account-delete-j67wz" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.108998 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1256396c-ee12-469e-864f-c87983516079-operator-scripts\") pod \"placementbdf2-account-delete-qmjm2\" (UID: \"1256396c-ee12-469e-864f-c87983516079\") " pod="openstack/placementbdf2-account-delete-qmjm2" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.109430 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6a94dde-b399-408b-93f2-488d02be7f07-operator-scripts\") pod \"glance93c8-account-delete-j67wz\" (UID: \"c6a94dde-b399-408b-93f2-488d02be7f07\") " pod="openstack/glance93c8-account-delete-j67wz" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.110097 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1256396c-ee12-469e-864f-c87983516079-operator-scripts\") pod \"placementbdf2-account-delete-qmjm2\" (UID: \"1256396c-ee12-469e-864f-c87983516079\") " pod="openstack/placementbdf2-account-delete-qmjm2" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.110799 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6a94dde-b399-408b-93f2-488d02be7f07-operator-scripts\") pod \"glance93c8-account-delete-j67wz\" (UID: \"c6a94dde-b399-408b-93f2-488d02be7f07\") " pod="openstack/glance93c8-account-delete-j67wz" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.116411 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican0363-account-delete-22cjb" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.125515 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5abc234f-80ac-4e2e-a43d-2a6fe3453f8c" path="/var/lib/kubelet/pods/5abc234f-80ac-4e2e-a43d-2a6fe3453f8c/volumes" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.126223 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af29daa3-e143-4ea0-bfe0-284fd103f8b3" path="/var/lib/kubelet/pods/af29daa3-e143-4ea0-bfe0-284fd103f8b3/volumes" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.127010 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-gl6zw"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.127156 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-7tpzk"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.141943 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n9xw\" (UniqueName: \"kubernetes.io/projected/c6a94dde-b399-408b-93f2-488d02be7f07-kube-api-access-2n9xw\") pod \"glance93c8-account-delete-j67wz\" (UID: \"c6a94dde-b399-408b-93f2-488d02be7f07\") " pod="openstack/glance93c8-account-delete-j67wz" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.144305 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmq74\" (UniqueName: \"kubernetes.io/projected/1256396c-ee12-469e-864f-c87983516079-kube-api-access-xmq74\") pod \"placementbdf2-account-delete-qmjm2\" (UID: \"1256396c-ee12-469e-864f-c87983516079\") " pod="openstack/placementbdf2-account-delete-qmjm2" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.182554 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-7tpzk"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.214078 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron896b-account-delete-ddxm6"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.215674 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron896b-account-delete-ddxm6" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.237575 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementbdf2-account-delete-qmjm2" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.259799 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron896b-account-delete-ddxm6"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.274047 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-jp2pf"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.296569 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-jp2pf"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.322458 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rkcc\" (UniqueName: \"kubernetes.io/projected/e4b56c27-0089-46d0-8cc3-c5788833f135-kube-api-access-6rkcc\") pod \"neutron896b-account-delete-ddxm6\" (UID: \"e4b56c27-0089-46d0-8cc3-c5788833f135\") " pod="openstack/neutron896b-account-delete-ddxm6" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.322547 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4b56c27-0089-46d0-8cc3-c5788833f135-operator-scripts\") pod \"neutron896b-account-delete-ddxm6\" (UID: \"e4b56c27-0089-46d0-8cc3-c5788833f135\") " pod="openstack/neutron896b-account-delete-ddxm6" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.345656 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance93c8-account-delete-j67wz" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.380604 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.403948 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapib13d-account-delete-k5vqn"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.405189 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapib13d-account-delete-k5vqn" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.425561 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rkcc\" (UniqueName: \"kubernetes.io/projected/e4b56c27-0089-46d0-8cc3-c5788833f135-kube-api-access-6rkcc\") pod \"neutron896b-account-delete-ddxm6\" (UID: \"e4b56c27-0089-46d0-8cc3-c5788833f135\") " pod="openstack/neutron896b-account-delete-ddxm6" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.425641 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4b56c27-0089-46d0-8cc3-c5788833f135-operator-scripts\") pod \"neutron896b-account-delete-ddxm6\" (UID: \"e4b56c27-0089-46d0-8cc3-c5788833f135\") " pod="openstack/neutron896b-account-delete-ddxm6" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.426518 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4b56c27-0089-46d0-8cc3-c5788833f135-operator-scripts\") pod \"neutron896b-account-delete-ddxm6\" (UID: \"e4b56c27-0089-46d0-8cc3-c5788833f135\") " pod="openstack/neutron896b-account-delete-ddxm6" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.429103 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_90b682d1-68e6-49a4-83a6-51b1b40b7e99/ovsdbserver-sb/0.log" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.429173 4947 generic.go:334] "Generic (PLEG): container finished" podID="90b682d1-68e6-49a4-83a6-51b1b40b7e99" containerID="a00403c9e5c9c0356f8cda0bb2f5ea33101133e03ecec9269cd5a9a058bc1298" exitCode=2 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.429191 4947 generic.go:334] "Generic (PLEG): container finished" podID="90b682d1-68e6-49a4-83a6-51b1b40b7e99" containerID="a79ca432e119cfe4382eac1658d9f1475820f86e884d9d267beaad6d791c9e27" exitCode=143 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.429256 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"90b682d1-68e6-49a4-83a6-51b1b40b7e99","Type":"ContainerDied","Data":"a00403c9e5c9c0356f8cda0bb2f5ea33101133e03ecec9269cd5a9a058bc1298"} Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.429282 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"90b682d1-68e6-49a4-83a6-51b1b40b7e99","Type":"ContainerDied","Data":"a79ca432e119cfe4382eac1658d9f1475820f86e884d9d267beaad6d791c9e27"} Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.440660 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapib13d-account-delete-k5vqn"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.458970 4947 generic.go:334] "Generic (PLEG): container finished" podID="855f6436-68f0-42d8-a12a-bf25632440c1" containerID="9d943d955b4387689a8d712809d10af11da1db79e92a195ba57d31cce0773125" exitCode=2 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.459058 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"855f6436-68f0-42d8-a12a-bf25632440c1","Type":"ContainerDied","Data":"9d943d955b4387689a8d712809d10af11da1db79e92a195ba57d31cce0773125"} Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.459910 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rkcc\" (UniqueName: \"kubernetes.io/projected/e4b56c27-0089-46d0-8cc3-c5788833f135-kube-api-access-6rkcc\") pod \"neutron896b-account-delete-ddxm6\" (UID: \"e4b56c27-0089-46d0-8cc3-c5788833f135\") " pod="openstack/neutron896b-account-delete-ddxm6" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.470889 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-lb94d"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.487589 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-84gbj"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.514063 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-6cg4w"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.514535 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-6cg4w" podUID="3496bac7-6b31-4ba8-a490-14bff1522b8c" containerName="openstack-network-exporter" containerID="cri-o://4768422c21501889e7e1951ef8537cbf19f01b02b829b137e7fb9d8dc5766658" gracePeriod=30 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.525778 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell0c18d-account-delete-458q7"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.526960 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0c18d-account-delete-458q7" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.528518 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h9rj\" (UniqueName: \"kubernetes.io/projected/62cff038-cc99-44ca-ba48-a05d16a96b26-kube-api-access-6h9rj\") pod \"novaapib13d-account-delete-k5vqn\" (UID: \"62cff038-cc99-44ca-ba48-a05d16a96b26\") " pod="openstack/novaapib13d-account-delete-k5vqn" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.528588 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62cff038-cc99-44ca-ba48-a05d16a96b26-operator-scripts\") pod \"novaapib13d-account-delete-k5vqn\" (UID: \"62cff038-cc99-44ca-ba48-a05d16a96b26\") " pod="openstack/novaapib13d-account-delete-k5vqn" Dec 03 07:12:31 crc kubenswrapper[4947]: E1203 07:12:31.530885 4947 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 03 07:12:31 crc kubenswrapper[4947]: E1203 07:12:31.531029 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-config-data podName:51d7ef1d-a0bf-465f-baad-1bc3a71618ff nodeName:}" failed. No retries permitted until 2025-12-03 07:12:32.031014949 +0000 UTC m=+1413.291969375 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-config-data") pod "rabbitmq-server-0" (UID: "51d7ef1d-a0bf-465f-baad-1bc3a71618ff") : configmap "rabbitmq-config-data" not found Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.531539 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6c475475-916c-4267-8064-f932c04d0df2/ovsdbserver-nb/0.log" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.531577 4947 generic.go:334] "Generic (PLEG): container finished" podID="6c475475-916c-4267-8064-f932c04d0df2" containerID="909ef5b644a4ef4623502b6cd9ac440ea6dcc840b5372f80ecaa44d612f7650b" exitCode=2 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.531591 4947 generic.go:334] "Generic (PLEG): container finished" podID="6c475475-916c-4267-8064-f932c04d0df2" containerID="759d1c8703ad59941336de48361002019ed4b13b124229b7732fe2bcf89eedf4" exitCode=143 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.531610 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6c475475-916c-4267-8064-f932c04d0df2","Type":"ContainerDied","Data":"909ef5b644a4ef4623502b6cd9ac440ea6dcc840b5372f80ecaa44d612f7650b"} Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.531633 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6c475475-916c-4267-8064-f932c04d0df2","Type":"ContainerDied","Data":"759d1c8703ad59941336de48361002019ed4b13b124229b7732fe2bcf89eedf4"} Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.563209 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell0c18d-account-delete-458q7"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.600840 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.600899 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.601364 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="account-server" containerID="cri-o://f49bb955ad69cca770cf0f549937dcdd2e2098f40b0afb681932a0b678968068" gracePeriod=30 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.601583 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="49caa6da-3c98-4c49-ab22-62121ff908cf" containerName="cinder-scheduler" containerID="cri-o://23fd79e84a8a0d20a65d598b706fa955381b97ec7814c70275bf4b3eb633dcce" gracePeriod=30 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.601908 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="swift-recon-cron" containerID="cri-o://52672ead9b6ba2c835fb5ca4f95054db2a9ce1fa6df424a727875fabb4ce0dbc" gracePeriod=30 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.601949 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="rsync" containerID="cri-o://f69a782b89a00f114b21b586623c5d8f0e73109af52bd1c6676ae4209fab1573" gracePeriod=30 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.601980 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="object-expirer" containerID="cri-o://d87586511f96368d02332320b8070531b8e4c9823a90eea78876b046f2488da5" gracePeriod=30 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.602008 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="object-updater" containerID="cri-o://e4a196d70b59a52f0de0490ca863f45fe4efbfa9a908b2cfde82cec941e4b30f" gracePeriod=30 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.602252 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="object-auditor" containerID="cri-o://bea9b5a2830e45fb27873f63fdf3c2659562adb3f096b0f77513dea99befbef1" gracePeriod=30 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.602322 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="object-replicator" containerID="cri-o://7579cc3029f2cccd754569c3be548ada24f443eaf1602cc6cdbee7d860630040" gracePeriod=30 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.602361 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="object-server" containerID="cri-o://66517c20f9bd5a79033d770b7fe6acd20f04d2ccb83413adddf9ce2d92f48b06" gracePeriod=30 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.602413 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="container-updater" containerID="cri-o://d934198eccfc8bc4a5f9d891474951c5c3b59ba02e3e53ad44a55f4895165461" gracePeriod=30 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.602473 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="container-auditor" containerID="cri-o://9f55040331ca3d502b5c2088f0718ed298df96d80034f2b8e129b91f1d02388f" gracePeriod=30 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.602522 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="container-replicator" containerID="cri-o://e8d2991a67aac964ce17db312462ba4a29fb541e49ec2b41398e5be186932d4d" gracePeriod=30 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.602577 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="account-replicator" containerID="cri-o://9c72d4dda83bcdd5aa991a2138d85cb2117515fc20f7262bda9d4cf7cbb24de8" gracePeriod=30 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.602676 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="account-reaper" containerID="cri-o://d8b3c8f44f2e233d211d7dc57ffed40c7ab6c7b15d021bbb35a7dbf134eff941" gracePeriod=30 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.602713 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="account-auditor" containerID="cri-o://d3eb07996d43bedac1f1d1fa489d736d81b2c8f48876740c85c31fdcd4f49d77" gracePeriod=30 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.602743 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="container-server" containerID="cri-o://d09ae4508992ad6594ab0cf49b18c9dde1d758bb9614d6152f7610d5543b8ba4" gracePeriod=30 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.602712 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="49caa6da-3c98-4c49-ab22-62121ff908cf" containerName="probe" containerID="cri-o://208ecb2e4a9580e581c1630c83f89daf84de2499a774647eff59bb9623f3cf65" gracePeriod=30 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.624468 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.624949 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9a879185-9b9a-45a5-a211-c61faf308cbb" containerName="cinder-api-log" containerID="cri-o://97036c64cfae8886767f870b26986f84c7ae69a1a7110b3ff8a8e15f7e00dd75" gracePeriod=30 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.625391 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="9a879185-9b9a-45a5-a211-c61faf308cbb" containerName="cinder-api" containerID="cri-o://737f46e0f99576fb923247d82fc1eef29b5d49123f805ead5797b64495cf9e63" gracePeriod=30 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.637648 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron896b-account-delete-ddxm6" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.645608 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g85vw\" (UniqueName: \"kubernetes.io/projected/8c562c87-9f66-447f-83ec-05165c95ca25-kube-api-access-g85vw\") pod \"novacell0c18d-account-delete-458q7\" (UID: \"8c562c87-9f66-447f-83ec-05165c95ca25\") " pod="openstack/novacell0c18d-account-delete-458q7" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.645758 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c562c87-9f66-447f-83ec-05165c95ca25-operator-scripts\") pod \"novacell0c18d-account-delete-458q7\" (UID: \"8c562c87-9f66-447f-83ec-05165c95ca25\") " pod="openstack/novacell0c18d-account-delete-458q7" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.646201 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h9rj\" (UniqueName: \"kubernetes.io/projected/62cff038-cc99-44ca-ba48-a05d16a96b26-kube-api-access-6h9rj\") pod \"novaapib13d-account-delete-k5vqn\" (UID: \"62cff038-cc99-44ca-ba48-a05d16a96b26\") " pod="openstack/novaapib13d-account-delete-k5vqn" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.646265 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62cff038-cc99-44ca-ba48-a05d16a96b26-operator-scripts\") pod \"novaapib13d-account-delete-k5vqn\" (UID: \"62cff038-cc99-44ca-ba48-a05d16a96b26\") " pod="openstack/novaapib13d-account-delete-k5vqn" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.658201 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62cff038-cc99-44ca-ba48-a05d16a96b26-operator-scripts\") pod \"novaapib13d-account-delete-k5vqn\" (UID: \"62cff038-cc99-44ca-ba48-a05d16a96b26\") " pod="openstack/novaapib13d-account-delete-k5vqn" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.670983 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-774d6d4878-7x6tj"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.671442 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-774d6d4878-7x6tj" podUID="1c3c47d5-30e8-4c5f-93fe-e0d944cdc998" containerName="placement-log" containerID="cri-o://845cd1a32bd6e9532c9423654e785193c73bf6e3b1aea2e7e18eadd318301cec" gracePeriod=30 Dec 03 07:12:31 crc kubenswrapper[4947]: E1203 07:12:31.678725 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 759d1c8703ad59941336de48361002019ed4b13b124229b7732fe2bcf89eedf4 is running failed: container process not found" containerID="759d1c8703ad59941336de48361002019ed4b13b124229b7732fe2bcf89eedf4" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 03 07:12:31 crc kubenswrapper[4947]: E1203 07:12:31.679438 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 759d1c8703ad59941336de48361002019ed4b13b124229b7732fe2bcf89eedf4 is running failed: container process not found" containerID="759d1c8703ad59941336de48361002019ed4b13b124229b7732fe2bcf89eedf4" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 03 07:12:31 crc kubenswrapper[4947]: E1203 07:12:31.688105 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 759d1c8703ad59941336de48361002019ed4b13b124229b7732fe2bcf89eedf4 is running failed: container process not found" containerID="759d1c8703ad59941336de48361002019ed4b13b124229b7732fe2bcf89eedf4" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 03 07:12:31 crc kubenswrapper[4947]: E1203 07:12:31.688187 4947 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 759d1c8703ad59941336de48361002019ed4b13b124229b7732fe2bcf89eedf4 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="6c475475-916c-4267-8064-f932c04d0df2" containerName="ovsdbserver-nb" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.702209 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-774d6d4878-7x6tj" podUID="1c3c47d5-30e8-4c5f-93fe-e0d944cdc998" containerName="placement-api" containerID="cri-o://ce7f8de092c5631abd0150a07a948c24b407959e654aa1d9ce964ed828bb05fa" gracePeriod=30 Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.713432 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h9rj\" (UniqueName: \"kubernetes.io/projected/62cff038-cc99-44ca-ba48-a05d16a96b26-kube-api-access-6h9rj\") pod \"novaapib13d-account-delete-k5vqn\" (UID: \"62cff038-cc99-44ca-ba48-a05d16a96b26\") " pod="openstack/novaapib13d-account-delete-k5vqn" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.761927 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c562c87-9f66-447f-83ec-05165c95ca25-operator-scripts\") pod \"novacell0c18d-account-delete-458q7\" (UID: \"8c562c87-9f66-447f-83ec-05165c95ca25\") " pod="openstack/novacell0c18d-account-delete-458q7" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.762155 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g85vw\" (UniqueName: \"kubernetes.io/projected/8c562c87-9f66-447f-83ec-05165c95ca25-kube-api-access-g85vw\") pod \"novacell0c18d-account-delete-458q7\" (UID: \"8c562c87-9f66-447f-83ec-05165c95ca25\") " pod="openstack/novacell0c18d-account-delete-458q7" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.763161 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c562c87-9f66-447f-83ec-05165c95ca25-operator-scripts\") pod \"novacell0c18d-account-delete-458q7\" (UID: \"8c562c87-9f66-447f-83ec-05165c95ca25\") " pod="openstack/novacell0c18d-account-delete-458q7" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.790113 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g85vw\" (UniqueName: \"kubernetes.io/projected/8c562c87-9f66-447f-83ec-05165c95ca25-kube-api-access-g85vw\") pod \"novacell0c18d-account-delete-458q7\" (UID: \"8c562c87-9f66-447f-83ec-05165c95ca25\") " pod="openstack/novacell0c18d-account-delete-458q7" Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.806323 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-7bxn2"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.862973 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-7bxn2"] Dec 03 07:12:31 crc kubenswrapper[4947]: E1203 07:12:31.865542 4947 secret.go:188] Couldn't get secret openstack/cinder-scripts: secret "cinder-scripts" not found Dec 03 07:12:31 crc kubenswrapper[4947]: E1203 07:12:31.865606 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-scripts podName:49caa6da-3c98-4c49-ab22-62121ff908cf nodeName:}" failed. No retries permitted until 2025-12-03 07:12:33.86559088 +0000 UTC m=+1415.126545296 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-scripts") pod "cinder-scheduler-0" (UID: "49caa6da-3c98-4c49-ab22-62121ff908cf") : secret "cinder-scripts" not found Dec 03 07:12:31 crc kubenswrapper[4947]: E1203 07:12:31.865906 4947 secret.go:188] Couldn't get secret openstack/cinder-config-data: secret "cinder-config-data" not found Dec 03 07:12:31 crc kubenswrapper[4947]: E1203 07:12:31.865930 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-config-data podName:49caa6da-3c98-4c49-ab22-62121ff908cf nodeName:}" failed. No retries permitted until 2025-12-03 07:12:33.865922679 +0000 UTC m=+1415.126877105 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-config-data") pod "cinder-scheduler-0" (UID: "49caa6da-3c98-4c49-ab22-62121ff908cf") : secret "cinder-config-data" not found Dec 03 07:12:31 crc kubenswrapper[4947]: E1203 07:12:31.865972 4947 projected.go:263] Couldn't get secret openstack/swift-proxy-config-data: secret "swift-proxy-config-data" not found Dec 03 07:12:31 crc kubenswrapper[4947]: E1203 07:12:31.865979 4947 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Dec 03 07:12:31 crc kubenswrapper[4947]: E1203 07:12:31.865989 4947 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 07:12:31 crc kubenswrapper[4947]: E1203 07:12:31.865999 4947 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-6ddbf97597-l6hz9: [secret "swift-proxy-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Dec 03 07:12:31 crc kubenswrapper[4947]: E1203 07:12:31.866018 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1f87802c-3846-486e-a131-39a7fe336c96-etc-swift podName:1f87802c-3846-486e-a131-39a7fe336c96 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:33.866012862 +0000 UTC m=+1415.126967278 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1f87802c-3846-486e-a131-39a7fe336c96-etc-swift") pod "swift-proxy-6ddbf97597-l6hz9" (UID: "1f87802c-3846-486e-a131-39a7fe336c96") : [secret "swift-proxy-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Dec 03 07:12:31 crc kubenswrapper[4947]: E1203 07:12:31.866044 4947 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 03 07:12:31 crc kubenswrapper[4947]: E1203 07:12:31.866059 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-config-data podName:5367165f-75ec-4633-8042-edfe91e3be60 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:33.866054853 +0000 UTC m=+1415.127009279 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-config-data") pod "rabbitmq-cell1-server-0" (UID: "5367165f-75ec-4633-8042-edfe91e3be60") : configmap "rabbitmq-cell1-config-data" not found Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.906467 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-dtngb"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.920657 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-dtngb"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.933318 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-b6497"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.941140 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-b6497"] Dec 03 07:12:31 crc kubenswrapper[4947]: I1203 07:12:31.954545 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapib13d-account-delete-k5vqn" Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.011985 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0c18d-account-delete-458q7" Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.019719 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6c475475-916c-4267-8064-f932c04d0df2/ovsdbserver-nb/0.log" Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.019801 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.039417 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9cbcb645-6xgx7"] Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.039749 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" podUID="b4a227e4-8c2a-4880-9944-877640627cd0" containerName="dnsmasq-dns" containerID="cri-o://cdbab280e2ed4809a2997438db415a2fa098c89060067db08594d46673b283c3" gracePeriod=10 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.075038 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c475475-916c-4267-8064-f932c04d0df2-metrics-certs-tls-certs\") pod \"6c475475-916c-4267-8064-f932c04d0df2\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.075088 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c475475-916c-4267-8064-f932c04d0df2-ovsdbserver-nb-tls-certs\") pod \"6c475475-916c-4267-8064-f932c04d0df2\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.075173 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c475475-916c-4267-8064-f932c04d0df2-combined-ca-bundle\") pod \"6c475475-916c-4267-8064-f932c04d0df2\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.075231 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c475475-916c-4267-8064-f932c04d0df2-config\") pod \"6c475475-916c-4267-8064-f932c04d0df2\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.075267 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6c475475-916c-4267-8064-f932c04d0df2-ovsdb-rundir\") pod \"6c475475-916c-4267-8064-f932c04d0df2\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.075282 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"6c475475-916c-4267-8064-f932c04d0df2\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.075298 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c475475-916c-4267-8064-f932c04d0df2-scripts\") pod \"6c475475-916c-4267-8064-f932c04d0df2\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.075354 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ww8p\" (UniqueName: \"kubernetes.io/projected/6c475475-916c-4267-8064-f932c04d0df2-kube-api-access-6ww8p\") pod \"6c475475-916c-4267-8064-f932c04d0df2\" (UID: \"6c475475-916c-4267-8064-f932c04d0df2\") " Dec 03 07:12:32 crc kubenswrapper[4947]: E1203 07:12:32.075760 4947 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 03 07:12:32 crc kubenswrapper[4947]: E1203 07:12:32.075802 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-config-data podName:51d7ef1d-a0bf-465f-baad-1bc3a71618ff nodeName:}" failed. No retries permitted until 2025-12-03 07:12:33.07578785 +0000 UTC m=+1414.336742276 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-config-data") pod "rabbitmq-server-0" (UID: "51d7ef1d-a0bf-465f-baad-1bc3a71618ff") : configmap "rabbitmq-config-data" not found Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.082318 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.089447 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c475475-916c-4267-8064-f932c04d0df2-scripts" (OuterVolumeSpecName: "scripts") pod "6c475475-916c-4267-8064-f932c04d0df2" (UID: "6c475475-916c-4267-8064-f932c04d0df2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.089711 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6ddbf97597-l6hz9"] Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.090098 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c7b9bf09-0d94-4520-b783-7eb3fb4b79d4" containerName="glance-log" containerID="cri-o://a1759e709aa0048498574524e67b384d6d64c42bed6642269b86d9285c67cdf0" gracePeriod=30 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.090577 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6ddbf97597-l6hz9" podUID="1f87802c-3846-486e-a131-39a7fe336c96" containerName="proxy-httpd" containerID="cri-o://1803fa6075e7e974f61cf6cd7ebcdef0211618286da42aad5c1687fe396a9e93" gracePeriod=30 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.090661 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c7b9bf09-0d94-4520-b783-7eb3fb4b79d4" containerName="glance-httpd" containerID="cri-o://64b3099ce4ae64f657b3ba6a9d2a21ac15dc33c2c3dddccc593d4d83045820bf" gracePeriod=30 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.090715 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6ddbf97597-l6hz9" podUID="1f87802c-3846-486e-a131-39a7fe336c96" containerName="proxy-server" containerID="cri-o://a6e1ead0fefc1c6dc26fdc188da65352aea353653dca0f86b588f1fdd21857ed" gracePeriod=30 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.096108 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c475475-916c-4267-8064-f932c04d0df2-config" (OuterVolumeSpecName: "config") pod "6c475475-916c-4267-8064-f932c04d0df2" (UID: "6c475475-916c-4267-8064-f932c04d0df2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.096294 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c475475-916c-4267-8064-f932c04d0df2-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "6c475475-916c-4267-8064-f932c04d0df2" (UID: "6c475475-916c-4267-8064-f932c04d0df2"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.123064 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.123352 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bdb45354-43cd-41e7-a511-95357eb656e5" containerName="glance-log" containerID="cri-o://1661b453aa4675cf59e5985d9eef19957b032c863d92cb6f01e34413e202e848" gracePeriod=30 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.123806 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bdb45354-43cd-41e7-a511-95357eb656e5" containerName="glance-httpd" containerID="cri-o://7a42de092f2c4832987606718c90461323cdaca36bc998fcfa88de29b59c873d" gracePeriod=30 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.132565 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6c489f678-crqhz"] Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.132798 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6c489f678-crqhz" podUID="1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3" containerName="barbican-keystone-listener-log" containerID="cri-o://aac3920ae7cd9e0b2c5f777e7aa7e5d8fbbea4f5ca8c93fe99023a44910f66a8" gracePeriod=30 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.133100 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6c489f678-crqhz" podUID="1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3" containerName="barbican-keystone-listener" containerID="cri-o://a809b23cd1a249dbc9331e612526f60ed5bdd18fa6472bcbf28960c85be485be" gracePeriod=30 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.163733 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7dd7c69-hhnlg"] Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.163980 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7dd7c69-hhnlg" podUID="ccb28a2e-a946-4407-be07-6ac8eaad8ab1" containerName="neutron-api" containerID="cri-o://4f26987218545ce4b3ac012485238ac52ce1b620c3025dd4686d6e203a028593" gracePeriod=30 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.164650 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7dd7c69-hhnlg" podUID="ccb28a2e-a946-4407-be07-6ac8eaad8ab1" containerName="neutron-httpd" containerID="cri-o://d1c453acbdb699e13cb35f456f04fa20cb22c76c9d9304778639baf320f5cf98" gracePeriod=30 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.177714 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c475475-916c-4267-8064-f932c04d0df2-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.177755 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6c475475-916c-4267-8064-f932c04d0df2-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.177768 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c475475-916c-4267-8064-f932c04d0df2-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.178704 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5db58dc49-hq6px"] Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.178911 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5db58dc49-hq6px" podUID="ed654f44-78c3-4118-83d5-e2a5d917c4f4" containerName="barbican-worker-log" containerID="cri-o://73b7a501be5ba2d71f1c1dd2ceb4c2ccc770dcda00e913d4135212d4d5b47c39" gracePeriod=30 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.179017 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5db58dc49-hq6px" podUID="ed654f44-78c3-4118-83d5-e2a5d917c4f4" containerName="barbican-worker" containerID="cri-o://e0dfb3c4add7777ef4ada7896099eaab6442a879387a5254d9c5547a18dee051" gracePeriod=30 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.191534 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-59d779d8d8-6jl5c"] Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.191767 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-59d779d8d8-6jl5c" podUID="e4e9d6dc-e814-485d-842b-9266732c7924" containerName="barbican-api-log" containerID="cri-o://0df9670d5726b37e2f523e33898d50880cf6beeea48f329f234fc69c23009757" gracePeriod=30 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.192134 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-59d779d8d8-6jl5c" podUID="e4e9d6dc-e814-485d-842b-9266732c7924" containerName="barbican-api" containerID="cri-o://c60ae6e938e54908332dbaa6ddd689e443ba56a7fa23dbd3d3249373512ca680" gracePeriod=30 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.207142 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.228196 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "6c475475-916c-4267-8064-f932c04d0df2" (UID: "6c475475-916c-4267-8064-f932c04d0df2"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.230751 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c475475-916c-4267-8064-f932c04d0df2-kube-api-access-6ww8p" (OuterVolumeSpecName: "kube-api-access-6ww8p") pod "6c475475-916c-4267-8064-f932c04d0df2" (UID: "6c475475-916c-4267-8064-f932c04d0df2"). InnerVolumeSpecName "kube-api-access-6ww8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.251146 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.251362 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="899d3d67-ec63-4d5f-ad93-c40003578347" containerName="nova-metadata-log" containerID="cri-o://96cdcb461d21fa3b9f6f2564295436608c7aad6f8b2a1e0421630332a70976f2" gracePeriod=30 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.251754 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="899d3d67-ec63-4d5f-ad93-c40003578347" containerName="nova-metadata-metadata" containerID="cri-o://a8b3ef7eb0ba7d8af170edb23f786fe99d24451a36fd652d4dbc94d3f46220ba" gracePeriod=30 Dec 03 07:12:32 crc kubenswrapper[4947]: E1203 07:12:32.258675 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a79ca432e119cfe4382eac1658d9f1475820f86e884d9d267beaad6d791c9e27 is running failed: container process not found" containerID="a79ca432e119cfe4382eac1658d9f1475820f86e884d9d267beaad6d791c9e27" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 03 07:12:32 crc kubenswrapper[4947]: E1203 07:12:32.259248 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a79ca432e119cfe4382eac1658d9f1475820f86e884d9d267beaad6d791c9e27 is running failed: container process not found" containerID="a79ca432e119cfe4382eac1658d9f1475820f86e884d9d267beaad6d791c9e27" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 03 07:12:32 crc kubenswrapper[4947]: E1203 07:12:32.260397 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a79ca432e119cfe4382eac1658d9f1475820f86e884d9d267beaad6d791c9e27 is running failed: container process not found" containerID="a79ca432e119cfe4382eac1658d9f1475820f86e884d9d267beaad6d791c9e27" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 03 07:12:32 crc kubenswrapper[4947]: E1203 07:12:32.260425 4947 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a79ca432e119cfe4382eac1658d9f1475820f86e884d9d267beaad6d791c9e27 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="90b682d1-68e6-49a4-83a6-51b1b40b7e99" containerName="ovsdbserver-sb" Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.269736 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c475475-916c-4267-8064-f932c04d0df2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c475475-916c-4267-8064-f932c04d0df2" (UID: "6c475475-916c-4267-8064-f932c04d0df2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.288939 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c475475-916c-4267-8064-f932c04d0df2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.288996 4947 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.289007 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ww8p\" (UniqueName: \"kubernetes.io/projected/6c475475-916c-4267-8064-f932c04d0df2-kube-api-access-6ww8p\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.302044 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.302237 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="3735b7db-e9a7-4be6-9c74-cad0131f2c0b" containerName="nova-scheduler-scheduler" containerID="cri-o://d0205492af2a84d2597e544b1fac1933410305054e5bf3b202d9e1f108f9cdfa" gracePeriod=30 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.311349 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.320796 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.321043 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="98d05028-c68a-4438-afcd-161c4f974b08" containerName="nova-api-log" containerID="cri-o://1ce62f53e6a80cc9f688e02da11287f9cc62f54ad243c0af9a53ea71ef8a6f49" gracePeriod=30 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.321192 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="98d05028-c68a-4438-afcd-161c4f974b08" containerName="nova-api-api" containerID="cri-o://6e0c48cd3228e4b00d96c837f40857c671662d7b07ee44989563b07cafc4c386" gracePeriod=30 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.336050 4947 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.336110 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-kgjrr"] Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.383549 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-kgjrr"] Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.398538 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-56ad-account-create-update-d94hj"] Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.401559 4947 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.418002 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-56ad-account-create-update-d94hj"] Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.431566 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.431840 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="e80985eb-c6e0-4ffc-9b98-b1c92be266eb" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://7ddbee771d7d031c4a9d787606308301ec00a05893e9401d6d27e942db47b236" gracePeriod=30 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.436287 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="5367165f-75ec-4633-8042-edfe91e3be60" containerName="rabbitmq" containerID="cri-o://de4fb9c7248f451bd80d99c66509fa611e9f2312d6bc7b3b3051166fe6d71e26" gracePeriod=604800 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.441612 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.476255 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-6ddbf97597-l6hz9" podUID="1f87802c-3846-486e-a131-39a7fe336c96" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.169:8080/healthcheck\": read tcp 10.217.0.2:36344->10.217.0.169:8080: read: connection reset by peer" Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.476599 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-6ddbf97597-l6hz9" podUID="1f87802c-3846-486e-a131-39a7fe336c96" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.169:8080/healthcheck\": read tcp 10.217.0.2:36350->10.217.0.169:8080: read: connection reset by peer" Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.509478 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tjs6k"] Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.524595 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tjs6k"] Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.544042 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6cg4w_3496bac7-6b31-4ba8-a490-14bff1522b8c/openstack-network-exporter/0.log" Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.544083 4947 generic.go:334] "Generic (PLEG): container finished" podID="3496bac7-6b31-4ba8-a490-14bff1522b8c" containerID="4768422c21501889e7e1951ef8537cbf19f01b02b829b137e7fb9d8dc5766658" exitCode=2 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.544126 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6cg4w" event={"ID":"3496bac7-6b31-4ba8-a490-14bff1522b8c","Type":"ContainerDied","Data":"4768422c21501889e7e1951ef8537cbf19f01b02b829b137e7fb9d8dc5766658"} Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.550794 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.550962 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="9f0a57ca-d063-4d27-ac54-f5431cca2971" containerName="nova-cell0-conductor-conductor" containerID="cri-o://334c2441513d95f243b52e76544d548f66b710f9a5c7ddb3358d9df97ecd5f57" gracePeriod=30 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.564519 4947 generic.go:334] "Generic (PLEG): container finished" podID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerID="d87586511f96368d02332320b8070531b8e4c9823a90eea78876b046f2488da5" exitCode=0 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.564536 4947 generic.go:334] "Generic (PLEG): container finished" podID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerID="e4a196d70b59a52f0de0490ca863f45fe4efbfa9a908b2cfde82cec941e4b30f" exitCode=0 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.564543 4947 generic.go:334] "Generic (PLEG): container finished" podID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerID="bea9b5a2830e45fb27873f63fdf3c2659562adb3f096b0f77513dea99befbef1" exitCode=0 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.564549 4947 generic.go:334] "Generic (PLEG): container finished" podID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerID="7579cc3029f2cccd754569c3be548ada24f443eaf1602cc6cdbee7d860630040" exitCode=0 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.564555 4947 generic.go:334] "Generic (PLEG): container finished" podID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerID="d934198eccfc8bc4a5f9d891474951c5c3b59ba02e3e53ad44a55f4895165461" exitCode=0 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.564562 4947 generic.go:334] "Generic (PLEG): container finished" podID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerID="9f55040331ca3d502b5c2088f0718ed298df96d80034f2b8e129b91f1d02388f" exitCode=0 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.564568 4947 generic.go:334] "Generic (PLEG): container finished" podID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerID="e8d2991a67aac964ce17db312462ba4a29fb541e49ec2b41398e5be186932d4d" exitCode=0 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.564574 4947 generic.go:334] "Generic (PLEG): container finished" podID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerID="d8b3c8f44f2e233d211d7dc57ffed40c7ab6c7b15d021bbb35a7dbf134eff941" exitCode=0 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.564581 4947 generic.go:334] "Generic (PLEG): container finished" podID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerID="d3eb07996d43bedac1f1d1fa489d736d81b2c8f48876740c85c31fdcd4f49d77" exitCode=0 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.564588 4947 generic.go:334] "Generic (PLEG): container finished" podID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerID="9c72d4dda83bcdd5aa991a2138d85cb2117515fc20f7262bda9d4cf7cbb24de8" exitCode=0 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.564616 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerDied","Data":"d87586511f96368d02332320b8070531b8e4c9823a90eea78876b046f2488da5"} Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.564635 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerDied","Data":"e4a196d70b59a52f0de0490ca863f45fe4efbfa9a908b2cfde82cec941e4b30f"} Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.564644 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerDied","Data":"bea9b5a2830e45fb27873f63fdf3c2659562adb3f096b0f77513dea99befbef1"} Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.564653 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerDied","Data":"7579cc3029f2cccd754569c3be548ada24f443eaf1602cc6cdbee7d860630040"} Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.564660 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerDied","Data":"d934198eccfc8bc4a5f9d891474951c5c3b59ba02e3e53ad44a55f4895165461"} Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.564669 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerDied","Data":"9f55040331ca3d502b5c2088f0718ed298df96d80034f2b8e129b91f1d02388f"} Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.564677 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerDied","Data":"e8d2991a67aac964ce17db312462ba4a29fb541e49ec2b41398e5be186932d4d"} Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.564685 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerDied","Data":"d8b3c8f44f2e233d211d7dc57ffed40c7ab6c7b15d021bbb35a7dbf134eff941"} Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.564693 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerDied","Data":"d3eb07996d43bedac1f1d1fa489d736d81b2c8f48876740c85c31fdcd4f49d77"} Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.564701 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerDied","Data":"9c72d4dda83bcdd5aa991a2138d85cb2117515fc20f7262bda9d4cf7cbb24de8"} Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.573447 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.573860 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="3b4b931b-69ee-4ff2-b01a-85d45fc93ec4" containerName="nova-cell1-conductor-conductor" containerID="cri-o://05a614732107d76e00cf7484e0feb6e0485e39e5f9e2a1c05314d7e15cca22c4" gracePeriod=30 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.585022 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-h5z8j"] Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.585560 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6c475475-916c-4267-8064-f932c04d0df2/ovsdbserver-nb/0.log" Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.585638 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6c475475-916c-4267-8064-f932c04d0df2","Type":"ContainerDied","Data":"1a2c780252bf9565c240dbb58e23c3b76dd581500932bae059cebded2f499d39"} Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.585669 4947 scope.go:117] "RemoveContainer" containerID="909ef5b644a4ef4623502b6cd9ac440ea6dcc840b5372f80ecaa44d612f7650b" Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.585669 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.592002 4947 generic.go:334] "Generic (PLEG): container finished" podID="9a879185-9b9a-45a5-a211-c61faf308cbb" containerID="97036c64cfae8886767f870b26986f84c7ae69a1a7110b3ff8a8e15f7e00dd75" exitCode=143 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.592144 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9a879185-9b9a-45a5-a211-c61faf308cbb","Type":"ContainerDied","Data":"97036c64cfae8886767f870b26986f84c7ae69a1a7110b3ff8a8e15f7e00dd75"} Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.594176 4947 generic.go:334] "Generic (PLEG): container finished" podID="1c3c47d5-30e8-4c5f-93fe-e0d944cdc998" containerID="845cd1a32bd6e9532c9423654e785193c73bf6e3b1aea2e7e18eadd318301cec" exitCode=143 Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.594557 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-774d6d4878-7x6tj" event={"ID":"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998","Type":"ContainerDied","Data":"845cd1a32bd6e9532c9423654e785193c73bf6e3b1aea2e7e18eadd318301cec"} Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.602220 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-h5z8j"] Dec 03 07:12:32 crc kubenswrapper[4947]: I1203 07:12:32.682165 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="51d7ef1d-a0bf-465f-baad-1bc3a71618ff" containerName="rabbitmq" containerID="cri-o://23a38b8a1f4cbbe4e0977fa37f88e153dc49ca4d190b8ca3be90add4e540bbe0" gracePeriod=604800 Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.104602 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="000a9c13-4796-4ef4-ba6e-5e57e567dc57" path="/var/lib/kubelet/pods/000a9c13-4796-4ef4-ba6e-5e57e567dc57/volumes" Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.105787 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58224663-92bc-4143-ad66-3ce51e606d86" path="/var/lib/kubelet/pods/58224663-92bc-4143-ad66-3ce51e606d86/volumes" Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.106768 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72d77a00-5a8b-4ac6-a764-c558f079bd04" path="/var/lib/kubelet/pods/72d77a00-5a8b-4ac6-a764-c558f079bd04/volumes" Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.108463 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73c410b2-0cdc-45f3-b06d-c67fd543e76c" path="/var/lib/kubelet/pods/73c410b2-0cdc-45f3-b06d-c67fd543e76c/volumes" Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.109314 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a182aa69-32cc-4496-a551-d5e0fafda4af" path="/var/lib/kubelet/pods/a182aa69-32cc-4496-a551-d5e0fafda4af/volumes" Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.110196 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2da706f-4332-49a8-9c85-2d90186bd708" path="/var/lib/kubelet/pods/a2da706f-4332-49a8-9c85-2d90186bd708/volumes" Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.111192 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6e0b77d-5e1e-4a81-abad-d35da9b42aaa" path="/var/lib/kubelet/pods/b6e0b77d-5e1e-4a81-abad-d35da9b42aaa/volumes" Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.112617 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf" path="/var/lib/kubelet/pods/c50ee2c1-e6b6-45b8-8378-2c1f7a99bbdf/volumes" Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.113399 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d069b8e8-8592-4fa1-a5e7-74d91e239c0a" path="/var/lib/kubelet/pods/d069b8e8-8592-4fa1-a5e7-74d91e239c0a/volumes" Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.114309 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5883886-a832-4366-be78-449b4559e8d2" path="/var/lib/kubelet/pods/e5883886-a832-4366-be78-449b4559e8d2/volumes" Dec 03 07:12:33 crc kubenswrapper[4947]: E1203 07:12:33.143938 4947 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 03 07:12:33 crc kubenswrapper[4947]: E1203 07:12:33.149080 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-config-data podName:51d7ef1d-a0bf-465f-baad-1bc3a71618ff nodeName:}" failed. No retries permitted until 2025-12-03 07:12:35.14905846 +0000 UTC m=+1416.410012886 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-config-data") pod "rabbitmq-server-0" (UID: "51d7ef1d-a0bf-465f-baad-1bc3a71618ff") : configmap "rabbitmq-config-data" not found Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.181655 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c475475-916c-4267-8064-f932c04d0df2-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "6c475475-916c-4267-8064-f932c04d0df2" (UID: "6c475475-916c-4267-8064-f932c04d0df2"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.193268 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c475475-916c-4267-8064-f932c04d0df2-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "6c475475-916c-4267-8064-f932c04d0df2" (UID: "6c475475-916c-4267-8064-f932c04d0df2"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.205635 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placementbdf2-account-delete-qmjm2"] Dec 03 07:12:33 crc kubenswrapper[4947]: E1203 07:12:33.221694 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9e770c8d82efa23dd527c2a4232bde18aef5979e147a5871d763e136d274533e" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 03 07:12:33 crc kubenswrapper[4947]: E1203 07:12:33.230865 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9e770c8d82efa23dd527c2a4232bde18aef5979e147a5871d763e136d274533e" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.240149 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican0363-account-delete-22cjb"] Dec 03 07:12:33 crc kubenswrapper[4947]: E1203 07:12:33.242445 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9e770c8d82efa23dd527c2a4232bde18aef5979e147a5871d763e136d274533e" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 03 07:12:33 crc kubenswrapper[4947]: E1203 07:12:33.248062 4947 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="855f6436-68f0-42d8-a12a-bf25632440c1" containerName="ovn-northd" Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.246023 4947 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c475475-916c-4267-8064-f932c04d0df2-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.250066 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c475475-916c-4267-8064-f932c04d0df2-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.420171 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder0e6b-account-delete-kmbkl"] Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.449451 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance93c8-account-delete-j67wz"] Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.502003 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8" containerName="galera" containerID="cri-o://a97cfb598af771fadb5726f8c4ca3b03aba98f58c7aa78e79de0f3b339e5fad7" gracePeriod=29 Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.644065 4947 generic.go:334] "Generic (PLEG): container finished" podID="1f87802c-3846-486e-a131-39a7fe336c96" containerID="a6e1ead0fefc1c6dc26fdc188da65352aea353653dca0f86b588f1fdd21857ed" exitCode=0 Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.644091 4947 generic.go:334] "Generic (PLEG): container finished" podID="1f87802c-3846-486e-a131-39a7fe336c96" containerID="1803fa6075e7e974f61cf6cd7ebcdef0211618286da42aad5c1687fe396a9e93" exitCode=0 Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.644131 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6ddbf97597-l6hz9" event={"ID":"1f87802c-3846-486e-a131-39a7fe336c96","Type":"ContainerDied","Data":"a6e1ead0fefc1c6dc26fdc188da65352aea353653dca0f86b588f1fdd21857ed"} Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.644155 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6ddbf97597-l6hz9" event={"ID":"1f87802c-3846-486e-a131-39a7fe336c96","Type":"ContainerDied","Data":"1803fa6075e7e974f61cf6cd7ebcdef0211618286da42aad5c1687fe396a9e93"} Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.646358 4947 generic.go:334] "Generic (PLEG): container finished" podID="b4a227e4-8c2a-4880-9944-877640627cd0" containerID="cdbab280e2ed4809a2997438db415a2fa098c89060067db08594d46673b283c3" exitCode=0 Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.646411 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" event={"ID":"b4a227e4-8c2a-4880-9944-877640627cd0","Type":"ContainerDied","Data":"cdbab280e2ed4809a2997438db415a2fa098c89060067db08594d46673b283c3"} Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.647830 4947 generic.go:334] "Generic (PLEG): container finished" podID="e4e9d6dc-e814-485d-842b-9266732c7924" containerID="0df9670d5726b37e2f523e33898d50880cf6beeea48f329f234fc69c23009757" exitCode=143 Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.647869 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59d779d8d8-6jl5c" event={"ID":"e4e9d6dc-e814-485d-842b-9266732c7924","Type":"ContainerDied","Data":"0df9670d5726b37e2f523e33898d50880cf6beeea48f329f234fc69c23009757"} Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.649944 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_90b682d1-68e6-49a4-83a6-51b1b40b7e99/ovsdbserver-sb/0.log" Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.650019 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"90b682d1-68e6-49a4-83a6-51b1b40b7e99","Type":"ContainerDied","Data":"579cbb8fdc4c6b9e04a1e0d47193bfce34fc2131c0319a5c4dd0313964672ec8"} Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.650036 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="579cbb8fdc4c6b9e04a1e0d47193bfce34fc2131c0319a5c4dd0313964672ec8" Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.659462 4947 generic.go:334] "Generic (PLEG): container finished" podID="1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3" containerID="aac3920ae7cd9e0b2c5f777e7aa7e5d8fbbea4f5ca8c93fe99023a44910f66a8" exitCode=143 Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.659534 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c489f678-crqhz" event={"ID":"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3","Type":"ContainerDied","Data":"aac3920ae7cd9e0b2c5f777e7aa7e5d8fbbea4f5ca8c93fe99023a44910f66a8"} Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.662549 4947 generic.go:334] "Generic (PLEG): container finished" podID="e80985eb-c6e0-4ffc-9b98-b1c92be266eb" containerID="7ddbee771d7d031c4a9d787606308301ec00a05893e9401d6d27e942db47b236" exitCode=0 Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.662602 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e80985eb-c6e0-4ffc-9b98-b1c92be266eb","Type":"ContainerDied","Data":"7ddbee771d7d031c4a9d787606308301ec00a05893e9401d6d27e942db47b236"} Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.686756 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron896b-account-delete-ddxm6"] Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.714466 4947 generic.go:334] "Generic (PLEG): container finished" podID="49caa6da-3c98-4c49-ab22-62121ff908cf" containerID="208ecb2e4a9580e581c1630c83f89daf84de2499a774647eff59bb9623f3cf65" exitCode=0 Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.714659 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"49caa6da-3c98-4c49-ab22-62121ff908cf","Type":"ContainerDied","Data":"208ecb2e4a9580e581c1630c83f89daf84de2499a774647eff59bb9623f3cf65"} Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.731350 4947 generic.go:334] "Generic (PLEG): container finished" podID="ed654f44-78c3-4118-83d5-e2a5d917c4f4" containerID="e0dfb3c4add7777ef4ada7896099eaab6442a879387a5254d9c5547a18dee051" exitCode=0 Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.731380 4947 generic.go:334] "Generic (PLEG): container finished" podID="ed654f44-78c3-4118-83d5-e2a5d917c4f4" containerID="73b7a501be5ba2d71f1c1dd2ceb4c2ccc770dcda00e913d4135212d4d5b47c39" exitCode=143 Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.731421 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5db58dc49-hq6px" event={"ID":"ed654f44-78c3-4118-83d5-e2a5d917c4f4","Type":"ContainerDied","Data":"e0dfb3c4add7777ef4ada7896099eaab6442a879387a5254d9c5547a18dee051"} Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.731446 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5db58dc49-hq6px" event={"ID":"ed654f44-78c3-4118-83d5-e2a5d917c4f4","Type":"ContainerDied","Data":"73b7a501be5ba2d71f1c1dd2ceb4c2ccc770dcda00e913d4135212d4d5b47c39"} Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.733380 4947 generic.go:334] "Generic (PLEG): container finished" podID="98d05028-c68a-4438-afcd-161c4f974b08" containerID="1ce62f53e6a80cc9f688e02da11287f9cc62f54ad243c0af9a53ea71ef8a6f49" exitCode=143 Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.733422 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"98d05028-c68a-4438-afcd-161c4f974b08","Type":"ContainerDied","Data":"1ce62f53e6a80cc9f688e02da11287f9cc62f54ad243c0af9a53ea71ef8a6f49"} Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.735269 4947 generic.go:334] "Generic (PLEG): container finished" podID="c7b9bf09-0d94-4520-b783-7eb3fb4b79d4" containerID="a1759e709aa0048498574524e67b384d6d64c42bed6642269b86d9285c67cdf0" exitCode=143 Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.735313 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4","Type":"ContainerDied","Data":"a1759e709aa0048498574524e67b384d6d64c42bed6642269b86d9285c67cdf0"} Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.738167 4947 generic.go:334] "Generic (PLEG): container finished" podID="899d3d67-ec63-4d5f-ad93-c40003578347" containerID="96cdcb461d21fa3b9f6f2564295436608c7aad6f8b2a1e0421630332a70976f2" exitCode=143 Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.738205 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"899d3d67-ec63-4d5f-ad93-c40003578347","Type":"ContainerDied","Data":"96cdcb461d21fa3b9f6f2564295436608c7aad6f8b2a1e0421630332a70976f2"} Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.745540 4947 generic.go:334] "Generic (PLEG): container finished" podID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerID="f69a782b89a00f114b21b586623c5d8f0e73109af52bd1c6676ae4209fab1573" exitCode=0 Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.745567 4947 generic.go:334] "Generic (PLEG): container finished" podID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerID="66517c20f9bd5a79033d770b7fe6acd20f04d2ccb83413adddf9ce2d92f48b06" exitCode=0 Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.745574 4947 generic.go:334] "Generic (PLEG): container finished" podID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerID="d09ae4508992ad6594ab0cf49b18c9dde1d758bb9614d6152f7610d5543b8ba4" exitCode=0 Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.745580 4947 generic.go:334] "Generic (PLEG): container finished" podID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerID="f49bb955ad69cca770cf0f549937dcdd2e2098f40b0afb681932a0b678968068" exitCode=0 Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.745612 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerDied","Data":"f69a782b89a00f114b21b586623c5d8f0e73109af52bd1c6676ae4209fab1573"} Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.745651 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerDied","Data":"66517c20f9bd5a79033d770b7fe6acd20f04d2ccb83413adddf9ce2d92f48b06"} Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.745678 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerDied","Data":"d09ae4508992ad6594ab0cf49b18c9dde1d758bb9614d6152f7610d5543b8ba4"} Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.745687 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerDied","Data":"f49bb955ad69cca770cf0f549937dcdd2e2098f40b0afb681932a0b678968068"} Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.749939 4947 generic.go:334] "Generic (PLEG): container finished" podID="bdb45354-43cd-41e7-a511-95357eb656e5" containerID="1661b453aa4675cf59e5985d9eef19957b032c863d92cb6f01e34413e202e848" exitCode=143 Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.750019 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bdb45354-43cd-41e7-a511-95357eb656e5","Type":"ContainerDied","Data":"1661b453aa4675cf59e5985d9eef19957b032c863d92cb6f01e34413e202e848"} Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.751351 4947 generic.go:334] "Generic (PLEG): container finished" podID="28892119-e165-46ea-a903-08207e491378" containerID="e942659fea577abdb40bc1687013b7988fc7bb6ef1815a0d789c1541253c9e6a" exitCode=137 Dec 03 07:12:33 crc kubenswrapper[4947]: E1203 07:12:33.754025 4947 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Dec 03 07:12:33 crc kubenswrapper[4947]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 03 07:12:33 crc kubenswrapper[4947]: + source /usr/local/bin/container-scripts/functions Dec 03 07:12:33 crc kubenswrapper[4947]: ++ OVNBridge=br-int Dec 03 07:12:33 crc kubenswrapper[4947]: ++ OVNRemote=tcp:localhost:6642 Dec 03 07:12:33 crc kubenswrapper[4947]: ++ OVNEncapType=geneve Dec 03 07:12:33 crc kubenswrapper[4947]: ++ OVNAvailabilityZones= Dec 03 07:12:33 crc kubenswrapper[4947]: ++ EnableChassisAsGateway=true Dec 03 07:12:33 crc kubenswrapper[4947]: ++ PhysicalNetworks= Dec 03 07:12:33 crc kubenswrapper[4947]: ++ OVNHostName= Dec 03 07:12:33 crc kubenswrapper[4947]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 03 07:12:33 crc kubenswrapper[4947]: ++ ovs_dir=/var/lib/openvswitch Dec 03 07:12:33 crc kubenswrapper[4947]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 03 07:12:33 crc kubenswrapper[4947]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 03 07:12:33 crc kubenswrapper[4947]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 03 07:12:33 crc kubenswrapper[4947]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 03 07:12:33 crc kubenswrapper[4947]: + sleep 0.5 Dec 03 07:12:33 crc kubenswrapper[4947]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 03 07:12:33 crc kubenswrapper[4947]: + sleep 0.5 Dec 03 07:12:33 crc kubenswrapper[4947]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 03 07:12:33 crc kubenswrapper[4947]: + sleep 0.5 Dec 03 07:12:33 crc kubenswrapper[4947]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 03 07:12:33 crc kubenswrapper[4947]: + cleanup_ovsdb_server_semaphore Dec 03 07:12:33 crc kubenswrapper[4947]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 03 07:12:33 crc kubenswrapper[4947]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 03 07:12:33 crc kubenswrapper[4947]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-lb94d" message=< Dec 03 07:12:33 crc kubenswrapper[4947]: Exiting ovsdb-server (5) [ OK ] Dec 03 07:12:33 crc kubenswrapper[4947]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 03 07:12:33 crc kubenswrapper[4947]: + source /usr/local/bin/container-scripts/functions Dec 03 07:12:33 crc kubenswrapper[4947]: ++ OVNBridge=br-int Dec 03 07:12:33 crc kubenswrapper[4947]: ++ OVNRemote=tcp:localhost:6642 Dec 03 07:12:33 crc kubenswrapper[4947]: ++ OVNEncapType=geneve Dec 03 07:12:33 crc kubenswrapper[4947]: ++ OVNAvailabilityZones= Dec 03 07:12:33 crc kubenswrapper[4947]: ++ EnableChassisAsGateway=true Dec 03 07:12:33 crc kubenswrapper[4947]: ++ PhysicalNetworks= Dec 03 07:12:33 crc kubenswrapper[4947]: ++ OVNHostName= Dec 03 07:12:33 crc kubenswrapper[4947]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 03 07:12:33 crc kubenswrapper[4947]: ++ ovs_dir=/var/lib/openvswitch Dec 03 07:12:33 crc kubenswrapper[4947]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 03 07:12:33 crc kubenswrapper[4947]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 03 07:12:33 crc kubenswrapper[4947]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 03 07:12:33 crc kubenswrapper[4947]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 03 07:12:33 crc kubenswrapper[4947]: + sleep 0.5 Dec 03 07:12:33 crc kubenswrapper[4947]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 03 07:12:33 crc kubenswrapper[4947]: + sleep 0.5 Dec 03 07:12:33 crc kubenswrapper[4947]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 03 07:12:33 crc kubenswrapper[4947]: + sleep 0.5 Dec 03 07:12:33 crc kubenswrapper[4947]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 03 07:12:33 crc kubenswrapper[4947]: + cleanup_ovsdb_server_semaphore Dec 03 07:12:33 crc kubenswrapper[4947]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 03 07:12:33 crc kubenswrapper[4947]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 03 07:12:33 crc kubenswrapper[4947]: > Dec 03 07:12:33 crc kubenswrapper[4947]: E1203 07:12:33.754063 4947 kuberuntime_container.go:691] "PreStop hook failed" err=< Dec 03 07:12:33 crc kubenswrapper[4947]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 03 07:12:33 crc kubenswrapper[4947]: + source /usr/local/bin/container-scripts/functions Dec 03 07:12:33 crc kubenswrapper[4947]: ++ OVNBridge=br-int Dec 03 07:12:33 crc kubenswrapper[4947]: ++ OVNRemote=tcp:localhost:6642 Dec 03 07:12:33 crc kubenswrapper[4947]: ++ OVNEncapType=geneve Dec 03 07:12:33 crc kubenswrapper[4947]: ++ OVNAvailabilityZones= Dec 03 07:12:33 crc kubenswrapper[4947]: ++ EnableChassisAsGateway=true Dec 03 07:12:33 crc kubenswrapper[4947]: ++ PhysicalNetworks= Dec 03 07:12:33 crc kubenswrapper[4947]: ++ OVNHostName= Dec 03 07:12:33 crc kubenswrapper[4947]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 03 07:12:33 crc kubenswrapper[4947]: ++ ovs_dir=/var/lib/openvswitch Dec 03 07:12:33 crc kubenswrapper[4947]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 03 07:12:33 crc kubenswrapper[4947]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 03 07:12:33 crc kubenswrapper[4947]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 03 07:12:33 crc kubenswrapper[4947]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 03 07:12:33 crc kubenswrapper[4947]: + sleep 0.5 Dec 03 07:12:33 crc kubenswrapper[4947]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 03 07:12:33 crc kubenswrapper[4947]: + sleep 0.5 Dec 03 07:12:33 crc kubenswrapper[4947]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 03 07:12:33 crc kubenswrapper[4947]: + sleep 0.5 Dec 03 07:12:33 crc kubenswrapper[4947]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 03 07:12:33 crc kubenswrapper[4947]: + cleanup_ovsdb_server_semaphore Dec 03 07:12:33 crc kubenswrapper[4947]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 03 07:12:33 crc kubenswrapper[4947]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 03 07:12:33 crc kubenswrapper[4947]: > pod="openstack/ovn-controller-ovs-lb94d" podUID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerName="ovsdb-server" containerID="cri-o://f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39" Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.754100 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-lb94d" podUID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerName="ovsdb-server" containerID="cri-o://f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39" gracePeriod=28 Dec 03 07:12:33 crc kubenswrapper[4947]: W1203 07:12:33.811108 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c259ef0_fa34_4ef1_a954_a15bd61ed120.slice/crio-9b33eb1e0fff543ba65dcf88a2708847c122e0faf1fea3f4daf790529fab53f9 WatchSource:0}: Error finding container 9b33eb1e0fff543ba65dcf88a2708847c122e0faf1fea3f4daf790529fab53f9: Status 404 returned error can't find the container with id 9b33eb1e0fff543ba65dcf88a2708847c122e0faf1fea3f4daf790529fab53f9 Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.841804 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-cell1-novncproxy-0" podUID="e80985eb-c6e0-4ffc-9b98-b1c92be266eb" containerName="nova-cell1-novncproxy-novncproxy" probeResult="failure" output="Get \"https://10.217.0.197:6080/vnc_lite.html\": dial tcp 10.217.0.197:6080: connect: connection refused" Dec 03 07:12:33 crc kubenswrapper[4947]: E1203 07:12:33.881306 4947 projected.go:263] Couldn't get secret openstack/swift-proxy-config-data: secret "swift-proxy-config-data" not found Dec 03 07:12:33 crc kubenswrapper[4947]: E1203 07:12:33.881316 4947 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 03 07:12:33 crc kubenswrapper[4947]: E1203 07:12:33.881328 4947 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Dec 03 07:12:33 crc kubenswrapper[4947]: E1203 07:12:33.881344 4947 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 07:12:33 crc kubenswrapper[4947]: E1203 07:12:33.881355 4947 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-6ddbf97597-l6hz9: [secret "swift-proxy-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Dec 03 07:12:33 crc kubenswrapper[4947]: E1203 07:12:33.881373 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-config-data podName:5367165f-75ec-4633-8042-edfe91e3be60 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:37.881356246 +0000 UTC m=+1419.142310672 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-config-data") pod "rabbitmq-cell1-server-0" (UID: "5367165f-75ec-4633-8042-edfe91e3be60") : configmap "rabbitmq-cell1-config-data" not found Dec 03 07:12:33 crc kubenswrapper[4947]: E1203 07:12:33.881393 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1f87802c-3846-486e-a131-39a7fe336c96-etc-swift podName:1f87802c-3846-486e-a131-39a7fe336c96 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:37.881379447 +0000 UTC m=+1419.142333873 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1f87802c-3846-486e-a131-39a7fe336c96-etc-swift") pod "swift-proxy-6ddbf97597-l6hz9" (UID: "1f87802c-3846-486e-a131-39a7fe336c96") : [secret "swift-proxy-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Dec 03 07:12:33 crc kubenswrapper[4947]: E1203 07:12:33.881428 4947 secret.go:188] Couldn't get secret openstack/cinder-config-data: secret "cinder-config-data" not found Dec 03 07:12:33 crc kubenswrapper[4947]: E1203 07:12:33.881444 4947 secret.go:188] Couldn't get secret openstack/cinder-scripts: secret "cinder-scripts" not found Dec 03 07:12:33 crc kubenswrapper[4947]: E1203 07:12:33.881470 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-config-data podName:49caa6da-3c98-4c49-ab22-62121ff908cf nodeName:}" failed. No retries permitted until 2025-12-03 07:12:37.881459119 +0000 UTC m=+1419.142413625 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-config-data") pod "cinder-scheduler-0" (UID: "49caa6da-3c98-4c49-ab22-62121ff908cf") : secret "cinder-config-data" not found Dec 03 07:12:33 crc kubenswrapper[4947]: E1203 07:12:33.881530 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-scripts podName:49caa6da-3c98-4c49-ab22-62121ff908cf nodeName:}" failed. No retries permitted until 2025-12-03 07:12:37.881480399 +0000 UTC m=+1419.142434935 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-scripts") pod "cinder-scheduler-0" (UID: "49caa6da-3c98-4c49-ab22-62121ff908cf") : secret "cinder-scripts" not found Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.892148 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-lb94d" podUID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerName="ovs-vswitchd" containerID="cri-o://3ce586afd15fbdd0b4c1dfbc0475b8228a9be2dd3ee32c7a7f80750aafce6e2f" gracePeriod=28 Dec 03 07:12:33 crc kubenswrapper[4947]: I1203 07:12:33.903740 4947 scope.go:117] "RemoveContainer" containerID="759d1c8703ad59941336de48361002019ed4b13b124229b7732fe2bcf89eedf4" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.075114 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_90b682d1-68e6-49a4-83a6-51b1b40b7e99/ovsdbserver-sb/0.log" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.075206 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.086291 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/90b682d1-68e6-49a4-83a6-51b1b40b7e99-metrics-certs-tls-certs\") pod \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.086355 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90b682d1-68e6-49a4-83a6-51b1b40b7e99-ovsdbserver-sb-tls-certs\") pod \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.086430 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90b682d1-68e6-49a4-83a6-51b1b40b7e99-combined-ca-bundle\") pod \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.086464 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/90b682d1-68e6-49a4-83a6-51b1b40b7e99-ovsdb-rundir\") pod \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.086522 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54rrj\" (UniqueName: \"kubernetes.io/projected/90b682d1-68e6-49a4-83a6-51b1b40b7e99-kube-api-access-54rrj\") pod \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.086609 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90b682d1-68e6-49a4-83a6-51b1b40b7e99-scripts\") pod \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.086633 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.086670 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90b682d1-68e6-49a4-83a6-51b1b40b7e99-config\") pod \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\" (UID: \"90b682d1-68e6-49a4-83a6-51b1b40b7e99\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.087486 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90b682d1-68e6-49a4-83a6-51b1b40b7e99-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "90b682d1-68e6-49a4-83a6-51b1b40b7e99" (UID: "90b682d1-68e6-49a4-83a6-51b1b40b7e99"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.091090 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90b682d1-68e6-49a4-83a6-51b1b40b7e99-config" (OuterVolumeSpecName: "config") pod "90b682d1-68e6-49a4-83a6-51b1b40b7e99" (UID: "90b682d1-68e6-49a4-83a6-51b1b40b7e99"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.092913 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90b682d1-68e6-49a4-83a6-51b1b40b7e99-scripts" (OuterVolumeSpecName: "scripts") pod "90b682d1-68e6-49a4-83a6-51b1b40b7e99" (UID: "90b682d1-68e6-49a4-83a6-51b1b40b7e99"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.095680 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.096527 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6cg4w_3496bac7-6b31-4ba8-a490-14bff1522b8c/openstack-network-exporter/0.log" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.096724 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6cg4w" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.103870 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "90b682d1-68e6-49a4-83a6-51b1b40b7e99" (UID: "90b682d1-68e6-49a4-83a6-51b1b40b7e99"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.107177 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.116661 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90b682d1-68e6-49a4-83a6-51b1b40b7e99-kube-api-access-54rrj" (OuterVolumeSpecName: "kube-api-access-54rrj") pod "90b682d1-68e6-49a4-83a6-51b1b40b7e99" (UID: "90b682d1-68e6-49a4-83a6-51b1b40b7e99"). InnerVolumeSpecName "kube-api-access-54rrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.124741 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.188874 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3496bac7-6b31-4ba8-a490-14bff1522b8c-metrics-certs-tls-certs\") pod \"3496bac7-6b31-4ba8-a490-14bff1522b8c\" (UID: \"3496bac7-6b31-4ba8-a490-14bff1522b8c\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.188918 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl5hq\" (UniqueName: \"kubernetes.io/projected/28892119-e165-46ea-a903-08207e491378-kube-api-access-bl5hq\") pod \"28892119-e165-46ea-a903-08207e491378\" (UID: \"28892119-e165-46ea-a903-08207e491378\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.188940 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28892119-e165-46ea-a903-08207e491378-combined-ca-bundle\") pod \"28892119-e165-46ea-a903-08207e491378\" (UID: \"28892119-e165-46ea-a903-08207e491378\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.189000 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/28892119-e165-46ea-a903-08207e491378-openstack-config\") pod \"28892119-e165-46ea-a903-08207e491378\" (UID: \"28892119-e165-46ea-a903-08207e491378\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.189068 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3496bac7-6b31-4ba8-a490-14bff1522b8c-ovn-rundir\") pod \"3496bac7-6b31-4ba8-a490-14bff1522b8c\" (UID: \"3496bac7-6b31-4ba8-a490-14bff1522b8c\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.189116 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nts44\" (UniqueName: \"kubernetes.io/projected/3496bac7-6b31-4ba8-a490-14bff1522b8c-kube-api-access-nts44\") pod \"3496bac7-6b31-4ba8-a490-14bff1522b8c\" (UID: \"3496bac7-6b31-4ba8-a490-14bff1522b8c\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.189188 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/28892119-e165-46ea-a903-08207e491378-openstack-config-secret\") pod \"28892119-e165-46ea-a903-08207e491378\" (UID: \"28892119-e165-46ea-a903-08207e491378\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.189847 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3496bac7-6b31-4ba8-a490-14bff1522b8c-combined-ca-bundle\") pod \"3496bac7-6b31-4ba8-a490-14bff1522b8c\" (UID: \"3496bac7-6b31-4ba8-a490-14bff1522b8c\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.189922 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3496bac7-6b31-4ba8-a490-14bff1522b8c-config\") pod \"3496bac7-6b31-4ba8-a490-14bff1522b8c\" (UID: \"3496bac7-6b31-4ba8-a490-14bff1522b8c\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.189990 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3496bac7-6b31-4ba8-a490-14bff1522b8c-ovs-rundir\") pod \"3496bac7-6b31-4ba8-a490-14bff1522b8c\" (UID: \"3496bac7-6b31-4ba8-a490-14bff1522b8c\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.190075 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3496bac7-6b31-4ba8-a490-14bff1522b8c-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "3496bac7-6b31-4ba8-a490-14bff1522b8c" (UID: "3496bac7-6b31-4ba8-a490-14bff1522b8c"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.190690 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/90b682d1-68e6-49a4-83a6-51b1b40b7e99-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.190704 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54rrj\" (UniqueName: \"kubernetes.io/projected/90b682d1-68e6-49a4-83a6-51b1b40b7e99-kube-api-access-54rrj\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.190715 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90b682d1-68e6-49a4-83a6-51b1b40b7e99-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.190762 4947 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.190772 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/90b682d1-68e6-49a4-83a6-51b1b40b7e99-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.190783 4947 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3496bac7-6b31-4ba8-a490-14bff1522b8c-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.193652 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3496bac7-6b31-4ba8-a490-14bff1522b8c-config" (OuterVolumeSpecName: "config") pod "3496bac7-6b31-4ba8-a490-14bff1522b8c" (UID: "3496bac7-6b31-4ba8-a490-14bff1522b8c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.194674 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3496bac7-6b31-4ba8-a490-14bff1522b8c-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "3496bac7-6b31-4ba8-a490-14bff1522b8c" (UID: "3496bac7-6b31-4ba8-a490-14bff1522b8c"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.228205 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28892119-e165-46ea-a903-08207e491378-kube-api-access-bl5hq" (OuterVolumeSpecName: "kube-api-access-bl5hq") pod "28892119-e165-46ea-a903-08207e491378" (UID: "28892119-e165-46ea-a903-08207e491378"). InnerVolumeSpecName "kube-api-access-bl5hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.228643 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90b682d1-68e6-49a4-83a6-51b1b40b7e99-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90b682d1-68e6-49a4-83a6-51b1b40b7e99" (UID: "90b682d1-68e6-49a4-83a6-51b1b40b7e99"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.228765 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3496bac7-6b31-4ba8-a490-14bff1522b8c-kube-api-access-nts44" (OuterVolumeSpecName: "kube-api-access-nts44") pod "3496bac7-6b31-4ba8-a490-14bff1522b8c" (UID: "3496bac7-6b31-4ba8-a490-14bff1522b8c"). InnerVolumeSpecName "kube-api-access-nts44". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.232397 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.270261 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28892119-e165-46ea-a903-08207e491378-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "28892119-e165-46ea-a903-08207e491378" (UID: "28892119-e165-46ea-a903-08207e491378"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.270557 4947 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.285913 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28892119-e165-46ea-a903-08207e491378-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28892119-e165-46ea-a903-08207e491378" (UID: "28892119-e165-46ea-a903-08207e491378"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.292388 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5thm\" (UniqueName: \"kubernetes.io/projected/b4a227e4-8c2a-4880-9944-877640627cd0-kube-api-access-p5thm\") pod \"b4a227e4-8c2a-4880-9944-877640627cd0\" (UID: \"b4a227e4-8c2a-4880-9944-877640627cd0\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.292474 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-config\") pod \"b4a227e4-8c2a-4880-9944-877640627cd0\" (UID: \"b4a227e4-8c2a-4880-9944-877640627cd0\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.292627 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-ovsdbserver-sb\") pod \"b4a227e4-8c2a-4880-9944-877640627cd0\" (UID: \"b4a227e4-8c2a-4880-9944-877640627cd0\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.292712 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-ovsdbserver-nb\") pod \"b4a227e4-8c2a-4880-9944-877640627cd0\" (UID: \"b4a227e4-8c2a-4880-9944-877640627cd0\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.292760 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-dns-swift-storage-0\") pod \"b4a227e4-8c2a-4880-9944-877640627cd0\" (UID: \"b4a227e4-8c2a-4880-9944-877640627cd0\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.292828 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-dns-svc\") pod \"b4a227e4-8c2a-4880-9944-877640627cd0\" (UID: \"b4a227e4-8c2a-4880-9944-877640627cd0\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.293364 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl5hq\" (UniqueName: \"kubernetes.io/projected/28892119-e165-46ea-a903-08207e491378-kube-api-access-bl5hq\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.293381 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28892119-e165-46ea-a903-08207e491378-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.293390 4947 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.293399 4947 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/28892119-e165-46ea-a903-08207e491378-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.293407 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nts44\" (UniqueName: \"kubernetes.io/projected/3496bac7-6b31-4ba8-a490-14bff1522b8c-kube-api-access-nts44\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.293417 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3496bac7-6b31-4ba8-a490-14bff1522b8c-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.293426 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90b682d1-68e6-49a4-83a6-51b1b40b7e99-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.293435 4947 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3496bac7-6b31-4ba8-a490-14bff1522b8c-ovs-rundir\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.311831 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4a227e4-8c2a-4880-9944-877640627cd0-kube-api-access-p5thm" (OuterVolumeSpecName: "kube-api-access-p5thm") pod "b4a227e4-8c2a-4880-9944-877640627cd0" (UID: "b4a227e4-8c2a-4880-9944-877640627cd0"). InnerVolumeSpecName "kube-api-access-p5thm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.397805 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5thm\" (UniqueName: \"kubernetes.io/projected/b4a227e4-8c2a-4880-9944-877640627cd0-kube-api-access-p5thm\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:34 crc kubenswrapper[4947]: E1203 07:12:34.399039 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39 is running failed: container process not found" containerID="f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 07:12:34 crc kubenswrapper[4947]: E1203 07:12:34.400133 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39 is running failed: container process not found" containerID="f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 07:12:34 crc kubenswrapper[4947]: E1203 07:12:34.400595 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39 is running failed: container process not found" containerID="f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 07:12:34 crc kubenswrapper[4947]: E1203 07:12:34.400624 4947 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-lb94d" podUID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerName="ovsdb-server" Dec 03 07:12:34 crc kubenswrapper[4947]: E1203 07:12:34.404203 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ce586afd15fbdd0b4c1dfbc0475b8228a9be2dd3ee32c7a7f80750aafce6e2f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 07:12:34 crc kubenswrapper[4947]: E1203 07:12:34.406864 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ce586afd15fbdd0b4c1dfbc0475b8228a9be2dd3ee32c7a7f80750aafce6e2f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 07:12:34 crc kubenswrapper[4947]: E1203 07:12:34.422653 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ce586afd15fbdd0b4c1dfbc0475b8228a9be2dd3ee32c7a7f80750aafce6e2f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 07:12:34 crc kubenswrapper[4947]: E1203 07:12:34.422724 4947 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-lb94d" podUID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerName="ovs-vswitchd" Dec 03 07:12:34 crc kubenswrapper[4947]: E1203 07:12:34.466053 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 05a614732107d76e00cf7484e0feb6e0485e39e5f9e2a1c05314d7e15cca22c4 is running failed: container process not found" containerID="05a614732107d76e00cf7484e0feb6e0485e39e5f9e2a1c05314d7e15cca22c4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 07:12:34 crc kubenswrapper[4947]: E1203 07:12:34.466309 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 05a614732107d76e00cf7484e0feb6e0485e39e5f9e2a1c05314d7e15cca22c4 is running failed: container process not found" containerID="05a614732107d76e00cf7484e0feb6e0485e39e5f9e2a1c05314d7e15cca22c4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 07:12:34 crc kubenswrapper[4947]: E1203 07:12:34.466483 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 05a614732107d76e00cf7484e0feb6e0485e39e5f9e2a1c05314d7e15cca22c4 is running failed: container process not found" containerID="05a614732107d76e00cf7484e0feb6e0485e39e5f9e2a1c05314d7e15cca22c4" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 07:12:34 crc kubenswrapper[4947]: E1203 07:12:34.466518 4947 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 05a614732107d76e00cf7484e0feb6e0485e39e5f9e2a1c05314d7e15cca22c4 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="3b4b931b-69ee-4ff2-b01a-85d45fc93ec4" containerName="nova-cell1-conductor-conductor" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.537357 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5db58dc49-hq6px" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.583993 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.607889 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed654f44-78c3-4118-83d5-e2a5d917c4f4-combined-ca-bundle\") pod \"ed654f44-78c3-4118-83d5-e2a5d917c4f4\" (UID: \"ed654f44-78c3-4118-83d5-e2a5d917c4f4\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.607947 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed654f44-78c3-4118-83d5-e2a5d917c4f4-logs\") pod \"ed654f44-78c3-4118-83d5-e2a5d917c4f4\" (UID: \"ed654f44-78c3-4118-83d5-e2a5d917c4f4\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.607976 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-nova-novncproxy-tls-certs\") pod \"e80985eb-c6e0-4ffc-9b98-b1c92be266eb\" (UID: \"e80985eb-c6e0-4ffc-9b98-b1c92be266eb\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.607999 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-combined-ca-bundle\") pod \"e80985eb-c6e0-4ffc-9b98-b1c92be266eb\" (UID: \"e80985eb-c6e0-4ffc-9b98-b1c92be266eb\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.608027 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed654f44-78c3-4118-83d5-e2a5d917c4f4-config-data\") pod \"ed654f44-78c3-4118-83d5-e2a5d917c4f4\" (UID: \"ed654f44-78c3-4118-83d5-e2a5d917c4f4\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.608050 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmjvz\" (UniqueName: \"kubernetes.io/projected/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-kube-api-access-fmjvz\") pod \"e80985eb-c6e0-4ffc-9b98-b1c92be266eb\" (UID: \"e80985eb-c6e0-4ffc-9b98-b1c92be266eb\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.608166 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hff2z\" (UniqueName: \"kubernetes.io/projected/ed654f44-78c3-4118-83d5-e2a5d917c4f4-kube-api-access-hff2z\") pod \"ed654f44-78c3-4118-83d5-e2a5d917c4f4\" (UID: \"ed654f44-78c3-4118-83d5-e2a5d917c4f4\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.608240 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-vencrypt-tls-certs\") pod \"e80985eb-c6e0-4ffc-9b98-b1c92be266eb\" (UID: \"e80985eb-c6e0-4ffc-9b98-b1c92be266eb\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.608276 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed654f44-78c3-4118-83d5-e2a5d917c4f4-config-data-custom\") pod \"ed654f44-78c3-4118-83d5-e2a5d917c4f4\" (UID: \"ed654f44-78c3-4118-83d5-e2a5d917c4f4\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.608378 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-config-data\") pod \"e80985eb-c6e0-4ffc-9b98-b1c92be266eb\" (UID: \"e80985eb-c6e0-4ffc-9b98-b1c92be266eb\") " Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.610011 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed654f44-78c3-4118-83d5-e2a5d917c4f4-logs" (OuterVolumeSpecName: "logs") pod "ed654f44-78c3-4118-83d5-e2a5d917c4f4" (UID: "ed654f44-78c3-4118-83d5-e2a5d917c4f4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.635938 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-kube-api-access-fmjvz" (OuterVolumeSpecName: "kube-api-access-fmjvz") pod "e80985eb-c6e0-4ffc-9b98-b1c92be266eb" (UID: "e80985eb-c6e0-4ffc-9b98-b1c92be266eb"). InnerVolumeSpecName "kube-api-access-fmjvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.684032 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed654f44-78c3-4118-83d5-e2a5d917c4f4-kube-api-access-hff2z" (OuterVolumeSpecName: "kube-api-access-hff2z") pod "ed654f44-78c3-4118-83d5-e2a5d917c4f4" (UID: "ed654f44-78c3-4118-83d5-e2a5d917c4f4"). InnerVolumeSpecName "kube-api-access-hff2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.689381 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed654f44-78c3-4118-83d5-e2a5d917c4f4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ed654f44-78c3-4118-83d5-e2a5d917c4f4" (UID: "ed654f44-78c3-4118-83d5-e2a5d917c4f4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.710424 4947 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ed654f44-78c3-4118-83d5-e2a5d917c4f4-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.710823 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed654f44-78c3-4118-83d5-e2a5d917c4f4-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.710925 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmjvz\" (UniqueName: \"kubernetes.io/projected/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-kube-api-access-fmjvz\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.710988 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hff2z\" (UniqueName: \"kubernetes.io/projected/ed654f44-78c3-4118-83d5-e2a5d917c4f4-kube-api-access-hff2z\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.779433 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5db58dc49-hq6px" event={"ID":"ed654f44-78c3-4118-83d5-e2a5d917c4f4","Type":"ContainerDied","Data":"72a7f211b5a75b4a185ba853cf9914e224dc0742880a659bebae781e9648aaf2"} Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.779532 4947 scope.go:117] "RemoveContainer" containerID="e0dfb3c4add7777ef4ada7896099eaab6442a879387a5254d9c5547a18dee051" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.779481 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5db58dc49-hq6px" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.781564 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron896b-account-delete-ddxm6" event={"ID":"e4b56c27-0089-46d0-8cc3-c5788833f135","Type":"ContainerStarted","Data":"362768caa5c7d95583b0b35cd93dccddd58e68fbc728c4967b010b78a7193957"} Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.785698 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6ddbf97597-l6hz9" event={"ID":"1f87802c-3846-486e-a131-39a7fe336c96","Type":"ContainerDied","Data":"4a3594817c6134092e02431f08ce019e2382e93c314324d8cb6c1e3448a88cdd"} Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.785728 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a3594817c6134092e02431f08ce019e2382e93c314324d8cb6c1e3448a88cdd" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.787074 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican0363-account-delete-22cjb" event={"ID":"44c0f438-8d06-433e-af68-49ccacd9a017","Type":"ContainerStarted","Data":"e9f5bb21882477cf0c9d62bd9aa094c8f2b5fe84275fe15e39ccd3e500a108e3"} Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.791408 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance93c8-account-delete-j67wz" event={"ID":"c6a94dde-b399-408b-93f2-488d02be7f07","Type":"ContainerStarted","Data":"1844d37f809640ea8effa099c546a30716c8ad0bbab318d8e046a3e937475817"} Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.791484 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance93c8-account-delete-j67wz" event={"ID":"c6a94dde-b399-408b-93f2-488d02be7f07","Type":"ContainerStarted","Data":"6f3e7195f5a72482222bd69507f214cd995af856e5e4d177929552e138e91ef6"} Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.795408 4947 generic.go:334] "Generic (PLEG): container finished" podID="1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3" containerID="a809b23cd1a249dbc9331e612526f60ed5bdd18fa6472bcbf28960c85be485be" exitCode=0 Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.795464 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c489f678-crqhz" event={"ID":"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3","Type":"ContainerDied","Data":"a809b23cd1a249dbc9331e612526f60ed5bdd18fa6472bcbf28960c85be485be"} Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.800790 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.800800 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9cbcb645-6xgx7" event={"ID":"b4a227e4-8c2a-4880-9944-877640627cd0","Type":"ContainerDied","Data":"c5bae347e6279d9024cb5005defb36680899b4dc6be37e8fdddd2e0da6c64681"} Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.802469 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6cg4w_3496bac7-6b31-4ba8-a490-14bff1522b8c/openstack-network-exporter/0.log" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.802590 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6cg4w" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.802642 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6cg4w" event={"ID":"3496bac7-6b31-4ba8-a490-14bff1522b8c","Type":"ContainerDied","Data":"315d6d274559f1e5b1ce6fbe348451f3e8c32240b91c6ac76e8b6663da1e470b"} Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.809971 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.815599 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance93c8-account-delete-j67wz" podStartSLOduration=4.815583508 podStartE2EDuration="4.815583508s" podCreationTimestamp="2025-12-03 07:12:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:12:34.807663893 +0000 UTC m=+1416.068618319" watchObservedRunningTime="2025-12-03 07:12:34.815583508 +0000 UTC m=+1416.076537924" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.823512 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder0e6b-account-delete-kmbkl" event={"ID":"8c259ef0-fa34-4ef1-a954-a15bd61ed120","Type":"ContainerStarted","Data":"9b33eb1e0fff543ba65dcf88a2708847c122e0faf1fea3f4daf790529fab53f9"} Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.825059 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementbdf2-account-delete-qmjm2" event={"ID":"1256396c-ee12-469e-864f-c87983516079","Type":"ContainerStarted","Data":"f02f49131c5887d2f979b608f73395a77c9847d69c592fe2dbe726e9c70b8309"} Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.828232 4947 generic.go:334] "Generic (PLEG): container finished" podID="3b4b931b-69ee-4ff2-b01a-85d45fc93ec4" containerID="05a614732107d76e00cf7484e0feb6e0485e39e5f9e2a1c05314d7e15cca22c4" exitCode=0 Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.828299 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3b4b931b-69ee-4ff2-b01a-85d45fc93ec4","Type":"ContainerDied","Data":"05a614732107d76e00cf7484e0feb6e0485e39e5f9e2a1c05314d7e15cca22c4"} Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.828331 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"3b4b931b-69ee-4ff2-b01a-85d45fc93ec4","Type":"ContainerDied","Data":"b2676120afc69da3f98f532e81b7734ae53d0aa77b47c4fe30d93bbbb6ab2a0a"} Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.828354 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2676120afc69da3f98f532e81b7734ae53d0aa77b47c4fe30d93bbbb6ab2a0a" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.831129 4947 generic.go:334] "Generic (PLEG): container finished" podID="ccb28a2e-a946-4407-be07-6ac8eaad8ab1" containerID="d1c453acbdb699e13cb35f456f04fa20cb22c76c9d9304778639baf320f5cf98" exitCode=0 Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.831191 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dd7c69-hhnlg" event={"ID":"ccb28a2e-a946-4407-be07-6ac8eaad8ab1","Type":"ContainerDied","Data":"d1c453acbdb699e13cb35f456f04fa20cb22c76c9d9304778639baf320f5cf98"} Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.847453 4947 generic.go:334] "Generic (PLEG): container finished" podID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerID="f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39" exitCode=0 Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.847549 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lb94d" event={"ID":"7b86a5d2-1933-4a2f-97de-f3b49985fbf8","Type":"ContainerDied","Data":"f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39"} Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.849587 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.849744 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e80985eb-c6e0-4ffc-9b98-b1c92be266eb","Type":"ContainerDied","Data":"2de63a9f7da24cf13720666eadfee51c31ec63a3d57b90de0b7a6a039cfb7ff6"} Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.850123 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.916159 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3496bac7-6b31-4ba8-a490-14bff1522b8c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3496bac7-6b31-4ba8-a490-14bff1522b8c" (UID: "3496bac7-6b31-4ba8-a490-14bff1522b8c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.938129 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapib13d-account-delete-k5vqn"] Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.947246 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-774d6d4878-7x6tj" podUID="1c3c47d5-30e8-4c5f-93fe-e0d944cdc998" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.150:8778/\": read tcp 10.217.0.2:45304->10.217.0.150:8778: read: connection reset by peer" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.947806 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-774d6d4878-7x6tj" podUID="1c3c47d5-30e8-4c5f-93fe-e0d944cdc998" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.150:8778/\": read tcp 10.217.0.2:45294->10.217.0.150:8778: read: connection reset by peer" Dec 03 07:12:34 crc kubenswrapper[4947]: I1203 07:12:34.958806 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell0c18d-account-delete-458q7"] Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.015719 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3496bac7-6b31-4ba8-a490-14bff1522b8c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.138938 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28892119-e165-46ea-a903-08207e491378-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "28892119-e165-46ea-a903-08207e491378" (UID: "28892119-e165-46ea-a903-08207e491378"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.145971 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3496bac7-6b31-4ba8-a490-14bff1522b8c-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "3496bac7-6b31-4ba8-a490-14bff1522b8c" (UID: "3496bac7-6b31-4ba8-a490-14bff1522b8c"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.147704 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-config-data" (OuterVolumeSpecName: "config-data") pod "e80985eb-c6e0-4ffc-9b98-b1c92be266eb" (UID: "e80985eb-c6e0-4ffc-9b98-b1c92be266eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.157125 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28892119-e165-46ea-a903-08207e491378" path="/var/lib/kubelet/pods/28892119-e165-46ea-a903-08207e491378/volumes" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.157911 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c475475-916c-4267-8064-f932c04d0df2" path="/var/lib/kubelet/pods/6c475475-916c-4267-8064-f932c04d0df2/volumes" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.168209 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b4a227e4-8c2a-4880-9944-877640627cd0" (UID: "b4a227e4-8c2a-4880-9944-877640627cd0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.168584 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b4a227e4-8c2a-4880-9944-877640627cd0" (UID: "b4a227e4-8c2a-4880-9944-877640627cd0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.199712 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed654f44-78c3-4118-83d5-e2a5d917c4f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed654f44-78c3-4118-83d5-e2a5d917c4f4" (UID: "ed654f44-78c3-4118-83d5-e2a5d917c4f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.216071 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-config" (OuterVolumeSpecName: "config") pod "b4a227e4-8c2a-4880-9944-877640627cd0" (UID: "b4a227e4-8c2a-4880-9944-877640627cd0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.216564 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90b682d1-68e6-49a4-83a6-51b1b40b7e99-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "90b682d1-68e6-49a4-83a6-51b1b40b7e99" (UID: "90b682d1-68e6-49a4-83a6-51b1b40b7e99"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.224832 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="9a879185-9b9a-45a5-a211-c61faf308cbb" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.165:8776/healthcheck\": read tcp 10.217.0.2:38922->10.217.0.165:8776: read: connection reset by peer" Dec 03 07:12:35 crc kubenswrapper[4947]: E1203 07:12:35.228307 4947 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.228329 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.228348 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.228358 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed654f44-78c3-4118-83d5-e2a5d917c4f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:35 crc kubenswrapper[4947]: E1203 07:12:35.228380 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-config-data podName:51d7ef1d-a0bf-465f-baad-1bc3a71618ff nodeName:}" failed. No retries permitted until 2025-12-03 07:12:39.228363113 +0000 UTC m=+1420.489317529 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-config-data") pod "rabbitmq-server-0" (UID: "51d7ef1d-a0bf-465f-baad-1bc3a71618ff") : configmap "rabbitmq-config-data" not found Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.228411 4947 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/28892119-e165-46ea-a903-08207e491378-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.228425 4947 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.228436 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/90b682d1-68e6-49a4-83a6-51b1b40b7e99-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.228446 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.228458 4947 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3496bac7-6b31-4ba8-a490-14bff1522b8c-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.232470 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e80985eb-c6e0-4ffc-9b98-b1c92be266eb" (UID: "e80985eb-c6e0-4ffc-9b98-b1c92be266eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:35 crc kubenswrapper[4947]: E1203 07:12:35.260886 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d0205492af2a84d2597e544b1fac1933410305054e5bf3b202d9e1f108f9cdfa is running failed: container process not found" containerID="d0205492af2a84d2597e544b1fac1933410305054e5bf3b202d9e1f108f9cdfa" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 07:12:35 crc kubenswrapper[4947]: E1203 07:12:35.263475 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d0205492af2a84d2597e544b1fac1933410305054e5bf3b202d9e1f108f9cdfa is running failed: container process not found" containerID="d0205492af2a84d2597e544b1fac1933410305054e5bf3b202d9e1f108f9cdfa" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 07:12:35 crc kubenswrapper[4947]: E1203 07:12:35.264422 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d0205492af2a84d2597e544b1fac1933410305054e5bf3b202d9e1f108f9cdfa is running failed: container process not found" containerID="d0205492af2a84d2597e544b1fac1933410305054e5bf3b202d9e1f108f9cdfa" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 07:12:35 crc kubenswrapper[4947]: E1203 07:12:35.264451 4947 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d0205492af2a84d2597e544b1fac1933410305054e5bf3b202d9e1f108f9cdfa is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="3735b7db-e9a7-4be6-9c74-cad0131f2c0b" containerName="nova-scheduler-scheduler" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.271347 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b4a227e4-8c2a-4880-9944-877640627cd0" (UID: "b4a227e4-8c2a-4880-9944-877640627cd0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.299034 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b4a227e4-8c2a-4880-9944-877640627cd0" (UID: "b4a227e4-8c2a-4880-9944-877640627cd0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.315441 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "e80985eb-c6e0-4ffc-9b98-b1c92be266eb" (UID: "e80985eb-c6e0-4ffc-9b98-b1c92be266eb"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.318079 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "e80985eb-c6e0-4ffc-9b98-b1c92be266eb" (UID: "e80985eb-c6e0-4ffc-9b98-b1c92be266eb"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.333091 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.333126 4947 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.333137 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.333145 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b4a227e4-8c2a-4880-9944-877640627cd0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.333154 4947 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e80985eb-c6e0-4ffc-9b98-b1c92be266eb-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.362806 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90b682d1-68e6-49a4-83a6-51b1b40b7e99-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "90b682d1-68e6-49a4-83a6-51b1b40b7e99" (UID: "90b682d1-68e6-49a4-83a6-51b1b40b7e99"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.427191 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed654f44-78c3-4118-83d5-e2a5d917c4f4-config-data" (OuterVolumeSpecName: "config-data") pod "ed654f44-78c3-4118-83d5-e2a5d917c4f4" (UID: "ed654f44-78c3-4118-83d5-e2a5d917c4f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.439166 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed654f44-78c3-4118-83d5-e2a5d917c4f4-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.439204 4947 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/90b682d1-68e6-49a4-83a6-51b1b40b7e99-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.506681 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="899d3d67-ec63-4d5f-ad93-c40003578347" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": read tcp 10.217.0.2:39592->10.217.0.203:8775: read: connection reset by peer" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.507764 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="899d3d67-ec63-4d5f-ad93-c40003578347" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": read tcp 10.217.0.2:39584->10.217.0.203:8775: read: connection reset by peer" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.590168 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-59d779d8d8-6jl5c" podUID="e4e9d6dc-e814-485d-842b-9266732c7924" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:50788->10.217.0.164:9311: read: connection reset by peer" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.590181 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-59d779d8d8-6jl5c" podUID="e4e9d6dc-e814-485d-842b-9266732c7924" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:50802->10.217.0.164:9311: read: connection reset by peer" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.603938 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7dd7c69-hhnlg" podUID="ccb28a2e-a946-4407-be07-6ac8eaad8ab1" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.155:9696/\": dial tcp 10.217.0.155:9696: connect: connection refused" Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.906520 4947 generic.go:334] "Generic (PLEG): container finished" podID="9a879185-9b9a-45a5-a211-c61faf308cbb" containerID="737f46e0f99576fb923247d82fc1eef29b5d49123f805ead5797b64495cf9e63" exitCode=0 Dec 03 07:12:35 crc kubenswrapper[4947]: I1203 07:12:35.992765 4947 generic.go:334] "Generic (PLEG): container finished" podID="e4e9d6dc-e814-485d-842b-9266732c7924" containerID="c60ae6e938e54908332dbaa6ddd689e443ba56a7fa23dbd3d3249373512ca680" exitCode=0 Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.043459 4947 generic.go:334] "Generic (PLEG): container finished" podID="fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8" containerID="a97cfb598af771fadb5726f8c4ca3b03aba98f58c7aa78e79de0f3b339e5fad7" exitCode=0 Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.094507 4947 generic.go:334] "Generic (PLEG): container finished" podID="44c0f438-8d06-433e-af68-49ccacd9a017" containerID="81c2ad2c09c8e03c68216f5fbecb08e40fa8cf252b49a0930e549c7e3e04e959" exitCode=0 Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.112934 4947 generic.go:334] "Generic (PLEG): container finished" podID="1256396c-ee12-469e-864f-c87983516079" containerID="099259ef27c1dbed4aec87e28da3485dccbf8c02f2d379b599c19fcda6e2c957" exitCode=0 Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.114695 4947 generic.go:334] "Generic (PLEG): container finished" podID="c7b9bf09-0d94-4520-b783-7eb3fb4b79d4" containerID="64b3099ce4ae64f657b3ba6a9d2a21ac15dc33c2c3dddccc593d4d83045820bf" exitCode=0 Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.117450 4947 generic.go:334] "Generic (PLEG): container finished" podID="899d3d67-ec63-4d5f-ad93-c40003578347" containerID="a8b3ef7eb0ba7d8af170edb23f786fe99d24451a36fd652d4dbc94d3f46220ba" exitCode=0 Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.130324 4947 generic.go:334] "Generic (PLEG): container finished" podID="c6a94dde-b399-408b-93f2-488d02be7f07" containerID="1844d37f809640ea8effa099c546a30716c8ad0bbab318d8e046a3e937475817" exitCode=0 Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.161834 4947 generic.go:334] "Generic (PLEG): container finished" podID="3735b7db-e9a7-4be6-9c74-cad0131f2c0b" containerID="d0205492af2a84d2597e544b1fac1933410305054e5bf3b202d9e1f108f9cdfa" exitCode=0 Dec 03 07:12:36 crc kubenswrapper[4947]: E1203 07:12:36.169556 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="334c2441513d95f243b52e76544d548f66b710f9a5c7ddb3358d9df97ecd5f57" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 07:12:36 crc kubenswrapper[4947]: E1203 07:12:36.180368 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="334c2441513d95f243b52e76544d548f66b710f9a5c7ddb3358d9df97ecd5f57" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 07:12:36 crc kubenswrapper[4947]: E1203 07:12:36.187710 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="334c2441513d95f243b52e76544d548f66b710f9a5c7ddb3358d9df97ecd5f57" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 07:12:36 crc kubenswrapper[4947]: E1203 07:12:36.187773 4947 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="9f0a57ca-d063-4d27-ac54-f5431cca2971" containerName="nova-cell0-conductor-conductor" Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.187935 4947 generic.go:334] "Generic (PLEG): container finished" podID="1c3c47d5-30e8-4c5f-93fe-e0d944cdc998" containerID="ce7f8de092c5631abd0150a07a948c24b407959e654aa1d9ce964ed828bb05fa" exitCode=0 Dec 03 07:12:36 crc kubenswrapper[4947]: E1203 07:12:36.227565 4947 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.146s" Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.227676 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9a879185-9b9a-45a5-a211-c61faf308cbb","Type":"ContainerDied","Data":"737f46e0f99576fb923247d82fc1eef29b5d49123f805ead5797b64495cf9e63"} Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.227699 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9a879185-9b9a-45a5-a211-c61faf308cbb","Type":"ContainerDied","Data":"4ae36dfda6ddd253abc4dbf1ff3fc0ae9aa5dbafaec8e55c9f72c05188ef29b9"} Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.227708 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ae36dfda6ddd253abc4dbf1ff3fc0ae9aa5dbafaec8e55c9f72c05188ef29b9" Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.227717 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapib13d-account-delete-k5vqn" event={"ID":"62cff038-cc99-44ca-ba48-a05d16a96b26","Type":"ContainerStarted","Data":"15df791074ddfa5b8c46cf981fc2125134f8625a76d9d6cb4e5cc3fc4f7549e7"} Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.227727 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59d779d8d8-6jl5c" event={"ID":"e4e9d6dc-e814-485d-842b-9266732c7924","Type":"ContainerDied","Data":"c60ae6e938e54908332dbaa6ddd689e443ba56a7fa23dbd3d3249373512ca680"} Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.227738 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8","Type":"ContainerDied","Data":"a97cfb598af771fadb5726f8c4ca3b03aba98f58c7aa78e79de0f3b339e5fad7"} Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.227748 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8","Type":"ContainerDied","Data":"830c08616b73bd8b530b0cc2fc5c5768679ded583b14b43ef2e68f803802520a"} Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.227756 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="830c08616b73bd8b530b0cc2fc5c5768679ded583b14b43ef2e68f803802520a" Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.227763 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0c18d-account-delete-458q7" event={"ID":"8c562c87-9f66-447f-83ec-05165c95ca25","Type":"ContainerStarted","Data":"ecba32580078a52bc491b554de34924f758e633ada59f370f7dda93c895da862"} Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.227773 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c489f678-crqhz" event={"ID":"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3","Type":"ContainerDied","Data":"8b7773436f1b032bb60a82d8c350c484575d97ff3f911f3e0eaad7489225d38f"} Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.227785 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b7773436f1b032bb60a82d8c350c484575d97ff3f911f3e0eaad7489225d38f" Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.227793 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican0363-account-delete-22cjb" event={"ID":"44c0f438-8d06-433e-af68-49ccacd9a017","Type":"ContainerDied","Data":"81c2ad2c09c8e03c68216f5fbecb08e40fa8cf252b49a0930e549c7e3e04e959"} Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.227802 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementbdf2-account-delete-qmjm2" event={"ID":"1256396c-ee12-469e-864f-c87983516079","Type":"ContainerDied","Data":"099259ef27c1dbed4aec87e28da3485dccbf8c02f2d379b599c19fcda6e2c957"} Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.227811 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4","Type":"ContainerDied","Data":"64b3099ce4ae64f657b3ba6a9d2a21ac15dc33c2c3dddccc593d4d83045820bf"} Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.227823 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"899d3d67-ec63-4d5f-ad93-c40003578347","Type":"ContainerDied","Data":"a8b3ef7eb0ba7d8af170edb23f786fe99d24451a36fd652d4dbc94d3f46220ba"} Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.227833 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance93c8-account-delete-j67wz" event={"ID":"c6a94dde-b399-408b-93f2-488d02be7f07","Type":"ContainerDied","Data":"1844d37f809640ea8effa099c546a30716c8ad0bbab318d8e046a3e937475817"} Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.227851 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3735b7db-e9a7-4be6-9c74-cad0131f2c0b","Type":"ContainerDied","Data":"d0205492af2a84d2597e544b1fac1933410305054e5bf3b202d9e1f108f9cdfa"} Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.227862 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3735b7db-e9a7-4be6-9c74-cad0131f2c0b","Type":"ContainerDied","Data":"dfc7a7d790de3be7195a0e499989a47fad20dcf7e377c93900a96c4b9f7ad988"} Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.227870 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfc7a7d790de3be7195a0e499989a47fad20dcf7e377c93900a96c4b9f7ad988" Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.227877 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-774d6d4878-7x6tj" event={"ID":"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998","Type":"ContainerDied","Data":"ce7f8de092c5631abd0150a07a948c24b407959e654aa1d9ce964ed828bb05fa"} Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.227890 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-774d6d4878-7x6tj" event={"ID":"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998","Type":"ContainerDied","Data":"745d4286ac2a70e6f05d7d5b44c411083a41f923a1acaec569d1c69ef6935d05"} Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.227898 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="745d4286ac2a70e6f05d7d5b44c411083a41f923a1acaec569d1c69ef6935d05" Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.290967 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.291206 4947 scope.go:117] "RemoveContainer" containerID="73b7a501be5ba2d71f1c1dd2ceb4c2ccc770dcda00e913d4135212d4d5b47c39" Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.291396 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="33bdabb7-a612-499f-855c-74da636d845a" containerName="ceilometer-central-agent" containerID="cri-o://f90261f5afceebdaeb77f3c54024be52a62232f173b9c1fd29905e0065ebea01" gracePeriod=30 Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.291627 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="33bdabb7-a612-499f-855c-74da636d845a" containerName="proxy-httpd" containerID="cri-o://f053572725c98712d4c872ef77a6d07d92fe84368f69a4183abee2c484393e0d" gracePeriod=30 Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.291688 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="33bdabb7-a612-499f-855c-74da636d845a" containerName="sg-core" containerID="cri-o://1788caba560e975af4980a6b4d8f9d5dc9b8edfaba731e0826ca1c2cec99f032" gracePeriod=30 Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.291748 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="33bdabb7-a612-499f-855c-74da636d845a" containerName="ceilometer-notification-agent" containerID="cri-o://f62233bb632eafcaea6d217ab7dcf4824646a80db89f733c7f2c4e7b72497014" gracePeriod=30 Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.332877 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.333336 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="90736cf6-0db1-44a8-b285-4d319f0951f8" containerName="kube-state-metrics" containerID="cri-o://3fe0141e6c7dfad5f3686eb0461017dbced765ef02bc95a048da886e97ae70e3" gracePeriod=30 Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.350168 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.350443 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="add50932-e8ea-4e7a-ab75-6fb1e0463499" containerName="memcached" containerID="cri-o://263bf6a9b3ef4a98946d41c42d9dbc1a2931efc6079ce15d3713906e986c6d70" gracePeriod=30 Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.445548 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-6lcmv"] Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.462551 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-6lcmv"] Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.477880 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-ms4p8"] Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.490616 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-ms4p8"] Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.509321 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.511773 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-c779fbf97-r9gnn"] Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.512035 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-c779fbf97-r9gnn" podUID="acfeae50-8757-4baf-a16a-c33ae100fdf2" containerName="keystone-api" containerID="cri-o://36b149648045b9cb069b114c1657d8c9f17d3c894192129d25f4f7e5044b7668" gracePeriod=30 Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.539988 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.555350 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-7pbrr"] Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.556573 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-7pbrr"] Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.566260 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-f419-account-create-update-9zkp8"] Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.579000 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-f419-account-create-update-9zkp8"] Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.695424 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stpzs\" (UniqueName: \"kubernetes.io/projected/1f87802c-3846-486e-a131-39a7fe336c96-kube-api-access-stpzs\") pod \"1f87802c-3846-486e-a131-39a7fe336c96\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.695787 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f87802c-3846-486e-a131-39a7fe336c96-config-data\") pod \"1f87802c-3846-486e-a131-39a7fe336c96\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.695859 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1f87802c-3846-486e-a131-39a7fe336c96-etc-swift\") pod \"1f87802c-3846-486e-a131-39a7fe336c96\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.695904 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f87802c-3846-486e-a131-39a7fe336c96-combined-ca-bundle\") pod \"1f87802c-3846-486e-a131-39a7fe336c96\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.695979 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f87802c-3846-486e-a131-39a7fe336c96-internal-tls-certs\") pod \"1f87802c-3846-486e-a131-39a7fe336c96\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.696009 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f87802c-3846-486e-a131-39a7fe336c96-run-httpd\") pod \"1f87802c-3846-486e-a131-39a7fe336c96\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.696028 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f87802c-3846-486e-a131-39a7fe336c96-log-httpd\") pod \"1f87802c-3846-486e-a131-39a7fe336c96\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.696052 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f87802c-3846-486e-a131-39a7fe336c96-public-tls-certs\") pod \"1f87802c-3846-486e-a131-39a7fe336c96\" (UID: \"1f87802c-3846-486e-a131-39a7fe336c96\") " Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.704625 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f87802c-3846-486e-a131-39a7fe336c96-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1f87802c-3846-486e-a131-39a7fe336c96" (UID: "1f87802c-3846-486e-a131-39a7fe336c96"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.705178 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f87802c-3846-486e-a131-39a7fe336c96-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1f87802c-3846-486e-a131-39a7fe336c96" (UID: "1f87802c-3846-486e-a131-39a7fe336c96"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.705685 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f87802c-3846-486e-a131-39a7fe336c96-kube-api-access-stpzs" (OuterVolumeSpecName: "kube-api-access-stpzs") pod "1f87802c-3846-486e-a131-39a7fe336c96" (UID: "1f87802c-3846-486e-a131-39a7fe336c96"). InnerVolumeSpecName "kube-api-access-stpzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.733603 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f87802c-3846-486e-a131-39a7fe336c96-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1f87802c-3846-486e-a131-39a7fe336c96" (UID: "1f87802c-3846-486e-a131-39a7fe336c96"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:36 crc kubenswrapper[4947]: E1203 07:12:36.770539 4947 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dca7534_e6d7_4cd9_88bb_4d8f7dff75d3.slice/crio-conmon-a809b23cd1a249dbc9331e612526f60ed5bdd18fa6472bcbf28960c85be485be.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dca7534_e6d7_4cd9_88bb_4d8f7dff75d3.slice/crio-a809b23cd1a249dbc9331e612526f60ed5bdd18fa6472bcbf28960c85be485be.scope\": RecentStats: unable to find data in memory cache]" Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.802563 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f87802c-3846-486e-a131-39a7fe336c96-config-data" (OuterVolumeSpecName: "config-data") pod "1f87802c-3846-486e-a131-39a7fe336c96" (UID: "1f87802c-3846-486e-a131-39a7fe336c96"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.803144 4947 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f87802c-3846-486e-a131-39a7fe336c96-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.803165 4947 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1f87802c-3846-486e-a131-39a7fe336c96-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.803175 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stpzs\" (UniqueName: \"kubernetes.io/projected/1f87802c-3846-486e-a131-39a7fe336c96-kube-api-access-stpzs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.803184 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f87802c-3846-486e-a131-39a7fe336c96-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.803192 4947 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1f87802c-3846-486e-a131-39a7fe336c96-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.812548 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f87802c-3846-486e-a131-39a7fe336c96-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f87802c-3846-486e-a131-39a7fe336c96" (UID: "1f87802c-3846-486e-a131-39a7fe336c96"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.826808 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f87802c-3846-486e-a131-39a7fe336c96-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1f87802c-3846-486e-a131-39a7fe336c96" (UID: "1f87802c-3846-486e-a131-39a7fe336c96"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.838359 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f87802c-3846-486e-a131-39a7fe336c96-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1f87802c-3846-486e-a131-39a7fe336c96" (UID: "1f87802c-3846-486e-a131-39a7fe336c96"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.859932 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="dbc40b18-511b-4bd7-bb2c-3dc868c6dcec" containerName="galera" containerID="cri-o://58133114584f0b390c627b1710fda7f019565ce9036c5e41a7ab89f063b5168d" gracePeriod=30 Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.905723 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f87802c-3846-486e-a131-39a7fe336c96-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.905757 4947 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f87802c-3846-486e-a131-39a7fe336c96-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:36 crc kubenswrapper[4947]: I1203 07:12:36.905766 4947 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f87802c-3846-486e-a131-39a7fe336c96-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.099361 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14e18665-c444-4a96-962a-cecea35695b1" path="/var/lib/kubelet/pods/14e18665-c444-4a96-962a-cecea35695b1/volumes" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.100937 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="883ad48f-95eb-48b1-932b-98e145d203ab" path="/var/lib/kubelet/pods/883ad48f-95eb-48b1-932b-98e145d203ab/volumes" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.101735 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a19d5562-e925-4567-94d0-001807fda043" path="/var/lib/kubelet/pods/a19d5562-e925-4567-94d0-001807fda043/volumes" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.102595 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b720d360-8afc-419d-b5e2-49259161a9ea" path="/var/lib/kubelet/pods/b720d360-8afc-419d-b5e2-49259161a9ea/volumes" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.197321 4947 scope.go:117] "RemoveContainer" containerID="cdbab280e2ed4809a2997438db415a2fa098c89060067db08594d46673b283c3" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.237158 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.270532 4947 generic.go:334] "Generic (PLEG): container finished" podID="bdb45354-43cd-41e7-a511-95357eb656e5" containerID="7a42de092f2c4832987606718c90461323cdaca36bc998fcfa88de29b59c873d" exitCode=0 Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.270589 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bdb45354-43cd-41e7-a511-95357eb656e5","Type":"ContainerDied","Data":"7a42de092f2c4832987606718c90461323cdaca36bc998fcfa88de29b59c873d"} Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.270617 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bdb45354-43cd-41e7-a511-95357eb656e5","Type":"ContainerDied","Data":"6e1b04e5cf8ca5937521ad7d5877ba5cd9a40faaa2d794fbaada6378c65d27c3"} Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.270627 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e1b04e5cf8ca5937521ad7d5877ba5cd9a40faaa2d794fbaada6378c65d27c3" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.275737 4947 scope.go:117] "RemoveContainer" containerID="9903cf2e6781315a233e9ee018d5a8e37a88e216dc112a5608259f1c8c2f8932" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.281957 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6c489f678-crqhz" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.298188 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59d779d8d8-6jl5c" event={"ID":"e4e9d6dc-e814-485d-842b-9266732c7924","Type":"ContainerDied","Data":"61e0344adc9512703ce97344b0be8ee9daf603e20b56691300046d1f21604dc5"} Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.298236 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61e0344adc9512703ce97344b0be8ee9daf603e20b56691300046d1f21604dc5" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.306056 4947 generic.go:334] "Generic (PLEG): container finished" podID="9f0a57ca-d063-4d27-ac54-f5431cca2971" containerID="334c2441513d95f243b52e76544d548f66b710f9a5c7ddb3358d9df97ecd5f57" exitCode=0 Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.306144 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9f0a57ca-d063-4d27-ac54-f5431cca2971","Type":"ContainerDied","Data":"334c2441513d95f243b52e76544d548f66b710f9a5c7ddb3358d9df97ecd5f57"} Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.307910 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.320538 4947 generic.go:334] "Generic (PLEG): container finished" podID="49caa6da-3c98-4c49-ab22-62121ff908cf" containerID="23fd79e84a8a0d20a65d598b706fa955381b97ec7814c70275bf4b3eb633dcce" exitCode=0 Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.320942 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"49caa6da-3c98-4c49-ab22-62121ff908cf","Type":"ContainerDied","Data":"23fd79e84a8a0d20a65d598b706fa955381b97ec7814c70275bf4b3eb633dcce"} Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.328133 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.345993 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4","Type":"ContainerDied","Data":"3eebaeb6a06911e5c12aced0d174869aeed0c91e99a4115f55a2036d926368db"} Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.346160 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3eebaeb6a06911e5c12aced0d174869aeed0c91e99a4115f55a2036d926368db" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.354420 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.354761 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"899d3d67-ec63-4d5f-ad93-c40003578347","Type":"ContainerDied","Data":"37b9c944600f7edad3c33294e138cb44fc8c7f386c32f123235cf07899642232"} Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.354789 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37b9c944600f7edad3c33294e138cb44fc8c7f386c32f123235cf07899642232" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.354939 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.359012 4947 generic.go:334] "Generic (PLEG): container finished" podID="90736cf6-0db1-44a8-b285-4d319f0951f8" containerID="3fe0141e6c7dfad5f3686eb0461017dbced765ef02bc95a048da886e97ae70e3" exitCode=2 Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.359100 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"90736cf6-0db1-44a8-b285-4d319f0951f8","Type":"ContainerDied","Data":"3fe0141e6c7dfad5f3686eb0461017dbced765ef02bc95a048da886e97ae70e3"} Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.359127 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"90736cf6-0db1-44a8-b285-4d319f0951f8","Type":"ContainerDied","Data":"316b25cd33a810b4ce65bf015ec2ce9702d7d22b0a407e7f649a18f9f2d17e0e"} Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.359139 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="316b25cd33a810b4ce65bf015ec2ce9702d7d22b0a407e7f649a18f9f2d17e0e" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.362789 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.369686 4947 generic.go:334] "Generic (PLEG): container finished" podID="33bdabb7-a612-499f-855c-74da636d845a" containerID="f053572725c98712d4c872ef77a6d07d92fe84368f69a4183abee2c484393e0d" exitCode=0 Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.369723 4947 generic.go:334] "Generic (PLEG): container finished" podID="33bdabb7-a612-499f-855c-74da636d845a" containerID="1788caba560e975af4980a6b4d8f9d5dc9b8edfaba731e0826ca1c2cec99f032" exitCode=2 Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.369731 4947 generic.go:334] "Generic (PLEG): container finished" podID="33bdabb7-a612-499f-855c-74da636d845a" containerID="f90261f5afceebdaeb77f3c54024be52a62232f173b9c1fd29905e0065ebea01" exitCode=0 Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.369819 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6ddbf97597-l6hz9" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.369868 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33bdabb7-a612-499f-855c-74da636d845a","Type":"ContainerDied","Data":"f053572725c98712d4c872ef77a6d07d92fe84368f69a4183abee2c484393e0d"} Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.369893 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33bdabb7-a612-499f-855c-74da636d845a","Type":"ContainerDied","Data":"1788caba560e975af4980a6b4d8f9d5dc9b8edfaba731e0826ca1c2cec99f032"} Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.369902 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33bdabb7-a612-499f-855c-74da636d845a","Type":"ContainerDied","Data":"f90261f5afceebdaeb77f3c54024be52a62232f173b9c1fd29905e0065ebea01"} Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.369696 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.387713 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.399160 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.401014 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.412716 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.413837 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.415002 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-internal-tls-certs\") pod \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.415054 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3735b7db-e9a7-4be6-9c74-cad0131f2c0b-config-data\") pod \"3735b7db-e9a7-4be6-9c74-cad0131f2c0b\" (UID: \"3735b7db-e9a7-4be6-9c74-cad0131f2c0b\") " Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.415079 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-logs\") pod \"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3\" (UID: \"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3\") " Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.415119 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzwvr\" (UniqueName: \"kubernetes.io/projected/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-kube-api-access-qzwvr\") pod \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.415142 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-public-tls-certs\") pod \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.415186 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-combined-ca-bundle\") pod \"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3\" (UID: \"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3\") " Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.415215 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqvl4\" (UniqueName: \"kubernetes.io/projected/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-kube-api-access-mqvl4\") pod \"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3\" (UID: \"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3\") " Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.415267 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-scripts\") pod \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.415314 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-config-data\") pod \"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3\" (UID: \"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3\") " Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.415344 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-logs\") pod \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.415363 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3735b7db-e9a7-4be6-9c74-cad0131f2c0b-combined-ca-bundle\") pod \"3735b7db-e9a7-4be6-9c74-cad0131f2c0b\" (UID: \"3735b7db-e9a7-4be6-9c74-cad0131f2c0b\") " Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.415388 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh77l\" (UniqueName: \"kubernetes.io/projected/3b4b931b-69ee-4ff2-b01a-85d45fc93ec4-kube-api-access-fh77l\") pod \"3b4b931b-69ee-4ff2-b01a-85d45fc93ec4\" (UID: \"3b4b931b-69ee-4ff2-b01a-85d45fc93ec4\") " Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.415419 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b4b931b-69ee-4ff2-b01a-85d45fc93ec4-combined-ca-bundle\") pod \"3b4b931b-69ee-4ff2-b01a-85d45fc93ec4\" (UID: \"3b4b931b-69ee-4ff2-b01a-85d45fc93ec4\") " Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.415436 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r6kr\" (UniqueName: \"kubernetes.io/projected/3735b7db-e9a7-4be6-9c74-cad0131f2c0b-kube-api-access-9r6kr\") pod \"3735b7db-e9a7-4be6-9c74-cad0131f2c0b\" (UID: \"3735b7db-e9a7-4be6-9c74-cad0131f2c0b\") " Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.415455 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b4b931b-69ee-4ff2-b01a-85d45fc93ec4-config-data\") pod \"3b4b931b-69ee-4ff2-b01a-85d45fc93ec4\" (UID: \"3b4b931b-69ee-4ff2-b01a-85d45fc93ec4\") " Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.415480 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-config-data\") pod \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.415548 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-combined-ca-bundle\") pod \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\" (UID: \"1c3c47d5-30e8-4c5f-93fe-e0d944cdc998\") " Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.415572 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-config-data-custom\") pod \"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3\" (UID: \"1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3\") " Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.420735 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-logs" (OuterVolumeSpecName: "logs") pod "1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3" (UID: "1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.428224 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-logs" (OuterVolumeSpecName: "logs") pod "1c3c47d5-30e8-4c5f-93fe-e0d944cdc998" (UID: "1c3c47d5-30e8-4c5f-93fe-e0d944cdc998"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.434947 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-kube-api-access-mqvl4" (OuterVolumeSpecName: "kube-api-access-mqvl4") pod "1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3" (UID: "1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3"). InnerVolumeSpecName "kube-api-access-mqvl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.435022 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-6cg4w"] Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.438338 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3" (UID: "1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.438464 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3735b7db-e9a7-4be6-9c74-cad0131f2c0b-kube-api-access-9r6kr" (OuterVolumeSpecName: "kube-api-access-9r6kr") pod "3735b7db-e9a7-4be6-9c74-cad0131f2c0b" (UID: "3735b7db-e9a7-4be6-9c74-cad0131f2c0b"). InnerVolumeSpecName "kube-api-access-9r6kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.445395 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-6cg4w"] Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.452106 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9cbcb645-6xgx7"] Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.462211 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9cbcb645-6xgx7"] Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.462294 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-kube-api-access-qzwvr" (OuterVolumeSpecName: "kube-api-access-qzwvr") pod "1c3c47d5-30e8-4c5f-93fe-e0d944cdc998" (UID: "1c3c47d5-30e8-4c5f-93fe-e0d944cdc998"). InnerVolumeSpecName "kube-api-access-qzwvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.463758 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.465212 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-scripts" (OuterVolumeSpecName: "scripts") pod "1c3c47d5-30e8-4c5f-93fe-e0d944cdc998" (UID: "1c3c47d5-30e8-4c5f-93fe-e0d944cdc998"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.473754 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5db58dc49-hq6px"] Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.482025 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b4b931b-69ee-4ff2-b01a-85d45fc93ec4-kube-api-access-fh77l" (OuterVolumeSpecName: "kube-api-access-fh77l") pod "3b4b931b-69ee-4ff2-b01a-85d45fc93ec4" (UID: "3b4b931b-69ee-4ff2-b01a-85d45fc93ec4"). InnerVolumeSpecName "kube-api-access-fh77l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.485398 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 07:12:37 crc kubenswrapper[4947]: I1203 07:12:37.490743 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-5db58dc49-hq6px"] Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.504616 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.504658 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.520987 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-httpd-run\") pod \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521037 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9a879185-9b9a-45a5-a211-c61faf308cbb-etc-machine-id\") pod \"9a879185-9b9a-45a5-a211-c61faf308cbb\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521063 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90736cf6-0db1-44a8-b285-4d319f0951f8-combined-ca-bundle\") pod \"90736cf6-0db1-44a8-b285-4d319f0951f8\" (UID: \"90736cf6-0db1-44a8-b285-4d319f0951f8\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521090 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/899d3d67-ec63-4d5f-ad93-c40003578347-combined-ca-bundle\") pod \"899d3d67-ec63-4d5f-ad93-c40003578347\" (UID: \"899d3d67-ec63-4d5f-ad93-c40003578347\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521110 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-combined-ca-bundle\") pod \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521141 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-internal-tls-certs\") pod \"e4e9d6dc-e814-485d-842b-9266732c7924\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521179 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521208 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a879185-9b9a-45a5-a211-c61faf308cbb-logs\") pod \"9a879185-9b9a-45a5-a211-c61faf308cbb\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521228 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdb45354-43cd-41e7-a511-95357eb656e5-config-data\") pod \"bdb45354-43cd-41e7-a511-95357eb656e5\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521247 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-kolla-config\") pod \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521280 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-config-data-custom\") pod \"e4e9d6dc-e814-485d-842b-9266732c7924\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521298 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-config-data-generated\") pod \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521317 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-combined-ca-bundle\") pod \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521340 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-combined-ca-bundle\") pod \"9a879185-9b9a-45a5-a211-c61faf308cbb\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521359 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnv2p\" (UniqueName: \"kubernetes.io/projected/899d3d67-ec63-4d5f-ad93-c40003578347-kube-api-access-jnv2p\") pod \"899d3d67-ec63-4d5f-ad93-c40003578347\" (UID: \"899d3d67-ec63-4d5f-ad93-c40003578347\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521398 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-scripts\") pod \"9a879185-9b9a-45a5-a211-c61faf308cbb\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521415 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-galera-tls-certs\") pod \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521443 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4e9d6dc-e814-485d-842b-9266732c7924-logs\") pod \"e4e9d6dc-e814-485d-842b-9266732c7924\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521465 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb45354-43cd-41e7-a511-95357eb656e5-combined-ca-bundle\") pod \"bdb45354-43cd-41e7-a511-95357eb656e5\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521510 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvkdq\" (UniqueName: \"kubernetes.io/projected/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-kube-api-access-zvkdq\") pod \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521540 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-config-data-custom\") pod \"9a879185-9b9a-45a5-a211-c61faf308cbb\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521576 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/90736cf6-0db1-44a8-b285-4d319f0951f8-kube-state-metrics-tls-certs\") pod \"90736cf6-0db1-44a8-b285-4d319f0951f8\" (UID: \"90736cf6-0db1-44a8-b285-4d319f0951f8\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521592 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb45354-43cd-41e7-a511-95357eb656e5-public-tls-certs\") pod \"bdb45354-43cd-41e7-a511-95357eb656e5\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521606 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-public-tls-certs\") pod \"e4e9d6dc-e814-485d-842b-9266732c7924\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521622 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/899d3d67-ec63-4d5f-ad93-c40003578347-config-data\") pod \"899d3d67-ec63-4d5f-ad93-c40003578347\" (UID: \"899d3d67-ec63-4d5f-ad93-c40003578347\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521643 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bdb45354-43cd-41e7-a511-95357eb656e5-httpd-run\") pod \"bdb45354-43cd-41e7-a511-95357eb656e5\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521663 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9xrb\" (UniqueName: \"kubernetes.io/projected/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-kube-api-access-j9xrb\") pod \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521680 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521698 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/90736cf6-0db1-44a8-b285-4d319f0951f8-kube-state-metrics-tls-config\") pod \"90736cf6-0db1-44a8-b285-4d319f0951f8\" (UID: \"90736cf6-0db1-44a8-b285-4d319f0951f8\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521712 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"bdb45354-43cd-41e7-a511-95357eb656e5\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521750 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-config-data\") pod \"e4e9d6dc-e814-485d-842b-9266732c7924\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521766 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-public-tls-certs\") pod \"9a879185-9b9a-45a5-a211-c61faf308cbb\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521785 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdgcj\" (UniqueName: \"kubernetes.io/projected/bdb45354-43cd-41e7-a511-95357eb656e5-kube-api-access-zdgcj\") pod \"bdb45354-43cd-41e7-a511-95357eb656e5\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521808 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-combined-ca-bundle\") pod \"e4e9d6dc-e814-485d-842b-9266732c7924\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521834 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdb45354-43cd-41e7-a511-95357eb656e5-scripts\") pod \"bdb45354-43cd-41e7-a511-95357eb656e5\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521854 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-logs\") pod \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521876 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqk9b\" (UniqueName: \"kubernetes.io/projected/e4e9d6dc-e814-485d-842b-9266732c7924-kube-api-access-lqk9b\") pod \"e4e9d6dc-e814-485d-842b-9266732c7924\" (UID: \"e4e9d6dc-e814-485d-842b-9266732c7924\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521910 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdb45354-43cd-41e7-a511-95357eb656e5-logs\") pod \"bdb45354-43cd-41e7-a511-95357eb656e5\" (UID: \"bdb45354-43cd-41e7-a511-95357eb656e5\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521931 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/899d3d67-ec63-4d5f-ad93-c40003578347-logs\") pod \"899d3d67-ec63-4d5f-ad93-c40003578347\" (UID: \"899d3d67-ec63-4d5f-ad93-c40003578347\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521954 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5j9j5\" (UniqueName: \"kubernetes.io/projected/90736cf6-0db1-44a8-b285-4d319f0951f8-kube-api-access-5j9j5\") pod \"90736cf6-0db1-44a8-b285-4d319f0951f8\" (UID: \"90736cf6-0db1-44a8-b285-4d319f0951f8\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.521980 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-operator-scripts\") pod \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.522023 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-config-data\") pod \"9a879185-9b9a-45a5-a211-c61faf308cbb\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.522045 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54k7t\" (UniqueName: \"kubernetes.io/projected/9a879185-9b9a-45a5-a211-c61faf308cbb-kube-api-access-54k7t\") pod \"9a879185-9b9a-45a5-a211-c61faf308cbb\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.522062 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-internal-tls-certs\") pod \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.522077 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/899d3d67-ec63-4d5f-ad93-c40003578347-nova-metadata-tls-certs\") pod \"899d3d67-ec63-4d5f-ad93-c40003578347\" (UID: \"899d3d67-ec63-4d5f-ad93-c40003578347\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.522094 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-internal-tls-certs\") pod \"9a879185-9b9a-45a5-a211-c61faf308cbb\" (UID: \"9a879185-9b9a-45a5-a211-c61faf308cbb\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.522113 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-config-data\") pod \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.522134 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-scripts\") pod \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\" (UID: \"c7b9bf09-0d94-4520-b783-7eb3fb4b79d4\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.522156 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-config-data-default\") pod \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\" (UID: \"fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.522568 4947 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.522581 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.522590 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzwvr\" (UniqueName: \"kubernetes.io/projected/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-kube-api-access-qzwvr\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.522601 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqvl4\" (UniqueName: \"kubernetes.io/projected/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-kube-api-access-mqvl4\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.522610 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.522618 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.522628 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh77l\" (UniqueName: \"kubernetes.io/projected/3b4b931b-69ee-4ff2-b01a-85d45fc93ec4-kube-api-access-fh77l\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.522636 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9r6kr\" (UniqueName: \"kubernetes.io/projected/3735b7db-e9a7-4be6-9c74-cad0131f2c0b-kube-api-access-9r6kr\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.524279 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3" (UID: "1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.525333 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8" (UID: "fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.526206 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c7b9bf09-0d94-4520-b783-7eb3fb4b79d4" (UID: "c7b9bf09-0d94-4520-b783-7eb3fb4b79d4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.526249 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a879185-9b9a-45a5-a211-c61faf308cbb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9a879185-9b9a-45a5-a211-c61faf308cbb" (UID: "9a879185-9b9a-45a5-a211-c61faf308cbb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.543034 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8" (UID: "fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.545008 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4e9d6dc-e814-485d-842b-9266732c7924-logs" (OuterVolumeSpecName: "logs") pod "e4e9d6dc-e814-485d-842b-9266732c7924" (UID: "e4e9d6dc-e814-485d-842b-9266732c7924"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.545202 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8" (UID: "fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.545286 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a879185-9b9a-45a5-a211-c61faf308cbb-logs" (OuterVolumeSpecName: "logs") pod "9a879185-9b9a-45a5-a211-c61faf308cbb" (UID: "9a879185-9b9a-45a5-a211-c61faf308cbb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.551014 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdb45354-43cd-41e7-a511-95357eb656e5-logs" (OuterVolumeSpecName: "logs") pod "bdb45354-43cd-41e7-a511-95357eb656e5" (UID: "bdb45354-43cd-41e7-a511-95357eb656e5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.551277 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/899d3d67-ec63-4d5f-ad93-c40003578347-logs" (OuterVolumeSpecName: "logs") pod "899d3d67-ec63-4d5f-ad93-c40003578347" (UID: "899d3d67-ec63-4d5f-ad93-c40003578347"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.561529 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdb45354-43cd-41e7-a511-95357eb656e5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bdb45354-43cd-41e7-a511-95357eb656e5" (UID: "bdb45354-43cd-41e7-a511-95357eb656e5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.564926 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8" (UID: "fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.567715 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6ddbf97597-l6hz9"] Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.569671 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e4e9d6dc-e814-485d-842b-9266732c7924" (UID: "e4e9d6dc-e814-485d-842b-9266732c7924"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.574523 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-logs" (OuterVolumeSpecName: "logs") pod "c7b9bf09-0d94-4520-b783-7eb3fb4b79d4" (UID: "c7b9bf09-0d94-4520-b783-7eb3fb4b79d4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.578518 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-6ddbf97597-l6hz9"] Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.580306 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-kube-api-access-zvkdq" (OuterVolumeSpecName: "kube-api-access-zvkdq") pod "fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8" (UID: "fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8"). InnerVolumeSpecName "kube-api-access-zvkdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.581804 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdb45354-43cd-41e7-a511-95357eb656e5-kube-api-access-zdgcj" (OuterVolumeSpecName: "kube-api-access-zdgcj") pod "bdb45354-43cd-41e7-a511-95357eb656e5" (UID: "bdb45354-43cd-41e7-a511-95357eb656e5"). InnerVolumeSpecName "kube-api-access-zdgcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.582578 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-scripts" (OuterVolumeSpecName: "scripts") pod "c7b9bf09-0d94-4520-b783-7eb3fb4b79d4" (UID: "c7b9bf09-0d94-4520-b783-7eb3fb4b79d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.585103 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-scripts" (OuterVolumeSpecName: "scripts") pod "9a879185-9b9a-45a5-a211-c61faf308cbb" (UID: "9a879185-9b9a-45a5-a211-c61faf308cbb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.588220 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9a879185-9b9a-45a5-a211-c61faf308cbb" (UID: "9a879185-9b9a-45a5-a211-c61faf308cbb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.591330 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3735b7db-e9a7-4be6-9c74-cad0131f2c0b-config-data" (OuterVolumeSpecName: "config-data") pod "3735b7db-e9a7-4be6-9c74-cad0131f2c0b" (UID: "3735b7db-e9a7-4be6-9c74-cad0131f2c0b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.597219 4947 scope.go:117] "RemoveContainer" containerID="4768422c21501889e7e1951ef8537cbf19f01b02b829b137e7fb9d8dc5766658" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.603460 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "c7b9bf09-0d94-4520-b783-7eb3fb4b79d4" (UID: "c7b9bf09-0d94-4520-b783-7eb3fb4b79d4"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.607547 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90736cf6-0db1-44a8-b285-4d319f0951f8-kube-api-access-5j9j5" (OuterVolumeSpecName: "kube-api-access-5j9j5") pod "90736cf6-0db1-44a8-b285-4d319f0951f8" (UID: "90736cf6-0db1-44a8-b285-4d319f0951f8"). InnerVolumeSpecName "kube-api-access-5j9j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.607651 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-kube-api-access-j9xrb" (OuterVolumeSpecName: "kube-api-access-j9xrb") pod "c7b9bf09-0d94-4520-b783-7eb3fb4b79d4" (UID: "c7b9bf09-0d94-4520-b783-7eb3fb4b79d4"). InnerVolumeSpecName "kube-api-access-j9xrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.607652 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4e9d6dc-e814-485d-842b-9266732c7924-kube-api-access-lqk9b" (OuterVolumeSpecName: "kube-api-access-lqk9b") pod "e4e9d6dc-e814-485d-842b-9266732c7924" (UID: "e4e9d6dc-e814-485d-842b-9266732c7924"). InnerVolumeSpecName "kube-api-access-lqk9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.609165 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/899d3d67-ec63-4d5f-ad93-c40003578347-kube-api-access-jnv2p" (OuterVolumeSpecName: "kube-api-access-jnv2p") pod "899d3d67-ec63-4d5f-ad93-c40003578347" (UID: "899d3d67-ec63-4d5f-ad93-c40003578347"). InnerVolumeSpecName "kube-api-access-jnv2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.609227 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a879185-9b9a-45a5-a211-c61faf308cbb-kube-api-access-54k7t" (OuterVolumeSpecName: "kube-api-access-54k7t") pod "9a879185-9b9a-45a5-a211-c61faf308cbb" (UID: "9a879185-9b9a-45a5-a211-c61faf308cbb"). InnerVolumeSpecName "kube-api-access-54k7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.624171 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-config-data\") pod \"49caa6da-3c98-4c49-ab22-62121ff908cf\" (UID: \"49caa6da-3c98-4c49-ab22-62121ff908cf\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.624230 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-config-data-custom\") pod \"49caa6da-3c98-4c49-ab22-62121ff908cf\" (UID: \"49caa6da-3c98-4c49-ab22-62121ff908cf\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.624306 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f0a57ca-d063-4d27-ac54-f5431cca2971-config-data\") pod \"9f0a57ca-d063-4d27-ac54-f5431cca2971\" (UID: \"9f0a57ca-d063-4d27-ac54-f5431cca2971\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.624382 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-combined-ca-bundle\") pod \"49caa6da-3c98-4c49-ab22-62121ff908cf\" (UID: \"49caa6da-3c98-4c49-ab22-62121ff908cf\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.624423 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxjvw\" (UniqueName: \"kubernetes.io/projected/9f0a57ca-d063-4d27-ac54-f5431cca2971-kube-api-access-dxjvw\") pod \"9f0a57ca-d063-4d27-ac54-f5431cca2971\" (UID: \"9f0a57ca-d063-4d27-ac54-f5431cca2971\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.624539 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvzxl\" (UniqueName: \"kubernetes.io/projected/49caa6da-3c98-4c49-ab22-62121ff908cf-kube-api-access-cvzxl\") pod \"49caa6da-3c98-4c49-ab22-62121ff908cf\" (UID: \"49caa6da-3c98-4c49-ab22-62121ff908cf\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.624583 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-scripts\") pod \"49caa6da-3c98-4c49-ab22-62121ff908cf\" (UID: \"49caa6da-3c98-4c49-ab22-62121ff908cf\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.624609 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f0a57ca-d063-4d27-ac54-f5431cca2971-combined-ca-bundle\") pod \"9f0a57ca-d063-4d27-ac54-f5431cca2971\" (UID: \"9f0a57ca-d063-4d27-ac54-f5431cca2971\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.624680 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49caa6da-3c98-4c49-ab22-62121ff908cf-etc-machine-id\") pod \"49caa6da-3c98-4c49-ab22-62121ff908cf\" (UID: \"49caa6da-3c98-4c49-ab22-62121ff908cf\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.624986 4947 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.624998 4947 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bdb45354-43cd-41e7-a511-95357eb656e5-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.625006 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9xrb\" (UniqueName: \"kubernetes.io/projected/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-kube-api-access-j9xrb\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.625017 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3735b7db-e9a7-4be6-9c74-cad0131f2c0b-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.625026 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdgcj\" (UniqueName: \"kubernetes.io/projected/bdb45354-43cd-41e7-a511-95357eb656e5-kube-api-access-zdgcj\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.625035 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.625043 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqk9b\" (UniqueName: \"kubernetes.io/projected/e4e9d6dc-e814-485d-842b-9266732c7924-kube-api-access-lqk9b\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.625051 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdb45354-43cd-41e7-a511-95357eb656e5-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.625060 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/899d3d67-ec63-4d5f-ad93-c40003578347-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.625068 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5j9j5\" (UniqueName: \"kubernetes.io/projected/90736cf6-0db1-44a8-b285-4d319f0951f8-kube-api-access-5j9j5\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.625077 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.625086 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.625094 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54k7t\" (UniqueName: \"kubernetes.io/projected/9a879185-9b9a-45a5-a211-c61faf308cbb-kube-api-access-54k7t\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.625102 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.625110 4947 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.625117 4947 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.625125 4947 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9a879185-9b9a-45a5-a211-c61faf308cbb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.625142 4947 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.625150 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a879185-9b9a-45a5-a211-c61faf308cbb-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.625158 4947 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.625167 4947 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.625175 4947 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.625183 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnv2p\" (UniqueName: \"kubernetes.io/projected/899d3d67-ec63-4d5f-ad93-c40003578347-kube-api-access-jnv2p\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.625191 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.625198 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4e9d6dc-e814-485d-842b-9266732c7924-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.625205 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvkdq\" (UniqueName: \"kubernetes.io/projected/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-kube-api-access-zvkdq\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.646219 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49caa6da-3c98-4c49-ab22-62121ff908cf-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "49caa6da-3c98-4c49-ab22-62121ff908cf" (UID: "49caa6da-3c98-4c49-ab22-62121ff908cf"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.650228 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdb45354-43cd-41e7-a511-95357eb656e5-scripts" (OuterVolumeSpecName: "scripts") pod "bdb45354-43cd-41e7-a511-95357eb656e5" (UID: "bdb45354-43cd-41e7-a511-95357eb656e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.652220 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "bdb45354-43cd-41e7-a511-95357eb656e5" (UID: "bdb45354-43cd-41e7-a511-95357eb656e5"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.689203 4947 scope.go:117] "RemoveContainer" containerID="e942659fea577abdb40bc1687013b7988fc7bb6ef1815a0d789c1541253c9e6a" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.727444 4947 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.727471 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdb45354-43cd-41e7-a511-95357eb656e5-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.727480 4947 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49caa6da-3c98-4c49-ab22-62121ff908cf-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.736409 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "49caa6da-3c98-4c49-ab22-62121ff908cf" (UID: "49caa6da-3c98-4c49-ab22-62121ff908cf"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.736604 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdb45354-43cd-41e7-a511-95357eb656e5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bdb45354-43cd-41e7-a511-95357eb656e5" (UID: "bdb45354-43cd-41e7-a511-95357eb656e5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.737395 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49caa6da-3c98-4c49-ab22-62121ff908cf-kube-api-access-cvzxl" (OuterVolumeSpecName: "kube-api-access-cvzxl") pod "49caa6da-3c98-4c49-ab22-62121ff908cf" (UID: "49caa6da-3c98-4c49-ab22-62121ff908cf"). InnerVolumeSpecName "kube-api-access-cvzxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.737727 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-scripts" (OuterVolumeSpecName: "scripts") pod "49caa6da-3c98-4c49-ab22-62121ff908cf" (UID: "49caa6da-3c98-4c49-ab22-62121ff908cf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.748663 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f0a57ca-d063-4d27-ac54-f5431cca2971-kube-api-access-dxjvw" (OuterVolumeSpecName: "kube-api-access-dxjvw") pod "9f0a57ca-d063-4d27-ac54-f5431cca2971" (UID: "9f0a57ca-d063-4d27-ac54-f5431cca2971"). InnerVolumeSpecName "kube-api-access-dxjvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.801602 4947 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.807402 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "mysql-db") pod "fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8" (UID: "fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.829091 4947 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdb45354-43cd-41e7-a511-95357eb656e5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.829125 4947 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.829137 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvzxl\" (UniqueName: \"kubernetes.io/projected/49caa6da-3c98-4c49-ab22-62121ff908cf-kube-api-access-cvzxl\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.829146 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.829155 4947 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.829164 4947 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.829172 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxjvw\" (UniqueName: \"kubernetes.io/projected/9f0a57ca-d063-4d27-ac54-f5431cca2971-kube-api-access-dxjvw\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.844314 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8" (UID: "fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.874040 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e4e9d6dc-e814-485d-842b-9266732c7924" (UID: "e4e9d6dc-e814-485d-842b-9266732c7924"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.889778 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b4b931b-69ee-4ff2-b01a-85d45fc93ec4-config-data" (OuterVolumeSpecName: "config-data") pod "3b4b931b-69ee-4ff2-b01a-85d45fc93ec4" (UID: "3b4b931b-69ee-4ff2-b01a-85d45fc93ec4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.931269 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.931290 4947 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.931300 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b4b931b-69ee-4ff2-b01a-85d45fc93ec4-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: E1203 07:12:37.931348 4947 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 03 07:12:38 crc kubenswrapper[4947]: E1203 07:12:37.932347 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-config-data podName:5367165f-75ec-4633-8042-edfe91e3be60 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:45.931377931 +0000 UTC m=+1427.192332357 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-config-data") pod "rabbitmq-cell1-server-0" (UID: "5367165f-75ec-4633-8042-edfe91e3be60") : configmap "rabbitmq-cell1-config-data" not found Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:37.983871 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90736cf6-0db1-44a8-b285-4d319f0951f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90736cf6-0db1-44a8-b285-4d319f0951f8" (UID: "90736cf6-0db1-44a8-b285-4d319f0951f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.004834 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90736cf6-0db1-44a8-b285-4d319f0951f8-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "90736cf6-0db1-44a8-b285-4d319f0951f8" (UID: "90736cf6-0db1-44a8-b285-4d319f0951f8"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.005320 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3735b7db-e9a7-4be6-9c74-cad0131f2c0b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3735b7db-e9a7-4be6-9c74-cad0131f2c0b" (UID: "3735b7db-e9a7-4be6-9c74-cad0131f2c0b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.009461 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9a879185-9b9a-45a5-a211-c61faf308cbb" (UID: "9a879185-9b9a-45a5-a211-c61faf308cbb"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.028312 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7b9bf09-0d94-4520-b783-7eb3fb4b79d4" (UID: "c7b9bf09-0d94-4520-b783-7eb3fb4b79d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.032330 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90736cf6-0db1-44a8-b285-4d319f0951f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.032352 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3735b7db-e9a7-4be6-9c74-cad0131f2c0b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.032361 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.032370 4947 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/90736cf6-0db1-44a8-b285-4d319f0951f8-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.032381 4947 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.039070 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a879185-9b9a-45a5-a211-c61faf308cbb" (UID: "9a879185-9b9a-45a5-a211-c61faf308cbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.040025 4947 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.051806 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f0a57ca-d063-4d27-ac54-f5431cca2971-config-data" (OuterVolumeSpecName: "config-data") pod "9f0a57ca-d063-4d27-ac54-f5431cca2971" (UID: "9f0a57ca-d063-4d27-ac54-f5431cca2971"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.087934 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e4e9d6dc-e814-485d-842b-9266732c7924" (UID: "e4e9d6dc-e814-485d-842b-9266732c7924"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.093084 4947 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.136870 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f0a57ca-d063-4d27-ac54-f5431cca2971-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.136961 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.136974 4947 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.136983 4947 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.136992 4947 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.146291 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c3c47d5-30e8-4c5f-93fe-e0d944cdc998" (UID: "1c3c47d5-30e8-4c5f-93fe-e0d944cdc998"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.156442 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49caa6da-3c98-4c49-ab22-62121ff908cf" (UID: "49caa6da-3c98-4c49-ab22-62121ff908cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.157463 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-config-data" (OuterVolumeSpecName: "config-data") pod "c7b9bf09-0d94-4520-b783-7eb3fb4b79d4" (UID: "c7b9bf09-0d94-4520-b783-7eb3fb4b79d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.159289 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-config-data" (OuterVolumeSpecName: "config-data") pod "9a879185-9b9a-45a5-a211-c61faf308cbb" (UID: "9a879185-9b9a-45a5-a211-c61faf308cbb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.160748 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f0a57ca-d063-4d27-ac54-f5431cca2971-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f0a57ca-d063-4d27-ac54-f5431cca2971" (UID: "9f0a57ca-d063-4d27-ac54-f5431cca2971"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.167779 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-config-data" (OuterVolumeSpecName: "config-data") pod "1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3" (UID: "1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.184618 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/899d3d67-ec63-4d5f-ad93-c40003578347-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "899d3d67-ec63-4d5f-ad93-c40003578347" (UID: "899d3d67-ec63-4d5f-ad93-c40003578347"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.185843 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-config-data" (OuterVolumeSpecName: "config-data") pod "e4e9d6dc-e814-485d-842b-9266732c7924" (UID: "e4e9d6dc-e814-485d-842b-9266732c7924"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.213925 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdb45354-43cd-41e7-a511-95357eb656e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bdb45354-43cd-41e7-a511-95357eb656e5" (UID: "bdb45354-43cd-41e7-a511-95357eb656e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: E1203 07:12:38.214994 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9e770c8d82efa23dd527c2a4232bde18aef5979e147a5871d763e136d274533e is running failed: container process not found" containerID="9e770c8d82efa23dd527c2a4232bde18aef5979e147a5871d763e136d274533e" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 03 07:12:38 crc kubenswrapper[4947]: E1203 07:12:38.215566 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9e770c8d82efa23dd527c2a4232bde18aef5979e147a5871d763e136d274533e is running failed: container process not found" containerID="9e770c8d82efa23dd527c2a4232bde18aef5979e147a5871d763e136d274533e" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 03 07:12:38 crc kubenswrapper[4947]: E1203 07:12:38.215941 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9e770c8d82efa23dd527c2a4232bde18aef5979e147a5871d763e136d274533e is running failed: container process not found" containerID="9e770c8d82efa23dd527c2a4232bde18aef5979e147a5871d763e136d274533e" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 03 07:12:38 crc kubenswrapper[4947]: E1203 07:12:38.215987 4947 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9e770c8d82efa23dd527c2a4232bde18aef5979e147a5871d763e136d274533e is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="855f6436-68f0-42d8-a12a-bf25632440c1" containerName="ovn-northd" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.226003 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-config-data" (OuterVolumeSpecName: "config-data") pod "1c3c47d5-30e8-4c5f-93fe-e0d944cdc998" (UID: "1c3c47d5-30e8-4c5f-93fe-e0d944cdc998"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.229888 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/899d3d67-ec63-4d5f-ad93-c40003578347-config-data" (OuterVolumeSpecName: "config-data") pod "899d3d67-ec63-4d5f-ad93-c40003578347" (UID: "899d3d67-ec63-4d5f-ad93-c40003578347"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.230201 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdb45354-43cd-41e7-a511-95357eb656e5-config-data" (OuterVolumeSpecName: "config-data") pod "bdb45354-43cd-41e7-a511-95357eb656e5" (UID: "bdb45354-43cd-41e7-a511-95357eb656e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.235609 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90736cf6-0db1-44a8-b285-4d319f0951f8-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "90736cf6-0db1-44a8-b285-4d319f0951f8" (UID: "90736cf6-0db1-44a8-b285-4d319f0951f8"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.238541 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.239070 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/899d3d67-ec63-4d5f-ad93-c40003578347-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.239089 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.239098 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdb45354-43cd-41e7-a511-95357eb656e5-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.239106 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.239116 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.239125 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdb45354-43cd-41e7-a511-95357eb656e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.239143 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.239156 4947 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/90736cf6-0db1-44a8-b285-4d319f0951f8-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.239164 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/899d3d67-ec63-4d5f-ad93-c40003578347-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.239175 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.239186 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f0a57ca-d063-4d27-ac54-f5431cca2971-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.239195 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.245001 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8" (UID: "fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.247171 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/899d3d67-ec63-4d5f-ad93-c40003578347-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "899d3d67-ec63-4d5f-ad93-c40003578347" (UID: "899d3d67-ec63-4d5f-ad93-c40003578347"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.299636 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c7b9bf09-0d94-4520-b783-7eb3fb4b79d4" (UID: "c7b9bf09-0d94-4520-b783-7eb3fb4b79d4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.314001 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9a879185-9b9a-45a5-a211-c61faf308cbb" (UID: "9a879185-9b9a-45a5-a211-c61faf308cbb"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.314347 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4e9d6dc-e814-485d-842b-9266732c7924" (UID: "e4e9d6dc-e814-485d-842b-9266732c7924"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.319255 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementbdf2-account-delete-qmjm2" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.328581 4947 scope.go:117] "RemoveContainer" containerID="7ddbee771d7d031c4a9d787606308301ec00a05893e9401d6d27e942db47b236" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.333177 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b4b931b-69ee-4ff2-b01a-85d45fc93ec4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b4b931b-69ee-4ff2-b01a-85d45fc93ec4" (UID: "3b4b931b-69ee-4ff2-b01a-85d45fc93ec4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.336897 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican0363-account-delete-22cjb" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.343061 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b4b931b-69ee-4ff2-b01a-85d45fc93ec4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.343100 4947 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.343112 4947 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a879185-9b9a-45a5-a211-c61faf308cbb-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.343123 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e9d6dc-e814-485d-842b-9266732c7924-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.343134 4947 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.343146 4947 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/899d3d67-ec63-4d5f-ad93-c40003578347-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.357554 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance93c8-account-delete-j67wz" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.357960 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1c3c47d5-30e8-4c5f-93fe-e0d944cdc998" (UID: "1c3c47d5-30e8-4c5f-93fe-e0d944cdc998"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.378178 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.384097 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1c3c47d5-30e8-4c5f-93fe-e0d944cdc998" (UID: "1c3c47d5-30e8-4c5f-93fe-e0d944cdc998"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.389632 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_855f6436-68f0-42d8-a12a-bf25632440c1/ovn-northd/0.log" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.389674 4947 generic.go:334] "Generic (PLEG): container finished" podID="855f6436-68f0-42d8-a12a-bf25632440c1" containerID="9e770c8d82efa23dd527c2a4232bde18aef5979e147a5871d763e136d274533e" exitCode=139 Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.389727 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"855f6436-68f0-42d8-a12a-bf25632440c1","Type":"ContainerDied","Data":"9e770c8d82efa23dd527c2a4232bde18aef5979e147a5871d763e136d274533e"} Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.389752 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"855f6436-68f0-42d8-a12a-bf25632440c1","Type":"ContainerDied","Data":"b8f6ad072d8a430d3f397408cbb15abd6578dd51c4ed60e99bd7561ae1e6ae4d"} Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.389761 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8f6ad072d8a430d3f397408cbb15abd6578dd51c4ed60e99bd7561ae1e6ae4d" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.389975 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_855f6436-68f0-42d8-a12a-bf25632440c1/ovn-northd/0.log" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.390048 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.392451 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron896b-account-delete-ddxm6" event={"ID":"e4b56c27-0089-46d0-8cc3-c5788833f135","Type":"ContainerStarted","Data":"48805411c29e52ae5cc60ad05d70dc1612bb234f5813c62bb0bcba4f5902f170"} Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.392959 4947 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/neutron896b-account-delete-ddxm6" secret="" err="secret \"galera-openstack-dockercfg-2prbt\" not found" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.411354 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican0363-account-delete-22cjb" event={"ID":"44c0f438-8d06-433e-af68-49ccacd9a017","Type":"ContainerDied","Data":"e9f5bb21882477cf0c9d62bd9aa094c8f2b5fe84275fe15e39ccd3e500a108e3"} Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.411387 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9f5bb21882477cf0c9d62bd9aa094c8f2b5fe84275fe15e39ccd3e500a108e3" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.411430 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican0363-account-delete-22cjb" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.414086 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder0e6b-account-delete-kmbkl" event={"ID":"8c259ef0-fa34-4ef1-a954-a15bd61ed120","Type":"ContainerStarted","Data":"5f5b17e47defdfb32050f77dd94b370b6ccbc70dc782d3ccfc33b54d67378629"} Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.415397 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementbdf2-account-delete-qmjm2" event={"ID":"1256396c-ee12-469e-864f-c87983516079","Type":"ContainerDied","Data":"f02f49131c5887d2f979b608f73395a77c9847d69c592fe2dbe726e9c70b8309"} Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.415421 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f02f49131c5887d2f979b608f73395a77c9847d69c592fe2dbe726e9c70b8309" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.415470 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementbdf2-account-delete-qmjm2" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.415531 4947 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/cinder0e6b-account-delete-kmbkl" secret="" err="secret \"galera-openstack-dockercfg-2prbt\" not found" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.416796 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.416834 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-config-data" (OuterVolumeSpecName: "config-data") pod "49caa6da-3c98-4c49-ab22-62121ff908cf" (UID: "49caa6da-3c98-4c49-ab22-62121ff908cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.417837 4947 generic.go:334] "Generic (PLEG): container finished" podID="98d05028-c68a-4438-afcd-161c4f974b08" containerID="6e0c48cd3228e4b00d96c837f40857c671662d7b07ee44989563b07cafc4c386" exitCode=0 Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.417894 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"98d05028-c68a-4438-afcd-161c4f974b08","Type":"ContainerDied","Data":"6e0c48cd3228e4b00d96c837f40857c671662d7b07ee44989563b07cafc4c386"} Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.417921 4947 scope.go:117] "RemoveContainer" containerID="6e0c48cd3228e4b00d96c837f40857c671662d7b07ee44989563b07cafc4c386" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.428153 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapib13d-account-delete-k5vqn" event={"ID":"62cff038-cc99-44ca-ba48-a05d16a96b26","Type":"ContainerStarted","Data":"5fcded32bf81a07bf90138d60ec52c620c0dcfd0f7192a732e55d09ba69d93dc"} Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.429531 4947 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novaapib13d-account-delete-k5vqn" secret="" err="secret \"galera-openstack-dockercfg-2prbt\" not found" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.436560 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0c18d-account-delete-458q7" event={"ID":"8c562c87-9f66-447f-83ec-05165c95ca25","Type":"ContainerStarted","Data":"207f7321a401ba4b7cf0fa6297e5b92b9beae00b568d12bec3f4e5034144bc08"} Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.437365 4947 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell0c18d-account-delete-458q7" secret="" err="secret \"galera-openstack-dockercfg-2prbt\" not found" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.438128 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron896b-account-delete-ddxm6" podStartSLOduration=8.438117468 podStartE2EDuration="8.438117468s" podCreationTimestamp="2025-12-03 07:12:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:12:38.435837947 +0000 UTC m=+1419.696792373" watchObservedRunningTime="2025-12-03 07:12:38.438117468 +0000 UTC m=+1419.699071894" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.442040 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9f0a57ca-d063-4d27-ac54-f5431cca2971","Type":"ContainerDied","Data":"4d37839e18ca3ca0588ed76fb7e8068a744a42e03b230a1c507cefdfdc97801a"} Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.442299 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.444118 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1256396c-ee12-469e-864f-c87983516079-operator-scripts\") pod \"1256396c-ee12-469e-864f-c87983516079\" (UID: \"1256396c-ee12-469e-864f-c87983516079\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.444158 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmq74\" (UniqueName: \"kubernetes.io/projected/1256396c-ee12-469e-864f-c87983516079-kube-api-access-xmq74\") pod \"1256396c-ee12-469e-864f-c87983516079\" (UID: \"1256396c-ee12-469e-864f-c87983516079\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.444262 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6a94dde-b399-408b-93f2-488d02be7f07-operator-scripts\") pod \"c6a94dde-b399-408b-93f2-488d02be7f07\" (UID: \"c6a94dde-b399-408b-93f2-488d02be7f07\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.444373 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44c0f438-8d06-433e-af68-49ccacd9a017-operator-scripts\") pod \"44c0f438-8d06-433e-af68-49ccacd9a017\" (UID: \"44c0f438-8d06-433e-af68-49ccacd9a017\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.444402 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmmgg\" (UniqueName: \"kubernetes.io/projected/44c0f438-8d06-433e-af68-49ccacd9a017-kube-api-access-jmmgg\") pod \"44c0f438-8d06-433e-af68-49ccacd9a017\" (UID: \"44c0f438-8d06-433e-af68-49ccacd9a017\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.444434 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n9xw\" (UniqueName: \"kubernetes.io/projected/c6a94dde-b399-408b-93f2-488d02be7f07-kube-api-access-2n9xw\") pod \"c6a94dde-b399-408b-93f2-488d02be7f07\" (UID: \"c6a94dde-b399-408b-93f2-488d02be7f07\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.444936 4947 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.444959 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49caa6da-3c98-4c49-ab22-62121ff908cf-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.444971 4947 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.449761 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6a94dde-b399-408b-93f2-488d02be7f07-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c6a94dde-b399-408b-93f2-488d02be7f07" (UID: "c6a94dde-b399-408b-93f2-488d02be7f07"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.450152 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1256396c-ee12-469e-864f-c87983516079-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1256396c-ee12-469e-864f-c87983516079" (UID: "1256396c-ee12-469e-864f-c87983516079"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.457901 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44c0f438-8d06-433e-af68-49ccacd9a017-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "44c0f438-8d06-433e-af68-49ccacd9a017" (UID: "44c0f438-8d06-433e-af68-49ccacd9a017"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.462216 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1256396c-ee12-469e-864f-c87983516079-kube-api-access-xmq74" (OuterVolumeSpecName: "kube-api-access-xmq74") pod "1256396c-ee12-469e-864f-c87983516079" (UID: "1256396c-ee12-469e-864f-c87983516079"). InnerVolumeSpecName "kube-api-access-xmq74". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.463294 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6a94dde-b399-408b-93f2-488d02be7f07-kube-api-access-2n9xw" (OuterVolumeSpecName: "kube-api-access-2n9xw") pod "c6a94dde-b399-408b-93f2-488d02be7f07" (UID: "c6a94dde-b399-408b-93f2-488d02be7f07"). InnerVolumeSpecName "kube-api-access-2n9xw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.468871 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novaapib13d-account-delete-k5vqn" podStartSLOduration=8.468846373 podStartE2EDuration="8.468846373s" podCreationTimestamp="2025-12-03 07:12:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:12:38.4580533 +0000 UTC m=+1419.719007726" watchObservedRunningTime="2025-12-03 07:12:38.468846373 +0000 UTC m=+1419.729800799" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.472463 4947 scope.go:117] "RemoveContainer" containerID="1ce62f53e6a80cc9f688e02da11287f9cc62f54ad243c0af9a53ea71ef8a6f49" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.473742 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44c0f438-8d06-433e-af68-49ccacd9a017-kube-api-access-jmmgg" (OuterVolumeSpecName: "kube-api-access-jmmgg") pod "44c0f438-8d06-433e-af68-49ccacd9a017" (UID: "44c0f438-8d06-433e-af68-49ccacd9a017"). InnerVolumeSpecName "kube-api-access-jmmgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.492783 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"49caa6da-3c98-4c49-ab22-62121ff908cf","Type":"ContainerDied","Data":"d040ac489ceeebebbd6c8f3183918ed793779fdf3503c2652969f88e0af4c4fa"} Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.493032 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.501958 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder0e6b-account-delete-kmbkl" podStartSLOduration=8.501940682 podStartE2EDuration="8.501940682s" podCreationTimestamp="2025-12-03 07:12:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:12:38.497098031 +0000 UTC m=+1419.758052457" watchObservedRunningTime="2025-12-03 07:12:38.501940682 +0000 UTC m=+1419.762895118" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.508816 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance93c8-account-delete-j67wz" event={"ID":"c6a94dde-b399-408b-93f2-488d02be7f07","Type":"ContainerDied","Data":"6f3e7195f5a72482222bd69507f214cd995af856e5e4d177929552e138e91ef6"} Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.508849 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f3e7195f5a72482222bd69507f214cd995af856e5e4d177929552e138e91ef6" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.508899 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance93c8-account-delete-j67wz" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.514029 4947 generic.go:334] "Generic (PLEG): container finished" podID="add50932-e8ea-4e7a-ab75-6fb1e0463499" containerID="263bf6a9b3ef4a98946d41c42d9dbc1a2931efc6079ce15d3713906e986c6d70" exitCode=0 Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.514154 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.514937 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.515768 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.518655 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-774d6d4878-7x6tj" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.518742 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.518786 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"add50932-e8ea-4e7a-ab75-6fb1e0463499","Type":"ContainerDied","Data":"263bf6a9b3ef4a98946d41c42d9dbc1a2931efc6079ce15d3713906e986c6d70"} Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.518831 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"add50932-e8ea-4e7a-ab75-6fb1e0463499","Type":"ContainerDied","Data":"74fb78f5022052166a30c09f0fbdf7c96543565acb2a6e491193815ec2a0dd60"} Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.519011 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.519020 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.519048 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.519052 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6c489f678-crqhz" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.519052 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.519069 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59d779d8d8-6jl5c" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.519104 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.534149 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novacell0c18d-account-delete-458q7" podStartSLOduration=7.534127757 podStartE2EDuration="7.534127757s" podCreationTimestamp="2025-12-03 07:12:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:12:38.512484019 +0000 UTC m=+1419.773438455" watchObservedRunningTime="2025-12-03 07:12:38.534127757 +0000 UTC m=+1419.795082193" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.545869 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d05028-c68a-4438-afcd-161c4f974b08-public-tls-certs\") pod \"98d05028-c68a-4438-afcd-161c4f974b08\" (UID: \"98d05028-c68a-4438-afcd-161c4f974b08\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.545928 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/add50932-e8ea-4e7a-ab75-6fb1e0463499-config-data\") pod \"add50932-e8ea-4e7a-ab75-6fb1e0463499\" (UID: \"add50932-e8ea-4e7a-ab75-6fb1e0463499\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.545965 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/855f6436-68f0-42d8-a12a-bf25632440c1-ovn-rundir\") pod \"855f6436-68f0-42d8-a12a-bf25632440c1\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.546007 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98d05028-c68a-4438-afcd-161c4f974b08-logs\") pod \"98d05028-c68a-4438-afcd-161c4f974b08\" (UID: \"98d05028-c68a-4438-afcd-161c4f974b08\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.546031 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/add50932-e8ea-4e7a-ab75-6fb1e0463499-kolla-config\") pod \"add50932-e8ea-4e7a-ab75-6fb1e0463499\" (UID: \"add50932-e8ea-4e7a-ab75-6fb1e0463499\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.546058 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855f6436-68f0-42d8-a12a-bf25632440c1-combined-ca-bundle\") pod \"855f6436-68f0-42d8-a12a-bf25632440c1\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.546081 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d05028-c68a-4438-afcd-161c4f974b08-internal-tls-certs\") pod \"98d05028-c68a-4438-afcd-161c4f974b08\" (UID: \"98d05028-c68a-4438-afcd-161c4f974b08\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.546109 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/855f6436-68f0-42d8-a12a-bf25632440c1-config\") pod \"855f6436-68f0-42d8-a12a-bf25632440c1\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.546142 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/add50932-e8ea-4e7a-ab75-6fb1e0463499-memcached-tls-certs\") pod \"add50932-e8ea-4e7a-ab75-6fb1e0463499\" (UID: \"add50932-e8ea-4e7a-ab75-6fb1e0463499\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.546181 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwk2t\" (UniqueName: \"kubernetes.io/projected/add50932-e8ea-4e7a-ab75-6fb1e0463499-kube-api-access-nwk2t\") pod \"add50932-e8ea-4e7a-ab75-6fb1e0463499\" (UID: \"add50932-e8ea-4e7a-ab75-6fb1e0463499\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.546216 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d05028-c68a-4438-afcd-161c4f974b08-combined-ca-bundle\") pod \"98d05028-c68a-4438-afcd-161c4f974b08\" (UID: \"98d05028-c68a-4438-afcd-161c4f974b08\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.546247 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/855f6436-68f0-42d8-a12a-bf25632440c1-scripts\") pod \"855f6436-68f0-42d8-a12a-bf25632440c1\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.546267 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzbmz\" (UniqueName: \"kubernetes.io/projected/855f6436-68f0-42d8-a12a-bf25632440c1-kube-api-access-zzbmz\") pod \"855f6436-68f0-42d8-a12a-bf25632440c1\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.546293 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98d05028-c68a-4438-afcd-161c4f974b08-config-data\") pod \"98d05028-c68a-4438-afcd-161c4f974b08\" (UID: \"98d05028-c68a-4438-afcd-161c4f974b08\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.546335 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jm4h\" (UniqueName: \"kubernetes.io/projected/98d05028-c68a-4438-afcd-161c4f974b08-kube-api-access-8jm4h\") pod \"98d05028-c68a-4438-afcd-161c4f974b08\" (UID: \"98d05028-c68a-4438-afcd-161c4f974b08\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.546370 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/855f6436-68f0-42d8-a12a-bf25632440c1-metrics-certs-tls-certs\") pod \"855f6436-68f0-42d8-a12a-bf25632440c1\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.546396 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/855f6436-68f0-42d8-a12a-bf25632440c1-ovn-northd-tls-certs\") pod \"855f6436-68f0-42d8-a12a-bf25632440c1\" (UID: \"855f6436-68f0-42d8-a12a-bf25632440c1\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.546446 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add50932-e8ea-4e7a-ab75-6fb1e0463499-combined-ca-bundle\") pod \"add50932-e8ea-4e7a-ab75-6fb1e0463499\" (UID: \"add50932-e8ea-4e7a-ab75-6fb1e0463499\") " Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.546654 4947 scope.go:117] "RemoveContainer" containerID="334c2441513d95f243b52e76544d548f66b710f9a5c7ddb3358d9df97ecd5f57" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.546992 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1256396c-ee12-469e-864f-c87983516079-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.547019 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmq74\" (UniqueName: \"kubernetes.io/projected/1256396c-ee12-469e-864f-c87983516079-kube-api-access-xmq74\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.547033 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6a94dde-b399-408b-93f2-488d02be7f07-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.547608 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/44c0f438-8d06-433e-af68-49ccacd9a017-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.547629 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmmgg\" (UniqueName: \"kubernetes.io/projected/44c0f438-8d06-433e-af68-49ccacd9a017-kube-api-access-jmmgg\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.547645 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2n9xw\" (UniqueName: \"kubernetes.io/projected/c6a94dde-b399-408b-93f2-488d02be7f07-kube-api-access-2n9xw\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: E1203 07:12:38.547239 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.547665 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/855f6436-68f0-42d8-a12a-bf25632440c1-scripts" (OuterVolumeSpecName: "scripts") pod "855f6436-68f0-42d8-a12a-bf25632440c1" (UID: "855f6436-68f0-42d8-a12a-bf25632440c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: E1203 07:12:38.547714 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8c259ef0-fa34-4ef1-a954-a15bd61ed120-operator-scripts podName:8c259ef0-fa34-4ef1-a954-a15bd61ed120 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:39.047693446 +0000 UTC m=+1420.308647872 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8c259ef0-fa34-4ef1-a954-a15bd61ed120-operator-scripts") pod "cinder0e6b-account-delete-kmbkl" (UID: "8c259ef0-fa34-4ef1-a954-a15bd61ed120") : configmap "openstack-scripts" not found Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.548128 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/add50932-e8ea-4e7a-ab75-6fb1e0463499-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "add50932-e8ea-4e7a-ab75-6fb1e0463499" (UID: "add50932-e8ea-4e7a-ab75-6fb1e0463499"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: E1203 07:12:38.549097 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:12:38 crc kubenswrapper[4947]: E1203 07:12:38.549146 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/62cff038-cc99-44ca-ba48-a05d16a96b26-operator-scripts podName:62cff038-cc99-44ca-ba48-a05d16a96b26 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:39.049129795 +0000 UTC m=+1420.310084221 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/62cff038-cc99-44ca-ba48-a05d16a96b26-operator-scripts") pod "novaapib13d-account-delete-k5vqn" (UID: "62cff038-cc99-44ca-ba48-a05d16a96b26") : configmap "openstack-scripts" not found Dec 03 07:12:38 crc kubenswrapper[4947]: E1203 07:12:38.549174 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:12:38 crc kubenswrapper[4947]: E1203 07:12:38.549194 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e4b56c27-0089-46d0-8cc3-c5788833f135-operator-scripts podName:e4b56c27-0089-46d0-8cc3-c5788833f135 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:39.049188096 +0000 UTC m=+1420.310142522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e4b56c27-0089-46d0-8cc3-c5788833f135-operator-scripts") pod "neutron896b-account-delete-ddxm6" (UID: "e4b56c27-0089-46d0-8cc3-c5788833f135") : configmap "openstack-scripts" not found Dec 03 07:12:38 crc kubenswrapper[4947]: E1203 07:12:38.549241 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:12:38 crc kubenswrapper[4947]: E1203 07:12:38.549261 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8c562c87-9f66-447f-83ec-05165c95ca25-operator-scripts podName:8c562c87-9f66-447f-83ec-05165c95ca25 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:39.049255598 +0000 UTC m=+1420.310210024 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8c562c87-9f66-447f-83ec-05165c95ca25-operator-scripts") pod "novacell0c18d-account-delete-458q7" (UID: "8c562c87-9f66-447f-83ec-05165c95ca25") : configmap "openstack-scripts" not found Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.549707 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/add50932-e8ea-4e7a-ab75-6fb1e0463499-config-data" (OuterVolumeSpecName: "config-data") pod "add50932-e8ea-4e7a-ab75-6fb1e0463499" (UID: "add50932-e8ea-4e7a-ab75-6fb1e0463499"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.549826 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/855f6436-68f0-42d8-a12a-bf25632440c1-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "855f6436-68f0-42d8-a12a-bf25632440c1" (UID: "855f6436-68f0-42d8-a12a-bf25632440c1"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.550061 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98d05028-c68a-4438-afcd-161c4f974b08-logs" (OuterVolumeSpecName: "logs") pod "98d05028-c68a-4438-afcd-161c4f974b08" (UID: "98d05028-c68a-4438-afcd-161c4f974b08"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.552588 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/855f6436-68f0-42d8-a12a-bf25632440c1-config" (OuterVolumeSpecName: "config") pod "855f6436-68f0-42d8-a12a-bf25632440c1" (UID: "855f6436-68f0-42d8-a12a-bf25632440c1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.552741 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/855f6436-68f0-42d8-a12a-bf25632440c1-kube-api-access-zzbmz" (OuterVolumeSpecName: "kube-api-access-zzbmz") pod "855f6436-68f0-42d8-a12a-bf25632440c1" (UID: "855f6436-68f0-42d8-a12a-bf25632440c1"). InnerVolumeSpecName "kube-api-access-zzbmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.558936 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.564040 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.572826 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98d05028-c68a-4438-afcd-161c4f974b08-kube-api-access-8jm4h" (OuterVolumeSpecName: "kube-api-access-8jm4h") pod "98d05028-c68a-4438-afcd-161c4f974b08" (UID: "98d05028-c68a-4438-afcd-161c4f974b08"). InnerVolumeSpecName "kube-api-access-8jm4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.578592 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/add50932-e8ea-4e7a-ab75-6fb1e0463499-kube-api-access-nwk2t" (OuterVolumeSpecName: "kube-api-access-nwk2t") pod "add50932-e8ea-4e7a-ab75-6fb1e0463499" (UID: "add50932-e8ea-4e7a-ab75-6fb1e0463499"). InnerVolumeSpecName "kube-api-access-nwk2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.603221 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855f6436-68f0-42d8-a12a-bf25632440c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "855f6436-68f0-42d8-a12a-bf25632440c1" (UID: "855f6436-68f0-42d8-a12a-bf25632440c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.613974 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/add50932-e8ea-4e7a-ab75-6fb1e0463499-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "add50932-e8ea-4e7a-ab75-6fb1e0463499" (UID: "add50932-e8ea-4e7a-ab75-6fb1e0463499"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.616319 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.624170 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.625281 4947 scope.go:117] "RemoveContainer" containerID="208ecb2e4a9580e581c1630c83f89daf84de2499a774647eff59bb9623f3cf65" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.644300 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/add50932-e8ea-4e7a-ab75-6fb1e0463499-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "add50932-e8ea-4e7a-ab75-6fb1e0463499" (UID: "add50932-e8ea-4e7a-ab75-6fb1e0463499"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.644460 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d05028-c68a-4438-afcd-161c4f974b08-config-data" (OuterVolumeSpecName: "config-data") pod "98d05028-c68a-4438-afcd-161c4f974b08" (UID: "98d05028-c68a-4438-afcd-161c4f974b08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.650138 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/add50932-e8ea-4e7a-ab75-6fb1e0463499-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.650166 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/add50932-e8ea-4e7a-ab75-6fb1e0463499-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.650176 4947 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/855f6436-68f0-42d8-a12a-bf25632440c1-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.650186 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98d05028-c68a-4438-afcd-161c4f974b08-logs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.650194 4947 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/add50932-e8ea-4e7a-ab75-6fb1e0463499-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.650202 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/855f6436-68f0-42d8-a12a-bf25632440c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.650209 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/855f6436-68f0-42d8-a12a-bf25632440c1-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.650219 4947 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/add50932-e8ea-4e7a-ab75-6fb1e0463499-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.650228 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwk2t\" (UniqueName: \"kubernetes.io/projected/add50932-e8ea-4e7a-ab75-6fb1e0463499-kube-api-access-nwk2t\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.650237 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/855f6436-68f0-42d8-a12a-bf25632440c1-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.650246 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzbmz\" (UniqueName: \"kubernetes.io/projected/855f6436-68f0-42d8-a12a-bf25632440c1-kube-api-access-zzbmz\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.650254 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98d05028-c68a-4438-afcd-161c4f974b08-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.650261 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jm4h\" (UniqueName: \"kubernetes.io/projected/98d05028-c68a-4438-afcd-161c4f974b08-kube-api-access-8jm4h\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.653541 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d05028-c68a-4438-afcd-161c4f974b08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98d05028-c68a-4438-afcd-161c4f974b08" (UID: "98d05028-c68a-4438-afcd-161c4f974b08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.661243 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d05028-c68a-4438-afcd-161c4f974b08-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "98d05028-c68a-4438-afcd-161c4f974b08" (UID: "98d05028-c68a-4438-afcd-161c4f974b08"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.664060 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855f6436-68f0-42d8-a12a-bf25632440c1-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "855f6436-68f0-42d8-a12a-bf25632440c1" (UID: "855f6436-68f0-42d8-a12a-bf25632440c1"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.668820 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98d05028-c68a-4438-afcd-161c4f974b08-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "98d05028-c68a-4438-afcd-161c4f974b08" (UID: "98d05028-c68a-4438-afcd-161c4f974b08"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.669792 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/855f6436-68f0-42d8-a12a-bf25632440c1-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "855f6436-68f0-42d8-a12a-bf25632440c1" (UID: "855f6436-68f0-42d8-a12a-bf25632440c1"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.751957 4947 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/855f6436-68f0-42d8-a12a-bf25632440c1-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.752248 4947 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/855f6436-68f0-42d8-a12a-bf25632440c1-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.752258 4947 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d05028-c68a-4438-afcd-161c4f974b08-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.752267 4947 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/98d05028-c68a-4438-afcd-161c4f974b08-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.752275 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98d05028-c68a-4438-afcd-161c4f974b08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.817923 4947 scope.go:117] "RemoveContainer" containerID="23fd79e84a8a0d20a65d598b706fa955381b97ec7814c70275bf4b3eb633dcce" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.834709 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.872572 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.875332 4947 scope.go:117] "RemoveContainer" containerID="263bf6a9b3ef4a98946d41c42d9dbc1a2931efc6079ce15d3713906e986c6d70" Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.887205 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.901415 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.916830 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.929671 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.936518 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.946786 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.955944 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.971169 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.988893 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-774d6d4878-7x6tj"] Dec 03 07:12:38 crc kubenswrapper[4947]: I1203 07:12:38.998188 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-774d6d4878-7x6tj"] Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.005456 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6c489f678-crqhz"] Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.007132 4947 scope.go:117] "RemoveContainer" containerID="263bf6a9b3ef4a98946d41c42d9dbc1a2931efc6079ce15d3713906e986c6d70" Dec 03 07:12:39 crc kubenswrapper[4947]: E1203 07:12:39.009356 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"263bf6a9b3ef4a98946d41c42d9dbc1a2931efc6079ce15d3713906e986c6d70\": container with ID starting with 263bf6a9b3ef4a98946d41c42d9dbc1a2931efc6079ce15d3713906e986c6d70 not found: ID does not exist" containerID="263bf6a9b3ef4a98946d41c42d9dbc1a2931efc6079ce15d3713906e986c6d70" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.009387 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"263bf6a9b3ef4a98946d41c42d9dbc1a2931efc6079ce15d3713906e986c6d70"} err="failed to get container status \"263bf6a9b3ef4a98946d41c42d9dbc1a2931efc6079ce15d3713906e986c6d70\": rpc error: code = NotFound desc = could not find container \"263bf6a9b3ef4a98946d41c42d9dbc1a2931efc6079ce15d3713906e986c6d70\": container with ID starting with 263bf6a9b3ef4a98946d41c42d9dbc1a2931efc6079ce15d3713906e986c6d70 not found: ID does not exist" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.019675 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-6c489f678-crqhz"] Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.028036 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.034867 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.040632 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.047939 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.054499 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:12:39 crc kubenswrapper[4947]: E1203 07:12:39.060020 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:12:39 crc kubenswrapper[4947]: E1203 07:12:39.060166 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8c259ef0-fa34-4ef1-a954-a15bd61ed120-operator-scripts podName:8c259ef0-fa34-4ef1-a954-a15bd61ed120 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:40.060150919 +0000 UTC m=+1421.321105345 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8c259ef0-fa34-4ef1-a954-a15bd61ed120-operator-scripts") pod "cinder0e6b-account-delete-kmbkl" (UID: "8c259ef0-fa34-4ef1-a954-a15bd61ed120") : configmap "openstack-scripts" not found Dec 03 07:12:39 crc kubenswrapper[4947]: E1203 07:12:39.060641 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:12:39 crc kubenswrapper[4947]: E1203 07:12:39.060753 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/62cff038-cc99-44ca-ba48-a05d16a96b26-operator-scripts podName:62cff038-cc99-44ca-ba48-a05d16a96b26 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:40.060744814 +0000 UTC m=+1421.321699240 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/62cff038-cc99-44ca-ba48-a05d16a96b26-operator-scripts") pod "novaapib13d-account-delete-k5vqn" (UID: "62cff038-cc99-44ca-ba48-a05d16a96b26") : configmap "openstack-scripts" not found Dec 03 07:12:39 crc kubenswrapper[4947]: E1203 07:12:39.060871 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:12:39 crc kubenswrapper[4947]: E1203 07:12:39.060982 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e4b56c27-0089-46d0-8cc3-c5788833f135-operator-scripts podName:e4b56c27-0089-46d0-8cc3-c5788833f135 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:40.06097498 +0000 UTC m=+1421.321929406 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e4b56c27-0089-46d0-8cc3-c5788833f135-operator-scripts") pod "neutron896b-account-delete-ddxm6" (UID: "e4b56c27-0089-46d0-8cc3-c5788833f135") : configmap "openstack-scripts" not found Dec 03 07:12:39 crc kubenswrapper[4947]: E1203 07:12:39.061066 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:12:39 crc kubenswrapper[4947]: E1203 07:12:39.061142 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8c562c87-9f66-447f-83ec-05165c95ca25-operator-scripts podName:8c562c87-9f66-447f-83ec-05165c95ca25 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:40.061134485 +0000 UTC m=+1421.322088911 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8c562c87-9f66-447f-83ec-05165c95ca25-operator-scripts") pod "novacell0c18d-account-delete-458q7" (UID: "8c562c87-9f66-447f-83ec-05165c95ca25") : configmap "openstack-scripts" not found Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.061263 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.071045 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-59d779d8d8-6jl5c"] Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.079481 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-59d779d8d8-6jl5c"] Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.104027 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c3c47d5-30e8-4c5f-93fe-e0d944cdc998" path="/var/lib/kubelet/pods/1c3c47d5-30e8-4c5f-93fe-e0d944cdc998/volumes" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.104804 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3" path="/var/lib/kubelet/pods/1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3/volumes" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.105341 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f87802c-3846-486e-a131-39a7fe336c96" path="/var/lib/kubelet/pods/1f87802c-3846-486e-a131-39a7fe336c96/volumes" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.106626 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3496bac7-6b31-4ba8-a490-14bff1522b8c" path="/var/lib/kubelet/pods/3496bac7-6b31-4ba8-a490-14bff1522b8c/volumes" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.107177 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3735b7db-e9a7-4be6-9c74-cad0131f2c0b" path="/var/lib/kubelet/pods/3735b7db-e9a7-4be6-9c74-cad0131f2c0b/volumes" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.107654 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b4b931b-69ee-4ff2-b01a-85d45fc93ec4" path="/var/lib/kubelet/pods/3b4b931b-69ee-4ff2-b01a-85d45fc93ec4/volumes" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.108735 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49caa6da-3c98-4c49-ab22-62121ff908cf" path="/var/lib/kubelet/pods/49caa6da-3c98-4c49-ab22-62121ff908cf/volumes" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.109388 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="899d3d67-ec63-4d5f-ad93-c40003578347" path="/var/lib/kubelet/pods/899d3d67-ec63-4d5f-ad93-c40003578347/volumes" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.109916 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90736cf6-0db1-44a8-b285-4d319f0951f8" path="/var/lib/kubelet/pods/90736cf6-0db1-44a8-b285-4d319f0951f8/volumes" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.111170 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90b682d1-68e6-49a4-83a6-51b1b40b7e99" path="/var/lib/kubelet/pods/90b682d1-68e6-49a4-83a6-51b1b40b7e99/volumes" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.111746 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a879185-9b9a-45a5-a211-c61faf308cbb" path="/var/lib/kubelet/pods/9a879185-9b9a-45a5-a211-c61faf308cbb/volumes" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.112913 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f0a57ca-d063-4d27-ac54-f5431cca2971" path="/var/lib/kubelet/pods/9f0a57ca-d063-4d27-ac54-f5431cca2971/volumes" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.113480 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4a227e4-8c2a-4880-9944-877640627cd0" path="/var/lib/kubelet/pods/b4a227e4-8c2a-4880-9944-877640627cd0/volumes" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.114270 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdb45354-43cd-41e7-a511-95357eb656e5" path="/var/lib/kubelet/pods/bdb45354-43cd-41e7-a511-95357eb656e5/volumes" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.115815 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7b9bf09-0d94-4520-b783-7eb3fb4b79d4" path="/var/lib/kubelet/pods/c7b9bf09-0d94-4520-b783-7eb3fb4b79d4/volumes" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.116579 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4e9d6dc-e814-485d-842b-9266732c7924" path="/var/lib/kubelet/pods/e4e9d6dc-e814-485d-842b-9266732c7924/volumes" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.117112 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e80985eb-c6e0-4ffc-9b98-b1c92be266eb" path="/var/lib/kubelet/pods/e80985eb-c6e0-4ffc-9b98-b1c92be266eb/volumes" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.118087 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed654f44-78c3-4118-83d5-e2a5d917c4f4" path="/var/lib/kubelet/pods/ed654f44-78c3-4118-83d5-e2a5d917c4f4/volumes" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.120102 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8" path="/var/lib/kubelet/pods/fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8/volumes" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.121306 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.121336 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Dec 03 07:12:39 crc kubenswrapper[4947]: E1203 07:12:39.263832 4947 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 03 07:12:39 crc kubenswrapper[4947]: E1203 07:12:39.263910 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-config-data podName:51d7ef1d-a0bf-465f-baad-1bc3a71618ff nodeName:}" failed. No retries permitted until 2025-12-03 07:12:47.263895144 +0000 UTC m=+1428.524849570 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-config-data") pod "rabbitmq-server-0" (UID: "51d7ef1d-a0bf-465f-baad-1bc3a71618ff") : configmap "rabbitmq-config-data" not found Dec 03 07:12:39 crc kubenswrapper[4947]: E1203 07:12:39.398738 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39 is running failed: container process not found" containerID="f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 07:12:39 crc kubenswrapper[4947]: E1203 07:12:39.399341 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ce586afd15fbdd0b4c1dfbc0475b8228a9be2dd3ee32c7a7f80750aafce6e2f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 07:12:39 crc kubenswrapper[4947]: E1203 07:12:39.399523 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39 is running failed: container process not found" containerID="f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 07:12:39 crc kubenswrapper[4947]: E1203 07:12:39.400606 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39 is running failed: container process not found" containerID="f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 07:12:39 crc kubenswrapper[4947]: E1203 07:12:39.400658 4947 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-lb94d" podUID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerName="ovsdb-server" Dec 03 07:12:39 crc kubenswrapper[4947]: E1203 07:12:39.400618 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ce586afd15fbdd0b4c1dfbc0475b8228a9be2dd3ee32c7a7f80750aafce6e2f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 07:12:39 crc kubenswrapper[4947]: E1203 07:12:39.401934 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ce586afd15fbdd0b4c1dfbc0475b8228a9be2dd3ee32c7a7f80750aafce6e2f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 07:12:39 crc kubenswrapper[4947]: E1203 07:12:39.402030 4947 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-lb94d" podUID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerName="ovs-vswitchd" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.577863 4947 generic.go:334] "Generic (PLEG): container finished" podID="51d7ef1d-a0bf-465f-baad-1bc3a71618ff" containerID="23a38b8a1f4cbbe4e0977fa37f88e153dc49ca4d190b8ca3be90add4e540bbe0" exitCode=0 Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.578509 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"51d7ef1d-a0bf-465f-baad-1bc3a71618ff","Type":"ContainerDied","Data":"23a38b8a1f4cbbe4e0977fa37f88e153dc49ca4d190b8ca3be90add4e540bbe0"} Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.613780 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"98d05028-c68a-4438-afcd-161c4f974b08","Type":"ContainerDied","Data":"46a0c85d673476dcf28c18dc595980d0048f2463589a11107160fb8331c9aeab"} Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.613895 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.628110 4947 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell0c18d-account-delete-458q7" secret="" err="secret \"galera-openstack-dockercfg-2prbt\" not found" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.628468 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.628927 4947 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/cinder0e6b-account-delete-kmbkl" secret="" err="secret \"galera-openstack-dockercfg-2prbt\" not found" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.629351 4947 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novaapib13d-account-delete-k5vqn" secret="" err="secret \"galera-openstack-dockercfg-2prbt\" not found" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.629459 4947 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/neutron896b-account-delete-ddxm6" secret="" err="secret \"galera-openstack-dockercfg-2prbt\" not found" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.709453 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.726639 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.745735 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.748673 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-84gbj" podUID="98211c56-fd23-46c2-9710-31fc562e2182" containerName="ovn-controller" probeResult="failure" output="command timed out" Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.762903 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 07:12:39 crc kubenswrapper[4947]: I1203 07:12:39.783829 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-84gbj" podUID="98211c56-fd23-46c2-9710-31fc562e2182" containerName="ovn-controller" probeResult="failure" output=< Dec 03 07:12:39 crc kubenswrapper[4947]: ERROR - Failed to get connection status from ovn-controller, ovn-appctl exit status: 0 Dec 03 07:12:39 crc kubenswrapper[4947]: > Dec 03 07:12:40 crc kubenswrapper[4947]: E1203 07:12:40.091125 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:12:40 crc kubenswrapper[4947]: E1203 07:12:40.091182 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e4b56c27-0089-46d0-8cc3-c5788833f135-operator-scripts podName:e4b56c27-0089-46d0-8cc3-c5788833f135 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:42.09116609 +0000 UTC m=+1423.352120516 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e4b56c27-0089-46d0-8cc3-c5788833f135-operator-scripts") pod "neutron896b-account-delete-ddxm6" (UID: "e4b56c27-0089-46d0-8cc3-c5788833f135") : configmap "openstack-scripts" not found Dec 03 07:12:40 crc kubenswrapper[4947]: E1203 07:12:40.091223 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:12:40 crc kubenswrapper[4947]: E1203 07:12:40.091249 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8c562c87-9f66-447f-83ec-05165c95ca25-operator-scripts podName:8c562c87-9f66-447f-83ec-05165c95ca25 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:42.091241392 +0000 UTC m=+1423.352195818 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8c562c87-9f66-447f-83ec-05165c95ca25-operator-scripts") pod "novacell0c18d-account-delete-458q7" (UID: "8c562c87-9f66-447f-83ec-05165c95ca25") : configmap "openstack-scripts" not found Dec 03 07:12:40 crc kubenswrapper[4947]: E1203 07:12:40.091278 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:12:40 crc kubenswrapper[4947]: E1203 07:12:40.091302 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8c259ef0-fa34-4ef1-a954-a15bd61ed120-operator-scripts podName:8c259ef0-fa34-4ef1-a954-a15bd61ed120 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:42.091290843 +0000 UTC m=+1423.352245269 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8c259ef0-fa34-4ef1-a954-a15bd61ed120-operator-scripts") pod "cinder0e6b-account-delete-kmbkl" (UID: "8c259ef0-fa34-4ef1-a954-a15bd61ed120") : configmap "openstack-scripts" not found Dec 03 07:12:40 crc kubenswrapper[4947]: E1203 07:12:40.091330 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:12:40 crc kubenswrapper[4947]: E1203 07:12:40.091350 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/62cff038-cc99-44ca-ba48-a05d16a96b26-operator-scripts podName:62cff038-cc99-44ca-ba48-a05d16a96b26 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:42.091343595 +0000 UTC m=+1423.352298021 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/62cff038-cc99-44ca-ba48-a05d16a96b26-operator-scripts") pod "novaapib13d-account-delete-k5vqn" (UID: "62cff038-cc99-44ca-ba48-a05d16a96b26") : configmap "openstack-scripts" not found Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.128675 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:12:40 crc kubenswrapper[4947]: E1203 07:12:40.188876 4947 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Dec 03 07:12:40 crc kubenswrapper[4947]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-12-03T07:12:32Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Dec 03 07:12:40 crc kubenswrapper[4947]: /etc/init.d/functions: line 589: 393 Alarm clock "$@" Dec 03 07:12:40 crc kubenswrapper[4947]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-84gbj" message=< Dec 03 07:12:40 crc kubenswrapper[4947]: Exiting ovn-controller (1) [FAILED] Dec 03 07:12:40 crc kubenswrapper[4947]: Killing ovn-controller (1) [ OK ] Dec 03 07:12:40 crc kubenswrapper[4947]: Killing ovn-controller (1) with SIGKILL [ OK ] Dec 03 07:12:40 crc kubenswrapper[4947]: 2025-12-03T07:12:32Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Dec 03 07:12:40 crc kubenswrapper[4947]: /etc/init.d/functions: line 589: 393 Alarm clock "$@" Dec 03 07:12:40 crc kubenswrapper[4947]: > Dec 03 07:12:40 crc kubenswrapper[4947]: E1203 07:12:40.188918 4947 kuberuntime_container.go:691] "PreStop hook failed" err=< Dec 03 07:12:40 crc kubenswrapper[4947]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-12-03T07:12:32Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Dec 03 07:12:40 crc kubenswrapper[4947]: /etc/init.d/functions: line 589: 393 Alarm clock "$@" Dec 03 07:12:40 crc kubenswrapper[4947]: > pod="openstack/ovn-controller-84gbj" podUID="98211c56-fd23-46c2-9710-31fc562e2182" containerName="ovn-controller" containerID="cri-o://8338c791b953b84b6e5c48f3b02c0d6abbb444adb3782f88dea28209a3c3d714" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.189743 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-84gbj" podUID="98211c56-fd23-46c2-9710-31fc562e2182" containerName="ovn-controller" containerID="cri-o://8338c791b953b84b6e5c48f3b02c0d6abbb444adb3782f88dea28209a3c3d714" gracePeriod=22 Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.193580 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dss2r\" (UniqueName: \"kubernetes.io/projected/5367165f-75ec-4633-8042-edfe91e3be60-kube-api-access-dss2r\") pod \"5367165f-75ec-4633-8042-edfe91e3be60\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.204168 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5367165f-75ec-4633-8042-edfe91e3be60-kube-api-access-dss2r" (OuterVolumeSpecName: "kube-api-access-dss2r") pod "5367165f-75ec-4633-8042-edfe91e3be60" (UID: "5367165f-75ec-4633-8042-edfe91e3be60"). InnerVolumeSpecName "kube-api-access-dss2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.274674 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-2qbdx"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.286014 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-2qbdx"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.293058 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-0e6b-account-create-update-bfb9m"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.298737 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-config-data\") pod \"5367165f-75ec-4633-8042-edfe91e3be60\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.298782 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-plugins-conf\") pod \"5367165f-75ec-4633-8042-edfe91e3be60\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.298840 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-server-conf\") pod \"5367165f-75ec-4633-8042-edfe91e3be60\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.298869 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5367165f-75ec-4633-8042-edfe91e3be60-rabbitmq-tls\") pod \"5367165f-75ec-4633-8042-edfe91e3be60\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.298894 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5367165f-75ec-4633-8042-edfe91e3be60-rabbitmq-erlang-cookie\") pod \"5367165f-75ec-4633-8042-edfe91e3be60\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.298920 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5367165f-75ec-4633-8042-edfe91e3be60-pod-info\") pod \"5367165f-75ec-4633-8042-edfe91e3be60\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.298950 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5367165f-75ec-4633-8042-edfe91e3be60-rabbitmq-plugins\") pod \"5367165f-75ec-4633-8042-edfe91e3be60\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.298984 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"5367165f-75ec-4633-8042-edfe91e3be60\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.299015 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5367165f-75ec-4633-8042-edfe91e3be60-erlang-cookie-secret\") pod \"5367165f-75ec-4633-8042-edfe91e3be60\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.299040 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5367165f-75ec-4633-8042-edfe91e3be60-rabbitmq-confd\") pod \"5367165f-75ec-4633-8042-edfe91e3be60\" (UID: \"5367165f-75ec-4633-8042-edfe91e3be60\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.299344 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dss2r\" (UniqueName: \"kubernetes.io/projected/5367165f-75ec-4633-8042-edfe91e3be60-kube-api-access-dss2r\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.302888 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder0e6b-account-delete-kmbkl"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.303384 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5367165f-75ec-4633-8042-edfe91e3be60-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5367165f-75ec-4633-8042-edfe91e3be60" (UID: "5367165f-75ec-4633-8042-edfe91e3be60"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.304239 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5367165f-75ec-4633-8042-edfe91e3be60" (UID: "5367165f-75ec-4633-8042-edfe91e3be60"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.304659 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5367165f-75ec-4633-8042-edfe91e3be60-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5367165f-75ec-4633-8042-edfe91e3be60" (UID: "5367165f-75ec-4633-8042-edfe91e3be60"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.306191 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-0e6b-account-create-update-bfb9m"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.311394 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5367165f-75ec-4633-8042-edfe91e3be60-pod-info" (OuterVolumeSpecName: "pod-info") pod "5367165f-75ec-4633-8042-edfe91e3be60" (UID: "5367165f-75ec-4633-8042-edfe91e3be60"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.313169 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5367165f-75ec-4633-8042-edfe91e3be60-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5367165f-75ec-4633-8042-edfe91e3be60" (UID: "5367165f-75ec-4633-8042-edfe91e3be60"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.313315 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "5367165f-75ec-4633-8042-edfe91e3be60" (UID: "5367165f-75ec-4633-8042-edfe91e3be60"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.316428 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5367165f-75ec-4633-8042-edfe91e3be60-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "5367165f-75ec-4633-8042-edfe91e3be60" (UID: "5367165f-75ec-4633-8042-edfe91e3be60"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.334669 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-config-data" (OuterVolumeSpecName: "config-data") pod "5367165f-75ec-4633-8042-edfe91e3be60" (UID: "5367165f-75ec-4633-8042-edfe91e3be60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.369584 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.370079 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-server-conf" (OuterVolumeSpecName: "server-conf") pod "5367165f-75ec-4633-8042-edfe91e3be60" (UID: "5367165f-75ec-4633-8042-edfe91e3be60"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.370090 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.382593 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-mxrh4"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.389988 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-mxrh4"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.398365 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-0363-account-create-update-hgv5w"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.399955 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-scripts\") pod \"acfeae50-8757-4baf-a16a-c33ae100fdf2\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.400005 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-combined-ca-bundle\") pod \"acfeae50-8757-4baf-a16a-c33ae100fdf2\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.400037 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-rabbitmq-erlang-cookie\") pod \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.400051 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.400077 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpc2q\" (UniqueName: \"kubernetes.io/projected/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-kube-api-access-lpc2q\") pod \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.400094 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-internal-tls-certs\") pod \"acfeae50-8757-4baf-a16a-c33ae100fdf2\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.400110 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwrkw\" (UniqueName: \"kubernetes.io/projected/acfeae50-8757-4baf-a16a-c33ae100fdf2-kube-api-access-gwrkw\") pod \"acfeae50-8757-4baf-a16a-c33ae100fdf2\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.400131 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-public-tls-certs\") pod \"acfeae50-8757-4baf-a16a-c33ae100fdf2\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.400516 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-plugins-conf\") pod \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.400555 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-config-data\") pod \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.400573 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-pod-info\") pod \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.400591 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-rabbitmq-confd\") pod \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.400613 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-credential-keys\") pod \"acfeae50-8757-4baf-a16a-c33ae100fdf2\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.400628 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-server-conf\") pod \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.400645 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-erlang-cookie-secret\") pod \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.400676 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-fernet-keys\") pod \"acfeae50-8757-4baf-a16a-c33ae100fdf2\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.400691 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-rabbitmq-plugins\") pod \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.400710 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-rabbitmq-tls\") pod \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\" (UID: \"51d7ef1d-a0bf-465f-baad-1bc3a71618ff\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.400746 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-config-data\") pod \"acfeae50-8757-4baf-a16a-c33ae100fdf2\" (UID: \"acfeae50-8757-4baf-a16a-c33ae100fdf2\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.400955 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.400966 4947 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.400975 4947 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5367165f-75ec-4633-8042-edfe91e3be60-server-conf\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.400983 4947 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5367165f-75ec-4633-8042-edfe91e3be60-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.400991 4947 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5367165f-75ec-4633-8042-edfe91e3be60-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.400999 4947 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5367165f-75ec-4633-8042-edfe91e3be60-pod-info\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.401007 4947 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5367165f-75ec-4633-8042-edfe91e3be60-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.401023 4947 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.401032 4947 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5367165f-75ec-4633-8042-edfe91e3be60-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.402462 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "51d7ef1d-a0bf-465f-baad-1bc3a71618ff" (UID: "51d7ef1d-a0bf-465f-baad-1bc3a71618ff"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.406297 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "51d7ef1d-a0bf-465f-baad-1bc3a71618ff" (UID: "51d7ef1d-a0bf-465f-baad-1bc3a71618ff"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.406789 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "51d7ef1d-a0bf-465f-baad-1bc3a71618ff" (UID: "51d7ef1d-a0bf-465f-baad-1bc3a71618ff"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.406912 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "acfeae50-8757-4baf-a16a-c33ae100fdf2" (UID: "acfeae50-8757-4baf-a16a-c33ae100fdf2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.407049 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-kube-api-access-lpc2q" (OuterVolumeSpecName: "kube-api-access-lpc2q") pod "51d7ef1d-a0bf-465f-baad-1bc3a71618ff" (UID: "51d7ef1d-a0bf-465f-baad-1bc3a71618ff"). InnerVolumeSpecName "kube-api-access-lpc2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.409969 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "51d7ef1d-a0bf-465f-baad-1bc3a71618ff" (UID: "51d7ef1d-a0bf-465f-baad-1bc3a71618ff"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.411679 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican0363-account-delete-22cjb"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.411855 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "51d7ef1d-a0bf-465f-baad-1bc3a71618ff" (UID: "51d7ef1d-a0bf-465f-baad-1bc3a71618ff"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.413741 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-pod-info" (OuterVolumeSpecName: "pod-info") pod "51d7ef1d-a0bf-465f-baad-1bc3a71618ff" (UID: "51d7ef1d-a0bf-465f-baad-1bc3a71618ff"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.418619 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acfeae50-8757-4baf-a16a-c33ae100fdf2-kube-api-access-gwrkw" (OuterVolumeSpecName: "kube-api-access-gwrkw") pod "acfeae50-8757-4baf-a16a-c33ae100fdf2" (UID: "acfeae50-8757-4baf-a16a-c33ae100fdf2"). InnerVolumeSpecName "kube-api-access-gwrkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.420509 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-scripts" (OuterVolumeSpecName: "scripts") pod "acfeae50-8757-4baf-a16a-c33ae100fdf2" (UID: "acfeae50-8757-4baf-a16a-c33ae100fdf2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.420774 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican0363-account-delete-22cjb"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.428937 4947 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.431154 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "acfeae50-8757-4baf-a16a-c33ae100fdf2" (UID: "acfeae50-8757-4baf-a16a-c33ae100fdf2"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.433943 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "51d7ef1d-a0bf-465f-baad-1bc3a71618ff" (UID: "51d7ef1d-a0bf-465f-baad-1bc3a71618ff"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.434968 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-0363-account-create-update-hgv5w"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.440089 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-config-data" (OuterVolumeSpecName: "config-data") pod "51d7ef1d-a0bf-465f-baad-1bc3a71618ff" (UID: "51d7ef1d-a0bf-465f-baad-1bc3a71618ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.485851 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-config-data" (OuterVolumeSpecName: "config-data") pod "acfeae50-8757-4baf-a16a-c33ae100fdf2" (UID: "acfeae50-8757-4baf-a16a-c33ae100fdf2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.502214 4947 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.502254 4947 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.502263 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpc2q\" (UniqueName: \"kubernetes.io/projected/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-kube-api-access-lpc2q\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.502274 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwrkw\" (UniqueName: \"kubernetes.io/projected/acfeae50-8757-4baf-a16a-c33ae100fdf2-kube-api-access-gwrkw\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.502282 4947 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.502290 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.502300 4947 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-pod-info\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.502307 4947 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.502315 4947 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.502323 4947 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.502331 4947 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.502338 4947 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.502346 4947 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.502354 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.502361 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.530601 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-cqwr9"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.540858 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-cqwr9"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.550024 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-bdf2-account-create-update-ckdwp"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.552729 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "acfeae50-8757-4baf-a16a-c33ae100fdf2" (UID: "acfeae50-8757-4baf-a16a-c33ae100fdf2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.564394 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementbdf2-account-delete-qmjm2"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.569529 4947 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.572233 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-bdf2-account-create-update-ckdwp"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.576065 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-server-conf" (OuterVolumeSpecName: "server-conf") pod "51d7ef1d-a0bf-465f-baad-1bc3a71618ff" (UID: "51d7ef1d-a0bf-465f-baad-1bc3a71618ff"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.576159 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "acfeae50-8757-4baf-a16a-c33ae100fdf2" (UID: "acfeae50-8757-4baf-a16a-c33ae100fdf2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.577008 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placementbdf2-account-delete-qmjm2"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.579866 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-84gbj_98211c56-fd23-46c2-9710-31fc562e2182/ovn-controller/0.log" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.579935 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84gbj" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.580160 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5367165f-75ec-4633-8042-edfe91e3be60-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5367165f-75ec-4633-8042-edfe91e3be60" (UID: "5367165f-75ec-4633-8042-edfe91e3be60"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.599932 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "acfeae50-8757-4baf-a16a-c33ae100fdf2" (UID: "acfeae50-8757-4baf-a16a-c33ae100fdf2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.608246 4947 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5367165f-75ec-4633-8042-edfe91e3be60-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.608270 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.608281 4947 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.608289 4947 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.608298 4947 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/acfeae50-8757-4baf-a16a-c33ae100fdf2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.608306 4947 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-server-conf\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.636964 4947 generic.go:334] "Generic (PLEG): container finished" podID="acfeae50-8757-4baf-a16a-c33ae100fdf2" containerID="36b149648045b9cb069b114c1657d8c9f17d3c894192129d25f4f7e5044b7668" exitCode=0 Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.637017 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c779fbf97-r9gnn" event={"ID":"acfeae50-8757-4baf-a16a-c33ae100fdf2","Type":"ContainerDied","Data":"36b149648045b9cb069b114c1657d8c9f17d3c894192129d25f4f7e5044b7668"} Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.637041 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c779fbf97-r9gnn" event={"ID":"acfeae50-8757-4baf-a16a-c33ae100fdf2","Type":"ContainerDied","Data":"3fe9364c45274343b67fd57c7b371f66c669a4dc9cf7d36ea06a92221b7427f3"} Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.637057 4947 scope.go:117] "RemoveContainer" containerID="36b149648045b9cb069b114c1657d8c9f17d3c894192129d25f4f7e5044b7668" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.637152 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c779fbf97-r9gnn" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.647213 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-m8d6g"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.651459 4947 generic.go:334] "Generic (PLEG): container finished" podID="dbc40b18-511b-4bd7-bb2c-3dc868c6dcec" containerID="58133114584f0b390c627b1710fda7f019565ce9036c5e41a7ab89f063b5168d" exitCode=0 Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.651539 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec","Type":"ContainerDied","Data":"58133114584f0b390c627b1710fda7f019565ce9036c5e41a7ab89f063b5168d"} Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.655008 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-m8d6g"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.660569 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-84gbj_98211c56-fd23-46c2-9710-31fc562e2182/ovn-controller/0.log" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.660623 4947 generic.go:334] "Generic (PLEG): container finished" podID="98211c56-fd23-46c2-9710-31fc562e2182" containerID="8338c791b953b84b6e5c48f3b02c0d6abbb444adb3782f88dea28209a3c3d714" exitCode=137 Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.660690 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84gbj" event={"ID":"98211c56-fd23-46c2-9710-31fc562e2182","Type":"ContainerDied","Data":"8338c791b953b84b6e5c48f3b02c0d6abbb444adb3782f88dea28209a3c3d714"} Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.660714 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-84gbj" event={"ID":"98211c56-fd23-46c2-9710-31fc562e2182","Type":"ContainerDied","Data":"5ae027dcf3be49781729b775d310cf2e66b9997bd637936180522f9cc453e25a"} Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.660774 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-84gbj" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.662100 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance93c8-account-delete-j67wz"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.670433 4947 generic.go:334] "Generic (PLEG): container finished" podID="5367165f-75ec-4633-8042-edfe91e3be60" containerID="de4fb9c7248f451bd80d99c66509fa611e9f2312d6bc7b3b3051166fe6d71e26" exitCode=0 Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.670711 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5367165f-75ec-4633-8042-edfe91e3be60","Type":"ContainerDied","Data":"de4fb9c7248f451bd80d99c66509fa611e9f2312d6bc7b3b3051166fe6d71e26"} Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.670747 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5367165f-75ec-4633-8042-edfe91e3be60","Type":"ContainerDied","Data":"b18b2126e63d321b34ad6f06ae5e5d1eba12bd9f355cc32ce87928b9eee7b969"} Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.670802 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.677041 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder0e6b-account-delete-kmbkl" podUID="8c259ef0-fa34-4ef1-a954-a15bd61ed120" containerName="mariadb-account-delete" containerID="cri-o://5f5b17e47defdfb32050f77dd94b370b6ccbc70dc782d3ccfc33b54d67378629" gracePeriod=30 Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.677140 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.677549 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"51d7ef1d-a0bf-465f-baad-1bc3a71618ff","Type":"ContainerDied","Data":"f7b505747bd91cb699b7598e9ae32eebcc8cebfc9aebbb06ea83a68fdb33110c"} Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.682855 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance93c8-account-delete-j67wz"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.685793 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "51d7ef1d-a0bf-465f-baad-1bc3a71618ff" (UID: "51d7ef1d-a0bf-465f-baad-1bc3a71618ff"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.691774 4947 scope.go:117] "RemoveContainer" containerID="36b149648045b9cb069b114c1657d8c9f17d3c894192129d25f4f7e5044b7668" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.691849 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-93c8-account-create-update-7nq7f"] Dec 03 07:12:40 crc kubenswrapper[4947]: E1203 07:12:40.692213 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36b149648045b9cb069b114c1657d8c9f17d3c894192129d25f4f7e5044b7668\": container with ID starting with 36b149648045b9cb069b114c1657d8c9f17d3c894192129d25f4f7e5044b7668 not found: ID does not exist" containerID="36b149648045b9cb069b114c1657d8c9f17d3c894192129d25f4f7e5044b7668" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.692232 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36b149648045b9cb069b114c1657d8c9f17d3c894192129d25f4f7e5044b7668"} err="failed to get container status \"36b149648045b9cb069b114c1657d8c9f17d3c894192129d25f4f7e5044b7668\": rpc error: code = NotFound desc = could not find container \"36b149648045b9cb069b114c1657d8c9f17d3c894192129d25f4f7e5044b7668\": container with ID starting with 36b149648045b9cb069b114c1657d8c9f17d3c894192129d25f4f7e5044b7668 not found: ID does not exist" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.692247 4947 scope.go:117] "RemoveContainer" containerID="8338c791b953b84b6e5c48f3b02c0d6abbb444adb3782f88dea28209a3c3d714" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.701429 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-93c8-account-create-update-7nq7f"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.707760 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-c779fbf97-r9gnn"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.708910 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/98211c56-fd23-46c2-9710-31fc562e2182-var-log-ovn\") pod \"98211c56-fd23-46c2-9710-31fc562e2182\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.708943 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/98211c56-fd23-46c2-9710-31fc562e2182-ovn-controller-tls-certs\") pod \"98211c56-fd23-46c2-9710-31fc562e2182\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.708981 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7hjn\" (UniqueName: \"kubernetes.io/projected/98211c56-fd23-46c2-9710-31fc562e2182-kube-api-access-w7hjn\") pod \"98211c56-fd23-46c2-9710-31fc562e2182\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.709023 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98211c56-fd23-46c2-9710-31fc562e2182-combined-ca-bundle\") pod \"98211c56-fd23-46c2-9710-31fc562e2182\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.709071 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/98211c56-fd23-46c2-9710-31fc562e2182-var-run\") pod \"98211c56-fd23-46c2-9710-31fc562e2182\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.709180 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/98211c56-fd23-46c2-9710-31fc562e2182-var-run-ovn\") pod \"98211c56-fd23-46c2-9710-31fc562e2182\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.709198 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/98211c56-fd23-46c2-9710-31fc562e2182-scripts\") pod \"98211c56-fd23-46c2-9710-31fc562e2182\" (UID: \"98211c56-fd23-46c2-9710-31fc562e2182\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.709475 4947 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/51d7ef1d-a0bf-465f-baad-1bc3a71618ff-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.710318 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98211c56-fd23-46c2-9710-31fc562e2182-scripts" (OuterVolumeSpecName: "scripts") pod "98211c56-fd23-46c2-9710-31fc562e2182" (UID: "98211c56-fd23-46c2-9710-31fc562e2182"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.710351 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/98211c56-fd23-46c2-9710-31fc562e2182-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "98211c56-fd23-46c2-9710-31fc562e2182" (UID: "98211c56-fd23-46c2-9710-31fc562e2182"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.710658 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/98211c56-fd23-46c2-9710-31fc562e2182-var-run" (OuterVolumeSpecName: "var-run") pod "98211c56-fd23-46c2-9710-31fc562e2182" (UID: "98211c56-fd23-46c2-9710-31fc562e2182"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.710708 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/98211c56-fd23-46c2-9710-31fc562e2182-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "98211c56-fd23-46c2-9710-31fc562e2182" (UID: "98211c56-fd23-46c2-9710-31fc562e2182"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.714543 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-c779fbf97-r9gnn"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.722438 4947 scope.go:117] "RemoveContainer" containerID="8338c791b953b84b6e5c48f3b02c0d6abbb444adb3782f88dea28209a3c3d714" Dec 03 07:12:40 crc kubenswrapper[4947]: E1203 07:12:40.722993 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8338c791b953b84b6e5c48f3b02c0d6abbb444adb3782f88dea28209a3c3d714\": container with ID starting with 8338c791b953b84b6e5c48f3b02c0d6abbb444adb3782f88dea28209a3c3d714 not found: ID does not exist" containerID="8338c791b953b84b6e5c48f3b02c0d6abbb444adb3782f88dea28209a3c3d714" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.723020 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8338c791b953b84b6e5c48f3b02c0d6abbb444adb3782f88dea28209a3c3d714"} err="failed to get container status \"8338c791b953b84b6e5c48f3b02c0d6abbb444adb3782f88dea28209a3c3d714\": rpc error: code = NotFound desc = could not find container \"8338c791b953b84b6e5c48f3b02c0d6abbb444adb3782f88dea28209a3c3d714\": container with ID starting with 8338c791b953b84b6e5c48f3b02c0d6abbb444adb3782f88dea28209a3c3d714 not found: ID does not exist" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.723038 4947 scope.go:117] "RemoveContainer" containerID="de4fb9c7248f451bd80d99c66509fa611e9f2312d6bc7b3b3051166fe6d71e26" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.726028 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98211c56-fd23-46c2-9710-31fc562e2182-kube-api-access-w7hjn" (OuterVolumeSpecName: "kube-api-access-w7hjn") pod "98211c56-fd23-46c2-9710-31fc562e2182" (UID: "98211c56-fd23-46c2-9710-31fc562e2182"). InnerVolumeSpecName "kube-api-access-w7hjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.732631 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.758629 4947 scope.go:117] "RemoveContainer" containerID="3a110d90b266fd5a43f006a937f64aeb99494243e24e04ebca6558fe2553d7e0" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.768519 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.773959 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.775775 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98211c56-fd23-46c2-9710-31fc562e2182-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98211c56-fd23-46c2-9710-31fc562e2182" (UID: "98211c56-fd23-46c2-9710-31fc562e2182"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.783373 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-qj625"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.791380 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98211c56-fd23-46c2-9710-31fc562e2182-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "98211c56-fd23-46c2-9710-31fc562e2182" (UID: "98211c56-fd23-46c2-9710-31fc562e2182"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.792046 4947 scope.go:117] "RemoveContainer" containerID="de4fb9c7248f451bd80d99c66509fa611e9f2312d6bc7b3b3051166fe6d71e26" Dec 03 07:12:40 crc kubenswrapper[4947]: E1203 07:12:40.792444 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de4fb9c7248f451bd80d99c66509fa611e9f2312d6bc7b3b3051166fe6d71e26\": container with ID starting with de4fb9c7248f451bd80d99c66509fa611e9f2312d6bc7b3b3051166fe6d71e26 not found: ID does not exist" containerID="de4fb9c7248f451bd80d99c66509fa611e9f2312d6bc7b3b3051166fe6d71e26" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.792528 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de4fb9c7248f451bd80d99c66509fa611e9f2312d6bc7b3b3051166fe6d71e26"} err="failed to get container status \"de4fb9c7248f451bd80d99c66509fa611e9f2312d6bc7b3b3051166fe6d71e26\": rpc error: code = NotFound desc = could not find container \"de4fb9c7248f451bd80d99c66509fa611e9f2312d6bc7b3b3051166fe6d71e26\": container with ID starting with de4fb9c7248f451bd80d99c66509fa611e9f2312d6bc7b3b3051166fe6d71e26 not found: ID does not exist" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.792553 4947 scope.go:117] "RemoveContainer" containerID="3a110d90b266fd5a43f006a937f64aeb99494243e24e04ebca6558fe2553d7e0" Dec 03 07:12:40 crc kubenswrapper[4947]: E1203 07:12:40.794016 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a110d90b266fd5a43f006a937f64aeb99494243e24e04ebca6558fe2553d7e0\": container with ID starting with 3a110d90b266fd5a43f006a937f64aeb99494243e24e04ebca6558fe2553d7e0 not found: ID does not exist" containerID="3a110d90b266fd5a43f006a937f64aeb99494243e24e04ebca6558fe2553d7e0" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.794151 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a110d90b266fd5a43f006a937f64aeb99494243e24e04ebca6558fe2553d7e0"} err="failed to get container status \"3a110d90b266fd5a43f006a937f64aeb99494243e24e04ebca6558fe2553d7e0\": rpc error: code = NotFound desc = could not find container \"3a110d90b266fd5a43f006a937f64aeb99494243e24e04ebca6558fe2553d7e0\": container with ID starting with 3a110d90b266fd5a43f006a937f64aeb99494243e24e04ebca6558fe2553d7e0 not found: ID does not exist" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.794259 4947 scope.go:117] "RemoveContainer" containerID="23a38b8a1f4cbbe4e0977fa37f88e153dc49ca4d190b8ca3be90add4e540bbe0" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.810357 4947 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/98211c56-fd23-46c2-9710-31fc562e2182-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.810393 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/98211c56-fd23-46c2-9710-31fc562e2182-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.810402 4947 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/98211c56-fd23-46c2-9710-31fc562e2182-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.810411 4947 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/98211c56-fd23-46c2-9710-31fc562e2182-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.810420 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7hjn\" (UniqueName: \"kubernetes.io/projected/98211c56-fd23-46c2-9710-31fc562e2182-kube-api-access-w7hjn\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.810428 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98211c56-fd23-46c2-9710-31fc562e2182-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.810436 4947 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/98211c56-fd23-46c2-9710-31fc562e2182-var-run\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.818817 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-qj625"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.819566 4947 scope.go:117] "RemoveContainer" containerID="f8e964bb562c5817c758d9180e2467c30d1c58be4f6c8c81c568b4e200107ce9" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.842566 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron896b-account-delete-ddxm6"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.843065 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron896b-account-delete-ddxm6" podUID="e4b56c27-0089-46d0-8cc3-c5788833f135" containerName="mariadb-account-delete" containerID="cri-o://48805411c29e52ae5cc60ad05d70dc1612bb234f5813c62bb0bcba4f5902f170" gracePeriod=30 Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.864150 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-896b-account-create-update-vt5xk"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.874131 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-896b-account-create-update-vt5xk"] Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.911041 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-config-data-generated\") pod \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.911084 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs69t\" (UniqueName: \"kubernetes.io/projected/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-kube-api-access-rs69t\") pod \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.911461 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "dbc40b18-511b-4bd7-bb2c-3dc868c6dcec" (UID: "dbc40b18-511b-4bd7-bb2c-3dc868c6dcec"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.911472 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-combined-ca-bundle\") pod \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.911557 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-kolla-config\") pod \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.911586 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-galera-tls-certs\") pod \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.911633 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-config-data-default\") pod \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.911658 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.911685 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-operator-scripts\") pod \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\" (UID: \"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec\") " Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.912232 4947 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.912875 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dbc40b18-511b-4bd7-bb2c-3dc868c6dcec" (UID: "dbc40b18-511b-4bd7-bb2c-3dc868c6dcec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.913407 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "dbc40b18-511b-4bd7-bb2c-3dc868c6dcec" (UID: "dbc40b18-511b-4bd7-bb2c-3dc868c6dcec"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.915271 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "dbc40b18-511b-4bd7-bb2c-3dc868c6dcec" (UID: "dbc40b18-511b-4bd7-bb2c-3dc868c6dcec"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.916509 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-kube-api-access-rs69t" (OuterVolumeSpecName: "kube-api-access-rs69t") pod "dbc40b18-511b-4bd7-bb2c-3dc868c6dcec" (UID: "dbc40b18-511b-4bd7-bb2c-3dc868c6dcec"). InnerVolumeSpecName "kube-api-access-rs69t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.927937 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "mysql-db") pod "dbc40b18-511b-4bd7-bb2c-3dc868c6dcec" (UID: "dbc40b18-511b-4bd7-bb2c-3dc868c6dcec"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.945193 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbc40b18-511b-4bd7-bb2c-3dc868c6dcec" (UID: "dbc40b18-511b-4bd7-bb2c-3dc868c6dcec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:40 crc kubenswrapper[4947]: I1203 07:12:40.969455 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "dbc40b18-511b-4bd7-bb2c-3dc868c6dcec" (UID: "dbc40b18-511b-4bd7-bb2c-3dc868c6dcec"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.013421 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.013457 4947 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.013469 4947 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.013483 4947 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.013531 4947 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.013544 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.013556 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs69t\" (UniqueName: \"kubernetes.io/projected/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec-kube-api-access-rs69t\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.033712 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-9xgkg"] Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.042458 4947 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.048335 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-9xgkg"] Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.057121 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-b13d-account-create-update-k6qnx"] Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.063115 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapib13d-account-delete-k5vqn"] Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.063391 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novaapib13d-account-delete-k5vqn" podUID="62cff038-cc99-44ca-ba48-a05d16a96b26" containerName="mariadb-account-delete" containerID="cri-o://5fcded32bf81a07bf90138d60ec52c620c0dcfd0f7192a732e55d09ba69d93dc" gracePeriod=30 Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.068892 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-b13d-account-create-update-k6qnx"] Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.078608 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-84gbj"] Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.102035 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aca2603-7c75-448c-b019-c9403906ac3b" path="/var/lib/kubelet/pods/0aca2603-7c75-448c-b019-c9403906ac3b/volumes" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.102933 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f5b09f9-d943-4f40-ab4a-567f71d6b13f" path="/var/lib/kubelet/pods/0f5b09f9-d943-4f40-ab4a-567f71d6b13f/volumes" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.103645 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1256396c-ee12-469e-864f-c87983516079" path="/var/lib/kubelet/pods/1256396c-ee12-469e-864f-c87983516079/volumes" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.104324 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5" path="/var/lib/kubelet/pods/1b89c1c2-cea2-4ad2-8594-9d3d1ab240f5/volumes" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.106154 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30c40412-9287-4f1a-a221-6f3dc1c2f33b" path="/var/lib/kubelet/pods/30c40412-9287-4f1a-a221-6f3dc1c2f33b/volumes" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.106906 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="411f9305-440e-4f16-9f83-004561707000" path="/var/lib/kubelet/pods/411f9305-440e-4f16-9f83-004561707000/volumes" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.107571 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44c0f438-8d06-433e-af68-49ccacd9a017" path="/var/lib/kubelet/pods/44c0f438-8d06-433e-af68-49ccacd9a017/volumes" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.109142 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5367165f-75ec-4633-8042-edfe91e3be60" path="/var/lib/kubelet/pods/5367165f-75ec-4633-8042-edfe91e3be60/volumes" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.110045 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6541ef13-a7b0-45f5-886e-ebf6ee0550bb" path="/var/lib/kubelet/pods/6541ef13-a7b0-45f5-886e-ebf6ee0550bb/volumes" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.110824 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="855f6436-68f0-42d8-a12a-bf25632440c1" path="/var/lib/kubelet/pods/855f6436-68f0-42d8-a12a-bf25632440c1/volumes" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.113242 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92ba5ee1-269e-4971-895f-2393110c2bcd" path="/var/lib/kubelet/pods/92ba5ee1-269e-4971-895f-2393110c2bcd/volumes" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.116804 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="941b0633-5ae9-4203-ad2e-8f4644056d2b" path="/var/lib/kubelet/pods/941b0633-5ae9-4203-ad2e-8f4644056d2b/volumes" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.117207 4947 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.117687 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98d05028-c68a-4438-afcd-161c4f974b08" path="/var/lib/kubelet/pods/98d05028-c68a-4438-afcd-161c4f974b08/volumes" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.118424 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acfeae50-8757-4baf-a16a-c33ae100fdf2" path="/var/lib/kubelet/pods/acfeae50-8757-4baf-a16a-c33ae100fdf2/volumes" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.119793 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="add50932-e8ea-4e7a-ab75-6fb1e0463499" path="/var/lib/kubelet/pods/add50932-e8ea-4e7a-ab75-6fb1e0463499/volumes" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.120472 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52" path="/var/lib/kubelet/pods/b6a5aa22-bc3a-4dc6-ab6b-baa2497bdc52/volumes" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.121168 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6b4e915-36d1-4586-96af-0abb8c1b9246" path="/var/lib/kubelet/pods/b6b4e915-36d1-4586-96af-0abb8c1b9246/volumes" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.121866 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6a94dde-b399-408b-93f2-488d02be7f07" path="/var/lib/kubelet/pods/c6a94dde-b399-408b-93f2-488d02be7f07/volumes" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.123127 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182" path="/var/lib/kubelet/pods/c8b5780a-d5b2-4a19-b7a4-6e14a5d3a182/volumes" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.123851 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eba16a83-33ec-44b8-9494-8a5efa7e59e7" path="/var/lib/kubelet/pods/eba16a83-33ec-44b8-9494-8a5efa7e59e7/volumes" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.124654 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-84gbj"] Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.124683 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.124700 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.124713 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-wnv6s"] Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.126754 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-wnv6s"] Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.179244 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-c18d-account-create-update-ss77z"] Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.194080 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0c18d-account-delete-458q7"] Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.194265 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novacell0c18d-account-delete-458q7" podUID="8c562c87-9f66-447f-83ec-05165c95ca25" containerName="mariadb-account-delete" containerID="cri-o://207f7321a401ba4b7cf0fa6297e5b92b9beae00b568d12bec3f4e5034144bc08" gracePeriod=30 Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.200525 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-c18d-account-create-update-ss77z"] Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.688965 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"dbc40b18-511b-4bd7-bb2c-3dc868c6dcec","Type":"ContainerDied","Data":"3e74b3ace2547285a3aaed7a716aed4a64a2cb2d00748ce51351b96e4c728b84"} Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.689012 4947 scope.go:117] "RemoveContainer" containerID="58133114584f0b390c627b1710fda7f019565ce9036c5e41a7ab89f063b5168d" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.689032 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.708208 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.713573 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.716356 4947 scope.go:117] "RemoveContainer" containerID="4a757f605dd0032d26474e5a8df58f33e3b89eab2725fe96b4bea59fa90b65d0" Dec 03 07:12:41 crc kubenswrapper[4947]: I1203 07:12:41.948232 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="9a879185-9b9a-45a5-a211-c61faf308cbb" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.165:8776/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 07:12:42 crc kubenswrapper[4947]: E1203 07:12:42.129781 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:12:42 crc kubenswrapper[4947]: E1203 07:12:42.130143 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/62cff038-cc99-44ca-ba48-a05d16a96b26-operator-scripts podName:62cff038-cc99-44ca-ba48-a05d16a96b26 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:46.130124736 +0000 UTC m=+1427.391079162 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/62cff038-cc99-44ca-ba48-a05d16a96b26-operator-scripts") pod "novaapib13d-account-delete-k5vqn" (UID: "62cff038-cc99-44ca-ba48-a05d16a96b26") : configmap "openstack-scripts" not found Dec 03 07:12:42 crc kubenswrapper[4947]: E1203 07:12:42.130558 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:12:42 crc kubenswrapper[4947]: E1203 07:12:42.130598 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e4b56c27-0089-46d0-8cc3-c5788833f135-operator-scripts podName:e4b56c27-0089-46d0-8cc3-c5788833f135 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:46.130587999 +0000 UTC m=+1427.391542425 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e4b56c27-0089-46d0-8cc3-c5788833f135-operator-scripts") pod "neutron896b-account-delete-ddxm6" (UID: "e4b56c27-0089-46d0-8cc3-c5788833f135") : configmap "openstack-scripts" not found Dec 03 07:12:42 crc kubenswrapper[4947]: E1203 07:12:42.130632 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:12:42 crc kubenswrapper[4947]: E1203 07:12:42.130656 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8c562c87-9f66-447f-83ec-05165c95ca25-operator-scripts podName:8c562c87-9f66-447f-83ec-05165c95ca25 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:46.130648311 +0000 UTC m=+1427.391602737 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8c562c87-9f66-447f-83ec-05165c95ca25-operator-scripts") pod "novacell0c18d-account-delete-458q7" (UID: "8c562c87-9f66-447f-83ec-05165c95ca25") : configmap "openstack-scripts" not found Dec 03 07:12:42 crc kubenswrapper[4947]: E1203 07:12:42.130685 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:12:42 crc kubenswrapper[4947]: E1203 07:12:42.130706 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8c259ef0-fa34-4ef1-a954-a15bd61ed120-operator-scripts podName:8c259ef0-fa34-4ef1-a954-a15bd61ed120 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:46.130699302 +0000 UTC m=+1427.391653728 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8c259ef0-fa34-4ef1-a954-a15bd61ed120-operator-scripts") pod "cinder0e6b-account-delete-kmbkl" (UID: "8c259ef0-fa34-4ef1-a954-a15bd61ed120") : configmap "openstack-scripts" not found Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.095818 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c45c77c-1306-4434-9b1f-9f6d45522d6a" path="/var/lib/kubelet/pods/2c45c77c-1306-4434-9b1f-9f6d45522d6a/volumes" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.097461 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51d7ef1d-a0bf-465f-baad-1bc3a71618ff" path="/var/lib/kubelet/pods/51d7ef1d-a0bf-465f-baad-1bc3a71618ff/volumes" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.098861 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98211c56-fd23-46c2-9710-31fc562e2182" path="/var/lib/kubelet/pods/98211c56-fd23-46c2-9710-31fc562e2182/volumes" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.101704 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbc40b18-511b-4bd7-bb2c-3dc868c6dcec" path="/var/lib/kubelet/pods/dbc40b18-511b-4bd7-bb2c-3dc868c6dcec/volumes" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.102989 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e07d2dc5-7d24-4294-8623-7b50b59c2135" path="/var/lib/kubelet/pods/e07d2dc5-7d24-4294-8623-7b50b59c2135/volumes" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.723020 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.732260 4947 generic.go:334] "Generic (PLEG): container finished" podID="33bdabb7-a612-499f-855c-74da636d845a" containerID="f62233bb632eafcaea6d217ab7dcf4824646a80db89f733c7f2c4e7b72497014" exitCode=0 Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.732574 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33bdabb7-a612-499f-855c-74da636d845a","Type":"ContainerDied","Data":"f62233bb632eafcaea6d217ab7dcf4824646a80db89f733c7f2c4e7b72497014"} Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.732706 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33bdabb7-a612-499f-855c-74da636d845a","Type":"ContainerDied","Data":"5714ddceec7271190cc5133b35c79321785ae793eb33df5212a52ff5283aad95"} Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.732817 4947 scope.go:117] "RemoveContainer" containerID="f053572725c98712d4c872ef77a6d07d92fe84368f69a4183abee2c484393e0d" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.778205 4947 scope.go:117] "RemoveContainer" containerID="1788caba560e975af4980a6b4d8f9d5dc9b8edfaba731e0826ca1c2cec99f032" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.801658 4947 scope.go:117] "RemoveContainer" containerID="f62233bb632eafcaea6d217ab7dcf4824646a80db89f733c7f2c4e7b72497014" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.836242 4947 scope.go:117] "RemoveContainer" containerID="f90261f5afceebdaeb77f3c54024be52a62232f173b9c1fd29905e0065ebea01" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.854447 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-scripts\") pod \"33bdabb7-a612-499f-855c-74da636d845a\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.854533 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-ceilometer-tls-certs\") pod \"33bdabb7-a612-499f-855c-74da636d845a\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.854596 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-sg-core-conf-yaml\") pod \"33bdabb7-a612-499f-855c-74da636d845a\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.854632 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33bdabb7-a612-499f-855c-74da636d845a-log-httpd\") pod \"33bdabb7-a612-499f-855c-74da636d845a\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.854686 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-combined-ca-bundle\") pod \"33bdabb7-a612-499f-855c-74da636d845a\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.854748 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-config-data\") pod \"33bdabb7-a612-499f-855c-74da636d845a\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.854812 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33bdabb7-a612-499f-855c-74da636d845a-run-httpd\") pod \"33bdabb7-a612-499f-855c-74da636d845a\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.854860 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nhw5\" (UniqueName: \"kubernetes.io/projected/33bdabb7-a612-499f-855c-74da636d845a-kube-api-access-4nhw5\") pod \"33bdabb7-a612-499f-855c-74da636d845a\" (UID: \"33bdabb7-a612-499f-855c-74da636d845a\") " Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.859177 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33bdabb7-a612-499f-855c-74da636d845a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "33bdabb7-a612-499f-855c-74da636d845a" (UID: "33bdabb7-a612-499f-855c-74da636d845a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.859229 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33bdabb7-a612-499f-855c-74da636d845a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "33bdabb7-a612-499f-855c-74da636d845a" (UID: "33bdabb7-a612-499f-855c-74da636d845a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.860801 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33bdabb7-a612-499f-855c-74da636d845a-kube-api-access-4nhw5" (OuterVolumeSpecName: "kube-api-access-4nhw5") pod "33bdabb7-a612-499f-855c-74da636d845a" (UID: "33bdabb7-a612-499f-855c-74da636d845a"). InnerVolumeSpecName "kube-api-access-4nhw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.861530 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-scripts" (OuterVolumeSpecName: "scripts") pod "33bdabb7-a612-499f-855c-74da636d845a" (UID: "33bdabb7-a612-499f-855c-74da636d845a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.868630 4947 scope.go:117] "RemoveContainer" containerID="f053572725c98712d4c872ef77a6d07d92fe84368f69a4183abee2c484393e0d" Dec 03 07:12:43 crc kubenswrapper[4947]: E1203 07:12:43.869335 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f053572725c98712d4c872ef77a6d07d92fe84368f69a4183abee2c484393e0d\": container with ID starting with f053572725c98712d4c872ef77a6d07d92fe84368f69a4183abee2c484393e0d not found: ID does not exist" containerID="f053572725c98712d4c872ef77a6d07d92fe84368f69a4183abee2c484393e0d" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.869394 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f053572725c98712d4c872ef77a6d07d92fe84368f69a4183abee2c484393e0d"} err="failed to get container status \"f053572725c98712d4c872ef77a6d07d92fe84368f69a4183abee2c484393e0d\": rpc error: code = NotFound desc = could not find container \"f053572725c98712d4c872ef77a6d07d92fe84368f69a4183abee2c484393e0d\": container with ID starting with f053572725c98712d4c872ef77a6d07d92fe84368f69a4183abee2c484393e0d not found: ID does not exist" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.869517 4947 scope.go:117] "RemoveContainer" containerID="1788caba560e975af4980a6b4d8f9d5dc9b8edfaba731e0826ca1c2cec99f032" Dec 03 07:12:43 crc kubenswrapper[4947]: E1203 07:12:43.869945 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1788caba560e975af4980a6b4d8f9d5dc9b8edfaba731e0826ca1c2cec99f032\": container with ID starting with 1788caba560e975af4980a6b4d8f9d5dc9b8edfaba731e0826ca1c2cec99f032 not found: ID does not exist" containerID="1788caba560e975af4980a6b4d8f9d5dc9b8edfaba731e0826ca1c2cec99f032" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.869986 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1788caba560e975af4980a6b4d8f9d5dc9b8edfaba731e0826ca1c2cec99f032"} err="failed to get container status \"1788caba560e975af4980a6b4d8f9d5dc9b8edfaba731e0826ca1c2cec99f032\": rpc error: code = NotFound desc = could not find container \"1788caba560e975af4980a6b4d8f9d5dc9b8edfaba731e0826ca1c2cec99f032\": container with ID starting with 1788caba560e975af4980a6b4d8f9d5dc9b8edfaba731e0826ca1c2cec99f032 not found: ID does not exist" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.870032 4947 scope.go:117] "RemoveContainer" containerID="f62233bb632eafcaea6d217ab7dcf4824646a80db89f733c7f2c4e7b72497014" Dec 03 07:12:43 crc kubenswrapper[4947]: E1203 07:12:43.870374 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f62233bb632eafcaea6d217ab7dcf4824646a80db89f733c7f2c4e7b72497014\": container with ID starting with f62233bb632eafcaea6d217ab7dcf4824646a80db89f733c7f2c4e7b72497014 not found: ID does not exist" containerID="f62233bb632eafcaea6d217ab7dcf4824646a80db89f733c7f2c4e7b72497014" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.870397 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f62233bb632eafcaea6d217ab7dcf4824646a80db89f733c7f2c4e7b72497014"} err="failed to get container status \"f62233bb632eafcaea6d217ab7dcf4824646a80db89f733c7f2c4e7b72497014\": rpc error: code = NotFound desc = could not find container \"f62233bb632eafcaea6d217ab7dcf4824646a80db89f733c7f2c4e7b72497014\": container with ID starting with f62233bb632eafcaea6d217ab7dcf4824646a80db89f733c7f2c4e7b72497014 not found: ID does not exist" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.870414 4947 scope.go:117] "RemoveContainer" containerID="f90261f5afceebdaeb77f3c54024be52a62232f173b9c1fd29905e0065ebea01" Dec 03 07:12:43 crc kubenswrapper[4947]: E1203 07:12:43.870784 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f90261f5afceebdaeb77f3c54024be52a62232f173b9c1fd29905e0065ebea01\": container with ID starting with f90261f5afceebdaeb77f3c54024be52a62232f173b9c1fd29905e0065ebea01 not found: ID does not exist" containerID="f90261f5afceebdaeb77f3c54024be52a62232f173b9c1fd29905e0065ebea01" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.870825 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f90261f5afceebdaeb77f3c54024be52a62232f173b9c1fd29905e0065ebea01"} err="failed to get container status \"f90261f5afceebdaeb77f3c54024be52a62232f173b9c1fd29905e0065ebea01\": rpc error: code = NotFound desc = could not find container \"f90261f5afceebdaeb77f3c54024be52a62232f173b9c1fd29905e0065ebea01\": container with ID starting with f90261f5afceebdaeb77f3c54024be52a62232f173b9c1fd29905e0065ebea01 not found: ID does not exist" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.891525 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "33bdabb7-a612-499f-855c-74da636d845a" (UID: "33bdabb7-a612-499f-855c-74da636d845a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.920673 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "33bdabb7-a612-499f-855c-74da636d845a" (UID: "33bdabb7-a612-499f-855c-74da636d845a"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.942571 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33bdabb7-a612-499f-855c-74da636d845a" (UID: "33bdabb7-a612-499f-855c-74da636d845a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.955981 4947 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33bdabb7-a612-499f-855c-74da636d845a-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.956009 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nhw5\" (UniqueName: \"kubernetes.io/projected/33bdabb7-a612-499f-855c-74da636d845a-kube-api-access-4nhw5\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.956019 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.956050 4947 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.956058 4947 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.956066 4947 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33bdabb7-a612-499f-855c-74da636d845a-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.956074 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:43 crc kubenswrapper[4947]: I1203 07:12:43.969909 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-config-data" (OuterVolumeSpecName: "config-data") pod "33bdabb7-a612-499f-855c-74da636d845a" (UID: "33bdabb7-a612-499f-855c-74da636d845a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:44 crc kubenswrapper[4947]: I1203 07:12:44.057458 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33bdabb7-a612-499f-855c-74da636d845a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:44 crc kubenswrapper[4947]: E1203 07:12:44.397988 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39 is running failed: container process not found" containerID="f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 07:12:44 crc kubenswrapper[4947]: E1203 07:12:44.398453 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39 is running failed: container process not found" containerID="f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 07:12:44 crc kubenswrapper[4947]: E1203 07:12:44.398916 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39 is running failed: container process not found" containerID="f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 07:12:44 crc kubenswrapper[4947]: E1203 07:12:44.398981 4947 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-lb94d" podUID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerName="ovsdb-server" Dec 03 07:12:44 crc kubenswrapper[4947]: E1203 07:12:44.399792 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ce586afd15fbdd0b4c1dfbc0475b8228a9be2dd3ee32c7a7f80750aafce6e2f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 07:12:44 crc kubenswrapper[4947]: E1203 07:12:44.401006 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ce586afd15fbdd0b4c1dfbc0475b8228a9be2dd3ee32c7a7f80750aafce6e2f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 07:12:44 crc kubenswrapper[4947]: E1203 07:12:44.405677 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ce586afd15fbdd0b4c1dfbc0475b8228a9be2dd3ee32c7a7f80750aafce6e2f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 07:12:44 crc kubenswrapper[4947]: E1203 07:12:44.405736 4947 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-lb94d" podUID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerName="ovs-vswitchd" Dec 03 07:12:44 crc kubenswrapper[4947]: I1203 07:12:44.754947 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 07:12:44 crc kubenswrapper[4947]: I1203 07:12:44.801979 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:12:44 crc kubenswrapper[4947]: I1203 07:12:44.809158 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 07:12:45 crc kubenswrapper[4947]: I1203 07:12:45.098123 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33bdabb7-a612-499f-855c-74da636d845a" path="/var/lib/kubelet/pods/33bdabb7-a612-499f-855c-74da636d845a/volumes" Dec 03 07:12:46 crc kubenswrapper[4947]: E1203 07:12:46.186544 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:12:46 crc kubenswrapper[4947]: E1203 07:12:46.186571 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:12:46 crc kubenswrapper[4947]: E1203 07:12:46.186630 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:12:46 crc kubenswrapper[4947]: E1203 07:12:46.186653 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:12:46 crc kubenswrapper[4947]: E1203 07:12:46.186775 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/62cff038-cc99-44ca-ba48-a05d16a96b26-operator-scripts podName:62cff038-cc99-44ca-ba48-a05d16a96b26 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:54.186760331 +0000 UTC m=+1435.447714757 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/62cff038-cc99-44ca-ba48-a05d16a96b26-operator-scripts") pod "novaapib13d-account-delete-k5vqn" (UID: "62cff038-cc99-44ca-ba48-a05d16a96b26") : configmap "openstack-scripts" not found Dec 03 07:12:46 crc kubenswrapper[4947]: E1203 07:12:46.186961 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8c562c87-9f66-447f-83ec-05165c95ca25-operator-scripts podName:8c562c87-9f66-447f-83ec-05165c95ca25 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:54.186926816 +0000 UTC m=+1435.447881282 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8c562c87-9f66-447f-83ec-05165c95ca25-operator-scripts") pod "novacell0c18d-account-delete-458q7" (UID: "8c562c87-9f66-447f-83ec-05165c95ca25") : configmap "openstack-scripts" not found Dec 03 07:12:46 crc kubenswrapper[4947]: E1203 07:12:46.186996 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e4b56c27-0089-46d0-8cc3-c5788833f135-operator-scripts podName:e4b56c27-0089-46d0-8cc3-c5788833f135 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:54.186982997 +0000 UTC m=+1435.447937463 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e4b56c27-0089-46d0-8cc3-c5788833f135-operator-scripts") pod "neutron896b-account-delete-ddxm6" (UID: "e4b56c27-0089-46d0-8cc3-c5788833f135") : configmap "openstack-scripts" not found Dec 03 07:12:46 crc kubenswrapper[4947]: E1203 07:12:46.187017 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8c259ef0-fa34-4ef1-a954-a15bd61ed120-operator-scripts podName:8c259ef0-fa34-4ef1-a954-a15bd61ed120 nodeName:}" failed. No retries permitted until 2025-12-03 07:12:54.187007078 +0000 UTC m=+1435.447961554 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8c259ef0-fa34-4ef1-a954-a15bd61ed120-operator-scripts") pod "cinder0e6b-account-delete-kmbkl" (UID: "8c259ef0-fa34-4ef1-a954-a15bd61ed120") : configmap "openstack-scripts" not found Dec 03 07:12:49 crc kubenswrapper[4947]: E1203 07:12:49.398186 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39 is running failed: container process not found" containerID="f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 07:12:49 crc kubenswrapper[4947]: E1203 07:12:49.398770 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39 is running failed: container process not found" containerID="f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 07:12:49 crc kubenswrapper[4947]: E1203 07:12:49.399040 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39 is running failed: container process not found" containerID="f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 07:12:49 crc kubenswrapper[4947]: E1203 07:12:49.399075 4947 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-lb94d" podUID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerName="ovsdb-server" Dec 03 07:12:49 crc kubenswrapper[4947]: E1203 07:12:49.400923 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ce586afd15fbdd0b4c1dfbc0475b8228a9be2dd3ee32c7a7f80750aafce6e2f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 07:12:49 crc kubenswrapper[4947]: E1203 07:12:49.402761 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ce586afd15fbdd0b4c1dfbc0475b8228a9be2dd3ee32c7a7f80750aafce6e2f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 07:12:49 crc kubenswrapper[4947]: E1203 07:12:49.404942 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ce586afd15fbdd0b4c1dfbc0475b8228a9be2dd3ee32c7a7f80750aafce6e2f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 07:12:49 crc kubenswrapper[4947]: E1203 07:12:49.404986 4947 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-lb94d" podUID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerName="ovs-vswitchd" Dec 03 07:12:54 crc kubenswrapper[4947]: E1203 07:12:54.240292 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:12:54 crc kubenswrapper[4947]: E1203 07:12:54.240405 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:12:54 crc kubenswrapper[4947]: E1203 07:12:54.240413 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8c562c87-9f66-447f-83ec-05165c95ca25-operator-scripts podName:8c562c87-9f66-447f-83ec-05165c95ca25 nodeName:}" failed. No retries permitted until 2025-12-03 07:13:10.2403863 +0000 UTC m=+1451.501340726 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8c562c87-9f66-447f-83ec-05165c95ca25-operator-scripts") pod "novacell0c18d-account-delete-458q7" (UID: "8c562c87-9f66-447f-83ec-05165c95ca25") : configmap "openstack-scripts" not found Dec 03 07:12:54 crc kubenswrapper[4947]: E1203 07:12:54.240523 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:12:54 crc kubenswrapper[4947]: E1203 07:12:54.240617 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8c259ef0-fa34-4ef1-a954-a15bd61ed120-operator-scripts podName:8c259ef0-fa34-4ef1-a954-a15bd61ed120 nodeName:}" failed. No retries permitted until 2025-12-03 07:13:10.240555204 +0000 UTC m=+1451.501509670 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8c259ef0-fa34-4ef1-a954-a15bd61ed120-operator-scripts") pod "cinder0e6b-account-delete-kmbkl" (UID: "8c259ef0-fa34-4ef1-a954-a15bd61ed120") : configmap "openstack-scripts" not found Dec 03 07:12:54 crc kubenswrapper[4947]: E1203 07:12:54.240650 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e4b56c27-0089-46d0-8cc3-c5788833f135-operator-scripts podName:e4b56c27-0089-46d0-8cc3-c5788833f135 nodeName:}" failed. No retries permitted until 2025-12-03 07:13:10.240635976 +0000 UTC m=+1451.501590432 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e4b56c27-0089-46d0-8cc3-c5788833f135-operator-scripts") pod "neutron896b-account-delete-ddxm6" (UID: "e4b56c27-0089-46d0-8cc3-c5788833f135") : configmap "openstack-scripts" not found Dec 03 07:12:54 crc kubenswrapper[4947]: E1203 07:12:54.240758 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:12:54 crc kubenswrapper[4947]: E1203 07:12:54.240896 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/62cff038-cc99-44ca-ba48-a05d16a96b26-operator-scripts podName:62cff038-cc99-44ca-ba48-a05d16a96b26 nodeName:}" failed. No retries permitted until 2025-12-03 07:13:10.240864592 +0000 UTC m=+1451.501819068 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/62cff038-cc99-44ca-ba48-a05d16a96b26-operator-scripts") pod "novaapib13d-account-delete-k5vqn" (UID: "62cff038-cc99-44ca-ba48-a05d16a96b26") : configmap "openstack-scripts" not found Dec 03 07:12:54 crc kubenswrapper[4947]: E1203 07:12:54.399523 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39 is running failed: container process not found" containerID="f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 07:12:54 crc kubenswrapper[4947]: E1203 07:12:54.400120 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39 is running failed: container process not found" containerID="f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 07:12:54 crc kubenswrapper[4947]: E1203 07:12:54.401260 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ce586afd15fbdd0b4c1dfbc0475b8228a9be2dd3ee32c7a7f80750aafce6e2f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 07:12:54 crc kubenswrapper[4947]: E1203 07:12:54.401374 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39 is running failed: container process not found" containerID="f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 07:12:54 crc kubenswrapper[4947]: E1203 07:12:54.401470 4947 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-lb94d" podUID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerName="ovsdb-server" Dec 03 07:12:54 crc kubenswrapper[4947]: E1203 07:12:54.403005 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ce586afd15fbdd0b4c1dfbc0475b8228a9be2dd3ee32c7a7f80750aafce6e2f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 07:12:54 crc kubenswrapper[4947]: E1203 07:12:54.404899 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ce586afd15fbdd0b4c1dfbc0475b8228a9be2dd3ee32c7a7f80750aafce6e2f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 07:12:54 crc kubenswrapper[4947]: E1203 07:12:54.404966 4947 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-lb94d" podUID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerName="ovs-vswitchd" Dec 03 07:12:54 crc kubenswrapper[4947]: I1203 07:12:54.877463 4947 generic.go:334] "Generic (PLEG): container finished" podID="ccb28a2e-a946-4407-be07-6ac8eaad8ab1" containerID="4f26987218545ce4b3ac012485238ac52ce1b620c3025dd4686d6e203a028593" exitCode=0 Dec 03 07:12:54 crc kubenswrapper[4947]: I1203 07:12:54.877864 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dd7c69-hhnlg" event={"ID":"ccb28a2e-a946-4407-be07-6ac8eaad8ab1","Type":"ContainerDied","Data":"4f26987218545ce4b3ac012485238ac52ce1b620c3025dd4686d6e203a028593"} Dec 03 07:12:54 crc kubenswrapper[4947]: I1203 07:12:54.979364 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.051565 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-httpd-config\") pod \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.051890 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfnq2\" (UniqueName: \"kubernetes.io/projected/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-kube-api-access-xfnq2\") pod \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.051932 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-config\") pod \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.052009 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-ovndb-tls-certs\") pod \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.052049 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-public-tls-certs\") pod \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.052075 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-internal-tls-certs\") pod \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.052099 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-combined-ca-bundle\") pod \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\" (UID: \"ccb28a2e-a946-4407-be07-6ac8eaad8ab1\") " Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.057579 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "ccb28a2e-a946-4407-be07-6ac8eaad8ab1" (UID: "ccb28a2e-a946-4407-be07-6ac8eaad8ab1"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.057856 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-kube-api-access-xfnq2" (OuterVolumeSpecName: "kube-api-access-xfnq2") pod "ccb28a2e-a946-4407-be07-6ac8eaad8ab1" (UID: "ccb28a2e-a946-4407-be07-6ac8eaad8ab1"). InnerVolumeSpecName "kube-api-access-xfnq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.092770 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ccb28a2e-a946-4407-be07-6ac8eaad8ab1" (UID: "ccb28a2e-a946-4407-be07-6ac8eaad8ab1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.093737 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ccb28a2e-a946-4407-be07-6ac8eaad8ab1" (UID: "ccb28a2e-a946-4407-be07-6ac8eaad8ab1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.095742 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-config" (OuterVolumeSpecName: "config") pod "ccb28a2e-a946-4407-be07-6ac8eaad8ab1" (UID: "ccb28a2e-a946-4407-be07-6ac8eaad8ab1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.096807 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ccb28a2e-a946-4407-be07-6ac8eaad8ab1" (UID: "ccb28a2e-a946-4407-be07-6ac8eaad8ab1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.132557 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "ccb28a2e-a946-4407-be07-6ac8eaad8ab1" (UID: "ccb28a2e-a946-4407-be07-6ac8eaad8ab1"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.153748 4947 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.153791 4947 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.153804 4947 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.153816 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.153829 4947 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.153840 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfnq2\" (UniqueName: \"kubernetes.io/projected/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-kube-api-access-xfnq2\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.153856 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ccb28a2e-a946-4407-be07-6ac8eaad8ab1-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.892713 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7dd7c69-hhnlg" event={"ID":"ccb28a2e-a946-4407-be07-6ac8eaad8ab1","Type":"ContainerDied","Data":"1f84d80c8ff24b901e58b30eec441c6cdb8f1da326d393aabcc8fd0cdc22457e"} Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.892779 4947 scope.go:117] "RemoveContainer" containerID="d1c453acbdb699e13cb35f456f04fa20cb22c76c9d9304778639baf320f5cf98" Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.892788 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7dd7c69-hhnlg" Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.929988 4947 scope.go:117] "RemoveContainer" containerID="4f26987218545ce4b3ac012485238ac52ce1b620c3025dd4686d6e203a028593" Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.950220 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7dd7c69-hhnlg"] Dec 03 07:12:55 crc kubenswrapper[4947]: I1203 07:12:55.958106 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7dd7c69-hhnlg"] Dec 03 07:12:57 crc kubenswrapper[4947]: I1203 07:12:57.094447 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccb28a2e-a946-4407-be07-6ac8eaad8ab1" path="/var/lib/kubelet/pods/ccb28a2e-a946-4407-be07-6ac8eaad8ab1/volumes" Dec 03 07:12:59 crc kubenswrapper[4947]: E1203 07:12:59.399015 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39 is running failed: container process not found" containerID="f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 07:12:59 crc kubenswrapper[4947]: E1203 07:12:59.400315 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39 is running failed: container process not found" containerID="f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 07:12:59 crc kubenswrapper[4947]: E1203 07:12:59.400891 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39 is running failed: container process not found" containerID="f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 07:12:59 crc kubenswrapper[4947]: E1203 07:12:59.400983 4947 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-lb94d" podUID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerName="ovsdb-server" Dec 03 07:12:59 crc kubenswrapper[4947]: E1203 07:12:59.401355 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ce586afd15fbdd0b4c1dfbc0475b8228a9be2dd3ee32c7a7f80750aafce6e2f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 07:12:59 crc kubenswrapper[4947]: E1203 07:12:59.403358 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ce586afd15fbdd0b4c1dfbc0475b8228a9be2dd3ee32c7a7f80750aafce6e2f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 07:12:59 crc kubenswrapper[4947]: E1203 07:12:59.405418 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ce586afd15fbdd0b4c1dfbc0475b8228a9be2dd3ee32c7a7f80750aafce6e2f" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 07:12:59 crc kubenswrapper[4947]: E1203 07:12:59.405540 4947 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-lb94d" podUID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerName="ovs-vswitchd" Dec 03 07:13:00 crc kubenswrapper[4947]: I1203 07:13:00.087072 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:13:00 crc kubenswrapper[4947]: I1203 07:13:00.087132 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:13:01 crc kubenswrapper[4947]: I1203 07:13:01.996790 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lb94d_7b86a5d2-1933-4a2f-97de-f3b49985fbf8/ovs-vswitchd/0.log" Dec 03 07:13:01 crc kubenswrapper[4947]: I1203 07:13:01.997818 4947 generic.go:334] "Generic (PLEG): container finished" podID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerID="3ce586afd15fbdd0b4c1dfbc0475b8228a9be2dd3ee32c7a7f80750aafce6e2f" exitCode=137 Dec 03 07:13:01 crc kubenswrapper[4947]: I1203 07:13:01.997950 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lb94d" event={"ID":"7b86a5d2-1933-4a2f-97de-f3b49985fbf8","Type":"ContainerDied","Data":"3ce586afd15fbdd0b4c1dfbc0475b8228a9be2dd3ee32c7a7f80750aafce6e2f"} Dec 03 07:13:02 crc kubenswrapper[4947]: I1203 07:13:02.011582 4947 generic.go:334] "Generic (PLEG): container finished" podID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerID="52672ead9b6ba2c835fb5ca4f95054db2a9ce1fa6df424a727875fabb4ce0dbc" exitCode=137 Dec 03 07:13:02 crc kubenswrapper[4947]: I1203 07:13:02.011626 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerDied","Data":"52672ead9b6ba2c835fb5ca4f95054db2a9ce1fa6df424a727875fabb4ce0dbc"} Dec 03 07:13:02 crc kubenswrapper[4947]: I1203 07:13:02.231743 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 03 07:13:02 crc kubenswrapper[4947]: I1203 07:13:02.386237 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5dc8a280-5a18-41fd-8e61-f51afa973d20-lock\") pod \"5dc8a280-5a18-41fd-8e61-f51afa973d20\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " Dec 03 07:13:02 crc kubenswrapper[4947]: I1203 07:13:02.386325 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5dc8a280-5a18-41fd-8e61-f51afa973d20-cache\") pod \"5dc8a280-5a18-41fd-8e61-f51afa973d20\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " Dec 03 07:13:02 crc kubenswrapper[4947]: I1203 07:13:02.386440 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-etc-swift\") pod \"5dc8a280-5a18-41fd-8e61-f51afa973d20\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " Dec 03 07:13:02 crc kubenswrapper[4947]: I1203 07:13:02.386562 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5kk8\" (UniqueName: \"kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-kube-api-access-b5kk8\") pod \"5dc8a280-5a18-41fd-8e61-f51afa973d20\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " Dec 03 07:13:02 crc kubenswrapper[4947]: I1203 07:13:02.386603 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"5dc8a280-5a18-41fd-8e61-f51afa973d20\" (UID: \"5dc8a280-5a18-41fd-8e61-f51afa973d20\") " Dec 03 07:13:02 crc kubenswrapper[4947]: I1203 07:13:02.388631 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dc8a280-5a18-41fd-8e61-f51afa973d20-cache" (OuterVolumeSpecName: "cache") pod "5dc8a280-5a18-41fd-8e61-f51afa973d20" (UID: "5dc8a280-5a18-41fd-8e61-f51afa973d20"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:13:02 crc kubenswrapper[4947]: I1203 07:13:02.388894 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dc8a280-5a18-41fd-8e61-f51afa973d20-lock" (OuterVolumeSpecName: "lock") pod "5dc8a280-5a18-41fd-8e61-f51afa973d20" (UID: "5dc8a280-5a18-41fd-8e61-f51afa973d20"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:13:02 crc kubenswrapper[4947]: I1203 07:13:02.394280 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5dc8a280-5a18-41fd-8e61-f51afa973d20" (UID: "5dc8a280-5a18-41fd-8e61-f51afa973d20"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:13:02 crc kubenswrapper[4947]: I1203 07:13:02.395052 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-kube-api-access-b5kk8" (OuterVolumeSpecName: "kube-api-access-b5kk8") pod "5dc8a280-5a18-41fd-8e61-f51afa973d20" (UID: "5dc8a280-5a18-41fd-8e61-f51afa973d20"). InnerVolumeSpecName "kube-api-access-b5kk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:13:02 crc kubenswrapper[4947]: I1203 07:13:02.395223 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "swift") pod "5dc8a280-5a18-41fd-8e61-f51afa973d20" (UID: "5dc8a280-5a18-41fd-8e61-f51afa973d20"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 07:13:02 crc kubenswrapper[4947]: I1203 07:13:02.488625 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5kk8\" (UniqueName: \"kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-kube-api-access-b5kk8\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:02 crc kubenswrapper[4947]: I1203 07:13:02.488675 4947 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 03 07:13:02 crc kubenswrapper[4947]: I1203 07:13:02.488689 4947 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5dc8a280-5a18-41fd-8e61-f51afa973d20-lock\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:02 crc kubenswrapper[4947]: I1203 07:13:02.488699 4947 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5dc8a280-5a18-41fd-8e61-f51afa973d20-cache\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:02 crc kubenswrapper[4947]: I1203 07:13:02.488711 4947 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5dc8a280-5a18-41fd-8e61-f51afa973d20-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:02 crc kubenswrapper[4947]: I1203 07:13:02.510763 4947 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 03 07:13:02 crc kubenswrapper[4947]: I1203 07:13:02.590245 4947 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.030955 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5dc8a280-5a18-41fd-8e61-f51afa973d20","Type":"ContainerDied","Data":"d45e15ccf604f6a3fb83d5c85740fc19f409cd96d5fbd32aa1553f993c35e019"} Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.031003 4947 scope.go:117] "RemoveContainer" containerID="52672ead9b6ba2c835fb5ca4f95054db2a9ce1fa6df424a727875fabb4ce0dbc" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.031368 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.033543 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lb94d_7b86a5d2-1933-4a2f-97de-f3b49985fbf8/ovs-vswitchd/0.log" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.035503 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-lb94d" event={"ID":"7b86a5d2-1933-4a2f-97de-f3b49985fbf8","Type":"ContainerDied","Data":"0c1cc19621bbdad40595aed4161c6d5e21175857117992d4456c69b529bf91c9"} Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.035548 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c1cc19621bbdad40595aed4161c6d5e21175857117992d4456c69b529bf91c9" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.086203 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-lb94d_7b86a5d2-1933-4a2f-97de-f3b49985fbf8/ovs-vswitchd/0.log" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.086850 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.090218 4947 scope.go:117] "RemoveContainer" containerID="f69a782b89a00f114b21b586623c5d8f0e73109af52bd1c6676ae4209fab1573" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.112404 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.121366 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.127963 4947 scope.go:117] "RemoveContainer" containerID="d87586511f96368d02332320b8070531b8e4c9823a90eea78876b046f2488da5" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.154057 4947 scope.go:117] "RemoveContainer" containerID="e4a196d70b59a52f0de0490ca863f45fe4efbfa9a908b2cfde82cec941e4b30f" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.173410 4947 scope.go:117] "RemoveContainer" containerID="bea9b5a2830e45fb27873f63fdf3c2659562adb3f096b0f77513dea99befbef1" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.192055 4947 scope.go:117] "RemoveContainer" containerID="7579cc3029f2cccd754569c3be548ada24f443eaf1602cc6cdbee7d860630040" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.199476 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc7vg\" (UniqueName: \"kubernetes.io/projected/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-kube-api-access-pc7vg\") pod \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\" (UID: \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\") " Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.199526 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-var-run\") pod \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\" (UID: \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\") " Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.199641 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-scripts\") pod \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\" (UID: \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\") " Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.199687 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-var-lib\") pod \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\" (UID: \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\") " Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.199665 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-var-run" (OuterVolumeSpecName: "var-run") pod "7b86a5d2-1933-4a2f-97de-f3b49985fbf8" (UID: "7b86a5d2-1933-4a2f-97de-f3b49985fbf8"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.199713 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-var-log\") pod \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\" (UID: \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\") " Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.199735 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-var-log" (OuterVolumeSpecName: "var-log") pod "7b86a5d2-1933-4a2f-97de-f3b49985fbf8" (UID: "7b86a5d2-1933-4a2f-97de-f3b49985fbf8"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.199809 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-etc-ovs\") pod \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\" (UID: \"7b86a5d2-1933-4a2f-97de-f3b49985fbf8\") " Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.199925 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-var-lib" (OuterVolumeSpecName: "var-lib") pod "7b86a5d2-1933-4a2f-97de-f3b49985fbf8" (UID: "7b86a5d2-1933-4a2f-97de-f3b49985fbf8"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.200163 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "7b86a5d2-1933-4a2f-97de-f3b49985fbf8" (UID: "7b86a5d2-1933-4a2f-97de-f3b49985fbf8"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.200477 4947 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-var-lib\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.200524 4947 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-var-log\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.200537 4947 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-etc-ovs\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.200551 4947 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-var-run\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.200795 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-scripts" (OuterVolumeSpecName: "scripts") pod "7b86a5d2-1933-4a2f-97de-f3b49985fbf8" (UID: "7b86a5d2-1933-4a2f-97de-f3b49985fbf8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.202760 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-kube-api-access-pc7vg" (OuterVolumeSpecName: "kube-api-access-pc7vg") pod "7b86a5d2-1933-4a2f-97de-f3b49985fbf8" (UID: "7b86a5d2-1933-4a2f-97de-f3b49985fbf8"). InnerVolumeSpecName "kube-api-access-pc7vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.209197 4947 scope.go:117] "RemoveContainer" containerID="66517c20f9bd5a79033d770b7fe6acd20f04d2ccb83413adddf9ce2d92f48b06" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.250406 4947 scope.go:117] "RemoveContainer" containerID="d934198eccfc8bc4a5f9d891474951c5c3b59ba02e3e53ad44a55f4895165461" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.283634 4947 scope.go:117] "RemoveContainer" containerID="9f55040331ca3d502b5c2088f0718ed298df96d80034f2b8e129b91f1d02388f" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.301299 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc7vg\" (UniqueName: \"kubernetes.io/projected/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-kube-api-access-pc7vg\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.301332 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b86a5d2-1933-4a2f-97de-f3b49985fbf8-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.309446 4947 scope.go:117] "RemoveContainer" containerID="e8d2991a67aac964ce17db312462ba4a29fb541e49ec2b41398e5be186932d4d" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.334086 4947 scope.go:117] "RemoveContainer" containerID="d09ae4508992ad6594ab0cf49b18c9dde1d758bb9614d6152f7610d5543b8ba4" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.355870 4947 scope.go:117] "RemoveContainer" containerID="d8b3c8f44f2e233d211d7dc57ffed40c7ab6c7b15d021bbb35a7dbf134eff941" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.377696 4947 scope.go:117] "RemoveContainer" containerID="d3eb07996d43bedac1f1d1fa489d736d81b2c8f48876740c85c31fdcd4f49d77" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.397315 4947 scope.go:117] "RemoveContainer" containerID="9c72d4dda83bcdd5aa991a2138d85cb2117515fc20f7262bda9d4cf7cbb24de8" Dec 03 07:13:03 crc kubenswrapper[4947]: I1203 07:13:03.418842 4947 scope.go:117] "RemoveContainer" containerID="f49bb955ad69cca770cf0f549937dcdd2e2098f40b0afb681932a0b678968068" Dec 03 07:13:04 crc kubenswrapper[4947]: I1203 07:13:04.045484 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-lb94d" Dec 03 07:13:04 crc kubenswrapper[4947]: I1203 07:13:04.085136 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-lb94d"] Dec 03 07:13:04 crc kubenswrapper[4947]: I1203 07:13:04.089590 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-lb94d"] Dec 03 07:13:05 crc kubenswrapper[4947]: I1203 07:13:05.099125 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" path="/var/lib/kubelet/pods/5dc8a280-5a18-41fd-8e61-f51afa973d20/volumes" Dec 03 07:13:05 crc kubenswrapper[4947]: I1203 07:13:05.104069 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" path="/var/lib/kubelet/pods/7b86a5d2-1933-4a2f-97de-f3b49985fbf8/volumes" Dec 03 07:13:10 crc kubenswrapper[4947]: E1203 07:13:10.323428 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:13:10 crc kubenswrapper[4947]: E1203 07:13:10.324120 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/62cff038-cc99-44ca-ba48-a05d16a96b26-operator-scripts podName:62cff038-cc99-44ca-ba48-a05d16a96b26 nodeName:}" failed. No retries permitted until 2025-12-03 07:13:42.324093567 +0000 UTC m=+1483.585048023 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/62cff038-cc99-44ca-ba48-a05d16a96b26-operator-scripts") pod "novaapib13d-account-delete-k5vqn" (UID: "62cff038-cc99-44ca-ba48-a05d16a96b26") : configmap "openstack-scripts" not found Dec 03 07:13:10 crc kubenswrapper[4947]: E1203 07:13:10.323556 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:13:10 crc kubenswrapper[4947]: E1203 07:13:10.324764 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8c562c87-9f66-447f-83ec-05165c95ca25-operator-scripts podName:8c562c87-9f66-447f-83ec-05165c95ca25 nodeName:}" failed. No retries permitted until 2025-12-03 07:13:42.324730824 +0000 UTC m=+1483.585685290 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8c562c87-9f66-447f-83ec-05165c95ca25-operator-scripts") pod "novacell0c18d-account-delete-458q7" (UID: "8c562c87-9f66-447f-83ec-05165c95ca25") : configmap "openstack-scripts" not found Dec 03 07:13:10 crc kubenswrapper[4947]: E1203 07:13:10.323530 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:13:10 crc kubenswrapper[4947]: E1203 07:13:10.323660 4947 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 07:13:10 crc kubenswrapper[4947]: E1203 07:13:10.324853 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e4b56c27-0089-46d0-8cc3-c5788833f135-operator-scripts podName:e4b56c27-0089-46d0-8cc3-c5788833f135 nodeName:}" failed. No retries permitted until 2025-12-03 07:13:42.324833226 +0000 UTC m=+1483.585787692 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/e4b56c27-0089-46d0-8cc3-c5788833f135-operator-scripts") pod "neutron896b-account-delete-ddxm6" (UID: "e4b56c27-0089-46d0-8cc3-c5788833f135") : configmap "openstack-scripts" not found Dec 03 07:13:10 crc kubenswrapper[4947]: E1203 07:13:10.325013 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8c259ef0-fa34-4ef1-a954-a15bd61ed120-operator-scripts podName:8c259ef0-fa34-4ef1-a954-a15bd61ed120 nodeName:}" failed. No retries permitted until 2025-12-03 07:13:42.32498714 +0000 UTC m=+1483.585941616 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/8c259ef0-fa34-4ef1-a954-a15bd61ed120-operator-scripts") pod "cinder0e6b-account-delete-kmbkl" (UID: "8c259ef0-fa34-4ef1-a954-a15bd61ed120") : configmap "openstack-scripts" not found Dec 03 07:13:10 crc kubenswrapper[4947]: E1203 07:13:10.996235 4947 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4b56c27_0089_46d0_8cc3_c5788833f135.slice/crio-48805411c29e52ae5cc60ad05d70dc1612bb234f5813c62bb0bcba4f5902f170.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4b56c27_0089_46d0_8cc3_c5788833f135.slice/crio-conmon-48805411c29e52ae5cc60ad05d70dc1612bb234f5813c62bb0bcba4f5902f170.scope\": RecentStats: unable to find data in memory cache]" Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.140964 4947 generic.go:334] "Generic (PLEG): container finished" podID="e4b56c27-0089-46d0-8cc3-c5788833f135" containerID="48805411c29e52ae5cc60ad05d70dc1612bb234f5813c62bb0bcba4f5902f170" exitCode=137 Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.141005 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron896b-account-delete-ddxm6" event={"ID":"e4b56c27-0089-46d0-8cc3-c5788833f135","Type":"ContainerDied","Data":"48805411c29e52ae5cc60ad05d70dc1612bb234f5813c62bb0bcba4f5902f170"} Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.142809 4947 generic.go:334] "Generic (PLEG): container finished" podID="8c259ef0-fa34-4ef1-a954-a15bd61ed120" containerID="5f5b17e47defdfb32050f77dd94b370b6ccbc70dc782d3ccfc33b54d67378629" exitCode=137 Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.142854 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder0e6b-account-delete-kmbkl" event={"ID":"8c259ef0-fa34-4ef1-a954-a15bd61ed120","Type":"ContainerDied","Data":"5f5b17e47defdfb32050f77dd94b370b6ccbc70dc782d3ccfc33b54d67378629"} Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.142874 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder0e6b-account-delete-kmbkl" event={"ID":"8c259ef0-fa34-4ef1-a954-a15bd61ed120","Type":"ContainerDied","Data":"9b33eb1e0fff543ba65dcf88a2708847c122e0faf1fea3f4daf790529fab53f9"} Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.142884 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b33eb1e0fff543ba65dcf88a2708847c122e0faf1fea3f4daf790529fab53f9" Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.143957 4947 generic.go:334] "Generic (PLEG): container finished" podID="62cff038-cc99-44ca-ba48-a05d16a96b26" containerID="5fcded32bf81a07bf90138d60ec52c620c0dcfd0f7192a732e55d09ba69d93dc" exitCode=137 Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.143992 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapib13d-account-delete-k5vqn" event={"ID":"62cff038-cc99-44ca-ba48-a05d16a96b26","Type":"ContainerDied","Data":"5fcded32bf81a07bf90138d60ec52c620c0dcfd0f7192a732e55d09ba69d93dc"} Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.201131 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder0e6b-account-delete-kmbkl" Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.209473 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron896b-account-delete-ddxm6" Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.313597 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapib13d-account-delete-k5vqn" Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.340120 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c259ef0-fa34-4ef1-a954-a15bd61ed120-operator-scripts\") pod \"8c259ef0-fa34-4ef1-a954-a15bd61ed120\" (UID: \"8c259ef0-fa34-4ef1-a954-a15bd61ed120\") " Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.340960 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c259ef0-fa34-4ef1-a954-a15bd61ed120-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8c259ef0-fa34-4ef1-a954-a15bd61ed120" (UID: "8c259ef0-fa34-4ef1-a954-a15bd61ed120"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.340991 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4b56c27-0089-46d0-8cc3-c5788833f135-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e4b56c27-0089-46d0-8cc3-c5788833f135" (UID: "e4b56c27-0089-46d0-8cc3-c5788833f135"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.341030 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4b56c27-0089-46d0-8cc3-c5788833f135-operator-scripts\") pod \"e4b56c27-0089-46d0-8cc3-c5788833f135\" (UID: \"e4b56c27-0089-46d0-8cc3-c5788833f135\") " Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.341080 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9smc\" (UniqueName: \"kubernetes.io/projected/8c259ef0-fa34-4ef1-a954-a15bd61ed120-kube-api-access-z9smc\") pod \"8c259ef0-fa34-4ef1-a954-a15bd61ed120\" (UID: \"8c259ef0-fa34-4ef1-a954-a15bd61ed120\") " Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.341760 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rkcc\" (UniqueName: \"kubernetes.io/projected/e4b56c27-0089-46d0-8cc3-c5788833f135-kube-api-access-6rkcc\") pod \"e4b56c27-0089-46d0-8cc3-c5788833f135\" (UID: \"e4b56c27-0089-46d0-8cc3-c5788833f135\") " Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.342102 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c259ef0-fa34-4ef1-a954-a15bd61ed120-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.342125 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4b56c27-0089-46d0-8cc3-c5788833f135-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.346912 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4b56c27-0089-46d0-8cc3-c5788833f135-kube-api-access-6rkcc" (OuterVolumeSpecName: "kube-api-access-6rkcc") pod "e4b56c27-0089-46d0-8cc3-c5788833f135" (UID: "e4b56c27-0089-46d0-8cc3-c5788833f135"). InnerVolumeSpecName "kube-api-access-6rkcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.347735 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c259ef0-fa34-4ef1-a954-a15bd61ed120-kube-api-access-z9smc" (OuterVolumeSpecName: "kube-api-access-z9smc") pod "8c259ef0-fa34-4ef1-a954-a15bd61ed120" (UID: "8c259ef0-fa34-4ef1-a954-a15bd61ed120"). InnerVolumeSpecName "kube-api-access-z9smc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.443813 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62cff038-cc99-44ca-ba48-a05d16a96b26-operator-scripts\") pod \"62cff038-cc99-44ca-ba48-a05d16a96b26\" (UID: \"62cff038-cc99-44ca-ba48-a05d16a96b26\") " Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.443880 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6h9rj\" (UniqueName: \"kubernetes.io/projected/62cff038-cc99-44ca-ba48-a05d16a96b26-kube-api-access-6h9rj\") pod \"62cff038-cc99-44ca-ba48-a05d16a96b26\" (UID: \"62cff038-cc99-44ca-ba48-a05d16a96b26\") " Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.444143 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9smc\" (UniqueName: \"kubernetes.io/projected/8c259ef0-fa34-4ef1-a954-a15bd61ed120-kube-api-access-z9smc\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.444161 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rkcc\" (UniqueName: \"kubernetes.io/projected/e4b56c27-0089-46d0-8cc3-c5788833f135-kube-api-access-6rkcc\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.444226 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62cff038-cc99-44ca-ba48-a05d16a96b26-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "62cff038-cc99-44ca-ba48-a05d16a96b26" (UID: "62cff038-cc99-44ca-ba48-a05d16a96b26"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.449511 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62cff038-cc99-44ca-ba48-a05d16a96b26-kube-api-access-6h9rj" (OuterVolumeSpecName: "kube-api-access-6h9rj") pod "62cff038-cc99-44ca-ba48-a05d16a96b26" (UID: "62cff038-cc99-44ca-ba48-a05d16a96b26"). InnerVolumeSpecName "kube-api-access-6h9rj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.480261 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0c18d-account-delete-458q7" Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.544899 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c562c87-9f66-447f-83ec-05165c95ca25-operator-scripts\") pod \"8c562c87-9f66-447f-83ec-05165c95ca25\" (UID: \"8c562c87-9f66-447f-83ec-05165c95ca25\") " Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.545068 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g85vw\" (UniqueName: \"kubernetes.io/projected/8c562c87-9f66-447f-83ec-05165c95ca25-kube-api-access-g85vw\") pod \"8c562c87-9f66-447f-83ec-05165c95ca25\" (UID: \"8c562c87-9f66-447f-83ec-05165c95ca25\") " Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.545293 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62cff038-cc99-44ca-ba48-a05d16a96b26-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.545310 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6h9rj\" (UniqueName: \"kubernetes.io/projected/62cff038-cc99-44ca-ba48-a05d16a96b26-kube-api-access-6h9rj\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.545378 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c562c87-9f66-447f-83ec-05165c95ca25-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8c562c87-9f66-447f-83ec-05165c95ca25" (UID: "8c562c87-9f66-447f-83ec-05165c95ca25"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.547434 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c562c87-9f66-447f-83ec-05165c95ca25-kube-api-access-g85vw" (OuterVolumeSpecName: "kube-api-access-g85vw") pod "8c562c87-9f66-447f-83ec-05165c95ca25" (UID: "8c562c87-9f66-447f-83ec-05165c95ca25"). InnerVolumeSpecName "kube-api-access-g85vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.646008 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g85vw\" (UniqueName: \"kubernetes.io/projected/8c562c87-9f66-447f-83ec-05165c95ca25-kube-api-access-g85vw\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:11 crc kubenswrapper[4947]: I1203 07:13:11.646034 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c562c87-9f66-447f-83ec-05165c95ca25-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:13:12 crc kubenswrapper[4947]: I1203 07:13:12.153041 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron896b-account-delete-ddxm6" Dec 03 07:13:12 crc kubenswrapper[4947]: I1203 07:13:12.153041 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron896b-account-delete-ddxm6" event={"ID":"e4b56c27-0089-46d0-8cc3-c5788833f135","Type":"ContainerDied","Data":"362768caa5c7d95583b0b35cd93dccddd58e68fbc728c4967b010b78a7193957"} Dec 03 07:13:12 crc kubenswrapper[4947]: I1203 07:13:12.153164 4947 scope.go:117] "RemoveContainer" containerID="48805411c29e52ae5cc60ad05d70dc1612bb234f5813c62bb0bcba4f5902f170" Dec 03 07:13:12 crc kubenswrapper[4947]: I1203 07:13:12.154578 4947 generic.go:334] "Generic (PLEG): container finished" podID="8c562c87-9f66-447f-83ec-05165c95ca25" containerID="207f7321a401ba4b7cf0fa6297e5b92b9beae00b568d12bec3f4e5034144bc08" exitCode=137 Dec 03 07:13:12 crc kubenswrapper[4947]: I1203 07:13:12.154640 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0c18d-account-delete-458q7" event={"ID":"8c562c87-9f66-447f-83ec-05165c95ca25","Type":"ContainerDied","Data":"207f7321a401ba4b7cf0fa6297e5b92b9beae00b568d12bec3f4e5034144bc08"} Dec 03 07:13:12 crc kubenswrapper[4947]: I1203 07:13:12.154665 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0c18d-account-delete-458q7" event={"ID":"8c562c87-9f66-447f-83ec-05165c95ca25","Type":"ContainerDied","Data":"ecba32580078a52bc491b554de34924f758e633ada59f370f7dda93c895da862"} Dec 03 07:13:12 crc kubenswrapper[4947]: I1203 07:13:12.154661 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0c18d-account-delete-458q7" Dec 03 07:13:12 crc kubenswrapper[4947]: I1203 07:13:12.156690 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapib13d-account-delete-k5vqn" event={"ID":"62cff038-cc99-44ca-ba48-a05d16a96b26","Type":"ContainerDied","Data":"15df791074ddfa5b8c46cf981fc2125134f8625a76d9d6cb4e5cc3fc4f7549e7"} Dec 03 07:13:12 crc kubenswrapper[4947]: I1203 07:13:12.156705 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapib13d-account-delete-k5vqn" Dec 03 07:13:12 crc kubenswrapper[4947]: I1203 07:13:12.156697 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder0e6b-account-delete-kmbkl" Dec 03 07:13:12 crc kubenswrapper[4947]: I1203 07:13:12.173791 4947 scope.go:117] "RemoveContainer" containerID="207f7321a401ba4b7cf0fa6297e5b92b9beae00b568d12bec3f4e5034144bc08" Dec 03 07:13:12 crc kubenswrapper[4947]: I1203 07:13:12.194526 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron896b-account-delete-ddxm6"] Dec 03 07:13:12 crc kubenswrapper[4947]: I1203 07:13:12.201068 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron896b-account-delete-ddxm6"] Dec 03 07:13:12 crc kubenswrapper[4947]: I1203 07:13:12.201622 4947 scope.go:117] "RemoveContainer" containerID="207f7321a401ba4b7cf0fa6297e5b92b9beae00b568d12bec3f4e5034144bc08" Dec 03 07:13:12 crc kubenswrapper[4947]: E1203 07:13:12.202119 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"207f7321a401ba4b7cf0fa6297e5b92b9beae00b568d12bec3f4e5034144bc08\": container with ID starting with 207f7321a401ba4b7cf0fa6297e5b92b9beae00b568d12bec3f4e5034144bc08 not found: ID does not exist" containerID="207f7321a401ba4b7cf0fa6297e5b92b9beae00b568d12bec3f4e5034144bc08" Dec 03 07:13:12 crc kubenswrapper[4947]: I1203 07:13:12.202202 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"207f7321a401ba4b7cf0fa6297e5b92b9beae00b568d12bec3f4e5034144bc08"} err="failed to get container status \"207f7321a401ba4b7cf0fa6297e5b92b9beae00b568d12bec3f4e5034144bc08\": rpc error: code = NotFound desc = could not find container \"207f7321a401ba4b7cf0fa6297e5b92b9beae00b568d12bec3f4e5034144bc08\": container with ID starting with 207f7321a401ba4b7cf0fa6297e5b92b9beae00b568d12bec3f4e5034144bc08 not found: ID does not exist" Dec 03 07:13:12 crc kubenswrapper[4947]: I1203 07:13:12.202230 4947 scope.go:117] "RemoveContainer" containerID="5fcded32bf81a07bf90138d60ec52c620c0dcfd0f7192a732e55d09ba69d93dc" Dec 03 07:13:12 crc kubenswrapper[4947]: I1203 07:13:12.212630 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapib13d-account-delete-k5vqn"] Dec 03 07:13:12 crc kubenswrapper[4947]: I1203 07:13:12.217967 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapib13d-account-delete-k5vqn"] Dec 03 07:13:12 crc kubenswrapper[4947]: I1203 07:13:12.222599 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder0e6b-account-delete-kmbkl"] Dec 03 07:13:12 crc kubenswrapper[4947]: I1203 07:13:12.231111 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder0e6b-account-delete-kmbkl"] Dec 03 07:13:12 crc kubenswrapper[4947]: I1203 07:13:12.236400 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0c18d-account-delete-458q7"] Dec 03 07:13:12 crc kubenswrapper[4947]: I1203 07:13:12.240605 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell0c18d-account-delete-458q7"] Dec 03 07:13:13 crc kubenswrapper[4947]: I1203 07:13:13.099237 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62cff038-cc99-44ca-ba48-a05d16a96b26" path="/var/lib/kubelet/pods/62cff038-cc99-44ca-ba48-a05d16a96b26/volumes" Dec 03 07:13:13 crc kubenswrapper[4947]: I1203 07:13:13.100543 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c259ef0-fa34-4ef1-a954-a15bd61ed120" path="/var/lib/kubelet/pods/8c259ef0-fa34-4ef1-a954-a15bd61ed120/volumes" Dec 03 07:13:13 crc kubenswrapper[4947]: I1203 07:13:13.101821 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c562c87-9f66-447f-83ec-05165c95ca25" path="/var/lib/kubelet/pods/8c562c87-9f66-447f-83ec-05165c95ca25/volumes" Dec 03 07:13:13 crc kubenswrapper[4947]: I1203 07:13:13.102991 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4b56c27-0089-46d0-8cc3-c5788833f135" path="/var/lib/kubelet/pods/e4b56c27-0089-46d0-8cc3-c5788833f135/volumes" Dec 03 07:13:30 crc kubenswrapper[4947]: I1203 07:13:30.086344 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:13:30 crc kubenswrapper[4947]: I1203 07:13:30.087568 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.693134 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fc8q2"] Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.694680 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="account-server" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.694774 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="account-server" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.694791 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3" containerName="barbican-keystone-listener" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.694803 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3" containerName="barbican-keystone-listener" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.694826 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="899d3d67-ec63-4d5f-ad93-c40003578347" containerName="nova-metadata-metadata" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.694836 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="899d3d67-ec63-4d5f-ad93-c40003578347" containerName="nova-metadata-metadata" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.694852 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="add50932-e8ea-4e7a-ab75-6fb1e0463499" containerName="memcached" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.694863 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="add50932-e8ea-4e7a-ab75-6fb1e0463499" containerName="memcached" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.694881 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbc40b18-511b-4bd7-bb2c-3dc868c6dcec" containerName="galera" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.694892 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbc40b18-511b-4bd7-bb2c-3dc868c6dcec" containerName="galera" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.695353 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f87802c-3846-486e-a131-39a7fe336c96" containerName="proxy-httpd" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.695380 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f87802c-3846-486e-a131-39a7fe336c96" containerName="proxy-httpd" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.695411 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5367165f-75ec-4633-8042-edfe91e3be60" containerName="rabbitmq" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.695424 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5367165f-75ec-4633-8042-edfe91e3be60" containerName="rabbitmq" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.695451 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdb45354-43cd-41e7-a511-95357eb656e5" containerName="glance-httpd" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.695465 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdb45354-43cd-41e7-a511-95357eb656e5" containerName="glance-httpd" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.695486 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90736cf6-0db1-44a8-b285-4d319f0951f8" containerName="kube-state-metrics" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.695525 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="90736cf6-0db1-44a8-b285-4d319f0951f8" containerName="kube-state-metrics" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.695541 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1256396c-ee12-469e-864f-c87983516079" containerName="mariadb-account-delete" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.695553 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1256396c-ee12-469e-864f-c87983516079" containerName="mariadb-account-delete" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.695571 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="object-replicator" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.695584 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="object-replicator" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.695601 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98d05028-c68a-4438-afcd-161c4f974b08" containerName="nova-api-api" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.695614 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="98d05028-c68a-4438-afcd-161c4f974b08" containerName="nova-api-api" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.695638 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb28a2e-a946-4407-be07-6ac8eaad8ab1" containerName="neutron-httpd" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.695651 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb28a2e-a946-4407-be07-6ac8eaad8ab1" containerName="neutron-httpd" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.695678 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerName="ovsdb-server-init" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.695691 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerName="ovsdb-server-init" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.695707 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c259ef0-fa34-4ef1-a954-a15bd61ed120" containerName="mariadb-account-delete" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.695719 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c259ef0-fa34-4ef1-a954-a15bd61ed120" containerName="mariadb-account-delete" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.695752 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4e9d6dc-e814-485d-842b-9266732c7924" containerName="barbican-api-log" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.695766 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e9d6dc-e814-485d-842b-9266732c7924" containerName="barbican-api-log" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.695793 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="object-updater" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.695806 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="object-updater" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.695832 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3" containerName="barbican-keystone-listener-log" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.695845 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3" containerName="barbican-keystone-listener-log" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.695870 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c0f438-8d06-433e-af68-49ccacd9a017" containerName="mariadb-account-delete" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.695883 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c0f438-8d06-433e-af68-49ccacd9a017" containerName="mariadb-account-delete" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.695905 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f87802c-3846-486e-a131-39a7fe336c96" containerName="proxy-server" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.695918 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f87802c-3846-486e-a131-39a7fe336c96" containerName="proxy-server" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.695935 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b4b931b-69ee-4ff2-b01a-85d45fc93ec4" containerName="nova-cell1-conductor-conductor" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.695950 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b4b931b-69ee-4ff2-b01a-85d45fc93ec4" containerName="nova-cell1-conductor-conductor" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.695971 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a879185-9b9a-45a5-a211-c61faf308cbb" containerName="cinder-api" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.695984 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a879185-9b9a-45a5-a211-c61faf308cbb" containerName="cinder-api" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696006 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="container-replicator" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696019 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="container-replicator" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696071 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdb45354-43cd-41e7-a511-95357eb656e5" containerName="glance-log" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696090 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdb45354-43cd-41e7-a511-95357eb656e5" containerName="glance-log" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696114 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="container-server" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696123 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="container-server" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696138 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="object-server" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696150 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="object-server" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696167 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed654f44-78c3-4118-83d5-e2a5d917c4f4" containerName="barbican-worker" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696177 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed654f44-78c3-4118-83d5-e2a5d917c4f4" containerName="barbican-worker" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696209 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98d05028-c68a-4438-afcd-161c4f974b08" containerName="nova-api-log" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696219 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="98d05028-c68a-4438-afcd-161c4f974b08" containerName="nova-api-log" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696230 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6a94dde-b399-408b-93f2-488d02be7f07" containerName="mariadb-account-delete" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696241 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6a94dde-b399-408b-93f2-488d02be7f07" containerName="mariadb-account-delete" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696260 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c3c47d5-30e8-4c5f-93fe-e0d944cdc998" containerName="placement-api" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696270 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c3c47d5-30e8-4c5f-93fe-e0d944cdc998" containerName="placement-api" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696290 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="855f6436-68f0-42d8-a12a-bf25632440c1" containerName="ovn-northd" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696300 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="855f6436-68f0-42d8-a12a-bf25632440c1" containerName="ovn-northd" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696312 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33bdabb7-a612-499f-855c-74da636d845a" containerName="ceilometer-central-agent" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696324 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="33bdabb7-a612-499f-855c-74da636d845a" containerName="ceilometer-central-agent" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696339 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8" containerName="galera" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696349 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8" containerName="galera" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696361 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="rsync" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696371 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="rsync" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696389 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="swift-recon-cron" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696402 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="swift-recon-cron" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696427 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33bdabb7-a612-499f-855c-74da636d845a" containerName="proxy-httpd" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696438 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="33bdabb7-a612-499f-855c-74da636d845a" containerName="proxy-httpd" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696448 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c562c87-9f66-447f-83ec-05165c95ca25" containerName="mariadb-account-delete" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696458 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c562c87-9f66-447f-83ec-05165c95ca25" containerName="mariadb-account-delete" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696478 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="account-auditor" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696504 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="account-auditor" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696522 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="container-auditor" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696532 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="container-auditor" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696545 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90b682d1-68e6-49a4-83a6-51b1b40b7e99" containerName="openstack-network-exporter" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696557 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b682d1-68e6-49a4-83a6-51b1b40b7e99" containerName="openstack-network-exporter" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696572 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4e9d6dc-e814-485d-842b-9266732c7924" containerName="barbican-api" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696581 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e9d6dc-e814-485d-842b-9266732c7924" containerName="barbican-api" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696591 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62cff038-cc99-44ca-ba48-a05d16a96b26" containerName="mariadb-account-delete" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696603 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="62cff038-cc99-44ca-ba48-a05d16a96b26" containerName="mariadb-account-delete" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696619 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c475475-916c-4267-8064-f932c04d0df2" containerName="ovsdbserver-nb" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696629 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c475475-916c-4267-8064-f932c04d0df2" containerName="ovsdbserver-nb" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696641 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerName="ovsdb-server" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696651 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerName="ovsdb-server" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696665 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49caa6da-3c98-4c49-ab22-62121ff908cf" containerName="cinder-scheduler" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696675 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="49caa6da-3c98-4c49-ab22-62121ff908cf" containerName="cinder-scheduler" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696691 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7b9bf09-0d94-4520-b783-7eb3fb4b79d4" containerName="glance-log" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696700 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b9bf09-0d94-4520-b783-7eb3fb4b79d4" containerName="glance-log" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696715 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4a227e4-8c2a-4880-9944-877640627cd0" containerName="dnsmasq-dns" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696725 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4a227e4-8c2a-4880-9944-877640627cd0" containerName="dnsmasq-dns" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696966 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d7ef1d-a0bf-465f-baad-1bc3a71618ff" containerName="setup-container" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.696981 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d7ef1d-a0bf-465f-baad-1bc3a71618ff" containerName="setup-container" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.696999 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acfeae50-8757-4baf-a16a-c33ae100fdf2" containerName="keystone-api" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.697014 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="acfeae50-8757-4baf-a16a-c33ae100fdf2" containerName="keystone-api" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.697032 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49caa6da-3c98-4c49-ab22-62121ff908cf" containerName="probe" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.697045 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="49caa6da-3c98-4c49-ab22-62121ff908cf" containerName="probe" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.697067 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8" containerName="mysql-bootstrap" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.697081 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8" containerName="mysql-bootstrap" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.697129 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="899d3d67-ec63-4d5f-ad93-c40003578347" containerName="nova-metadata-log" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.697175 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="899d3d67-ec63-4d5f-ad93-c40003578347" containerName="nova-metadata-log" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.697192 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4b56c27-0089-46d0-8cc3-c5788833f135" containerName="mariadb-account-delete" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.697203 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4b56c27-0089-46d0-8cc3-c5788833f135" containerName="mariadb-account-delete" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.697224 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c3c47d5-30e8-4c5f-93fe-e0d944cdc998" containerName="placement-log" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.697276 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c3c47d5-30e8-4c5f-93fe-e0d944cdc998" containerName="placement-log" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.697291 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51d7ef1d-a0bf-465f-baad-1bc3a71618ff" containerName="rabbitmq" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.697344 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="51d7ef1d-a0bf-465f-baad-1bc3a71618ff" containerName="rabbitmq" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.697401 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98211c56-fd23-46c2-9710-31fc562e2182" containerName="ovn-controller" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.697630 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="98211c56-fd23-46c2-9710-31fc562e2182" containerName="ovn-controller" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.697675 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="855f6436-68f0-42d8-a12a-bf25632440c1" containerName="openstack-network-exporter" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.697689 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="855f6436-68f0-42d8-a12a-bf25632440c1" containerName="openstack-network-exporter" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.697708 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3735b7db-e9a7-4be6-9c74-cad0131f2c0b" containerName="nova-scheduler-scheduler" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.697721 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="3735b7db-e9a7-4be6-9c74-cad0131f2c0b" containerName="nova-scheduler-scheduler" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.697739 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="account-reaper" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.697753 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="account-reaper" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.697773 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33bdabb7-a612-499f-855c-74da636d845a" containerName="sg-core" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.697787 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="33bdabb7-a612-499f-855c-74da636d845a" containerName="sg-core" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.697837 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33bdabb7-a612-499f-855c-74da636d845a" containerName="ceilometer-notification-agent" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.697881 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="33bdabb7-a612-499f-855c-74da636d845a" containerName="ceilometer-notification-agent" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.697904 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5367165f-75ec-4633-8042-edfe91e3be60" containerName="setup-container" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.697913 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5367165f-75ec-4633-8042-edfe91e3be60" containerName="setup-container" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.697925 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="object-auditor" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.697934 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="object-auditor" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.698180 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a879185-9b9a-45a5-a211-c61faf308cbb" containerName="cinder-api-log" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.698195 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a879185-9b9a-45a5-a211-c61faf308cbb" containerName="cinder-api-log" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.698220 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb28a2e-a946-4407-be07-6ac8eaad8ab1" containerName="neutron-api" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.698234 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb28a2e-a946-4407-be07-6ac8eaad8ab1" containerName="neutron-api" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.698248 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7b9bf09-0d94-4520-b783-7eb3fb4b79d4" containerName="glance-httpd" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.698262 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b9bf09-0d94-4520-b783-7eb3fb4b79d4" containerName="glance-httpd" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.698279 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4a227e4-8c2a-4880-9944-877640627cd0" containerName="init" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.698293 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4a227e4-8c2a-4880-9944-877640627cd0" containerName="init" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.698354 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbc40b18-511b-4bd7-bb2c-3dc868c6dcec" containerName="mysql-bootstrap" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.698416 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbc40b18-511b-4bd7-bb2c-3dc868c6dcec" containerName="mysql-bootstrap" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.698433 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="account-replicator" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.698443 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="account-replicator" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.698699 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="object-expirer" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.698718 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="object-expirer" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.698743 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3496bac7-6b31-4ba8-a490-14bff1522b8c" containerName="openstack-network-exporter" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.698757 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="3496bac7-6b31-4ba8-a490-14bff1522b8c" containerName="openstack-network-exporter" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.698771 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="container-updater" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.698783 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="container-updater" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.698810 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c475475-916c-4267-8064-f932c04d0df2" containerName="openstack-network-exporter" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.698823 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c475475-916c-4267-8064-f932c04d0df2" containerName="openstack-network-exporter" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.698838 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e80985eb-c6e0-4ffc-9b98-b1c92be266eb" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.698851 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e80985eb-c6e0-4ffc-9b98-b1c92be266eb" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.698871 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed654f44-78c3-4118-83d5-e2a5d917c4f4" containerName="barbican-worker-log" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.698884 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed654f44-78c3-4118-83d5-e2a5d917c4f4" containerName="barbican-worker-log" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.698898 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerName="ovs-vswitchd" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.698911 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerName="ovs-vswitchd" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.698945 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90b682d1-68e6-49a4-83a6-51b1b40b7e99" containerName="ovsdbserver-sb" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.698958 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="90b682d1-68e6-49a4-83a6-51b1b40b7e99" containerName="ovsdbserver-sb" Dec 03 07:13:52 crc kubenswrapper[4947]: E1203 07:13:52.699000 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0a57ca-d063-4d27-ac54-f5431cca2971" containerName="nova-cell0-conductor-conductor" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699014 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0a57ca-d063-4d27-ac54-f5431cca2971" containerName="nova-cell0-conductor-conductor" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699408 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="855f6436-68f0-42d8-a12a-bf25632440c1" containerName="openstack-network-exporter" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699452 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4a227e4-8c2a-4880-9944-877640627cd0" containerName="dnsmasq-dns" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699469 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="3496bac7-6b31-4ba8-a490-14bff1522b8c" containerName="openstack-network-exporter" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699485 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbc40b18-511b-4bd7-bb2c-3dc868c6dcec" containerName="galera" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699541 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5367165f-75ec-4633-8042-edfe91e3be60" containerName="rabbitmq" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699558 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="899d3d67-ec63-4d5f-ad93-c40003578347" containerName="nova-metadata-metadata" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699573 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="51d7ef1d-a0bf-465f-baad-1bc3a71618ff" containerName="rabbitmq" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699590 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="account-server" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699610 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="object-server" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699621 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7b9bf09-0d94-4520-b783-7eb3fb4b79d4" containerName="glance-log" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699635 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="account-reaper" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699645 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3" containerName="barbican-keystone-listener" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699667 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="98d05028-c68a-4438-afcd-161c4f974b08" containerName="nova-api-log" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699687 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="90736cf6-0db1-44a8-b285-4d319f0951f8" containerName="kube-state-metrics" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699702 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c3c47d5-30e8-4c5f-93fe-e0d944cdc998" containerName="placement-api" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699713 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7b9bf09-0d94-4520-b783-7eb3fb4b79d4" containerName="glance-httpd" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699725 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c562c87-9f66-447f-83ec-05165c95ca25" containerName="mariadb-account-delete" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699741 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="33bdabb7-a612-499f-855c-74da636d845a" containerName="ceilometer-notification-agent" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699760 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c475475-916c-4267-8064-f932c04d0df2" containerName="openstack-network-exporter" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699777 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdb45354-43cd-41e7-a511-95357eb656e5" containerName="glance-log" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699790 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="855f6436-68f0-42d8-a12a-bf25632440c1" containerName="ovn-northd" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699808 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="account-replicator" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699826 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc5d79a6-e1fe-4b2b-a51d-a580f5560bf8" containerName="galera" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699836 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="rsync" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699856 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="add50932-e8ea-4e7a-ab75-6fb1e0463499" containerName="memcached" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699868 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="e80985eb-c6e0-4ffc-9b98-b1c92be266eb" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699890 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="33bdabb7-a612-499f-855c-74da636d845a" containerName="sg-core" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699904 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="container-replicator" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699917 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c259ef0-fa34-4ef1-a954-a15bd61ed120" containerName="mariadb-account-delete" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699933 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="container-auditor" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699947 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="49caa6da-3c98-4c49-ab22-62121ff908cf" containerName="cinder-scheduler" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699962 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="49caa6da-3c98-4c49-ab22-62121ff908cf" containerName="probe" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699982 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6a94dde-b399-408b-93f2-488d02be7f07" containerName="mariadb-account-delete" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.699997 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccb28a2e-a946-4407-be07-6ac8eaad8ab1" containerName="neutron-httpd" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.700018 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="44c0f438-8d06-433e-af68-49ccacd9a017" containerName="mariadb-account-delete" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.700036 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f87802c-3846-486e-a131-39a7fe336c96" containerName="proxy-httpd" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.700050 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4e9d6dc-e814-485d-842b-9266732c7924" containerName="barbican-api-log" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.700066 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f87802c-3846-486e-a131-39a7fe336c96" containerName="proxy-server" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.700083 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed654f44-78c3-4118-83d5-e2a5d917c4f4" containerName="barbican-worker" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.700094 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="33bdabb7-a612-499f-855c-74da636d845a" containerName="proxy-httpd" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.700106 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="98211c56-fd23-46c2-9710-31fc562e2182" containerName="ovn-controller" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.700127 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="object-expirer" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.700144 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a879185-9b9a-45a5-a211-c61faf308cbb" containerName="cinder-api-log" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.700157 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="62cff038-cc99-44ca-ba48-a05d16a96b26" containerName="mariadb-account-delete" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.700176 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerName="ovs-vswitchd" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.700192 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c475475-916c-4267-8064-f932c04d0df2" containerName="ovsdbserver-nb" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.703584 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b86a5d2-1933-4a2f-97de-f3b49985fbf8" containerName="ovsdb-server" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.703608 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="account-auditor" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.703622 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccb28a2e-a946-4407-be07-6ac8eaad8ab1" containerName="neutron-api" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.703639 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="container-updater" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.703654 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="object-auditor" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.703669 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b4b931b-69ee-4ff2-b01a-85d45fc93ec4" containerName="nova-cell1-conductor-conductor" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.703683 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dca7534-e6d7-4cd9-88bb-4d8f7dff75d3" containerName="barbican-keystone-listener-log" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.703699 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="90b682d1-68e6-49a4-83a6-51b1b40b7e99" containerName="openstack-network-exporter" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.703709 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="swift-recon-cron" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.703720 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="acfeae50-8757-4baf-a16a-c33ae100fdf2" containerName="keystone-api" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.703734 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="98d05028-c68a-4438-afcd-161c4f974b08" containerName="nova-api-api" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.703750 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="object-updater" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.703762 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a879185-9b9a-45a5-a211-c61faf308cbb" containerName="cinder-api" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.703779 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="90b682d1-68e6-49a4-83a6-51b1b40b7e99" containerName="ovsdbserver-sb" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.703792 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="object-replicator" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.703807 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="33bdabb7-a612-499f-855c-74da636d845a" containerName="ceilometer-central-agent" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.705625 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0a57ca-d063-4d27-ac54-f5431cca2971" containerName="nova-cell0-conductor-conductor" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.705652 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="899d3d67-ec63-4d5f-ad93-c40003578347" containerName="nova-metadata-log" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.705672 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dc8a280-5a18-41fd-8e61-f51afa973d20" containerName="container-server" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.705690 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdb45354-43cd-41e7-a511-95357eb656e5" containerName="glance-httpd" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.705707 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed654f44-78c3-4118-83d5-e2a5d917c4f4" containerName="barbican-worker-log" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.705724 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="3735b7db-e9a7-4be6-9c74-cad0131f2c0b" containerName="nova-scheduler-scheduler" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.705735 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4e9d6dc-e814-485d-842b-9266732c7924" containerName="barbican-api" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.705750 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4b56c27-0089-46d0-8cc3-c5788833f135" containerName="mariadb-account-delete" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.705767 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c3c47d5-30e8-4c5f-93fe-e0d944cdc998" containerName="placement-log" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.705777 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="1256396c-ee12-469e-864f-c87983516079" containerName="mariadb-account-delete" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.708104 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fc8q2"] Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.708225 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fc8q2" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.892151 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4w92\" (UniqueName: \"kubernetes.io/projected/fb6808ac-ff5d-49d3-a325-de242d82e4f7-kube-api-access-v4w92\") pod \"certified-operators-fc8q2\" (UID: \"fb6808ac-ff5d-49d3-a325-de242d82e4f7\") " pod="openshift-marketplace/certified-operators-fc8q2" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.892387 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb6808ac-ff5d-49d3-a325-de242d82e4f7-catalog-content\") pod \"certified-operators-fc8q2\" (UID: \"fb6808ac-ff5d-49d3-a325-de242d82e4f7\") " pod="openshift-marketplace/certified-operators-fc8q2" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.892449 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb6808ac-ff5d-49d3-a325-de242d82e4f7-utilities\") pod \"certified-operators-fc8q2\" (UID: \"fb6808ac-ff5d-49d3-a325-de242d82e4f7\") " pod="openshift-marketplace/certified-operators-fc8q2" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.993934 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb6808ac-ff5d-49d3-a325-de242d82e4f7-catalog-content\") pod \"certified-operators-fc8q2\" (UID: \"fb6808ac-ff5d-49d3-a325-de242d82e4f7\") " pod="openshift-marketplace/certified-operators-fc8q2" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.994012 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb6808ac-ff5d-49d3-a325-de242d82e4f7-utilities\") pod \"certified-operators-fc8q2\" (UID: \"fb6808ac-ff5d-49d3-a325-de242d82e4f7\") " pod="openshift-marketplace/certified-operators-fc8q2" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.994119 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4w92\" (UniqueName: \"kubernetes.io/projected/fb6808ac-ff5d-49d3-a325-de242d82e4f7-kube-api-access-v4w92\") pod \"certified-operators-fc8q2\" (UID: \"fb6808ac-ff5d-49d3-a325-de242d82e4f7\") " pod="openshift-marketplace/certified-operators-fc8q2" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.994660 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb6808ac-ff5d-49d3-a325-de242d82e4f7-utilities\") pod \"certified-operators-fc8q2\" (UID: \"fb6808ac-ff5d-49d3-a325-de242d82e4f7\") " pod="openshift-marketplace/certified-operators-fc8q2" Dec 03 07:13:52 crc kubenswrapper[4947]: I1203 07:13:52.994756 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb6808ac-ff5d-49d3-a325-de242d82e4f7-catalog-content\") pod \"certified-operators-fc8q2\" (UID: \"fb6808ac-ff5d-49d3-a325-de242d82e4f7\") " pod="openshift-marketplace/certified-operators-fc8q2" Dec 03 07:13:53 crc kubenswrapper[4947]: I1203 07:13:53.025758 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4w92\" (UniqueName: \"kubernetes.io/projected/fb6808ac-ff5d-49d3-a325-de242d82e4f7-kube-api-access-v4w92\") pod \"certified-operators-fc8q2\" (UID: \"fb6808ac-ff5d-49d3-a325-de242d82e4f7\") " pod="openshift-marketplace/certified-operators-fc8q2" Dec 03 07:13:53 crc kubenswrapper[4947]: I1203 07:13:53.035215 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fc8q2" Dec 03 07:13:53 crc kubenswrapper[4947]: I1203 07:13:53.571262 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fc8q2"] Dec 03 07:13:53 crc kubenswrapper[4947]: I1203 07:13:53.593268 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fc8q2" event={"ID":"fb6808ac-ff5d-49d3-a325-de242d82e4f7","Type":"ContainerStarted","Data":"c0627d697547a9c73c5e5dbc29aa638a31705af0246fe345b0a4b5e03a88181e"} Dec 03 07:13:54 crc kubenswrapper[4947]: I1203 07:13:54.603957 4947 generic.go:334] "Generic (PLEG): container finished" podID="fb6808ac-ff5d-49d3-a325-de242d82e4f7" containerID="16d5d69531488cd8c4644a96b3545776666e9679c9bb5b843e2e9c2c278bd73b" exitCode=0 Dec 03 07:13:54 crc kubenswrapper[4947]: I1203 07:13:54.604057 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fc8q2" event={"ID":"fb6808ac-ff5d-49d3-a325-de242d82e4f7","Type":"ContainerDied","Data":"16d5d69531488cd8c4644a96b3545776666e9679c9bb5b843e2e9c2c278bd73b"} Dec 03 07:13:55 crc kubenswrapper[4947]: I1203 07:13:55.616796 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fc8q2" event={"ID":"fb6808ac-ff5d-49d3-a325-de242d82e4f7","Type":"ContainerStarted","Data":"5166f9b654dcbcb8386aabcbe0d8ddba0a700f6dac0f47a38a57009f8c4d18f6"} Dec 03 07:13:56 crc kubenswrapper[4947]: I1203 07:13:56.632436 4947 generic.go:334] "Generic (PLEG): container finished" podID="fb6808ac-ff5d-49d3-a325-de242d82e4f7" containerID="5166f9b654dcbcb8386aabcbe0d8ddba0a700f6dac0f47a38a57009f8c4d18f6" exitCode=0 Dec 03 07:13:56 crc kubenswrapper[4947]: I1203 07:13:56.632544 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fc8q2" event={"ID":"fb6808ac-ff5d-49d3-a325-de242d82e4f7","Type":"ContainerDied","Data":"5166f9b654dcbcb8386aabcbe0d8ddba0a700f6dac0f47a38a57009f8c4d18f6"} Dec 03 07:13:57 crc kubenswrapper[4947]: I1203 07:13:57.643627 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fc8q2" event={"ID":"fb6808ac-ff5d-49d3-a325-de242d82e4f7","Type":"ContainerStarted","Data":"a1307dfaf771235f53ac78e0b6cf6d415f38b948496bbf9915b8a41d323fb87a"} Dec 03 07:13:57 crc kubenswrapper[4947]: I1203 07:13:57.664922 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fc8q2" podStartSLOduration=3.117264135 podStartE2EDuration="5.664897848s" podCreationTimestamp="2025-12-03 07:13:52 +0000 UTC" firstStartedPulling="2025-12-03 07:13:54.606027624 +0000 UTC m=+1495.866982080" lastFinishedPulling="2025-12-03 07:13:57.153661367 +0000 UTC m=+1498.414615793" observedRunningTime="2025-12-03 07:13:57.662140413 +0000 UTC m=+1498.923094879" watchObservedRunningTime="2025-12-03 07:13:57.664897848 +0000 UTC m=+1498.925852284" Dec 03 07:14:00 crc kubenswrapper[4947]: I1203 07:14:00.086100 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:14:00 crc kubenswrapper[4947]: I1203 07:14:00.086518 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:14:00 crc kubenswrapper[4947]: I1203 07:14:00.086575 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 07:14:00 crc kubenswrapper[4947]: I1203 07:14:00.087261 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:14:00 crc kubenswrapper[4947]: I1203 07:14:00.087357 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" gracePeriod=600 Dec 03 07:14:00 crc kubenswrapper[4947]: E1203 07:14:00.214403 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:14:00 crc kubenswrapper[4947]: I1203 07:14:00.675607 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" exitCode=0 Dec 03 07:14:00 crc kubenswrapper[4947]: I1203 07:14:00.675688 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946"} Dec 03 07:14:00 crc kubenswrapper[4947]: I1203 07:14:00.675966 4947 scope.go:117] "RemoveContainer" containerID="c728d116f3c53bcc6037be4215b8c5da0c570fa9f0fddd2b5bb621a8286fe726" Dec 03 07:14:00 crc kubenswrapper[4947]: I1203 07:14:00.676364 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:14:00 crc kubenswrapper[4947]: E1203 07:14:00.676595 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:14:01 crc kubenswrapper[4947]: I1203 07:14:01.136370 4947 scope.go:117] "RemoveContainer" containerID="c04ea9b7a8fa051bc8f8f6e63fce2b71c7ecf7839a65f3ca53d45b4449c04110" Dec 03 07:14:01 crc kubenswrapper[4947]: I1203 07:14:01.182124 4947 scope.go:117] "RemoveContainer" containerID="62aa129fc147cc1180b141d3b348f304094abc5ad47b8dddaf94bcc554ab8860" Dec 03 07:14:01 crc kubenswrapper[4947]: I1203 07:14:01.233227 4947 scope.go:117] "RemoveContainer" containerID="73f4afbf254709699e2004489bb7cfd91169baaf41f65098687b729bf3f60e60" Dec 03 07:14:01 crc kubenswrapper[4947]: I1203 07:14:01.299851 4947 scope.go:117] "RemoveContainer" containerID="a00403c9e5c9c0356f8cda0bb2f5ea33101133e03ecec9269cd5a9a058bc1298" Dec 03 07:14:01 crc kubenswrapper[4947]: I1203 07:14:01.338227 4947 scope.go:117] "RemoveContainer" containerID="f444dd7f6f1caddc9717139386579ed806f4ee5f7bb6faeee2f25587865e1c39" Dec 03 07:14:01 crc kubenswrapper[4947]: I1203 07:14:01.361685 4947 scope.go:117] "RemoveContainer" containerID="a79ca432e119cfe4382eac1658d9f1475820f86e884d9d267beaad6d791c9e27" Dec 03 07:14:01 crc kubenswrapper[4947]: I1203 07:14:01.383823 4947 scope.go:117] "RemoveContainer" containerID="a97cfb598af771fadb5726f8c4ca3b03aba98f58c7aa78e79de0f3b339e5fad7" Dec 03 07:14:01 crc kubenswrapper[4947]: I1203 07:14:01.406954 4947 scope.go:117] "RemoveContainer" containerID="9d943d955b4387689a8d712809d10af11da1db79e92a195ba57d31cce0773125" Dec 03 07:14:01 crc kubenswrapper[4947]: I1203 07:14:01.434501 4947 scope.go:117] "RemoveContainer" containerID="9e770c8d82efa23dd527c2a4232bde18aef5979e147a5871d763e136d274533e" Dec 03 07:14:01 crc kubenswrapper[4947]: I1203 07:14:01.456704 4947 scope.go:117] "RemoveContainer" containerID="3ce586afd15fbdd0b4c1dfbc0475b8228a9be2dd3ee32c7a7f80750aafce6e2f" Dec 03 07:14:03 crc kubenswrapper[4947]: I1203 07:14:03.036337 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fc8q2" Dec 03 07:14:03 crc kubenswrapper[4947]: I1203 07:14:03.037254 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fc8q2" Dec 03 07:14:03 crc kubenswrapper[4947]: I1203 07:14:03.097896 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fc8q2" Dec 03 07:14:03 crc kubenswrapper[4947]: I1203 07:14:03.764756 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fc8q2" Dec 03 07:14:03 crc kubenswrapper[4947]: I1203 07:14:03.810635 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fc8q2"] Dec 03 07:14:05 crc kubenswrapper[4947]: I1203 07:14:05.746761 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fc8q2" podUID="fb6808ac-ff5d-49d3-a325-de242d82e4f7" containerName="registry-server" containerID="cri-o://a1307dfaf771235f53ac78e0b6cf6d415f38b948496bbf9915b8a41d323fb87a" gracePeriod=2 Dec 03 07:14:06 crc kubenswrapper[4947]: I1203 07:14:06.762152 4947 generic.go:334] "Generic (PLEG): container finished" podID="fb6808ac-ff5d-49d3-a325-de242d82e4f7" containerID="a1307dfaf771235f53ac78e0b6cf6d415f38b948496bbf9915b8a41d323fb87a" exitCode=0 Dec 03 07:14:06 crc kubenswrapper[4947]: I1203 07:14:06.762225 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fc8q2" event={"ID":"fb6808ac-ff5d-49d3-a325-de242d82e4f7","Type":"ContainerDied","Data":"a1307dfaf771235f53ac78e0b6cf6d415f38b948496bbf9915b8a41d323fb87a"} Dec 03 07:14:07 crc kubenswrapper[4947]: I1203 07:14:07.310438 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fc8q2" Dec 03 07:14:07 crc kubenswrapper[4947]: I1203 07:14:07.430502 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb6808ac-ff5d-49d3-a325-de242d82e4f7-catalog-content\") pod \"fb6808ac-ff5d-49d3-a325-de242d82e4f7\" (UID: \"fb6808ac-ff5d-49d3-a325-de242d82e4f7\") " Dec 03 07:14:07 crc kubenswrapper[4947]: I1203 07:14:07.430614 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4w92\" (UniqueName: \"kubernetes.io/projected/fb6808ac-ff5d-49d3-a325-de242d82e4f7-kube-api-access-v4w92\") pod \"fb6808ac-ff5d-49d3-a325-de242d82e4f7\" (UID: \"fb6808ac-ff5d-49d3-a325-de242d82e4f7\") " Dec 03 07:14:07 crc kubenswrapper[4947]: I1203 07:14:07.430661 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb6808ac-ff5d-49d3-a325-de242d82e4f7-utilities\") pod \"fb6808ac-ff5d-49d3-a325-de242d82e4f7\" (UID: \"fb6808ac-ff5d-49d3-a325-de242d82e4f7\") " Dec 03 07:14:07 crc kubenswrapper[4947]: I1203 07:14:07.431911 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb6808ac-ff5d-49d3-a325-de242d82e4f7-utilities" (OuterVolumeSpecName: "utilities") pod "fb6808ac-ff5d-49d3-a325-de242d82e4f7" (UID: "fb6808ac-ff5d-49d3-a325-de242d82e4f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:14:07 crc kubenswrapper[4947]: I1203 07:14:07.438213 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb6808ac-ff5d-49d3-a325-de242d82e4f7-kube-api-access-v4w92" (OuterVolumeSpecName: "kube-api-access-v4w92") pod "fb6808ac-ff5d-49d3-a325-de242d82e4f7" (UID: "fb6808ac-ff5d-49d3-a325-de242d82e4f7"). InnerVolumeSpecName "kube-api-access-v4w92". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:14:07 crc kubenswrapper[4947]: I1203 07:14:07.501388 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb6808ac-ff5d-49d3-a325-de242d82e4f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb6808ac-ff5d-49d3-a325-de242d82e4f7" (UID: "fb6808ac-ff5d-49d3-a325-de242d82e4f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:14:07 crc kubenswrapper[4947]: I1203 07:14:07.532452 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4w92\" (UniqueName: \"kubernetes.io/projected/fb6808ac-ff5d-49d3-a325-de242d82e4f7-kube-api-access-v4w92\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:07 crc kubenswrapper[4947]: I1203 07:14:07.532482 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb6808ac-ff5d-49d3-a325-de242d82e4f7-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:07 crc kubenswrapper[4947]: I1203 07:14:07.532513 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb6808ac-ff5d-49d3-a325-de242d82e4f7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:07 crc kubenswrapper[4947]: I1203 07:14:07.775069 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fc8q2" event={"ID":"fb6808ac-ff5d-49d3-a325-de242d82e4f7","Type":"ContainerDied","Data":"c0627d697547a9c73c5e5dbc29aa638a31705af0246fe345b0a4b5e03a88181e"} Dec 03 07:14:07 crc kubenswrapper[4947]: I1203 07:14:07.775153 4947 scope.go:117] "RemoveContainer" containerID="a1307dfaf771235f53ac78e0b6cf6d415f38b948496bbf9915b8a41d323fb87a" Dec 03 07:14:07 crc kubenswrapper[4947]: I1203 07:14:07.775429 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fc8q2" Dec 03 07:14:07 crc kubenswrapper[4947]: I1203 07:14:07.803295 4947 scope.go:117] "RemoveContainer" containerID="5166f9b654dcbcb8386aabcbe0d8ddba0a700f6dac0f47a38a57009f8c4d18f6" Dec 03 07:14:07 crc kubenswrapper[4947]: I1203 07:14:07.826571 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fc8q2"] Dec 03 07:14:07 crc kubenswrapper[4947]: I1203 07:14:07.831244 4947 scope.go:117] "RemoveContainer" containerID="16d5d69531488cd8c4644a96b3545776666e9679c9bb5b843e2e9c2c278bd73b" Dec 03 07:14:07 crc kubenswrapper[4947]: I1203 07:14:07.834536 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fc8q2"] Dec 03 07:14:09 crc kubenswrapper[4947]: I1203 07:14:09.098840 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb6808ac-ff5d-49d3-a325-de242d82e4f7" path="/var/lib/kubelet/pods/fb6808ac-ff5d-49d3-a325-de242d82e4f7/volumes" Dec 03 07:14:14 crc kubenswrapper[4947]: I1203 07:14:14.083146 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:14:14 crc kubenswrapper[4947]: E1203 07:14:14.084117 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:14:25 crc kubenswrapper[4947]: I1203 07:14:25.084193 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:14:25 crc kubenswrapper[4947]: E1203 07:14:25.085191 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:14:37 crc kubenswrapper[4947]: I1203 07:14:37.722723 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cfhsk"] Dec 03 07:14:37 crc kubenswrapper[4947]: E1203 07:14:37.723947 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb6808ac-ff5d-49d3-a325-de242d82e4f7" containerName="registry-server" Dec 03 07:14:37 crc kubenswrapper[4947]: I1203 07:14:37.723980 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb6808ac-ff5d-49d3-a325-de242d82e4f7" containerName="registry-server" Dec 03 07:14:37 crc kubenswrapper[4947]: E1203 07:14:37.724031 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb6808ac-ff5d-49d3-a325-de242d82e4f7" containerName="extract-utilities" Dec 03 07:14:37 crc kubenswrapper[4947]: I1203 07:14:37.724048 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb6808ac-ff5d-49d3-a325-de242d82e4f7" containerName="extract-utilities" Dec 03 07:14:37 crc kubenswrapper[4947]: E1203 07:14:37.724071 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb6808ac-ff5d-49d3-a325-de242d82e4f7" containerName="extract-content" Dec 03 07:14:37 crc kubenswrapper[4947]: I1203 07:14:37.724089 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb6808ac-ff5d-49d3-a325-de242d82e4f7" containerName="extract-content" Dec 03 07:14:37 crc kubenswrapper[4947]: I1203 07:14:37.724416 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb6808ac-ff5d-49d3-a325-de242d82e4f7" containerName="registry-server" Dec 03 07:14:37 crc kubenswrapper[4947]: I1203 07:14:37.726566 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cfhsk" Dec 03 07:14:37 crc kubenswrapper[4947]: I1203 07:14:37.737137 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cfhsk"] Dec 03 07:14:37 crc kubenswrapper[4947]: I1203 07:14:37.841579 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p6x4\" (UniqueName: \"kubernetes.io/projected/975fdf1a-16fb-4764-a1a2-1d93b198284d-kube-api-access-8p6x4\") pod \"community-operators-cfhsk\" (UID: \"975fdf1a-16fb-4764-a1a2-1d93b198284d\") " pod="openshift-marketplace/community-operators-cfhsk" Dec 03 07:14:37 crc kubenswrapper[4947]: I1203 07:14:37.842065 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/975fdf1a-16fb-4764-a1a2-1d93b198284d-catalog-content\") pod \"community-operators-cfhsk\" (UID: \"975fdf1a-16fb-4764-a1a2-1d93b198284d\") " pod="openshift-marketplace/community-operators-cfhsk" Dec 03 07:14:37 crc kubenswrapper[4947]: I1203 07:14:37.842321 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/975fdf1a-16fb-4764-a1a2-1d93b198284d-utilities\") pod \"community-operators-cfhsk\" (UID: \"975fdf1a-16fb-4764-a1a2-1d93b198284d\") " pod="openshift-marketplace/community-operators-cfhsk" Dec 03 07:14:37 crc kubenswrapper[4947]: I1203 07:14:37.943349 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/975fdf1a-16fb-4764-a1a2-1d93b198284d-catalog-content\") pod \"community-operators-cfhsk\" (UID: \"975fdf1a-16fb-4764-a1a2-1d93b198284d\") " pod="openshift-marketplace/community-operators-cfhsk" Dec 03 07:14:37 crc kubenswrapper[4947]: I1203 07:14:37.943416 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/975fdf1a-16fb-4764-a1a2-1d93b198284d-utilities\") pod \"community-operators-cfhsk\" (UID: \"975fdf1a-16fb-4764-a1a2-1d93b198284d\") " pod="openshift-marketplace/community-operators-cfhsk" Dec 03 07:14:37 crc kubenswrapper[4947]: I1203 07:14:37.943450 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p6x4\" (UniqueName: \"kubernetes.io/projected/975fdf1a-16fb-4764-a1a2-1d93b198284d-kube-api-access-8p6x4\") pod \"community-operators-cfhsk\" (UID: \"975fdf1a-16fb-4764-a1a2-1d93b198284d\") " pod="openshift-marketplace/community-operators-cfhsk" Dec 03 07:14:37 crc kubenswrapper[4947]: I1203 07:14:37.944128 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/975fdf1a-16fb-4764-a1a2-1d93b198284d-catalog-content\") pod \"community-operators-cfhsk\" (UID: \"975fdf1a-16fb-4764-a1a2-1d93b198284d\") " pod="openshift-marketplace/community-operators-cfhsk" Dec 03 07:14:37 crc kubenswrapper[4947]: I1203 07:14:37.944158 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/975fdf1a-16fb-4764-a1a2-1d93b198284d-utilities\") pod \"community-operators-cfhsk\" (UID: \"975fdf1a-16fb-4764-a1a2-1d93b198284d\") " pod="openshift-marketplace/community-operators-cfhsk" Dec 03 07:14:37 crc kubenswrapper[4947]: I1203 07:14:37.972664 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p6x4\" (UniqueName: \"kubernetes.io/projected/975fdf1a-16fb-4764-a1a2-1d93b198284d-kube-api-access-8p6x4\") pod \"community-operators-cfhsk\" (UID: \"975fdf1a-16fb-4764-a1a2-1d93b198284d\") " pod="openshift-marketplace/community-operators-cfhsk" Dec 03 07:14:38 crc kubenswrapper[4947]: I1203 07:14:38.053389 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cfhsk" Dec 03 07:14:38 crc kubenswrapper[4947]: I1203 07:14:38.571562 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cfhsk"] Dec 03 07:14:39 crc kubenswrapper[4947]: I1203 07:14:39.086760 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:14:39 crc kubenswrapper[4947]: E1203 07:14:39.086966 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:14:39 crc kubenswrapper[4947]: I1203 07:14:39.132407 4947 generic.go:334] "Generic (PLEG): container finished" podID="975fdf1a-16fb-4764-a1a2-1d93b198284d" containerID="a7901be306813058373af9686ed0970ba288a36417fa6c95c11734f32a34ad19" exitCode=0 Dec 03 07:14:39 crc kubenswrapper[4947]: I1203 07:14:39.132449 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfhsk" event={"ID":"975fdf1a-16fb-4764-a1a2-1d93b198284d","Type":"ContainerDied","Data":"a7901be306813058373af9686ed0970ba288a36417fa6c95c11734f32a34ad19"} Dec 03 07:14:39 crc kubenswrapper[4947]: I1203 07:14:39.132479 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfhsk" event={"ID":"975fdf1a-16fb-4764-a1a2-1d93b198284d","Type":"ContainerStarted","Data":"fb7c055b8785e85010d2cf5aa4864112df0b8ee343a6abf09a3b3da8d4a4637a"} Dec 03 07:14:40 crc kubenswrapper[4947]: I1203 07:14:40.142848 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfhsk" event={"ID":"975fdf1a-16fb-4764-a1a2-1d93b198284d","Type":"ContainerStarted","Data":"f19b80834b8397cea59643ed8c19e2bd98f0cbe2a9d26dc28dfa5ef27173aead"} Dec 03 07:14:41 crc kubenswrapper[4947]: I1203 07:14:41.159642 4947 generic.go:334] "Generic (PLEG): container finished" podID="975fdf1a-16fb-4764-a1a2-1d93b198284d" containerID="f19b80834b8397cea59643ed8c19e2bd98f0cbe2a9d26dc28dfa5ef27173aead" exitCode=0 Dec 03 07:14:41 crc kubenswrapper[4947]: I1203 07:14:41.159706 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfhsk" event={"ID":"975fdf1a-16fb-4764-a1a2-1d93b198284d","Type":"ContainerDied","Data":"f19b80834b8397cea59643ed8c19e2bd98f0cbe2a9d26dc28dfa5ef27173aead"} Dec 03 07:14:42 crc kubenswrapper[4947]: I1203 07:14:42.171314 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfhsk" event={"ID":"975fdf1a-16fb-4764-a1a2-1d93b198284d","Type":"ContainerStarted","Data":"38a23b9d981273bd086654f1de0cf258f37b161e6685274bcb88161adcacea13"} Dec 03 07:14:42 crc kubenswrapper[4947]: I1203 07:14:42.193775 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cfhsk" podStartSLOduration=2.6148514179999998 podStartE2EDuration="5.193754751s" podCreationTimestamp="2025-12-03 07:14:37 +0000 UTC" firstStartedPulling="2025-12-03 07:14:39.133791787 +0000 UTC m=+1540.394746213" lastFinishedPulling="2025-12-03 07:14:41.71269511 +0000 UTC m=+1542.973649546" observedRunningTime="2025-12-03 07:14:42.186547895 +0000 UTC m=+1543.447502321" watchObservedRunningTime="2025-12-03 07:14:42.193754751 +0000 UTC m=+1543.454709187" Dec 03 07:14:48 crc kubenswrapper[4947]: I1203 07:14:48.054725 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cfhsk" Dec 03 07:14:48 crc kubenswrapper[4947]: I1203 07:14:48.055133 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cfhsk" Dec 03 07:14:48 crc kubenswrapper[4947]: I1203 07:14:48.136072 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cfhsk" Dec 03 07:14:48 crc kubenswrapper[4947]: I1203 07:14:48.280081 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cfhsk" Dec 03 07:14:48 crc kubenswrapper[4947]: I1203 07:14:48.379115 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cfhsk"] Dec 03 07:14:50 crc kubenswrapper[4947]: I1203 07:14:50.239880 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cfhsk" podUID="975fdf1a-16fb-4764-a1a2-1d93b198284d" containerName="registry-server" containerID="cri-o://38a23b9d981273bd086654f1de0cf258f37b161e6685274bcb88161adcacea13" gracePeriod=2 Dec 03 07:14:51 crc kubenswrapper[4947]: I1203 07:14:51.257927 4947 generic.go:334] "Generic (PLEG): container finished" podID="975fdf1a-16fb-4764-a1a2-1d93b198284d" containerID="38a23b9d981273bd086654f1de0cf258f37b161e6685274bcb88161adcacea13" exitCode=0 Dec 03 07:14:51 crc kubenswrapper[4947]: I1203 07:14:51.258063 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfhsk" event={"ID":"975fdf1a-16fb-4764-a1a2-1d93b198284d","Type":"ContainerDied","Data":"38a23b9d981273bd086654f1de0cf258f37b161e6685274bcb88161adcacea13"} Dec 03 07:14:51 crc kubenswrapper[4947]: I1203 07:14:51.740368 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cfhsk" Dec 03 07:14:51 crc kubenswrapper[4947]: I1203 07:14:51.864526 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/975fdf1a-16fb-4764-a1a2-1d93b198284d-catalog-content\") pod \"975fdf1a-16fb-4764-a1a2-1d93b198284d\" (UID: \"975fdf1a-16fb-4764-a1a2-1d93b198284d\") " Dec 03 07:14:51 crc kubenswrapper[4947]: I1203 07:14:51.864613 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8p6x4\" (UniqueName: \"kubernetes.io/projected/975fdf1a-16fb-4764-a1a2-1d93b198284d-kube-api-access-8p6x4\") pod \"975fdf1a-16fb-4764-a1a2-1d93b198284d\" (UID: \"975fdf1a-16fb-4764-a1a2-1d93b198284d\") " Dec 03 07:14:51 crc kubenswrapper[4947]: I1203 07:14:51.864733 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/975fdf1a-16fb-4764-a1a2-1d93b198284d-utilities\") pod \"975fdf1a-16fb-4764-a1a2-1d93b198284d\" (UID: \"975fdf1a-16fb-4764-a1a2-1d93b198284d\") " Dec 03 07:14:51 crc kubenswrapper[4947]: I1203 07:14:51.866123 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/975fdf1a-16fb-4764-a1a2-1d93b198284d-utilities" (OuterVolumeSpecName: "utilities") pod "975fdf1a-16fb-4764-a1a2-1d93b198284d" (UID: "975fdf1a-16fb-4764-a1a2-1d93b198284d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:14:51 crc kubenswrapper[4947]: I1203 07:14:51.870648 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/975fdf1a-16fb-4764-a1a2-1d93b198284d-kube-api-access-8p6x4" (OuterVolumeSpecName: "kube-api-access-8p6x4") pod "975fdf1a-16fb-4764-a1a2-1d93b198284d" (UID: "975fdf1a-16fb-4764-a1a2-1d93b198284d"). InnerVolumeSpecName "kube-api-access-8p6x4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:14:51 crc kubenswrapper[4947]: I1203 07:14:51.933758 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/975fdf1a-16fb-4764-a1a2-1d93b198284d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "975fdf1a-16fb-4764-a1a2-1d93b198284d" (UID: "975fdf1a-16fb-4764-a1a2-1d93b198284d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:14:51 crc kubenswrapper[4947]: I1203 07:14:51.965876 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8p6x4\" (UniqueName: \"kubernetes.io/projected/975fdf1a-16fb-4764-a1a2-1d93b198284d-kube-api-access-8p6x4\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:51 crc kubenswrapper[4947]: I1203 07:14:51.965909 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/975fdf1a-16fb-4764-a1a2-1d93b198284d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:51 crc kubenswrapper[4947]: I1203 07:14:51.965921 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/975fdf1a-16fb-4764-a1a2-1d93b198284d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:52 crc kubenswrapper[4947]: I1203 07:14:52.083462 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:14:52 crc kubenswrapper[4947]: E1203 07:14:52.083909 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:14:52 crc kubenswrapper[4947]: I1203 07:14:52.271707 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfhsk" event={"ID":"975fdf1a-16fb-4764-a1a2-1d93b198284d","Type":"ContainerDied","Data":"fb7c055b8785e85010d2cf5aa4864112df0b8ee343a6abf09a3b3da8d4a4637a"} Dec 03 07:14:52 crc kubenswrapper[4947]: I1203 07:14:52.271747 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cfhsk" Dec 03 07:14:52 crc kubenswrapper[4947]: I1203 07:14:52.271762 4947 scope.go:117] "RemoveContainer" containerID="38a23b9d981273bd086654f1de0cf258f37b161e6685274bcb88161adcacea13" Dec 03 07:14:52 crc kubenswrapper[4947]: I1203 07:14:52.293442 4947 scope.go:117] "RemoveContainer" containerID="f19b80834b8397cea59643ed8c19e2bd98f0cbe2a9d26dc28dfa5ef27173aead" Dec 03 07:14:52 crc kubenswrapper[4947]: I1203 07:14:52.313417 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cfhsk"] Dec 03 07:14:52 crc kubenswrapper[4947]: I1203 07:14:52.319429 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cfhsk"] Dec 03 07:14:52 crc kubenswrapper[4947]: I1203 07:14:52.333264 4947 scope.go:117] "RemoveContainer" containerID="a7901be306813058373af9686ed0970ba288a36417fa6c95c11734f32a34ad19" Dec 03 07:14:53 crc kubenswrapper[4947]: I1203 07:14:53.099625 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="975fdf1a-16fb-4764-a1a2-1d93b198284d" path="/var/lib/kubelet/pods/975fdf1a-16fb-4764-a1a2-1d93b198284d/volumes" Dec 03 07:15:00 crc kubenswrapper[4947]: I1203 07:15:00.161268 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412435-kgtgb"] Dec 03 07:15:00 crc kubenswrapper[4947]: E1203 07:15:00.162474 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="975fdf1a-16fb-4764-a1a2-1d93b198284d" containerName="registry-server" Dec 03 07:15:00 crc kubenswrapper[4947]: I1203 07:15:00.162556 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="975fdf1a-16fb-4764-a1a2-1d93b198284d" containerName="registry-server" Dec 03 07:15:00 crc kubenswrapper[4947]: E1203 07:15:00.162582 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="975fdf1a-16fb-4764-a1a2-1d93b198284d" containerName="extract-utilities" Dec 03 07:15:00 crc kubenswrapper[4947]: I1203 07:15:00.162598 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="975fdf1a-16fb-4764-a1a2-1d93b198284d" containerName="extract-utilities" Dec 03 07:15:00 crc kubenswrapper[4947]: E1203 07:15:00.162644 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="975fdf1a-16fb-4764-a1a2-1d93b198284d" containerName="extract-content" Dec 03 07:15:00 crc kubenswrapper[4947]: I1203 07:15:00.162662 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="975fdf1a-16fb-4764-a1a2-1d93b198284d" containerName="extract-content" Dec 03 07:15:00 crc kubenswrapper[4947]: I1203 07:15:00.163035 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="975fdf1a-16fb-4764-a1a2-1d93b198284d" containerName="registry-server" Dec 03 07:15:00 crc kubenswrapper[4947]: I1203 07:15:00.163885 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-kgtgb" Dec 03 07:15:00 crc kubenswrapper[4947]: I1203 07:15:00.166823 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 07:15:00 crc kubenswrapper[4947]: I1203 07:15:00.167171 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 07:15:00 crc kubenswrapper[4947]: I1203 07:15:00.188124 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412435-kgtgb"] Dec 03 07:15:00 crc kubenswrapper[4947]: I1203 07:15:00.303163 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh5bm\" (UniqueName: \"kubernetes.io/projected/10d438d4-2007-4ef3-b529-e67d1c77af75-kube-api-access-kh5bm\") pod \"collect-profiles-29412435-kgtgb\" (UID: \"10d438d4-2007-4ef3-b529-e67d1c77af75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-kgtgb" Dec 03 07:15:00 crc kubenswrapper[4947]: I1203 07:15:00.303449 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10d438d4-2007-4ef3-b529-e67d1c77af75-secret-volume\") pod \"collect-profiles-29412435-kgtgb\" (UID: \"10d438d4-2007-4ef3-b529-e67d1c77af75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-kgtgb" Dec 03 07:15:00 crc kubenswrapper[4947]: I1203 07:15:00.303510 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10d438d4-2007-4ef3-b529-e67d1c77af75-config-volume\") pod \"collect-profiles-29412435-kgtgb\" (UID: \"10d438d4-2007-4ef3-b529-e67d1c77af75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-kgtgb" Dec 03 07:15:00 crc kubenswrapper[4947]: I1203 07:15:00.404976 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh5bm\" (UniqueName: \"kubernetes.io/projected/10d438d4-2007-4ef3-b529-e67d1c77af75-kube-api-access-kh5bm\") pod \"collect-profiles-29412435-kgtgb\" (UID: \"10d438d4-2007-4ef3-b529-e67d1c77af75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-kgtgb" Dec 03 07:15:00 crc kubenswrapper[4947]: I1203 07:15:00.405081 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10d438d4-2007-4ef3-b529-e67d1c77af75-secret-volume\") pod \"collect-profiles-29412435-kgtgb\" (UID: \"10d438d4-2007-4ef3-b529-e67d1c77af75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-kgtgb" Dec 03 07:15:00 crc kubenswrapper[4947]: I1203 07:15:00.405204 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10d438d4-2007-4ef3-b529-e67d1c77af75-config-volume\") pod \"collect-profiles-29412435-kgtgb\" (UID: \"10d438d4-2007-4ef3-b529-e67d1c77af75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-kgtgb" Dec 03 07:15:00 crc kubenswrapper[4947]: I1203 07:15:00.407752 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10d438d4-2007-4ef3-b529-e67d1c77af75-config-volume\") pod \"collect-profiles-29412435-kgtgb\" (UID: \"10d438d4-2007-4ef3-b529-e67d1c77af75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-kgtgb" Dec 03 07:15:00 crc kubenswrapper[4947]: I1203 07:15:00.414667 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10d438d4-2007-4ef3-b529-e67d1c77af75-secret-volume\") pod \"collect-profiles-29412435-kgtgb\" (UID: \"10d438d4-2007-4ef3-b529-e67d1c77af75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-kgtgb" Dec 03 07:15:00 crc kubenswrapper[4947]: I1203 07:15:00.442589 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh5bm\" (UniqueName: \"kubernetes.io/projected/10d438d4-2007-4ef3-b529-e67d1c77af75-kube-api-access-kh5bm\") pod \"collect-profiles-29412435-kgtgb\" (UID: \"10d438d4-2007-4ef3-b529-e67d1c77af75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-kgtgb" Dec 03 07:15:00 crc kubenswrapper[4947]: I1203 07:15:00.486317 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-kgtgb" Dec 03 07:15:00 crc kubenswrapper[4947]: I1203 07:15:00.729091 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412435-kgtgb"] Dec 03 07:15:00 crc kubenswrapper[4947]: W1203 07:15:00.740848 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10d438d4_2007_4ef3_b529_e67d1c77af75.slice/crio-c36788fe71034bdc414b049fe72cf03fa3e6c34d90359717ac1c39d8085fef68 WatchSource:0}: Error finding container c36788fe71034bdc414b049fe72cf03fa3e6c34d90359717ac1c39d8085fef68: Status 404 returned error can't find the container with id c36788fe71034bdc414b049fe72cf03fa3e6c34d90359717ac1c39d8085fef68 Dec 03 07:15:01 crc kubenswrapper[4947]: I1203 07:15:01.357448 4947 generic.go:334] "Generic (PLEG): container finished" podID="10d438d4-2007-4ef3-b529-e67d1c77af75" containerID="04e6239d3ae53b8c5a72c23338a1ac80fabbe821a7724fbac28190afdfdd2074" exitCode=0 Dec 03 07:15:01 crc kubenswrapper[4947]: I1203 07:15:01.357633 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-kgtgb" event={"ID":"10d438d4-2007-4ef3-b529-e67d1c77af75","Type":"ContainerDied","Data":"04e6239d3ae53b8c5a72c23338a1ac80fabbe821a7724fbac28190afdfdd2074"} Dec 03 07:15:01 crc kubenswrapper[4947]: I1203 07:15:01.357771 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-kgtgb" event={"ID":"10d438d4-2007-4ef3-b529-e67d1c77af75","Type":"ContainerStarted","Data":"c36788fe71034bdc414b049fe72cf03fa3e6c34d90359717ac1c39d8085fef68"} Dec 03 07:15:01 crc kubenswrapper[4947]: I1203 07:15:01.687443 4947 scope.go:117] "RemoveContainer" containerID="6862eeba03902517d3aa9b7fe73e29b1783f7913945bfd9c6bffd507ee095166" Dec 03 07:15:01 crc kubenswrapper[4947]: I1203 07:15:01.721425 4947 scope.go:117] "RemoveContainer" containerID="cca036791174cd51f9120f826c271de5b4fa4462fa9eca0d1720c564b47c02f0" Dec 03 07:15:01 crc kubenswrapper[4947]: I1203 07:15:01.740471 4947 scope.go:117] "RemoveContainer" containerID="7149a4572e5e5e3a063052ea9b7c1cd9fe182ceed0f5cd18d89d101e6199b317" Dec 03 07:15:01 crc kubenswrapper[4947]: I1203 07:15:01.780751 4947 scope.go:117] "RemoveContainer" containerID="c9a719a484374ac820c11d3eb970965b2a4af8d1426143c047b5501f316ae86a" Dec 03 07:15:01 crc kubenswrapper[4947]: I1203 07:15:01.830310 4947 scope.go:117] "RemoveContainer" containerID="9b3c5a6fec243c546a76a99e7c5a8fb0550f8743b0fcd71dd7bf768c01e41571" Dec 03 07:15:01 crc kubenswrapper[4947]: I1203 07:15:01.857311 4947 scope.go:117] "RemoveContainer" containerID="6766d43117c70819e1c6cb4c5fa75c20b20eceebf474e1f3dd284b91392b89bb" Dec 03 07:15:01 crc kubenswrapper[4947]: I1203 07:15:01.873664 4947 scope.go:117] "RemoveContainer" containerID="12c4f57540cc8159dc9d9868fdd26f7712daf6b1b80c9065470a6becaf4c402b" Dec 03 07:15:01 crc kubenswrapper[4947]: I1203 07:15:01.902973 4947 scope.go:117] "RemoveContainer" containerID="54ffca1d739dbd65d8fc73257d9b164f24984c77619c97c08b04adf4a0aabcd3" Dec 03 07:15:01 crc kubenswrapper[4947]: I1203 07:15:01.956457 4947 scope.go:117] "RemoveContainer" containerID="5ac2f4be7b9ddfe24a426f374335c91d29dea24a151ca75482bb4d5c943f1c1f" Dec 03 07:15:02 crc kubenswrapper[4947]: I1203 07:15:02.001858 4947 scope.go:117] "RemoveContainer" containerID="a78540289946534a350ef445e5a1ab10a4c50c6691c8ab7e7bd31dc482d3b51d" Dec 03 07:15:02 crc kubenswrapper[4947]: I1203 07:15:02.034662 4947 scope.go:117] "RemoveContainer" containerID="fe361200f59f5f57f5be4b1d94e67c3de3026379cc88db2902131071988c9d9f" Dec 03 07:15:02 crc kubenswrapper[4947]: I1203 07:15:02.054343 4947 scope.go:117] "RemoveContainer" containerID="5ad3fa21c883f1399362511f39b2a0eba4dd79e18d1bbc686bd886b186157175" Dec 03 07:15:02 crc kubenswrapper[4947]: I1203 07:15:02.077450 4947 scope.go:117] "RemoveContainer" containerID="cec820629ebe723b81f827504cd18d3327c219c11a06e73c61a37a257b4fe002" Dec 03 07:15:02 crc kubenswrapper[4947]: I1203 07:15:02.099430 4947 scope.go:117] "RemoveContainer" containerID="362bc95a1a55a281ea46b642a4e75653da431e70d6b31a1af4b85f03b79ba639" Dec 03 07:15:02 crc kubenswrapper[4947]: I1203 07:15:02.135232 4947 scope.go:117] "RemoveContainer" containerID="16255723e7205ad7a536dbf4b69b6b3fcc4882ee357336dbdd9cdf6612ace516" Dec 03 07:15:02 crc kubenswrapper[4947]: I1203 07:15:02.171069 4947 scope.go:117] "RemoveContainer" containerID="eea712e04988a28c6a66e0d143ed1e46cb7f5ad455b6cdf5305180775878b7b0" Dec 03 07:15:02 crc kubenswrapper[4947]: I1203 07:15:02.190142 4947 scope.go:117] "RemoveContainer" containerID="7ab390ea5a32098708bae957ee6d754d53e5b9113cc224e30df12f6bde7d7e18" Dec 03 07:15:02 crc kubenswrapper[4947]: I1203 07:15:02.214390 4947 scope.go:117] "RemoveContainer" containerID="ff60d3a64764b7b0a8521fbe7ced724e9b3c275421d8e741d28b55a0fee93a3a" Dec 03 07:15:02 crc kubenswrapper[4947]: I1203 07:15:02.252297 4947 scope.go:117] "RemoveContainer" containerID="80650da93ac6c7dbe9d3efbba4ab0980626758431c3f9a1e26df29dbb6b4dd90" Dec 03 07:15:02 crc kubenswrapper[4947]: I1203 07:15:02.736472 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-kgtgb" Dec 03 07:15:02 crc kubenswrapper[4947]: I1203 07:15:02.841974 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10d438d4-2007-4ef3-b529-e67d1c77af75-config-volume\") pod \"10d438d4-2007-4ef3-b529-e67d1c77af75\" (UID: \"10d438d4-2007-4ef3-b529-e67d1c77af75\") " Dec 03 07:15:02 crc kubenswrapper[4947]: I1203 07:15:02.842117 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10d438d4-2007-4ef3-b529-e67d1c77af75-secret-volume\") pod \"10d438d4-2007-4ef3-b529-e67d1c77af75\" (UID: \"10d438d4-2007-4ef3-b529-e67d1c77af75\") " Dec 03 07:15:02 crc kubenswrapper[4947]: I1203 07:15:02.842178 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh5bm\" (UniqueName: \"kubernetes.io/projected/10d438d4-2007-4ef3-b529-e67d1c77af75-kube-api-access-kh5bm\") pod \"10d438d4-2007-4ef3-b529-e67d1c77af75\" (UID: \"10d438d4-2007-4ef3-b529-e67d1c77af75\") " Dec 03 07:15:02 crc kubenswrapper[4947]: I1203 07:15:02.843441 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10d438d4-2007-4ef3-b529-e67d1c77af75-config-volume" (OuterVolumeSpecName: "config-volume") pod "10d438d4-2007-4ef3-b529-e67d1c77af75" (UID: "10d438d4-2007-4ef3-b529-e67d1c77af75"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:15:02 crc kubenswrapper[4947]: I1203 07:15:02.850918 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10d438d4-2007-4ef3-b529-e67d1c77af75-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "10d438d4-2007-4ef3-b529-e67d1c77af75" (UID: "10d438d4-2007-4ef3-b529-e67d1c77af75"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:15:02 crc kubenswrapper[4947]: I1203 07:15:02.853838 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10d438d4-2007-4ef3-b529-e67d1c77af75-kube-api-access-kh5bm" (OuterVolumeSpecName: "kube-api-access-kh5bm") pod "10d438d4-2007-4ef3-b529-e67d1c77af75" (UID: "10d438d4-2007-4ef3-b529-e67d1c77af75"). InnerVolumeSpecName "kube-api-access-kh5bm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:15:02 crc kubenswrapper[4947]: I1203 07:15:02.943711 4947 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10d438d4-2007-4ef3-b529-e67d1c77af75-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 07:15:02 crc kubenswrapper[4947]: I1203 07:15:02.943749 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh5bm\" (UniqueName: \"kubernetes.io/projected/10d438d4-2007-4ef3-b529-e67d1c77af75-kube-api-access-kh5bm\") on node \"crc\" DevicePath \"\"" Dec 03 07:15:02 crc kubenswrapper[4947]: I1203 07:15:02.943758 4947 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10d438d4-2007-4ef3-b529-e67d1c77af75-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 07:15:03 crc kubenswrapper[4947]: I1203 07:15:03.407020 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-kgtgb" event={"ID":"10d438d4-2007-4ef3-b529-e67d1c77af75","Type":"ContainerDied","Data":"c36788fe71034bdc414b049fe72cf03fa3e6c34d90359717ac1c39d8085fef68"} Dec 03 07:15:03 crc kubenswrapper[4947]: I1203 07:15:03.407062 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c36788fe71034bdc414b049fe72cf03fa3e6c34d90359717ac1c39d8085fef68" Dec 03 07:15:03 crc kubenswrapper[4947]: I1203 07:15:03.407148 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-kgtgb" Dec 03 07:15:07 crc kubenswrapper[4947]: I1203 07:15:07.083565 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:15:07 crc kubenswrapper[4947]: E1203 07:15:07.084226 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:15:15 crc kubenswrapper[4947]: I1203 07:15:15.241251 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rm96n"] Dec 03 07:15:15 crc kubenswrapper[4947]: E1203 07:15:15.242337 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d438d4-2007-4ef3-b529-e67d1c77af75" containerName="collect-profiles" Dec 03 07:15:15 crc kubenswrapper[4947]: I1203 07:15:15.242356 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d438d4-2007-4ef3-b529-e67d1c77af75" containerName="collect-profiles" Dec 03 07:15:15 crc kubenswrapper[4947]: I1203 07:15:15.242550 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="10d438d4-2007-4ef3-b529-e67d1c77af75" containerName="collect-profiles" Dec 03 07:15:15 crc kubenswrapper[4947]: I1203 07:15:15.243980 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rm96n" Dec 03 07:15:15 crc kubenswrapper[4947]: I1203 07:15:15.255940 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rm96n"] Dec 03 07:15:15 crc kubenswrapper[4947]: I1203 07:15:15.354082 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbbl5\" (UniqueName: \"kubernetes.io/projected/7dc2624e-f408-4913-8598-8f7238d73cb9-kube-api-access-lbbl5\") pod \"redhat-marketplace-rm96n\" (UID: \"7dc2624e-f408-4913-8598-8f7238d73cb9\") " pod="openshift-marketplace/redhat-marketplace-rm96n" Dec 03 07:15:15 crc kubenswrapper[4947]: I1203 07:15:15.354386 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dc2624e-f408-4913-8598-8f7238d73cb9-catalog-content\") pod \"redhat-marketplace-rm96n\" (UID: \"7dc2624e-f408-4913-8598-8f7238d73cb9\") " pod="openshift-marketplace/redhat-marketplace-rm96n" Dec 03 07:15:15 crc kubenswrapper[4947]: I1203 07:15:15.354432 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dc2624e-f408-4913-8598-8f7238d73cb9-utilities\") pod \"redhat-marketplace-rm96n\" (UID: \"7dc2624e-f408-4913-8598-8f7238d73cb9\") " pod="openshift-marketplace/redhat-marketplace-rm96n" Dec 03 07:15:15 crc kubenswrapper[4947]: I1203 07:15:15.455756 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbbl5\" (UniqueName: \"kubernetes.io/projected/7dc2624e-f408-4913-8598-8f7238d73cb9-kube-api-access-lbbl5\") pod \"redhat-marketplace-rm96n\" (UID: \"7dc2624e-f408-4913-8598-8f7238d73cb9\") " pod="openshift-marketplace/redhat-marketplace-rm96n" Dec 03 07:15:15 crc kubenswrapper[4947]: I1203 07:15:15.455855 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dc2624e-f408-4913-8598-8f7238d73cb9-catalog-content\") pod \"redhat-marketplace-rm96n\" (UID: \"7dc2624e-f408-4913-8598-8f7238d73cb9\") " pod="openshift-marketplace/redhat-marketplace-rm96n" Dec 03 07:15:15 crc kubenswrapper[4947]: I1203 07:15:15.455939 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dc2624e-f408-4913-8598-8f7238d73cb9-utilities\") pod \"redhat-marketplace-rm96n\" (UID: \"7dc2624e-f408-4913-8598-8f7238d73cb9\") " pod="openshift-marketplace/redhat-marketplace-rm96n" Dec 03 07:15:15 crc kubenswrapper[4947]: I1203 07:15:15.456858 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dc2624e-f408-4913-8598-8f7238d73cb9-utilities\") pod \"redhat-marketplace-rm96n\" (UID: \"7dc2624e-f408-4913-8598-8f7238d73cb9\") " pod="openshift-marketplace/redhat-marketplace-rm96n" Dec 03 07:15:15 crc kubenswrapper[4947]: I1203 07:15:15.456982 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dc2624e-f408-4913-8598-8f7238d73cb9-catalog-content\") pod \"redhat-marketplace-rm96n\" (UID: \"7dc2624e-f408-4913-8598-8f7238d73cb9\") " pod="openshift-marketplace/redhat-marketplace-rm96n" Dec 03 07:15:15 crc kubenswrapper[4947]: I1203 07:15:15.476392 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbbl5\" (UniqueName: \"kubernetes.io/projected/7dc2624e-f408-4913-8598-8f7238d73cb9-kube-api-access-lbbl5\") pod \"redhat-marketplace-rm96n\" (UID: \"7dc2624e-f408-4913-8598-8f7238d73cb9\") " pod="openshift-marketplace/redhat-marketplace-rm96n" Dec 03 07:15:15 crc kubenswrapper[4947]: I1203 07:15:15.573371 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rm96n" Dec 03 07:15:15 crc kubenswrapper[4947]: I1203 07:15:15.995903 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rm96n"] Dec 03 07:15:16 crc kubenswrapper[4947]: I1203 07:15:16.538286 4947 generic.go:334] "Generic (PLEG): container finished" podID="7dc2624e-f408-4913-8598-8f7238d73cb9" containerID="4c19ed13a87910dc08e194064d4315719123c18e113310334c654d29fe0b8890" exitCode=0 Dec 03 07:15:16 crc kubenswrapper[4947]: I1203 07:15:16.538333 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rm96n" event={"ID":"7dc2624e-f408-4913-8598-8f7238d73cb9","Type":"ContainerDied","Data":"4c19ed13a87910dc08e194064d4315719123c18e113310334c654d29fe0b8890"} Dec 03 07:15:16 crc kubenswrapper[4947]: I1203 07:15:16.538363 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rm96n" event={"ID":"7dc2624e-f408-4913-8598-8f7238d73cb9","Type":"ContainerStarted","Data":"ef994cf5e9a9a8da176fb40ad8927b32f9917e54646d85e52959b2e17082880d"} Dec 03 07:15:17 crc kubenswrapper[4947]: I1203 07:15:17.548873 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rm96n" event={"ID":"7dc2624e-f408-4913-8598-8f7238d73cb9","Type":"ContainerStarted","Data":"e6c4e0fa99fb193d72fa5470b9bf049e995ca3d72d47fff72538bb9179dbb9b1"} Dec 03 07:15:18 crc kubenswrapper[4947]: I1203 07:15:18.559741 4947 generic.go:334] "Generic (PLEG): container finished" podID="7dc2624e-f408-4913-8598-8f7238d73cb9" containerID="e6c4e0fa99fb193d72fa5470b9bf049e995ca3d72d47fff72538bb9179dbb9b1" exitCode=0 Dec 03 07:15:18 crc kubenswrapper[4947]: I1203 07:15:18.559833 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rm96n" event={"ID":"7dc2624e-f408-4913-8598-8f7238d73cb9","Type":"ContainerDied","Data":"e6c4e0fa99fb193d72fa5470b9bf049e995ca3d72d47fff72538bb9179dbb9b1"} Dec 03 07:15:19 crc kubenswrapper[4947]: I1203 07:15:19.574544 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rm96n" event={"ID":"7dc2624e-f408-4913-8598-8f7238d73cb9","Type":"ContainerStarted","Data":"690833dce263733a8d98a21cf40d665da6887fb9f710731dcb9cdc88ddbe8511"} Dec 03 07:15:19 crc kubenswrapper[4947]: I1203 07:15:19.597964 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rm96n" podStartSLOduration=2.157567823 podStartE2EDuration="4.597943004s" podCreationTimestamp="2025-12-03 07:15:15 +0000 UTC" firstStartedPulling="2025-12-03 07:15:16.540362405 +0000 UTC m=+1577.801316841" lastFinishedPulling="2025-12-03 07:15:18.980737586 +0000 UTC m=+1580.241692022" observedRunningTime="2025-12-03 07:15:19.595463187 +0000 UTC m=+1580.856417633" watchObservedRunningTime="2025-12-03 07:15:19.597943004 +0000 UTC m=+1580.858897450" Dec 03 07:15:20 crc kubenswrapper[4947]: I1203 07:15:20.083444 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:15:20 crc kubenswrapper[4947]: E1203 07:15:20.083776 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:15:25 crc kubenswrapper[4947]: I1203 07:15:25.573658 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rm96n" Dec 03 07:15:25 crc kubenswrapper[4947]: I1203 07:15:25.574107 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rm96n" Dec 03 07:15:25 crc kubenswrapper[4947]: I1203 07:15:25.625538 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rm96n" Dec 03 07:15:25 crc kubenswrapper[4947]: I1203 07:15:25.686650 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rm96n" Dec 03 07:15:25 crc kubenswrapper[4947]: I1203 07:15:25.872810 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rm96n"] Dec 03 07:15:27 crc kubenswrapper[4947]: I1203 07:15:27.657318 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rm96n" podUID="7dc2624e-f408-4913-8598-8f7238d73cb9" containerName="registry-server" containerID="cri-o://690833dce263733a8d98a21cf40d665da6887fb9f710731dcb9cdc88ddbe8511" gracePeriod=2 Dec 03 07:15:28 crc kubenswrapper[4947]: I1203 07:15:28.669481 4947 generic.go:334] "Generic (PLEG): container finished" podID="7dc2624e-f408-4913-8598-8f7238d73cb9" containerID="690833dce263733a8d98a21cf40d665da6887fb9f710731dcb9cdc88ddbe8511" exitCode=0 Dec 03 07:15:28 crc kubenswrapper[4947]: I1203 07:15:28.669673 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rm96n" event={"ID":"7dc2624e-f408-4913-8598-8f7238d73cb9","Type":"ContainerDied","Data":"690833dce263733a8d98a21cf40d665da6887fb9f710731dcb9cdc88ddbe8511"} Dec 03 07:15:28 crc kubenswrapper[4947]: I1203 07:15:28.669994 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rm96n" event={"ID":"7dc2624e-f408-4913-8598-8f7238d73cb9","Type":"ContainerDied","Data":"ef994cf5e9a9a8da176fb40ad8927b32f9917e54646d85e52959b2e17082880d"} Dec 03 07:15:28 crc kubenswrapper[4947]: I1203 07:15:28.670070 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef994cf5e9a9a8da176fb40ad8927b32f9917e54646d85e52959b2e17082880d" Dec 03 07:15:28 crc kubenswrapper[4947]: I1203 07:15:28.678914 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rm96n" Dec 03 07:15:28 crc kubenswrapper[4947]: I1203 07:15:28.760107 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dc2624e-f408-4913-8598-8f7238d73cb9-catalog-content\") pod \"7dc2624e-f408-4913-8598-8f7238d73cb9\" (UID: \"7dc2624e-f408-4913-8598-8f7238d73cb9\") " Dec 03 07:15:28 crc kubenswrapper[4947]: I1203 07:15:28.760192 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dc2624e-f408-4913-8598-8f7238d73cb9-utilities\") pod \"7dc2624e-f408-4913-8598-8f7238d73cb9\" (UID: \"7dc2624e-f408-4913-8598-8f7238d73cb9\") " Dec 03 07:15:28 crc kubenswrapper[4947]: I1203 07:15:28.760262 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbbl5\" (UniqueName: \"kubernetes.io/projected/7dc2624e-f408-4913-8598-8f7238d73cb9-kube-api-access-lbbl5\") pod \"7dc2624e-f408-4913-8598-8f7238d73cb9\" (UID: \"7dc2624e-f408-4913-8598-8f7238d73cb9\") " Dec 03 07:15:28 crc kubenswrapper[4947]: I1203 07:15:28.762022 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dc2624e-f408-4913-8598-8f7238d73cb9-utilities" (OuterVolumeSpecName: "utilities") pod "7dc2624e-f408-4913-8598-8f7238d73cb9" (UID: "7dc2624e-f408-4913-8598-8f7238d73cb9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:15:28 crc kubenswrapper[4947]: I1203 07:15:28.767381 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dc2624e-f408-4913-8598-8f7238d73cb9-kube-api-access-lbbl5" (OuterVolumeSpecName: "kube-api-access-lbbl5") pod "7dc2624e-f408-4913-8598-8f7238d73cb9" (UID: "7dc2624e-f408-4913-8598-8f7238d73cb9"). InnerVolumeSpecName "kube-api-access-lbbl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:15:28 crc kubenswrapper[4947]: I1203 07:15:28.796123 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dc2624e-f408-4913-8598-8f7238d73cb9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7dc2624e-f408-4913-8598-8f7238d73cb9" (UID: "7dc2624e-f408-4913-8598-8f7238d73cb9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:15:28 crc kubenswrapper[4947]: I1203 07:15:28.862795 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dc2624e-f408-4913-8598-8f7238d73cb9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:15:28 crc kubenswrapper[4947]: I1203 07:15:28.862843 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dc2624e-f408-4913-8598-8f7238d73cb9-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:15:28 crc kubenswrapper[4947]: I1203 07:15:28.862856 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbbl5\" (UniqueName: \"kubernetes.io/projected/7dc2624e-f408-4913-8598-8f7238d73cb9-kube-api-access-lbbl5\") on node \"crc\" DevicePath \"\"" Dec 03 07:15:29 crc kubenswrapper[4947]: I1203 07:15:29.678531 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rm96n" Dec 03 07:15:29 crc kubenswrapper[4947]: I1203 07:15:29.734022 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rm96n"] Dec 03 07:15:29 crc kubenswrapper[4947]: I1203 07:15:29.743571 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rm96n"] Dec 03 07:15:31 crc kubenswrapper[4947]: I1203 07:15:31.100124 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dc2624e-f408-4913-8598-8f7238d73cb9" path="/var/lib/kubelet/pods/7dc2624e-f408-4913-8598-8f7238d73cb9/volumes" Dec 03 07:15:32 crc kubenswrapper[4947]: I1203 07:15:32.083800 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:15:32 crc kubenswrapper[4947]: E1203 07:15:32.084014 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:15:43 crc kubenswrapper[4947]: I1203 07:15:43.084084 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:15:43 crc kubenswrapper[4947]: E1203 07:15:43.085064 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:15:56 crc kubenswrapper[4947]: I1203 07:15:56.084092 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:15:56 crc kubenswrapper[4947]: E1203 07:15:56.084870 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:16:02 crc kubenswrapper[4947]: I1203 07:16:02.638647 4947 scope.go:117] "RemoveContainer" containerID="c60ae6e938e54908332dbaa6ddd689e443ba56a7fa23dbd3d3249373512ca680" Dec 03 07:16:02 crc kubenswrapper[4947]: I1203 07:16:02.683605 4947 scope.go:117] "RemoveContainer" containerID="845cd1a32bd6e9532c9423654e785193c73bf6e3b1aea2e7e18eadd318301cec" Dec 03 07:16:02 crc kubenswrapper[4947]: I1203 07:16:02.715616 4947 scope.go:117] "RemoveContainer" containerID="7a42de092f2c4832987606718c90461323cdaca36bc998fcfa88de29b59c873d" Dec 03 07:16:02 crc kubenswrapper[4947]: I1203 07:16:02.748446 4947 scope.go:117] "RemoveContainer" containerID="aac3920ae7cd9e0b2c5f777e7aa7e5d8fbbea4f5ca8c93fe99023a44910f66a8" Dec 03 07:16:02 crc kubenswrapper[4947]: I1203 07:16:02.771771 4947 scope.go:117] "RemoveContainer" containerID="d961d7a3cdd727c0e7fbdb61831a58352766133832f234d26b0ca308916654cf" Dec 03 07:16:02 crc kubenswrapper[4947]: I1203 07:16:02.797759 4947 scope.go:117] "RemoveContainer" containerID="00d072c696fe0a03b60855d152210d0d79d395112c447d7edb863c26fd2e08db" Dec 03 07:16:02 crc kubenswrapper[4947]: I1203 07:16:02.826244 4947 scope.go:117] "RemoveContainer" containerID="1661b453aa4675cf59e5985d9eef19957b032c863d92cb6f01e34413e202e848" Dec 03 07:16:02 crc kubenswrapper[4947]: I1203 07:16:02.853164 4947 scope.go:117] "RemoveContainer" containerID="87dd90cc8a08b7bf0637010541350c2b593540197fa2c3a2b4f64f7953ae225f" Dec 03 07:16:02 crc kubenswrapper[4947]: I1203 07:16:02.884415 4947 scope.go:117] "RemoveContainer" containerID="1803fa6075e7e974f61cf6cd7ebcdef0211618286da42aad5c1687fe396a9e93" Dec 03 07:16:02 crc kubenswrapper[4947]: I1203 07:16:02.899900 4947 scope.go:117] "RemoveContainer" containerID="97036c64cfae8886767f870b26986f84c7ae69a1a7110b3ff8a8e15f7e00dd75" Dec 03 07:16:02 crc kubenswrapper[4947]: I1203 07:16:02.915190 4947 scope.go:117] "RemoveContainer" containerID="a809b23cd1a249dbc9331e612526f60ed5bdd18fa6472bcbf28960c85be485be" Dec 03 07:16:02 crc kubenswrapper[4947]: I1203 07:16:02.932117 4947 scope.go:117] "RemoveContainer" containerID="ce7f8de092c5631abd0150a07a948c24b407959e654aa1d9ce964ed828bb05fa" Dec 03 07:16:02 crc kubenswrapper[4947]: I1203 07:16:02.948987 4947 scope.go:117] "RemoveContainer" containerID="0df9670d5726b37e2f523e33898d50880cf6beeea48f329f234fc69c23009757" Dec 03 07:16:02 crc kubenswrapper[4947]: I1203 07:16:02.963089 4947 scope.go:117] "RemoveContainer" containerID="15964c1303d9d7cbf96b97388f04d9d2a0066502b8675cf6bd31dd81efcad9d4" Dec 03 07:16:02 crc kubenswrapper[4947]: I1203 07:16:02.978469 4947 scope.go:117] "RemoveContainer" containerID="64b3099ce4ae64f657b3ba6a9d2a21ac15dc33c2c3dddccc593d4d83045820bf" Dec 03 07:16:03 crc kubenswrapper[4947]: I1203 07:16:03.020142 4947 scope.go:117] "RemoveContainer" containerID="a1759e709aa0048498574524e67b384d6d64c42bed6642269b86d9285c67cdf0" Dec 03 07:16:03 crc kubenswrapper[4947]: I1203 07:16:03.044812 4947 scope.go:117] "RemoveContainer" containerID="6100514238c35219823e96f507d19e92cdaf0f5f76bc0b651132a8cdfecdd83f" Dec 03 07:16:03 crc kubenswrapper[4947]: I1203 07:16:03.079628 4947 scope.go:117] "RemoveContainer" containerID="737f46e0f99576fb923247d82fc1eef29b5d49123f805ead5797b64495cf9e63" Dec 03 07:16:03 crc kubenswrapper[4947]: I1203 07:16:03.103551 4947 scope.go:117] "RemoveContainer" containerID="76a4bdae9bb828027148c5b11ebef7f1e1e3ae1474d196cf4f90b8703f56d94a" Dec 03 07:16:03 crc kubenswrapper[4947]: I1203 07:16:03.122690 4947 scope.go:117] "RemoveContainer" containerID="a6e1ead0fefc1c6dc26fdc188da65352aea353653dca0f86b588f1fdd21857ed" Dec 03 07:16:03 crc kubenswrapper[4947]: I1203 07:16:03.141398 4947 scope.go:117] "RemoveContainer" containerID="d890527d7be95a78ab0069a1cc568b31239f334eaea64b0a493969424e024807" Dec 03 07:16:03 crc kubenswrapper[4947]: I1203 07:16:03.176751 4947 scope.go:117] "RemoveContainer" containerID="11184cabbbb6be466af52f75261a132e0dea9e3c2152fd1f2181fdacb612abd2" Dec 03 07:16:09 crc kubenswrapper[4947]: I1203 07:16:09.091262 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:16:09 crc kubenswrapper[4947]: E1203 07:16:09.092398 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:16:23 crc kubenswrapper[4947]: I1203 07:16:23.083469 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:16:23 crc kubenswrapper[4947]: E1203 07:16:23.084182 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:16:34 crc kubenswrapper[4947]: I1203 07:16:34.083875 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:16:34 crc kubenswrapper[4947]: E1203 07:16:34.084670 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:16:48 crc kubenswrapper[4947]: I1203 07:16:48.083274 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:16:48 crc kubenswrapper[4947]: E1203 07:16:48.085662 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:17:03 crc kubenswrapper[4947]: I1203 07:17:03.083205 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:17:03 crc kubenswrapper[4947]: E1203 07:17:03.084128 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:17:03 crc kubenswrapper[4947]: I1203 07:17:03.424726 4947 scope.go:117] "RemoveContainer" containerID="79d76a072f599124aea7aa155893adf39a85cca4bb9f8e9f6280be548ec04f73" Dec 03 07:17:03 crc kubenswrapper[4947]: I1203 07:17:03.466466 4947 scope.go:117] "RemoveContainer" containerID="3fe0141e6c7dfad5f3686eb0461017dbced765ef02bc95a048da886e97ae70e3" Dec 03 07:17:03 crc kubenswrapper[4947]: I1203 07:17:03.506560 4947 scope.go:117] "RemoveContainer" containerID="00aa7a80e06314abdb440924c5edafb9eaef6ba88ad895856328a1ead1571418" Dec 03 07:17:03 crc kubenswrapper[4947]: I1203 07:17:03.533821 4947 scope.go:117] "RemoveContainer" containerID="05a614732107d76e00cf7484e0feb6e0485e39e5f9e2a1c05314d7e15cca22c4" Dec 03 07:17:03 crc kubenswrapper[4947]: I1203 07:17:03.557859 4947 scope.go:117] "RemoveContainer" containerID="e87240b302c77a397917136cebd59e0e317e8b17fdf4bfb67df917c0ec2d7cbd" Dec 03 07:17:03 crc kubenswrapper[4947]: I1203 07:17:03.601669 4947 scope.go:117] "RemoveContainer" containerID="755fd0d95a886192addb01b57efa724676305c9c137608f6fd6ccd18807ef9cb" Dec 03 07:17:03 crc kubenswrapper[4947]: I1203 07:17:03.661606 4947 scope.go:117] "RemoveContainer" containerID="47543c62ad6a1a8db1e3df0f70a0cf3357416e4167bbed925cb86aecffd099af" Dec 03 07:17:03 crc kubenswrapper[4947]: I1203 07:17:03.684328 4947 scope.go:117] "RemoveContainer" containerID="a6ce595ade9a9475a433b8be016f0ffd0956854fd235a02fee7a7b549bfe6fea" Dec 03 07:17:16 crc kubenswrapper[4947]: I1203 07:17:16.083352 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:17:16 crc kubenswrapper[4947]: E1203 07:17:16.084341 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:17:29 crc kubenswrapper[4947]: I1203 07:17:29.091617 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:17:29 crc kubenswrapper[4947]: E1203 07:17:29.094771 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:17:40 crc kubenswrapper[4947]: I1203 07:17:40.083149 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:17:40 crc kubenswrapper[4947]: E1203 07:17:40.084744 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:17:53 crc kubenswrapper[4947]: I1203 07:17:53.084463 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:17:53 crc kubenswrapper[4947]: E1203 07:17:53.085569 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:18:03 crc kubenswrapper[4947]: I1203 07:18:03.832702 4947 scope.go:117] "RemoveContainer" containerID="2fcba5d7c876bd3b90e8851cfc63ebb2e3bd539f891a545b8488e8ea2168ca5f" Dec 03 07:18:03 crc kubenswrapper[4947]: I1203 07:18:03.887616 4947 scope.go:117] "RemoveContainer" containerID="96cdcb461d21fa3b9f6f2564295436608c7aad6f8b2a1e0421630332a70976f2" Dec 03 07:18:03 crc kubenswrapper[4947]: I1203 07:18:03.913290 4947 scope.go:117] "RemoveContainer" containerID="d0205492af2a84d2597e544b1fac1933410305054e5bf3b202d9e1f108f9cdfa" Dec 03 07:18:03 crc kubenswrapper[4947]: I1203 07:18:03.942374 4947 scope.go:117] "RemoveContainer" containerID="a8b3ef7eb0ba7d8af170edb23f786fe99d24451a36fd652d4dbc94d3f46220ba" Dec 03 07:18:04 crc kubenswrapper[4947]: I1203 07:18:04.083165 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:18:04 crc kubenswrapper[4947]: E1203 07:18:04.083385 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:18:19 crc kubenswrapper[4947]: I1203 07:18:19.090978 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:18:19 crc kubenswrapper[4947]: E1203 07:18:19.094711 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:18:33 crc kubenswrapper[4947]: I1203 07:18:33.083332 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:18:33 crc kubenswrapper[4947]: E1203 07:18:33.084998 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:18:43 crc kubenswrapper[4947]: I1203 07:18:43.917327 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s2wh6"] Dec 03 07:18:43 crc kubenswrapper[4947]: E1203 07:18:43.918733 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dc2624e-f408-4913-8598-8f7238d73cb9" containerName="extract-utilities" Dec 03 07:18:43 crc kubenswrapper[4947]: I1203 07:18:43.918769 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc2624e-f408-4913-8598-8f7238d73cb9" containerName="extract-utilities" Dec 03 07:18:43 crc kubenswrapper[4947]: E1203 07:18:43.918812 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dc2624e-f408-4913-8598-8f7238d73cb9" containerName="registry-server" Dec 03 07:18:43 crc kubenswrapper[4947]: I1203 07:18:43.918830 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc2624e-f408-4913-8598-8f7238d73cb9" containerName="registry-server" Dec 03 07:18:43 crc kubenswrapper[4947]: E1203 07:18:43.918860 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dc2624e-f408-4913-8598-8f7238d73cb9" containerName="extract-content" Dec 03 07:18:43 crc kubenswrapper[4947]: I1203 07:18:43.918877 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc2624e-f408-4913-8598-8f7238d73cb9" containerName="extract-content" Dec 03 07:18:43 crc kubenswrapper[4947]: I1203 07:18:43.919345 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dc2624e-f408-4913-8598-8f7238d73cb9" containerName="registry-server" Dec 03 07:18:43 crc kubenswrapper[4947]: I1203 07:18:43.921628 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s2wh6" Dec 03 07:18:43 crc kubenswrapper[4947]: I1203 07:18:43.938361 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s2wh6"] Dec 03 07:18:44 crc kubenswrapper[4947]: I1203 07:18:44.000194 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2b873ec-5e0a-4776-a27d-f99d97060d71-utilities\") pod \"redhat-operators-s2wh6\" (UID: \"d2b873ec-5e0a-4776-a27d-f99d97060d71\") " pod="openshift-marketplace/redhat-operators-s2wh6" Dec 03 07:18:44 crc kubenswrapper[4947]: I1203 07:18:44.000302 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94mqq\" (UniqueName: \"kubernetes.io/projected/d2b873ec-5e0a-4776-a27d-f99d97060d71-kube-api-access-94mqq\") pod \"redhat-operators-s2wh6\" (UID: \"d2b873ec-5e0a-4776-a27d-f99d97060d71\") " pod="openshift-marketplace/redhat-operators-s2wh6" Dec 03 07:18:44 crc kubenswrapper[4947]: I1203 07:18:44.000656 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2b873ec-5e0a-4776-a27d-f99d97060d71-catalog-content\") pod \"redhat-operators-s2wh6\" (UID: \"d2b873ec-5e0a-4776-a27d-f99d97060d71\") " pod="openshift-marketplace/redhat-operators-s2wh6" Dec 03 07:18:44 crc kubenswrapper[4947]: I1203 07:18:44.101553 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2b873ec-5e0a-4776-a27d-f99d97060d71-utilities\") pod \"redhat-operators-s2wh6\" (UID: \"d2b873ec-5e0a-4776-a27d-f99d97060d71\") " pod="openshift-marketplace/redhat-operators-s2wh6" Dec 03 07:18:44 crc kubenswrapper[4947]: I1203 07:18:44.101612 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94mqq\" (UniqueName: \"kubernetes.io/projected/d2b873ec-5e0a-4776-a27d-f99d97060d71-kube-api-access-94mqq\") pod \"redhat-operators-s2wh6\" (UID: \"d2b873ec-5e0a-4776-a27d-f99d97060d71\") " pod="openshift-marketplace/redhat-operators-s2wh6" Dec 03 07:18:44 crc kubenswrapper[4947]: I1203 07:18:44.101677 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2b873ec-5e0a-4776-a27d-f99d97060d71-catalog-content\") pod \"redhat-operators-s2wh6\" (UID: \"d2b873ec-5e0a-4776-a27d-f99d97060d71\") " pod="openshift-marketplace/redhat-operators-s2wh6" Dec 03 07:18:44 crc kubenswrapper[4947]: I1203 07:18:44.102236 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2b873ec-5e0a-4776-a27d-f99d97060d71-catalog-content\") pod \"redhat-operators-s2wh6\" (UID: \"d2b873ec-5e0a-4776-a27d-f99d97060d71\") " pod="openshift-marketplace/redhat-operators-s2wh6" Dec 03 07:18:44 crc kubenswrapper[4947]: I1203 07:18:44.102363 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2b873ec-5e0a-4776-a27d-f99d97060d71-utilities\") pod \"redhat-operators-s2wh6\" (UID: \"d2b873ec-5e0a-4776-a27d-f99d97060d71\") " pod="openshift-marketplace/redhat-operators-s2wh6" Dec 03 07:18:44 crc kubenswrapper[4947]: I1203 07:18:44.136681 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94mqq\" (UniqueName: \"kubernetes.io/projected/d2b873ec-5e0a-4776-a27d-f99d97060d71-kube-api-access-94mqq\") pod \"redhat-operators-s2wh6\" (UID: \"d2b873ec-5e0a-4776-a27d-f99d97060d71\") " pod="openshift-marketplace/redhat-operators-s2wh6" Dec 03 07:18:44 crc kubenswrapper[4947]: I1203 07:18:44.257611 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s2wh6" Dec 03 07:18:44 crc kubenswrapper[4947]: I1203 07:18:44.730854 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s2wh6"] Dec 03 07:18:45 crc kubenswrapper[4947]: I1203 07:18:45.635384 4947 generic.go:334] "Generic (PLEG): container finished" podID="d2b873ec-5e0a-4776-a27d-f99d97060d71" containerID="d454a4bf946baf3c1a281e35a2581a9aa3e8ac20fbe7cb127cfed4a4e27c09e5" exitCode=0 Dec 03 07:18:45 crc kubenswrapper[4947]: I1203 07:18:45.635429 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s2wh6" event={"ID":"d2b873ec-5e0a-4776-a27d-f99d97060d71","Type":"ContainerDied","Data":"d454a4bf946baf3c1a281e35a2581a9aa3e8ac20fbe7cb127cfed4a4e27c09e5"} Dec 03 07:18:45 crc kubenswrapper[4947]: I1203 07:18:45.635461 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s2wh6" event={"ID":"d2b873ec-5e0a-4776-a27d-f99d97060d71","Type":"ContainerStarted","Data":"9d0303bd3d0776077bfe5d0f94398f984d09f9d8c3ca3f039b46230644bb5687"} Dec 03 07:18:45 crc kubenswrapper[4947]: I1203 07:18:45.638722 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 07:18:46 crc kubenswrapper[4947]: I1203 07:18:46.651059 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s2wh6" event={"ID":"d2b873ec-5e0a-4776-a27d-f99d97060d71","Type":"ContainerStarted","Data":"ca8974968849324cc4c65e381da8a8c7fee56db9aca124f2363a4ddc7f210e39"} Dec 03 07:18:47 crc kubenswrapper[4947]: I1203 07:18:47.083327 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:18:47 crc kubenswrapper[4947]: E1203 07:18:47.083672 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:18:47 crc kubenswrapper[4947]: I1203 07:18:47.670709 4947 generic.go:334] "Generic (PLEG): container finished" podID="d2b873ec-5e0a-4776-a27d-f99d97060d71" containerID="ca8974968849324cc4c65e381da8a8c7fee56db9aca124f2363a4ddc7f210e39" exitCode=0 Dec 03 07:18:47 crc kubenswrapper[4947]: I1203 07:18:47.670858 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s2wh6" event={"ID":"d2b873ec-5e0a-4776-a27d-f99d97060d71","Type":"ContainerDied","Data":"ca8974968849324cc4c65e381da8a8c7fee56db9aca124f2363a4ddc7f210e39"} Dec 03 07:18:48 crc kubenswrapper[4947]: I1203 07:18:48.689368 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s2wh6" event={"ID":"d2b873ec-5e0a-4776-a27d-f99d97060d71","Type":"ContainerStarted","Data":"359bf7ad1142fccbf7ceeb4c48b100d597c7859875a2d8fcf7a212465352679b"} Dec 03 07:18:48 crc kubenswrapper[4947]: I1203 07:18:48.719853 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s2wh6" podStartSLOduration=3.182714197 podStartE2EDuration="5.719824674s" podCreationTimestamp="2025-12-03 07:18:43 +0000 UTC" firstStartedPulling="2025-12-03 07:18:45.638470229 +0000 UTC m=+1786.899424655" lastFinishedPulling="2025-12-03 07:18:48.175580696 +0000 UTC m=+1789.436535132" observedRunningTime="2025-12-03 07:18:48.714347055 +0000 UTC m=+1789.975301491" watchObservedRunningTime="2025-12-03 07:18:48.719824674 +0000 UTC m=+1789.980779140" Dec 03 07:18:54 crc kubenswrapper[4947]: I1203 07:18:54.259768 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s2wh6" Dec 03 07:18:54 crc kubenswrapper[4947]: I1203 07:18:54.260151 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s2wh6" Dec 03 07:18:55 crc kubenswrapper[4947]: I1203 07:18:55.324109 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s2wh6" podUID="d2b873ec-5e0a-4776-a27d-f99d97060d71" containerName="registry-server" probeResult="failure" output=< Dec 03 07:18:55 crc kubenswrapper[4947]: timeout: failed to connect service ":50051" within 1s Dec 03 07:18:55 crc kubenswrapper[4947]: > Dec 03 07:19:00 crc kubenswrapper[4947]: I1203 07:19:00.083445 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:19:00 crc kubenswrapper[4947]: E1203 07:19:00.084179 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:19:04 crc kubenswrapper[4947]: I1203 07:19:04.040538 4947 scope.go:117] "RemoveContainer" containerID="1844d37f809640ea8effa099c546a30716c8ad0bbab318d8e046a3e937475817" Dec 03 07:19:04 crc kubenswrapper[4947]: I1203 07:19:04.080428 4947 scope.go:117] "RemoveContainer" containerID="81c2ad2c09c8e03c68216f5fbecb08e40fa8cf252b49a0930e549c7e3e04e959" Dec 03 07:19:04 crc kubenswrapper[4947]: I1203 07:19:04.109121 4947 scope.go:117] "RemoveContainer" containerID="5f5b17e47defdfb32050f77dd94b370b6ccbc70dc782d3ccfc33b54d67378629" Dec 03 07:19:04 crc kubenswrapper[4947]: I1203 07:19:04.142986 4947 scope.go:117] "RemoveContainer" containerID="099259ef27c1dbed4aec87e28da3485dccbf8c02f2d379b599c19fcda6e2c957" Dec 03 07:19:04 crc kubenswrapper[4947]: I1203 07:19:04.315058 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s2wh6" Dec 03 07:19:04 crc kubenswrapper[4947]: I1203 07:19:04.388614 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s2wh6" Dec 03 07:19:04 crc kubenswrapper[4947]: I1203 07:19:04.572070 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s2wh6"] Dec 03 07:19:05 crc kubenswrapper[4947]: I1203 07:19:05.849119 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s2wh6" podUID="d2b873ec-5e0a-4776-a27d-f99d97060d71" containerName="registry-server" containerID="cri-o://359bf7ad1142fccbf7ceeb4c48b100d597c7859875a2d8fcf7a212465352679b" gracePeriod=2 Dec 03 07:19:06 crc kubenswrapper[4947]: I1203 07:19:06.263821 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s2wh6" Dec 03 07:19:06 crc kubenswrapper[4947]: I1203 07:19:06.457151 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2b873ec-5e0a-4776-a27d-f99d97060d71-utilities\") pod \"d2b873ec-5e0a-4776-a27d-f99d97060d71\" (UID: \"d2b873ec-5e0a-4776-a27d-f99d97060d71\") " Dec 03 07:19:06 crc kubenswrapper[4947]: I1203 07:19:06.457206 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2b873ec-5e0a-4776-a27d-f99d97060d71-catalog-content\") pod \"d2b873ec-5e0a-4776-a27d-f99d97060d71\" (UID: \"d2b873ec-5e0a-4776-a27d-f99d97060d71\") " Dec 03 07:19:06 crc kubenswrapper[4947]: I1203 07:19:06.457363 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94mqq\" (UniqueName: \"kubernetes.io/projected/d2b873ec-5e0a-4776-a27d-f99d97060d71-kube-api-access-94mqq\") pod \"d2b873ec-5e0a-4776-a27d-f99d97060d71\" (UID: \"d2b873ec-5e0a-4776-a27d-f99d97060d71\") " Dec 03 07:19:06 crc kubenswrapper[4947]: I1203 07:19:06.459480 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2b873ec-5e0a-4776-a27d-f99d97060d71-utilities" (OuterVolumeSpecName: "utilities") pod "d2b873ec-5e0a-4776-a27d-f99d97060d71" (UID: "d2b873ec-5e0a-4776-a27d-f99d97060d71"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:19:06 crc kubenswrapper[4947]: I1203 07:19:06.465830 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2b873ec-5e0a-4776-a27d-f99d97060d71-kube-api-access-94mqq" (OuterVolumeSpecName: "kube-api-access-94mqq") pod "d2b873ec-5e0a-4776-a27d-f99d97060d71" (UID: "d2b873ec-5e0a-4776-a27d-f99d97060d71"). InnerVolumeSpecName "kube-api-access-94mqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:19:06 crc kubenswrapper[4947]: I1203 07:19:06.560836 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94mqq\" (UniqueName: \"kubernetes.io/projected/d2b873ec-5e0a-4776-a27d-f99d97060d71-kube-api-access-94mqq\") on node \"crc\" DevicePath \"\"" Dec 03 07:19:06 crc kubenswrapper[4947]: I1203 07:19:06.560877 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2b873ec-5e0a-4776-a27d-f99d97060d71-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:19:06 crc kubenswrapper[4947]: I1203 07:19:06.597208 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2b873ec-5e0a-4776-a27d-f99d97060d71-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2b873ec-5e0a-4776-a27d-f99d97060d71" (UID: "d2b873ec-5e0a-4776-a27d-f99d97060d71"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:19:06 crc kubenswrapper[4947]: I1203 07:19:06.661323 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2b873ec-5e0a-4776-a27d-f99d97060d71-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:19:06 crc kubenswrapper[4947]: I1203 07:19:06.860363 4947 generic.go:334] "Generic (PLEG): container finished" podID="d2b873ec-5e0a-4776-a27d-f99d97060d71" containerID="359bf7ad1142fccbf7ceeb4c48b100d597c7859875a2d8fcf7a212465352679b" exitCode=0 Dec 03 07:19:06 crc kubenswrapper[4947]: I1203 07:19:06.860551 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s2wh6" Dec 03 07:19:06 crc kubenswrapper[4947]: I1203 07:19:06.861558 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s2wh6" event={"ID":"d2b873ec-5e0a-4776-a27d-f99d97060d71","Type":"ContainerDied","Data":"359bf7ad1142fccbf7ceeb4c48b100d597c7859875a2d8fcf7a212465352679b"} Dec 03 07:19:06 crc kubenswrapper[4947]: I1203 07:19:06.861777 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s2wh6" event={"ID":"d2b873ec-5e0a-4776-a27d-f99d97060d71","Type":"ContainerDied","Data":"9d0303bd3d0776077bfe5d0f94398f984d09f9d8c3ca3f039b46230644bb5687"} Dec 03 07:19:06 crc kubenswrapper[4947]: I1203 07:19:06.861883 4947 scope.go:117] "RemoveContainer" containerID="359bf7ad1142fccbf7ceeb4c48b100d597c7859875a2d8fcf7a212465352679b" Dec 03 07:19:06 crc kubenswrapper[4947]: I1203 07:19:06.902229 4947 scope.go:117] "RemoveContainer" containerID="ca8974968849324cc4c65e381da8a8c7fee56db9aca124f2363a4ddc7f210e39" Dec 03 07:19:06 crc kubenswrapper[4947]: I1203 07:19:06.915336 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s2wh6"] Dec 03 07:19:06 crc kubenswrapper[4947]: I1203 07:19:06.939318 4947 scope.go:117] "RemoveContainer" containerID="d454a4bf946baf3c1a281e35a2581a9aa3e8ac20fbe7cb127cfed4a4e27c09e5" Dec 03 07:19:06 crc kubenswrapper[4947]: I1203 07:19:06.949655 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s2wh6"] Dec 03 07:19:06 crc kubenswrapper[4947]: I1203 07:19:06.965302 4947 scope.go:117] "RemoveContainer" containerID="359bf7ad1142fccbf7ceeb4c48b100d597c7859875a2d8fcf7a212465352679b" Dec 03 07:19:06 crc kubenswrapper[4947]: E1203 07:19:06.965908 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"359bf7ad1142fccbf7ceeb4c48b100d597c7859875a2d8fcf7a212465352679b\": container with ID starting with 359bf7ad1142fccbf7ceeb4c48b100d597c7859875a2d8fcf7a212465352679b not found: ID does not exist" containerID="359bf7ad1142fccbf7ceeb4c48b100d597c7859875a2d8fcf7a212465352679b" Dec 03 07:19:06 crc kubenswrapper[4947]: I1203 07:19:06.966138 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"359bf7ad1142fccbf7ceeb4c48b100d597c7859875a2d8fcf7a212465352679b"} err="failed to get container status \"359bf7ad1142fccbf7ceeb4c48b100d597c7859875a2d8fcf7a212465352679b\": rpc error: code = NotFound desc = could not find container \"359bf7ad1142fccbf7ceeb4c48b100d597c7859875a2d8fcf7a212465352679b\": container with ID starting with 359bf7ad1142fccbf7ceeb4c48b100d597c7859875a2d8fcf7a212465352679b not found: ID does not exist" Dec 03 07:19:06 crc kubenswrapper[4947]: I1203 07:19:06.966318 4947 scope.go:117] "RemoveContainer" containerID="ca8974968849324cc4c65e381da8a8c7fee56db9aca124f2363a4ddc7f210e39" Dec 03 07:19:06 crc kubenswrapper[4947]: E1203 07:19:06.967088 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca8974968849324cc4c65e381da8a8c7fee56db9aca124f2363a4ddc7f210e39\": container with ID starting with ca8974968849324cc4c65e381da8a8c7fee56db9aca124f2363a4ddc7f210e39 not found: ID does not exist" containerID="ca8974968849324cc4c65e381da8a8c7fee56db9aca124f2363a4ddc7f210e39" Dec 03 07:19:06 crc kubenswrapper[4947]: I1203 07:19:06.967172 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca8974968849324cc4c65e381da8a8c7fee56db9aca124f2363a4ddc7f210e39"} err="failed to get container status \"ca8974968849324cc4c65e381da8a8c7fee56db9aca124f2363a4ddc7f210e39\": rpc error: code = NotFound desc = could not find container \"ca8974968849324cc4c65e381da8a8c7fee56db9aca124f2363a4ddc7f210e39\": container with ID starting with ca8974968849324cc4c65e381da8a8c7fee56db9aca124f2363a4ddc7f210e39 not found: ID does not exist" Dec 03 07:19:06 crc kubenswrapper[4947]: I1203 07:19:06.967237 4947 scope.go:117] "RemoveContainer" containerID="d454a4bf946baf3c1a281e35a2581a9aa3e8ac20fbe7cb127cfed4a4e27c09e5" Dec 03 07:19:06 crc kubenswrapper[4947]: E1203 07:19:06.967693 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d454a4bf946baf3c1a281e35a2581a9aa3e8ac20fbe7cb127cfed4a4e27c09e5\": container with ID starting with d454a4bf946baf3c1a281e35a2581a9aa3e8ac20fbe7cb127cfed4a4e27c09e5 not found: ID does not exist" containerID="d454a4bf946baf3c1a281e35a2581a9aa3e8ac20fbe7cb127cfed4a4e27c09e5" Dec 03 07:19:06 crc kubenswrapper[4947]: I1203 07:19:06.967759 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d454a4bf946baf3c1a281e35a2581a9aa3e8ac20fbe7cb127cfed4a4e27c09e5"} err="failed to get container status \"d454a4bf946baf3c1a281e35a2581a9aa3e8ac20fbe7cb127cfed4a4e27c09e5\": rpc error: code = NotFound desc = could not find container \"d454a4bf946baf3c1a281e35a2581a9aa3e8ac20fbe7cb127cfed4a4e27c09e5\": container with ID starting with d454a4bf946baf3c1a281e35a2581a9aa3e8ac20fbe7cb127cfed4a4e27c09e5 not found: ID does not exist" Dec 03 07:19:07 crc kubenswrapper[4947]: I1203 07:19:07.105075 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2b873ec-5e0a-4776-a27d-f99d97060d71" path="/var/lib/kubelet/pods/d2b873ec-5e0a-4776-a27d-f99d97060d71/volumes" Dec 03 07:19:11 crc kubenswrapper[4947]: I1203 07:19:11.083268 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:19:11 crc kubenswrapper[4947]: I1203 07:19:11.915897 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"df4fb0d45037c625cdd984bb9028ca45a6e0d11dd1a8acffe358b355880a7404"} Dec 03 07:21:30 crc kubenswrapper[4947]: I1203 07:21:30.086580 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:21:30 crc kubenswrapper[4947]: I1203 07:21:30.087258 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:22:00 crc kubenswrapper[4947]: I1203 07:22:00.086897 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:22:00 crc kubenswrapper[4947]: I1203 07:22:00.087580 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:22:04 crc kubenswrapper[4947]: I1203 07:22:04.340059 4947 scope.go:117] "RemoveContainer" containerID="e6c4e0fa99fb193d72fa5470b9bf049e995ca3d72d47fff72538bb9179dbb9b1" Dec 03 07:22:04 crc kubenswrapper[4947]: I1203 07:22:04.387286 4947 scope.go:117] "RemoveContainer" containerID="4c19ed13a87910dc08e194064d4315719123c18e113310334c654d29fe0b8890" Dec 03 07:22:04 crc kubenswrapper[4947]: I1203 07:22:04.426433 4947 scope.go:117] "RemoveContainer" containerID="690833dce263733a8d98a21cf40d665da6887fb9f710731dcb9cdc88ddbe8511" Dec 03 07:22:30 crc kubenswrapper[4947]: I1203 07:22:30.086081 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:22:30 crc kubenswrapper[4947]: I1203 07:22:30.086767 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:22:30 crc kubenswrapper[4947]: I1203 07:22:30.086827 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 07:22:30 crc kubenswrapper[4947]: I1203 07:22:30.087470 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"df4fb0d45037c625cdd984bb9028ca45a6e0d11dd1a8acffe358b355880a7404"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:22:30 crc kubenswrapper[4947]: I1203 07:22:30.087576 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://df4fb0d45037c625cdd984bb9028ca45a6e0d11dd1a8acffe358b355880a7404" gracePeriod=600 Dec 03 07:22:30 crc kubenswrapper[4947]: I1203 07:22:30.973251 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="df4fb0d45037c625cdd984bb9028ca45a6e0d11dd1a8acffe358b355880a7404" exitCode=0 Dec 03 07:22:30 crc kubenswrapper[4947]: I1203 07:22:30.973286 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"df4fb0d45037c625cdd984bb9028ca45a6e0d11dd1a8acffe358b355880a7404"} Dec 03 07:22:30 crc kubenswrapper[4947]: I1203 07:22:30.974478 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c"} Dec 03 07:22:30 crc kubenswrapper[4947]: I1203 07:22:30.974555 4947 scope.go:117] "RemoveContainer" containerID="207c43c002efff61fd55f17fc50a8e75b6fbd44b49d47c8c5b10dfab4828c946" Dec 03 07:24:22 crc kubenswrapper[4947]: I1203 07:24:22.785867 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rfxcz"] Dec 03 07:24:22 crc kubenswrapper[4947]: E1203 07:24:22.786980 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b873ec-5e0a-4776-a27d-f99d97060d71" containerName="registry-server" Dec 03 07:24:22 crc kubenswrapper[4947]: I1203 07:24:22.787005 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b873ec-5e0a-4776-a27d-f99d97060d71" containerName="registry-server" Dec 03 07:24:22 crc kubenswrapper[4947]: E1203 07:24:22.787039 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b873ec-5e0a-4776-a27d-f99d97060d71" containerName="extract-utilities" Dec 03 07:24:22 crc kubenswrapper[4947]: I1203 07:24:22.787051 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b873ec-5e0a-4776-a27d-f99d97060d71" containerName="extract-utilities" Dec 03 07:24:22 crc kubenswrapper[4947]: E1203 07:24:22.787073 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b873ec-5e0a-4776-a27d-f99d97060d71" containerName="extract-content" Dec 03 07:24:22 crc kubenswrapper[4947]: I1203 07:24:22.787085 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b873ec-5e0a-4776-a27d-f99d97060d71" containerName="extract-content" Dec 03 07:24:22 crc kubenswrapper[4947]: I1203 07:24:22.787310 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2b873ec-5e0a-4776-a27d-f99d97060d71" containerName="registry-server" Dec 03 07:24:22 crc kubenswrapper[4947]: I1203 07:24:22.788923 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rfxcz" Dec 03 07:24:22 crc kubenswrapper[4947]: I1203 07:24:22.820771 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rfxcz"] Dec 03 07:24:22 crc kubenswrapper[4947]: I1203 07:24:22.906387 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lh6f\" (UniqueName: \"kubernetes.io/projected/4e0f95e9-2938-4a0a-a241-a03efcbead30-kube-api-access-2lh6f\") pod \"certified-operators-rfxcz\" (UID: \"4e0f95e9-2938-4a0a-a241-a03efcbead30\") " pod="openshift-marketplace/certified-operators-rfxcz" Dec 03 07:24:22 crc kubenswrapper[4947]: I1203 07:24:22.906707 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e0f95e9-2938-4a0a-a241-a03efcbead30-utilities\") pod \"certified-operators-rfxcz\" (UID: \"4e0f95e9-2938-4a0a-a241-a03efcbead30\") " pod="openshift-marketplace/certified-operators-rfxcz" Dec 03 07:24:22 crc kubenswrapper[4947]: I1203 07:24:22.906893 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e0f95e9-2938-4a0a-a241-a03efcbead30-catalog-content\") pod \"certified-operators-rfxcz\" (UID: \"4e0f95e9-2938-4a0a-a241-a03efcbead30\") " pod="openshift-marketplace/certified-operators-rfxcz" Dec 03 07:24:23 crc kubenswrapper[4947]: I1203 07:24:23.007920 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e0f95e9-2938-4a0a-a241-a03efcbead30-catalog-content\") pod \"certified-operators-rfxcz\" (UID: \"4e0f95e9-2938-4a0a-a241-a03efcbead30\") " pod="openshift-marketplace/certified-operators-rfxcz" Dec 03 07:24:23 crc kubenswrapper[4947]: I1203 07:24:23.008355 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lh6f\" (UniqueName: \"kubernetes.io/projected/4e0f95e9-2938-4a0a-a241-a03efcbead30-kube-api-access-2lh6f\") pod \"certified-operators-rfxcz\" (UID: \"4e0f95e9-2938-4a0a-a241-a03efcbead30\") " pod="openshift-marketplace/certified-operators-rfxcz" Dec 03 07:24:23 crc kubenswrapper[4947]: I1203 07:24:23.008576 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e0f95e9-2938-4a0a-a241-a03efcbead30-catalog-content\") pod \"certified-operators-rfxcz\" (UID: \"4e0f95e9-2938-4a0a-a241-a03efcbead30\") " pod="openshift-marketplace/certified-operators-rfxcz" Dec 03 07:24:23 crc kubenswrapper[4947]: I1203 07:24:23.008595 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e0f95e9-2938-4a0a-a241-a03efcbead30-utilities\") pod \"certified-operators-rfxcz\" (UID: \"4e0f95e9-2938-4a0a-a241-a03efcbead30\") " pod="openshift-marketplace/certified-operators-rfxcz" Dec 03 07:24:23 crc kubenswrapper[4947]: I1203 07:24:23.008941 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e0f95e9-2938-4a0a-a241-a03efcbead30-utilities\") pod \"certified-operators-rfxcz\" (UID: \"4e0f95e9-2938-4a0a-a241-a03efcbead30\") " pod="openshift-marketplace/certified-operators-rfxcz" Dec 03 07:24:23 crc kubenswrapper[4947]: I1203 07:24:23.040300 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lh6f\" (UniqueName: \"kubernetes.io/projected/4e0f95e9-2938-4a0a-a241-a03efcbead30-kube-api-access-2lh6f\") pod \"certified-operators-rfxcz\" (UID: \"4e0f95e9-2938-4a0a-a241-a03efcbead30\") " pod="openshift-marketplace/certified-operators-rfxcz" Dec 03 07:24:23 crc kubenswrapper[4947]: I1203 07:24:23.135203 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rfxcz" Dec 03 07:24:23 crc kubenswrapper[4947]: I1203 07:24:23.569602 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rfxcz"] Dec 03 07:24:24 crc kubenswrapper[4947]: I1203 07:24:24.053684 4947 generic.go:334] "Generic (PLEG): container finished" podID="4e0f95e9-2938-4a0a-a241-a03efcbead30" containerID="f602e5123d348eb192cb0b616d00197b2ffe9009b0bc997668fca98b75c71988" exitCode=0 Dec 03 07:24:24 crc kubenswrapper[4947]: I1203 07:24:24.053734 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfxcz" event={"ID":"4e0f95e9-2938-4a0a-a241-a03efcbead30","Type":"ContainerDied","Data":"f602e5123d348eb192cb0b616d00197b2ffe9009b0bc997668fca98b75c71988"} Dec 03 07:24:24 crc kubenswrapper[4947]: I1203 07:24:24.053769 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfxcz" event={"ID":"4e0f95e9-2938-4a0a-a241-a03efcbead30","Type":"ContainerStarted","Data":"00cf55b991737db98c1e3e143c87b287224ce83f62f092105418f2d9acaf2431"} Dec 03 07:24:24 crc kubenswrapper[4947]: I1203 07:24:24.056123 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 07:24:25 crc kubenswrapper[4947]: I1203 07:24:25.067048 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfxcz" event={"ID":"4e0f95e9-2938-4a0a-a241-a03efcbead30","Type":"ContainerStarted","Data":"7f5c9188cfd41d591e8b1a6893b7d74e143110cfc22d79d2c2cbb5e1aa4b9853"} Dec 03 07:24:25 crc kubenswrapper[4947]: E1203 07:24:25.187232 4947 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e0f95e9_2938_4a0a_a241_a03efcbead30.slice/crio-7f5c9188cfd41d591e8b1a6893b7d74e143110cfc22d79d2c2cbb5e1aa4b9853.scope\": RecentStats: unable to find data in memory cache]" Dec 03 07:24:26 crc kubenswrapper[4947]: I1203 07:24:26.074685 4947 generic.go:334] "Generic (PLEG): container finished" podID="4e0f95e9-2938-4a0a-a241-a03efcbead30" containerID="7f5c9188cfd41d591e8b1a6893b7d74e143110cfc22d79d2c2cbb5e1aa4b9853" exitCode=0 Dec 03 07:24:26 crc kubenswrapper[4947]: I1203 07:24:26.074753 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfxcz" event={"ID":"4e0f95e9-2938-4a0a-a241-a03efcbead30","Type":"ContainerDied","Data":"7f5c9188cfd41d591e8b1a6893b7d74e143110cfc22d79d2c2cbb5e1aa4b9853"} Dec 03 07:24:27 crc kubenswrapper[4947]: I1203 07:24:27.091339 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfxcz" event={"ID":"4e0f95e9-2938-4a0a-a241-a03efcbead30","Type":"ContainerStarted","Data":"6e04d7f7c20ae3dbe2d9ba29a3339a8824020d8d97bb33331483d375a8f99b92"} Dec 03 07:24:27 crc kubenswrapper[4947]: I1203 07:24:27.110377 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rfxcz" podStartSLOduration=2.680109206 podStartE2EDuration="5.110363382s" podCreationTimestamp="2025-12-03 07:24:22 +0000 UTC" firstStartedPulling="2025-12-03 07:24:24.055927132 +0000 UTC m=+2125.316881558" lastFinishedPulling="2025-12-03 07:24:26.486181298 +0000 UTC m=+2127.747135734" observedRunningTime="2025-12-03 07:24:27.10846445 +0000 UTC m=+2128.369418876" watchObservedRunningTime="2025-12-03 07:24:27.110363382 +0000 UTC m=+2128.371317808" Dec 03 07:24:30 crc kubenswrapper[4947]: I1203 07:24:30.086382 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:24:30 crc kubenswrapper[4947]: I1203 07:24:30.086693 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:24:33 crc kubenswrapper[4947]: I1203 07:24:33.135354 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rfxcz" Dec 03 07:24:33 crc kubenswrapper[4947]: I1203 07:24:33.135779 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rfxcz" Dec 03 07:24:33 crc kubenswrapper[4947]: I1203 07:24:33.199858 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rfxcz" Dec 03 07:24:34 crc kubenswrapper[4947]: I1203 07:24:34.306040 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rfxcz" Dec 03 07:24:34 crc kubenswrapper[4947]: I1203 07:24:34.363639 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rfxcz"] Dec 03 07:24:36 crc kubenswrapper[4947]: I1203 07:24:36.168176 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rfxcz" podUID="4e0f95e9-2938-4a0a-a241-a03efcbead30" containerName="registry-server" containerID="cri-o://6e04d7f7c20ae3dbe2d9ba29a3339a8824020d8d97bb33331483d375a8f99b92" gracePeriod=2 Dec 03 07:24:37 crc kubenswrapper[4947]: I1203 07:24:37.181168 4947 generic.go:334] "Generic (PLEG): container finished" podID="4e0f95e9-2938-4a0a-a241-a03efcbead30" containerID="6e04d7f7c20ae3dbe2d9ba29a3339a8824020d8d97bb33331483d375a8f99b92" exitCode=0 Dec 03 07:24:37 crc kubenswrapper[4947]: I1203 07:24:37.181218 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfxcz" event={"ID":"4e0f95e9-2938-4a0a-a241-a03efcbead30","Type":"ContainerDied","Data":"6e04d7f7c20ae3dbe2d9ba29a3339a8824020d8d97bb33331483d375a8f99b92"} Dec 03 07:24:37 crc kubenswrapper[4947]: I1203 07:24:37.767076 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rfxcz" Dec 03 07:24:37 crc kubenswrapper[4947]: I1203 07:24:37.822227 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lh6f\" (UniqueName: \"kubernetes.io/projected/4e0f95e9-2938-4a0a-a241-a03efcbead30-kube-api-access-2lh6f\") pod \"4e0f95e9-2938-4a0a-a241-a03efcbead30\" (UID: \"4e0f95e9-2938-4a0a-a241-a03efcbead30\") " Dec 03 07:24:37 crc kubenswrapper[4947]: I1203 07:24:37.822330 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e0f95e9-2938-4a0a-a241-a03efcbead30-catalog-content\") pod \"4e0f95e9-2938-4a0a-a241-a03efcbead30\" (UID: \"4e0f95e9-2938-4a0a-a241-a03efcbead30\") " Dec 03 07:24:37 crc kubenswrapper[4947]: I1203 07:24:37.822436 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e0f95e9-2938-4a0a-a241-a03efcbead30-utilities\") pod \"4e0f95e9-2938-4a0a-a241-a03efcbead30\" (UID: \"4e0f95e9-2938-4a0a-a241-a03efcbead30\") " Dec 03 07:24:37 crc kubenswrapper[4947]: I1203 07:24:37.824163 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e0f95e9-2938-4a0a-a241-a03efcbead30-utilities" (OuterVolumeSpecName: "utilities") pod "4e0f95e9-2938-4a0a-a241-a03efcbead30" (UID: "4e0f95e9-2938-4a0a-a241-a03efcbead30"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:24:37 crc kubenswrapper[4947]: I1203 07:24:37.829355 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e0f95e9-2938-4a0a-a241-a03efcbead30-kube-api-access-2lh6f" (OuterVolumeSpecName: "kube-api-access-2lh6f") pod "4e0f95e9-2938-4a0a-a241-a03efcbead30" (UID: "4e0f95e9-2938-4a0a-a241-a03efcbead30"). InnerVolumeSpecName "kube-api-access-2lh6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:24:37 crc kubenswrapper[4947]: I1203 07:24:37.887668 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e0f95e9-2938-4a0a-a241-a03efcbead30-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e0f95e9-2938-4a0a-a241-a03efcbead30" (UID: "4e0f95e9-2938-4a0a-a241-a03efcbead30"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:24:37 crc kubenswrapper[4947]: I1203 07:24:37.924658 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e0f95e9-2938-4a0a-a241-a03efcbead30-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:24:37 crc kubenswrapper[4947]: I1203 07:24:37.924699 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lh6f\" (UniqueName: \"kubernetes.io/projected/4e0f95e9-2938-4a0a-a241-a03efcbead30-kube-api-access-2lh6f\") on node \"crc\" DevicePath \"\"" Dec 03 07:24:37 crc kubenswrapper[4947]: I1203 07:24:37.924711 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e0f95e9-2938-4a0a-a241-a03efcbead30-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:24:38 crc kubenswrapper[4947]: I1203 07:24:38.192985 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rfxcz" event={"ID":"4e0f95e9-2938-4a0a-a241-a03efcbead30","Type":"ContainerDied","Data":"00cf55b991737db98c1e3e143c87b287224ce83f62f092105418f2d9acaf2431"} Dec 03 07:24:38 crc kubenswrapper[4947]: I1203 07:24:38.193097 4947 scope.go:117] "RemoveContainer" containerID="6e04d7f7c20ae3dbe2d9ba29a3339a8824020d8d97bb33331483d375a8f99b92" Dec 03 07:24:38 crc kubenswrapper[4947]: I1203 07:24:38.193147 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rfxcz" Dec 03 07:24:38 crc kubenswrapper[4947]: I1203 07:24:38.214640 4947 scope.go:117] "RemoveContainer" containerID="7f5c9188cfd41d591e8b1a6893b7d74e143110cfc22d79d2c2cbb5e1aa4b9853" Dec 03 07:24:38 crc kubenswrapper[4947]: I1203 07:24:38.225657 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rfxcz"] Dec 03 07:24:38 crc kubenswrapper[4947]: I1203 07:24:38.233336 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rfxcz"] Dec 03 07:24:38 crc kubenswrapper[4947]: I1203 07:24:38.247460 4947 scope.go:117] "RemoveContainer" containerID="f602e5123d348eb192cb0b616d00197b2ffe9009b0bc997668fca98b75c71988" Dec 03 07:24:39 crc kubenswrapper[4947]: I1203 07:24:39.098977 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e0f95e9-2938-4a0a-a241-a03efcbead30" path="/var/lib/kubelet/pods/4e0f95e9-2938-4a0a-a241-a03efcbead30/volumes" Dec 03 07:25:00 crc kubenswrapper[4947]: I1203 07:25:00.086451 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:25:00 crc kubenswrapper[4947]: I1203 07:25:00.087144 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:25:30 crc kubenswrapper[4947]: I1203 07:25:30.086856 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:25:30 crc kubenswrapper[4947]: I1203 07:25:30.087514 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:25:30 crc kubenswrapper[4947]: I1203 07:25:30.087568 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 07:25:30 crc kubenswrapper[4947]: I1203 07:25:30.088298 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:25:30 crc kubenswrapper[4947]: I1203 07:25:30.088360 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" gracePeriod=600 Dec 03 07:25:30 crc kubenswrapper[4947]: E1203 07:25:30.221014 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:25:30 crc kubenswrapper[4947]: I1203 07:25:30.668616 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" exitCode=0 Dec 03 07:25:30 crc kubenswrapper[4947]: I1203 07:25:30.668690 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c"} Dec 03 07:25:30 crc kubenswrapper[4947]: I1203 07:25:30.669111 4947 scope.go:117] "RemoveContainer" containerID="df4fb0d45037c625cdd984bb9028ca45a6e0d11dd1a8acffe358b355880a7404" Dec 03 07:25:30 crc kubenswrapper[4947]: I1203 07:25:30.670154 4947 scope.go:117] "RemoveContainer" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" Dec 03 07:25:30 crc kubenswrapper[4947]: E1203 07:25:30.670630 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:25:45 crc kubenswrapper[4947]: I1203 07:25:45.084688 4947 scope.go:117] "RemoveContainer" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" Dec 03 07:25:45 crc kubenswrapper[4947]: E1203 07:25:45.085858 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:25:56 crc kubenswrapper[4947]: I1203 07:25:56.083410 4947 scope.go:117] "RemoveContainer" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" Dec 03 07:25:56 crc kubenswrapper[4947]: E1203 07:25:56.083969 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:25:59 crc kubenswrapper[4947]: I1203 07:25:59.028148 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8p2ks"] Dec 03 07:25:59 crc kubenswrapper[4947]: E1203 07:25:59.028521 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e0f95e9-2938-4a0a-a241-a03efcbead30" containerName="registry-server" Dec 03 07:25:59 crc kubenswrapper[4947]: I1203 07:25:59.028538 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e0f95e9-2938-4a0a-a241-a03efcbead30" containerName="registry-server" Dec 03 07:25:59 crc kubenswrapper[4947]: E1203 07:25:59.028562 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e0f95e9-2938-4a0a-a241-a03efcbead30" containerName="extract-utilities" Dec 03 07:25:59 crc kubenswrapper[4947]: I1203 07:25:59.028570 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e0f95e9-2938-4a0a-a241-a03efcbead30" containerName="extract-utilities" Dec 03 07:25:59 crc kubenswrapper[4947]: E1203 07:25:59.028595 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e0f95e9-2938-4a0a-a241-a03efcbead30" containerName="extract-content" Dec 03 07:25:59 crc kubenswrapper[4947]: I1203 07:25:59.028602 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e0f95e9-2938-4a0a-a241-a03efcbead30" containerName="extract-content" Dec 03 07:25:59 crc kubenswrapper[4947]: I1203 07:25:59.028778 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e0f95e9-2938-4a0a-a241-a03efcbead30" containerName="registry-server" Dec 03 07:25:59 crc kubenswrapper[4947]: I1203 07:25:59.029881 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p2ks" Dec 03 07:25:59 crc kubenswrapper[4947]: I1203 07:25:59.093929 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8p2ks"] Dec 03 07:25:59 crc kubenswrapper[4947]: I1203 07:25:59.140242 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88a73ab6-1d70-4636-b819-489f0579e395-utilities\") pod \"community-operators-8p2ks\" (UID: \"88a73ab6-1d70-4636-b819-489f0579e395\") " pod="openshift-marketplace/community-operators-8p2ks" Dec 03 07:25:59 crc kubenswrapper[4947]: I1203 07:25:59.140308 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5qlv\" (UniqueName: \"kubernetes.io/projected/88a73ab6-1d70-4636-b819-489f0579e395-kube-api-access-q5qlv\") pod \"community-operators-8p2ks\" (UID: \"88a73ab6-1d70-4636-b819-489f0579e395\") " pod="openshift-marketplace/community-operators-8p2ks" Dec 03 07:25:59 crc kubenswrapper[4947]: I1203 07:25:59.140331 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88a73ab6-1d70-4636-b819-489f0579e395-catalog-content\") pod \"community-operators-8p2ks\" (UID: \"88a73ab6-1d70-4636-b819-489f0579e395\") " pod="openshift-marketplace/community-operators-8p2ks" Dec 03 07:25:59 crc kubenswrapper[4947]: I1203 07:25:59.242377 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88a73ab6-1d70-4636-b819-489f0579e395-utilities\") pod \"community-operators-8p2ks\" (UID: \"88a73ab6-1d70-4636-b819-489f0579e395\") " pod="openshift-marketplace/community-operators-8p2ks" Dec 03 07:25:59 crc kubenswrapper[4947]: I1203 07:25:59.242748 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5qlv\" (UniqueName: \"kubernetes.io/projected/88a73ab6-1d70-4636-b819-489f0579e395-kube-api-access-q5qlv\") pod \"community-operators-8p2ks\" (UID: \"88a73ab6-1d70-4636-b819-489f0579e395\") " pod="openshift-marketplace/community-operators-8p2ks" Dec 03 07:25:59 crc kubenswrapper[4947]: I1203 07:25:59.242903 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88a73ab6-1d70-4636-b819-489f0579e395-catalog-content\") pod \"community-operators-8p2ks\" (UID: \"88a73ab6-1d70-4636-b819-489f0579e395\") " pod="openshift-marketplace/community-operators-8p2ks" Dec 03 07:25:59 crc kubenswrapper[4947]: I1203 07:25:59.242946 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88a73ab6-1d70-4636-b819-489f0579e395-utilities\") pod \"community-operators-8p2ks\" (UID: \"88a73ab6-1d70-4636-b819-489f0579e395\") " pod="openshift-marketplace/community-operators-8p2ks" Dec 03 07:25:59 crc kubenswrapper[4947]: I1203 07:25:59.243218 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88a73ab6-1d70-4636-b819-489f0579e395-catalog-content\") pod \"community-operators-8p2ks\" (UID: \"88a73ab6-1d70-4636-b819-489f0579e395\") " pod="openshift-marketplace/community-operators-8p2ks" Dec 03 07:25:59 crc kubenswrapper[4947]: I1203 07:25:59.261257 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5qlv\" (UniqueName: \"kubernetes.io/projected/88a73ab6-1d70-4636-b819-489f0579e395-kube-api-access-q5qlv\") pod \"community-operators-8p2ks\" (UID: \"88a73ab6-1d70-4636-b819-489f0579e395\") " pod="openshift-marketplace/community-operators-8p2ks" Dec 03 07:25:59 crc kubenswrapper[4947]: I1203 07:25:59.352586 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p2ks" Dec 03 07:25:59 crc kubenswrapper[4947]: I1203 07:25:59.832961 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8p2ks"] Dec 03 07:25:59 crc kubenswrapper[4947]: W1203 07:25:59.837144 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88a73ab6_1d70_4636_b819_489f0579e395.slice/crio-2e6007c141ae5cdf3a24ab3215536b296db99570acaeba890dba8dfa9de9c7ce WatchSource:0}: Error finding container 2e6007c141ae5cdf3a24ab3215536b296db99570acaeba890dba8dfa9de9c7ce: Status 404 returned error can't find the container with id 2e6007c141ae5cdf3a24ab3215536b296db99570acaeba890dba8dfa9de9c7ce Dec 03 07:25:59 crc kubenswrapper[4947]: I1203 07:25:59.945365 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p2ks" event={"ID":"88a73ab6-1d70-4636-b819-489f0579e395","Type":"ContainerStarted","Data":"2e6007c141ae5cdf3a24ab3215536b296db99570acaeba890dba8dfa9de9c7ce"} Dec 03 07:26:00 crc kubenswrapper[4947]: I1203 07:26:00.957375 4947 generic.go:334] "Generic (PLEG): container finished" podID="88a73ab6-1d70-4636-b819-489f0579e395" containerID="2cf94279076cf7f636ca535071565bdb242a32123ff88f5478a1658d432eb81c" exitCode=0 Dec 03 07:26:00 crc kubenswrapper[4947]: I1203 07:26:00.957755 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p2ks" event={"ID":"88a73ab6-1d70-4636-b819-489f0579e395","Type":"ContainerDied","Data":"2cf94279076cf7f636ca535071565bdb242a32123ff88f5478a1658d432eb81c"} Dec 03 07:26:01 crc kubenswrapper[4947]: I1203 07:26:01.966910 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p2ks" event={"ID":"88a73ab6-1d70-4636-b819-489f0579e395","Type":"ContainerStarted","Data":"06c3446aa0ca747535cf981f0bb10d25e2b28d28115693bc64158e54f9e0ed69"} Dec 03 07:26:02 crc kubenswrapper[4947]: I1203 07:26:02.982274 4947 generic.go:334] "Generic (PLEG): container finished" podID="88a73ab6-1d70-4636-b819-489f0579e395" containerID="06c3446aa0ca747535cf981f0bb10d25e2b28d28115693bc64158e54f9e0ed69" exitCode=0 Dec 03 07:26:02 crc kubenswrapper[4947]: I1203 07:26:02.982395 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p2ks" event={"ID":"88a73ab6-1d70-4636-b819-489f0579e395","Type":"ContainerDied","Data":"06c3446aa0ca747535cf981f0bb10d25e2b28d28115693bc64158e54f9e0ed69"} Dec 03 07:26:03 crc kubenswrapper[4947]: I1203 07:26:03.995324 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p2ks" event={"ID":"88a73ab6-1d70-4636-b819-489f0579e395","Type":"ContainerStarted","Data":"ad6830f093509b2edd231385057da96df7bca7a2b1bbf73a64d4a6b61d5381d6"} Dec 03 07:26:04 crc kubenswrapper[4947]: I1203 07:26:04.023123 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8p2ks" podStartSLOduration=2.575810398 podStartE2EDuration="5.022958431s" podCreationTimestamp="2025-12-03 07:25:59 +0000 UTC" firstStartedPulling="2025-12-03 07:26:00.960094723 +0000 UTC m=+2222.221049179" lastFinishedPulling="2025-12-03 07:26:03.407242746 +0000 UTC m=+2224.668197212" observedRunningTime="2025-12-03 07:26:04.013930206 +0000 UTC m=+2225.274884662" watchObservedRunningTime="2025-12-03 07:26:04.022958431 +0000 UTC m=+2225.283912867" Dec 03 07:26:09 crc kubenswrapper[4947]: I1203 07:26:09.353485 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8p2ks" Dec 03 07:26:09 crc kubenswrapper[4947]: I1203 07:26:09.353951 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8p2ks" Dec 03 07:26:09 crc kubenswrapper[4947]: I1203 07:26:09.402725 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8p2ks" Dec 03 07:26:10 crc kubenswrapper[4947]: I1203 07:26:10.082843 4947 scope.go:117] "RemoveContainer" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" Dec 03 07:26:10 crc kubenswrapper[4947]: E1203 07:26:10.083418 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:26:10 crc kubenswrapper[4947]: I1203 07:26:10.120243 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8p2ks" Dec 03 07:26:10 crc kubenswrapper[4947]: I1203 07:26:10.199154 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8p2ks"] Dec 03 07:26:12 crc kubenswrapper[4947]: I1203 07:26:12.060029 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8p2ks" podUID="88a73ab6-1d70-4636-b819-489f0579e395" containerName="registry-server" containerID="cri-o://ad6830f093509b2edd231385057da96df7bca7a2b1bbf73a64d4a6b61d5381d6" gracePeriod=2 Dec 03 07:26:12 crc kubenswrapper[4947]: I1203 07:26:12.551630 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p2ks" Dec 03 07:26:12 crc kubenswrapper[4947]: I1203 07:26:12.679677 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5qlv\" (UniqueName: \"kubernetes.io/projected/88a73ab6-1d70-4636-b819-489f0579e395-kube-api-access-q5qlv\") pod \"88a73ab6-1d70-4636-b819-489f0579e395\" (UID: \"88a73ab6-1d70-4636-b819-489f0579e395\") " Dec 03 07:26:12 crc kubenswrapper[4947]: I1203 07:26:12.679887 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88a73ab6-1d70-4636-b819-489f0579e395-catalog-content\") pod \"88a73ab6-1d70-4636-b819-489f0579e395\" (UID: \"88a73ab6-1d70-4636-b819-489f0579e395\") " Dec 03 07:26:12 crc kubenswrapper[4947]: I1203 07:26:12.679937 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88a73ab6-1d70-4636-b819-489f0579e395-utilities\") pod \"88a73ab6-1d70-4636-b819-489f0579e395\" (UID: \"88a73ab6-1d70-4636-b819-489f0579e395\") " Dec 03 07:26:12 crc kubenswrapper[4947]: I1203 07:26:12.681053 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88a73ab6-1d70-4636-b819-489f0579e395-utilities" (OuterVolumeSpecName: "utilities") pod "88a73ab6-1d70-4636-b819-489f0579e395" (UID: "88a73ab6-1d70-4636-b819-489f0579e395"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:26:12 crc kubenswrapper[4947]: I1203 07:26:12.687301 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88a73ab6-1d70-4636-b819-489f0579e395-kube-api-access-q5qlv" (OuterVolumeSpecName: "kube-api-access-q5qlv") pod "88a73ab6-1d70-4636-b819-489f0579e395" (UID: "88a73ab6-1d70-4636-b819-489f0579e395"). InnerVolumeSpecName "kube-api-access-q5qlv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:26:12 crc kubenswrapper[4947]: I1203 07:26:12.726124 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88a73ab6-1d70-4636-b819-489f0579e395-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88a73ab6-1d70-4636-b819-489f0579e395" (UID: "88a73ab6-1d70-4636-b819-489f0579e395"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:26:12 crc kubenswrapper[4947]: I1203 07:26:12.781556 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88a73ab6-1d70-4636-b819-489f0579e395-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:26:12 crc kubenswrapper[4947]: I1203 07:26:12.781591 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88a73ab6-1d70-4636-b819-489f0579e395-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:26:12 crc kubenswrapper[4947]: I1203 07:26:12.781605 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5qlv\" (UniqueName: \"kubernetes.io/projected/88a73ab6-1d70-4636-b819-489f0579e395-kube-api-access-q5qlv\") on node \"crc\" DevicePath \"\"" Dec 03 07:26:13 crc kubenswrapper[4947]: I1203 07:26:13.070400 4947 generic.go:334] "Generic (PLEG): container finished" podID="88a73ab6-1d70-4636-b819-489f0579e395" containerID="ad6830f093509b2edd231385057da96df7bca7a2b1bbf73a64d4a6b61d5381d6" exitCode=0 Dec 03 07:26:13 crc kubenswrapper[4947]: I1203 07:26:13.070459 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p2ks" event={"ID":"88a73ab6-1d70-4636-b819-489f0579e395","Type":"ContainerDied","Data":"ad6830f093509b2edd231385057da96df7bca7a2b1bbf73a64d4a6b61d5381d6"} Dec 03 07:26:13 crc kubenswrapper[4947]: I1203 07:26:13.070520 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p2ks" Dec 03 07:26:13 crc kubenswrapper[4947]: I1203 07:26:13.070550 4947 scope.go:117] "RemoveContainer" containerID="ad6830f093509b2edd231385057da96df7bca7a2b1bbf73a64d4a6b61d5381d6" Dec 03 07:26:13 crc kubenswrapper[4947]: I1203 07:26:13.070534 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p2ks" event={"ID":"88a73ab6-1d70-4636-b819-489f0579e395","Type":"ContainerDied","Data":"2e6007c141ae5cdf3a24ab3215536b296db99570acaeba890dba8dfa9de9c7ce"} Dec 03 07:26:13 crc kubenswrapper[4947]: I1203 07:26:13.113607 4947 scope.go:117] "RemoveContainer" containerID="06c3446aa0ca747535cf981f0bb10d25e2b28d28115693bc64158e54f9e0ed69" Dec 03 07:26:13 crc kubenswrapper[4947]: I1203 07:26:13.122304 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8p2ks"] Dec 03 07:26:13 crc kubenswrapper[4947]: I1203 07:26:13.125204 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8p2ks"] Dec 03 07:26:13 crc kubenswrapper[4947]: I1203 07:26:13.143547 4947 scope.go:117] "RemoveContainer" containerID="2cf94279076cf7f636ca535071565bdb242a32123ff88f5478a1658d432eb81c" Dec 03 07:26:13 crc kubenswrapper[4947]: I1203 07:26:13.172053 4947 scope.go:117] "RemoveContainer" containerID="ad6830f093509b2edd231385057da96df7bca7a2b1bbf73a64d4a6b61d5381d6" Dec 03 07:26:13 crc kubenswrapper[4947]: E1203 07:26:13.172474 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad6830f093509b2edd231385057da96df7bca7a2b1bbf73a64d4a6b61d5381d6\": container with ID starting with ad6830f093509b2edd231385057da96df7bca7a2b1bbf73a64d4a6b61d5381d6 not found: ID does not exist" containerID="ad6830f093509b2edd231385057da96df7bca7a2b1bbf73a64d4a6b61d5381d6" Dec 03 07:26:13 crc kubenswrapper[4947]: I1203 07:26:13.172535 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad6830f093509b2edd231385057da96df7bca7a2b1bbf73a64d4a6b61d5381d6"} err="failed to get container status \"ad6830f093509b2edd231385057da96df7bca7a2b1bbf73a64d4a6b61d5381d6\": rpc error: code = NotFound desc = could not find container \"ad6830f093509b2edd231385057da96df7bca7a2b1bbf73a64d4a6b61d5381d6\": container with ID starting with ad6830f093509b2edd231385057da96df7bca7a2b1bbf73a64d4a6b61d5381d6 not found: ID does not exist" Dec 03 07:26:13 crc kubenswrapper[4947]: I1203 07:26:13.172750 4947 scope.go:117] "RemoveContainer" containerID="06c3446aa0ca747535cf981f0bb10d25e2b28d28115693bc64158e54f9e0ed69" Dec 03 07:26:13 crc kubenswrapper[4947]: E1203 07:26:13.173171 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06c3446aa0ca747535cf981f0bb10d25e2b28d28115693bc64158e54f9e0ed69\": container with ID starting with 06c3446aa0ca747535cf981f0bb10d25e2b28d28115693bc64158e54f9e0ed69 not found: ID does not exist" containerID="06c3446aa0ca747535cf981f0bb10d25e2b28d28115693bc64158e54f9e0ed69" Dec 03 07:26:13 crc kubenswrapper[4947]: I1203 07:26:13.173217 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06c3446aa0ca747535cf981f0bb10d25e2b28d28115693bc64158e54f9e0ed69"} err="failed to get container status \"06c3446aa0ca747535cf981f0bb10d25e2b28d28115693bc64158e54f9e0ed69\": rpc error: code = NotFound desc = could not find container \"06c3446aa0ca747535cf981f0bb10d25e2b28d28115693bc64158e54f9e0ed69\": container with ID starting with 06c3446aa0ca747535cf981f0bb10d25e2b28d28115693bc64158e54f9e0ed69 not found: ID does not exist" Dec 03 07:26:13 crc kubenswrapper[4947]: I1203 07:26:13.173249 4947 scope.go:117] "RemoveContainer" containerID="2cf94279076cf7f636ca535071565bdb242a32123ff88f5478a1658d432eb81c" Dec 03 07:26:13 crc kubenswrapper[4947]: E1203 07:26:13.173787 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cf94279076cf7f636ca535071565bdb242a32123ff88f5478a1658d432eb81c\": container with ID starting with 2cf94279076cf7f636ca535071565bdb242a32123ff88f5478a1658d432eb81c not found: ID does not exist" containerID="2cf94279076cf7f636ca535071565bdb242a32123ff88f5478a1658d432eb81c" Dec 03 07:26:13 crc kubenswrapper[4947]: I1203 07:26:13.173812 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cf94279076cf7f636ca535071565bdb242a32123ff88f5478a1658d432eb81c"} err="failed to get container status \"2cf94279076cf7f636ca535071565bdb242a32123ff88f5478a1658d432eb81c\": rpc error: code = NotFound desc = could not find container \"2cf94279076cf7f636ca535071565bdb242a32123ff88f5478a1658d432eb81c\": container with ID starting with 2cf94279076cf7f636ca535071565bdb242a32123ff88f5478a1658d432eb81c not found: ID does not exist" Dec 03 07:26:15 crc kubenswrapper[4947]: I1203 07:26:15.099023 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88a73ab6-1d70-4636-b819-489f0579e395" path="/var/lib/kubelet/pods/88a73ab6-1d70-4636-b819-489f0579e395/volumes" Dec 03 07:26:25 crc kubenswrapper[4947]: I1203 07:26:25.083025 4947 scope.go:117] "RemoveContainer" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" Dec 03 07:26:25 crc kubenswrapper[4947]: E1203 07:26:25.084066 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:26:39 crc kubenswrapper[4947]: I1203 07:26:39.091439 4947 scope.go:117] "RemoveContainer" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" Dec 03 07:26:39 crc kubenswrapper[4947]: E1203 07:26:39.092594 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:26:50 crc kubenswrapper[4947]: I1203 07:26:50.083867 4947 scope.go:117] "RemoveContainer" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" Dec 03 07:26:50 crc kubenswrapper[4947]: E1203 07:26:50.084913 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:27:01 crc kubenswrapper[4947]: I1203 07:27:01.083470 4947 scope.go:117] "RemoveContainer" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" Dec 03 07:27:01 crc kubenswrapper[4947]: E1203 07:27:01.084478 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:27:15 crc kubenswrapper[4947]: I1203 07:27:15.083624 4947 scope.go:117] "RemoveContainer" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" Dec 03 07:27:15 crc kubenswrapper[4947]: E1203 07:27:15.084901 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:27:28 crc kubenswrapper[4947]: I1203 07:27:28.083627 4947 scope.go:117] "RemoveContainer" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" Dec 03 07:27:28 crc kubenswrapper[4947]: E1203 07:27:28.084637 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:27:40 crc kubenswrapper[4947]: I1203 07:27:40.084354 4947 scope.go:117] "RemoveContainer" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" Dec 03 07:27:40 crc kubenswrapper[4947]: E1203 07:27:40.085545 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:27:54 crc kubenswrapper[4947]: I1203 07:27:54.083675 4947 scope.go:117] "RemoveContainer" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" Dec 03 07:27:54 crc kubenswrapper[4947]: E1203 07:27:54.084644 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:28:05 crc kubenswrapper[4947]: I1203 07:28:05.083712 4947 scope.go:117] "RemoveContainer" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" Dec 03 07:28:05 crc kubenswrapper[4947]: E1203 07:28:05.084348 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:28:20 crc kubenswrapper[4947]: I1203 07:28:20.083008 4947 scope.go:117] "RemoveContainer" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" Dec 03 07:28:20 crc kubenswrapper[4947]: E1203 07:28:20.084058 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:28:33 crc kubenswrapper[4947]: I1203 07:28:33.084384 4947 scope.go:117] "RemoveContainer" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" Dec 03 07:28:33 crc kubenswrapper[4947]: E1203 07:28:33.085396 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:28:47 crc kubenswrapper[4947]: I1203 07:28:47.083450 4947 scope.go:117] "RemoveContainer" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" Dec 03 07:28:47 crc kubenswrapper[4947]: E1203 07:28:47.084544 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:29:02 crc kubenswrapper[4947]: I1203 07:29:02.083355 4947 scope.go:117] "RemoveContainer" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" Dec 03 07:29:02 crc kubenswrapper[4947]: E1203 07:29:02.084426 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:29:13 crc kubenswrapper[4947]: I1203 07:29:13.083594 4947 scope.go:117] "RemoveContainer" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" Dec 03 07:29:13 crc kubenswrapper[4947]: E1203 07:29:13.084271 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:29:27 crc kubenswrapper[4947]: I1203 07:29:27.083587 4947 scope.go:117] "RemoveContainer" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" Dec 03 07:29:27 crc kubenswrapper[4947]: E1203 07:29:27.084924 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:29:39 crc kubenswrapper[4947]: I1203 07:29:39.090385 4947 scope.go:117] "RemoveContainer" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" Dec 03 07:29:39 crc kubenswrapper[4947]: E1203 07:29:39.091053 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:29:48 crc kubenswrapper[4947]: I1203 07:29:48.809900 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ph7zr"] Dec 03 07:29:48 crc kubenswrapper[4947]: E1203 07:29:48.810934 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a73ab6-1d70-4636-b819-489f0579e395" containerName="extract-content" Dec 03 07:29:48 crc kubenswrapper[4947]: I1203 07:29:48.810953 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a73ab6-1d70-4636-b819-489f0579e395" containerName="extract-content" Dec 03 07:29:48 crc kubenswrapper[4947]: E1203 07:29:48.810984 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a73ab6-1d70-4636-b819-489f0579e395" containerName="extract-utilities" Dec 03 07:29:48 crc kubenswrapper[4947]: I1203 07:29:48.810994 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a73ab6-1d70-4636-b819-489f0579e395" containerName="extract-utilities" Dec 03 07:29:48 crc kubenswrapper[4947]: E1203 07:29:48.811009 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a73ab6-1d70-4636-b819-489f0579e395" containerName="registry-server" Dec 03 07:29:48 crc kubenswrapper[4947]: I1203 07:29:48.811022 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a73ab6-1d70-4636-b819-489f0579e395" containerName="registry-server" Dec 03 07:29:48 crc kubenswrapper[4947]: I1203 07:29:48.811250 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="88a73ab6-1d70-4636-b819-489f0579e395" containerName="registry-server" Dec 03 07:29:48 crc kubenswrapper[4947]: I1203 07:29:48.812840 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ph7zr" Dec 03 07:29:48 crc kubenswrapper[4947]: I1203 07:29:48.824119 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ph7zr"] Dec 03 07:29:48 crc kubenswrapper[4947]: I1203 07:29:48.894181 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4-utilities\") pod \"redhat-operators-ph7zr\" (UID: \"8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4\") " pod="openshift-marketplace/redhat-operators-ph7zr" Dec 03 07:29:48 crc kubenswrapper[4947]: I1203 07:29:48.894232 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4-catalog-content\") pod \"redhat-operators-ph7zr\" (UID: \"8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4\") " pod="openshift-marketplace/redhat-operators-ph7zr" Dec 03 07:29:48 crc kubenswrapper[4947]: I1203 07:29:48.894306 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzr8m\" (UniqueName: \"kubernetes.io/projected/8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4-kube-api-access-lzr8m\") pod \"redhat-operators-ph7zr\" (UID: \"8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4\") " pod="openshift-marketplace/redhat-operators-ph7zr" Dec 03 07:29:48 crc kubenswrapper[4947]: I1203 07:29:48.995888 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4-utilities\") pod \"redhat-operators-ph7zr\" (UID: \"8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4\") " pod="openshift-marketplace/redhat-operators-ph7zr" Dec 03 07:29:48 crc kubenswrapper[4947]: I1203 07:29:48.995926 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4-catalog-content\") pod \"redhat-operators-ph7zr\" (UID: \"8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4\") " pod="openshift-marketplace/redhat-operators-ph7zr" Dec 03 07:29:48 crc kubenswrapper[4947]: I1203 07:29:48.995963 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzr8m\" (UniqueName: \"kubernetes.io/projected/8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4-kube-api-access-lzr8m\") pod \"redhat-operators-ph7zr\" (UID: \"8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4\") " pod="openshift-marketplace/redhat-operators-ph7zr" Dec 03 07:29:48 crc kubenswrapper[4947]: I1203 07:29:48.996415 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4-catalog-content\") pod \"redhat-operators-ph7zr\" (UID: \"8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4\") " pod="openshift-marketplace/redhat-operators-ph7zr" Dec 03 07:29:48 crc kubenswrapper[4947]: I1203 07:29:48.996564 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4-utilities\") pod \"redhat-operators-ph7zr\" (UID: \"8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4\") " pod="openshift-marketplace/redhat-operators-ph7zr" Dec 03 07:29:49 crc kubenswrapper[4947]: I1203 07:29:49.020731 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzr8m\" (UniqueName: \"kubernetes.io/projected/8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4-kube-api-access-lzr8m\") pod \"redhat-operators-ph7zr\" (UID: \"8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4\") " pod="openshift-marketplace/redhat-operators-ph7zr" Dec 03 07:29:49 crc kubenswrapper[4947]: I1203 07:29:49.181952 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ph7zr" Dec 03 07:29:49 crc kubenswrapper[4947]: I1203 07:29:49.621395 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ph7zr"] Dec 03 07:29:49 crc kubenswrapper[4947]: W1203 07:29:49.627168 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f9bf58a_fcf0_4fe9_9e9b_bf3c454293a4.slice/crio-212770cd461f9eedecf8d1bd50b5b31c0de9ff61d3371890594893b325d0f370 WatchSource:0}: Error finding container 212770cd461f9eedecf8d1bd50b5b31c0de9ff61d3371890594893b325d0f370: Status 404 returned error can't find the container with id 212770cd461f9eedecf8d1bd50b5b31c0de9ff61d3371890594893b325d0f370 Dec 03 07:29:50 crc kubenswrapper[4947]: I1203 07:29:50.106121 4947 generic.go:334] "Generic (PLEG): container finished" podID="8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4" containerID="e09709c837db505361c143702e68005f3d5260eef5a82301f227031a26ec2a97" exitCode=0 Dec 03 07:29:50 crc kubenswrapper[4947]: I1203 07:29:50.106188 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ph7zr" event={"ID":"8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4","Type":"ContainerDied","Data":"e09709c837db505361c143702e68005f3d5260eef5a82301f227031a26ec2a97"} Dec 03 07:29:50 crc kubenswrapper[4947]: I1203 07:29:50.106232 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ph7zr" event={"ID":"8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4","Type":"ContainerStarted","Data":"212770cd461f9eedecf8d1bd50b5b31c0de9ff61d3371890594893b325d0f370"} Dec 03 07:29:50 crc kubenswrapper[4947]: I1203 07:29:50.109378 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 07:29:51 crc kubenswrapper[4947]: I1203 07:29:51.083002 4947 scope.go:117] "RemoveContainer" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" Dec 03 07:29:51 crc kubenswrapper[4947]: E1203 07:29:51.083539 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:29:51 crc kubenswrapper[4947]: I1203 07:29:51.119443 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ph7zr" event={"ID":"8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4","Type":"ContainerStarted","Data":"15c6ffcee5bcd1f050d316f69af91b921c15d9b2524d132f859a1fa449644920"} Dec 03 07:29:52 crc kubenswrapper[4947]: I1203 07:29:52.133693 4947 generic.go:334] "Generic (PLEG): container finished" podID="8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4" containerID="15c6ffcee5bcd1f050d316f69af91b921c15d9b2524d132f859a1fa449644920" exitCode=0 Dec 03 07:29:52 crc kubenswrapper[4947]: I1203 07:29:52.134567 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ph7zr" event={"ID":"8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4","Type":"ContainerDied","Data":"15c6ffcee5bcd1f050d316f69af91b921c15d9b2524d132f859a1fa449644920"} Dec 03 07:29:53 crc kubenswrapper[4947]: I1203 07:29:53.147263 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ph7zr" event={"ID":"8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4","Type":"ContainerStarted","Data":"1f5b19b4bbbb92a921cb6ac35aa5a0c283e3dc5ff06c54ea3c06b35615b3c09b"} Dec 03 07:29:53 crc kubenswrapper[4947]: I1203 07:29:53.173033 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ph7zr" podStartSLOduration=2.609491965 podStartE2EDuration="5.173010001s" podCreationTimestamp="2025-12-03 07:29:48 +0000 UTC" firstStartedPulling="2025-12-03 07:29:50.108848568 +0000 UTC m=+2451.369802994" lastFinishedPulling="2025-12-03 07:29:52.672366574 +0000 UTC m=+2453.933321030" observedRunningTime="2025-12-03 07:29:53.169807345 +0000 UTC m=+2454.430761821" watchObservedRunningTime="2025-12-03 07:29:53.173010001 +0000 UTC m=+2454.433964437" Dec 03 07:29:59 crc kubenswrapper[4947]: I1203 07:29:59.183081 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ph7zr" Dec 03 07:29:59 crc kubenswrapper[4947]: I1203 07:29:59.183522 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ph7zr" Dec 03 07:29:59 crc kubenswrapper[4947]: I1203 07:29:59.244817 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ph7zr" Dec 03 07:30:00 crc kubenswrapper[4947]: I1203 07:30:00.165134 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412450-t6xgf"] Dec 03 07:30:00 crc kubenswrapper[4947]: I1203 07:30:00.166699 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-t6xgf" Dec 03 07:30:00 crc kubenswrapper[4947]: I1203 07:30:00.169099 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 07:30:00 crc kubenswrapper[4947]: I1203 07:30:00.169230 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 07:30:00 crc kubenswrapper[4947]: I1203 07:30:00.179710 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412450-t6xgf"] Dec 03 07:30:00 crc kubenswrapper[4947]: I1203 07:30:00.252239 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ph7zr" Dec 03 07:30:00 crc kubenswrapper[4947]: I1203 07:30:00.256336 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/647811b5-9ad1-4ead-a5a8-f9c3ee8790c2-config-volume\") pod \"collect-profiles-29412450-t6xgf\" (UID: \"647811b5-9ad1-4ead-a5a8-f9c3ee8790c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-t6xgf" Dec 03 07:30:00 crc kubenswrapper[4947]: I1203 07:30:00.256520 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/647811b5-9ad1-4ead-a5a8-f9c3ee8790c2-secret-volume\") pod \"collect-profiles-29412450-t6xgf\" (UID: \"647811b5-9ad1-4ead-a5a8-f9c3ee8790c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-t6xgf" Dec 03 07:30:00 crc kubenswrapper[4947]: I1203 07:30:00.256682 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cp2l\" (UniqueName: \"kubernetes.io/projected/647811b5-9ad1-4ead-a5a8-f9c3ee8790c2-kube-api-access-6cp2l\") pod \"collect-profiles-29412450-t6xgf\" (UID: \"647811b5-9ad1-4ead-a5a8-f9c3ee8790c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-t6xgf" Dec 03 07:30:00 crc kubenswrapper[4947]: I1203 07:30:00.293800 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ph7zr"] Dec 03 07:30:00 crc kubenswrapper[4947]: I1203 07:30:00.357560 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cp2l\" (UniqueName: \"kubernetes.io/projected/647811b5-9ad1-4ead-a5a8-f9c3ee8790c2-kube-api-access-6cp2l\") pod \"collect-profiles-29412450-t6xgf\" (UID: \"647811b5-9ad1-4ead-a5a8-f9c3ee8790c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-t6xgf" Dec 03 07:30:00 crc kubenswrapper[4947]: I1203 07:30:00.357655 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/647811b5-9ad1-4ead-a5a8-f9c3ee8790c2-config-volume\") pod \"collect-profiles-29412450-t6xgf\" (UID: \"647811b5-9ad1-4ead-a5a8-f9c3ee8790c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-t6xgf" Dec 03 07:30:00 crc kubenswrapper[4947]: I1203 07:30:00.357721 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/647811b5-9ad1-4ead-a5a8-f9c3ee8790c2-secret-volume\") pod \"collect-profiles-29412450-t6xgf\" (UID: \"647811b5-9ad1-4ead-a5a8-f9c3ee8790c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-t6xgf" Dec 03 07:30:00 crc kubenswrapper[4947]: I1203 07:30:00.359171 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/647811b5-9ad1-4ead-a5a8-f9c3ee8790c2-config-volume\") pod \"collect-profiles-29412450-t6xgf\" (UID: \"647811b5-9ad1-4ead-a5a8-f9c3ee8790c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-t6xgf" Dec 03 07:30:00 crc kubenswrapper[4947]: I1203 07:30:00.363614 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/647811b5-9ad1-4ead-a5a8-f9c3ee8790c2-secret-volume\") pod \"collect-profiles-29412450-t6xgf\" (UID: \"647811b5-9ad1-4ead-a5a8-f9c3ee8790c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-t6xgf" Dec 03 07:30:00 crc kubenswrapper[4947]: I1203 07:30:00.388953 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cp2l\" (UniqueName: \"kubernetes.io/projected/647811b5-9ad1-4ead-a5a8-f9c3ee8790c2-kube-api-access-6cp2l\") pod \"collect-profiles-29412450-t6xgf\" (UID: \"647811b5-9ad1-4ead-a5a8-f9c3ee8790c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-t6xgf" Dec 03 07:30:00 crc kubenswrapper[4947]: I1203 07:30:00.488319 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-t6xgf" Dec 03 07:30:01 crc kubenswrapper[4947]: W1203 07:30:01.025919 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod647811b5_9ad1_4ead_a5a8_f9c3ee8790c2.slice/crio-216322feaa40472f53d107e876d3024e187bf197e92dcebdca35f8f22ba6289e WatchSource:0}: Error finding container 216322feaa40472f53d107e876d3024e187bf197e92dcebdca35f8f22ba6289e: Status 404 returned error can't find the container with id 216322feaa40472f53d107e876d3024e187bf197e92dcebdca35f8f22ba6289e Dec 03 07:30:01 crc kubenswrapper[4947]: I1203 07:30:01.026404 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412450-t6xgf"] Dec 03 07:30:01 crc kubenswrapper[4947]: I1203 07:30:01.222471 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-t6xgf" event={"ID":"647811b5-9ad1-4ead-a5a8-f9c3ee8790c2","Type":"ContainerStarted","Data":"216322feaa40472f53d107e876d3024e187bf197e92dcebdca35f8f22ba6289e"} Dec 03 07:30:02 crc kubenswrapper[4947]: I1203 07:30:02.235221 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ph7zr" podUID="8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4" containerName="registry-server" containerID="cri-o://1f5b19b4bbbb92a921cb6ac35aa5a0c283e3dc5ff06c54ea3c06b35615b3c09b" gracePeriod=2 Dec 03 07:30:02 crc kubenswrapper[4947]: I1203 07:30:02.235542 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-t6xgf" event={"ID":"647811b5-9ad1-4ead-a5a8-f9c3ee8790c2","Type":"ContainerStarted","Data":"1bd55e1d2ebef97b1663e41bd29d22afa379b89e4f1293601aaf0ac371c77016"} Dec 03 07:30:03 crc kubenswrapper[4947]: I1203 07:30:03.259376 4947 generic.go:334] "Generic (PLEG): container finished" podID="8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4" containerID="1f5b19b4bbbb92a921cb6ac35aa5a0c283e3dc5ff06c54ea3c06b35615b3c09b" exitCode=0 Dec 03 07:30:03 crc kubenswrapper[4947]: I1203 07:30:03.259762 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ph7zr" event={"ID":"8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4","Type":"ContainerDied","Data":"1f5b19b4bbbb92a921cb6ac35aa5a0c283e3dc5ff06c54ea3c06b35615b3c09b"} Dec 03 07:30:03 crc kubenswrapper[4947]: I1203 07:30:03.268675 4947 generic.go:334] "Generic (PLEG): container finished" podID="647811b5-9ad1-4ead-a5a8-f9c3ee8790c2" containerID="1bd55e1d2ebef97b1663e41bd29d22afa379b89e4f1293601aaf0ac371c77016" exitCode=0 Dec 03 07:30:03 crc kubenswrapper[4947]: I1203 07:30:03.268725 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-t6xgf" event={"ID":"647811b5-9ad1-4ead-a5a8-f9c3ee8790c2","Type":"ContainerDied","Data":"1bd55e1d2ebef97b1663e41bd29d22afa379b89e4f1293601aaf0ac371c77016"} Dec 03 07:30:03 crc kubenswrapper[4947]: I1203 07:30:03.410872 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ph7zr" Dec 03 07:30:03 crc kubenswrapper[4947]: I1203 07:30:03.506446 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzr8m\" (UniqueName: \"kubernetes.io/projected/8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4-kube-api-access-lzr8m\") pod \"8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4\" (UID: \"8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4\") " Dec 03 07:30:03 crc kubenswrapper[4947]: I1203 07:30:03.506542 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4-catalog-content\") pod \"8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4\" (UID: \"8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4\") " Dec 03 07:30:03 crc kubenswrapper[4947]: I1203 07:30:03.506687 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4-utilities\") pod \"8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4\" (UID: \"8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4\") " Dec 03 07:30:03 crc kubenswrapper[4947]: I1203 07:30:03.507957 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4-utilities" (OuterVolumeSpecName: "utilities") pod "8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4" (UID: "8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:30:03 crc kubenswrapper[4947]: I1203 07:30:03.513775 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4-kube-api-access-lzr8m" (OuterVolumeSpecName: "kube-api-access-lzr8m") pod "8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4" (UID: "8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4"). InnerVolumeSpecName "kube-api-access-lzr8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:30:03 crc kubenswrapper[4947]: I1203 07:30:03.608390 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzr8m\" (UniqueName: \"kubernetes.io/projected/8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4-kube-api-access-lzr8m\") on node \"crc\" DevicePath \"\"" Dec 03 07:30:03 crc kubenswrapper[4947]: I1203 07:30:03.608437 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:30:03 crc kubenswrapper[4947]: I1203 07:30:03.623094 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4" (UID: "8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:30:03 crc kubenswrapper[4947]: I1203 07:30:03.710138 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:30:04 crc kubenswrapper[4947]: I1203 07:30:04.280055 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ph7zr" event={"ID":"8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4","Type":"ContainerDied","Data":"212770cd461f9eedecf8d1bd50b5b31c0de9ff61d3371890594893b325d0f370"} Dec 03 07:30:04 crc kubenswrapper[4947]: I1203 07:30:04.280131 4947 scope.go:117] "RemoveContainer" containerID="1f5b19b4bbbb92a921cb6ac35aa5a0c283e3dc5ff06c54ea3c06b35615b3c09b" Dec 03 07:30:04 crc kubenswrapper[4947]: I1203 07:30:04.280234 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ph7zr" Dec 03 07:30:04 crc kubenswrapper[4947]: I1203 07:30:04.320719 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ph7zr"] Dec 03 07:30:04 crc kubenswrapper[4947]: I1203 07:30:04.321745 4947 scope.go:117] "RemoveContainer" containerID="15c6ffcee5bcd1f050d316f69af91b921c15d9b2524d132f859a1fa449644920" Dec 03 07:30:04 crc kubenswrapper[4947]: I1203 07:30:04.325611 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ph7zr"] Dec 03 07:30:04 crc kubenswrapper[4947]: I1203 07:30:04.366693 4947 scope.go:117] "RemoveContainer" containerID="e09709c837db505361c143702e68005f3d5260eef5a82301f227031a26ec2a97" Dec 03 07:30:04 crc kubenswrapper[4947]: I1203 07:30:04.585994 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-t6xgf" Dec 03 07:30:04 crc kubenswrapper[4947]: I1203 07:30:04.724401 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/647811b5-9ad1-4ead-a5a8-f9c3ee8790c2-secret-volume\") pod \"647811b5-9ad1-4ead-a5a8-f9c3ee8790c2\" (UID: \"647811b5-9ad1-4ead-a5a8-f9c3ee8790c2\") " Dec 03 07:30:04 crc kubenswrapper[4947]: I1203 07:30:04.724469 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/647811b5-9ad1-4ead-a5a8-f9c3ee8790c2-config-volume\") pod \"647811b5-9ad1-4ead-a5a8-f9c3ee8790c2\" (UID: \"647811b5-9ad1-4ead-a5a8-f9c3ee8790c2\") " Dec 03 07:30:04 crc kubenswrapper[4947]: I1203 07:30:04.724629 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cp2l\" (UniqueName: \"kubernetes.io/projected/647811b5-9ad1-4ead-a5a8-f9c3ee8790c2-kube-api-access-6cp2l\") pod \"647811b5-9ad1-4ead-a5a8-f9c3ee8790c2\" (UID: \"647811b5-9ad1-4ead-a5a8-f9c3ee8790c2\") " Dec 03 07:30:04 crc kubenswrapper[4947]: I1203 07:30:04.725047 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/647811b5-9ad1-4ead-a5a8-f9c3ee8790c2-config-volume" (OuterVolumeSpecName: "config-volume") pod "647811b5-9ad1-4ead-a5a8-f9c3ee8790c2" (UID: "647811b5-9ad1-4ead-a5a8-f9c3ee8790c2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:30:04 crc kubenswrapper[4947]: I1203 07:30:04.727584 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/647811b5-9ad1-4ead-a5a8-f9c3ee8790c2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "647811b5-9ad1-4ead-a5a8-f9c3ee8790c2" (UID: "647811b5-9ad1-4ead-a5a8-f9c3ee8790c2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:30:04 crc kubenswrapper[4947]: I1203 07:30:04.727940 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/647811b5-9ad1-4ead-a5a8-f9c3ee8790c2-kube-api-access-6cp2l" (OuterVolumeSpecName: "kube-api-access-6cp2l") pod "647811b5-9ad1-4ead-a5a8-f9c3ee8790c2" (UID: "647811b5-9ad1-4ead-a5a8-f9c3ee8790c2"). InnerVolumeSpecName "kube-api-access-6cp2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:30:04 crc kubenswrapper[4947]: I1203 07:30:04.826845 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cp2l\" (UniqueName: \"kubernetes.io/projected/647811b5-9ad1-4ead-a5a8-f9c3ee8790c2-kube-api-access-6cp2l\") on node \"crc\" DevicePath \"\"" Dec 03 07:30:04 crc kubenswrapper[4947]: I1203 07:30:04.826878 4947 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/647811b5-9ad1-4ead-a5a8-f9c3ee8790c2-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 07:30:04 crc kubenswrapper[4947]: I1203 07:30:04.826888 4947 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/647811b5-9ad1-4ead-a5a8-f9c3ee8790c2-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 07:30:05 crc kubenswrapper[4947]: I1203 07:30:05.090854 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4" path="/var/lib/kubelet/pods/8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4/volumes" Dec 03 07:30:05 crc kubenswrapper[4947]: I1203 07:30:05.289340 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-t6xgf" event={"ID":"647811b5-9ad1-4ead-a5a8-f9c3ee8790c2","Type":"ContainerDied","Data":"216322feaa40472f53d107e876d3024e187bf197e92dcebdca35f8f22ba6289e"} Dec 03 07:30:05 crc kubenswrapper[4947]: I1203 07:30:05.289409 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="216322feaa40472f53d107e876d3024e187bf197e92dcebdca35f8f22ba6289e" Dec 03 07:30:05 crc kubenswrapper[4947]: I1203 07:30:05.289419 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-t6xgf" Dec 03 07:30:05 crc kubenswrapper[4947]: I1203 07:30:05.676982 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412405-w524p"] Dec 03 07:30:05 crc kubenswrapper[4947]: I1203 07:30:05.691174 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412405-w524p"] Dec 03 07:30:06 crc kubenswrapper[4947]: I1203 07:30:06.083127 4947 scope.go:117] "RemoveContainer" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" Dec 03 07:30:06 crc kubenswrapper[4947]: E1203 07:30:06.083521 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:30:07 crc kubenswrapper[4947]: I1203 07:30:07.100459 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9c0cc40-7389-4703-bf34-d42b6bf32710" path="/var/lib/kubelet/pods/b9c0cc40-7389-4703-bf34-d42b6bf32710/volumes" Dec 03 07:30:21 crc kubenswrapper[4947]: I1203 07:30:21.083987 4947 scope.go:117] "RemoveContainer" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" Dec 03 07:30:21 crc kubenswrapper[4947]: E1203 07:30:21.085074 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:30:33 crc kubenswrapper[4947]: I1203 07:30:33.083023 4947 scope.go:117] "RemoveContainer" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" Dec 03 07:30:33 crc kubenswrapper[4947]: I1203 07:30:33.542156 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"d265391ff7585ca11382f591c2bda4b86b5e500198d54c1f83a8d9c246635dde"} Dec 03 07:30:55 crc kubenswrapper[4947]: I1203 07:30:55.768625 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-58cjs"] Dec 03 07:30:55 crc kubenswrapper[4947]: E1203 07:30:55.769706 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4" containerName="extract-utilities" Dec 03 07:30:55 crc kubenswrapper[4947]: I1203 07:30:55.769731 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4" containerName="extract-utilities" Dec 03 07:30:55 crc kubenswrapper[4947]: E1203 07:30:55.769762 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4" containerName="registry-server" Dec 03 07:30:55 crc kubenswrapper[4947]: I1203 07:30:55.769775 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4" containerName="registry-server" Dec 03 07:30:55 crc kubenswrapper[4947]: E1203 07:30:55.769793 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4" containerName="extract-content" Dec 03 07:30:55 crc kubenswrapper[4947]: I1203 07:30:55.769806 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4" containerName="extract-content" Dec 03 07:30:55 crc kubenswrapper[4947]: E1203 07:30:55.769858 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647811b5-9ad1-4ead-a5a8-f9c3ee8790c2" containerName="collect-profiles" Dec 03 07:30:55 crc kubenswrapper[4947]: I1203 07:30:55.769870 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="647811b5-9ad1-4ead-a5a8-f9c3ee8790c2" containerName="collect-profiles" Dec 03 07:30:55 crc kubenswrapper[4947]: I1203 07:30:55.770117 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f9bf58a-fcf0-4fe9-9e9b-bf3c454293a4" containerName="registry-server" Dec 03 07:30:55 crc kubenswrapper[4947]: I1203 07:30:55.770146 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="647811b5-9ad1-4ead-a5a8-f9c3ee8790c2" containerName="collect-profiles" Dec 03 07:30:55 crc kubenswrapper[4947]: I1203 07:30:55.773079 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-58cjs" Dec 03 07:30:55 crc kubenswrapper[4947]: I1203 07:30:55.779247 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-58cjs"] Dec 03 07:30:55 crc kubenswrapper[4947]: I1203 07:30:55.917301 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc1b399d-b686-418e-a62a-e41892d377ab-utilities\") pod \"redhat-marketplace-58cjs\" (UID: \"cc1b399d-b686-418e-a62a-e41892d377ab\") " pod="openshift-marketplace/redhat-marketplace-58cjs" Dec 03 07:30:55 crc kubenswrapper[4947]: I1203 07:30:55.917350 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc1b399d-b686-418e-a62a-e41892d377ab-catalog-content\") pod \"redhat-marketplace-58cjs\" (UID: \"cc1b399d-b686-418e-a62a-e41892d377ab\") " pod="openshift-marketplace/redhat-marketplace-58cjs" Dec 03 07:30:55 crc kubenswrapper[4947]: I1203 07:30:55.917409 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmz4w\" (UniqueName: \"kubernetes.io/projected/cc1b399d-b686-418e-a62a-e41892d377ab-kube-api-access-rmz4w\") pod \"redhat-marketplace-58cjs\" (UID: \"cc1b399d-b686-418e-a62a-e41892d377ab\") " pod="openshift-marketplace/redhat-marketplace-58cjs" Dec 03 07:30:56 crc kubenswrapper[4947]: I1203 07:30:56.019118 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmz4w\" (UniqueName: \"kubernetes.io/projected/cc1b399d-b686-418e-a62a-e41892d377ab-kube-api-access-rmz4w\") pod \"redhat-marketplace-58cjs\" (UID: \"cc1b399d-b686-418e-a62a-e41892d377ab\") " pod="openshift-marketplace/redhat-marketplace-58cjs" Dec 03 07:30:56 crc kubenswrapper[4947]: I1203 07:30:56.019204 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc1b399d-b686-418e-a62a-e41892d377ab-utilities\") pod \"redhat-marketplace-58cjs\" (UID: \"cc1b399d-b686-418e-a62a-e41892d377ab\") " pod="openshift-marketplace/redhat-marketplace-58cjs" Dec 03 07:30:56 crc kubenswrapper[4947]: I1203 07:30:56.019242 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc1b399d-b686-418e-a62a-e41892d377ab-catalog-content\") pod \"redhat-marketplace-58cjs\" (UID: \"cc1b399d-b686-418e-a62a-e41892d377ab\") " pod="openshift-marketplace/redhat-marketplace-58cjs" Dec 03 07:30:56 crc kubenswrapper[4947]: I1203 07:30:56.019774 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc1b399d-b686-418e-a62a-e41892d377ab-catalog-content\") pod \"redhat-marketplace-58cjs\" (UID: \"cc1b399d-b686-418e-a62a-e41892d377ab\") " pod="openshift-marketplace/redhat-marketplace-58cjs" Dec 03 07:30:56 crc kubenswrapper[4947]: I1203 07:30:56.020178 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc1b399d-b686-418e-a62a-e41892d377ab-utilities\") pod \"redhat-marketplace-58cjs\" (UID: \"cc1b399d-b686-418e-a62a-e41892d377ab\") " pod="openshift-marketplace/redhat-marketplace-58cjs" Dec 03 07:30:56 crc kubenswrapper[4947]: I1203 07:30:56.051412 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmz4w\" (UniqueName: \"kubernetes.io/projected/cc1b399d-b686-418e-a62a-e41892d377ab-kube-api-access-rmz4w\") pod \"redhat-marketplace-58cjs\" (UID: \"cc1b399d-b686-418e-a62a-e41892d377ab\") " pod="openshift-marketplace/redhat-marketplace-58cjs" Dec 03 07:30:56 crc kubenswrapper[4947]: I1203 07:30:56.098035 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-58cjs" Dec 03 07:30:56 crc kubenswrapper[4947]: I1203 07:30:56.568255 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-58cjs"] Dec 03 07:30:56 crc kubenswrapper[4947]: W1203 07:30:56.575849 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc1b399d_b686_418e_a62a_e41892d377ab.slice/crio-bf9850f8480644bf9659f6aa0935915197c907f8279c178a2b5a08365b27499c WatchSource:0}: Error finding container bf9850f8480644bf9659f6aa0935915197c907f8279c178a2b5a08365b27499c: Status 404 returned error can't find the container with id bf9850f8480644bf9659f6aa0935915197c907f8279c178a2b5a08365b27499c Dec 03 07:30:56 crc kubenswrapper[4947]: I1203 07:30:56.811308 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58cjs" event={"ID":"cc1b399d-b686-418e-a62a-e41892d377ab","Type":"ContainerStarted","Data":"2ebf0140bf9cd7ca4d3538fc45de101369ed584d844ed9cfd6aa70c56f367bf8"} Dec 03 07:30:56 crc kubenswrapper[4947]: I1203 07:30:56.811660 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58cjs" event={"ID":"cc1b399d-b686-418e-a62a-e41892d377ab","Type":"ContainerStarted","Data":"bf9850f8480644bf9659f6aa0935915197c907f8279c178a2b5a08365b27499c"} Dec 03 07:30:57 crc kubenswrapper[4947]: I1203 07:30:57.824251 4947 generic.go:334] "Generic (PLEG): container finished" podID="cc1b399d-b686-418e-a62a-e41892d377ab" containerID="2ebf0140bf9cd7ca4d3538fc45de101369ed584d844ed9cfd6aa70c56f367bf8" exitCode=0 Dec 03 07:30:57 crc kubenswrapper[4947]: I1203 07:30:57.824337 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58cjs" event={"ID":"cc1b399d-b686-418e-a62a-e41892d377ab","Type":"ContainerDied","Data":"2ebf0140bf9cd7ca4d3538fc45de101369ed584d844ed9cfd6aa70c56f367bf8"} Dec 03 07:30:58 crc kubenswrapper[4947]: I1203 07:30:58.837590 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58cjs" event={"ID":"cc1b399d-b686-418e-a62a-e41892d377ab","Type":"ContainerStarted","Data":"584dea7807cfdbeebaa2fc13cb3e839ef4462c565a8aae2674f53701a57e9ba5"} Dec 03 07:30:59 crc kubenswrapper[4947]: I1203 07:30:59.845816 4947 generic.go:334] "Generic (PLEG): container finished" podID="cc1b399d-b686-418e-a62a-e41892d377ab" containerID="584dea7807cfdbeebaa2fc13cb3e839ef4462c565a8aae2674f53701a57e9ba5" exitCode=0 Dec 03 07:30:59 crc kubenswrapper[4947]: I1203 07:30:59.845880 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58cjs" event={"ID":"cc1b399d-b686-418e-a62a-e41892d377ab","Type":"ContainerDied","Data":"584dea7807cfdbeebaa2fc13cb3e839ef4462c565a8aae2674f53701a57e9ba5"} Dec 03 07:31:01 crc kubenswrapper[4947]: I1203 07:31:01.864179 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58cjs" event={"ID":"cc1b399d-b686-418e-a62a-e41892d377ab","Type":"ContainerStarted","Data":"a3db2e08af0e9032f67c901d49be224eddb5ef0e9840fca1023b478415608511"} Dec 03 07:31:01 crc kubenswrapper[4947]: I1203 07:31:01.885069 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-58cjs" podStartSLOduration=3.26237775 podStartE2EDuration="6.885051099s" podCreationTimestamp="2025-12-03 07:30:55 +0000 UTC" firstStartedPulling="2025-12-03 07:30:57.828130141 +0000 UTC m=+2519.089084617" lastFinishedPulling="2025-12-03 07:31:01.4508035 +0000 UTC m=+2522.711757966" observedRunningTime="2025-12-03 07:31:01.877823423 +0000 UTC m=+2523.138777879" watchObservedRunningTime="2025-12-03 07:31:01.885051099 +0000 UTC m=+2523.146005535" Dec 03 07:31:04 crc kubenswrapper[4947]: I1203 07:31:04.670585 4947 scope.go:117] "RemoveContainer" containerID="8f90dfa482cde9093d9723fd092a4ce4a48f644a83f95eea475576a5de33973c" Dec 03 07:31:06 crc kubenswrapper[4947]: I1203 07:31:06.099118 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-58cjs" Dec 03 07:31:06 crc kubenswrapper[4947]: I1203 07:31:06.099448 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-58cjs" Dec 03 07:31:06 crc kubenswrapper[4947]: I1203 07:31:06.141998 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-58cjs" Dec 03 07:31:06 crc kubenswrapper[4947]: I1203 07:31:06.961920 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-58cjs" Dec 03 07:31:07 crc kubenswrapper[4947]: I1203 07:31:07.016383 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-58cjs"] Dec 03 07:31:08 crc kubenswrapper[4947]: I1203 07:31:08.926846 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-58cjs" podUID="cc1b399d-b686-418e-a62a-e41892d377ab" containerName="registry-server" containerID="cri-o://a3db2e08af0e9032f67c901d49be224eddb5ef0e9840fca1023b478415608511" gracePeriod=2 Dec 03 07:31:09 crc kubenswrapper[4947]: I1203 07:31:09.416877 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-58cjs" Dec 03 07:31:09 crc kubenswrapper[4947]: I1203 07:31:09.521705 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc1b399d-b686-418e-a62a-e41892d377ab-catalog-content\") pod \"cc1b399d-b686-418e-a62a-e41892d377ab\" (UID: \"cc1b399d-b686-418e-a62a-e41892d377ab\") " Dec 03 07:31:09 crc kubenswrapper[4947]: I1203 07:31:09.521799 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc1b399d-b686-418e-a62a-e41892d377ab-utilities\") pod \"cc1b399d-b686-418e-a62a-e41892d377ab\" (UID: \"cc1b399d-b686-418e-a62a-e41892d377ab\") " Dec 03 07:31:09 crc kubenswrapper[4947]: I1203 07:31:09.522575 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmz4w\" (UniqueName: \"kubernetes.io/projected/cc1b399d-b686-418e-a62a-e41892d377ab-kube-api-access-rmz4w\") pod \"cc1b399d-b686-418e-a62a-e41892d377ab\" (UID: \"cc1b399d-b686-418e-a62a-e41892d377ab\") " Dec 03 07:31:09 crc kubenswrapper[4947]: I1203 07:31:09.522680 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc1b399d-b686-418e-a62a-e41892d377ab-utilities" (OuterVolumeSpecName: "utilities") pod "cc1b399d-b686-418e-a62a-e41892d377ab" (UID: "cc1b399d-b686-418e-a62a-e41892d377ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:31:09 crc kubenswrapper[4947]: I1203 07:31:09.522871 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc1b399d-b686-418e-a62a-e41892d377ab-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:31:09 crc kubenswrapper[4947]: I1203 07:31:09.532734 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc1b399d-b686-418e-a62a-e41892d377ab-kube-api-access-rmz4w" (OuterVolumeSpecName: "kube-api-access-rmz4w") pod "cc1b399d-b686-418e-a62a-e41892d377ab" (UID: "cc1b399d-b686-418e-a62a-e41892d377ab"). InnerVolumeSpecName "kube-api-access-rmz4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:31:09 crc kubenswrapper[4947]: I1203 07:31:09.557457 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc1b399d-b686-418e-a62a-e41892d377ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc1b399d-b686-418e-a62a-e41892d377ab" (UID: "cc1b399d-b686-418e-a62a-e41892d377ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:31:09 crc kubenswrapper[4947]: I1203 07:31:09.623810 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc1b399d-b686-418e-a62a-e41892d377ab-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:31:09 crc kubenswrapper[4947]: I1203 07:31:09.623845 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmz4w\" (UniqueName: \"kubernetes.io/projected/cc1b399d-b686-418e-a62a-e41892d377ab-kube-api-access-rmz4w\") on node \"crc\" DevicePath \"\"" Dec 03 07:31:09 crc kubenswrapper[4947]: I1203 07:31:09.939411 4947 generic.go:334] "Generic (PLEG): container finished" podID="cc1b399d-b686-418e-a62a-e41892d377ab" containerID="a3db2e08af0e9032f67c901d49be224eddb5ef0e9840fca1023b478415608511" exitCode=0 Dec 03 07:31:09 crc kubenswrapper[4947]: I1203 07:31:09.939465 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58cjs" event={"ID":"cc1b399d-b686-418e-a62a-e41892d377ab","Type":"ContainerDied","Data":"a3db2e08af0e9032f67c901d49be224eddb5ef0e9840fca1023b478415608511"} Dec 03 07:31:09 crc kubenswrapper[4947]: I1203 07:31:09.939545 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-58cjs" event={"ID":"cc1b399d-b686-418e-a62a-e41892d377ab","Type":"ContainerDied","Data":"bf9850f8480644bf9659f6aa0935915197c907f8279c178a2b5a08365b27499c"} Dec 03 07:31:09 crc kubenswrapper[4947]: I1203 07:31:09.939542 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-58cjs" Dec 03 07:31:09 crc kubenswrapper[4947]: I1203 07:31:09.939568 4947 scope.go:117] "RemoveContainer" containerID="a3db2e08af0e9032f67c901d49be224eddb5ef0e9840fca1023b478415608511" Dec 03 07:31:09 crc kubenswrapper[4947]: I1203 07:31:09.964328 4947 scope.go:117] "RemoveContainer" containerID="584dea7807cfdbeebaa2fc13cb3e839ef4462c565a8aae2674f53701a57e9ba5" Dec 03 07:31:09 crc kubenswrapper[4947]: I1203 07:31:09.988025 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-58cjs"] Dec 03 07:31:10 crc kubenswrapper[4947]: I1203 07:31:10.004222 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-58cjs"] Dec 03 07:31:10 crc kubenswrapper[4947]: I1203 07:31:10.011245 4947 scope.go:117] "RemoveContainer" containerID="2ebf0140bf9cd7ca4d3538fc45de101369ed584d844ed9cfd6aa70c56f367bf8" Dec 03 07:31:10 crc kubenswrapper[4947]: I1203 07:31:10.045455 4947 scope.go:117] "RemoveContainer" containerID="a3db2e08af0e9032f67c901d49be224eddb5ef0e9840fca1023b478415608511" Dec 03 07:31:10 crc kubenswrapper[4947]: E1203 07:31:10.045873 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3db2e08af0e9032f67c901d49be224eddb5ef0e9840fca1023b478415608511\": container with ID starting with a3db2e08af0e9032f67c901d49be224eddb5ef0e9840fca1023b478415608511 not found: ID does not exist" containerID="a3db2e08af0e9032f67c901d49be224eddb5ef0e9840fca1023b478415608511" Dec 03 07:31:10 crc kubenswrapper[4947]: I1203 07:31:10.045921 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3db2e08af0e9032f67c901d49be224eddb5ef0e9840fca1023b478415608511"} err="failed to get container status \"a3db2e08af0e9032f67c901d49be224eddb5ef0e9840fca1023b478415608511\": rpc error: code = NotFound desc = could not find container \"a3db2e08af0e9032f67c901d49be224eddb5ef0e9840fca1023b478415608511\": container with ID starting with a3db2e08af0e9032f67c901d49be224eddb5ef0e9840fca1023b478415608511 not found: ID does not exist" Dec 03 07:31:10 crc kubenswrapper[4947]: I1203 07:31:10.045950 4947 scope.go:117] "RemoveContainer" containerID="584dea7807cfdbeebaa2fc13cb3e839ef4462c565a8aae2674f53701a57e9ba5" Dec 03 07:31:10 crc kubenswrapper[4947]: E1203 07:31:10.046360 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"584dea7807cfdbeebaa2fc13cb3e839ef4462c565a8aae2674f53701a57e9ba5\": container with ID starting with 584dea7807cfdbeebaa2fc13cb3e839ef4462c565a8aae2674f53701a57e9ba5 not found: ID does not exist" containerID="584dea7807cfdbeebaa2fc13cb3e839ef4462c565a8aae2674f53701a57e9ba5" Dec 03 07:31:10 crc kubenswrapper[4947]: I1203 07:31:10.046561 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"584dea7807cfdbeebaa2fc13cb3e839ef4462c565a8aae2674f53701a57e9ba5"} err="failed to get container status \"584dea7807cfdbeebaa2fc13cb3e839ef4462c565a8aae2674f53701a57e9ba5\": rpc error: code = NotFound desc = could not find container \"584dea7807cfdbeebaa2fc13cb3e839ef4462c565a8aae2674f53701a57e9ba5\": container with ID starting with 584dea7807cfdbeebaa2fc13cb3e839ef4462c565a8aae2674f53701a57e9ba5 not found: ID does not exist" Dec 03 07:31:10 crc kubenswrapper[4947]: I1203 07:31:10.046721 4947 scope.go:117] "RemoveContainer" containerID="2ebf0140bf9cd7ca4d3538fc45de101369ed584d844ed9cfd6aa70c56f367bf8" Dec 03 07:31:10 crc kubenswrapper[4947]: E1203 07:31:10.047137 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ebf0140bf9cd7ca4d3538fc45de101369ed584d844ed9cfd6aa70c56f367bf8\": container with ID starting with 2ebf0140bf9cd7ca4d3538fc45de101369ed584d844ed9cfd6aa70c56f367bf8 not found: ID does not exist" containerID="2ebf0140bf9cd7ca4d3538fc45de101369ed584d844ed9cfd6aa70c56f367bf8" Dec 03 07:31:10 crc kubenswrapper[4947]: I1203 07:31:10.047163 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ebf0140bf9cd7ca4d3538fc45de101369ed584d844ed9cfd6aa70c56f367bf8"} err="failed to get container status \"2ebf0140bf9cd7ca4d3538fc45de101369ed584d844ed9cfd6aa70c56f367bf8\": rpc error: code = NotFound desc = could not find container \"2ebf0140bf9cd7ca4d3538fc45de101369ed584d844ed9cfd6aa70c56f367bf8\": container with ID starting with 2ebf0140bf9cd7ca4d3538fc45de101369ed584d844ed9cfd6aa70c56f367bf8 not found: ID does not exist" Dec 03 07:31:11 crc kubenswrapper[4947]: I1203 07:31:11.092882 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc1b399d-b686-418e-a62a-e41892d377ab" path="/var/lib/kubelet/pods/cc1b399d-b686-418e-a62a-e41892d377ab/volumes" Dec 03 07:33:00 crc kubenswrapper[4947]: I1203 07:33:00.086344 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:33:00 crc kubenswrapper[4947]: I1203 07:33:00.086878 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:33:30 crc kubenswrapper[4947]: I1203 07:33:30.086435 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:33:30 crc kubenswrapper[4947]: I1203 07:33:30.087053 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:34:00 crc kubenswrapper[4947]: I1203 07:34:00.086174 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:34:00 crc kubenswrapper[4947]: I1203 07:34:00.087074 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:34:00 crc kubenswrapper[4947]: I1203 07:34:00.087139 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 07:34:00 crc kubenswrapper[4947]: I1203 07:34:00.087878 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d265391ff7585ca11382f591c2bda4b86b5e500198d54c1f83a8d9c246635dde"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:34:00 crc kubenswrapper[4947]: I1203 07:34:00.087976 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://d265391ff7585ca11382f591c2bda4b86b5e500198d54c1f83a8d9c246635dde" gracePeriod=600 Dec 03 07:34:00 crc kubenswrapper[4947]: I1203 07:34:00.571216 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="d265391ff7585ca11382f591c2bda4b86b5e500198d54c1f83a8d9c246635dde" exitCode=0 Dec 03 07:34:00 crc kubenswrapper[4947]: I1203 07:34:00.571408 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"d265391ff7585ca11382f591c2bda4b86b5e500198d54c1f83a8d9c246635dde"} Dec 03 07:34:00 crc kubenswrapper[4947]: I1203 07:34:00.572011 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900"} Dec 03 07:34:00 crc kubenswrapper[4947]: I1203 07:34:00.572086 4947 scope.go:117] "RemoveContainer" containerID="3f37d1085bcc96f347834d1f3fdc0d188e92d017b479daed87a81a33080a534c" Dec 03 07:34:38 crc kubenswrapper[4947]: I1203 07:34:38.905461 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rxhrn"] Dec 03 07:34:38 crc kubenswrapper[4947]: E1203 07:34:38.907402 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc1b399d-b686-418e-a62a-e41892d377ab" containerName="extract-content" Dec 03 07:34:38 crc kubenswrapper[4947]: I1203 07:34:38.907421 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc1b399d-b686-418e-a62a-e41892d377ab" containerName="extract-content" Dec 03 07:34:38 crc kubenswrapper[4947]: E1203 07:34:38.907458 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc1b399d-b686-418e-a62a-e41892d377ab" containerName="extract-utilities" Dec 03 07:34:38 crc kubenswrapper[4947]: I1203 07:34:38.907467 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc1b399d-b686-418e-a62a-e41892d377ab" containerName="extract-utilities" Dec 03 07:34:38 crc kubenswrapper[4947]: E1203 07:34:38.907483 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc1b399d-b686-418e-a62a-e41892d377ab" containerName="registry-server" Dec 03 07:34:38 crc kubenswrapper[4947]: I1203 07:34:38.907511 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc1b399d-b686-418e-a62a-e41892d377ab" containerName="registry-server" Dec 03 07:34:38 crc kubenswrapper[4947]: I1203 07:34:38.907693 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc1b399d-b686-418e-a62a-e41892d377ab" containerName="registry-server" Dec 03 07:34:38 crc kubenswrapper[4947]: I1203 07:34:38.909048 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rxhrn" Dec 03 07:34:38 crc kubenswrapper[4947]: I1203 07:34:38.932761 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rxhrn"] Dec 03 07:34:39 crc kubenswrapper[4947]: I1203 07:34:39.080051 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sfw9\" (UniqueName: \"kubernetes.io/projected/f043c98e-51f2-49cf-a929-04c0683cc0fc-kube-api-access-6sfw9\") pod \"certified-operators-rxhrn\" (UID: \"f043c98e-51f2-49cf-a929-04c0683cc0fc\") " pod="openshift-marketplace/certified-operators-rxhrn" Dec 03 07:34:39 crc kubenswrapper[4947]: I1203 07:34:39.080140 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f043c98e-51f2-49cf-a929-04c0683cc0fc-catalog-content\") pod \"certified-operators-rxhrn\" (UID: \"f043c98e-51f2-49cf-a929-04c0683cc0fc\") " pod="openshift-marketplace/certified-operators-rxhrn" Dec 03 07:34:39 crc kubenswrapper[4947]: I1203 07:34:39.080166 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f043c98e-51f2-49cf-a929-04c0683cc0fc-utilities\") pod \"certified-operators-rxhrn\" (UID: \"f043c98e-51f2-49cf-a929-04c0683cc0fc\") " pod="openshift-marketplace/certified-operators-rxhrn" Dec 03 07:34:39 crc kubenswrapper[4947]: I1203 07:34:39.181181 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f043c98e-51f2-49cf-a929-04c0683cc0fc-catalog-content\") pod \"certified-operators-rxhrn\" (UID: \"f043c98e-51f2-49cf-a929-04c0683cc0fc\") " pod="openshift-marketplace/certified-operators-rxhrn" Dec 03 07:34:39 crc kubenswrapper[4947]: I1203 07:34:39.181241 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f043c98e-51f2-49cf-a929-04c0683cc0fc-utilities\") pod \"certified-operators-rxhrn\" (UID: \"f043c98e-51f2-49cf-a929-04c0683cc0fc\") " pod="openshift-marketplace/certified-operators-rxhrn" Dec 03 07:34:39 crc kubenswrapper[4947]: I1203 07:34:39.181328 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sfw9\" (UniqueName: \"kubernetes.io/projected/f043c98e-51f2-49cf-a929-04c0683cc0fc-kube-api-access-6sfw9\") pod \"certified-operators-rxhrn\" (UID: \"f043c98e-51f2-49cf-a929-04c0683cc0fc\") " pod="openshift-marketplace/certified-operators-rxhrn" Dec 03 07:34:39 crc kubenswrapper[4947]: I1203 07:34:39.182180 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f043c98e-51f2-49cf-a929-04c0683cc0fc-catalog-content\") pod \"certified-operators-rxhrn\" (UID: \"f043c98e-51f2-49cf-a929-04c0683cc0fc\") " pod="openshift-marketplace/certified-operators-rxhrn" Dec 03 07:34:39 crc kubenswrapper[4947]: I1203 07:34:39.182325 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f043c98e-51f2-49cf-a929-04c0683cc0fc-utilities\") pod \"certified-operators-rxhrn\" (UID: \"f043c98e-51f2-49cf-a929-04c0683cc0fc\") " pod="openshift-marketplace/certified-operators-rxhrn" Dec 03 07:34:39 crc kubenswrapper[4947]: I1203 07:34:39.201575 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sfw9\" (UniqueName: \"kubernetes.io/projected/f043c98e-51f2-49cf-a929-04c0683cc0fc-kube-api-access-6sfw9\") pod \"certified-operators-rxhrn\" (UID: \"f043c98e-51f2-49cf-a929-04c0683cc0fc\") " pod="openshift-marketplace/certified-operators-rxhrn" Dec 03 07:34:39 crc kubenswrapper[4947]: I1203 07:34:39.231873 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rxhrn" Dec 03 07:34:39 crc kubenswrapper[4947]: I1203 07:34:39.514058 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rxhrn"] Dec 03 07:34:39 crc kubenswrapper[4947]: I1203 07:34:39.948637 4947 generic.go:334] "Generic (PLEG): container finished" podID="f043c98e-51f2-49cf-a929-04c0683cc0fc" containerID="17556f4df644332f2eec162608b2a3db2cd08ba34ea15c3f57bd56b9f26fde17" exitCode=0 Dec 03 07:34:39 crc kubenswrapper[4947]: I1203 07:34:39.948687 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rxhrn" event={"ID":"f043c98e-51f2-49cf-a929-04c0683cc0fc","Type":"ContainerDied","Data":"17556f4df644332f2eec162608b2a3db2cd08ba34ea15c3f57bd56b9f26fde17"} Dec 03 07:34:39 crc kubenswrapper[4947]: I1203 07:34:39.948716 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rxhrn" event={"ID":"f043c98e-51f2-49cf-a929-04c0683cc0fc","Type":"ContainerStarted","Data":"a03da6050ef6736cf5acbf92ab3bdd74682daac79e667100d3a664812aec19a6"} Dec 03 07:34:40 crc kubenswrapper[4947]: I1203 07:34:40.958584 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rxhrn" event={"ID":"f043c98e-51f2-49cf-a929-04c0683cc0fc","Type":"ContainerStarted","Data":"00343adf365a723df805c2e812b410276a2687f76ea707b3e837c623a527094f"} Dec 03 07:34:41 crc kubenswrapper[4947]: I1203 07:34:41.969551 4947 generic.go:334] "Generic (PLEG): container finished" podID="f043c98e-51f2-49cf-a929-04c0683cc0fc" containerID="00343adf365a723df805c2e812b410276a2687f76ea707b3e837c623a527094f" exitCode=0 Dec 03 07:34:41 crc kubenswrapper[4947]: I1203 07:34:41.969616 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rxhrn" event={"ID":"f043c98e-51f2-49cf-a929-04c0683cc0fc","Type":"ContainerDied","Data":"00343adf365a723df805c2e812b410276a2687f76ea707b3e837c623a527094f"} Dec 03 07:34:42 crc kubenswrapper[4947]: I1203 07:34:42.981368 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rxhrn" event={"ID":"f043c98e-51f2-49cf-a929-04c0683cc0fc","Type":"ContainerStarted","Data":"5c37672b9e7059e0fe31e6bf699113b733f8969312a57d178eb2b1efde0ad529"} Dec 03 07:34:43 crc kubenswrapper[4947]: I1203 07:34:43.016702 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rxhrn" podStartSLOduration=2.460539262 podStartE2EDuration="5.016678158s" podCreationTimestamp="2025-12-03 07:34:38 +0000 UTC" firstStartedPulling="2025-12-03 07:34:39.950328333 +0000 UTC m=+2741.211282769" lastFinishedPulling="2025-12-03 07:34:42.506467189 +0000 UTC m=+2743.767421665" observedRunningTime="2025-12-03 07:34:43.007136149 +0000 UTC m=+2744.268090635" watchObservedRunningTime="2025-12-03 07:34:43.016678158 +0000 UTC m=+2744.277632604" Dec 03 07:34:49 crc kubenswrapper[4947]: I1203 07:34:49.233047 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rxhrn" Dec 03 07:34:49 crc kubenswrapper[4947]: I1203 07:34:49.234091 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rxhrn" Dec 03 07:34:49 crc kubenswrapper[4947]: I1203 07:34:49.309995 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rxhrn" Dec 03 07:34:50 crc kubenswrapper[4947]: I1203 07:34:50.091368 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rxhrn" Dec 03 07:34:50 crc kubenswrapper[4947]: I1203 07:34:50.149937 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rxhrn"] Dec 03 07:34:52 crc kubenswrapper[4947]: I1203 07:34:52.064877 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rxhrn" podUID="f043c98e-51f2-49cf-a929-04c0683cc0fc" containerName="registry-server" containerID="cri-o://5c37672b9e7059e0fe31e6bf699113b733f8969312a57d178eb2b1efde0ad529" gracePeriod=2 Dec 03 07:34:53 crc kubenswrapper[4947]: I1203 07:34:53.080028 4947 generic.go:334] "Generic (PLEG): container finished" podID="f043c98e-51f2-49cf-a929-04c0683cc0fc" containerID="5c37672b9e7059e0fe31e6bf699113b733f8969312a57d178eb2b1efde0ad529" exitCode=0 Dec 03 07:34:53 crc kubenswrapper[4947]: I1203 07:34:53.080073 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rxhrn" event={"ID":"f043c98e-51f2-49cf-a929-04c0683cc0fc","Type":"ContainerDied","Data":"5c37672b9e7059e0fe31e6bf699113b733f8969312a57d178eb2b1efde0ad529"} Dec 03 07:34:53 crc kubenswrapper[4947]: I1203 07:34:53.596410 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rxhrn" Dec 03 07:34:53 crc kubenswrapper[4947]: I1203 07:34:53.629803 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sfw9\" (UniqueName: \"kubernetes.io/projected/f043c98e-51f2-49cf-a929-04c0683cc0fc-kube-api-access-6sfw9\") pod \"f043c98e-51f2-49cf-a929-04c0683cc0fc\" (UID: \"f043c98e-51f2-49cf-a929-04c0683cc0fc\") " Dec 03 07:34:53 crc kubenswrapper[4947]: I1203 07:34:53.630654 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f043c98e-51f2-49cf-a929-04c0683cc0fc-catalog-content\") pod \"f043c98e-51f2-49cf-a929-04c0683cc0fc\" (UID: \"f043c98e-51f2-49cf-a929-04c0683cc0fc\") " Dec 03 07:34:53 crc kubenswrapper[4947]: I1203 07:34:53.630896 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f043c98e-51f2-49cf-a929-04c0683cc0fc-utilities\") pod \"f043c98e-51f2-49cf-a929-04c0683cc0fc\" (UID: \"f043c98e-51f2-49cf-a929-04c0683cc0fc\") " Dec 03 07:34:53 crc kubenswrapper[4947]: I1203 07:34:53.631845 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f043c98e-51f2-49cf-a929-04c0683cc0fc-utilities" (OuterVolumeSpecName: "utilities") pod "f043c98e-51f2-49cf-a929-04c0683cc0fc" (UID: "f043c98e-51f2-49cf-a929-04c0683cc0fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:34:53 crc kubenswrapper[4947]: I1203 07:34:53.637878 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f043c98e-51f2-49cf-a929-04c0683cc0fc-kube-api-access-6sfw9" (OuterVolumeSpecName: "kube-api-access-6sfw9") pod "f043c98e-51f2-49cf-a929-04c0683cc0fc" (UID: "f043c98e-51f2-49cf-a929-04c0683cc0fc"). InnerVolumeSpecName "kube-api-access-6sfw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:34:53 crc kubenswrapper[4947]: I1203 07:34:53.703384 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f043c98e-51f2-49cf-a929-04c0683cc0fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f043c98e-51f2-49cf-a929-04c0683cc0fc" (UID: "f043c98e-51f2-49cf-a929-04c0683cc0fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:34:53 crc kubenswrapper[4947]: I1203 07:34:53.732479 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f043c98e-51f2-49cf-a929-04c0683cc0fc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:34:53 crc kubenswrapper[4947]: I1203 07:34:53.732528 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f043c98e-51f2-49cf-a929-04c0683cc0fc-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:34:53 crc kubenswrapper[4947]: I1203 07:34:53.732543 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sfw9\" (UniqueName: \"kubernetes.io/projected/f043c98e-51f2-49cf-a929-04c0683cc0fc-kube-api-access-6sfw9\") on node \"crc\" DevicePath \"\"" Dec 03 07:34:54 crc kubenswrapper[4947]: I1203 07:34:54.094773 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rxhrn" event={"ID":"f043c98e-51f2-49cf-a929-04c0683cc0fc","Type":"ContainerDied","Data":"a03da6050ef6736cf5acbf92ab3bdd74682daac79e667100d3a664812aec19a6"} Dec 03 07:34:54 crc kubenswrapper[4947]: I1203 07:34:54.094863 4947 scope.go:117] "RemoveContainer" containerID="5c37672b9e7059e0fe31e6bf699113b733f8969312a57d178eb2b1efde0ad529" Dec 03 07:34:54 crc kubenswrapper[4947]: I1203 07:34:54.094881 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rxhrn" Dec 03 07:34:54 crc kubenswrapper[4947]: I1203 07:34:54.138033 4947 scope.go:117] "RemoveContainer" containerID="00343adf365a723df805c2e812b410276a2687f76ea707b3e837c623a527094f" Dec 03 07:34:54 crc kubenswrapper[4947]: I1203 07:34:54.158731 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rxhrn"] Dec 03 07:34:54 crc kubenswrapper[4947]: I1203 07:34:54.176206 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rxhrn"] Dec 03 07:34:54 crc kubenswrapper[4947]: I1203 07:34:54.179818 4947 scope.go:117] "RemoveContainer" containerID="17556f4df644332f2eec162608b2a3db2cd08ba34ea15c3f57bd56b9f26fde17" Dec 03 07:34:55 crc kubenswrapper[4947]: I1203 07:34:55.100714 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f043c98e-51f2-49cf-a929-04c0683cc0fc" path="/var/lib/kubelet/pods/f043c98e-51f2-49cf-a929-04c0683cc0fc/volumes" Dec 03 07:36:00 crc kubenswrapper[4947]: I1203 07:36:00.086319 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:36:00 crc kubenswrapper[4947]: I1203 07:36:00.086922 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:36:30 crc kubenswrapper[4947]: I1203 07:36:30.087062 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:36:30 crc kubenswrapper[4947]: I1203 07:36:30.087607 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:37:00 crc kubenswrapper[4947]: I1203 07:37:00.087009 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:37:00 crc kubenswrapper[4947]: I1203 07:37:00.087490 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:37:00 crc kubenswrapper[4947]: I1203 07:37:00.087550 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 07:37:00 crc kubenswrapper[4947]: I1203 07:37:00.087940 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:37:00 crc kubenswrapper[4947]: I1203 07:37:00.087996 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" gracePeriod=600 Dec 03 07:37:00 crc kubenswrapper[4947]: E1203 07:37:00.293196 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:37:01 crc kubenswrapper[4947]: I1203 07:37:01.204973 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" exitCode=0 Dec 03 07:37:01 crc kubenswrapper[4947]: I1203 07:37:01.205038 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900"} Dec 03 07:37:01 crc kubenswrapper[4947]: I1203 07:37:01.205089 4947 scope.go:117] "RemoveContainer" containerID="d265391ff7585ca11382f591c2bda4b86b5e500198d54c1f83a8d9c246635dde" Dec 03 07:37:01 crc kubenswrapper[4947]: I1203 07:37:01.205834 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:37:01 crc kubenswrapper[4947]: E1203 07:37:01.206296 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:37:14 crc kubenswrapper[4947]: I1203 07:37:14.083062 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:37:14 crc kubenswrapper[4947]: E1203 07:37:14.085410 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:37:26 crc kubenswrapper[4947]: I1203 07:37:26.083642 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:37:26 crc kubenswrapper[4947]: E1203 07:37:26.084678 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:37:40 crc kubenswrapper[4947]: I1203 07:37:40.083751 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:37:40 crc kubenswrapper[4947]: E1203 07:37:40.084818 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:37:54 crc kubenswrapper[4947]: I1203 07:37:54.083423 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:37:54 crc kubenswrapper[4947]: E1203 07:37:54.084090 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:38:08 crc kubenswrapper[4947]: I1203 07:38:08.082302 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:38:08 crc kubenswrapper[4947]: E1203 07:38:08.084103 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:38:20 crc kubenswrapper[4947]: I1203 07:38:20.083584 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:38:20 crc kubenswrapper[4947]: E1203 07:38:20.085167 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:38:33 crc kubenswrapper[4947]: I1203 07:38:33.084163 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:38:33 crc kubenswrapper[4947]: E1203 07:38:33.085672 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:38:48 crc kubenswrapper[4947]: I1203 07:38:48.082804 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:38:48 crc kubenswrapper[4947]: E1203 07:38:48.083690 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:38:59 crc kubenswrapper[4947]: I1203 07:38:59.103920 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:38:59 crc kubenswrapper[4947]: E1203 07:38:59.104949 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:39:12 crc kubenswrapper[4947]: I1203 07:39:12.083260 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:39:12 crc kubenswrapper[4947]: E1203 07:39:12.083946 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:39:26 crc kubenswrapper[4947]: I1203 07:39:26.083336 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:39:26 crc kubenswrapper[4947]: E1203 07:39:26.084180 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:39:37 crc kubenswrapper[4947]: I1203 07:39:37.083165 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:39:37 crc kubenswrapper[4947]: E1203 07:39:37.083791 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:39:50 crc kubenswrapper[4947]: I1203 07:39:50.083593 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:39:50 crc kubenswrapper[4947]: E1203 07:39:50.084318 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:40:01 crc kubenswrapper[4947]: I1203 07:40:01.083766 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:40:01 crc kubenswrapper[4947]: E1203 07:40:01.084657 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:40:12 crc kubenswrapper[4947]: I1203 07:40:12.083454 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:40:12 crc kubenswrapper[4947]: E1203 07:40:12.084153 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:40:27 crc kubenswrapper[4947]: I1203 07:40:27.083151 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:40:27 crc kubenswrapper[4947]: E1203 07:40:27.084033 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:40:28 crc kubenswrapper[4947]: I1203 07:40:28.095157 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9vsk4"] Dec 03 07:40:28 crc kubenswrapper[4947]: E1203 07:40:28.095539 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f043c98e-51f2-49cf-a929-04c0683cc0fc" containerName="registry-server" Dec 03 07:40:28 crc kubenswrapper[4947]: I1203 07:40:28.095556 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f043c98e-51f2-49cf-a929-04c0683cc0fc" containerName="registry-server" Dec 03 07:40:28 crc kubenswrapper[4947]: E1203 07:40:28.095575 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f043c98e-51f2-49cf-a929-04c0683cc0fc" containerName="extract-content" Dec 03 07:40:28 crc kubenswrapper[4947]: I1203 07:40:28.095582 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f043c98e-51f2-49cf-a929-04c0683cc0fc" containerName="extract-content" Dec 03 07:40:28 crc kubenswrapper[4947]: E1203 07:40:28.095604 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f043c98e-51f2-49cf-a929-04c0683cc0fc" containerName="extract-utilities" Dec 03 07:40:28 crc kubenswrapper[4947]: I1203 07:40:28.095612 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f043c98e-51f2-49cf-a929-04c0683cc0fc" containerName="extract-utilities" Dec 03 07:40:28 crc kubenswrapper[4947]: I1203 07:40:28.095794 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f043c98e-51f2-49cf-a929-04c0683cc0fc" containerName="registry-server" Dec 03 07:40:28 crc kubenswrapper[4947]: I1203 07:40:28.096922 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9vsk4" Dec 03 07:40:28 crc kubenswrapper[4947]: I1203 07:40:28.112346 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9vsk4"] Dec 03 07:40:28 crc kubenswrapper[4947]: I1203 07:40:28.249039 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cac6dd8-9e54-4984-b766-bf4d37f80728-catalog-content\") pod \"redhat-operators-9vsk4\" (UID: \"3cac6dd8-9e54-4984-b766-bf4d37f80728\") " pod="openshift-marketplace/redhat-operators-9vsk4" Dec 03 07:40:28 crc kubenswrapper[4947]: I1203 07:40:28.249131 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cac6dd8-9e54-4984-b766-bf4d37f80728-utilities\") pod \"redhat-operators-9vsk4\" (UID: \"3cac6dd8-9e54-4984-b766-bf4d37f80728\") " pod="openshift-marketplace/redhat-operators-9vsk4" Dec 03 07:40:28 crc kubenswrapper[4947]: I1203 07:40:28.249313 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjvvc\" (UniqueName: \"kubernetes.io/projected/3cac6dd8-9e54-4984-b766-bf4d37f80728-kube-api-access-mjvvc\") pod \"redhat-operators-9vsk4\" (UID: \"3cac6dd8-9e54-4984-b766-bf4d37f80728\") " pod="openshift-marketplace/redhat-operators-9vsk4" Dec 03 07:40:28 crc kubenswrapper[4947]: I1203 07:40:28.350446 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjvvc\" (UniqueName: \"kubernetes.io/projected/3cac6dd8-9e54-4984-b766-bf4d37f80728-kube-api-access-mjvvc\") pod \"redhat-operators-9vsk4\" (UID: \"3cac6dd8-9e54-4984-b766-bf4d37f80728\") " pod="openshift-marketplace/redhat-operators-9vsk4" Dec 03 07:40:28 crc kubenswrapper[4947]: I1203 07:40:28.350557 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cac6dd8-9e54-4984-b766-bf4d37f80728-catalog-content\") pod \"redhat-operators-9vsk4\" (UID: \"3cac6dd8-9e54-4984-b766-bf4d37f80728\") " pod="openshift-marketplace/redhat-operators-9vsk4" Dec 03 07:40:28 crc kubenswrapper[4947]: I1203 07:40:28.350614 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cac6dd8-9e54-4984-b766-bf4d37f80728-utilities\") pod \"redhat-operators-9vsk4\" (UID: \"3cac6dd8-9e54-4984-b766-bf4d37f80728\") " pod="openshift-marketplace/redhat-operators-9vsk4" Dec 03 07:40:28 crc kubenswrapper[4947]: I1203 07:40:28.351097 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cac6dd8-9e54-4984-b766-bf4d37f80728-catalog-content\") pod \"redhat-operators-9vsk4\" (UID: \"3cac6dd8-9e54-4984-b766-bf4d37f80728\") " pod="openshift-marketplace/redhat-operators-9vsk4" Dec 03 07:40:28 crc kubenswrapper[4947]: I1203 07:40:28.351116 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cac6dd8-9e54-4984-b766-bf4d37f80728-utilities\") pod \"redhat-operators-9vsk4\" (UID: \"3cac6dd8-9e54-4984-b766-bf4d37f80728\") " pod="openshift-marketplace/redhat-operators-9vsk4" Dec 03 07:40:28 crc kubenswrapper[4947]: I1203 07:40:28.369450 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjvvc\" (UniqueName: \"kubernetes.io/projected/3cac6dd8-9e54-4984-b766-bf4d37f80728-kube-api-access-mjvvc\") pod \"redhat-operators-9vsk4\" (UID: \"3cac6dd8-9e54-4984-b766-bf4d37f80728\") " pod="openshift-marketplace/redhat-operators-9vsk4" Dec 03 07:40:28 crc kubenswrapper[4947]: I1203 07:40:28.432907 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9vsk4" Dec 03 07:40:28 crc kubenswrapper[4947]: I1203 07:40:28.718053 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9vsk4"] Dec 03 07:40:29 crc kubenswrapper[4947]: I1203 07:40:29.079095 4947 generic.go:334] "Generic (PLEG): container finished" podID="3cac6dd8-9e54-4984-b766-bf4d37f80728" containerID="7ad9351d723ad9e05226262b3fd06b2e3c1fea71fbe342dedb795b9552690968" exitCode=0 Dec 03 07:40:29 crc kubenswrapper[4947]: I1203 07:40:29.079178 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vsk4" event={"ID":"3cac6dd8-9e54-4984-b766-bf4d37f80728","Type":"ContainerDied","Data":"7ad9351d723ad9e05226262b3fd06b2e3c1fea71fbe342dedb795b9552690968"} Dec 03 07:40:29 crc kubenswrapper[4947]: I1203 07:40:29.079404 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vsk4" event={"ID":"3cac6dd8-9e54-4984-b766-bf4d37f80728","Type":"ContainerStarted","Data":"dbbd7797152807fcfbf64733439ab5934c3a93f3630bdb226cdb4b69b60323b6"} Dec 03 07:40:29 crc kubenswrapper[4947]: I1203 07:40:29.081501 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 07:40:31 crc kubenswrapper[4947]: I1203 07:40:31.101386 4947 generic.go:334] "Generic (PLEG): container finished" podID="3cac6dd8-9e54-4984-b766-bf4d37f80728" containerID="7613135cca68b3f5ee886b851653c0bef29af5254c9d8bfbf9b201b91f26c976" exitCode=0 Dec 03 07:40:31 crc kubenswrapper[4947]: I1203 07:40:31.101818 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vsk4" event={"ID":"3cac6dd8-9e54-4984-b766-bf4d37f80728","Type":"ContainerDied","Data":"7613135cca68b3f5ee886b851653c0bef29af5254c9d8bfbf9b201b91f26c976"} Dec 03 07:40:32 crc kubenswrapper[4947]: I1203 07:40:32.112201 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vsk4" event={"ID":"3cac6dd8-9e54-4984-b766-bf4d37f80728","Type":"ContainerStarted","Data":"c6eecb0596003bb2c5afe3bb7c147a59952edec9b253b7304630d1d667588eda"} Dec 03 07:40:32 crc kubenswrapper[4947]: I1203 07:40:32.138853 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9vsk4" podStartSLOduration=1.7453966570000001 podStartE2EDuration="4.138828148s" podCreationTimestamp="2025-12-03 07:40:28 +0000 UTC" firstStartedPulling="2025-12-03 07:40:29.081281222 +0000 UTC m=+3090.342235648" lastFinishedPulling="2025-12-03 07:40:31.474712713 +0000 UTC m=+3092.735667139" observedRunningTime="2025-12-03 07:40:32.1322246 +0000 UTC m=+3093.393179126" watchObservedRunningTime="2025-12-03 07:40:32.138828148 +0000 UTC m=+3093.399782574" Dec 03 07:40:38 crc kubenswrapper[4947]: I1203 07:40:38.433858 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9vsk4" Dec 03 07:40:38 crc kubenswrapper[4947]: I1203 07:40:38.434463 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9vsk4" Dec 03 07:40:38 crc kubenswrapper[4947]: I1203 07:40:38.500334 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9vsk4" Dec 03 07:40:39 crc kubenswrapper[4947]: I1203 07:40:39.224140 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9vsk4" Dec 03 07:40:39 crc kubenswrapper[4947]: I1203 07:40:39.292096 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9vsk4"] Dec 03 07:40:40 crc kubenswrapper[4947]: I1203 07:40:40.083694 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:40:40 crc kubenswrapper[4947]: E1203 07:40:40.084056 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:40:41 crc kubenswrapper[4947]: I1203 07:40:41.184983 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9vsk4" podUID="3cac6dd8-9e54-4984-b766-bf4d37f80728" containerName="registry-server" containerID="cri-o://c6eecb0596003bb2c5afe3bb7c147a59952edec9b253b7304630d1d667588eda" gracePeriod=2 Dec 03 07:40:43 crc kubenswrapper[4947]: I1203 07:40:43.221358 4947 generic.go:334] "Generic (PLEG): container finished" podID="3cac6dd8-9e54-4984-b766-bf4d37f80728" containerID="c6eecb0596003bb2c5afe3bb7c147a59952edec9b253b7304630d1d667588eda" exitCode=0 Dec 03 07:40:43 crc kubenswrapper[4947]: I1203 07:40:43.221956 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vsk4" event={"ID":"3cac6dd8-9e54-4984-b766-bf4d37f80728","Type":"ContainerDied","Data":"c6eecb0596003bb2c5afe3bb7c147a59952edec9b253b7304630d1d667588eda"} Dec 03 07:40:43 crc kubenswrapper[4947]: I1203 07:40:43.441912 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9vsk4" Dec 03 07:40:43 crc kubenswrapper[4947]: I1203 07:40:43.587093 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjvvc\" (UniqueName: \"kubernetes.io/projected/3cac6dd8-9e54-4984-b766-bf4d37f80728-kube-api-access-mjvvc\") pod \"3cac6dd8-9e54-4984-b766-bf4d37f80728\" (UID: \"3cac6dd8-9e54-4984-b766-bf4d37f80728\") " Dec 03 07:40:43 crc kubenswrapper[4947]: I1203 07:40:43.587174 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cac6dd8-9e54-4984-b766-bf4d37f80728-catalog-content\") pod \"3cac6dd8-9e54-4984-b766-bf4d37f80728\" (UID: \"3cac6dd8-9e54-4984-b766-bf4d37f80728\") " Dec 03 07:40:43 crc kubenswrapper[4947]: I1203 07:40:43.587230 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cac6dd8-9e54-4984-b766-bf4d37f80728-utilities\") pod \"3cac6dd8-9e54-4984-b766-bf4d37f80728\" (UID: \"3cac6dd8-9e54-4984-b766-bf4d37f80728\") " Dec 03 07:40:43 crc kubenswrapper[4947]: I1203 07:40:43.588161 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cac6dd8-9e54-4984-b766-bf4d37f80728-utilities" (OuterVolumeSpecName: "utilities") pod "3cac6dd8-9e54-4984-b766-bf4d37f80728" (UID: "3cac6dd8-9e54-4984-b766-bf4d37f80728"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:40:43 crc kubenswrapper[4947]: I1203 07:40:43.593820 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cac6dd8-9e54-4984-b766-bf4d37f80728-kube-api-access-mjvvc" (OuterVolumeSpecName: "kube-api-access-mjvvc") pod "3cac6dd8-9e54-4984-b766-bf4d37f80728" (UID: "3cac6dd8-9e54-4984-b766-bf4d37f80728"). InnerVolumeSpecName "kube-api-access-mjvvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:40:43 crc kubenswrapper[4947]: I1203 07:40:43.688939 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjvvc\" (UniqueName: \"kubernetes.io/projected/3cac6dd8-9e54-4984-b766-bf4d37f80728-kube-api-access-mjvvc\") on node \"crc\" DevicePath \"\"" Dec 03 07:40:43 crc kubenswrapper[4947]: I1203 07:40:43.688976 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3cac6dd8-9e54-4984-b766-bf4d37f80728-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:40:43 crc kubenswrapper[4947]: I1203 07:40:43.716282 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cac6dd8-9e54-4984-b766-bf4d37f80728-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3cac6dd8-9e54-4984-b766-bf4d37f80728" (UID: "3cac6dd8-9e54-4984-b766-bf4d37f80728"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:40:43 crc kubenswrapper[4947]: I1203 07:40:43.791659 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3cac6dd8-9e54-4984-b766-bf4d37f80728-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:40:44 crc kubenswrapper[4947]: I1203 07:40:44.232140 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vsk4" event={"ID":"3cac6dd8-9e54-4984-b766-bf4d37f80728","Type":"ContainerDied","Data":"dbbd7797152807fcfbf64733439ab5934c3a93f3630bdb226cdb4b69b60323b6"} Dec 03 07:40:44 crc kubenswrapper[4947]: I1203 07:40:44.232199 4947 scope.go:117] "RemoveContainer" containerID="c6eecb0596003bb2c5afe3bb7c147a59952edec9b253b7304630d1d667588eda" Dec 03 07:40:44 crc kubenswrapper[4947]: I1203 07:40:44.232813 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9vsk4" Dec 03 07:40:44 crc kubenswrapper[4947]: I1203 07:40:44.266734 4947 scope.go:117] "RemoveContainer" containerID="7613135cca68b3f5ee886b851653c0bef29af5254c9d8bfbf9b201b91f26c976" Dec 03 07:40:44 crc kubenswrapper[4947]: I1203 07:40:44.277796 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9vsk4"] Dec 03 07:40:44 crc kubenswrapper[4947]: I1203 07:40:44.285924 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9vsk4"] Dec 03 07:40:44 crc kubenswrapper[4947]: I1203 07:40:44.296805 4947 scope.go:117] "RemoveContainer" containerID="7ad9351d723ad9e05226262b3fd06b2e3c1fea71fbe342dedb795b9552690968" Dec 03 07:40:45 crc kubenswrapper[4947]: I1203 07:40:45.098658 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cac6dd8-9e54-4984-b766-bf4d37f80728" path="/var/lib/kubelet/pods/3cac6dd8-9e54-4984-b766-bf4d37f80728/volumes" Dec 03 07:40:54 crc kubenswrapper[4947]: I1203 07:40:54.082931 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:40:54 crc kubenswrapper[4947]: E1203 07:40:54.083611 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:41:07 crc kubenswrapper[4947]: I1203 07:41:07.083096 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:41:07 crc kubenswrapper[4947]: E1203 07:41:07.084026 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:41:18 crc kubenswrapper[4947]: I1203 07:41:18.082944 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:41:18 crc kubenswrapper[4947]: E1203 07:41:18.084161 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:41:31 crc kubenswrapper[4947]: I1203 07:41:31.083901 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:41:31 crc kubenswrapper[4947]: E1203 07:41:31.084656 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:41:43 crc kubenswrapper[4947]: I1203 07:41:43.084245 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:41:43 crc kubenswrapper[4947]: E1203 07:41:43.085387 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:41:56 crc kubenswrapper[4947]: I1203 07:41:56.082834 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:41:56 crc kubenswrapper[4947]: E1203 07:41:56.083770 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:42:09 crc kubenswrapper[4947]: I1203 07:42:09.089172 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:42:09 crc kubenswrapper[4947]: I1203 07:42:09.975429 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"01574b7dd8e441a5ba3f2e2f165f8d74a2e6b97643cfc91c64ee5044abf50a98"} Dec 03 07:43:04 crc kubenswrapper[4947]: I1203 07:43:04.226955 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h6rqx"] Dec 03 07:43:04 crc kubenswrapper[4947]: E1203 07:43:04.228011 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cac6dd8-9e54-4984-b766-bf4d37f80728" containerName="extract-utilities" Dec 03 07:43:04 crc kubenswrapper[4947]: I1203 07:43:04.228038 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cac6dd8-9e54-4984-b766-bf4d37f80728" containerName="extract-utilities" Dec 03 07:43:04 crc kubenswrapper[4947]: E1203 07:43:04.228063 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cac6dd8-9e54-4984-b766-bf4d37f80728" containerName="extract-content" Dec 03 07:43:04 crc kubenswrapper[4947]: I1203 07:43:04.228077 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cac6dd8-9e54-4984-b766-bf4d37f80728" containerName="extract-content" Dec 03 07:43:04 crc kubenswrapper[4947]: E1203 07:43:04.228102 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cac6dd8-9e54-4984-b766-bf4d37f80728" containerName="registry-server" Dec 03 07:43:04 crc kubenswrapper[4947]: I1203 07:43:04.228117 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cac6dd8-9e54-4984-b766-bf4d37f80728" containerName="registry-server" Dec 03 07:43:04 crc kubenswrapper[4947]: I1203 07:43:04.228383 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cac6dd8-9e54-4984-b766-bf4d37f80728" containerName="registry-server" Dec 03 07:43:04 crc kubenswrapper[4947]: I1203 07:43:04.230239 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6rqx" Dec 03 07:43:04 crc kubenswrapper[4947]: I1203 07:43:04.244429 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6rqx"] Dec 03 07:43:04 crc kubenswrapper[4947]: I1203 07:43:04.387202 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e54b4926-842c-419f-b891-f02f7851f1e7-catalog-content\") pod \"redhat-marketplace-h6rqx\" (UID: \"e54b4926-842c-419f-b891-f02f7851f1e7\") " pod="openshift-marketplace/redhat-marketplace-h6rqx" Dec 03 07:43:04 crc kubenswrapper[4947]: I1203 07:43:04.387306 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlpld\" (UniqueName: \"kubernetes.io/projected/e54b4926-842c-419f-b891-f02f7851f1e7-kube-api-access-qlpld\") pod \"redhat-marketplace-h6rqx\" (UID: \"e54b4926-842c-419f-b891-f02f7851f1e7\") " pod="openshift-marketplace/redhat-marketplace-h6rqx" Dec 03 07:43:04 crc kubenswrapper[4947]: I1203 07:43:04.387372 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e54b4926-842c-419f-b891-f02f7851f1e7-utilities\") pod \"redhat-marketplace-h6rqx\" (UID: \"e54b4926-842c-419f-b891-f02f7851f1e7\") " pod="openshift-marketplace/redhat-marketplace-h6rqx" Dec 03 07:43:04 crc kubenswrapper[4947]: I1203 07:43:04.489158 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlpld\" (UniqueName: \"kubernetes.io/projected/e54b4926-842c-419f-b891-f02f7851f1e7-kube-api-access-qlpld\") pod \"redhat-marketplace-h6rqx\" (UID: \"e54b4926-842c-419f-b891-f02f7851f1e7\") " pod="openshift-marketplace/redhat-marketplace-h6rqx" Dec 03 07:43:04 crc kubenswrapper[4947]: I1203 07:43:04.489576 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e54b4926-842c-419f-b891-f02f7851f1e7-utilities\") pod \"redhat-marketplace-h6rqx\" (UID: \"e54b4926-842c-419f-b891-f02f7851f1e7\") " pod="openshift-marketplace/redhat-marketplace-h6rqx" Dec 03 07:43:04 crc kubenswrapper[4947]: I1203 07:43:04.489711 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e54b4926-842c-419f-b891-f02f7851f1e7-catalog-content\") pod \"redhat-marketplace-h6rqx\" (UID: \"e54b4926-842c-419f-b891-f02f7851f1e7\") " pod="openshift-marketplace/redhat-marketplace-h6rqx" Dec 03 07:43:04 crc kubenswrapper[4947]: I1203 07:43:04.490112 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e54b4926-842c-419f-b891-f02f7851f1e7-utilities\") pod \"redhat-marketplace-h6rqx\" (UID: \"e54b4926-842c-419f-b891-f02f7851f1e7\") " pod="openshift-marketplace/redhat-marketplace-h6rqx" Dec 03 07:43:04 crc kubenswrapper[4947]: I1203 07:43:04.490143 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e54b4926-842c-419f-b891-f02f7851f1e7-catalog-content\") pod \"redhat-marketplace-h6rqx\" (UID: \"e54b4926-842c-419f-b891-f02f7851f1e7\") " pod="openshift-marketplace/redhat-marketplace-h6rqx" Dec 03 07:43:04 crc kubenswrapper[4947]: I1203 07:43:04.507513 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlpld\" (UniqueName: \"kubernetes.io/projected/e54b4926-842c-419f-b891-f02f7851f1e7-kube-api-access-qlpld\") pod \"redhat-marketplace-h6rqx\" (UID: \"e54b4926-842c-419f-b891-f02f7851f1e7\") " pod="openshift-marketplace/redhat-marketplace-h6rqx" Dec 03 07:43:04 crc kubenswrapper[4947]: I1203 07:43:04.579298 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6rqx" Dec 03 07:43:05 crc kubenswrapper[4947]: I1203 07:43:05.062761 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6rqx"] Dec 03 07:43:05 crc kubenswrapper[4947]: I1203 07:43:05.476433 4947 generic.go:334] "Generic (PLEG): container finished" podID="e54b4926-842c-419f-b891-f02f7851f1e7" containerID="cca821d257a3c598e5c568805b907707e45baf6bf9601e6b77e445da6b32bf70" exitCode=0 Dec 03 07:43:05 crc kubenswrapper[4947]: I1203 07:43:05.476636 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6rqx" event={"ID":"e54b4926-842c-419f-b891-f02f7851f1e7","Type":"ContainerDied","Data":"cca821d257a3c598e5c568805b907707e45baf6bf9601e6b77e445da6b32bf70"} Dec 03 07:43:05 crc kubenswrapper[4947]: I1203 07:43:05.476725 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6rqx" event={"ID":"e54b4926-842c-419f-b891-f02f7851f1e7","Type":"ContainerStarted","Data":"b97748ada5e18007d2e7aa3ef6738089063f40caa98603202edc3b46bbaddf9b"} Dec 03 07:43:06 crc kubenswrapper[4947]: I1203 07:43:06.486204 4947 generic.go:334] "Generic (PLEG): container finished" podID="e54b4926-842c-419f-b891-f02f7851f1e7" containerID="4387c9bbf9707085bcd7ff614d7e661e067714e51bb62e58416e774f98d8c66a" exitCode=0 Dec 03 07:43:06 crc kubenswrapper[4947]: I1203 07:43:06.486256 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6rqx" event={"ID":"e54b4926-842c-419f-b891-f02f7851f1e7","Type":"ContainerDied","Data":"4387c9bbf9707085bcd7ff614d7e661e067714e51bb62e58416e774f98d8c66a"} Dec 03 07:43:06 crc kubenswrapper[4947]: I1203 07:43:06.607185 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m7hgc"] Dec 03 07:43:06 crc kubenswrapper[4947]: I1203 07:43:06.608704 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m7hgc" Dec 03 07:43:06 crc kubenswrapper[4947]: I1203 07:43:06.627501 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8028669-dbfd-4d2b-bdfb-440f156211b9-catalog-content\") pod \"community-operators-m7hgc\" (UID: \"b8028669-dbfd-4d2b-bdfb-440f156211b9\") " pod="openshift-marketplace/community-operators-m7hgc" Dec 03 07:43:06 crc kubenswrapper[4947]: I1203 07:43:06.627570 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kk42\" (UniqueName: \"kubernetes.io/projected/b8028669-dbfd-4d2b-bdfb-440f156211b9-kube-api-access-4kk42\") pod \"community-operators-m7hgc\" (UID: \"b8028669-dbfd-4d2b-bdfb-440f156211b9\") " pod="openshift-marketplace/community-operators-m7hgc" Dec 03 07:43:06 crc kubenswrapper[4947]: I1203 07:43:06.627592 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8028669-dbfd-4d2b-bdfb-440f156211b9-utilities\") pod \"community-operators-m7hgc\" (UID: \"b8028669-dbfd-4d2b-bdfb-440f156211b9\") " pod="openshift-marketplace/community-operators-m7hgc" Dec 03 07:43:06 crc kubenswrapper[4947]: I1203 07:43:06.639364 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m7hgc"] Dec 03 07:43:06 crc kubenswrapper[4947]: I1203 07:43:06.728788 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8028669-dbfd-4d2b-bdfb-440f156211b9-catalog-content\") pod \"community-operators-m7hgc\" (UID: \"b8028669-dbfd-4d2b-bdfb-440f156211b9\") " pod="openshift-marketplace/community-operators-m7hgc" Dec 03 07:43:06 crc kubenswrapper[4947]: I1203 07:43:06.728845 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kk42\" (UniqueName: \"kubernetes.io/projected/b8028669-dbfd-4d2b-bdfb-440f156211b9-kube-api-access-4kk42\") pod \"community-operators-m7hgc\" (UID: \"b8028669-dbfd-4d2b-bdfb-440f156211b9\") " pod="openshift-marketplace/community-operators-m7hgc" Dec 03 07:43:06 crc kubenswrapper[4947]: I1203 07:43:06.728867 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8028669-dbfd-4d2b-bdfb-440f156211b9-utilities\") pod \"community-operators-m7hgc\" (UID: \"b8028669-dbfd-4d2b-bdfb-440f156211b9\") " pod="openshift-marketplace/community-operators-m7hgc" Dec 03 07:43:06 crc kubenswrapper[4947]: I1203 07:43:06.729444 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8028669-dbfd-4d2b-bdfb-440f156211b9-catalog-content\") pod \"community-operators-m7hgc\" (UID: \"b8028669-dbfd-4d2b-bdfb-440f156211b9\") " pod="openshift-marketplace/community-operators-m7hgc" Dec 03 07:43:06 crc kubenswrapper[4947]: I1203 07:43:06.729463 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8028669-dbfd-4d2b-bdfb-440f156211b9-utilities\") pod \"community-operators-m7hgc\" (UID: \"b8028669-dbfd-4d2b-bdfb-440f156211b9\") " pod="openshift-marketplace/community-operators-m7hgc" Dec 03 07:43:06 crc kubenswrapper[4947]: I1203 07:43:06.754425 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kk42\" (UniqueName: \"kubernetes.io/projected/b8028669-dbfd-4d2b-bdfb-440f156211b9-kube-api-access-4kk42\") pod \"community-operators-m7hgc\" (UID: \"b8028669-dbfd-4d2b-bdfb-440f156211b9\") " pod="openshift-marketplace/community-operators-m7hgc" Dec 03 07:43:06 crc kubenswrapper[4947]: I1203 07:43:06.942955 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m7hgc" Dec 03 07:43:07 crc kubenswrapper[4947]: I1203 07:43:07.456928 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m7hgc"] Dec 03 07:43:07 crc kubenswrapper[4947]: I1203 07:43:07.494818 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7hgc" event={"ID":"b8028669-dbfd-4d2b-bdfb-440f156211b9","Type":"ContainerStarted","Data":"63eace7793e3f913accc6815d71fb65dc71ce38875b5be7c3cf83c213f132e1b"} Dec 03 07:43:07 crc kubenswrapper[4947]: I1203 07:43:07.496678 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6rqx" event={"ID":"e54b4926-842c-419f-b891-f02f7851f1e7","Type":"ContainerStarted","Data":"a15a9c2f5eb90c5590cea81fcd85165a441b5782b63963a94a325d2f9ef299ab"} Dec 03 07:43:07 crc kubenswrapper[4947]: I1203 07:43:07.514139 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h6rqx" podStartSLOduration=2.142842564 podStartE2EDuration="3.514118439s" podCreationTimestamp="2025-12-03 07:43:04 +0000 UTC" firstStartedPulling="2025-12-03 07:43:05.478958765 +0000 UTC m=+3246.739913191" lastFinishedPulling="2025-12-03 07:43:06.85023464 +0000 UTC m=+3248.111189066" observedRunningTime="2025-12-03 07:43:07.512591638 +0000 UTC m=+3248.773546064" watchObservedRunningTime="2025-12-03 07:43:07.514118439 +0000 UTC m=+3248.775072875" Dec 03 07:43:08 crc kubenswrapper[4947]: I1203 07:43:08.518114 4947 generic.go:334] "Generic (PLEG): container finished" podID="b8028669-dbfd-4d2b-bdfb-440f156211b9" containerID="a7758c0035ebd904b627ced263d863a73f358cd12e196c2a0fb4ca55230376a4" exitCode=0 Dec 03 07:43:08 crc kubenswrapper[4947]: I1203 07:43:08.518206 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7hgc" event={"ID":"b8028669-dbfd-4d2b-bdfb-440f156211b9","Type":"ContainerDied","Data":"a7758c0035ebd904b627ced263d863a73f358cd12e196c2a0fb4ca55230376a4"} Dec 03 07:43:12 crc kubenswrapper[4947]: I1203 07:43:12.550655 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7hgc" event={"ID":"b8028669-dbfd-4d2b-bdfb-440f156211b9","Type":"ContainerStarted","Data":"335507b297ac276bfd08bb834212fc9d5a876054ca3d5d372a87946aa139318b"} Dec 03 07:43:13 crc kubenswrapper[4947]: I1203 07:43:13.566860 4947 generic.go:334] "Generic (PLEG): container finished" podID="b8028669-dbfd-4d2b-bdfb-440f156211b9" containerID="335507b297ac276bfd08bb834212fc9d5a876054ca3d5d372a87946aa139318b" exitCode=0 Dec 03 07:43:13 crc kubenswrapper[4947]: I1203 07:43:13.566943 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7hgc" event={"ID":"b8028669-dbfd-4d2b-bdfb-440f156211b9","Type":"ContainerDied","Data":"335507b297ac276bfd08bb834212fc9d5a876054ca3d5d372a87946aa139318b"} Dec 03 07:43:14 crc kubenswrapper[4947]: I1203 07:43:14.579524 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h6rqx" Dec 03 07:43:14 crc kubenswrapper[4947]: I1203 07:43:14.579931 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h6rqx" Dec 03 07:43:14 crc kubenswrapper[4947]: I1203 07:43:14.582550 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7hgc" event={"ID":"b8028669-dbfd-4d2b-bdfb-440f156211b9","Type":"ContainerStarted","Data":"f3bc9902b880a7b13e3f90c8af7ace92e6d7bdebee42f2f935929bd0f26f132b"} Dec 03 07:43:14 crc kubenswrapper[4947]: I1203 07:43:14.622479 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m7hgc" podStartSLOduration=2.995788712 podStartE2EDuration="8.622452545s" podCreationTimestamp="2025-12-03 07:43:06 +0000 UTC" firstStartedPulling="2025-12-03 07:43:08.52018005 +0000 UTC m=+3249.781134476" lastFinishedPulling="2025-12-03 07:43:14.146843853 +0000 UTC m=+3255.407798309" observedRunningTime="2025-12-03 07:43:14.611710885 +0000 UTC m=+3255.872665331" watchObservedRunningTime="2025-12-03 07:43:14.622452545 +0000 UTC m=+3255.883406981" Dec 03 07:43:14 crc kubenswrapper[4947]: I1203 07:43:14.642196 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h6rqx" Dec 03 07:43:15 crc kubenswrapper[4947]: I1203 07:43:15.668250 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h6rqx" Dec 03 07:43:16 crc kubenswrapper[4947]: I1203 07:43:16.207486 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6rqx"] Dec 03 07:43:16 crc kubenswrapper[4947]: I1203 07:43:16.943995 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m7hgc" Dec 03 07:43:16 crc kubenswrapper[4947]: I1203 07:43:16.944070 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m7hgc" Dec 03 07:43:17 crc kubenswrapper[4947]: I1203 07:43:17.003077 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m7hgc" Dec 03 07:43:17 crc kubenswrapper[4947]: I1203 07:43:17.606914 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h6rqx" podUID="e54b4926-842c-419f-b891-f02f7851f1e7" containerName="registry-server" containerID="cri-o://a15a9c2f5eb90c5590cea81fcd85165a441b5782b63963a94a325d2f9ef299ab" gracePeriod=2 Dec 03 07:43:18 crc kubenswrapper[4947]: I1203 07:43:18.069633 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6rqx" Dec 03 07:43:18 crc kubenswrapper[4947]: I1203 07:43:18.209046 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e54b4926-842c-419f-b891-f02f7851f1e7-catalog-content\") pod \"e54b4926-842c-419f-b891-f02f7851f1e7\" (UID: \"e54b4926-842c-419f-b891-f02f7851f1e7\") " Dec 03 07:43:18 crc kubenswrapper[4947]: I1203 07:43:18.209269 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e54b4926-842c-419f-b891-f02f7851f1e7-utilities\") pod \"e54b4926-842c-419f-b891-f02f7851f1e7\" (UID: \"e54b4926-842c-419f-b891-f02f7851f1e7\") " Dec 03 07:43:18 crc kubenswrapper[4947]: I1203 07:43:18.209330 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlpld\" (UniqueName: \"kubernetes.io/projected/e54b4926-842c-419f-b891-f02f7851f1e7-kube-api-access-qlpld\") pod \"e54b4926-842c-419f-b891-f02f7851f1e7\" (UID: \"e54b4926-842c-419f-b891-f02f7851f1e7\") " Dec 03 07:43:18 crc kubenswrapper[4947]: I1203 07:43:18.210779 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e54b4926-842c-419f-b891-f02f7851f1e7-utilities" (OuterVolumeSpecName: "utilities") pod "e54b4926-842c-419f-b891-f02f7851f1e7" (UID: "e54b4926-842c-419f-b891-f02f7851f1e7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:43:18 crc kubenswrapper[4947]: I1203 07:43:18.214598 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e54b4926-842c-419f-b891-f02f7851f1e7-kube-api-access-qlpld" (OuterVolumeSpecName: "kube-api-access-qlpld") pod "e54b4926-842c-419f-b891-f02f7851f1e7" (UID: "e54b4926-842c-419f-b891-f02f7851f1e7"). InnerVolumeSpecName "kube-api-access-qlpld". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:43:18 crc kubenswrapper[4947]: I1203 07:43:18.227472 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e54b4926-842c-419f-b891-f02f7851f1e7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e54b4926-842c-419f-b891-f02f7851f1e7" (UID: "e54b4926-842c-419f-b891-f02f7851f1e7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:43:18 crc kubenswrapper[4947]: I1203 07:43:18.311215 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e54b4926-842c-419f-b891-f02f7851f1e7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:43:18 crc kubenswrapper[4947]: I1203 07:43:18.311241 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e54b4926-842c-419f-b891-f02f7851f1e7-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:43:18 crc kubenswrapper[4947]: I1203 07:43:18.311251 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlpld\" (UniqueName: \"kubernetes.io/projected/e54b4926-842c-419f-b891-f02f7851f1e7-kube-api-access-qlpld\") on node \"crc\" DevicePath \"\"" Dec 03 07:43:18 crc kubenswrapper[4947]: I1203 07:43:18.615617 4947 generic.go:334] "Generic (PLEG): container finished" podID="e54b4926-842c-419f-b891-f02f7851f1e7" containerID="a15a9c2f5eb90c5590cea81fcd85165a441b5782b63963a94a325d2f9ef299ab" exitCode=0 Dec 03 07:43:18 crc kubenswrapper[4947]: I1203 07:43:18.615650 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6rqx" Dec 03 07:43:18 crc kubenswrapper[4947]: I1203 07:43:18.615667 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6rqx" event={"ID":"e54b4926-842c-419f-b891-f02f7851f1e7","Type":"ContainerDied","Data":"a15a9c2f5eb90c5590cea81fcd85165a441b5782b63963a94a325d2f9ef299ab"} Dec 03 07:43:18 crc kubenswrapper[4947]: I1203 07:43:18.616304 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6rqx" event={"ID":"e54b4926-842c-419f-b891-f02f7851f1e7","Type":"ContainerDied","Data":"b97748ada5e18007d2e7aa3ef6738089063f40caa98603202edc3b46bbaddf9b"} Dec 03 07:43:18 crc kubenswrapper[4947]: I1203 07:43:18.616342 4947 scope.go:117] "RemoveContainer" containerID="a15a9c2f5eb90c5590cea81fcd85165a441b5782b63963a94a325d2f9ef299ab" Dec 03 07:43:18 crc kubenswrapper[4947]: I1203 07:43:18.656518 4947 scope.go:117] "RemoveContainer" containerID="4387c9bbf9707085bcd7ff614d7e661e067714e51bb62e58416e774f98d8c66a" Dec 03 07:43:18 crc kubenswrapper[4947]: I1203 07:43:18.656836 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6rqx"] Dec 03 07:43:18 crc kubenswrapper[4947]: I1203 07:43:18.662266 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6rqx"] Dec 03 07:43:18 crc kubenswrapper[4947]: I1203 07:43:18.696735 4947 scope.go:117] "RemoveContainer" containerID="cca821d257a3c598e5c568805b907707e45baf6bf9601e6b77e445da6b32bf70" Dec 03 07:43:18 crc kubenswrapper[4947]: I1203 07:43:18.713992 4947 scope.go:117] "RemoveContainer" containerID="a15a9c2f5eb90c5590cea81fcd85165a441b5782b63963a94a325d2f9ef299ab" Dec 03 07:43:18 crc kubenswrapper[4947]: E1203 07:43:18.714435 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a15a9c2f5eb90c5590cea81fcd85165a441b5782b63963a94a325d2f9ef299ab\": container with ID starting with a15a9c2f5eb90c5590cea81fcd85165a441b5782b63963a94a325d2f9ef299ab not found: ID does not exist" containerID="a15a9c2f5eb90c5590cea81fcd85165a441b5782b63963a94a325d2f9ef299ab" Dec 03 07:43:18 crc kubenswrapper[4947]: I1203 07:43:18.714527 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a15a9c2f5eb90c5590cea81fcd85165a441b5782b63963a94a325d2f9ef299ab"} err="failed to get container status \"a15a9c2f5eb90c5590cea81fcd85165a441b5782b63963a94a325d2f9ef299ab\": rpc error: code = NotFound desc = could not find container \"a15a9c2f5eb90c5590cea81fcd85165a441b5782b63963a94a325d2f9ef299ab\": container with ID starting with a15a9c2f5eb90c5590cea81fcd85165a441b5782b63963a94a325d2f9ef299ab not found: ID does not exist" Dec 03 07:43:18 crc kubenswrapper[4947]: I1203 07:43:18.714567 4947 scope.go:117] "RemoveContainer" containerID="4387c9bbf9707085bcd7ff614d7e661e067714e51bb62e58416e774f98d8c66a" Dec 03 07:43:18 crc kubenswrapper[4947]: E1203 07:43:18.715010 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4387c9bbf9707085bcd7ff614d7e661e067714e51bb62e58416e774f98d8c66a\": container with ID starting with 4387c9bbf9707085bcd7ff614d7e661e067714e51bb62e58416e774f98d8c66a not found: ID does not exist" containerID="4387c9bbf9707085bcd7ff614d7e661e067714e51bb62e58416e774f98d8c66a" Dec 03 07:43:18 crc kubenswrapper[4947]: I1203 07:43:18.715211 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4387c9bbf9707085bcd7ff614d7e661e067714e51bb62e58416e774f98d8c66a"} err="failed to get container status \"4387c9bbf9707085bcd7ff614d7e661e067714e51bb62e58416e774f98d8c66a\": rpc error: code = NotFound desc = could not find container \"4387c9bbf9707085bcd7ff614d7e661e067714e51bb62e58416e774f98d8c66a\": container with ID starting with 4387c9bbf9707085bcd7ff614d7e661e067714e51bb62e58416e774f98d8c66a not found: ID does not exist" Dec 03 07:43:18 crc kubenswrapper[4947]: I1203 07:43:18.715383 4947 scope.go:117] "RemoveContainer" containerID="cca821d257a3c598e5c568805b907707e45baf6bf9601e6b77e445da6b32bf70" Dec 03 07:43:18 crc kubenswrapper[4947]: E1203 07:43:18.715967 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cca821d257a3c598e5c568805b907707e45baf6bf9601e6b77e445da6b32bf70\": container with ID starting with cca821d257a3c598e5c568805b907707e45baf6bf9601e6b77e445da6b32bf70 not found: ID does not exist" containerID="cca821d257a3c598e5c568805b907707e45baf6bf9601e6b77e445da6b32bf70" Dec 03 07:43:18 crc kubenswrapper[4947]: I1203 07:43:18.716015 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cca821d257a3c598e5c568805b907707e45baf6bf9601e6b77e445da6b32bf70"} err="failed to get container status \"cca821d257a3c598e5c568805b907707e45baf6bf9601e6b77e445da6b32bf70\": rpc error: code = NotFound desc = could not find container \"cca821d257a3c598e5c568805b907707e45baf6bf9601e6b77e445da6b32bf70\": container with ID starting with cca821d257a3c598e5c568805b907707e45baf6bf9601e6b77e445da6b32bf70 not found: ID does not exist" Dec 03 07:43:18 crc kubenswrapper[4947]: E1203 07:43:18.780978 4947 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode54b4926_842c_419f_b891_f02f7851f1e7.slice/crio-b97748ada5e18007d2e7aa3ef6738089063f40caa98603202edc3b46bbaddf9b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode54b4926_842c_419f_b891_f02f7851f1e7.slice\": RecentStats: unable to find data in memory cache]" Dec 03 07:43:19 crc kubenswrapper[4947]: I1203 07:43:19.097547 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e54b4926-842c-419f-b891-f02f7851f1e7" path="/var/lib/kubelet/pods/e54b4926-842c-419f-b891-f02f7851f1e7/volumes" Dec 03 07:43:27 crc kubenswrapper[4947]: I1203 07:43:27.014434 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m7hgc" Dec 03 07:43:27 crc kubenswrapper[4947]: I1203 07:43:27.109520 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m7hgc"] Dec 03 07:43:27 crc kubenswrapper[4947]: I1203 07:43:27.167953 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sp9x2"] Dec 03 07:43:27 crc kubenswrapper[4947]: I1203 07:43:27.168215 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sp9x2" podUID="db662847-42e5-472b-b370-1d537a258211" containerName="registry-server" containerID="cri-o://4dfd9a96d0d07531aeaa9b0238f912b14fda8de323126126867228eb7c2a76d0" gracePeriod=2 Dec 03 07:43:27 crc kubenswrapper[4947]: I1203 07:43:27.218019 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-sp9x2" podUID="db662847-42e5-472b-b370-1d537a258211" containerName="registry-server" probeResult="failure" output="" Dec 03 07:43:27 crc kubenswrapper[4947]: I1203 07:43:27.221562 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-sp9x2" podUID="db662847-42e5-472b-b370-1d537a258211" containerName="registry-server" probeResult="failure" output="" Dec 03 07:43:29 crc kubenswrapper[4947]: E1203 07:43:29.010420 4947 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb662847_42e5_472b_b370_1d537a258211.slice/crio-conmon-4dfd9a96d0d07531aeaa9b0238f912b14fda8de323126126867228eb7c2a76d0.scope\": RecentStats: unable to find data in memory cache]" Dec 03 07:43:29 crc kubenswrapper[4947]: I1203 07:43:29.447921 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sp9x2" Dec 03 07:43:29 crc kubenswrapper[4947]: I1203 07:43:29.594913 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db662847-42e5-472b-b370-1d537a258211-utilities\") pod \"db662847-42e5-472b-b370-1d537a258211\" (UID: \"db662847-42e5-472b-b370-1d537a258211\") " Dec 03 07:43:29 crc kubenswrapper[4947]: I1203 07:43:29.594993 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnf2w\" (UniqueName: \"kubernetes.io/projected/db662847-42e5-472b-b370-1d537a258211-kube-api-access-rnf2w\") pod \"db662847-42e5-472b-b370-1d537a258211\" (UID: \"db662847-42e5-472b-b370-1d537a258211\") " Dec 03 07:43:29 crc kubenswrapper[4947]: I1203 07:43:29.595052 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db662847-42e5-472b-b370-1d537a258211-catalog-content\") pod \"db662847-42e5-472b-b370-1d537a258211\" (UID: \"db662847-42e5-472b-b370-1d537a258211\") " Dec 03 07:43:29 crc kubenswrapper[4947]: I1203 07:43:29.595679 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db662847-42e5-472b-b370-1d537a258211-utilities" (OuterVolumeSpecName: "utilities") pod "db662847-42e5-472b-b370-1d537a258211" (UID: "db662847-42e5-472b-b370-1d537a258211"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:43:29 crc kubenswrapper[4947]: I1203 07:43:29.603439 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db662847-42e5-472b-b370-1d537a258211-kube-api-access-rnf2w" (OuterVolumeSpecName: "kube-api-access-rnf2w") pod "db662847-42e5-472b-b370-1d537a258211" (UID: "db662847-42e5-472b-b370-1d537a258211"). InnerVolumeSpecName "kube-api-access-rnf2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:43:29 crc kubenswrapper[4947]: I1203 07:43:29.659433 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db662847-42e5-472b-b370-1d537a258211-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db662847-42e5-472b-b370-1d537a258211" (UID: "db662847-42e5-472b-b370-1d537a258211"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:43:29 crc kubenswrapper[4947]: I1203 07:43:29.696227 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db662847-42e5-472b-b370-1d537a258211-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:43:29 crc kubenswrapper[4947]: I1203 07:43:29.696267 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnf2w\" (UniqueName: \"kubernetes.io/projected/db662847-42e5-472b-b370-1d537a258211-kube-api-access-rnf2w\") on node \"crc\" DevicePath \"\"" Dec 03 07:43:29 crc kubenswrapper[4947]: I1203 07:43:29.696281 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db662847-42e5-472b-b370-1d537a258211-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:43:29 crc kubenswrapper[4947]: I1203 07:43:29.709244 4947 generic.go:334] "Generic (PLEG): container finished" podID="db662847-42e5-472b-b370-1d537a258211" containerID="4dfd9a96d0d07531aeaa9b0238f912b14fda8de323126126867228eb7c2a76d0" exitCode=0 Dec 03 07:43:29 crc kubenswrapper[4947]: I1203 07:43:29.709288 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sp9x2" event={"ID":"db662847-42e5-472b-b370-1d537a258211","Type":"ContainerDied","Data":"4dfd9a96d0d07531aeaa9b0238f912b14fda8de323126126867228eb7c2a76d0"} Dec 03 07:43:29 crc kubenswrapper[4947]: I1203 07:43:29.709320 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sp9x2" Dec 03 07:43:29 crc kubenswrapper[4947]: I1203 07:43:29.709347 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sp9x2" event={"ID":"db662847-42e5-472b-b370-1d537a258211","Type":"ContainerDied","Data":"1a24fc132d35b8fdfe6b94d699319287bbd545f0b915a4a819faa63b68bf1d58"} Dec 03 07:43:29 crc kubenswrapper[4947]: I1203 07:43:29.709366 4947 scope.go:117] "RemoveContainer" containerID="4dfd9a96d0d07531aeaa9b0238f912b14fda8de323126126867228eb7c2a76d0" Dec 03 07:43:29 crc kubenswrapper[4947]: I1203 07:43:29.728007 4947 scope.go:117] "RemoveContainer" containerID="7ef9d23a84e9e187a04e495f773226443522d320329721b2d43f4061a8000446" Dec 03 07:43:29 crc kubenswrapper[4947]: I1203 07:43:29.742607 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sp9x2"] Dec 03 07:43:29 crc kubenswrapper[4947]: I1203 07:43:29.749198 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sp9x2"] Dec 03 07:43:29 crc kubenswrapper[4947]: I1203 07:43:29.759402 4947 scope.go:117] "RemoveContainer" containerID="3bb4842b5bb7ad2c27bb152003d20ae81b0a2ab69256edc81ddf59003c0cdea8" Dec 03 07:43:29 crc kubenswrapper[4947]: I1203 07:43:29.775335 4947 scope.go:117] "RemoveContainer" containerID="4dfd9a96d0d07531aeaa9b0238f912b14fda8de323126126867228eb7c2a76d0" Dec 03 07:43:29 crc kubenswrapper[4947]: E1203 07:43:29.775730 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dfd9a96d0d07531aeaa9b0238f912b14fda8de323126126867228eb7c2a76d0\": container with ID starting with 4dfd9a96d0d07531aeaa9b0238f912b14fda8de323126126867228eb7c2a76d0 not found: ID does not exist" containerID="4dfd9a96d0d07531aeaa9b0238f912b14fda8de323126126867228eb7c2a76d0" Dec 03 07:43:29 crc kubenswrapper[4947]: I1203 07:43:29.775760 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dfd9a96d0d07531aeaa9b0238f912b14fda8de323126126867228eb7c2a76d0"} err="failed to get container status \"4dfd9a96d0d07531aeaa9b0238f912b14fda8de323126126867228eb7c2a76d0\": rpc error: code = NotFound desc = could not find container \"4dfd9a96d0d07531aeaa9b0238f912b14fda8de323126126867228eb7c2a76d0\": container with ID starting with 4dfd9a96d0d07531aeaa9b0238f912b14fda8de323126126867228eb7c2a76d0 not found: ID does not exist" Dec 03 07:43:29 crc kubenswrapper[4947]: I1203 07:43:29.775779 4947 scope.go:117] "RemoveContainer" containerID="7ef9d23a84e9e187a04e495f773226443522d320329721b2d43f4061a8000446" Dec 03 07:43:29 crc kubenswrapper[4947]: E1203 07:43:29.776071 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ef9d23a84e9e187a04e495f773226443522d320329721b2d43f4061a8000446\": container with ID starting with 7ef9d23a84e9e187a04e495f773226443522d320329721b2d43f4061a8000446 not found: ID does not exist" containerID="7ef9d23a84e9e187a04e495f773226443522d320329721b2d43f4061a8000446" Dec 03 07:43:29 crc kubenswrapper[4947]: I1203 07:43:29.776097 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ef9d23a84e9e187a04e495f773226443522d320329721b2d43f4061a8000446"} err="failed to get container status \"7ef9d23a84e9e187a04e495f773226443522d320329721b2d43f4061a8000446\": rpc error: code = NotFound desc = could not find container \"7ef9d23a84e9e187a04e495f773226443522d320329721b2d43f4061a8000446\": container with ID starting with 7ef9d23a84e9e187a04e495f773226443522d320329721b2d43f4061a8000446 not found: ID does not exist" Dec 03 07:43:29 crc kubenswrapper[4947]: I1203 07:43:29.776109 4947 scope.go:117] "RemoveContainer" containerID="3bb4842b5bb7ad2c27bb152003d20ae81b0a2ab69256edc81ddf59003c0cdea8" Dec 03 07:43:29 crc kubenswrapper[4947]: E1203 07:43:29.776330 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bb4842b5bb7ad2c27bb152003d20ae81b0a2ab69256edc81ddf59003c0cdea8\": container with ID starting with 3bb4842b5bb7ad2c27bb152003d20ae81b0a2ab69256edc81ddf59003c0cdea8 not found: ID does not exist" containerID="3bb4842b5bb7ad2c27bb152003d20ae81b0a2ab69256edc81ddf59003c0cdea8" Dec 03 07:43:29 crc kubenswrapper[4947]: I1203 07:43:29.776348 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bb4842b5bb7ad2c27bb152003d20ae81b0a2ab69256edc81ddf59003c0cdea8"} err="failed to get container status \"3bb4842b5bb7ad2c27bb152003d20ae81b0a2ab69256edc81ddf59003c0cdea8\": rpc error: code = NotFound desc = could not find container \"3bb4842b5bb7ad2c27bb152003d20ae81b0a2ab69256edc81ddf59003c0cdea8\": container with ID starting with 3bb4842b5bb7ad2c27bb152003d20ae81b0a2ab69256edc81ddf59003c0cdea8 not found: ID does not exist" Dec 03 07:43:31 crc kubenswrapper[4947]: I1203 07:43:31.099406 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db662847-42e5-472b-b370-1d537a258211" path="/var/lib/kubelet/pods/db662847-42e5-472b-b370-1d537a258211/volumes" Dec 03 07:44:30 crc kubenswrapper[4947]: I1203 07:44:30.086922 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:44:30 crc kubenswrapper[4947]: I1203 07:44:30.087699 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.086620 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.087246 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.162069 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412465-g6hxq"] Dec 03 07:45:00 crc kubenswrapper[4947]: E1203 07:45:00.162444 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e54b4926-842c-419f-b891-f02f7851f1e7" containerName="extract-utilities" Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.162465 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e54b4926-842c-419f-b891-f02f7851f1e7" containerName="extract-utilities" Dec 03 07:45:00 crc kubenswrapper[4947]: E1203 07:45:00.162484 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db662847-42e5-472b-b370-1d537a258211" containerName="registry-server" Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.162524 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="db662847-42e5-472b-b370-1d537a258211" containerName="registry-server" Dec 03 07:45:00 crc kubenswrapper[4947]: E1203 07:45:00.162554 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e54b4926-842c-419f-b891-f02f7851f1e7" containerName="registry-server" Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.162567 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e54b4926-842c-419f-b891-f02f7851f1e7" containerName="registry-server" Dec 03 07:45:00 crc kubenswrapper[4947]: E1203 07:45:00.162603 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db662847-42e5-472b-b370-1d537a258211" containerName="extract-content" Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.162640 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="db662847-42e5-472b-b370-1d537a258211" containerName="extract-content" Dec 03 07:45:00 crc kubenswrapper[4947]: E1203 07:45:00.162661 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db662847-42e5-472b-b370-1d537a258211" containerName="extract-utilities" Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.162674 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="db662847-42e5-472b-b370-1d537a258211" containerName="extract-utilities" Dec 03 07:45:00 crc kubenswrapper[4947]: E1203 07:45:00.162690 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e54b4926-842c-419f-b891-f02f7851f1e7" containerName="extract-content" Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.162698 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e54b4926-842c-419f-b891-f02f7851f1e7" containerName="extract-content" Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.162884 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="db662847-42e5-472b-b370-1d537a258211" containerName="registry-server" Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.162903 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="e54b4926-842c-419f-b891-f02f7851f1e7" containerName="registry-server" Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.163457 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-g6hxq" Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.165955 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.166277 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.188072 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412465-g6hxq"] Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.323638 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc67b8b8-984d-46d9-8e27-ea4a53b96228-config-volume\") pod \"collect-profiles-29412465-g6hxq\" (UID: \"fc67b8b8-984d-46d9-8e27-ea4a53b96228\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-g6hxq" Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.323940 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q45kr\" (UniqueName: \"kubernetes.io/projected/fc67b8b8-984d-46d9-8e27-ea4a53b96228-kube-api-access-q45kr\") pod \"collect-profiles-29412465-g6hxq\" (UID: \"fc67b8b8-984d-46d9-8e27-ea4a53b96228\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-g6hxq" Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.324057 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc67b8b8-984d-46d9-8e27-ea4a53b96228-secret-volume\") pod \"collect-profiles-29412465-g6hxq\" (UID: \"fc67b8b8-984d-46d9-8e27-ea4a53b96228\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-g6hxq" Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.425114 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q45kr\" (UniqueName: \"kubernetes.io/projected/fc67b8b8-984d-46d9-8e27-ea4a53b96228-kube-api-access-q45kr\") pod \"collect-profiles-29412465-g6hxq\" (UID: \"fc67b8b8-984d-46d9-8e27-ea4a53b96228\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-g6hxq" Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.425184 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc67b8b8-984d-46d9-8e27-ea4a53b96228-secret-volume\") pod \"collect-profiles-29412465-g6hxq\" (UID: \"fc67b8b8-984d-46d9-8e27-ea4a53b96228\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-g6hxq" Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.425377 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc67b8b8-984d-46d9-8e27-ea4a53b96228-config-volume\") pod \"collect-profiles-29412465-g6hxq\" (UID: \"fc67b8b8-984d-46d9-8e27-ea4a53b96228\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-g6hxq" Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.427075 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc67b8b8-984d-46d9-8e27-ea4a53b96228-config-volume\") pod \"collect-profiles-29412465-g6hxq\" (UID: \"fc67b8b8-984d-46d9-8e27-ea4a53b96228\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-g6hxq" Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.446117 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc67b8b8-984d-46d9-8e27-ea4a53b96228-secret-volume\") pod \"collect-profiles-29412465-g6hxq\" (UID: \"fc67b8b8-984d-46d9-8e27-ea4a53b96228\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-g6hxq" Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.467480 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q45kr\" (UniqueName: \"kubernetes.io/projected/fc67b8b8-984d-46d9-8e27-ea4a53b96228-kube-api-access-q45kr\") pod \"collect-profiles-29412465-g6hxq\" (UID: \"fc67b8b8-984d-46d9-8e27-ea4a53b96228\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-g6hxq" Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.485881 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-g6hxq" Dec 03 07:45:00 crc kubenswrapper[4947]: I1203 07:45:00.984904 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412465-g6hxq"] Dec 03 07:45:01 crc kubenswrapper[4947]: I1203 07:45:01.558309 4947 generic.go:334] "Generic (PLEG): container finished" podID="fc67b8b8-984d-46d9-8e27-ea4a53b96228" containerID="9e1737c5ee372ae8c38b34e925326531e5a717360215ba424baf77ae2a889a72" exitCode=0 Dec 03 07:45:01 crc kubenswrapper[4947]: I1203 07:45:01.558404 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-g6hxq" event={"ID":"fc67b8b8-984d-46d9-8e27-ea4a53b96228","Type":"ContainerDied","Data":"9e1737c5ee372ae8c38b34e925326531e5a717360215ba424baf77ae2a889a72"} Dec 03 07:45:01 crc kubenswrapper[4947]: I1203 07:45:01.558545 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-g6hxq" event={"ID":"fc67b8b8-984d-46d9-8e27-ea4a53b96228","Type":"ContainerStarted","Data":"66414f2fa88d2e6bf48d7a1178c223b1183cdbb8f0eb4a8b29983d3af2715212"} Dec 03 07:45:02 crc kubenswrapper[4947]: I1203 07:45:02.888783 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-g6hxq" Dec 03 07:45:03 crc kubenswrapper[4947]: I1203 07:45:03.064280 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc67b8b8-984d-46d9-8e27-ea4a53b96228-config-volume\") pod \"fc67b8b8-984d-46d9-8e27-ea4a53b96228\" (UID: \"fc67b8b8-984d-46d9-8e27-ea4a53b96228\") " Dec 03 07:45:03 crc kubenswrapper[4947]: I1203 07:45:03.064370 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc67b8b8-984d-46d9-8e27-ea4a53b96228-secret-volume\") pod \"fc67b8b8-984d-46d9-8e27-ea4a53b96228\" (UID: \"fc67b8b8-984d-46d9-8e27-ea4a53b96228\") " Dec 03 07:45:03 crc kubenswrapper[4947]: I1203 07:45:03.064523 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q45kr\" (UniqueName: \"kubernetes.io/projected/fc67b8b8-984d-46d9-8e27-ea4a53b96228-kube-api-access-q45kr\") pod \"fc67b8b8-984d-46d9-8e27-ea4a53b96228\" (UID: \"fc67b8b8-984d-46d9-8e27-ea4a53b96228\") " Dec 03 07:45:03 crc kubenswrapper[4947]: I1203 07:45:03.065737 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc67b8b8-984d-46d9-8e27-ea4a53b96228-config-volume" (OuterVolumeSpecName: "config-volume") pod "fc67b8b8-984d-46d9-8e27-ea4a53b96228" (UID: "fc67b8b8-984d-46d9-8e27-ea4a53b96228"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:45:03 crc kubenswrapper[4947]: I1203 07:45:03.070389 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc67b8b8-984d-46d9-8e27-ea4a53b96228-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "fc67b8b8-984d-46d9-8e27-ea4a53b96228" (UID: "fc67b8b8-984d-46d9-8e27-ea4a53b96228"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:45:03 crc kubenswrapper[4947]: I1203 07:45:03.071288 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc67b8b8-984d-46d9-8e27-ea4a53b96228-kube-api-access-q45kr" (OuterVolumeSpecName: "kube-api-access-q45kr") pod "fc67b8b8-984d-46d9-8e27-ea4a53b96228" (UID: "fc67b8b8-984d-46d9-8e27-ea4a53b96228"). InnerVolumeSpecName "kube-api-access-q45kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:45:03 crc kubenswrapper[4947]: I1203 07:45:03.166694 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q45kr\" (UniqueName: \"kubernetes.io/projected/fc67b8b8-984d-46d9-8e27-ea4a53b96228-kube-api-access-q45kr\") on node \"crc\" DevicePath \"\"" Dec 03 07:45:03 crc kubenswrapper[4947]: I1203 07:45:03.166796 4947 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fc67b8b8-984d-46d9-8e27-ea4a53b96228-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 07:45:03 crc kubenswrapper[4947]: I1203 07:45:03.166843 4947 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/fc67b8b8-984d-46d9-8e27-ea4a53b96228-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 07:45:03 crc kubenswrapper[4947]: I1203 07:45:03.576893 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-g6hxq" event={"ID":"fc67b8b8-984d-46d9-8e27-ea4a53b96228","Type":"ContainerDied","Data":"66414f2fa88d2e6bf48d7a1178c223b1183cdbb8f0eb4a8b29983d3af2715212"} Dec 03 07:45:03 crc kubenswrapper[4947]: I1203 07:45:03.577186 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66414f2fa88d2e6bf48d7a1178c223b1183cdbb8f0eb4a8b29983d3af2715212" Dec 03 07:45:03 crc kubenswrapper[4947]: I1203 07:45:03.576975 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-g6hxq" Dec 03 07:45:03 crc kubenswrapper[4947]: I1203 07:45:03.978525 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412420-lxv44"] Dec 03 07:45:03 crc kubenswrapper[4947]: I1203 07:45:03.985424 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412420-lxv44"] Dec 03 07:45:05 crc kubenswrapper[4947]: I1203 07:45:05.100656 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42c748f8-81ed-4cac-ad68-33b0a1d7218d" path="/var/lib/kubelet/pods/42c748f8-81ed-4cac-ad68-33b0a1d7218d/volumes" Dec 03 07:45:30 crc kubenswrapper[4947]: I1203 07:45:30.086098 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:45:30 crc kubenswrapper[4947]: I1203 07:45:30.087926 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:45:30 crc kubenswrapper[4947]: I1203 07:45:30.088073 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 07:45:30 crc kubenswrapper[4947]: I1203 07:45:30.088648 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"01574b7dd8e441a5ba3f2e2f165f8d74a2e6b97643cfc91c64ee5044abf50a98"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:45:30 crc kubenswrapper[4947]: I1203 07:45:30.088825 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://01574b7dd8e441a5ba3f2e2f165f8d74a2e6b97643cfc91c64ee5044abf50a98" gracePeriod=600 Dec 03 07:45:30 crc kubenswrapper[4947]: I1203 07:45:30.840189 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="01574b7dd8e441a5ba3f2e2f165f8d74a2e6b97643cfc91c64ee5044abf50a98" exitCode=0 Dec 03 07:45:30 crc kubenswrapper[4947]: I1203 07:45:30.840258 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"01574b7dd8e441a5ba3f2e2f165f8d74a2e6b97643cfc91c64ee5044abf50a98"} Dec 03 07:45:30 crc kubenswrapper[4947]: I1203 07:45:30.840699 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198"} Dec 03 07:45:30 crc kubenswrapper[4947]: I1203 07:45:30.840741 4947 scope.go:117] "RemoveContainer" containerID="2c8329da1fee8fe62b817d1688cf7b6ceb2e45d782a3e9933a9b65538d12f900" Dec 03 07:45:54 crc kubenswrapper[4947]: I1203 07:45:54.456115 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lhjhx"] Dec 03 07:45:54 crc kubenswrapper[4947]: E1203 07:45:54.457391 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc67b8b8-984d-46d9-8e27-ea4a53b96228" containerName="collect-profiles" Dec 03 07:45:54 crc kubenswrapper[4947]: I1203 07:45:54.457422 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc67b8b8-984d-46d9-8e27-ea4a53b96228" containerName="collect-profiles" Dec 03 07:45:54 crc kubenswrapper[4947]: I1203 07:45:54.457766 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc67b8b8-984d-46d9-8e27-ea4a53b96228" containerName="collect-profiles" Dec 03 07:45:54 crc kubenswrapper[4947]: I1203 07:45:54.459612 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lhjhx" Dec 03 07:45:54 crc kubenswrapper[4947]: I1203 07:45:54.483072 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lhjhx"] Dec 03 07:45:54 crc kubenswrapper[4947]: I1203 07:45:54.593844 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6np4\" (UniqueName: \"kubernetes.io/projected/607b467a-2cf4-42ab-a319-5d04c114b979-kube-api-access-g6np4\") pod \"certified-operators-lhjhx\" (UID: \"607b467a-2cf4-42ab-a319-5d04c114b979\") " pod="openshift-marketplace/certified-operators-lhjhx" Dec 03 07:45:54 crc kubenswrapper[4947]: I1203 07:45:54.594228 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/607b467a-2cf4-42ab-a319-5d04c114b979-utilities\") pod \"certified-operators-lhjhx\" (UID: \"607b467a-2cf4-42ab-a319-5d04c114b979\") " pod="openshift-marketplace/certified-operators-lhjhx" Dec 03 07:45:54 crc kubenswrapper[4947]: I1203 07:45:54.594286 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/607b467a-2cf4-42ab-a319-5d04c114b979-catalog-content\") pod \"certified-operators-lhjhx\" (UID: \"607b467a-2cf4-42ab-a319-5d04c114b979\") " pod="openshift-marketplace/certified-operators-lhjhx" Dec 03 07:45:54 crc kubenswrapper[4947]: I1203 07:45:54.696248 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6np4\" (UniqueName: \"kubernetes.io/projected/607b467a-2cf4-42ab-a319-5d04c114b979-kube-api-access-g6np4\") pod \"certified-operators-lhjhx\" (UID: \"607b467a-2cf4-42ab-a319-5d04c114b979\") " pod="openshift-marketplace/certified-operators-lhjhx" Dec 03 07:45:54 crc kubenswrapper[4947]: I1203 07:45:54.696408 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/607b467a-2cf4-42ab-a319-5d04c114b979-utilities\") pod \"certified-operators-lhjhx\" (UID: \"607b467a-2cf4-42ab-a319-5d04c114b979\") " pod="openshift-marketplace/certified-operators-lhjhx" Dec 03 07:45:54 crc kubenswrapper[4947]: I1203 07:45:54.696541 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/607b467a-2cf4-42ab-a319-5d04c114b979-catalog-content\") pod \"certified-operators-lhjhx\" (UID: \"607b467a-2cf4-42ab-a319-5d04c114b979\") " pod="openshift-marketplace/certified-operators-lhjhx" Dec 03 07:45:54 crc kubenswrapper[4947]: I1203 07:45:54.697115 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/607b467a-2cf4-42ab-a319-5d04c114b979-utilities\") pod \"certified-operators-lhjhx\" (UID: \"607b467a-2cf4-42ab-a319-5d04c114b979\") " pod="openshift-marketplace/certified-operators-lhjhx" Dec 03 07:45:54 crc kubenswrapper[4947]: I1203 07:45:54.697116 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/607b467a-2cf4-42ab-a319-5d04c114b979-catalog-content\") pod \"certified-operators-lhjhx\" (UID: \"607b467a-2cf4-42ab-a319-5d04c114b979\") " pod="openshift-marketplace/certified-operators-lhjhx" Dec 03 07:45:54 crc kubenswrapper[4947]: I1203 07:45:54.722752 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6np4\" (UniqueName: \"kubernetes.io/projected/607b467a-2cf4-42ab-a319-5d04c114b979-kube-api-access-g6np4\") pod \"certified-operators-lhjhx\" (UID: \"607b467a-2cf4-42ab-a319-5d04c114b979\") " pod="openshift-marketplace/certified-operators-lhjhx" Dec 03 07:45:54 crc kubenswrapper[4947]: I1203 07:45:54.787372 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lhjhx" Dec 03 07:45:55 crc kubenswrapper[4947]: I1203 07:45:55.295538 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lhjhx"] Dec 03 07:45:56 crc kubenswrapper[4947]: I1203 07:45:56.108962 4947 generic.go:334] "Generic (PLEG): container finished" podID="607b467a-2cf4-42ab-a319-5d04c114b979" containerID="abdf10e14ecb7da0754240d0ffc12c08072f52b19725ecfd4c61ef7a69b52f8b" exitCode=0 Dec 03 07:45:56 crc kubenswrapper[4947]: I1203 07:45:56.109072 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhjhx" event={"ID":"607b467a-2cf4-42ab-a319-5d04c114b979","Type":"ContainerDied","Data":"abdf10e14ecb7da0754240d0ffc12c08072f52b19725ecfd4c61ef7a69b52f8b"} Dec 03 07:45:56 crc kubenswrapper[4947]: I1203 07:45:56.109296 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhjhx" event={"ID":"607b467a-2cf4-42ab-a319-5d04c114b979","Type":"ContainerStarted","Data":"546bbb245f1ce90ee261161c5ffc4d241e4de3f1f48ecf354518dfb136921053"} Dec 03 07:45:56 crc kubenswrapper[4947]: I1203 07:45:56.111868 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 07:45:57 crc kubenswrapper[4947]: I1203 07:45:57.117845 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhjhx" event={"ID":"607b467a-2cf4-42ab-a319-5d04c114b979","Type":"ContainerStarted","Data":"8a90d878c8e3ae466103d672cded04b6be9ce854c05ba1ac79a66eea33509a29"} Dec 03 07:45:58 crc kubenswrapper[4947]: I1203 07:45:58.134661 4947 generic.go:334] "Generic (PLEG): container finished" podID="607b467a-2cf4-42ab-a319-5d04c114b979" containerID="8a90d878c8e3ae466103d672cded04b6be9ce854c05ba1ac79a66eea33509a29" exitCode=0 Dec 03 07:45:58 crc kubenswrapper[4947]: I1203 07:45:58.134876 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhjhx" event={"ID":"607b467a-2cf4-42ab-a319-5d04c114b979","Type":"ContainerDied","Data":"8a90d878c8e3ae466103d672cded04b6be9ce854c05ba1ac79a66eea33509a29"} Dec 03 07:45:59 crc kubenswrapper[4947]: I1203 07:45:59.145186 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhjhx" event={"ID":"607b467a-2cf4-42ab-a319-5d04c114b979","Type":"ContainerStarted","Data":"12d5398e4adefd506d48012e5c39c8db3ad57c54af1aa2a1de7eed914b7c6095"} Dec 03 07:45:59 crc kubenswrapper[4947]: I1203 07:45:59.178981 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lhjhx" podStartSLOduration=2.700987017 podStartE2EDuration="5.178886905s" podCreationTimestamp="2025-12-03 07:45:54 +0000 UTC" firstStartedPulling="2025-12-03 07:45:56.111322348 +0000 UTC m=+3417.372276824" lastFinishedPulling="2025-12-03 07:45:58.589222246 +0000 UTC m=+3419.850176712" observedRunningTime="2025-12-03 07:45:59.174935019 +0000 UTC m=+3420.435889455" watchObservedRunningTime="2025-12-03 07:45:59.178886905 +0000 UTC m=+3420.439841341" Dec 03 07:46:04 crc kubenswrapper[4947]: I1203 07:46:04.787527 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lhjhx" Dec 03 07:46:04 crc kubenswrapper[4947]: I1203 07:46:04.788076 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lhjhx" Dec 03 07:46:04 crc kubenswrapper[4947]: I1203 07:46:04.827333 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lhjhx" Dec 03 07:46:05 crc kubenswrapper[4947]: I1203 07:46:05.056910 4947 scope.go:117] "RemoveContainer" containerID="10cf80903e41763a6942116d3c10552ad3a1f46e555a333db975487d6c490fbb" Dec 03 07:46:05 crc kubenswrapper[4947]: I1203 07:46:05.229613 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lhjhx" Dec 03 07:46:08 crc kubenswrapper[4947]: I1203 07:46:08.427088 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lhjhx"] Dec 03 07:46:08 crc kubenswrapper[4947]: I1203 07:46:08.427623 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lhjhx" podUID="607b467a-2cf4-42ab-a319-5d04c114b979" containerName="registry-server" containerID="cri-o://12d5398e4adefd506d48012e5c39c8db3ad57c54af1aa2a1de7eed914b7c6095" gracePeriod=2 Dec 03 07:46:09 crc kubenswrapper[4947]: I1203 07:46:09.228052 4947 generic.go:334] "Generic (PLEG): container finished" podID="607b467a-2cf4-42ab-a319-5d04c114b979" containerID="12d5398e4adefd506d48012e5c39c8db3ad57c54af1aa2a1de7eed914b7c6095" exitCode=0 Dec 03 07:46:09 crc kubenswrapper[4947]: I1203 07:46:09.228321 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhjhx" event={"ID":"607b467a-2cf4-42ab-a319-5d04c114b979","Type":"ContainerDied","Data":"12d5398e4adefd506d48012e5c39c8db3ad57c54af1aa2a1de7eed914b7c6095"} Dec 03 07:46:09 crc kubenswrapper[4947]: I1203 07:46:09.433666 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lhjhx" Dec 03 07:46:09 crc kubenswrapper[4947]: I1203 07:46:09.532193 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6np4\" (UniqueName: \"kubernetes.io/projected/607b467a-2cf4-42ab-a319-5d04c114b979-kube-api-access-g6np4\") pod \"607b467a-2cf4-42ab-a319-5d04c114b979\" (UID: \"607b467a-2cf4-42ab-a319-5d04c114b979\") " Dec 03 07:46:09 crc kubenswrapper[4947]: I1203 07:46:09.533098 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/607b467a-2cf4-42ab-a319-5d04c114b979-utilities\") pod \"607b467a-2cf4-42ab-a319-5d04c114b979\" (UID: \"607b467a-2cf4-42ab-a319-5d04c114b979\") " Dec 03 07:46:09 crc kubenswrapper[4947]: I1203 07:46:09.533123 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/607b467a-2cf4-42ab-a319-5d04c114b979-catalog-content\") pod \"607b467a-2cf4-42ab-a319-5d04c114b979\" (UID: \"607b467a-2cf4-42ab-a319-5d04c114b979\") " Dec 03 07:46:09 crc kubenswrapper[4947]: I1203 07:46:09.533916 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/607b467a-2cf4-42ab-a319-5d04c114b979-utilities" (OuterVolumeSpecName: "utilities") pod "607b467a-2cf4-42ab-a319-5d04c114b979" (UID: "607b467a-2cf4-42ab-a319-5d04c114b979"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:46:09 crc kubenswrapper[4947]: I1203 07:46:09.547817 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/607b467a-2cf4-42ab-a319-5d04c114b979-kube-api-access-g6np4" (OuterVolumeSpecName: "kube-api-access-g6np4") pod "607b467a-2cf4-42ab-a319-5d04c114b979" (UID: "607b467a-2cf4-42ab-a319-5d04c114b979"). InnerVolumeSpecName "kube-api-access-g6np4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:46:09 crc kubenswrapper[4947]: I1203 07:46:09.609503 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/607b467a-2cf4-42ab-a319-5d04c114b979-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "607b467a-2cf4-42ab-a319-5d04c114b979" (UID: "607b467a-2cf4-42ab-a319-5d04c114b979"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:46:09 crc kubenswrapper[4947]: I1203 07:46:09.634591 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6np4\" (UniqueName: \"kubernetes.io/projected/607b467a-2cf4-42ab-a319-5d04c114b979-kube-api-access-g6np4\") on node \"crc\" DevicePath \"\"" Dec 03 07:46:09 crc kubenswrapper[4947]: I1203 07:46:09.634621 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/607b467a-2cf4-42ab-a319-5d04c114b979-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:46:09 crc kubenswrapper[4947]: I1203 07:46:09.634631 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/607b467a-2cf4-42ab-a319-5d04c114b979-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:46:10 crc kubenswrapper[4947]: I1203 07:46:10.236538 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lhjhx" event={"ID":"607b467a-2cf4-42ab-a319-5d04c114b979","Type":"ContainerDied","Data":"546bbb245f1ce90ee261161c5ffc4d241e4de3f1f48ecf354518dfb136921053"} Dec 03 07:46:10 crc kubenswrapper[4947]: I1203 07:46:10.236592 4947 scope.go:117] "RemoveContainer" containerID="12d5398e4adefd506d48012e5c39c8db3ad57c54af1aa2a1de7eed914b7c6095" Dec 03 07:46:10 crc kubenswrapper[4947]: I1203 07:46:10.236680 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lhjhx" Dec 03 07:46:10 crc kubenswrapper[4947]: I1203 07:46:10.255937 4947 scope.go:117] "RemoveContainer" containerID="8a90d878c8e3ae466103d672cded04b6be9ce854c05ba1ac79a66eea33509a29" Dec 03 07:46:10 crc kubenswrapper[4947]: I1203 07:46:10.275480 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lhjhx"] Dec 03 07:46:10 crc kubenswrapper[4947]: I1203 07:46:10.278736 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lhjhx"] Dec 03 07:46:10 crc kubenswrapper[4947]: I1203 07:46:10.291720 4947 scope.go:117] "RemoveContainer" containerID="abdf10e14ecb7da0754240d0ffc12c08072f52b19725ecfd4c61ef7a69b52f8b" Dec 03 07:46:11 crc kubenswrapper[4947]: I1203 07:46:11.094418 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="607b467a-2cf4-42ab-a319-5d04c114b979" path="/var/lib/kubelet/pods/607b467a-2cf4-42ab-a319-5d04c114b979/volumes" Dec 03 07:47:30 crc kubenswrapper[4947]: I1203 07:47:30.087109 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:47:30 crc kubenswrapper[4947]: I1203 07:47:30.087712 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:48:00 crc kubenswrapper[4947]: I1203 07:48:00.086374 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:48:00 crc kubenswrapper[4947]: I1203 07:48:00.086985 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:48:30 crc kubenswrapper[4947]: I1203 07:48:30.087002 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:48:30 crc kubenswrapper[4947]: I1203 07:48:30.087606 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:48:30 crc kubenswrapper[4947]: I1203 07:48:30.087671 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 07:48:30 crc kubenswrapper[4947]: I1203 07:48:30.088630 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:48:30 crc kubenswrapper[4947]: I1203 07:48:30.088743 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" gracePeriod=600 Dec 03 07:48:30 crc kubenswrapper[4947]: E1203 07:48:30.218181 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:48:30 crc kubenswrapper[4947]: I1203 07:48:30.699728 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" exitCode=0 Dec 03 07:48:30 crc kubenswrapper[4947]: I1203 07:48:30.699796 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198"} Dec 03 07:48:30 crc kubenswrapper[4947]: I1203 07:48:30.699853 4947 scope.go:117] "RemoveContainer" containerID="01574b7dd8e441a5ba3f2e2f165f8d74a2e6b97643cfc91c64ee5044abf50a98" Dec 03 07:48:30 crc kubenswrapper[4947]: I1203 07:48:30.700738 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:48:30 crc kubenswrapper[4947]: E1203 07:48:30.701170 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:48:46 crc kubenswrapper[4947]: I1203 07:48:46.082735 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:48:46 crc kubenswrapper[4947]: E1203 07:48:46.083670 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:48:59 crc kubenswrapper[4947]: I1203 07:48:59.092334 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:48:59 crc kubenswrapper[4947]: E1203 07:48:59.093367 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:49:11 crc kubenswrapper[4947]: I1203 07:49:11.084398 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:49:11 crc kubenswrapper[4947]: E1203 07:49:11.085551 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:49:26 crc kubenswrapper[4947]: I1203 07:49:26.083213 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:49:26 crc kubenswrapper[4947]: E1203 07:49:26.084721 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:49:39 crc kubenswrapper[4947]: I1203 07:49:39.095422 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:49:39 crc kubenswrapper[4947]: E1203 07:49:39.098264 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:49:50 crc kubenswrapper[4947]: I1203 07:49:50.083700 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:49:50 crc kubenswrapper[4947]: E1203 07:49:50.084359 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:50:05 crc kubenswrapper[4947]: I1203 07:50:05.082713 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:50:05 crc kubenswrapper[4947]: E1203 07:50:05.083462 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:50:18 crc kubenswrapper[4947]: I1203 07:50:18.082872 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:50:18 crc kubenswrapper[4947]: E1203 07:50:18.083389 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:50:31 crc kubenswrapper[4947]: I1203 07:50:31.083783 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:50:31 crc kubenswrapper[4947]: E1203 07:50:31.084993 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:50:40 crc kubenswrapper[4947]: I1203 07:50:40.375893 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r2ngr"] Dec 03 07:50:40 crc kubenswrapper[4947]: E1203 07:50:40.376814 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="607b467a-2cf4-42ab-a319-5d04c114b979" containerName="registry-server" Dec 03 07:50:40 crc kubenswrapper[4947]: I1203 07:50:40.376828 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="607b467a-2cf4-42ab-a319-5d04c114b979" containerName="registry-server" Dec 03 07:50:40 crc kubenswrapper[4947]: E1203 07:50:40.376840 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="607b467a-2cf4-42ab-a319-5d04c114b979" containerName="extract-content" Dec 03 07:50:40 crc kubenswrapper[4947]: I1203 07:50:40.376847 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="607b467a-2cf4-42ab-a319-5d04c114b979" containerName="extract-content" Dec 03 07:50:40 crc kubenswrapper[4947]: E1203 07:50:40.376870 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="607b467a-2cf4-42ab-a319-5d04c114b979" containerName="extract-utilities" Dec 03 07:50:40 crc kubenswrapper[4947]: I1203 07:50:40.376878 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="607b467a-2cf4-42ab-a319-5d04c114b979" containerName="extract-utilities" Dec 03 07:50:40 crc kubenswrapper[4947]: I1203 07:50:40.377023 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="607b467a-2cf4-42ab-a319-5d04c114b979" containerName="registry-server" Dec 03 07:50:40 crc kubenswrapper[4947]: I1203 07:50:40.378204 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r2ngr" Dec 03 07:50:40 crc kubenswrapper[4947]: I1203 07:50:40.397767 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r2ngr"] Dec 03 07:50:40 crc kubenswrapper[4947]: I1203 07:50:40.435059 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz6cm\" (UniqueName: \"kubernetes.io/projected/44c84a7f-c038-452d-a593-0f5c1f12d27a-kube-api-access-sz6cm\") pod \"redhat-operators-r2ngr\" (UID: \"44c84a7f-c038-452d-a593-0f5c1f12d27a\") " pod="openshift-marketplace/redhat-operators-r2ngr" Dec 03 07:50:40 crc kubenswrapper[4947]: I1203 07:50:40.435119 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44c84a7f-c038-452d-a593-0f5c1f12d27a-utilities\") pod \"redhat-operators-r2ngr\" (UID: \"44c84a7f-c038-452d-a593-0f5c1f12d27a\") " pod="openshift-marketplace/redhat-operators-r2ngr" Dec 03 07:50:40 crc kubenswrapper[4947]: I1203 07:50:40.435179 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44c84a7f-c038-452d-a593-0f5c1f12d27a-catalog-content\") pod \"redhat-operators-r2ngr\" (UID: \"44c84a7f-c038-452d-a593-0f5c1f12d27a\") " pod="openshift-marketplace/redhat-operators-r2ngr" Dec 03 07:50:40 crc kubenswrapper[4947]: I1203 07:50:40.535746 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44c84a7f-c038-452d-a593-0f5c1f12d27a-catalog-content\") pod \"redhat-operators-r2ngr\" (UID: \"44c84a7f-c038-452d-a593-0f5c1f12d27a\") " pod="openshift-marketplace/redhat-operators-r2ngr" Dec 03 07:50:40 crc kubenswrapper[4947]: I1203 07:50:40.535832 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz6cm\" (UniqueName: \"kubernetes.io/projected/44c84a7f-c038-452d-a593-0f5c1f12d27a-kube-api-access-sz6cm\") pod \"redhat-operators-r2ngr\" (UID: \"44c84a7f-c038-452d-a593-0f5c1f12d27a\") " pod="openshift-marketplace/redhat-operators-r2ngr" Dec 03 07:50:40 crc kubenswrapper[4947]: I1203 07:50:40.535863 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44c84a7f-c038-452d-a593-0f5c1f12d27a-utilities\") pod \"redhat-operators-r2ngr\" (UID: \"44c84a7f-c038-452d-a593-0f5c1f12d27a\") " pod="openshift-marketplace/redhat-operators-r2ngr" Dec 03 07:50:40 crc kubenswrapper[4947]: I1203 07:50:40.536339 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44c84a7f-c038-452d-a593-0f5c1f12d27a-utilities\") pod \"redhat-operators-r2ngr\" (UID: \"44c84a7f-c038-452d-a593-0f5c1f12d27a\") " pod="openshift-marketplace/redhat-operators-r2ngr" Dec 03 07:50:40 crc kubenswrapper[4947]: I1203 07:50:40.536656 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44c84a7f-c038-452d-a593-0f5c1f12d27a-catalog-content\") pod \"redhat-operators-r2ngr\" (UID: \"44c84a7f-c038-452d-a593-0f5c1f12d27a\") " pod="openshift-marketplace/redhat-operators-r2ngr" Dec 03 07:50:40 crc kubenswrapper[4947]: I1203 07:50:40.570336 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz6cm\" (UniqueName: \"kubernetes.io/projected/44c84a7f-c038-452d-a593-0f5c1f12d27a-kube-api-access-sz6cm\") pod \"redhat-operators-r2ngr\" (UID: \"44c84a7f-c038-452d-a593-0f5c1f12d27a\") " pod="openshift-marketplace/redhat-operators-r2ngr" Dec 03 07:50:40 crc kubenswrapper[4947]: I1203 07:50:40.710406 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r2ngr" Dec 03 07:50:40 crc kubenswrapper[4947]: I1203 07:50:40.926905 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r2ngr"] Dec 03 07:50:41 crc kubenswrapper[4947]: I1203 07:50:41.896820 4947 generic.go:334] "Generic (PLEG): container finished" podID="44c84a7f-c038-452d-a593-0f5c1f12d27a" containerID="dc940152955849c4be0a546e9205b0172e291333e2abd808102eaf3fafe3a2d0" exitCode=0 Dec 03 07:50:41 crc kubenswrapper[4947]: I1203 07:50:41.897170 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r2ngr" event={"ID":"44c84a7f-c038-452d-a593-0f5c1f12d27a","Type":"ContainerDied","Data":"dc940152955849c4be0a546e9205b0172e291333e2abd808102eaf3fafe3a2d0"} Dec 03 07:50:41 crc kubenswrapper[4947]: I1203 07:50:41.897212 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r2ngr" event={"ID":"44c84a7f-c038-452d-a593-0f5c1f12d27a","Type":"ContainerStarted","Data":"19a7f770492fd052a47ebd46106a62b3f1cee7157d6a8f3293a17cc2e8afee10"} Dec 03 07:50:43 crc kubenswrapper[4947]: I1203 07:50:43.928193 4947 generic.go:334] "Generic (PLEG): container finished" podID="44c84a7f-c038-452d-a593-0f5c1f12d27a" containerID="f5ce58099fd07529e1bd1e710140e92376207d714d4f4bf16d96c33e147c1be5" exitCode=0 Dec 03 07:50:43 crc kubenswrapper[4947]: I1203 07:50:43.928306 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r2ngr" event={"ID":"44c84a7f-c038-452d-a593-0f5c1f12d27a","Type":"ContainerDied","Data":"f5ce58099fd07529e1bd1e710140e92376207d714d4f4bf16d96c33e147c1be5"} Dec 03 07:50:44 crc kubenswrapper[4947]: I1203 07:50:44.944139 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r2ngr" event={"ID":"44c84a7f-c038-452d-a593-0f5c1f12d27a","Type":"ContainerStarted","Data":"1372ea8fe1eb13c5fad8b0685646d10638f8103717f9124750e338073d855f79"} Dec 03 07:50:44 crc kubenswrapper[4947]: I1203 07:50:44.975567 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r2ngr" podStartSLOduration=2.425481187 podStartE2EDuration="4.975550867s" podCreationTimestamp="2025-12-03 07:50:40 +0000 UTC" firstStartedPulling="2025-12-03 07:50:41.900211453 +0000 UTC m=+3703.161165919" lastFinishedPulling="2025-12-03 07:50:44.450281133 +0000 UTC m=+3705.711235599" observedRunningTime="2025-12-03 07:50:44.970394156 +0000 UTC m=+3706.231348582" watchObservedRunningTime="2025-12-03 07:50:44.975550867 +0000 UTC m=+3706.236505293" Dec 03 07:50:45 crc kubenswrapper[4947]: I1203 07:50:45.083979 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:50:45 crc kubenswrapper[4947]: E1203 07:50:45.084602 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:50:50 crc kubenswrapper[4947]: I1203 07:50:50.710932 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r2ngr" Dec 03 07:50:50 crc kubenswrapper[4947]: I1203 07:50:50.711622 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r2ngr" Dec 03 07:50:51 crc kubenswrapper[4947]: I1203 07:50:51.772413 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r2ngr" podUID="44c84a7f-c038-452d-a593-0f5c1f12d27a" containerName="registry-server" probeResult="failure" output=< Dec 03 07:50:51 crc kubenswrapper[4947]: timeout: failed to connect service ":50051" within 1s Dec 03 07:50:51 crc kubenswrapper[4947]: > Dec 03 07:50:56 crc kubenswrapper[4947]: I1203 07:50:56.082876 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:50:56 crc kubenswrapper[4947]: E1203 07:50:56.083322 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:51:00 crc kubenswrapper[4947]: I1203 07:51:00.775054 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r2ngr" Dec 03 07:51:00 crc kubenswrapper[4947]: I1203 07:51:00.837416 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r2ngr" Dec 03 07:51:01 crc kubenswrapper[4947]: I1203 07:51:01.024904 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r2ngr"] Dec 03 07:51:02 crc kubenswrapper[4947]: I1203 07:51:02.089023 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r2ngr" podUID="44c84a7f-c038-452d-a593-0f5c1f12d27a" containerName="registry-server" containerID="cri-o://1372ea8fe1eb13c5fad8b0685646d10638f8103717f9124750e338073d855f79" gracePeriod=2 Dec 03 07:51:04 crc kubenswrapper[4947]: I1203 07:51:04.104919 4947 generic.go:334] "Generic (PLEG): container finished" podID="44c84a7f-c038-452d-a593-0f5c1f12d27a" containerID="1372ea8fe1eb13c5fad8b0685646d10638f8103717f9124750e338073d855f79" exitCode=0 Dec 03 07:51:04 crc kubenswrapper[4947]: I1203 07:51:04.104969 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r2ngr" event={"ID":"44c84a7f-c038-452d-a593-0f5c1f12d27a","Type":"ContainerDied","Data":"1372ea8fe1eb13c5fad8b0685646d10638f8103717f9124750e338073d855f79"} Dec 03 07:51:04 crc kubenswrapper[4947]: I1203 07:51:04.342157 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r2ngr" Dec 03 07:51:04 crc kubenswrapper[4947]: I1203 07:51:04.541767 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz6cm\" (UniqueName: \"kubernetes.io/projected/44c84a7f-c038-452d-a593-0f5c1f12d27a-kube-api-access-sz6cm\") pod \"44c84a7f-c038-452d-a593-0f5c1f12d27a\" (UID: \"44c84a7f-c038-452d-a593-0f5c1f12d27a\") " Dec 03 07:51:04 crc kubenswrapper[4947]: I1203 07:51:04.541833 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44c84a7f-c038-452d-a593-0f5c1f12d27a-utilities\") pod \"44c84a7f-c038-452d-a593-0f5c1f12d27a\" (UID: \"44c84a7f-c038-452d-a593-0f5c1f12d27a\") " Dec 03 07:51:04 crc kubenswrapper[4947]: I1203 07:51:04.541923 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44c84a7f-c038-452d-a593-0f5c1f12d27a-catalog-content\") pod \"44c84a7f-c038-452d-a593-0f5c1f12d27a\" (UID: \"44c84a7f-c038-452d-a593-0f5c1f12d27a\") " Dec 03 07:51:04 crc kubenswrapper[4947]: I1203 07:51:04.544424 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44c84a7f-c038-452d-a593-0f5c1f12d27a-utilities" (OuterVolumeSpecName: "utilities") pod "44c84a7f-c038-452d-a593-0f5c1f12d27a" (UID: "44c84a7f-c038-452d-a593-0f5c1f12d27a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:51:04 crc kubenswrapper[4947]: I1203 07:51:04.563664 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44c84a7f-c038-452d-a593-0f5c1f12d27a-kube-api-access-sz6cm" (OuterVolumeSpecName: "kube-api-access-sz6cm") pod "44c84a7f-c038-452d-a593-0f5c1f12d27a" (UID: "44c84a7f-c038-452d-a593-0f5c1f12d27a"). InnerVolumeSpecName "kube-api-access-sz6cm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:51:04 crc kubenswrapper[4947]: I1203 07:51:04.644109 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz6cm\" (UniqueName: \"kubernetes.io/projected/44c84a7f-c038-452d-a593-0f5c1f12d27a-kube-api-access-sz6cm\") on node \"crc\" DevicePath \"\"" Dec 03 07:51:04 crc kubenswrapper[4947]: I1203 07:51:04.644159 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44c84a7f-c038-452d-a593-0f5c1f12d27a-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:51:04 crc kubenswrapper[4947]: I1203 07:51:04.661387 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44c84a7f-c038-452d-a593-0f5c1f12d27a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44c84a7f-c038-452d-a593-0f5c1f12d27a" (UID: "44c84a7f-c038-452d-a593-0f5c1f12d27a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:51:04 crc kubenswrapper[4947]: I1203 07:51:04.745770 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44c84a7f-c038-452d-a593-0f5c1f12d27a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:51:05 crc kubenswrapper[4947]: I1203 07:51:05.115315 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r2ngr" event={"ID":"44c84a7f-c038-452d-a593-0f5c1f12d27a","Type":"ContainerDied","Data":"19a7f770492fd052a47ebd46106a62b3f1cee7157d6a8f3293a17cc2e8afee10"} Dec 03 07:51:05 crc kubenswrapper[4947]: I1203 07:51:05.115369 4947 scope.go:117] "RemoveContainer" containerID="1372ea8fe1eb13c5fad8b0685646d10638f8103717f9124750e338073d855f79" Dec 03 07:51:05 crc kubenswrapper[4947]: I1203 07:51:05.115372 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r2ngr" Dec 03 07:51:05 crc kubenswrapper[4947]: I1203 07:51:05.141820 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r2ngr"] Dec 03 07:51:05 crc kubenswrapper[4947]: I1203 07:51:05.142606 4947 scope.go:117] "RemoveContainer" containerID="f5ce58099fd07529e1bd1e710140e92376207d714d4f4bf16d96c33e147c1be5" Dec 03 07:51:05 crc kubenswrapper[4947]: I1203 07:51:05.155646 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r2ngr"] Dec 03 07:51:05 crc kubenswrapper[4947]: I1203 07:51:05.169627 4947 scope.go:117] "RemoveContainer" containerID="dc940152955849c4be0a546e9205b0172e291333e2abd808102eaf3fafe3a2d0" Dec 03 07:51:07 crc kubenswrapper[4947]: I1203 07:51:07.095809 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44c84a7f-c038-452d-a593-0f5c1f12d27a" path="/var/lib/kubelet/pods/44c84a7f-c038-452d-a593-0f5c1f12d27a/volumes" Dec 03 07:51:10 crc kubenswrapper[4947]: I1203 07:51:10.082985 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:51:10 crc kubenswrapper[4947]: E1203 07:51:10.083447 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:51:21 crc kubenswrapper[4947]: I1203 07:51:21.083429 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:51:21 crc kubenswrapper[4947]: E1203 07:51:21.087360 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:51:32 crc kubenswrapper[4947]: I1203 07:51:32.083113 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:51:32 crc kubenswrapper[4947]: E1203 07:51:32.084053 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:51:44 crc kubenswrapper[4947]: I1203 07:51:44.083278 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:51:44 crc kubenswrapper[4947]: E1203 07:51:44.083792 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:51:58 crc kubenswrapper[4947]: I1203 07:51:58.084402 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:51:58 crc kubenswrapper[4947]: E1203 07:51:58.085866 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:52:09 crc kubenswrapper[4947]: I1203 07:52:09.091818 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:52:09 crc kubenswrapper[4947]: E1203 07:52:09.093282 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:52:21 crc kubenswrapper[4947]: I1203 07:52:21.083764 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:52:21 crc kubenswrapper[4947]: E1203 07:52:21.084735 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:52:32 crc kubenswrapper[4947]: I1203 07:52:32.083528 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:52:32 crc kubenswrapper[4947]: E1203 07:52:32.084542 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:52:45 crc kubenswrapper[4947]: I1203 07:52:45.083238 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:52:45 crc kubenswrapper[4947]: E1203 07:52:45.083973 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:52:57 crc kubenswrapper[4947]: I1203 07:52:57.083094 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:52:57 crc kubenswrapper[4947]: E1203 07:52:57.084911 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:53:11 crc kubenswrapper[4947]: I1203 07:53:11.082455 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:53:11 crc kubenswrapper[4947]: E1203 07:53:11.083173 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:53:25 crc kubenswrapper[4947]: I1203 07:53:25.083639 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:53:25 crc kubenswrapper[4947]: E1203 07:53:25.084592 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 07:53:37 crc kubenswrapper[4947]: I1203 07:53:37.083251 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:53:37 crc kubenswrapper[4947]: I1203 07:53:37.481704 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"dd8022caf794576a5d53eebed6928631e795070f5bf854a75a9328ce82ad5174"} Dec 03 07:55:13 crc kubenswrapper[4947]: I1203 07:55:13.479962 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5mr8g"] Dec 03 07:55:13 crc kubenswrapper[4947]: E1203 07:55:13.480945 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c84a7f-c038-452d-a593-0f5c1f12d27a" containerName="registry-server" Dec 03 07:55:13 crc kubenswrapper[4947]: I1203 07:55:13.480961 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c84a7f-c038-452d-a593-0f5c1f12d27a" containerName="registry-server" Dec 03 07:55:13 crc kubenswrapper[4947]: E1203 07:55:13.480979 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c84a7f-c038-452d-a593-0f5c1f12d27a" containerName="extract-utilities" Dec 03 07:55:13 crc kubenswrapper[4947]: I1203 07:55:13.480987 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c84a7f-c038-452d-a593-0f5c1f12d27a" containerName="extract-utilities" Dec 03 07:55:13 crc kubenswrapper[4947]: E1203 07:55:13.481020 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c84a7f-c038-452d-a593-0f5c1f12d27a" containerName="extract-content" Dec 03 07:55:13 crc kubenswrapper[4947]: I1203 07:55:13.481028 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c84a7f-c038-452d-a593-0f5c1f12d27a" containerName="extract-content" Dec 03 07:55:13 crc kubenswrapper[4947]: I1203 07:55:13.481229 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="44c84a7f-c038-452d-a593-0f5c1f12d27a" containerName="registry-server" Dec 03 07:55:13 crc kubenswrapper[4947]: I1203 07:55:13.482795 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5mr8g" Dec 03 07:55:13 crc kubenswrapper[4947]: I1203 07:55:13.504735 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5mr8g"] Dec 03 07:55:13 crc kubenswrapper[4947]: I1203 07:55:13.637773 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwmqr\" (UniqueName: \"kubernetes.io/projected/d6a65d9c-b32e-4b42-9a31-79ea5d4704e4-kube-api-access-gwmqr\") pod \"redhat-marketplace-5mr8g\" (UID: \"d6a65d9c-b32e-4b42-9a31-79ea5d4704e4\") " pod="openshift-marketplace/redhat-marketplace-5mr8g" Dec 03 07:55:13 crc kubenswrapper[4947]: I1203 07:55:13.637887 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6a65d9c-b32e-4b42-9a31-79ea5d4704e4-utilities\") pod \"redhat-marketplace-5mr8g\" (UID: \"d6a65d9c-b32e-4b42-9a31-79ea5d4704e4\") " pod="openshift-marketplace/redhat-marketplace-5mr8g" Dec 03 07:55:13 crc kubenswrapper[4947]: I1203 07:55:13.637906 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6a65d9c-b32e-4b42-9a31-79ea5d4704e4-catalog-content\") pod \"redhat-marketplace-5mr8g\" (UID: \"d6a65d9c-b32e-4b42-9a31-79ea5d4704e4\") " pod="openshift-marketplace/redhat-marketplace-5mr8g" Dec 03 07:55:13 crc kubenswrapper[4947]: I1203 07:55:13.738991 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6a65d9c-b32e-4b42-9a31-79ea5d4704e4-utilities\") pod \"redhat-marketplace-5mr8g\" (UID: \"d6a65d9c-b32e-4b42-9a31-79ea5d4704e4\") " pod="openshift-marketplace/redhat-marketplace-5mr8g" Dec 03 07:55:13 crc kubenswrapper[4947]: I1203 07:55:13.739039 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6a65d9c-b32e-4b42-9a31-79ea5d4704e4-catalog-content\") pod \"redhat-marketplace-5mr8g\" (UID: \"d6a65d9c-b32e-4b42-9a31-79ea5d4704e4\") " pod="openshift-marketplace/redhat-marketplace-5mr8g" Dec 03 07:55:13 crc kubenswrapper[4947]: I1203 07:55:13.739073 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwmqr\" (UniqueName: \"kubernetes.io/projected/d6a65d9c-b32e-4b42-9a31-79ea5d4704e4-kube-api-access-gwmqr\") pod \"redhat-marketplace-5mr8g\" (UID: \"d6a65d9c-b32e-4b42-9a31-79ea5d4704e4\") " pod="openshift-marketplace/redhat-marketplace-5mr8g" Dec 03 07:55:13 crc kubenswrapper[4947]: I1203 07:55:13.739572 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6a65d9c-b32e-4b42-9a31-79ea5d4704e4-utilities\") pod \"redhat-marketplace-5mr8g\" (UID: \"d6a65d9c-b32e-4b42-9a31-79ea5d4704e4\") " pod="openshift-marketplace/redhat-marketplace-5mr8g" Dec 03 07:55:13 crc kubenswrapper[4947]: I1203 07:55:13.739599 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6a65d9c-b32e-4b42-9a31-79ea5d4704e4-catalog-content\") pod \"redhat-marketplace-5mr8g\" (UID: \"d6a65d9c-b32e-4b42-9a31-79ea5d4704e4\") " pod="openshift-marketplace/redhat-marketplace-5mr8g" Dec 03 07:55:13 crc kubenswrapper[4947]: I1203 07:55:13.762003 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwmqr\" (UniqueName: \"kubernetes.io/projected/d6a65d9c-b32e-4b42-9a31-79ea5d4704e4-kube-api-access-gwmqr\") pod \"redhat-marketplace-5mr8g\" (UID: \"d6a65d9c-b32e-4b42-9a31-79ea5d4704e4\") " pod="openshift-marketplace/redhat-marketplace-5mr8g" Dec 03 07:55:13 crc kubenswrapper[4947]: I1203 07:55:13.816607 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5mr8g" Dec 03 07:55:14 crc kubenswrapper[4947]: I1203 07:55:14.270664 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5mr8g"] Dec 03 07:55:14 crc kubenswrapper[4947]: I1203 07:55:14.303161 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mr8g" event={"ID":"d6a65d9c-b32e-4b42-9a31-79ea5d4704e4","Type":"ContainerStarted","Data":"bc683bf13a44058f5023635ce4a682989d8ffb4f1b29757f34b2cdefb4709fda"} Dec 03 07:55:15 crc kubenswrapper[4947]: I1203 07:55:15.280442 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-65wjk"] Dec 03 07:55:15 crc kubenswrapper[4947]: I1203 07:55:15.284432 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-65wjk" Dec 03 07:55:15 crc kubenswrapper[4947]: I1203 07:55:15.293287 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-65wjk"] Dec 03 07:55:15 crc kubenswrapper[4947]: I1203 07:55:15.333141 4947 generic.go:334] "Generic (PLEG): container finished" podID="d6a65d9c-b32e-4b42-9a31-79ea5d4704e4" containerID="eb1c00ae8a08007715bcdd88c7a83b14be949cb5237945f3e37dc7cfaf6922b4" exitCode=0 Dec 03 07:55:15 crc kubenswrapper[4947]: I1203 07:55:15.333208 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mr8g" event={"ID":"d6a65d9c-b32e-4b42-9a31-79ea5d4704e4","Type":"ContainerDied","Data":"eb1c00ae8a08007715bcdd88c7a83b14be949cb5237945f3e37dc7cfaf6922b4"} Dec 03 07:55:15 crc kubenswrapper[4947]: I1203 07:55:15.335181 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 07:55:15 crc kubenswrapper[4947]: I1203 07:55:15.464282 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73450081-dfe9-459f-bafa-f6cdf42ae499-utilities\") pod \"community-operators-65wjk\" (UID: \"73450081-dfe9-459f-bafa-f6cdf42ae499\") " pod="openshift-marketplace/community-operators-65wjk" Dec 03 07:55:15 crc kubenswrapper[4947]: I1203 07:55:15.464351 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73450081-dfe9-459f-bafa-f6cdf42ae499-catalog-content\") pod \"community-operators-65wjk\" (UID: \"73450081-dfe9-459f-bafa-f6cdf42ae499\") " pod="openshift-marketplace/community-operators-65wjk" Dec 03 07:55:15 crc kubenswrapper[4947]: I1203 07:55:15.464401 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26mk7\" (UniqueName: \"kubernetes.io/projected/73450081-dfe9-459f-bafa-f6cdf42ae499-kube-api-access-26mk7\") pod \"community-operators-65wjk\" (UID: \"73450081-dfe9-459f-bafa-f6cdf42ae499\") " pod="openshift-marketplace/community-operators-65wjk" Dec 03 07:55:15 crc kubenswrapper[4947]: I1203 07:55:15.566107 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73450081-dfe9-459f-bafa-f6cdf42ae499-utilities\") pod \"community-operators-65wjk\" (UID: \"73450081-dfe9-459f-bafa-f6cdf42ae499\") " pod="openshift-marketplace/community-operators-65wjk" Dec 03 07:55:15 crc kubenswrapper[4947]: I1203 07:55:15.566150 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73450081-dfe9-459f-bafa-f6cdf42ae499-catalog-content\") pod \"community-operators-65wjk\" (UID: \"73450081-dfe9-459f-bafa-f6cdf42ae499\") " pod="openshift-marketplace/community-operators-65wjk" Dec 03 07:55:15 crc kubenswrapper[4947]: I1203 07:55:15.566181 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26mk7\" (UniqueName: \"kubernetes.io/projected/73450081-dfe9-459f-bafa-f6cdf42ae499-kube-api-access-26mk7\") pod \"community-operators-65wjk\" (UID: \"73450081-dfe9-459f-bafa-f6cdf42ae499\") " pod="openshift-marketplace/community-operators-65wjk" Dec 03 07:55:15 crc kubenswrapper[4947]: I1203 07:55:15.566955 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73450081-dfe9-459f-bafa-f6cdf42ae499-utilities\") pod \"community-operators-65wjk\" (UID: \"73450081-dfe9-459f-bafa-f6cdf42ae499\") " pod="openshift-marketplace/community-operators-65wjk" Dec 03 07:55:15 crc kubenswrapper[4947]: I1203 07:55:15.567009 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73450081-dfe9-459f-bafa-f6cdf42ae499-catalog-content\") pod \"community-operators-65wjk\" (UID: \"73450081-dfe9-459f-bafa-f6cdf42ae499\") " pod="openshift-marketplace/community-operators-65wjk" Dec 03 07:55:15 crc kubenswrapper[4947]: I1203 07:55:15.589583 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26mk7\" (UniqueName: \"kubernetes.io/projected/73450081-dfe9-459f-bafa-f6cdf42ae499-kube-api-access-26mk7\") pod \"community-operators-65wjk\" (UID: \"73450081-dfe9-459f-bafa-f6cdf42ae499\") " pod="openshift-marketplace/community-operators-65wjk" Dec 03 07:55:15 crc kubenswrapper[4947]: I1203 07:55:15.624199 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-65wjk" Dec 03 07:55:16 crc kubenswrapper[4947]: I1203 07:55:16.134847 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-65wjk"] Dec 03 07:55:16 crc kubenswrapper[4947]: W1203 07:55:16.136425 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73450081_dfe9_459f_bafa_f6cdf42ae499.slice/crio-f0341a353e334c19d5f825008f94a5265c12e6879a33dcab575fd51e7d4952ac WatchSource:0}: Error finding container f0341a353e334c19d5f825008f94a5265c12e6879a33dcab575fd51e7d4952ac: Status 404 returned error can't find the container with id f0341a353e334c19d5f825008f94a5265c12e6879a33dcab575fd51e7d4952ac Dec 03 07:55:16 crc kubenswrapper[4947]: I1203 07:55:16.341938 4947 generic.go:334] "Generic (PLEG): container finished" podID="73450081-dfe9-459f-bafa-f6cdf42ae499" containerID="74aaa7d06bf924894e497f804c1f718453d3cf1465c624af786dbc94016d64bf" exitCode=0 Dec 03 07:55:16 crc kubenswrapper[4947]: I1203 07:55:16.341994 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-65wjk" event={"ID":"73450081-dfe9-459f-bafa-f6cdf42ae499","Type":"ContainerDied","Data":"74aaa7d06bf924894e497f804c1f718453d3cf1465c624af786dbc94016d64bf"} Dec 03 07:55:16 crc kubenswrapper[4947]: I1203 07:55:16.342020 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-65wjk" event={"ID":"73450081-dfe9-459f-bafa-f6cdf42ae499","Type":"ContainerStarted","Data":"f0341a353e334c19d5f825008f94a5265c12e6879a33dcab575fd51e7d4952ac"} Dec 03 07:55:16 crc kubenswrapper[4947]: I1203 07:55:16.345701 4947 generic.go:334] "Generic (PLEG): container finished" podID="d6a65d9c-b32e-4b42-9a31-79ea5d4704e4" containerID="786947a5347ae4899dd0d4fe80a49ca384fc77cbf8fa0f5fcd438dd91fb8a3ce" exitCode=0 Dec 03 07:55:16 crc kubenswrapper[4947]: I1203 07:55:16.345765 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mr8g" event={"ID":"d6a65d9c-b32e-4b42-9a31-79ea5d4704e4","Type":"ContainerDied","Data":"786947a5347ae4899dd0d4fe80a49ca384fc77cbf8fa0f5fcd438dd91fb8a3ce"} Dec 03 07:55:17 crc kubenswrapper[4947]: I1203 07:55:17.355662 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mr8g" event={"ID":"d6a65d9c-b32e-4b42-9a31-79ea5d4704e4","Type":"ContainerStarted","Data":"e65fb250c3ea49b0f13da9340f78035cac9bc6796a24c66bd2bc2d455321f96f"} Dec 03 07:55:17 crc kubenswrapper[4947]: I1203 07:55:17.377311 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5mr8g" podStartSLOduration=2.962246227 podStartE2EDuration="4.377291037s" podCreationTimestamp="2025-12-03 07:55:13 +0000 UTC" firstStartedPulling="2025-12-03 07:55:15.334984918 +0000 UTC m=+3976.595939334" lastFinishedPulling="2025-12-03 07:55:16.750029718 +0000 UTC m=+3978.010984144" observedRunningTime="2025-12-03 07:55:17.373059213 +0000 UTC m=+3978.634013629" watchObservedRunningTime="2025-12-03 07:55:17.377291037 +0000 UTC m=+3978.638245453" Dec 03 07:55:18 crc kubenswrapper[4947]: I1203 07:55:18.367845 4947 generic.go:334] "Generic (PLEG): container finished" podID="73450081-dfe9-459f-bafa-f6cdf42ae499" containerID="9181e277b593abc42469b5d9faa6fda23f44ca79908512f913ac8a660d86c642" exitCode=0 Dec 03 07:55:18 crc kubenswrapper[4947]: I1203 07:55:18.367905 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-65wjk" event={"ID":"73450081-dfe9-459f-bafa-f6cdf42ae499","Type":"ContainerDied","Data":"9181e277b593abc42469b5d9faa6fda23f44ca79908512f913ac8a660d86c642"} Dec 03 07:55:19 crc kubenswrapper[4947]: I1203 07:55:19.377132 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-65wjk" event={"ID":"73450081-dfe9-459f-bafa-f6cdf42ae499","Type":"ContainerStarted","Data":"fd11509ac32891ed6fcaf292151364eb80a69c774aea00357b9ffca7996abcc4"} Dec 03 07:55:19 crc kubenswrapper[4947]: I1203 07:55:19.397612 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-65wjk" podStartSLOduration=1.844985764 podStartE2EDuration="4.397588241s" podCreationTimestamp="2025-12-03 07:55:15 +0000 UTC" firstStartedPulling="2025-12-03 07:55:16.343105585 +0000 UTC m=+3977.604060011" lastFinishedPulling="2025-12-03 07:55:18.895708062 +0000 UTC m=+3980.156662488" observedRunningTime="2025-12-03 07:55:19.393802269 +0000 UTC m=+3980.654756695" watchObservedRunningTime="2025-12-03 07:55:19.397588241 +0000 UTC m=+3980.658542667" Dec 03 07:55:23 crc kubenswrapper[4947]: I1203 07:55:23.817848 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5mr8g" Dec 03 07:55:23 crc kubenswrapper[4947]: I1203 07:55:23.818215 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5mr8g" Dec 03 07:55:23 crc kubenswrapper[4947]: I1203 07:55:23.866095 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5mr8g" Dec 03 07:55:24 crc kubenswrapper[4947]: I1203 07:55:24.461817 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5mr8g" Dec 03 07:55:24 crc kubenswrapper[4947]: I1203 07:55:24.506836 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5mr8g"] Dec 03 07:55:25 crc kubenswrapper[4947]: I1203 07:55:25.625128 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-65wjk" Dec 03 07:55:25 crc kubenswrapper[4947]: I1203 07:55:25.625177 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-65wjk" Dec 03 07:55:25 crc kubenswrapper[4947]: I1203 07:55:25.664235 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-65wjk" Dec 03 07:55:26 crc kubenswrapper[4947]: I1203 07:55:26.434343 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5mr8g" podUID="d6a65d9c-b32e-4b42-9a31-79ea5d4704e4" containerName="registry-server" containerID="cri-o://e65fb250c3ea49b0f13da9340f78035cac9bc6796a24c66bd2bc2d455321f96f" gracePeriod=2 Dec 03 07:55:26 crc kubenswrapper[4947]: I1203 07:55:26.489382 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-65wjk" Dec 03 07:55:27 crc kubenswrapper[4947]: I1203 07:55:27.445253 4947 generic.go:334] "Generic (PLEG): container finished" podID="d6a65d9c-b32e-4b42-9a31-79ea5d4704e4" containerID="e65fb250c3ea49b0f13da9340f78035cac9bc6796a24c66bd2bc2d455321f96f" exitCode=0 Dec 03 07:55:27 crc kubenswrapper[4947]: I1203 07:55:27.445313 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mr8g" event={"ID":"d6a65d9c-b32e-4b42-9a31-79ea5d4704e4","Type":"ContainerDied","Data":"e65fb250c3ea49b0f13da9340f78035cac9bc6796a24c66bd2bc2d455321f96f"} Dec 03 07:55:27 crc kubenswrapper[4947]: I1203 07:55:27.500057 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-65wjk"] Dec 03 07:55:28 crc kubenswrapper[4947]: I1203 07:55:28.078232 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5mr8g" Dec 03 07:55:28 crc kubenswrapper[4947]: I1203 07:55:28.143556 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6a65d9c-b32e-4b42-9a31-79ea5d4704e4-catalog-content\") pod \"d6a65d9c-b32e-4b42-9a31-79ea5d4704e4\" (UID: \"d6a65d9c-b32e-4b42-9a31-79ea5d4704e4\") " Dec 03 07:55:28 crc kubenswrapper[4947]: I1203 07:55:28.143692 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6a65d9c-b32e-4b42-9a31-79ea5d4704e4-utilities\") pod \"d6a65d9c-b32e-4b42-9a31-79ea5d4704e4\" (UID: \"d6a65d9c-b32e-4b42-9a31-79ea5d4704e4\") " Dec 03 07:55:28 crc kubenswrapper[4947]: I1203 07:55:28.143725 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwmqr\" (UniqueName: \"kubernetes.io/projected/d6a65d9c-b32e-4b42-9a31-79ea5d4704e4-kube-api-access-gwmqr\") pod \"d6a65d9c-b32e-4b42-9a31-79ea5d4704e4\" (UID: \"d6a65d9c-b32e-4b42-9a31-79ea5d4704e4\") " Dec 03 07:55:28 crc kubenswrapper[4947]: I1203 07:55:28.144732 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6a65d9c-b32e-4b42-9a31-79ea5d4704e4-utilities" (OuterVolumeSpecName: "utilities") pod "d6a65d9c-b32e-4b42-9a31-79ea5d4704e4" (UID: "d6a65d9c-b32e-4b42-9a31-79ea5d4704e4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:55:28 crc kubenswrapper[4947]: I1203 07:55:28.154435 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6a65d9c-b32e-4b42-9a31-79ea5d4704e4-kube-api-access-gwmqr" (OuterVolumeSpecName: "kube-api-access-gwmqr") pod "d6a65d9c-b32e-4b42-9a31-79ea5d4704e4" (UID: "d6a65d9c-b32e-4b42-9a31-79ea5d4704e4"). InnerVolumeSpecName "kube-api-access-gwmqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:55:28 crc kubenswrapper[4947]: I1203 07:55:28.161933 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6a65d9c-b32e-4b42-9a31-79ea5d4704e4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d6a65d9c-b32e-4b42-9a31-79ea5d4704e4" (UID: "d6a65d9c-b32e-4b42-9a31-79ea5d4704e4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:55:28 crc kubenswrapper[4947]: I1203 07:55:28.244722 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6a65d9c-b32e-4b42-9a31-79ea5d4704e4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:55:28 crc kubenswrapper[4947]: I1203 07:55:28.244767 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6a65d9c-b32e-4b42-9a31-79ea5d4704e4-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:55:28 crc kubenswrapper[4947]: I1203 07:55:28.244782 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwmqr\" (UniqueName: \"kubernetes.io/projected/d6a65d9c-b32e-4b42-9a31-79ea5d4704e4-kube-api-access-gwmqr\") on node \"crc\" DevicePath \"\"" Dec 03 07:55:28 crc kubenswrapper[4947]: I1203 07:55:28.455304 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5mr8g" event={"ID":"d6a65d9c-b32e-4b42-9a31-79ea5d4704e4","Type":"ContainerDied","Data":"bc683bf13a44058f5023635ce4a682989d8ffb4f1b29757f34b2cdefb4709fda"} Dec 03 07:55:28 crc kubenswrapper[4947]: I1203 07:55:28.455365 4947 scope.go:117] "RemoveContainer" containerID="e65fb250c3ea49b0f13da9340f78035cac9bc6796a24c66bd2bc2d455321f96f" Dec 03 07:55:28 crc kubenswrapper[4947]: I1203 07:55:28.455383 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5mr8g" Dec 03 07:55:28 crc kubenswrapper[4947]: I1203 07:55:28.455447 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-65wjk" podUID="73450081-dfe9-459f-bafa-f6cdf42ae499" containerName="registry-server" containerID="cri-o://fd11509ac32891ed6fcaf292151364eb80a69c774aea00357b9ffca7996abcc4" gracePeriod=2 Dec 03 07:55:28 crc kubenswrapper[4947]: I1203 07:55:28.474018 4947 scope.go:117] "RemoveContainer" containerID="786947a5347ae4899dd0d4fe80a49ca384fc77cbf8fa0f5fcd438dd91fb8a3ce" Dec 03 07:55:28 crc kubenswrapper[4947]: I1203 07:55:28.496380 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5mr8g"] Dec 03 07:55:28 crc kubenswrapper[4947]: I1203 07:55:28.502590 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5mr8g"] Dec 03 07:55:28 crc kubenswrapper[4947]: I1203 07:55:28.511216 4947 scope.go:117] "RemoveContainer" containerID="eb1c00ae8a08007715bcdd88c7a83b14be949cb5237945f3e37dc7cfaf6922b4" Dec 03 07:55:29 crc kubenswrapper[4947]: I1203 07:55:29.103077 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6a65d9c-b32e-4b42-9a31-79ea5d4704e4" path="/var/lib/kubelet/pods/d6a65d9c-b32e-4b42-9a31-79ea5d4704e4/volumes" Dec 03 07:55:29 crc kubenswrapper[4947]: I1203 07:55:29.467610 4947 generic.go:334] "Generic (PLEG): container finished" podID="73450081-dfe9-459f-bafa-f6cdf42ae499" containerID="fd11509ac32891ed6fcaf292151364eb80a69c774aea00357b9ffca7996abcc4" exitCode=0 Dec 03 07:55:29 crc kubenswrapper[4947]: I1203 07:55:29.467678 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-65wjk" event={"ID":"73450081-dfe9-459f-bafa-f6cdf42ae499","Type":"ContainerDied","Data":"fd11509ac32891ed6fcaf292151364eb80a69c774aea00357b9ffca7996abcc4"} Dec 03 07:55:29 crc kubenswrapper[4947]: I1203 07:55:29.522377 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-65wjk" Dec 03 07:55:29 crc kubenswrapper[4947]: I1203 07:55:29.665671 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73450081-dfe9-459f-bafa-f6cdf42ae499-catalog-content\") pod \"73450081-dfe9-459f-bafa-f6cdf42ae499\" (UID: \"73450081-dfe9-459f-bafa-f6cdf42ae499\") " Dec 03 07:55:29 crc kubenswrapper[4947]: I1203 07:55:29.665829 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26mk7\" (UniqueName: \"kubernetes.io/projected/73450081-dfe9-459f-bafa-f6cdf42ae499-kube-api-access-26mk7\") pod \"73450081-dfe9-459f-bafa-f6cdf42ae499\" (UID: \"73450081-dfe9-459f-bafa-f6cdf42ae499\") " Dec 03 07:55:29 crc kubenswrapper[4947]: I1203 07:55:29.665873 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73450081-dfe9-459f-bafa-f6cdf42ae499-utilities\") pod \"73450081-dfe9-459f-bafa-f6cdf42ae499\" (UID: \"73450081-dfe9-459f-bafa-f6cdf42ae499\") " Dec 03 07:55:29 crc kubenswrapper[4947]: I1203 07:55:29.668194 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73450081-dfe9-459f-bafa-f6cdf42ae499-utilities" (OuterVolumeSpecName: "utilities") pod "73450081-dfe9-459f-bafa-f6cdf42ae499" (UID: "73450081-dfe9-459f-bafa-f6cdf42ae499"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:55:29 crc kubenswrapper[4947]: I1203 07:55:29.671004 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73450081-dfe9-459f-bafa-f6cdf42ae499-kube-api-access-26mk7" (OuterVolumeSpecName: "kube-api-access-26mk7") pod "73450081-dfe9-459f-bafa-f6cdf42ae499" (UID: "73450081-dfe9-459f-bafa-f6cdf42ae499"). InnerVolumeSpecName "kube-api-access-26mk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:55:29 crc kubenswrapper[4947]: I1203 07:55:29.741143 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73450081-dfe9-459f-bafa-f6cdf42ae499-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73450081-dfe9-459f-bafa-f6cdf42ae499" (UID: "73450081-dfe9-459f-bafa-f6cdf42ae499"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:55:29 crc kubenswrapper[4947]: I1203 07:55:29.768416 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73450081-dfe9-459f-bafa-f6cdf42ae499-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:55:29 crc kubenswrapper[4947]: I1203 07:55:29.768471 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73450081-dfe9-459f-bafa-f6cdf42ae499-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:55:29 crc kubenswrapper[4947]: I1203 07:55:29.768510 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26mk7\" (UniqueName: \"kubernetes.io/projected/73450081-dfe9-459f-bafa-f6cdf42ae499-kube-api-access-26mk7\") on node \"crc\" DevicePath \"\"" Dec 03 07:55:30 crc kubenswrapper[4947]: I1203 07:55:30.482526 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-65wjk" event={"ID":"73450081-dfe9-459f-bafa-f6cdf42ae499","Type":"ContainerDied","Data":"f0341a353e334c19d5f825008f94a5265c12e6879a33dcab575fd51e7d4952ac"} Dec 03 07:55:30 crc kubenswrapper[4947]: I1203 07:55:30.482735 4947 scope.go:117] "RemoveContainer" containerID="fd11509ac32891ed6fcaf292151364eb80a69c774aea00357b9ffca7996abcc4" Dec 03 07:55:30 crc kubenswrapper[4947]: I1203 07:55:30.482648 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-65wjk" Dec 03 07:55:30 crc kubenswrapper[4947]: I1203 07:55:30.503715 4947 scope.go:117] "RemoveContainer" containerID="9181e277b593abc42469b5d9faa6fda23f44ca79908512f913ac8a660d86c642" Dec 03 07:55:30 crc kubenswrapper[4947]: I1203 07:55:30.520064 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-65wjk"] Dec 03 07:55:30 crc kubenswrapper[4947]: I1203 07:55:30.526968 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-65wjk"] Dec 03 07:55:30 crc kubenswrapper[4947]: I1203 07:55:30.849396 4947 scope.go:117] "RemoveContainer" containerID="74aaa7d06bf924894e497f804c1f718453d3cf1465c624af786dbc94016d64bf" Dec 03 07:55:31 crc kubenswrapper[4947]: I1203 07:55:31.093998 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73450081-dfe9-459f-bafa-f6cdf42ae499" path="/var/lib/kubelet/pods/73450081-dfe9-459f-bafa-f6cdf42ae499/volumes" Dec 03 07:56:00 crc kubenswrapper[4947]: I1203 07:56:00.086203 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:56:00 crc kubenswrapper[4947]: I1203 07:56:00.086698 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:56:30 crc kubenswrapper[4947]: I1203 07:56:30.086882 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:56:30 crc kubenswrapper[4947]: I1203 07:56:30.087457 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:57:00 crc kubenswrapper[4947]: I1203 07:57:00.086406 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:57:00 crc kubenswrapper[4947]: I1203 07:57:00.087262 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:57:00 crc kubenswrapper[4947]: I1203 07:57:00.087322 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 07:57:00 crc kubenswrapper[4947]: I1203 07:57:00.088037 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dd8022caf794576a5d53eebed6928631e795070f5bf854a75a9328ce82ad5174"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:57:00 crc kubenswrapper[4947]: I1203 07:57:00.088137 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://dd8022caf794576a5d53eebed6928631e795070f5bf854a75a9328ce82ad5174" gracePeriod=600 Dec 03 07:57:00 crc kubenswrapper[4947]: I1203 07:57:00.378161 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="dd8022caf794576a5d53eebed6928631e795070f5bf854a75a9328ce82ad5174" exitCode=0 Dec 03 07:57:00 crc kubenswrapper[4947]: I1203 07:57:00.378235 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"dd8022caf794576a5d53eebed6928631e795070f5bf854a75a9328ce82ad5174"} Dec 03 07:57:00 crc kubenswrapper[4947]: I1203 07:57:00.378632 4947 scope.go:117] "RemoveContainer" containerID="bdb1efeec877401519d4332daec5857cd1dffd2bf04c51c52cd04ef118521198" Dec 03 07:57:01 crc kubenswrapper[4947]: I1203 07:57:01.391024 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d"} Dec 03 07:58:42 crc kubenswrapper[4947]: I1203 07:58:42.744004 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-njbpj"] Dec 03 07:58:42 crc kubenswrapper[4947]: E1203 07:58:42.745101 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73450081-dfe9-459f-bafa-f6cdf42ae499" containerName="registry-server" Dec 03 07:58:42 crc kubenswrapper[4947]: I1203 07:58:42.745122 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="73450081-dfe9-459f-bafa-f6cdf42ae499" containerName="registry-server" Dec 03 07:58:42 crc kubenswrapper[4947]: E1203 07:58:42.745158 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a65d9c-b32e-4b42-9a31-79ea5d4704e4" containerName="extract-content" Dec 03 07:58:42 crc kubenswrapper[4947]: I1203 07:58:42.745171 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a65d9c-b32e-4b42-9a31-79ea5d4704e4" containerName="extract-content" Dec 03 07:58:42 crc kubenswrapper[4947]: E1203 07:58:42.745186 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73450081-dfe9-459f-bafa-f6cdf42ae499" containerName="extract-content" Dec 03 07:58:42 crc kubenswrapper[4947]: I1203 07:58:42.745197 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="73450081-dfe9-459f-bafa-f6cdf42ae499" containerName="extract-content" Dec 03 07:58:42 crc kubenswrapper[4947]: E1203 07:58:42.745218 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a65d9c-b32e-4b42-9a31-79ea5d4704e4" containerName="extract-utilities" Dec 03 07:58:42 crc kubenswrapper[4947]: I1203 07:58:42.745228 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a65d9c-b32e-4b42-9a31-79ea5d4704e4" containerName="extract-utilities" Dec 03 07:58:42 crc kubenswrapper[4947]: E1203 07:58:42.745248 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73450081-dfe9-459f-bafa-f6cdf42ae499" containerName="extract-utilities" Dec 03 07:58:42 crc kubenswrapper[4947]: I1203 07:58:42.745258 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="73450081-dfe9-459f-bafa-f6cdf42ae499" containerName="extract-utilities" Dec 03 07:58:42 crc kubenswrapper[4947]: E1203 07:58:42.745271 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a65d9c-b32e-4b42-9a31-79ea5d4704e4" containerName="registry-server" Dec 03 07:58:42 crc kubenswrapper[4947]: I1203 07:58:42.745281 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a65d9c-b32e-4b42-9a31-79ea5d4704e4" containerName="registry-server" Dec 03 07:58:42 crc kubenswrapper[4947]: I1203 07:58:42.745537 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6a65d9c-b32e-4b42-9a31-79ea5d4704e4" containerName="registry-server" Dec 03 07:58:42 crc kubenswrapper[4947]: I1203 07:58:42.745572 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="73450081-dfe9-459f-bafa-f6cdf42ae499" containerName="registry-server" Dec 03 07:58:42 crc kubenswrapper[4947]: I1203 07:58:42.747437 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-njbpj" Dec 03 07:58:42 crc kubenswrapper[4947]: I1203 07:58:42.769306 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-njbpj"] Dec 03 07:58:42 crc kubenswrapper[4947]: I1203 07:58:42.902360 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44957a4e-7c93-40c4-b32d-bd0439ba60ca-utilities\") pod \"certified-operators-njbpj\" (UID: \"44957a4e-7c93-40c4-b32d-bd0439ba60ca\") " pod="openshift-marketplace/certified-operators-njbpj" Dec 03 07:58:42 crc kubenswrapper[4947]: I1203 07:58:42.902720 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44957a4e-7c93-40c4-b32d-bd0439ba60ca-catalog-content\") pod \"certified-operators-njbpj\" (UID: \"44957a4e-7c93-40c4-b32d-bd0439ba60ca\") " pod="openshift-marketplace/certified-operators-njbpj" Dec 03 07:58:42 crc kubenswrapper[4947]: I1203 07:58:42.902853 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h895h\" (UniqueName: \"kubernetes.io/projected/44957a4e-7c93-40c4-b32d-bd0439ba60ca-kube-api-access-h895h\") pod \"certified-operators-njbpj\" (UID: \"44957a4e-7c93-40c4-b32d-bd0439ba60ca\") " pod="openshift-marketplace/certified-operators-njbpj" Dec 03 07:58:43 crc kubenswrapper[4947]: I1203 07:58:43.004350 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44957a4e-7c93-40c4-b32d-bd0439ba60ca-utilities\") pod \"certified-operators-njbpj\" (UID: \"44957a4e-7c93-40c4-b32d-bd0439ba60ca\") " pod="openshift-marketplace/certified-operators-njbpj" Dec 03 07:58:43 crc kubenswrapper[4947]: I1203 07:58:43.004686 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44957a4e-7c93-40c4-b32d-bd0439ba60ca-catalog-content\") pod \"certified-operators-njbpj\" (UID: \"44957a4e-7c93-40c4-b32d-bd0439ba60ca\") " pod="openshift-marketplace/certified-operators-njbpj" Dec 03 07:58:43 crc kubenswrapper[4947]: I1203 07:58:43.004801 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h895h\" (UniqueName: \"kubernetes.io/projected/44957a4e-7c93-40c4-b32d-bd0439ba60ca-kube-api-access-h895h\") pod \"certified-operators-njbpj\" (UID: \"44957a4e-7c93-40c4-b32d-bd0439ba60ca\") " pod="openshift-marketplace/certified-operators-njbpj" Dec 03 07:58:43 crc kubenswrapper[4947]: I1203 07:58:43.005342 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44957a4e-7c93-40c4-b32d-bd0439ba60ca-utilities\") pod \"certified-operators-njbpj\" (UID: \"44957a4e-7c93-40c4-b32d-bd0439ba60ca\") " pod="openshift-marketplace/certified-operators-njbpj" Dec 03 07:58:43 crc kubenswrapper[4947]: I1203 07:58:43.005453 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44957a4e-7c93-40c4-b32d-bd0439ba60ca-catalog-content\") pod \"certified-operators-njbpj\" (UID: \"44957a4e-7c93-40c4-b32d-bd0439ba60ca\") " pod="openshift-marketplace/certified-operators-njbpj" Dec 03 07:58:43 crc kubenswrapper[4947]: I1203 07:58:43.031364 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h895h\" (UniqueName: \"kubernetes.io/projected/44957a4e-7c93-40c4-b32d-bd0439ba60ca-kube-api-access-h895h\") pod \"certified-operators-njbpj\" (UID: \"44957a4e-7c93-40c4-b32d-bd0439ba60ca\") " pod="openshift-marketplace/certified-operators-njbpj" Dec 03 07:58:43 crc kubenswrapper[4947]: I1203 07:58:43.075969 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-njbpj" Dec 03 07:58:43 crc kubenswrapper[4947]: I1203 07:58:43.639233 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-njbpj"] Dec 03 07:58:44 crc kubenswrapper[4947]: I1203 07:58:44.410782 4947 generic.go:334] "Generic (PLEG): container finished" podID="44957a4e-7c93-40c4-b32d-bd0439ba60ca" containerID="255f5c6510ee512041a618de670f5d5b0ed92032dfac351296125202676cf8da" exitCode=0 Dec 03 07:58:44 crc kubenswrapper[4947]: I1203 07:58:44.410888 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-njbpj" event={"ID":"44957a4e-7c93-40c4-b32d-bd0439ba60ca","Type":"ContainerDied","Data":"255f5c6510ee512041a618de670f5d5b0ed92032dfac351296125202676cf8da"} Dec 03 07:58:44 crc kubenswrapper[4947]: I1203 07:58:44.411142 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-njbpj" event={"ID":"44957a4e-7c93-40c4-b32d-bd0439ba60ca","Type":"ContainerStarted","Data":"9c2ee8212e4183fba40611289a97bab3252daa84ec8f8200c8f5f11878341f00"} Dec 03 07:58:48 crc kubenswrapper[4947]: I1203 07:58:48.451123 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-njbpj" event={"ID":"44957a4e-7c93-40c4-b32d-bd0439ba60ca","Type":"ContainerStarted","Data":"e3dbb0e8c27af4e8f883956e645d276f4e32b10e6badd7ff1dabe57e4ac3d41e"} Dec 03 07:58:49 crc kubenswrapper[4947]: I1203 07:58:49.466777 4947 generic.go:334] "Generic (PLEG): container finished" podID="44957a4e-7c93-40c4-b32d-bd0439ba60ca" containerID="e3dbb0e8c27af4e8f883956e645d276f4e32b10e6badd7ff1dabe57e4ac3d41e" exitCode=0 Dec 03 07:58:49 crc kubenswrapper[4947]: I1203 07:58:49.467158 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-njbpj" event={"ID":"44957a4e-7c93-40c4-b32d-bd0439ba60ca","Type":"ContainerDied","Data":"e3dbb0e8c27af4e8f883956e645d276f4e32b10e6badd7ff1dabe57e4ac3d41e"} Dec 03 07:58:50 crc kubenswrapper[4947]: I1203 07:58:50.477195 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-njbpj" event={"ID":"44957a4e-7c93-40c4-b32d-bd0439ba60ca","Type":"ContainerStarted","Data":"76fd801c795820ec066b4e1afafb5e5889de627b679860ca80d188e3f7c8253c"} Dec 03 07:58:50 crc kubenswrapper[4947]: I1203 07:58:50.512185 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-njbpj" podStartSLOduration=2.9940579290000002 podStartE2EDuration="8.512163223s" podCreationTimestamp="2025-12-03 07:58:42 +0000 UTC" firstStartedPulling="2025-12-03 07:58:44.412268456 +0000 UTC m=+4185.673222882" lastFinishedPulling="2025-12-03 07:58:49.93037375 +0000 UTC m=+4191.191328176" observedRunningTime="2025-12-03 07:58:50.507775059 +0000 UTC m=+4191.768729505" watchObservedRunningTime="2025-12-03 07:58:50.512163223 +0000 UTC m=+4191.773117649" Dec 03 07:58:53 crc kubenswrapper[4947]: I1203 07:58:53.076768 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-njbpj" Dec 03 07:58:53 crc kubenswrapper[4947]: I1203 07:58:53.077086 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-njbpj" Dec 03 07:58:53 crc kubenswrapper[4947]: I1203 07:58:53.137117 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-njbpj" Dec 03 07:59:00 crc kubenswrapper[4947]: I1203 07:59:00.086467 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:59:00 crc kubenswrapper[4947]: I1203 07:59:00.086866 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:59:03 crc kubenswrapper[4947]: I1203 07:59:03.159364 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-njbpj" Dec 03 07:59:03 crc kubenswrapper[4947]: I1203 07:59:03.247917 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-njbpj"] Dec 03 07:59:03 crc kubenswrapper[4947]: I1203 07:59:03.306201 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hqplp"] Dec 03 07:59:03 crc kubenswrapper[4947]: I1203 07:59:03.306521 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hqplp" podUID="be1f365b-f0e8-413f-ac93-6be2fc6282a8" containerName="registry-server" containerID="cri-o://e38bd3f1fe6049d3622ace19445979e3ef2446e14f2908ecd8f836a736bb8536" gracePeriod=2 Dec 03 07:59:03 crc kubenswrapper[4947]: I1203 07:59:03.589957 4947 generic.go:334] "Generic (PLEG): container finished" podID="be1f365b-f0e8-413f-ac93-6be2fc6282a8" containerID="e38bd3f1fe6049d3622ace19445979e3ef2446e14f2908ecd8f836a736bb8536" exitCode=0 Dec 03 07:59:03 crc kubenswrapper[4947]: I1203 07:59:03.590038 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqplp" event={"ID":"be1f365b-f0e8-413f-ac93-6be2fc6282a8","Type":"ContainerDied","Data":"e38bd3f1fe6049d3622ace19445979e3ef2446e14f2908ecd8f836a736bb8536"} Dec 03 07:59:05 crc kubenswrapper[4947]: I1203 07:59:05.160059 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqplp" Dec 03 07:59:05 crc kubenswrapper[4947]: I1203 07:59:05.341397 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be1f365b-f0e8-413f-ac93-6be2fc6282a8-utilities\") pod \"be1f365b-f0e8-413f-ac93-6be2fc6282a8\" (UID: \"be1f365b-f0e8-413f-ac93-6be2fc6282a8\") " Dec 03 07:59:05 crc kubenswrapper[4947]: I1203 07:59:05.341469 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be1f365b-f0e8-413f-ac93-6be2fc6282a8-catalog-content\") pod \"be1f365b-f0e8-413f-ac93-6be2fc6282a8\" (UID: \"be1f365b-f0e8-413f-ac93-6be2fc6282a8\") " Dec 03 07:59:05 crc kubenswrapper[4947]: I1203 07:59:05.341525 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2dpj\" (UniqueName: \"kubernetes.io/projected/be1f365b-f0e8-413f-ac93-6be2fc6282a8-kube-api-access-l2dpj\") pod \"be1f365b-f0e8-413f-ac93-6be2fc6282a8\" (UID: \"be1f365b-f0e8-413f-ac93-6be2fc6282a8\") " Dec 03 07:59:05 crc kubenswrapper[4947]: I1203 07:59:05.341950 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be1f365b-f0e8-413f-ac93-6be2fc6282a8-utilities" (OuterVolumeSpecName: "utilities") pod "be1f365b-f0e8-413f-ac93-6be2fc6282a8" (UID: "be1f365b-f0e8-413f-ac93-6be2fc6282a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:59:05 crc kubenswrapper[4947]: I1203 07:59:05.350000 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be1f365b-f0e8-413f-ac93-6be2fc6282a8-kube-api-access-l2dpj" (OuterVolumeSpecName: "kube-api-access-l2dpj") pod "be1f365b-f0e8-413f-ac93-6be2fc6282a8" (UID: "be1f365b-f0e8-413f-ac93-6be2fc6282a8"). InnerVolumeSpecName "kube-api-access-l2dpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:59:05 crc kubenswrapper[4947]: I1203 07:59:05.377196 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be1f365b-f0e8-413f-ac93-6be2fc6282a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be1f365b-f0e8-413f-ac93-6be2fc6282a8" (UID: "be1f365b-f0e8-413f-ac93-6be2fc6282a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:59:05 crc kubenswrapper[4947]: I1203 07:59:05.416837 4947 scope.go:117] "RemoveContainer" containerID="9ad641e180c0e3ed8e3aefdf96bc3e35b050ab9b17d6a3df2fb20f7d71cbb0b8" Dec 03 07:59:05 crc kubenswrapper[4947]: I1203 07:59:05.433520 4947 scope.go:117] "RemoveContainer" containerID="e38bd3f1fe6049d3622ace19445979e3ef2446e14f2908ecd8f836a736bb8536" Dec 03 07:59:05 crc kubenswrapper[4947]: I1203 07:59:05.442898 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be1f365b-f0e8-413f-ac93-6be2fc6282a8-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:05 crc kubenswrapper[4947]: I1203 07:59:05.442925 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be1f365b-f0e8-413f-ac93-6be2fc6282a8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:05 crc kubenswrapper[4947]: I1203 07:59:05.442936 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2dpj\" (UniqueName: \"kubernetes.io/projected/be1f365b-f0e8-413f-ac93-6be2fc6282a8-kube-api-access-l2dpj\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:05 crc kubenswrapper[4947]: I1203 07:59:05.454383 4947 scope.go:117] "RemoveContainer" containerID="e2e4c79d4ca3cc76c2d236e2edebe8f791942eb804809abc84594084e7b54694" Dec 03 07:59:05 crc kubenswrapper[4947]: I1203 07:59:05.612373 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqplp" Dec 03 07:59:05 crc kubenswrapper[4947]: I1203 07:59:05.612395 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqplp" event={"ID":"be1f365b-f0e8-413f-ac93-6be2fc6282a8","Type":"ContainerDied","Data":"0d93e14fd615618355de39579c2f58a684bf55fd468252ac5da351e26d0a6ca3"} Dec 03 07:59:05 crc kubenswrapper[4947]: I1203 07:59:05.657659 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hqplp"] Dec 03 07:59:05 crc kubenswrapper[4947]: I1203 07:59:05.662017 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hqplp"] Dec 03 07:59:07 crc kubenswrapper[4947]: I1203 07:59:07.092743 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be1f365b-f0e8-413f-ac93-6be2fc6282a8" path="/var/lib/kubelet/pods/be1f365b-f0e8-413f-ac93-6be2fc6282a8/volumes" Dec 03 07:59:30 crc kubenswrapper[4947]: I1203 07:59:30.086742 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:59:30 crc kubenswrapper[4947]: I1203 07:59:30.087199 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:00:00 crc kubenswrapper[4947]: I1203 08:00:00.086430 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:00:00 crc kubenswrapper[4947]: I1203 08:00:00.087649 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:00:00 crc kubenswrapper[4947]: I1203 08:00:00.087733 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 08:00:00 crc kubenswrapper[4947]: I1203 08:00:00.088941 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:00:00 crc kubenswrapper[4947]: I1203 08:00:00.089062 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" gracePeriod=600 Dec 03 08:00:00 crc kubenswrapper[4947]: I1203 08:00:00.200321 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412480-clkgg"] Dec 03 08:00:00 crc kubenswrapper[4947]: E1203 08:00:00.200661 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be1f365b-f0e8-413f-ac93-6be2fc6282a8" containerName="registry-server" Dec 03 08:00:00 crc kubenswrapper[4947]: I1203 08:00:00.200682 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="be1f365b-f0e8-413f-ac93-6be2fc6282a8" containerName="registry-server" Dec 03 08:00:00 crc kubenswrapper[4947]: E1203 08:00:00.200713 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be1f365b-f0e8-413f-ac93-6be2fc6282a8" containerName="extract-utilities" Dec 03 08:00:00 crc kubenswrapper[4947]: I1203 08:00:00.200723 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="be1f365b-f0e8-413f-ac93-6be2fc6282a8" containerName="extract-utilities" Dec 03 08:00:00 crc kubenswrapper[4947]: E1203 08:00:00.200751 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be1f365b-f0e8-413f-ac93-6be2fc6282a8" containerName="extract-content" Dec 03 08:00:00 crc kubenswrapper[4947]: I1203 08:00:00.200759 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="be1f365b-f0e8-413f-ac93-6be2fc6282a8" containerName="extract-content" Dec 03 08:00:00 crc kubenswrapper[4947]: I1203 08:00:00.200952 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="be1f365b-f0e8-413f-ac93-6be2fc6282a8" containerName="registry-server" Dec 03 08:00:00 crc kubenswrapper[4947]: I1203 08:00:00.201396 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-clkgg" Dec 03 08:00:00 crc kubenswrapper[4947]: I1203 08:00:00.205872 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 08:00:00 crc kubenswrapper[4947]: I1203 08:00:00.206119 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 08:00:00 crc kubenswrapper[4947]: I1203 08:00:00.215468 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412480-clkgg"] Dec 03 08:00:00 crc kubenswrapper[4947]: E1203 08:00:00.231869 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:00:00 crc kubenswrapper[4947]: I1203 08:00:00.301799 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be921a3d-da61-477b-a169-851809fdbad4-config-volume\") pod \"collect-profiles-29412480-clkgg\" (UID: \"be921a3d-da61-477b-a169-851809fdbad4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-clkgg" Dec 03 08:00:00 crc kubenswrapper[4947]: I1203 08:00:00.301847 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be921a3d-da61-477b-a169-851809fdbad4-secret-volume\") pod \"collect-profiles-29412480-clkgg\" (UID: \"be921a3d-da61-477b-a169-851809fdbad4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-clkgg" Dec 03 08:00:00 crc kubenswrapper[4947]: I1203 08:00:00.301885 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56fw6\" (UniqueName: \"kubernetes.io/projected/be921a3d-da61-477b-a169-851809fdbad4-kube-api-access-56fw6\") pod \"collect-profiles-29412480-clkgg\" (UID: \"be921a3d-da61-477b-a169-851809fdbad4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-clkgg" Dec 03 08:00:00 crc kubenswrapper[4947]: I1203 08:00:00.402863 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be921a3d-da61-477b-a169-851809fdbad4-secret-volume\") pod \"collect-profiles-29412480-clkgg\" (UID: \"be921a3d-da61-477b-a169-851809fdbad4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-clkgg" Dec 03 08:00:00 crc kubenswrapper[4947]: I1203 08:00:00.402942 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56fw6\" (UniqueName: \"kubernetes.io/projected/be921a3d-da61-477b-a169-851809fdbad4-kube-api-access-56fw6\") pod \"collect-profiles-29412480-clkgg\" (UID: \"be921a3d-da61-477b-a169-851809fdbad4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-clkgg" Dec 03 08:00:00 crc kubenswrapper[4947]: I1203 08:00:00.403017 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be921a3d-da61-477b-a169-851809fdbad4-config-volume\") pod \"collect-profiles-29412480-clkgg\" (UID: \"be921a3d-da61-477b-a169-851809fdbad4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-clkgg" Dec 03 08:00:00 crc kubenswrapper[4947]: I1203 08:00:00.403859 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be921a3d-da61-477b-a169-851809fdbad4-config-volume\") pod \"collect-profiles-29412480-clkgg\" (UID: \"be921a3d-da61-477b-a169-851809fdbad4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-clkgg" Dec 03 08:00:00 crc kubenswrapper[4947]: I1203 08:00:00.409402 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be921a3d-da61-477b-a169-851809fdbad4-secret-volume\") pod \"collect-profiles-29412480-clkgg\" (UID: \"be921a3d-da61-477b-a169-851809fdbad4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-clkgg" Dec 03 08:00:00 crc kubenswrapper[4947]: I1203 08:00:00.420070 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56fw6\" (UniqueName: \"kubernetes.io/projected/be921a3d-da61-477b-a169-851809fdbad4-kube-api-access-56fw6\") pod \"collect-profiles-29412480-clkgg\" (UID: \"be921a3d-da61-477b-a169-851809fdbad4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-clkgg" Dec 03 08:00:00 crc kubenswrapper[4947]: I1203 08:00:00.551348 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-clkgg" Dec 03 08:00:00 crc kubenswrapper[4947]: I1203 08:00:00.985058 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412480-clkgg"] Dec 03 08:00:01 crc kubenswrapper[4947]: I1203 08:00:01.143714 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-clkgg" event={"ID":"be921a3d-da61-477b-a169-851809fdbad4","Type":"ContainerStarted","Data":"5717a640a91499273771d60862a12f153f763accb357cf1901560527ac424920"} Dec 03 08:00:01 crc kubenswrapper[4947]: I1203 08:00:01.147451 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d"} Dec 03 08:00:01 crc kubenswrapper[4947]: I1203 08:00:01.147535 4947 scope.go:117] "RemoveContainer" containerID="dd8022caf794576a5d53eebed6928631e795070f5bf854a75a9328ce82ad5174" Dec 03 08:00:01 crc kubenswrapper[4947]: I1203 08:00:01.147468 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" exitCode=0 Dec 03 08:00:01 crc kubenswrapper[4947]: I1203 08:00:01.148852 4947 scope.go:117] "RemoveContainer" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" Dec 03 08:00:01 crc kubenswrapper[4947]: E1203 08:00:01.149343 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:00:02 crc kubenswrapper[4947]: I1203 08:00:02.161455 4947 generic.go:334] "Generic (PLEG): container finished" podID="be921a3d-da61-477b-a169-851809fdbad4" containerID="0a0341d26c855aee8f795e55171f786ff29d990ba4f86028d6e4d327450248e7" exitCode=0 Dec 03 08:00:02 crc kubenswrapper[4947]: I1203 08:00:02.161670 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-clkgg" event={"ID":"be921a3d-da61-477b-a169-851809fdbad4","Type":"ContainerDied","Data":"0a0341d26c855aee8f795e55171f786ff29d990ba4f86028d6e4d327450248e7"} Dec 03 08:00:03 crc kubenswrapper[4947]: I1203 08:00:03.497476 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-clkgg" Dec 03 08:00:03 crc kubenswrapper[4947]: I1203 08:00:03.547692 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56fw6\" (UniqueName: \"kubernetes.io/projected/be921a3d-da61-477b-a169-851809fdbad4-kube-api-access-56fw6\") pod \"be921a3d-da61-477b-a169-851809fdbad4\" (UID: \"be921a3d-da61-477b-a169-851809fdbad4\") " Dec 03 08:00:03 crc kubenswrapper[4947]: I1203 08:00:03.547893 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be921a3d-da61-477b-a169-851809fdbad4-config-volume\") pod \"be921a3d-da61-477b-a169-851809fdbad4\" (UID: \"be921a3d-da61-477b-a169-851809fdbad4\") " Dec 03 08:00:03 crc kubenswrapper[4947]: I1203 08:00:03.548155 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be921a3d-da61-477b-a169-851809fdbad4-secret-volume\") pod \"be921a3d-da61-477b-a169-851809fdbad4\" (UID: \"be921a3d-da61-477b-a169-851809fdbad4\") " Dec 03 08:00:03 crc kubenswrapper[4947]: I1203 08:00:03.548455 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be921a3d-da61-477b-a169-851809fdbad4-config-volume" (OuterVolumeSpecName: "config-volume") pod "be921a3d-da61-477b-a169-851809fdbad4" (UID: "be921a3d-da61-477b-a169-851809fdbad4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:00:03 crc kubenswrapper[4947]: I1203 08:00:03.548936 4947 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/be921a3d-da61-477b-a169-851809fdbad4-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 08:00:03 crc kubenswrapper[4947]: I1203 08:00:03.562253 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be921a3d-da61-477b-a169-851809fdbad4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "be921a3d-da61-477b-a169-851809fdbad4" (UID: "be921a3d-da61-477b-a169-851809fdbad4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:00:03 crc kubenswrapper[4947]: I1203 08:00:03.563980 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be921a3d-da61-477b-a169-851809fdbad4-kube-api-access-56fw6" (OuterVolumeSpecName: "kube-api-access-56fw6") pod "be921a3d-da61-477b-a169-851809fdbad4" (UID: "be921a3d-da61-477b-a169-851809fdbad4"). InnerVolumeSpecName "kube-api-access-56fw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:00:03 crc kubenswrapper[4947]: I1203 08:00:03.650360 4947 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/be921a3d-da61-477b-a169-851809fdbad4-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 08:00:03 crc kubenswrapper[4947]: I1203 08:00:03.650410 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56fw6\" (UniqueName: \"kubernetes.io/projected/be921a3d-da61-477b-a169-851809fdbad4-kube-api-access-56fw6\") on node \"crc\" DevicePath \"\"" Dec 03 08:00:04 crc kubenswrapper[4947]: I1203 08:00:04.186178 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-clkgg" event={"ID":"be921a3d-da61-477b-a169-851809fdbad4","Type":"ContainerDied","Data":"5717a640a91499273771d60862a12f153f763accb357cf1901560527ac424920"} Dec 03 08:00:04 crc kubenswrapper[4947]: I1203 08:00:04.186237 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5717a640a91499273771d60862a12f153f763accb357cf1901560527ac424920" Dec 03 08:00:04 crc kubenswrapper[4947]: I1203 08:00:04.186253 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-clkgg" Dec 03 08:00:04 crc kubenswrapper[4947]: I1203 08:00:04.595648 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412435-kgtgb"] Dec 03 08:00:04 crc kubenswrapper[4947]: I1203 08:00:04.603721 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412435-kgtgb"] Dec 03 08:00:05 crc kubenswrapper[4947]: I1203 08:00:05.101981 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10d438d4-2007-4ef3-b529-e67d1c77af75" path="/var/lib/kubelet/pods/10d438d4-2007-4ef3-b529-e67d1c77af75/volumes" Dec 03 08:00:05 crc kubenswrapper[4947]: I1203 08:00:05.501342 4947 scope.go:117] "RemoveContainer" containerID="04e6239d3ae53b8c5a72c23338a1ac80fabbe821a7724fbac28190afdfdd2074" Dec 03 08:00:15 crc kubenswrapper[4947]: I1203 08:00:15.083721 4947 scope.go:117] "RemoveContainer" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" Dec 03 08:00:15 crc kubenswrapper[4947]: E1203 08:00:15.084688 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:00:28 crc kubenswrapper[4947]: I1203 08:00:28.084937 4947 scope.go:117] "RemoveContainer" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" Dec 03 08:00:28 crc kubenswrapper[4947]: E1203 08:00:28.087204 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:00:43 crc kubenswrapper[4947]: I1203 08:00:43.082829 4947 scope.go:117] "RemoveContainer" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" Dec 03 08:00:43 crc kubenswrapper[4947]: E1203 08:00:43.083454 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:00:58 crc kubenswrapper[4947]: I1203 08:00:58.082747 4947 scope.go:117] "RemoveContainer" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" Dec 03 08:00:58 crc kubenswrapper[4947]: E1203 08:00:58.086587 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:01:05 crc kubenswrapper[4947]: I1203 08:01:05.015811 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v9p5p"] Dec 03 08:01:05 crc kubenswrapper[4947]: E1203 08:01:05.016509 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be921a3d-da61-477b-a169-851809fdbad4" containerName="collect-profiles" Dec 03 08:01:05 crc kubenswrapper[4947]: I1203 08:01:05.016520 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="be921a3d-da61-477b-a169-851809fdbad4" containerName="collect-profiles" Dec 03 08:01:05 crc kubenswrapper[4947]: I1203 08:01:05.016684 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="be921a3d-da61-477b-a169-851809fdbad4" containerName="collect-profiles" Dec 03 08:01:05 crc kubenswrapper[4947]: I1203 08:01:05.017601 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v9p5p" Dec 03 08:01:05 crc kubenswrapper[4947]: I1203 08:01:05.035618 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v9p5p"] Dec 03 08:01:05 crc kubenswrapper[4947]: I1203 08:01:05.115477 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa9b9734-c7b3-4ed1-8a24-bf713da63d39-catalog-content\") pod \"redhat-operators-v9p5p\" (UID: \"aa9b9734-c7b3-4ed1-8a24-bf713da63d39\") " pod="openshift-marketplace/redhat-operators-v9p5p" Dec 03 08:01:05 crc kubenswrapper[4947]: I1203 08:01:05.115562 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7vsv\" (UniqueName: \"kubernetes.io/projected/aa9b9734-c7b3-4ed1-8a24-bf713da63d39-kube-api-access-r7vsv\") pod \"redhat-operators-v9p5p\" (UID: \"aa9b9734-c7b3-4ed1-8a24-bf713da63d39\") " pod="openshift-marketplace/redhat-operators-v9p5p" Dec 03 08:01:05 crc kubenswrapper[4947]: I1203 08:01:05.115607 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa9b9734-c7b3-4ed1-8a24-bf713da63d39-utilities\") pod \"redhat-operators-v9p5p\" (UID: \"aa9b9734-c7b3-4ed1-8a24-bf713da63d39\") " pod="openshift-marketplace/redhat-operators-v9p5p" Dec 03 08:01:05 crc kubenswrapper[4947]: I1203 08:01:05.217140 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa9b9734-c7b3-4ed1-8a24-bf713da63d39-catalog-content\") pod \"redhat-operators-v9p5p\" (UID: \"aa9b9734-c7b3-4ed1-8a24-bf713da63d39\") " pod="openshift-marketplace/redhat-operators-v9p5p" Dec 03 08:01:05 crc kubenswrapper[4947]: I1203 08:01:05.217215 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7vsv\" (UniqueName: \"kubernetes.io/projected/aa9b9734-c7b3-4ed1-8a24-bf713da63d39-kube-api-access-r7vsv\") pod \"redhat-operators-v9p5p\" (UID: \"aa9b9734-c7b3-4ed1-8a24-bf713da63d39\") " pod="openshift-marketplace/redhat-operators-v9p5p" Dec 03 08:01:05 crc kubenswrapper[4947]: I1203 08:01:05.217272 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa9b9734-c7b3-4ed1-8a24-bf713da63d39-utilities\") pod \"redhat-operators-v9p5p\" (UID: \"aa9b9734-c7b3-4ed1-8a24-bf713da63d39\") " pod="openshift-marketplace/redhat-operators-v9p5p" Dec 03 08:01:05 crc kubenswrapper[4947]: I1203 08:01:05.217836 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa9b9734-c7b3-4ed1-8a24-bf713da63d39-catalog-content\") pod \"redhat-operators-v9p5p\" (UID: \"aa9b9734-c7b3-4ed1-8a24-bf713da63d39\") " pod="openshift-marketplace/redhat-operators-v9p5p" Dec 03 08:01:05 crc kubenswrapper[4947]: I1203 08:01:05.217890 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa9b9734-c7b3-4ed1-8a24-bf713da63d39-utilities\") pod \"redhat-operators-v9p5p\" (UID: \"aa9b9734-c7b3-4ed1-8a24-bf713da63d39\") " pod="openshift-marketplace/redhat-operators-v9p5p" Dec 03 08:01:05 crc kubenswrapper[4947]: I1203 08:01:05.236168 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7vsv\" (UniqueName: \"kubernetes.io/projected/aa9b9734-c7b3-4ed1-8a24-bf713da63d39-kube-api-access-r7vsv\") pod \"redhat-operators-v9p5p\" (UID: \"aa9b9734-c7b3-4ed1-8a24-bf713da63d39\") " pod="openshift-marketplace/redhat-operators-v9p5p" Dec 03 08:01:05 crc kubenswrapper[4947]: I1203 08:01:05.337717 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v9p5p" Dec 03 08:01:05 crc kubenswrapper[4947]: I1203 08:01:05.752476 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v9p5p"] Dec 03 08:01:06 crc kubenswrapper[4947]: I1203 08:01:06.727742 4947 generic.go:334] "Generic (PLEG): container finished" podID="aa9b9734-c7b3-4ed1-8a24-bf713da63d39" containerID="de9b83f0a2acf22aa7b8d007fa395192f95864a8b042af19cdd7bb9e8d40ed73" exitCode=0 Dec 03 08:01:06 crc kubenswrapper[4947]: I1203 08:01:06.728026 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9p5p" event={"ID":"aa9b9734-c7b3-4ed1-8a24-bf713da63d39","Type":"ContainerDied","Data":"de9b83f0a2acf22aa7b8d007fa395192f95864a8b042af19cdd7bb9e8d40ed73"} Dec 03 08:01:06 crc kubenswrapper[4947]: I1203 08:01:06.728052 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9p5p" event={"ID":"aa9b9734-c7b3-4ed1-8a24-bf713da63d39","Type":"ContainerStarted","Data":"03a9342f624e8b1715d7dd284490bdf00fe650a849eff850f1c9a19070841d7a"} Dec 03 08:01:06 crc kubenswrapper[4947]: I1203 08:01:06.731017 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 08:01:07 crc kubenswrapper[4947]: I1203 08:01:07.737181 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9p5p" event={"ID":"aa9b9734-c7b3-4ed1-8a24-bf713da63d39","Type":"ContainerStarted","Data":"07828e367162c64c9f8d33dd5a329f68689440aa337a744c3d09a2c6c520c567"} Dec 03 08:01:08 crc kubenswrapper[4947]: I1203 08:01:08.748003 4947 generic.go:334] "Generic (PLEG): container finished" podID="aa9b9734-c7b3-4ed1-8a24-bf713da63d39" containerID="07828e367162c64c9f8d33dd5a329f68689440aa337a744c3d09a2c6c520c567" exitCode=0 Dec 03 08:01:08 crc kubenswrapper[4947]: I1203 08:01:08.748070 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9p5p" event={"ID":"aa9b9734-c7b3-4ed1-8a24-bf713da63d39","Type":"ContainerDied","Data":"07828e367162c64c9f8d33dd5a329f68689440aa337a744c3d09a2c6c520c567"} Dec 03 08:01:09 crc kubenswrapper[4947]: I1203 08:01:09.760157 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9p5p" event={"ID":"aa9b9734-c7b3-4ed1-8a24-bf713da63d39","Type":"ContainerStarted","Data":"1f3386b69f4e102ac02e7d456c04e150164ec90f0385dc0e0243be0cf0d86af8"} Dec 03 08:01:09 crc kubenswrapper[4947]: I1203 08:01:09.795641 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v9p5p" podStartSLOduration=3.280847153 podStartE2EDuration="5.795609589s" podCreationTimestamp="2025-12-03 08:01:04 +0000 UTC" firstStartedPulling="2025-12-03 08:01:06.730749816 +0000 UTC m=+4327.991704252" lastFinishedPulling="2025-12-03 08:01:09.245512272 +0000 UTC m=+4330.506466688" observedRunningTime="2025-12-03 08:01:09.787282384 +0000 UTC m=+4331.048236810" watchObservedRunningTime="2025-12-03 08:01:09.795609589 +0000 UTC m=+4331.056564085" Dec 03 08:01:12 crc kubenswrapper[4947]: I1203 08:01:12.083334 4947 scope.go:117] "RemoveContainer" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" Dec 03 08:01:12 crc kubenswrapper[4947]: E1203 08:01:12.084203 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:01:15 crc kubenswrapper[4947]: I1203 08:01:15.338132 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v9p5p" Dec 03 08:01:15 crc kubenswrapper[4947]: I1203 08:01:15.338405 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v9p5p" Dec 03 08:01:15 crc kubenswrapper[4947]: I1203 08:01:15.374054 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v9p5p" Dec 03 08:01:15 crc kubenswrapper[4947]: I1203 08:01:15.863653 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v9p5p" Dec 03 08:01:15 crc kubenswrapper[4947]: I1203 08:01:15.907914 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v9p5p"] Dec 03 08:01:17 crc kubenswrapper[4947]: I1203 08:01:17.819291 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v9p5p" podUID="aa9b9734-c7b3-4ed1-8a24-bf713da63d39" containerName="registry-server" containerID="cri-o://1f3386b69f4e102ac02e7d456c04e150164ec90f0385dc0e0243be0cf0d86af8" gracePeriod=2 Dec 03 08:01:19 crc kubenswrapper[4947]: I1203 08:01:19.836209 4947 generic.go:334] "Generic (PLEG): container finished" podID="aa9b9734-c7b3-4ed1-8a24-bf713da63d39" containerID="1f3386b69f4e102ac02e7d456c04e150164ec90f0385dc0e0243be0cf0d86af8" exitCode=0 Dec 03 08:01:19 crc kubenswrapper[4947]: I1203 08:01:19.836472 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9p5p" event={"ID":"aa9b9734-c7b3-4ed1-8a24-bf713da63d39","Type":"ContainerDied","Data":"1f3386b69f4e102ac02e7d456c04e150164ec90f0385dc0e0243be0cf0d86af8"} Dec 03 08:01:20 crc kubenswrapper[4947]: I1203 08:01:20.076436 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v9p5p" Dec 03 08:01:20 crc kubenswrapper[4947]: I1203 08:01:20.219799 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa9b9734-c7b3-4ed1-8a24-bf713da63d39-catalog-content\") pod \"aa9b9734-c7b3-4ed1-8a24-bf713da63d39\" (UID: \"aa9b9734-c7b3-4ed1-8a24-bf713da63d39\") " Dec 03 08:01:20 crc kubenswrapper[4947]: I1203 08:01:20.220175 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa9b9734-c7b3-4ed1-8a24-bf713da63d39-utilities\") pod \"aa9b9734-c7b3-4ed1-8a24-bf713da63d39\" (UID: \"aa9b9734-c7b3-4ed1-8a24-bf713da63d39\") " Dec 03 08:01:20 crc kubenswrapper[4947]: I1203 08:01:20.220216 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7vsv\" (UniqueName: \"kubernetes.io/projected/aa9b9734-c7b3-4ed1-8a24-bf713da63d39-kube-api-access-r7vsv\") pod \"aa9b9734-c7b3-4ed1-8a24-bf713da63d39\" (UID: \"aa9b9734-c7b3-4ed1-8a24-bf713da63d39\") " Dec 03 08:01:20 crc kubenswrapper[4947]: I1203 08:01:20.221425 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa9b9734-c7b3-4ed1-8a24-bf713da63d39-utilities" (OuterVolumeSpecName: "utilities") pod "aa9b9734-c7b3-4ed1-8a24-bf713da63d39" (UID: "aa9b9734-c7b3-4ed1-8a24-bf713da63d39"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:01:20 crc kubenswrapper[4947]: I1203 08:01:20.232002 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa9b9734-c7b3-4ed1-8a24-bf713da63d39-kube-api-access-r7vsv" (OuterVolumeSpecName: "kube-api-access-r7vsv") pod "aa9b9734-c7b3-4ed1-8a24-bf713da63d39" (UID: "aa9b9734-c7b3-4ed1-8a24-bf713da63d39"). InnerVolumeSpecName "kube-api-access-r7vsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:01:20 crc kubenswrapper[4947]: I1203 08:01:20.322174 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa9b9734-c7b3-4ed1-8a24-bf713da63d39-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:01:20 crc kubenswrapper[4947]: I1203 08:01:20.322206 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7vsv\" (UniqueName: \"kubernetes.io/projected/aa9b9734-c7b3-4ed1-8a24-bf713da63d39-kube-api-access-r7vsv\") on node \"crc\" DevicePath \"\"" Dec 03 08:01:20 crc kubenswrapper[4947]: I1203 08:01:20.353591 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa9b9734-c7b3-4ed1-8a24-bf713da63d39-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa9b9734-c7b3-4ed1-8a24-bf713da63d39" (UID: "aa9b9734-c7b3-4ed1-8a24-bf713da63d39"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:01:20 crc kubenswrapper[4947]: I1203 08:01:20.424050 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa9b9734-c7b3-4ed1-8a24-bf713da63d39-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:01:20 crc kubenswrapper[4947]: I1203 08:01:20.848474 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9p5p" event={"ID":"aa9b9734-c7b3-4ed1-8a24-bf713da63d39","Type":"ContainerDied","Data":"03a9342f624e8b1715d7dd284490bdf00fe650a849eff850f1c9a19070841d7a"} Dec 03 08:01:20 crc kubenswrapper[4947]: I1203 08:01:20.848545 4947 scope.go:117] "RemoveContainer" containerID="1f3386b69f4e102ac02e7d456c04e150164ec90f0385dc0e0243be0cf0d86af8" Dec 03 08:01:20 crc kubenswrapper[4947]: I1203 08:01:20.848678 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v9p5p" Dec 03 08:01:20 crc kubenswrapper[4947]: I1203 08:01:20.881108 4947 scope.go:117] "RemoveContainer" containerID="07828e367162c64c9f8d33dd5a329f68689440aa337a744c3d09a2c6c520c567" Dec 03 08:01:20 crc kubenswrapper[4947]: I1203 08:01:20.907624 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v9p5p"] Dec 03 08:01:20 crc kubenswrapper[4947]: I1203 08:01:20.916416 4947 scope.go:117] "RemoveContainer" containerID="de9b83f0a2acf22aa7b8d007fa395192f95864a8b042af19cdd7bb9e8d40ed73" Dec 03 08:01:20 crc kubenswrapper[4947]: I1203 08:01:20.917487 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v9p5p"] Dec 03 08:01:21 crc kubenswrapper[4947]: I1203 08:01:21.098625 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa9b9734-c7b3-4ed1-8a24-bf713da63d39" path="/var/lib/kubelet/pods/aa9b9734-c7b3-4ed1-8a24-bf713da63d39/volumes" Dec 03 08:01:25 crc kubenswrapper[4947]: I1203 08:01:25.083375 4947 scope.go:117] "RemoveContainer" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" Dec 03 08:01:25 crc kubenswrapper[4947]: E1203 08:01:25.084059 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:01:38 crc kubenswrapper[4947]: I1203 08:01:38.082544 4947 scope.go:117] "RemoveContainer" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" Dec 03 08:01:38 crc kubenswrapper[4947]: E1203 08:01:38.083562 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:01:53 crc kubenswrapper[4947]: I1203 08:01:53.083055 4947 scope.go:117] "RemoveContainer" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" Dec 03 08:01:53 crc kubenswrapper[4947]: E1203 08:01:53.084032 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:02:04 crc kubenswrapper[4947]: I1203 08:02:04.084400 4947 scope.go:117] "RemoveContainer" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" Dec 03 08:02:04 crc kubenswrapper[4947]: E1203 08:02:04.085893 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:02:17 crc kubenswrapper[4947]: I1203 08:02:17.084482 4947 scope.go:117] "RemoveContainer" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" Dec 03 08:02:17 crc kubenswrapper[4947]: E1203 08:02:17.085353 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:02:28 crc kubenswrapper[4947]: I1203 08:02:28.082758 4947 scope.go:117] "RemoveContainer" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" Dec 03 08:02:28 crc kubenswrapper[4947]: E1203 08:02:28.083828 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:02:42 crc kubenswrapper[4947]: I1203 08:02:42.083417 4947 scope.go:117] "RemoveContainer" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" Dec 03 08:02:42 crc kubenswrapper[4947]: E1203 08:02:42.084275 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:02:57 crc kubenswrapper[4947]: I1203 08:02:57.083389 4947 scope.go:117] "RemoveContainer" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" Dec 03 08:02:57 crc kubenswrapper[4947]: E1203 08:02:57.084110 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:03:09 crc kubenswrapper[4947]: I1203 08:03:09.103628 4947 scope.go:117] "RemoveContainer" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" Dec 03 08:03:09 crc kubenswrapper[4947]: E1203 08:03:09.108592 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:03:24 crc kubenswrapper[4947]: I1203 08:03:24.091570 4947 scope.go:117] "RemoveContainer" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" Dec 03 08:03:24 crc kubenswrapper[4947]: E1203 08:03:24.092800 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:03:39 crc kubenswrapper[4947]: I1203 08:03:39.091579 4947 scope.go:117] "RemoveContainer" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" Dec 03 08:03:39 crc kubenswrapper[4947]: E1203 08:03:39.092819 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:03:51 crc kubenswrapper[4947]: I1203 08:03:51.083847 4947 scope.go:117] "RemoveContainer" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" Dec 03 08:03:51 crc kubenswrapper[4947]: E1203 08:03:51.084738 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:04:02 crc kubenswrapper[4947]: I1203 08:04:02.083172 4947 scope.go:117] "RemoveContainer" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" Dec 03 08:04:02 crc kubenswrapper[4947]: E1203 08:04:02.084272 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:04:13 crc kubenswrapper[4947]: I1203 08:04:13.083082 4947 scope.go:117] "RemoveContainer" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" Dec 03 08:04:13 crc kubenswrapper[4947]: E1203 08:04:13.084079 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:04:28 crc kubenswrapper[4947]: I1203 08:04:28.082574 4947 scope.go:117] "RemoveContainer" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" Dec 03 08:04:28 crc kubenswrapper[4947]: E1203 08:04:28.083387 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:04:41 crc kubenswrapper[4947]: I1203 08:04:41.083823 4947 scope.go:117] "RemoveContainer" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" Dec 03 08:04:41 crc kubenswrapper[4947]: E1203 08:04:41.084816 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:04:52 crc kubenswrapper[4947]: I1203 08:04:52.083634 4947 scope.go:117] "RemoveContainer" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" Dec 03 08:04:52 crc kubenswrapper[4947]: E1203 08:04:52.086649 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:05:04 crc kubenswrapper[4947]: I1203 08:05:04.083185 4947 scope.go:117] "RemoveContainer" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" Dec 03 08:05:04 crc kubenswrapper[4947]: I1203 08:05:04.884143 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"b45f5c21fc3863183c9c64b93246eb0944e6f7d5c46a66e54f0b81e163422e83"} Dec 03 08:07:23 crc kubenswrapper[4947]: I1203 08:07:23.142598 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-csznx"] Dec 03 08:07:23 crc kubenswrapper[4947]: E1203 08:07:23.143532 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa9b9734-c7b3-4ed1-8a24-bf713da63d39" containerName="registry-server" Dec 03 08:07:23 crc kubenswrapper[4947]: I1203 08:07:23.143554 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa9b9734-c7b3-4ed1-8a24-bf713da63d39" containerName="registry-server" Dec 03 08:07:23 crc kubenswrapper[4947]: E1203 08:07:23.143580 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa9b9734-c7b3-4ed1-8a24-bf713da63d39" containerName="extract-content" Dec 03 08:07:23 crc kubenswrapper[4947]: I1203 08:07:23.143588 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa9b9734-c7b3-4ed1-8a24-bf713da63d39" containerName="extract-content" Dec 03 08:07:23 crc kubenswrapper[4947]: E1203 08:07:23.143604 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa9b9734-c7b3-4ed1-8a24-bf713da63d39" containerName="extract-utilities" Dec 03 08:07:23 crc kubenswrapper[4947]: I1203 08:07:23.143612 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa9b9734-c7b3-4ed1-8a24-bf713da63d39" containerName="extract-utilities" Dec 03 08:07:23 crc kubenswrapper[4947]: I1203 08:07:23.143858 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa9b9734-c7b3-4ed1-8a24-bf713da63d39" containerName="registry-server" Dec 03 08:07:23 crc kubenswrapper[4947]: I1203 08:07:23.145328 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-csznx" Dec 03 08:07:23 crc kubenswrapper[4947]: I1203 08:07:23.151024 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-csznx"] Dec 03 08:07:23 crc kubenswrapper[4947]: I1203 08:07:23.348858 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm9gk\" (UniqueName: \"kubernetes.io/projected/e814445b-2177-4305-9bfe-bbe2c32daf07-kube-api-access-rm9gk\") pod \"community-operators-csznx\" (UID: \"e814445b-2177-4305-9bfe-bbe2c32daf07\") " pod="openshift-marketplace/community-operators-csznx" Dec 03 08:07:23 crc kubenswrapper[4947]: I1203 08:07:23.349168 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e814445b-2177-4305-9bfe-bbe2c32daf07-utilities\") pod \"community-operators-csznx\" (UID: \"e814445b-2177-4305-9bfe-bbe2c32daf07\") " pod="openshift-marketplace/community-operators-csznx" Dec 03 08:07:23 crc kubenswrapper[4947]: I1203 08:07:23.349188 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e814445b-2177-4305-9bfe-bbe2c32daf07-catalog-content\") pod \"community-operators-csznx\" (UID: \"e814445b-2177-4305-9bfe-bbe2c32daf07\") " pod="openshift-marketplace/community-operators-csznx" Dec 03 08:07:23 crc kubenswrapper[4947]: I1203 08:07:23.450642 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm9gk\" (UniqueName: \"kubernetes.io/projected/e814445b-2177-4305-9bfe-bbe2c32daf07-kube-api-access-rm9gk\") pod \"community-operators-csznx\" (UID: \"e814445b-2177-4305-9bfe-bbe2c32daf07\") " pod="openshift-marketplace/community-operators-csznx" Dec 03 08:07:23 crc kubenswrapper[4947]: I1203 08:07:23.450689 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e814445b-2177-4305-9bfe-bbe2c32daf07-utilities\") pod \"community-operators-csznx\" (UID: \"e814445b-2177-4305-9bfe-bbe2c32daf07\") " pod="openshift-marketplace/community-operators-csznx" Dec 03 08:07:23 crc kubenswrapper[4947]: I1203 08:07:23.450708 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e814445b-2177-4305-9bfe-bbe2c32daf07-catalog-content\") pod \"community-operators-csznx\" (UID: \"e814445b-2177-4305-9bfe-bbe2c32daf07\") " pod="openshift-marketplace/community-operators-csznx" Dec 03 08:07:23 crc kubenswrapper[4947]: I1203 08:07:23.451177 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e814445b-2177-4305-9bfe-bbe2c32daf07-catalog-content\") pod \"community-operators-csznx\" (UID: \"e814445b-2177-4305-9bfe-bbe2c32daf07\") " pod="openshift-marketplace/community-operators-csznx" Dec 03 08:07:23 crc kubenswrapper[4947]: I1203 08:07:23.451415 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e814445b-2177-4305-9bfe-bbe2c32daf07-utilities\") pod \"community-operators-csznx\" (UID: \"e814445b-2177-4305-9bfe-bbe2c32daf07\") " pod="openshift-marketplace/community-operators-csznx" Dec 03 08:07:23 crc kubenswrapper[4947]: I1203 08:07:23.476169 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm9gk\" (UniqueName: \"kubernetes.io/projected/e814445b-2177-4305-9bfe-bbe2c32daf07-kube-api-access-rm9gk\") pod \"community-operators-csznx\" (UID: \"e814445b-2177-4305-9bfe-bbe2c32daf07\") " pod="openshift-marketplace/community-operators-csznx" Dec 03 08:07:23 crc kubenswrapper[4947]: I1203 08:07:23.502642 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-csznx" Dec 03 08:07:23 crc kubenswrapper[4947]: I1203 08:07:23.985051 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-csznx"] Dec 03 08:07:24 crc kubenswrapper[4947]: I1203 08:07:24.300610 4947 generic.go:334] "Generic (PLEG): container finished" podID="e814445b-2177-4305-9bfe-bbe2c32daf07" containerID="01f35ccb4eae57c8c08948112aa5766f11eabbb2f0d5cae3a0ec6ae8c6af2101" exitCode=0 Dec 03 08:07:24 crc kubenswrapper[4947]: I1203 08:07:24.300668 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-csznx" event={"ID":"e814445b-2177-4305-9bfe-bbe2c32daf07","Type":"ContainerDied","Data":"01f35ccb4eae57c8c08948112aa5766f11eabbb2f0d5cae3a0ec6ae8c6af2101"} Dec 03 08:07:24 crc kubenswrapper[4947]: I1203 08:07:24.300921 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-csznx" event={"ID":"e814445b-2177-4305-9bfe-bbe2c32daf07","Type":"ContainerStarted","Data":"27bd83d73a464cb5174dc62e9c4e3efbb131b083c27dc3532d2ffe115d72b5c5"} Dec 03 08:07:24 crc kubenswrapper[4947]: I1203 08:07:24.302305 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 08:07:25 crc kubenswrapper[4947]: I1203 08:07:25.313770 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-csznx" event={"ID":"e814445b-2177-4305-9bfe-bbe2c32daf07","Type":"ContainerStarted","Data":"c34ad973c51a0309c5ff0cbd9971a537c85add414146b16444923f43a139e062"} Dec 03 08:07:26 crc kubenswrapper[4947]: I1203 08:07:26.131070 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w4mk5"] Dec 03 08:07:26 crc kubenswrapper[4947]: I1203 08:07:26.132864 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w4mk5" Dec 03 08:07:26 crc kubenswrapper[4947]: I1203 08:07:26.145563 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4mk5"] Dec 03 08:07:26 crc kubenswrapper[4947]: I1203 08:07:26.217236 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb4tf\" (UniqueName: \"kubernetes.io/projected/9c9f51ab-a379-4dd9-8e7d-4ad2f930510c-kube-api-access-sb4tf\") pod \"redhat-marketplace-w4mk5\" (UID: \"9c9f51ab-a379-4dd9-8e7d-4ad2f930510c\") " pod="openshift-marketplace/redhat-marketplace-w4mk5" Dec 03 08:07:26 crc kubenswrapper[4947]: I1203 08:07:26.217555 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c9f51ab-a379-4dd9-8e7d-4ad2f930510c-catalog-content\") pod \"redhat-marketplace-w4mk5\" (UID: \"9c9f51ab-a379-4dd9-8e7d-4ad2f930510c\") " pod="openshift-marketplace/redhat-marketplace-w4mk5" Dec 03 08:07:26 crc kubenswrapper[4947]: I1203 08:07:26.217587 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c9f51ab-a379-4dd9-8e7d-4ad2f930510c-utilities\") pod \"redhat-marketplace-w4mk5\" (UID: \"9c9f51ab-a379-4dd9-8e7d-4ad2f930510c\") " pod="openshift-marketplace/redhat-marketplace-w4mk5" Dec 03 08:07:26 crc kubenswrapper[4947]: I1203 08:07:26.319325 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb4tf\" (UniqueName: \"kubernetes.io/projected/9c9f51ab-a379-4dd9-8e7d-4ad2f930510c-kube-api-access-sb4tf\") pod \"redhat-marketplace-w4mk5\" (UID: \"9c9f51ab-a379-4dd9-8e7d-4ad2f930510c\") " pod="openshift-marketplace/redhat-marketplace-w4mk5" Dec 03 08:07:26 crc kubenswrapper[4947]: I1203 08:07:26.319415 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c9f51ab-a379-4dd9-8e7d-4ad2f930510c-catalog-content\") pod \"redhat-marketplace-w4mk5\" (UID: \"9c9f51ab-a379-4dd9-8e7d-4ad2f930510c\") " pod="openshift-marketplace/redhat-marketplace-w4mk5" Dec 03 08:07:26 crc kubenswrapper[4947]: I1203 08:07:26.319459 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c9f51ab-a379-4dd9-8e7d-4ad2f930510c-utilities\") pod \"redhat-marketplace-w4mk5\" (UID: \"9c9f51ab-a379-4dd9-8e7d-4ad2f930510c\") " pod="openshift-marketplace/redhat-marketplace-w4mk5" Dec 03 08:07:26 crc kubenswrapper[4947]: I1203 08:07:26.320159 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c9f51ab-a379-4dd9-8e7d-4ad2f930510c-utilities\") pod \"redhat-marketplace-w4mk5\" (UID: \"9c9f51ab-a379-4dd9-8e7d-4ad2f930510c\") " pod="openshift-marketplace/redhat-marketplace-w4mk5" Dec 03 08:07:26 crc kubenswrapper[4947]: I1203 08:07:26.320621 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c9f51ab-a379-4dd9-8e7d-4ad2f930510c-catalog-content\") pod \"redhat-marketplace-w4mk5\" (UID: \"9c9f51ab-a379-4dd9-8e7d-4ad2f930510c\") " pod="openshift-marketplace/redhat-marketplace-w4mk5" Dec 03 08:07:26 crc kubenswrapper[4947]: I1203 08:07:26.323289 4947 generic.go:334] "Generic (PLEG): container finished" podID="e814445b-2177-4305-9bfe-bbe2c32daf07" containerID="c34ad973c51a0309c5ff0cbd9971a537c85add414146b16444923f43a139e062" exitCode=0 Dec 03 08:07:26 crc kubenswrapper[4947]: I1203 08:07:26.323316 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-csznx" event={"ID":"e814445b-2177-4305-9bfe-bbe2c32daf07","Type":"ContainerDied","Data":"c34ad973c51a0309c5ff0cbd9971a537c85add414146b16444923f43a139e062"} Dec 03 08:07:26 crc kubenswrapper[4947]: I1203 08:07:26.323357 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-csznx" event={"ID":"e814445b-2177-4305-9bfe-bbe2c32daf07","Type":"ContainerStarted","Data":"b37299f9f0fb0d4daf96d5210b202b6d6758cf926f1e79fde21bd42f74040608"} Dec 03 08:07:26 crc kubenswrapper[4947]: I1203 08:07:26.345544 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-csznx" podStartSLOduration=1.890944959 podStartE2EDuration="3.345526566s" podCreationTimestamp="2025-12-03 08:07:23 +0000 UTC" firstStartedPulling="2025-12-03 08:07:24.302060984 +0000 UTC m=+4705.563015410" lastFinishedPulling="2025-12-03 08:07:25.756642571 +0000 UTC m=+4707.017597017" observedRunningTime="2025-12-03 08:07:26.340195943 +0000 UTC m=+4707.601150369" watchObservedRunningTime="2025-12-03 08:07:26.345526566 +0000 UTC m=+4707.606480992" Dec 03 08:07:26 crc kubenswrapper[4947]: I1203 08:07:26.353539 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb4tf\" (UniqueName: \"kubernetes.io/projected/9c9f51ab-a379-4dd9-8e7d-4ad2f930510c-kube-api-access-sb4tf\") pod \"redhat-marketplace-w4mk5\" (UID: \"9c9f51ab-a379-4dd9-8e7d-4ad2f930510c\") " pod="openshift-marketplace/redhat-marketplace-w4mk5" Dec 03 08:07:26 crc kubenswrapper[4947]: I1203 08:07:26.447585 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w4mk5" Dec 03 08:07:26 crc kubenswrapper[4947]: W1203 08:07:26.911768 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c9f51ab_a379_4dd9_8e7d_4ad2f930510c.slice/crio-cc33b49d3b32591d4bbb507f78abf918480920496548c1c244829a3bfea89d13 WatchSource:0}: Error finding container cc33b49d3b32591d4bbb507f78abf918480920496548c1c244829a3bfea89d13: Status 404 returned error can't find the container with id cc33b49d3b32591d4bbb507f78abf918480920496548c1c244829a3bfea89d13 Dec 03 08:07:26 crc kubenswrapper[4947]: I1203 08:07:26.922376 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4mk5"] Dec 03 08:07:27 crc kubenswrapper[4947]: I1203 08:07:27.335424 4947 generic.go:334] "Generic (PLEG): container finished" podID="9c9f51ab-a379-4dd9-8e7d-4ad2f930510c" containerID="9a5d5d8a32da08c265d9e3904a4535bd85f6adf8f87072505f81a440afe76c94" exitCode=0 Dec 03 08:07:27 crc kubenswrapper[4947]: I1203 08:07:27.335559 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4mk5" event={"ID":"9c9f51ab-a379-4dd9-8e7d-4ad2f930510c","Type":"ContainerDied","Data":"9a5d5d8a32da08c265d9e3904a4535bd85f6adf8f87072505f81a440afe76c94"} Dec 03 08:07:27 crc kubenswrapper[4947]: I1203 08:07:27.338081 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4mk5" event={"ID":"9c9f51ab-a379-4dd9-8e7d-4ad2f930510c","Type":"ContainerStarted","Data":"cc33b49d3b32591d4bbb507f78abf918480920496548c1c244829a3bfea89d13"} Dec 03 08:07:29 crc kubenswrapper[4947]: I1203 08:07:29.376072 4947 generic.go:334] "Generic (PLEG): container finished" podID="9c9f51ab-a379-4dd9-8e7d-4ad2f930510c" containerID="94b1dfc9bb66b26ec48f05acca91442885cc45b66db1b1f2ce8cab23d48745c9" exitCode=0 Dec 03 08:07:29 crc kubenswrapper[4947]: I1203 08:07:29.376161 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4mk5" event={"ID":"9c9f51ab-a379-4dd9-8e7d-4ad2f930510c","Type":"ContainerDied","Data":"94b1dfc9bb66b26ec48f05acca91442885cc45b66db1b1f2ce8cab23d48745c9"} Dec 03 08:07:30 crc kubenswrapper[4947]: I1203 08:07:30.086762 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:07:30 crc kubenswrapper[4947]: I1203 08:07:30.086823 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:07:30 crc kubenswrapper[4947]: I1203 08:07:30.385362 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4mk5" event={"ID":"9c9f51ab-a379-4dd9-8e7d-4ad2f930510c","Type":"ContainerStarted","Data":"c1f9ba4226172eb90a86e0ccaeb54ec37f4b6c461b922ef9087b075348e9fa48"} Dec 03 08:07:30 crc kubenswrapper[4947]: I1203 08:07:30.404362 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w4mk5" podStartSLOduration=1.7335006339999999 podStartE2EDuration="4.404342141s" podCreationTimestamp="2025-12-03 08:07:26 +0000 UTC" firstStartedPulling="2025-12-03 08:07:27.337584121 +0000 UTC m=+4708.598538567" lastFinishedPulling="2025-12-03 08:07:30.008425628 +0000 UTC m=+4711.269380074" observedRunningTime="2025-12-03 08:07:30.402670287 +0000 UTC m=+4711.663624713" watchObservedRunningTime="2025-12-03 08:07:30.404342141 +0000 UTC m=+4711.665296557" Dec 03 08:07:33 crc kubenswrapper[4947]: I1203 08:07:33.503455 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-csznx" Dec 03 08:07:33 crc kubenswrapper[4947]: I1203 08:07:33.503851 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-csznx" Dec 03 08:07:33 crc kubenswrapper[4947]: I1203 08:07:33.578872 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-csznx" Dec 03 08:07:34 crc kubenswrapper[4947]: I1203 08:07:34.475839 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-csznx" Dec 03 08:07:34 crc kubenswrapper[4947]: I1203 08:07:34.528278 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-csznx"] Dec 03 08:07:36 crc kubenswrapper[4947]: I1203 08:07:36.435019 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-csznx" podUID="e814445b-2177-4305-9bfe-bbe2c32daf07" containerName="registry-server" containerID="cri-o://b37299f9f0fb0d4daf96d5210b202b6d6758cf926f1e79fde21bd42f74040608" gracePeriod=2 Dec 03 08:07:36 crc kubenswrapper[4947]: I1203 08:07:36.448290 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w4mk5" Dec 03 08:07:36 crc kubenswrapper[4947]: I1203 08:07:36.448377 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w4mk5" Dec 03 08:07:36 crc kubenswrapper[4947]: I1203 08:07:36.872284 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w4mk5" Dec 03 08:07:37 crc kubenswrapper[4947]: I1203 08:07:37.446038 4947 generic.go:334] "Generic (PLEG): container finished" podID="e814445b-2177-4305-9bfe-bbe2c32daf07" containerID="b37299f9f0fb0d4daf96d5210b202b6d6758cf926f1e79fde21bd42f74040608" exitCode=0 Dec 03 08:07:37 crc kubenswrapper[4947]: I1203 08:07:37.446106 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-csznx" event={"ID":"e814445b-2177-4305-9bfe-bbe2c32daf07","Type":"ContainerDied","Data":"b37299f9f0fb0d4daf96d5210b202b6d6758cf926f1e79fde21bd42f74040608"} Dec 03 08:07:37 crc kubenswrapper[4947]: I1203 08:07:37.446449 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-csznx" event={"ID":"e814445b-2177-4305-9bfe-bbe2c32daf07","Type":"ContainerDied","Data":"27bd83d73a464cb5174dc62e9c4e3efbb131b083c27dc3532d2ffe115d72b5c5"} Dec 03 08:07:37 crc kubenswrapper[4947]: I1203 08:07:37.446466 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27bd83d73a464cb5174dc62e9c4e3efbb131b083c27dc3532d2ffe115d72b5c5" Dec 03 08:07:37 crc kubenswrapper[4947]: I1203 08:07:37.451159 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-csznx" Dec 03 08:07:37 crc kubenswrapper[4947]: I1203 08:07:37.500557 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w4mk5" Dec 03 08:07:37 crc kubenswrapper[4947]: I1203 08:07:37.502075 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm9gk\" (UniqueName: \"kubernetes.io/projected/e814445b-2177-4305-9bfe-bbe2c32daf07-kube-api-access-rm9gk\") pod \"e814445b-2177-4305-9bfe-bbe2c32daf07\" (UID: \"e814445b-2177-4305-9bfe-bbe2c32daf07\") " Dec 03 08:07:37 crc kubenswrapper[4947]: I1203 08:07:37.502126 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e814445b-2177-4305-9bfe-bbe2c32daf07-catalog-content\") pod \"e814445b-2177-4305-9bfe-bbe2c32daf07\" (UID: \"e814445b-2177-4305-9bfe-bbe2c32daf07\") " Dec 03 08:07:37 crc kubenswrapper[4947]: I1203 08:07:37.502163 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e814445b-2177-4305-9bfe-bbe2c32daf07-utilities\") pod \"e814445b-2177-4305-9bfe-bbe2c32daf07\" (UID: \"e814445b-2177-4305-9bfe-bbe2c32daf07\") " Dec 03 08:07:37 crc kubenswrapper[4947]: I1203 08:07:37.504499 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e814445b-2177-4305-9bfe-bbe2c32daf07-utilities" (OuterVolumeSpecName: "utilities") pod "e814445b-2177-4305-9bfe-bbe2c32daf07" (UID: "e814445b-2177-4305-9bfe-bbe2c32daf07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:07:37 crc kubenswrapper[4947]: I1203 08:07:37.510528 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e814445b-2177-4305-9bfe-bbe2c32daf07-kube-api-access-rm9gk" (OuterVolumeSpecName: "kube-api-access-rm9gk") pod "e814445b-2177-4305-9bfe-bbe2c32daf07" (UID: "e814445b-2177-4305-9bfe-bbe2c32daf07"). InnerVolumeSpecName "kube-api-access-rm9gk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:07:37 crc kubenswrapper[4947]: I1203 08:07:37.562868 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e814445b-2177-4305-9bfe-bbe2c32daf07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e814445b-2177-4305-9bfe-bbe2c32daf07" (UID: "e814445b-2177-4305-9bfe-bbe2c32daf07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:07:37 crc kubenswrapper[4947]: I1203 08:07:37.603774 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e814445b-2177-4305-9bfe-bbe2c32daf07-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:07:37 crc kubenswrapper[4947]: I1203 08:07:37.603849 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm9gk\" (UniqueName: \"kubernetes.io/projected/e814445b-2177-4305-9bfe-bbe2c32daf07-kube-api-access-rm9gk\") on node \"crc\" DevicePath \"\"" Dec 03 08:07:37 crc kubenswrapper[4947]: I1203 08:07:37.603864 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e814445b-2177-4305-9bfe-bbe2c32daf07-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:07:38 crc kubenswrapper[4947]: I1203 08:07:38.318927 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4mk5"] Dec 03 08:07:38 crc kubenswrapper[4947]: I1203 08:07:38.458371 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-csznx" Dec 03 08:07:38 crc kubenswrapper[4947]: I1203 08:07:38.501728 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-csznx"] Dec 03 08:07:38 crc kubenswrapper[4947]: I1203 08:07:38.506996 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-csznx"] Dec 03 08:07:39 crc kubenswrapper[4947]: I1203 08:07:39.098238 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e814445b-2177-4305-9bfe-bbe2c32daf07" path="/var/lib/kubelet/pods/e814445b-2177-4305-9bfe-bbe2c32daf07/volumes" Dec 03 08:07:39 crc kubenswrapper[4947]: I1203 08:07:39.464881 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w4mk5" podUID="9c9f51ab-a379-4dd9-8e7d-4ad2f930510c" containerName="registry-server" containerID="cri-o://c1f9ba4226172eb90a86e0ccaeb54ec37f4b6c461b922ef9087b075348e9fa48" gracePeriod=2 Dec 03 08:07:39 crc kubenswrapper[4947]: I1203 08:07:39.930613 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w4mk5" Dec 03 08:07:39 crc kubenswrapper[4947]: I1203 08:07:39.940727 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb4tf\" (UniqueName: \"kubernetes.io/projected/9c9f51ab-a379-4dd9-8e7d-4ad2f930510c-kube-api-access-sb4tf\") pod \"9c9f51ab-a379-4dd9-8e7d-4ad2f930510c\" (UID: \"9c9f51ab-a379-4dd9-8e7d-4ad2f930510c\") " Dec 03 08:07:39 crc kubenswrapper[4947]: I1203 08:07:39.940820 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c9f51ab-a379-4dd9-8e7d-4ad2f930510c-utilities\") pod \"9c9f51ab-a379-4dd9-8e7d-4ad2f930510c\" (UID: \"9c9f51ab-a379-4dd9-8e7d-4ad2f930510c\") " Dec 03 08:07:39 crc kubenswrapper[4947]: I1203 08:07:39.940858 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c9f51ab-a379-4dd9-8e7d-4ad2f930510c-catalog-content\") pod \"9c9f51ab-a379-4dd9-8e7d-4ad2f930510c\" (UID: \"9c9f51ab-a379-4dd9-8e7d-4ad2f930510c\") " Dec 03 08:07:39 crc kubenswrapper[4947]: I1203 08:07:39.941743 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c9f51ab-a379-4dd9-8e7d-4ad2f930510c-utilities" (OuterVolumeSpecName: "utilities") pod "9c9f51ab-a379-4dd9-8e7d-4ad2f930510c" (UID: "9c9f51ab-a379-4dd9-8e7d-4ad2f930510c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:07:39 crc kubenswrapper[4947]: I1203 08:07:39.942145 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c9f51ab-a379-4dd9-8e7d-4ad2f930510c-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:07:39 crc kubenswrapper[4947]: I1203 08:07:39.952909 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c9f51ab-a379-4dd9-8e7d-4ad2f930510c-kube-api-access-sb4tf" (OuterVolumeSpecName: "kube-api-access-sb4tf") pod "9c9f51ab-a379-4dd9-8e7d-4ad2f930510c" (UID: "9c9f51ab-a379-4dd9-8e7d-4ad2f930510c"). InnerVolumeSpecName "kube-api-access-sb4tf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:07:39 crc kubenswrapper[4947]: I1203 08:07:39.961882 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c9f51ab-a379-4dd9-8e7d-4ad2f930510c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c9f51ab-a379-4dd9-8e7d-4ad2f930510c" (UID: "9c9f51ab-a379-4dd9-8e7d-4ad2f930510c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:07:40 crc kubenswrapper[4947]: I1203 08:07:40.043246 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c9f51ab-a379-4dd9-8e7d-4ad2f930510c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:07:40 crc kubenswrapper[4947]: I1203 08:07:40.043303 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb4tf\" (UniqueName: \"kubernetes.io/projected/9c9f51ab-a379-4dd9-8e7d-4ad2f930510c-kube-api-access-sb4tf\") on node \"crc\" DevicePath \"\"" Dec 03 08:07:40 crc kubenswrapper[4947]: I1203 08:07:40.475417 4947 generic.go:334] "Generic (PLEG): container finished" podID="9c9f51ab-a379-4dd9-8e7d-4ad2f930510c" containerID="c1f9ba4226172eb90a86e0ccaeb54ec37f4b6c461b922ef9087b075348e9fa48" exitCode=0 Dec 03 08:07:40 crc kubenswrapper[4947]: I1203 08:07:40.475483 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w4mk5" Dec 03 08:07:40 crc kubenswrapper[4947]: I1203 08:07:40.475540 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4mk5" event={"ID":"9c9f51ab-a379-4dd9-8e7d-4ad2f930510c","Type":"ContainerDied","Data":"c1f9ba4226172eb90a86e0ccaeb54ec37f4b6c461b922ef9087b075348e9fa48"} Dec 03 08:07:40 crc kubenswrapper[4947]: I1203 08:07:40.475605 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4mk5" event={"ID":"9c9f51ab-a379-4dd9-8e7d-4ad2f930510c","Type":"ContainerDied","Data":"cc33b49d3b32591d4bbb507f78abf918480920496548c1c244829a3bfea89d13"} Dec 03 08:07:40 crc kubenswrapper[4947]: I1203 08:07:40.475637 4947 scope.go:117] "RemoveContainer" containerID="c1f9ba4226172eb90a86e0ccaeb54ec37f4b6c461b922ef9087b075348e9fa48" Dec 03 08:07:40 crc kubenswrapper[4947]: I1203 08:07:40.522216 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4mk5"] Dec 03 08:07:40 crc kubenswrapper[4947]: I1203 08:07:40.530226 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4mk5"] Dec 03 08:07:40 crc kubenswrapper[4947]: I1203 08:07:40.531867 4947 scope.go:117] "RemoveContainer" containerID="94b1dfc9bb66b26ec48f05acca91442885cc45b66db1b1f2ce8cab23d48745c9" Dec 03 08:07:40 crc kubenswrapper[4947]: I1203 08:07:40.552682 4947 scope.go:117] "RemoveContainer" containerID="9a5d5d8a32da08c265d9e3904a4535bd85f6adf8f87072505f81a440afe76c94" Dec 03 08:07:40 crc kubenswrapper[4947]: I1203 08:07:40.588172 4947 scope.go:117] "RemoveContainer" containerID="c1f9ba4226172eb90a86e0ccaeb54ec37f4b6c461b922ef9087b075348e9fa48" Dec 03 08:07:40 crc kubenswrapper[4947]: E1203 08:07:40.588898 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1f9ba4226172eb90a86e0ccaeb54ec37f4b6c461b922ef9087b075348e9fa48\": container with ID starting with c1f9ba4226172eb90a86e0ccaeb54ec37f4b6c461b922ef9087b075348e9fa48 not found: ID does not exist" containerID="c1f9ba4226172eb90a86e0ccaeb54ec37f4b6c461b922ef9087b075348e9fa48" Dec 03 08:07:40 crc kubenswrapper[4947]: I1203 08:07:40.588970 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1f9ba4226172eb90a86e0ccaeb54ec37f4b6c461b922ef9087b075348e9fa48"} err="failed to get container status \"c1f9ba4226172eb90a86e0ccaeb54ec37f4b6c461b922ef9087b075348e9fa48\": rpc error: code = NotFound desc = could not find container \"c1f9ba4226172eb90a86e0ccaeb54ec37f4b6c461b922ef9087b075348e9fa48\": container with ID starting with c1f9ba4226172eb90a86e0ccaeb54ec37f4b6c461b922ef9087b075348e9fa48 not found: ID does not exist" Dec 03 08:07:40 crc kubenswrapper[4947]: I1203 08:07:40.589010 4947 scope.go:117] "RemoveContainer" containerID="94b1dfc9bb66b26ec48f05acca91442885cc45b66db1b1f2ce8cab23d48745c9" Dec 03 08:07:40 crc kubenswrapper[4947]: E1203 08:07:40.589541 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94b1dfc9bb66b26ec48f05acca91442885cc45b66db1b1f2ce8cab23d48745c9\": container with ID starting with 94b1dfc9bb66b26ec48f05acca91442885cc45b66db1b1f2ce8cab23d48745c9 not found: ID does not exist" containerID="94b1dfc9bb66b26ec48f05acca91442885cc45b66db1b1f2ce8cab23d48745c9" Dec 03 08:07:40 crc kubenswrapper[4947]: I1203 08:07:40.589583 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94b1dfc9bb66b26ec48f05acca91442885cc45b66db1b1f2ce8cab23d48745c9"} err="failed to get container status \"94b1dfc9bb66b26ec48f05acca91442885cc45b66db1b1f2ce8cab23d48745c9\": rpc error: code = NotFound desc = could not find container \"94b1dfc9bb66b26ec48f05acca91442885cc45b66db1b1f2ce8cab23d48745c9\": container with ID starting with 94b1dfc9bb66b26ec48f05acca91442885cc45b66db1b1f2ce8cab23d48745c9 not found: ID does not exist" Dec 03 08:07:40 crc kubenswrapper[4947]: I1203 08:07:40.589613 4947 scope.go:117] "RemoveContainer" containerID="9a5d5d8a32da08c265d9e3904a4535bd85f6adf8f87072505f81a440afe76c94" Dec 03 08:07:40 crc kubenswrapper[4947]: E1203 08:07:40.590056 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a5d5d8a32da08c265d9e3904a4535bd85f6adf8f87072505f81a440afe76c94\": container with ID starting with 9a5d5d8a32da08c265d9e3904a4535bd85f6adf8f87072505f81a440afe76c94 not found: ID does not exist" containerID="9a5d5d8a32da08c265d9e3904a4535bd85f6adf8f87072505f81a440afe76c94" Dec 03 08:07:40 crc kubenswrapper[4947]: I1203 08:07:40.590092 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a5d5d8a32da08c265d9e3904a4535bd85f6adf8f87072505f81a440afe76c94"} err="failed to get container status \"9a5d5d8a32da08c265d9e3904a4535bd85f6adf8f87072505f81a440afe76c94\": rpc error: code = NotFound desc = could not find container \"9a5d5d8a32da08c265d9e3904a4535bd85f6adf8f87072505f81a440afe76c94\": container with ID starting with 9a5d5d8a32da08c265d9e3904a4535bd85f6adf8f87072505f81a440afe76c94 not found: ID does not exist" Dec 03 08:07:41 crc kubenswrapper[4947]: I1203 08:07:41.092404 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c9f51ab-a379-4dd9-8e7d-4ad2f930510c" path="/var/lib/kubelet/pods/9c9f51ab-a379-4dd9-8e7d-4ad2f930510c/volumes" Dec 03 08:08:00 crc kubenswrapper[4947]: I1203 08:08:00.085989 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:08:00 crc kubenswrapper[4947]: I1203 08:08:00.086559 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:08:30 crc kubenswrapper[4947]: I1203 08:08:30.086576 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:08:30 crc kubenswrapper[4947]: I1203 08:08:30.087156 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:08:30 crc kubenswrapper[4947]: I1203 08:08:30.087234 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 08:08:30 crc kubenswrapper[4947]: I1203 08:08:30.088340 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b45f5c21fc3863183c9c64b93246eb0944e6f7d5c46a66e54f0b81e163422e83"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:08:30 crc kubenswrapper[4947]: I1203 08:08:30.088447 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://b45f5c21fc3863183c9c64b93246eb0944e6f7d5c46a66e54f0b81e163422e83" gracePeriod=600 Dec 03 08:08:30 crc kubenswrapper[4947]: I1203 08:08:30.964806 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="b45f5c21fc3863183c9c64b93246eb0944e6f7d5c46a66e54f0b81e163422e83" exitCode=0 Dec 03 08:08:30 crc kubenswrapper[4947]: I1203 08:08:30.964893 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"b45f5c21fc3863183c9c64b93246eb0944e6f7d5c46a66e54f0b81e163422e83"} Dec 03 08:08:30 crc kubenswrapper[4947]: I1203 08:08:30.965670 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517"} Dec 03 08:08:30 crc kubenswrapper[4947]: I1203 08:08:30.965712 4947 scope.go:117] "RemoveContainer" containerID="c93371c9ab8acd5bf73e47fd8428acb9c5b1950e55b99caaf702ca98d000224d" Dec 03 08:10:30 crc kubenswrapper[4947]: I1203 08:10:30.087020 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:10:30 crc kubenswrapper[4947]: I1203 08:10:30.087906 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:11:00 crc kubenswrapper[4947]: I1203 08:11:00.086573 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:11:00 crc kubenswrapper[4947]: I1203 08:11:00.088091 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:11:30 crc kubenswrapper[4947]: I1203 08:11:30.086256 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:11:30 crc kubenswrapper[4947]: I1203 08:11:30.086728 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:11:30 crc kubenswrapper[4947]: I1203 08:11:30.086768 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 08:11:30 crc kubenswrapper[4947]: I1203 08:11:30.087269 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:11:30 crc kubenswrapper[4947]: I1203 08:11:30.087326 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" gracePeriod=600 Dec 03 08:11:30 crc kubenswrapper[4947]: I1203 08:11:30.536407 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" exitCode=0 Dec 03 08:11:30 crc kubenswrapper[4947]: I1203 08:11:30.536471 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517"} Dec 03 08:11:30 crc kubenswrapper[4947]: I1203 08:11:30.536556 4947 scope.go:117] "RemoveContainer" containerID="b45f5c21fc3863183c9c64b93246eb0944e6f7d5c46a66e54f0b81e163422e83" Dec 03 08:11:30 crc kubenswrapper[4947]: E1203 08:11:30.853693 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:11:31 crc kubenswrapper[4947]: I1203 08:11:31.547363 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:11:31 crc kubenswrapper[4947]: E1203 08:11:31.547801 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:11:33 crc kubenswrapper[4947]: I1203 08:11:33.243148 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-72dcb"] Dec 03 08:11:33 crc kubenswrapper[4947]: E1203 08:11:33.243513 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c9f51ab-a379-4dd9-8e7d-4ad2f930510c" containerName="extract-content" Dec 03 08:11:33 crc kubenswrapper[4947]: I1203 08:11:33.243529 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c9f51ab-a379-4dd9-8e7d-4ad2f930510c" containerName="extract-content" Dec 03 08:11:33 crc kubenswrapper[4947]: E1203 08:11:33.243560 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c9f51ab-a379-4dd9-8e7d-4ad2f930510c" containerName="registry-server" Dec 03 08:11:33 crc kubenswrapper[4947]: I1203 08:11:33.243567 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c9f51ab-a379-4dd9-8e7d-4ad2f930510c" containerName="registry-server" Dec 03 08:11:33 crc kubenswrapper[4947]: E1203 08:11:33.243582 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e814445b-2177-4305-9bfe-bbe2c32daf07" containerName="extract-content" Dec 03 08:11:33 crc kubenswrapper[4947]: I1203 08:11:33.243590 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e814445b-2177-4305-9bfe-bbe2c32daf07" containerName="extract-content" Dec 03 08:11:33 crc kubenswrapper[4947]: E1203 08:11:33.243605 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c9f51ab-a379-4dd9-8e7d-4ad2f930510c" containerName="extract-utilities" Dec 03 08:11:33 crc kubenswrapper[4947]: I1203 08:11:33.243613 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c9f51ab-a379-4dd9-8e7d-4ad2f930510c" containerName="extract-utilities" Dec 03 08:11:33 crc kubenswrapper[4947]: E1203 08:11:33.243633 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e814445b-2177-4305-9bfe-bbe2c32daf07" containerName="extract-utilities" Dec 03 08:11:33 crc kubenswrapper[4947]: I1203 08:11:33.243641 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e814445b-2177-4305-9bfe-bbe2c32daf07" containerName="extract-utilities" Dec 03 08:11:33 crc kubenswrapper[4947]: E1203 08:11:33.243651 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e814445b-2177-4305-9bfe-bbe2c32daf07" containerName="registry-server" Dec 03 08:11:33 crc kubenswrapper[4947]: I1203 08:11:33.243660 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e814445b-2177-4305-9bfe-bbe2c32daf07" containerName="registry-server" Dec 03 08:11:33 crc kubenswrapper[4947]: I1203 08:11:33.243828 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="e814445b-2177-4305-9bfe-bbe2c32daf07" containerName="registry-server" Dec 03 08:11:33 crc kubenswrapper[4947]: I1203 08:11:33.243849 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c9f51ab-a379-4dd9-8e7d-4ad2f930510c" containerName="registry-server" Dec 03 08:11:33 crc kubenswrapper[4947]: I1203 08:11:33.245053 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-72dcb" Dec 03 08:11:33 crc kubenswrapper[4947]: I1203 08:11:33.271598 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-72dcb"] Dec 03 08:11:33 crc kubenswrapper[4947]: I1203 08:11:33.327752 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/678185dc-6826-4920-afe7-5cce281f244b-catalog-content\") pod \"redhat-operators-72dcb\" (UID: \"678185dc-6826-4920-afe7-5cce281f244b\") " pod="openshift-marketplace/redhat-operators-72dcb" Dec 03 08:11:33 crc kubenswrapper[4947]: I1203 08:11:33.327871 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zjt9\" (UniqueName: \"kubernetes.io/projected/678185dc-6826-4920-afe7-5cce281f244b-kube-api-access-9zjt9\") pod \"redhat-operators-72dcb\" (UID: \"678185dc-6826-4920-afe7-5cce281f244b\") " pod="openshift-marketplace/redhat-operators-72dcb" Dec 03 08:11:33 crc kubenswrapper[4947]: I1203 08:11:33.327911 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/678185dc-6826-4920-afe7-5cce281f244b-utilities\") pod \"redhat-operators-72dcb\" (UID: \"678185dc-6826-4920-afe7-5cce281f244b\") " pod="openshift-marketplace/redhat-operators-72dcb" Dec 03 08:11:33 crc kubenswrapper[4947]: I1203 08:11:33.429604 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zjt9\" (UniqueName: \"kubernetes.io/projected/678185dc-6826-4920-afe7-5cce281f244b-kube-api-access-9zjt9\") pod \"redhat-operators-72dcb\" (UID: \"678185dc-6826-4920-afe7-5cce281f244b\") " pod="openshift-marketplace/redhat-operators-72dcb" Dec 03 08:11:33 crc kubenswrapper[4947]: I1203 08:11:33.429669 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/678185dc-6826-4920-afe7-5cce281f244b-utilities\") pod \"redhat-operators-72dcb\" (UID: \"678185dc-6826-4920-afe7-5cce281f244b\") " pod="openshift-marketplace/redhat-operators-72dcb" Dec 03 08:11:33 crc kubenswrapper[4947]: I1203 08:11:33.429716 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/678185dc-6826-4920-afe7-5cce281f244b-catalog-content\") pod \"redhat-operators-72dcb\" (UID: \"678185dc-6826-4920-afe7-5cce281f244b\") " pod="openshift-marketplace/redhat-operators-72dcb" Dec 03 08:11:33 crc kubenswrapper[4947]: I1203 08:11:33.430383 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/678185dc-6826-4920-afe7-5cce281f244b-catalog-content\") pod \"redhat-operators-72dcb\" (UID: \"678185dc-6826-4920-afe7-5cce281f244b\") " pod="openshift-marketplace/redhat-operators-72dcb" Dec 03 08:11:33 crc kubenswrapper[4947]: I1203 08:11:33.430408 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/678185dc-6826-4920-afe7-5cce281f244b-utilities\") pod \"redhat-operators-72dcb\" (UID: \"678185dc-6826-4920-afe7-5cce281f244b\") " pod="openshift-marketplace/redhat-operators-72dcb" Dec 03 08:11:33 crc kubenswrapper[4947]: I1203 08:11:33.450087 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zjt9\" (UniqueName: \"kubernetes.io/projected/678185dc-6826-4920-afe7-5cce281f244b-kube-api-access-9zjt9\") pod \"redhat-operators-72dcb\" (UID: \"678185dc-6826-4920-afe7-5cce281f244b\") " pod="openshift-marketplace/redhat-operators-72dcb" Dec 03 08:11:33 crc kubenswrapper[4947]: I1203 08:11:33.578355 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-72dcb" Dec 03 08:11:34 crc kubenswrapper[4947]: I1203 08:11:34.083879 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-72dcb"] Dec 03 08:11:34 crc kubenswrapper[4947]: I1203 08:11:34.574442 4947 generic.go:334] "Generic (PLEG): container finished" podID="678185dc-6826-4920-afe7-5cce281f244b" containerID="e83f9cfac43ed1b455fcedf9eccd073fdb6e2ad5ad4596d9bbd809edd220d587" exitCode=0 Dec 03 08:11:34 crc kubenswrapper[4947]: I1203 08:11:34.574625 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-72dcb" event={"ID":"678185dc-6826-4920-afe7-5cce281f244b","Type":"ContainerDied","Data":"e83f9cfac43ed1b455fcedf9eccd073fdb6e2ad5ad4596d9bbd809edd220d587"} Dec 03 08:11:34 crc kubenswrapper[4947]: I1203 08:11:34.574772 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-72dcb" event={"ID":"678185dc-6826-4920-afe7-5cce281f244b","Type":"ContainerStarted","Data":"5b5575bdf173f114e26ca7bb30456e28e924bd35b4e524d8a0b5762cc970fd61"} Dec 03 08:11:34 crc kubenswrapper[4947]: I1203 08:11:34.631707 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pssvx"] Dec 03 08:11:34 crc kubenswrapper[4947]: I1203 08:11:34.632996 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pssvx" Dec 03 08:11:34 crc kubenswrapper[4947]: I1203 08:11:34.644024 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pssvx"] Dec 03 08:11:34 crc kubenswrapper[4947]: I1203 08:11:34.653624 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73d6bb93-fcf1-423a-a7d0-5dfaf25c6638-catalog-content\") pod \"certified-operators-pssvx\" (UID: \"73d6bb93-fcf1-423a-a7d0-5dfaf25c6638\") " pod="openshift-marketplace/certified-operators-pssvx" Dec 03 08:11:34 crc kubenswrapper[4947]: I1203 08:11:34.653811 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2594\" (UniqueName: \"kubernetes.io/projected/73d6bb93-fcf1-423a-a7d0-5dfaf25c6638-kube-api-access-g2594\") pod \"certified-operators-pssvx\" (UID: \"73d6bb93-fcf1-423a-a7d0-5dfaf25c6638\") " pod="openshift-marketplace/certified-operators-pssvx" Dec 03 08:11:34 crc kubenswrapper[4947]: I1203 08:11:34.653987 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73d6bb93-fcf1-423a-a7d0-5dfaf25c6638-utilities\") pod \"certified-operators-pssvx\" (UID: \"73d6bb93-fcf1-423a-a7d0-5dfaf25c6638\") " pod="openshift-marketplace/certified-operators-pssvx" Dec 03 08:11:34 crc kubenswrapper[4947]: I1203 08:11:34.754867 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73d6bb93-fcf1-423a-a7d0-5dfaf25c6638-catalog-content\") pod \"certified-operators-pssvx\" (UID: \"73d6bb93-fcf1-423a-a7d0-5dfaf25c6638\") " pod="openshift-marketplace/certified-operators-pssvx" Dec 03 08:11:34 crc kubenswrapper[4947]: I1203 08:11:34.754909 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2594\" (UniqueName: \"kubernetes.io/projected/73d6bb93-fcf1-423a-a7d0-5dfaf25c6638-kube-api-access-g2594\") pod \"certified-operators-pssvx\" (UID: \"73d6bb93-fcf1-423a-a7d0-5dfaf25c6638\") " pod="openshift-marketplace/certified-operators-pssvx" Dec 03 08:11:34 crc kubenswrapper[4947]: I1203 08:11:34.754973 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73d6bb93-fcf1-423a-a7d0-5dfaf25c6638-utilities\") pod \"certified-operators-pssvx\" (UID: \"73d6bb93-fcf1-423a-a7d0-5dfaf25c6638\") " pod="openshift-marketplace/certified-operators-pssvx" Dec 03 08:11:34 crc kubenswrapper[4947]: I1203 08:11:34.755376 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73d6bb93-fcf1-423a-a7d0-5dfaf25c6638-utilities\") pod \"certified-operators-pssvx\" (UID: \"73d6bb93-fcf1-423a-a7d0-5dfaf25c6638\") " pod="openshift-marketplace/certified-operators-pssvx" Dec 03 08:11:34 crc kubenswrapper[4947]: I1203 08:11:34.755382 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73d6bb93-fcf1-423a-a7d0-5dfaf25c6638-catalog-content\") pod \"certified-operators-pssvx\" (UID: \"73d6bb93-fcf1-423a-a7d0-5dfaf25c6638\") " pod="openshift-marketplace/certified-operators-pssvx" Dec 03 08:11:34 crc kubenswrapper[4947]: I1203 08:11:34.783806 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2594\" (UniqueName: \"kubernetes.io/projected/73d6bb93-fcf1-423a-a7d0-5dfaf25c6638-kube-api-access-g2594\") pod \"certified-operators-pssvx\" (UID: \"73d6bb93-fcf1-423a-a7d0-5dfaf25c6638\") " pod="openshift-marketplace/certified-operators-pssvx" Dec 03 08:11:34 crc kubenswrapper[4947]: I1203 08:11:34.949308 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pssvx" Dec 03 08:11:35 crc kubenswrapper[4947]: I1203 08:11:35.244931 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pssvx"] Dec 03 08:11:35 crc kubenswrapper[4947]: W1203 08:11:35.249363 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73d6bb93_fcf1_423a_a7d0_5dfaf25c6638.slice/crio-bab3015f014401107aaf4801b8b050b6add6e2dee0d4fdd08bfd30bc1405a5a3 WatchSource:0}: Error finding container bab3015f014401107aaf4801b8b050b6add6e2dee0d4fdd08bfd30bc1405a5a3: Status 404 returned error can't find the container with id bab3015f014401107aaf4801b8b050b6add6e2dee0d4fdd08bfd30bc1405a5a3 Dec 03 08:11:35 crc kubenswrapper[4947]: I1203 08:11:35.582128 4947 generic.go:334] "Generic (PLEG): container finished" podID="73d6bb93-fcf1-423a-a7d0-5dfaf25c6638" containerID="dbee225a563df37640f89cf62633ade280c689d850dddbb97e1ff66a2ca8013d" exitCode=0 Dec 03 08:11:35 crc kubenswrapper[4947]: I1203 08:11:35.582176 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pssvx" event={"ID":"73d6bb93-fcf1-423a-a7d0-5dfaf25c6638","Type":"ContainerDied","Data":"dbee225a563df37640f89cf62633ade280c689d850dddbb97e1ff66a2ca8013d"} Dec 03 08:11:35 crc kubenswrapper[4947]: I1203 08:11:35.582203 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pssvx" event={"ID":"73d6bb93-fcf1-423a-a7d0-5dfaf25c6638","Type":"ContainerStarted","Data":"bab3015f014401107aaf4801b8b050b6add6e2dee0d4fdd08bfd30bc1405a5a3"} Dec 03 08:11:36 crc kubenswrapper[4947]: I1203 08:11:36.593234 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pssvx" event={"ID":"73d6bb93-fcf1-423a-a7d0-5dfaf25c6638","Type":"ContainerStarted","Data":"edec8eba4a2c03c3eecfa612924a0dfcbd4b8dadbf6c2d57b2aefc95c0edec94"} Dec 03 08:11:36 crc kubenswrapper[4947]: I1203 08:11:36.595417 4947 generic.go:334] "Generic (PLEG): container finished" podID="678185dc-6826-4920-afe7-5cce281f244b" containerID="f34983e3abc1e3fd8e761c784917eb425f75b1442269d24bd4e1820c92695091" exitCode=0 Dec 03 08:11:36 crc kubenswrapper[4947]: I1203 08:11:36.595448 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-72dcb" event={"ID":"678185dc-6826-4920-afe7-5cce281f244b","Type":"ContainerDied","Data":"f34983e3abc1e3fd8e761c784917eb425f75b1442269d24bd4e1820c92695091"} Dec 03 08:11:37 crc kubenswrapper[4947]: I1203 08:11:37.605779 4947 generic.go:334] "Generic (PLEG): container finished" podID="73d6bb93-fcf1-423a-a7d0-5dfaf25c6638" containerID="edec8eba4a2c03c3eecfa612924a0dfcbd4b8dadbf6c2d57b2aefc95c0edec94" exitCode=0 Dec 03 08:11:37 crc kubenswrapper[4947]: I1203 08:11:37.605825 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pssvx" event={"ID":"73d6bb93-fcf1-423a-a7d0-5dfaf25c6638","Type":"ContainerDied","Data":"edec8eba4a2c03c3eecfa612924a0dfcbd4b8dadbf6c2d57b2aefc95c0edec94"} Dec 03 08:11:37 crc kubenswrapper[4947]: I1203 08:11:37.608944 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-72dcb" event={"ID":"678185dc-6826-4920-afe7-5cce281f244b","Type":"ContainerStarted","Data":"097413b0983176459ee7c5bfbb42a5530b6c8fc20df9afb705ab9ed80b4ed6ed"} Dec 03 08:11:37 crc kubenswrapper[4947]: I1203 08:11:37.653024 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-72dcb" podStartSLOduration=2.207803075 podStartE2EDuration="4.653006759s" podCreationTimestamp="2025-12-03 08:11:33 +0000 UTC" firstStartedPulling="2025-12-03 08:11:34.576820974 +0000 UTC m=+4955.837775400" lastFinishedPulling="2025-12-03 08:11:37.022024668 +0000 UTC m=+4958.282979084" observedRunningTime="2025-12-03 08:11:37.6481852 +0000 UTC m=+4958.909139626" watchObservedRunningTime="2025-12-03 08:11:37.653006759 +0000 UTC m=+4958.913961195" Dec 03 08:11:38 crc kubenswrapper[4947]: I1203 08:11:38.617716 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pssvx" event={"ID":"73d6bb93-fcf1-423a-a7d0-5dfaf25c6638","Type":"ContainerStarted","Data":"6d9fd2628c61259b034019888e1004511cb6645aac77d05a20f7276ca19f2f6d"} Dec 03 08:11:38 crc kubenswrapper[4947]: I1203 08:11:38.641694 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pssvx" podStartSLOduration=2.069037877 podStartE2EDuration="4.641669292s" podCreationTimestamp="2025-12-03 08:11:34 +0000 UTC" firstStartedPulling="2025-12-03 08:11:35.609174457 +0000 UTC m=+4956.870128883" lastFinishedPulling="2025-12-03 08:11:38.181805862 +0000 UTC m=+4959.442760298" observedRunningTime="2025-12-03 08:11:38.637393877 +0000 UTC m=+4959.898348313" watchObservedRunningTime="2025-12-03 08:11:38.641669292 +0000 UTC m=+4959.902623748" Dec 03 08:11:42 crc kubenswrapper[4947]: I1203 08:11:42.083072 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:11:42 crc kubenswrapper[4947]: E1203 08:11:42.083451 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:11:43 crc kubenswrapper[4947]: I1203 08:11:43.579153 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-72dcb" Dec 03 08:11:43 crc kubenswrapper[4947]: I1203 08:11:43.579246 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-72dcb" Dec 03 08:11:43 crc kubenswrapper[4947]: I1203 08:11:43.716175 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-72dcb" Dec 03 08:11:43 crc kubenswrapper[4947]: I1203 08:11:43.787297 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-72dcb" Dec 03 08:11:44 crc kubenswrapper[4947]: I1203 08:11:44.951148 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pssvx" Dec 03 08:11:44 crc kubenswrapper[4947]: I1203 08:11:44.951472 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pssvx" Dec 03 08:11:45 crc kubenswrapper[4947]: I1203 08:11:45.182166 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pssvx" Dec 03 08:11:45 crc kubenswrapper[4947]: I1203 08:11:45.230762 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-72dcb"] Dec 03 08:11:45 crc kubenswrapper[4947]: I1203 08:11:45.666935 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-72dcb" podUID="678185dc-6826-4920-afe7-5cce281f244b" containerName="registry-server" containerID="cri-o://097413b0983176459ee7c5bfbb42a5530b6c8fc20df9afb705ab9ed80b4ed6ed" gracePeriod=2 Dec 03 08:11:45 crc kubenswrapper[4947]: I1203 08:11:45.709662 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pssvx" Dec 03 08:11:47 crc kubenswrapper[4947]: I1203 08:11:47.232662 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pssvx"] Dec 03 08:11:47 crc kubenswrapper[4947]: I1203 08:11:47.686703 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pssvx" podUID="73d6bb93-fcf1-423a-a7d0-5dfaf25c6638" containerName="registry-server" containerID="cri-o://6d9fd2628c61259b034019888e1004511cb6645aac77d05a20f7276ca19f2f6d" gracePeriod=2 Dec 03 08:11:48 crc kubenswrapper[4947]: I1203 08:11:48.702077 4947 generic.go:334] "Generic (PLEG): container finished" podID="678185dc-6826-4920-afe7-5cce281f244b" containerID="097413b0983176459ee7c5bfbb42a5530b6c8fc20df9afb705ab9ed80b4ed6ed" exitCode=0 Dec 03 08:11:48 crc kubenswrapper[4947]: I1203 08:11:48.702279 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-72dcb" event={"ID":"678185dc-6826-4920-afe7-5cce281f244b","Type":"ContainerDied","Data":"097413b0983176459ee7c5bfbb42a5530b6c8fc20df9afb705ab9ed80b4ed6ed"} Dec 03 08:11:48 crc kubenswrapper[4947]: I1203 08:11:48.705192 4947 generic.go:334] "Generic (PLEG): container finished" podID="73d6bb93-fcf1-423a-a7d0-5dfaf25c6638" containerID="6d9fd2628c61259b034019888e1004511cb6645aac77d05a20f7276ca19f2f6d" exitCode=0 Dec 03 08:11:48 crc kubenswrapper[4947]: I1203 08:11:48.705242 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pssvx" event={"ID":"73d6bb93-fcf1-423a-a7d0-5dfaf25c6638","Type":"ContainerDied","Data":"6d9fd2628c61259b034019888e1004511cb6645aac77d05a20f7276ca19f2f6d"} Dec 03 08:11:48 crc kubenswrapper[4947]: I1203 08:11:48.836237 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-72dcb" Dec 03 08:11:48 crc kubenswrapper[4947]: I1203 08:11:48.960358 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/678185dc-6826-4920-afe7-5cce281f244b-utilities\") pod \"678185dc-6826-4920-afe7-5cce281f244b\" (UID: \"678185dc-6826-4920-afe7-5cce281f244b\") " Dec 03 08:11:48 crc kubenswrapper[4947]: I1203 08:11:48.960431 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zjt9\" (UniqueName: \"kubernetes.io/projected/678185dc-6826-4920-afe7-5cce281f244b-kube-api-access-9zjt9\") pod \"678185dc-6826-4920-afe7-5cce281f244b\" (UID: \"678185dc-6826-4920-afe7-5cce281f244b\") " Dec 03 08:11:48 crc kubenswrapper[4947]: I1203 08:11:48.960523 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/678185dc-6826-4920-afe7-5cce281f244b-catalog-content\") pod \"678185dc-6826-4920-afe7-5cce281f244b\" (UID: \"678185dc-6826-4920-afe7-5cce281f244b\") " Dec 03 08:11:48 crc kubenswrapper[4947]: I1203 08:11:48.961347 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/678185dc-6826-4920-afe7-5cce281f244b-utilities" (OuterVolumeSpecName: "utilities") pod "678185dc-6826-4920-afe7-5cce281f244b" (UID: "678185dc-6826-4920-afe7-5cce281f244b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:11:48 crc kubenswrapper[4947]: I1203 08:11:48.968543 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/678185dc-6826-4920-afe7-5cce281f244b-kube-api-access-9zjt9" (OuterVolumeSpecName: "kube-api-access-9zjt9") pod "678185dc-6826-4920-afe7-5cce281f244b" (UID: "678185dc-6826-4920-afe7-5cce281f244b"). InnerVolumeSpecName "kube-api-access-9zjt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.019864 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pssvx" Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.061770 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/678185dc-6826-4920-afe7-5cce281f244b-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.061804 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zjt9\" (UniqueName: \"kubernetes.io/projected/678185dc-6826-4920-afe7-5cce281f244b-kube-api-access-9zjt9\") on node \"crc\" DevicePath \"\"" Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.078369 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/678185dc-6826-4920-afe7-5cce281f244b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "678185dc-6826-4920-afe7-5cce281f244b" (UID: "678185dc-6826-4920-afe7-5cce281f244b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.162831 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2594\" (UniqueName: \"kubernetes.io/projected/73d6bb93-fcf1-423a-a7d0-5dfaf25c6638-kube-api-access-g2594\") pod \"73d6bb93-fcf1-423a-a7d0-5dfaf25c6638\" (UID: \"73d6bb93-fcf1-423a-a7d0-5dfaf25c6638\") " Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.163087 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73d6bb93-fcf1-423a-a7d0-5dfaf25c6638-utilities\") pod \"73d6bb93-fcf1-423a-a7d0-5dfaf25c6638\" (UID: \"73d6bb93-fcf1-423a-a7d0-5dfaf25c6638\") " Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.163232 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73d6bb93-fcf1-423a-a7d0-5dfaf25c6638-catalog-content\") pod \"73d6bb93-fcf1-423a-a7d0-5dfaf25c6638\" (UID: \"73d6bb93-fcf1-423a-a7d0-5dfaf25c6638\") " Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.163580 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/678185dc-6826-4920-afe7-5cce281f244b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.163912 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73d6bb93-fcf1-423a-a7d0-5dfaf25c6638-utilities" (OuterVolumeSpecName: "utilities") pod "73d6bb93-fcf1-423a-a7d0-5dfaf25c6638" (UID: "73d6bb93-fcf1-423a-a7d0-5dfaf25c6638"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.167333 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73d6bb93-fcf1-423a-a7d0-5dfaf25c6638-kube-api-access-g2594" (OuterVolumeSpecName: "kube-api-access-g2594") pod "73d6bb93-fcf1-423a-a7d0-5dfaf25c6638" (UID: "73d6bb93-fcf1-423a-a7d0-5dfaf25c6638"). InnerVolumeSpecName "kube-api-access-g2594". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.222857 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73d6bb93-fcf1-423a-a7d0-5dfaf25c6638-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73d6bb93-fcf1-423a-a7d0-5dfaf25c6638" (UID: "73d6bb93-fcf1-423a-a7d0-5dfaf25c6638"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.264420 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73d6bb93-fcf1-423a-a7d0-5dfaf25c6638-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.264507 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2594\" (UniqueName: \"kubernetes.io/projected/73d6bb93-fcf1-423a-a7d0-5dfaf25c6638-kube-api-access-g2594\") on node \"crc\" DevicePath \"\"" Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.264525 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73d6bb93-fcf1-423a-a7d0-5dfaf25c6638-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.718394 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-72dcb" event={"ID":"678185dc-6826-4920-afe7-5cce281f244b","Type":"ContainerDied","Data":"5b5575bdf173f114e26ca7bb30456e28e924bd35b4e524d8a0b5762cc970fd61"} Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.718425 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-72dcb" Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.718691 4947 scope.go:117] "RemoveContainer" containerID="097413b0983176459ee7c5bfbb42a5530b6c8fc20df9afb705ab9ed80b4ed6ed" Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.726297 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pssvx" event={"ID":"73d6bb93-fcf1-423a-a7d0-5dfaf25c6638","Type":"ContainerDied","Data":"bab3015f014401107aaf4801b8b050b6add6e2dee0d4fdd08bfd30bc1405a5a3"} Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.726522 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pssvx" Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.763136 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-72dcb"] Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.767683 4947 scope.go:117] "RemoveContainer" containerID="f34983e3abc1e3fd8e761c784917eb425f75b1442269d24bd4e1820c92695091" Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.777471 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-72dcb"] Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.794073 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pssvx"] Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.799107 4947 scope.go:117] "RemoveContainer" containerID="e83f9cfac43ed1b455fcedf9eccd073fdb6e2ad5ad4596d9bbd809edd220d587" Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.800828 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pssvx"] Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.830864 4947 scope.go:117] "RemoveContainer" containerID="6d9fd2628c61259b034019888e1004511cb6645aac77d05a20f7276ca19f2f6d" Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.860627 4947 scope.go:117] "RemoveContainer" containerID="edec8eba4a2c03c3eecfa612924a0dfcbd4b8dadbf6c2d57b2aefc95c0edec94" Dec 03 08:11:49 crc kubenswrapper[4947]: I1203 08:11:49.877190 4947 scope.go:117] "RemoveContainer" containerID="dbee225a563df37640f89cf62633ade280c689d850dddbb97e1ff66a2ca8013d" Dec 03 08:11:51 crc kubenswrapper[4947]: I1203 08:11:51.100080 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="678185dc-6826-4920-afe7-5cce281f244b" path="/var/lib/kubelet/pods/678185dc-6826-4920-afe7-5cce281f244b/volumes" Dec 03 08:11:51 crc kubenswrapper[4947]: I1203 08:11:51.101332 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73d6bb93-fcf1-423a-a7d0-5dfaf25c6638" path="/var/lib/kubelet/pods/73d6bb93-fcf1-423a-a7d0-5dfaf25c6638/volumes" Dec 03 08:11:54 crc kubenswrapper[4947]: I1203 08:11:54.083347 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:11:54 crc kubenswrapper[4947]: E1203 08:11:54.085692 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:12:07 crc kubenswrapper[4947]: I1203 08:12:07.083708 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:12:07 crc kubenswrapper[4947]: E1203 08:12:07.084475 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:12:20 crc kubenswrapper[4947]: I1203 08:12:20.083722 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:12:20 crc kubenswrapper[4947]: E1203 08:12:20.084726 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:12:34 crc kubenswrapper[4947]: I1203 08:12:34.083918 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:12:34 crc kubenswrapper[4947]: E1203 08:12:34.086095 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:12:35 crc kubenswrapper[4947]: I1203 08:12:35.919063 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-j45sc" podUID="780d43c8-0dea-47ad-95cc-26801572c76d" containerName="registry-server" probeResult="failure" output=< Dec 03 08:12:35 crc kubenswrapper[4947]: timeout: failed to connect service ":50051" within 1s Dec 03 08:12:35 crc kubenswrapper[4947]: > Dec 03 08:12:48 crc kubenswrapper[4947]: I1203 08:12:48.083208 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:12:48 crc kubenswrapper[4947]: E1203 08:12:48.085063 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:13:00 crc kubenswrapper[4947]: I1203 08:13:00.083116 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:13:00 crc kubenswrapper[4947]: E1203 08:13:00.084022 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:13:13 crc kubenswrapper[4947]: I1203 08:13:13.083939 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:13:13 crc kubenswrapper[4947]: E1203 08:13:13.085239 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:13:28 crc kubenswrapper[4947]: I1203 08:13:28.084712 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:13:28 crc kubenswrapper[4947]: E1203 08:13:28.087191 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:13:41 crc kubenswrapper[4947]: I1203 08:13:41.083823 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:13:41 crc kubenswrapper[4947]: E1203 08:13:41.084930 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:13:52 crc kubenswrapper[4947]: I1203 08:13:52.083659 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:13:52 crc kubenswrapper[4947]: E1203 08:13:52.084566 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:14:05 crc kubenswrapper[4947]: I1203 08:14:05.874837 4947 scope.go:117] "RemoveContainer" containerID="c34ad973c51a0309c5ff0cbd9971a537c85add414146b16444923f43a139e062" Dec 03 08:14:05 crc kubenswrapper[4947]: I1203 08:14:05.927453 4947 scope.go:117] "RemoveContainer" containerID="b37299f9f0fb0d4daf96d5210b202b6d6758cf926f1e79fde21bd42f74040608" Dec 03 08:14:05 crc kubenswrapper[4947]: I1203 08:14:05.953010 4947 scope.go:117] "RemoveContainer" containerID="01f35ccb4eae57c8c08948112aa5766f11eabbb2f0d5cae3a0ec6ae8c6af2101" Dec 03 08:14:07 crc kubenswrapper[4947]: I1203 08:14:07.083926 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:14:07 crc kubenswrapper[4947]: E1203 08:14:07.084582 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:14:22 crc kubenswrapper[4947]: I1203 08:14:22.083516 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:14:22 crc kubenswrapper[4947]: E1203 08:14:22.084463 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:14:33 crc kubenswrapper[4947]: I1203 08:14:33.086876 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:14:33 crc kubenswrapper[4947]: E1203 08:14:33.087690 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:14:44 crc kubenswrapper[4947]: I1203 08:14:44.083006 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:14:44 crc kubenswrapper[4947]: E1203 08:14:44.083878 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:14:57 crc kubenswrapper[4947]: I1203 08:14:57.083627 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:14:57 crc kubenswrapper[4947]: E1203 08:14:57.084423 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:15:00 crc kubenswrapper[4947]: I1203 08:15:00.166789 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412495-qhws9"] Dec 03 08:15:00 crc kubenswrapper[4947]: E1203 08:15:00.168168 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d6bb93-fcf1-423a-a7d0-5dfaf25c6638" containerName="extract-utilities" Dec 03 08:15:00 crc kubenswrapper[4947]: I1203 08:15:00.168206 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d6bb93-fcf1-423a-a7d0-5dfaf25c6638" containerName="extract-utilities" Dec 03 08:15:00 crc kubenswrapper[4947]: E1203 08:15:00.168242 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="678185dc-6826-4920-afe7-5cce281f244b" containerName="registry-server" Dec 03 08:15:00 crc kubenswrapper[4947]: I1203 08:15:00.168262 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="678185dc-6826-4920-afe7-5cce281f244b" containerName="registry-server" Dec 03 08:15:00 crc kubenswrapper[4947]: E1203 08:15:00.168315 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="678185dc-6826-4920-afe7-5cce281f244b" containerName="extract-content" Dec 03 08:15:00 crc kubenswrapper[4947]: I1203 08:15:00.168335 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="678185dc-6826-4920-afe7-5cce281f244b" containerName="extract-content" Dec 03 08:15:00 crc kubenswrapper[4947]: E1203 08:15:00.168357 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="678185dc-6826-4920-afe7-5cce281f244b" containerName="extract-utilities" Dec 03 08:15:00 crc kubenswrapper[4947]: I1203 08:15:00.168375 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="678185dc-6826-4920-afe7-5cce281f244b" containerName="extract-utilities" Dec 03 08:15:00 crc kubenswrapper[4947]: E1203 08:15:00.168414 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d6bb93-fcf1-423a-a7d0-5dfaf25c6638" containerName="extract-content" Dec 03 08:15:00 crc kubenswrapper[4947]: I1203 08:15:00.168433 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d6bb93-fcf1-423a-a7d0-5dfaf25c6638" containerName="extract-content" Dec 03 08:15:00 crc kubenswrapper[4947]: E1203 08:15:00.168476 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d6bb93-fcf1-423a-a7d0-5dfaf25c6638" containerName="registry-server" Dec 03 08:15:00 crc kubenswrapper[4947]: I1203 08:15:00.168529 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d6bb93-fcf1-423a-a7d0-5dfaf25c6638" containerName="registry-server" Dec 03 08:15:00 crc kubenswrapper[4947]: I1203 08:15:00.168840 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="678185dc-6826-4920-afe7-5cce281f244b" containerName="registry-server" Dec 03 08:15:00 crc kubenswrapper[4947]: I1203 08:15:00.168913 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="73d6bb93-fcf1-423a-a7d0-5dfaf25c6638" containerName="registry-server" Dec 03 08:15:00 crc kubenswrapper[4947]: I1203 08:15:00.169894 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-qhws9" Dec 03 08:15:00 crc kubenswrapper[4947]: I1203 08:15:00.172573 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 08:15:00 crc kubenswrapper[4947]: I1203 08:15:00.172594 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 08:15:00 crc kubenswrapper[4947]: I1203 08:15:00.177264 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412495-qhws9"] Dec 03 08:15:00 crc kubenswrapper[4947]: I1203 08:15:00.288105 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/914fca72-8b32-450d-bcff-d9d3c6e72cf1-secret-volume\") pod \"collect-profiles-29412495-qhws9\" (UID: \"914fca72-8b32-450d-bcff-d9d3c6e72cf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-qhws9" Dec 03 08:15:00 crc kubenswrapper[4947]: I1203 08:15:00.288228 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6pmm\" (UniqueName: \"kubernetes.io/projected/914fca72-8b32-450d-bcff-d9d3c6e72cf1-kube-api-access-n6pmm\") pod \"collect-profiles-29412495-qhws9\" (UID: \"914fca72-8b32-450d-bcff-d9d3c6e72cf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-qhws9" Dec 03 08:15:00 crc kubenswrapper[4947]: I1203 08:15:00.288278 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/914fca72-8b32-450d-bcff-d9d3c6e72cf1-config-volume\") pod \"collect-profiles-29412495-qhws9\" (UID: \"914fca72-8b32-450d-bcff-d9d3c6e72cf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-qhws9" Dec 03 08:15:00 crc kubenswrapper[4947]: I1203 08:15:00.389203 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/914fca72-8b32-450d-bcff-d9d3c6e72cf1-secret-volume\") pod \"collect-profiles-29412495-qhws9\" (UID: \"914fca72-8b32-450d-bcff-d9d3c6e72cf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-qhws9" Dec 03 08:15:00 crc kubenswrapper[4947]: I1203 08:15:00.389332 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6pmm\" (UniqueName: \"kubernetes.io/projected/914fca72-8b32-450d-bcff-d9d3c6e72cf1-kube-api-access-n6pmm\") pod \"collect-profiles-29412495-qhws9\" (UID: \"914fca72-8b32-450d-bcff-d9d3c6e72cf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-qhws9" Dec 03 08:15:00 crc kubenswrapper[4947]: I1203 08:15:00.389381 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/914fca72-8b32-450d-bcff-d9d3c6e72cf1-config-volume\") pod \"collect-profiles-29412495-qhws9\" (UID: \"914fca72-8b32-450d-bcff-d9d3c6e72cf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-qhws9" Dec 03 08:15:00 crc kubenswrapper[4947]: I1203 08:15:00.390403 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/914fca72-8b32-450d-bcff-d9d3c6e72cf1-config-volume\") pod \"collect-profiles-29412495-qhws9\" (UID: \"914fca72-8b32-450d-bcff-d9d3c6e72cf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-qhws9" Dec 03 08:15:00 crc kubenswrapper[4947]: I1203 08:15:00.396174 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/914fca72-8b32-450d-bcff-d9d3c6e72cf1-secret-volume\") pod \"collect-profiles-29412495-qhws9\" (UID: \"914fca72-8b32-450d-bcff-d9d3c6e72cf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-qhws9" Dec 03 08:15:00 crc kubenswrapper[4947]: I1203 08:15:00.414912 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6pmm\" (UniqueName: \"kubernetes.io/projected/914fca72-8b32-450d-bcff-d9d3c6e72cf1-kube-api-access-n6pmm\") pod \"collect-profiles-29412495-qhws9\" (UID: \"914fca72-8b32-450d-bcff-d9d3c6e72cf1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-qhws9" Dec 03 08:15:00 crc kubenswrapper[4947]: I1203 08:15:00.495742 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-qhws9" Dec 03 08:15:00 crc kubenswrapper[4947]: I1203 08:15:00.984564 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412495-qhws9"] Dec 03 08:15:01 crc kubenswrapper[4947]: I1203 08:15:01.497789 4947 generic.go:334] "Generic (PLEG): container finished" podID="914fca72-8b32-450d-bcff-d9d3c6e72cf1" containerID="3adb80a992017975c3495f8b0899f1a09c1de0d2be37de6255432ccdd9e9b3e5" exitCode=0 Dec 03 08:15:01 crc kubenswrapper[4947]: I1203 08:15:01.497861 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-qhws9" event={"ID":"914fca72-8b32-450d-bcff-d9d3c6e72cf1","Type":"ContainerDied","Data":"3adb80a992017975c3495f8b0899f1a09c1de0d2be37de6255432ccdd9e9b3e5"} Dec 03 08:15:01 crc kubenswrapper[4947]: I1203 08:15:01.498104 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-qhws9" event={"ID":"914fca72-8b32-450d-bcff-d9d3c6e72cf1","Type":"ContainerStarted","Data":"53714783ee8edb483249a65c05dfb12ef52752e068d5fad5cfe1481bd06cf8fb"} Dec 03 08:15:02 crc kubenswrapper[4947]: I1203 08:15:02.938073 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-qhws9" Dec 03 08:15:03 crc kubenswrapper[4947]: I1203 08:15:03.070949 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/914fca72-8b32-450d-bcff-d9d3c6e72cf1-secret-volume\") pod \"914fca72-8b32-450d-bcff-d9d3c6e72cf1\" (UID: \"914fca72-8b32-450d-bcff-d9d3c6e72cf1\") " Dec 03 08:15:03 crc kubenswrapper[4947]: I1203 08:15:03.071097 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6pmm\" (UniqueName: \"kubernetes.io/projected/914fca72-8b32-450d-bcff-d9d3c6e72cf1-kube-api-access-n6pmm\") pod \"914fca72-8b32-450d-bcff-d9d3c6e72cf1\" (UID: \"914fca72-8b32-450d-bcff-d9d3c6e72cf1\") " Dec 03 08:15:03 crc kubenswrapper[4947]: I1203 08:15:03.071168 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/914fca72-8b32-450d-bcff-d9d3c6e72cf1-config-volume\") pod \"914fca72-8b32-450d-bcff-d9d3c6e72cf1\" (UID: \"914fca72-8b32-450d-bcff-d9d3c6e72cf1\") " Dec 03 08:15:03 crc kubenswrapper[4947]: I1203 08:15:03.072379 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/914fca72-8b32-450d-bcff-d9d3c6e72cf1-config-volume" (OuterVolumeSpecName: "config-volume") pod "914fca72-8b32-450d-bcff-d9d3c6e72cf1" (UID: "914fca72-8b32-450d-bcff-d9d3c6e72cf1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:15:03 crc kubenswrapper[4947]: I1203 08:15:03.076724 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/914fca72-8b32-450d-bcff-d9d3c6e72cf1-kube-api-access-n6pmm" (OuterVolumeSpecName: "kube-api-access-n6pmm") pod "914fca72-8b32-450d-bcff-d9d3c6e72cf1" (UID: "914fca72-8b32-450d-bcff-d9d3c6e72cf1"). InnerVolumeSpecName "kube-api-access-n6pmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:15:03 crc kubenswrapper[4947]: I1203 08:15:03.085704 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/914fca72-8b32-450d-bcff-d9d3c6e72cf1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "914fca72-8b32-450d-bcff-d9d3c6e72cf1" (UID: "914fca72-8b32-450d-bcff-d9d3c6e72cf1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:15:03 crc kubenswrapper[4947]: I1203 08:15:03.172736 4947 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/914fca72-8b32-450d-bcff-d9d3c6e72cf1-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 08:15:03 crc kubenswrapper[4947]: I1203 08:15:03.172771 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6pmm\" (UniqueName: \"kubernetes.io/projected/914fca72-8b32-450d-bcff-d9d3c6e72cf1-kube-api-access-n6pmm\") on node \"crc\" DevicePath \"\"" Dec 03 08:15:03 crc kubenswrapper[4947]: I1203 08:15:03.172785 4947 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/914fca72-8b32-450d-bcff-d9d3c6e72cf1-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 08:15:03 crc kubenswrapper[4947]: I1203 08:15:03.517481 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-qhws9" event={"ID":"914fca72-8b32-450d-bcff-d9d3c6e72cf1","Type":"ContainerDied","Data":"53714783ee8edb483249a65c05dfb12ef52752e068d5fad5cfe1481bd06cf8fb"} Dec 03 08:15:03 crc kubenswrapper[4947]: I1203 08:15:03.517879 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53714783ee8edb483249a65c05dfb12ef52752e068d5fad5cfe1481bd06cf8fb" Dec 03 08:15:03 crc kubenswrapper[4947]: I1203 08:15:03.517608 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-qhws9" Dec 03 08:15:04 crc kubenswrapper[4947]: I1203 08:15:04.016542 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412450-t6xgf"] Dec 03 08:15:04 crc kubenswrapper[4947]: I1203 08:15:04.024688 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412450-t6xgf"] Dec 03 08:15:05 crc kubenswrapper[4947]: I1203 08:15:05.098980 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="647811b5-9ad1-4ead-a5a8-f9c3ee8790c2" path="/var/lib/kubelet/pods/647811b5-9ad1-4ead-a5a8-f9c3ee8790c2/volumes" Dec 03 08:15:06 crc kubenswrapper[4947]: I1203 08:15:06.000461 4947 scope.go:117] "RemoveContainer" containerID="1bd55e1d2ebef97b1663e41bd29d22afa379b89e4f1293601aaf0ac371c77016" Dec 03 08:15:09 crc kubenswrapper[4947]: I1203 08:15:09.087030 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:15:09 crc kubenswrapper[4947]: E1203 08:15:09.087788 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:15:21 crc kubenswrapper[4947]: I1203 08:15:21.082900 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:15:21 crc kubenswrapper[4947]: E1203 08:15:21.083690 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:15:35 crc kubenswrapper[4947]: I1203 08:15:35.084134 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:15:35 crc kubenswrapper[4947]: E1203 08:15:35.085752 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:15:49 crc kubenswrapper[4947]: I1203 08:15:49.095000 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:15:49 crc kubenswrapper[4947]: E1203 08:15:49.096098 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:16:02 crc kubenswrapper[4947]: I1203 08:16:02.083295 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:16:02 crc kubenswrapper[4947]: E1203 08:16:02.083943 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:16:14 crc kubenswrapper[4947]: I1203 08:16:14.083666 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:16:14 crc kubenswrapper[4947]: E1203 08:16:14.084342 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:16:27 crc kubenswrapper[4947]: I1203 08:16:27.083481 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:16:27 crc kubenswrapper[4947]: E1203 08:16:27.084350 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:16:38 crc kubenswrapper[4947]: I1203 08:16:38.083410 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:16:38 crc kubenswrapper[4947]: I1203 08:16:38.342931 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"fab3869a6a339861653758174ff6a69391a8ba3c2cecefc3d5b211cbc5751299"} Dec 03 08:17:36 crc kubenswrapper[4947]: I1203 08:17:36.788744 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lt4hf"] Dec 03 08:17:36 crc kubenswrapper[4947]: E1203 08:17:36.791740 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="914fca72-8b32-450d-bcff-d9d3c6e72cf1" containerName="collect-profiles" Dec 03 08:17:36 crc kubenswrapper[4947]: I1203 08:17:36.791812 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="914fca72-8b32-450d-bcff-d9d3c6e72cf1" containerName="collect-profiles" Dec 03 08:17:36 crc kubenswrapper[4947]: I1203 08:17:36.792090 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="914fca72-8b32-450d-bcff-d9d3c6e72cf1" containerName="collect-profiles" Dec 03 08:17:36 crc kubenswrapper[4947]: I1203 08:17:36.799097 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lt4hf" Dec 03 08:17:36 crc kubenswrapper[4947]: I1203 08:17:36.803378 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lt4hf"] Dec 03 08:17:36 crc kubenswrapper[4947]: I1203 08:17:36.899222 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdm8s\" (UniqueName: \"kubernetes.io/projected/1e344d7f-dd0a-40f2-b6de-a74bd9f68528-kube-api-access-qdm8s\") pod \"community-operators-lt4hf\" (UID: \"1e344d7f-dd0a-40f2-b6de-a74bd9f68528\") " pod="openshift-marketplace/community-operators-lt4hf" Dec 03 08:17:36 crc kubenswrapper[4947]: I1203 08:17:36.899288 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e344d7f-dd0a-40f2-b6de-a74bd9f68528-catalog-content\") pod \"community-operators-lt4hf\" (UID: \"1e344d7f-dd0a-40f2-b6de-a74bd9f68528\") " pod="openshift-marketplace/community-operators-lt4hf" Dec 03 08:17:36 crc kubenswrapper[4947]: I1203 08:17:36.899332 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e344d7f-dd0a-40f2-b6de-a74bd9f68528-utilities\") pod \"community-operators-lt4hf\" (UID: \"1e344d7f-dd0a-40f2-b6de-a74bd9f68528\") " pod="openshift-marketplace/community-operators-lt4hf" Dec 03 08:17:37 crc kubenswrapper[4947]: I1203 08:17:37.000885 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e344d7f-dd0a-40f2-b6de-a74bd9f68528-utilities\") pod \"community-operators-lt4hf\" (UID: \"1e344d7f-dd0a-40f2-b6de-a74bd9f68528\") " pod="openshift-marketplace/community-operators-lt4hf" Dec 03 08:17:37 crc kubenswrapper[4947]: I1203 08:17:37.001211 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdm8s\" (UniqueName: \"kubernetes.io/projected/1e344d7f-dd0a-40f2-b6de-a74bd9f68528-kube-api-access-qdm8s\") pod \"community-operators-lt4hf\" (UID: \"1e344d7f-dd0a-40f2-b6de-a74bd9f68528\") " pod="openshift-marketplace/community-operators-lt4hf" Dec 03 08:17:37 crc kubenswrapper[4947]: I1203 08:17:37.001311 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e344d7f-dd0a-40f2-b6de-a74bd9f68528-catalog-content\") pod \"community-operators-lt4hf\" (UID: \"1e344d7f-dd0a-40f2-b6de-a74bd9f68528\") " pod="openshift-marketplace/community-operators-lt4hf" Dec 03 08:17:37 crc kubenswrapper[4947]: I1203 08:17:37.001486 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e344d7f-dd0a-40f2-b6de-a74bd9f68528-utilities\") pod \"community-operators-lt4hf\" (UID: \"1e344d7f-dd0a-40f2-b6de-a74bd9f68528\") " pod="openshift-marketplace/community-operators-lt4hf" Dec 03 08:17:37 crc kubenswrapper[4947]: I1203 08:17:37.001882 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e344d7f-dd0a-40f2-b6de-a74bd9f68528-catalog-content\") pod \"community-operators-lt4hf\" (UID: \"1e344d7f-dd0a-40f2-b6de-a74bd9f68528\") " pod="openshift-marketplace/community-operators-lt4hf" Dec 03 08:17:37 crc kubenswrapper[4947]: I1203 08:17:37.028184 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdm8s\" (UniqueName: \"kubernetes.io/projected/1e344d7f-dd0a-40f2-b6de-a74bd9f68528-kube-api-access-qdm8s\") pod \"community-operators-lt4hf\" (UID: \"1e344d7f-dd0a-40f2-b6de-a74bd9f68528\") " pod="openshift-marketplace/community-operators-lt4hf" Dec 03 08:17:37 crc kubenswrapper[4947]: I1203 08:17:37.120733 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lt4hf" Dec 03 08:17:37 crc kubenswrapper[4947]: I1203 08:17:37.682157 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lt4hf"] Dec 03 08:17:37 crc kubenswrapper[4947]: I1203 08:17:37.899022 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lt4hf" event={"ID":"1e344d7f-dd0a-40f2-b6de-a74bd9f68528","Type":"ContainerStarted","Data":"a1937c78a2e3d77470b1e9506e35690cfa04330ed361ca420bd4f5cd3e36cb49"} Dec 03 08:17:38 crc kubenswrapper[4947]: I1203 08:17:38.912518 4947 generic.go:334] "Generic (PLEG): container finished" podID="1e344d7f-dd0a-40f2-b6de-a74bd9f68528" containerID="78f9dec7173054c59cd7f52ccb98a02a061ecd16f83ab074419f016512d6d3d4" exitCode=0 Dec 03 08:17:38 crc kubenswrapper[4947]: I1203 08:17:38.912566 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lt4hf" event={"ID":"1e344d7f-dd0a-40f2-b6de-a74bd9f68528","Type":"ContainerDied","Data":"78f9dec7173054c59cd7f52ccb98a02a061ecd16f83ab074419f016512d6d3d4"} Dec 03 08:17:38 crc kubenswrapper[4947]: I1203 08:17:38.914781 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 08:17:40 crc kubenswrapper[4947]: I1203 08:17:40.933678 4947 generic.go:334] "Generic (PLEG): container finished" podID="1e344d7f-dd0a-40f2-b6de-a74bd9f68528" containerID="d6f48009d2d08389da899e20cfa62c00373ed91ca3194565b0491f97d911689a" exitCode=0 Dec 03 08:17:40 crc kubenswrapper[4947]: I1203 08:17:40.933797 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lt4hf" event={"ID":"1e344d7f-dd0a-40f2-b6de-a74bd9f68528","Type":"ContainerDied","Data":"d6f48009d2d08389da899e20cfa62c00373ed91ca3194565b0491f97d911689a"} Dec 03 08:17:41 crc kubenswrapper[4947]: I1203 08:17:41.946566 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lt4hf" event={"ID":"1e344d7f-dd0a-40f2-b6de-a74bd9f68528","Type":"ContainerStarted","Data":"6c4ba788e7fae0fa16efdaf08cde86ff1e84658a423d448b423af4d0f5e1d631"} Dec 03 08:17:41 crc kubenswrapper[4947]: I1203 08:17:41.971106 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lt4hf" podStartSLOduration=3.505784876 podStartE2EDuration="5.971083493s" podCreationTimestamp="2025-12-03 08:17:36 +0000 UTC" firstStartedPulling="2025-12-03 08:17:38.914264889 +0000 UTC m=+5320.175219345" lastFinishedPulling="2025-12-03 08:17:41.379563526 +0000 UTC m=+5322.640517962" observedRunningTime="2025-12-03 08:17:41.968885943 +0000 UTC m=+5323.229840479" watchObservedRunningTime="2025-12-03 08:17:41.971083493 +0000 UTC m=+5323.232037939" Dec 03 08:17:47 crc kubenswrapper[4947]: I1203 08:17:47.121313 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lt4hf" Dec 03 08:17:47 crc kubenswrapper[4947]: I1203 08:17:47.121853 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lt4hf" Dec 03 08:17:47 crc kubenswrapper[4947]: I1203 08:17:47.186579 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lt4hf" Dec 03 08:17:48 crc kubenswrapper[4947]: I1203 08:17:48.070150 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lt4hf" Dec 03 08:17:48 crc kubenswrapper[4947]: I1203 08:17:48.133998 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lt4hf"] Dec 03 08:17:50 crc kubenswrapper[4947]: I1203 08:17:50.017955 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lt4hf" podUID="1e344d7f-dd0a-40f2-b6de-a74bd9f68528" containerName="registry-server" containerID="cri-o://6c4ba788e7fae0fa16efdaf08cde86ff1e84658a423d448b423af4d0f5e1d631" gracePeriod=2 Dec 03 08:17:51 crc kubenswrapper[4947]: I1203 08:17:51.030645 4947 generic.go:334] "Generic (PLEG): container finished" podID="1e344d7f-dd0a-40f2-b6de-a74bd9f68528" containerID="6c4ba788e7fae0fa16efdaf08cde86ff1e84658a423d448b423af4d0f5e1d631" exitCode=0 Dec 03 08:17:51 crc kubenswrapper[4947]: I1203 08:17:51.030705 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lt4hf" event={"ID":"1e344d7f-dd0a-40f2-b6de-a74bd9f68528","Type":"ContainerDied","Data":"6c4ba788e7fae0fa16efdaf08cde86ff1e84658a423d448b423af4d0f5e1d631"} Dec 03 08:17:51 crc kubenswrapper[4947]: I1203 08:17:51.132311 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lt4hf" Dec 03 08:17:51 crc kubenswrapper[4947]: I1203 08:17:51.237332 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e344d7f-dd0a-40f2-b6de-a74bd9f68528-catalog-content\") pod \"1e344d7f-dd0a-40f2-b6de-a74bd9f68528\" (UID: \"1e344d7f-dd0a-40f2-b6de-a74bd9f68528\") " Dec 03 08:17:51 crc kubenswrapper[4947]: I1203 08:17:51.237380 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdm8s\" (UniqueName: \"kubernetes.io/projected/1e344d7f-dd0a-40f2-b6de-a74bd9f68528-kube-api-access-qdm8s\") pod \"1e344d7f-dd0a-40f2-b6de-a74bd9f68528\" (UID: \"1e344d7f-dd0a-40f2-b6de-a74bd9f68528\") " Dec 03 08:17:51 crc kubenswrapper[4947]: I1203 08:17:51.237408 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e344d7f-dd0a-40f2-b6de-a74bd9f68528-utilities\") pod \"1e344d7f-dd0a-40f2-b6de-a74bd9f68528\" (UID: \"1e344d7f-dd0a-40f2-b6de-a74bd9f68528\") " Dec 03 08:17:51 crc kubenswrapper[4947]: I1203 08:17:51.238754 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e344d7f-dd0a-40f2-b6de-a74bd9f68528-utilities" (OuterVolumeSpecName: "utilities") pod "1e344d7f-dd0a-40f2-b6de-a74bd9f68528" (UID: "1e344d7f-dd0a-40f2-b6de-a74bd9f68528"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:17:51 crc kubenswrapper[4947]: I1203 08:17:51.245661 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e344d7f-dd0a-40f2-b6de-a74bd9f68528-kube-api-access-qdm8s" (OuterVolumeSpecName: "kube-api-access-qdm8s") pod "1e344d7f-dd0a-40f2-b6de-a74bd9f68528" (UID: "1e344d7f-dd0a-40f2-b6de-a74bd9f68528"). InnerVolumeSpecName "kube-api-access-qdm8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:17:51 crc kubenswrapper[4947]: I1203 08:17:51.287584 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e344d7f-dd0a-40f2-b6de-a74bd9f68528-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e344d7f-dd0a-40f2-b6de-a74bd9f68528" (UID: "1e344d7f-dd0a-40f2-b6de-a74bd9f68528"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:17:51 crc kubenswrapper[4947]: I1203 08:17:51.339062 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e344d7f-dd0a-40f2-b6de-a74bd9f68528-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:17:51 crc kubenswrapper[4947]: I1203 08:17:51.339103 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdm8s\" (UniqueName: \"kubernetes.io/projected/1e344d7f-dd0a-40f2-b6de-a74bd9f68528-kube-api-access-qdm8s\") on node \"crc\" DevicePath \"\"" Dec 03 08:17:51 crc kubenswrapper[4947]: I1203 08:17:51.339119 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e344d7f-dd0a-40f2-b6de-a74bd9f68528-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:17:52 crc kubenswrapper[4947]: I1203 08:17:52.040922 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lt4hf" event={"ID":"1e344d7f-dd0a-40f2-b6de-a74bd9f68528","Type":"ContainerDied","Data":"a1937c78a2e3d77470b1e9506e35690cfa04330ed361ca420bd4f5cd3e36cb49"} Dec 03 08:17:52 crc kubenswrapper[4947]: I1203 08:17:52.040990 4947 scope.go:117] "RemoveContainer" containerID="6c4ba788e7fae0fa16efdaf08cde86ff1e84658a423d448b423af4d0f5e1d631" Dec 03 08:17:52 crc kubenswrapper[4947]: I1203 08:17:52.041044 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lt4hf" Dec 03 08:17:52 crc kubenswrapper[4947]: I1203 08:17:52.061081 4947 scope.go:117] "RemoveContainer" containerID="d6f48009d2d08389da899e20cfa62c00373ed91ca3194565b0491f97d911689a" Dec 03 08:17:52 crc kubenswrapper[4947]: I1203 08:17:52.077758 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lt4hf"] Dec 03 08:17:52 crc kubenswrapper[4947]: I1203 08:17:52.087684 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lt4hf"] Dec 03 08:17:52 crc kubenswrapper[4947]: I1203 08:17:52.111606 4947 scope.go:117] "RemoveContainer" containerID="78f9dec7173054c59cd7f52ccb98a02a061ecd16f83ab074419f016512d6d3d4" Dec 03 08:17:53 crc kubenswrapper[4947]: I1203 08:17:53.100268 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e344d7f-dd0a-40f2-b6de-a74bd9f68528" path="/var/lib/kubelet/pods/1e344d7f-dd0a-40f2-b6de-a74bd9f68528/volumes" Dec 03 08:19:00 crc kubenswrapper[4947]: I1203 08:19:00.086986 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:19:00 crc kubenswrapper[4947]: I1203 08:19:00.087810 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:19:30 crc kubenswrapper[4947]: I1203 08:19:30.086111 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:19:30 crc kubenswrapper[4947]: I1203 08:19:30.086915 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:19:33 crc kubenswrapper[4947]: I1203 08:19:33.180710 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z5g7f"] Dec 03 08:19:33 crc kubenswrapper[4947]: E1203 08:19:33.181341 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e344d7f-dd0a-40f2-b6de-a74bd9f68528" containerName="extract-content" Dec 03 08:19:33 crc kubenswrapper[4947]: I1203 08:19:33.181359 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e344d7f-dd0a-40f2-b6de-a74bd9f68528" containerName="extract-content" Dec 03 08:19:33 crc kubenswrapper[4947]: E1203 08:19:33.181398 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e344d7f-dd0a-40f2-b6de-a74bd9f68528" containerName="registry-server" Dec 03 08:19:33 crc kubenswrapper[4947]: I1203 08:19:33.181409 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e344d7f-dd0a-40f2-b6de-a74bd9f68528" containerName="registry-server" Dec 03 08:19:33 crc kubenswrapper[4947]: E1203 08:19:33.181438 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e344d7f-dd0a-40f2-b6de-a74bd9f68528" containerName="extract-utilities" Dec 03 08:19:33 crc kubenswrapper[4947]: I1203 08:19:33.181447 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e344d7f-dd0a-40f2-b6de-a74bd9f68528" containerName="extract-utilities" Dec 03 08:19:33 crc kubenswrapper[4947]: I1203 08:19:33.181684 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e344d7f-dd0a-40f2-b6de-a74bd9f68528" containerName="registry-server" Dec 03 08:19:33 crc kubenswrapper[4947]: I1203 08:19:33.182955 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z5g7f" Dec 03 08:19:33 crc kubenswrapper[4947]: I1203 08:19:33.198512 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z5g7f"] Dec 03 08:19:33 crc kubenswrapper[4947]: I1203 08:19:33.334842 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c14ed39-5287-4964-9f6e-1261578c8011-catalog-content\") pod \"redhat-marketplace-z5g7f\" (UID: \"9c14ed39-5287-4964-9f6e-1261578c8011\") " pod="openshift-marketplace/redhat-marketplace-z5g7f" Dec 03 08:19:33 crc kubenswrapper[4947]: I1203 08:19:33.334905 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsh2k\" (UniqueName: \"kubernetes.io/projected/9c14ed39-5287-4964-9f6e-1261578c8011-kube-api-access-wsh2k\") pod \"redhat-marketplace-z5g7f\" (UID: \"9c14ed39-5287-4964-9f6e-1261578c8011\") " pod="openshift-marketplace/redhat-marketplace-z5g7f" Dec 03 08:19:33 crc kubenswrapper[4947]: I1203 08:19:33.334949 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c14ed39-5287-4964-9f6e-1261578c8011-utilities\") pod \"redhat-marketplace-z5g7f\" (UID: \"9c14ed39-5287-4964-9f6e-1261578c8011\") " pod="openshift-marketplace/redhat-marketplace-z5g7f" Dec 03 08:19:33 crc kubenswrapper[4947]: I1203 08:19:33.436066 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c14ed39-5287-4964-9f6e-1261578c8011-catalog-content\") pod \"redhat-marketplace-z5g7f\" (UID: \"9c14ed39-5287-4964-9f6e-1261578c8011\") " pod="openshift-marketplace/redhat-marketplace-z5g7f" Dec 03 08:19:33 crc kubenswrapper[4947]: I1203 08:19:33.436124 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsh2k\" (UniqueName: \"kubernetes.io/projected/9c14ed39-5287-4964-9f6e-1261578c8011-kube-api-access-wsh2k\") pod \"redhat-marketplace-z5g7f\" (UID: \"9c14ed39-5287-4964-9f6e-1261578c8011\") " pod="openshift-marketplace/redhat-marketplace-z5g7f" Dec 03 08:19:33 crc kubenswrapper[4947]: I1203 08:19:33.436167 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c14ed39-5287-4964-9f6e-1261578c8011-utilities\") pod \"redhat-marketplace-z5g7f\" (UID: \"9c14ed39-5287-4964-9f6e-1261578c8011\") " pod="openshift-marketplace/redhat-marketplace-z5g7f" Dec 03 08:19:33 crc kubenswrapper[4947]: I1203 08:19:33.436768 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c14ed39-5287-4964-9f6e-1261578c8011-utilities\") pod \"redhat-marketplace-z5g7f\" (UID: \"9c14ed39-5287-4964-9f6e-1261578c8011\") " pod="openshift-marketplace/redhat-marketplace-z5g7f" Dec 03 08:19:33 crc kubenswrapper[4947]: I1203 08:19:33.436837 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c14ed39-5287-4964-9f6e-1261578c8011-catalog-content\") pod \"redhat-marketplace-z5g7f\" (UID: \"9c14ed39-5287-4964-9f6e-1261578c8011\") " pod="openshift-marketplace/redhat-marketplace-z5g7f" Dec 03 08:19:33 crc kubenswrapper[4947]: I1203 08:19:33.456054 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsh2k\" (UniqueName: \"kubernetes.io/projected/9c14ed39-5287-4964-9f6e-1261578c8011-kube-api-access-wsh2k\") pod \"redhat-marketplace-z5g7f\" (UID: \"9c14ed39-5287-4964-9f6e-1261578c8011\") " pod="openshift-marketplace/redhat-marketplace-z5g7f" Dec 03 08:19:33 crc kubenswrapper[4947]: I1203 08:19:33.629268 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z5g7f" Dec 03 08:19:33 crc kubenswrapper[4947]: I1203 08:19:33.935573 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z5g7f"] Dec 03 08:19:33 crc kubenswrapper[4947]: W1203 08:19:33.939511 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c14ed39_5287_4964_9f6e_1261578c8011.slice/crio-c85cd4e6f68e24208206aef8c93e8e37797ac8c5fd7e7de0e11b7127cfe72c2c WatchSource:0}: Error finding container c85cd4e6f68e24208206aef8c93e8e37797ac8c5fd7e7de0e11b7127cfe72c2c: Status 404 returned error can't find the container with id c85cd4e6f68e24208206aef8c93e8e37797ac8c5fd7e7de0e11b7127cfe72c2c Dec 03 08:19:33 crc kubenswrapper[4947]: I1203 08:19:33.948060 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5g7f" event={"ID":"9c14ed39-5287-4964-9f6e-1261578c8011","Type":"ContainerStarted","Data":"c85cd4e6f68e24208206aef8c93e8e37797ac8c5fd7e7de0e11b7127cfe72c2c"} Dec 03 08:19:34 crc kubenswrapper[4947]: I1203 08:19:34.959807 4947 generic.go:334] "Generic (PLEG): container finished" podID="9c14ed39-5287-4964-9f6e-1261578c8011" containerID="1a80527dcbe6d40f3d62b2ea910b3156a7f6e0b996a14a4a2827b01b32924f99" exitCode=0 Dec 03 08:19:34 crc kubenswrapper[4947]: I1203 08:19:34.959923 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5g7f" event={"ID":"9c14ed39-5287-4964-9f6e-1261578c8011","Type":"ContainerDied","Data":"1a80527dcbe6d40f3d62b2ea910b3156a7f6e0b996a14a4a2827b01b32924f99"} Dec 03 08:19:35 crc kubenswrapper[4947]: I1203 08:19:35.970128 4947 generic.go:334] "Generic (PLEG): container finished" podID="9c14ed39-5287-4964-9f6e-1261578c8011" containerID="df483199c6bed2931602cb4319fc19aa4deb4f125d3152dddc2a8a8739ef38f9" exitCode=0 Dec 03 08:19:35 crc kubenswrapper[4947]: I1203 08:19:35.970196 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5g7f" event={"ID":"9c14ed39-5287-4964-9f6e-1261578c8011","Type":"ContainerDied","Data":"df483199c6bed2931602cb4319fc19aa4deb4f125d3152dddc2a8a8739ef38f9"} Dec 03 08:19:36 crc kubenswrapper[4947]: I1203 08:19:36.980334 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5g7f" event={"ID":"9c14ed39-5287-4964-9f6e-1261578c8011","Type":"ContainerStarted","Data":"fd7acb98b15a2b19c1f4e1997dd9ecb997f7486752105c5dbb94a7c97ed5b9ec"} Dec 03 08:19:37 crc kubenswrapper[4947]: I1203 08:19:37.018143 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z5g7f" podStartSLOduration=2.5143817200000003 podStartE2EDuration="4.018103093s" podCreationTimestamp="2025-12-03 08:19:33 +0000 UTC" firstStartedPulling="2025-12-03 08:19:34.962893306 +0000 UTC m=+5436.223847772" lastFinishedPulling="2025-12-03 08:19:36.466614689 +0000 UTC m=+5437.727569145" observedRunningTime="2025-12-03 08:19:37.005450261 +0000 UTC m=+5438.266404677" watchObservedRunningTime="2025-12-03 08:19:37.018103093 +0000 UTC m=+5438.279057549" Dec 03 08:19:43 crc kubenswrapper[4947]: I1203 08:19:43.630184 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z5g7f" Dec 03 08:19:43 crc kubenswrapper[4947]: I1203 08:19:43.630754 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z5g7f" Dec 03 08:19:43 crc kubenswrapper[4947]: I1203 08:19:43.691567 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z5g7f" Dec 03 08:19:44 crc kubenswrapper[4947]: I1203 08:19:44.094596 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z5g7f" Dec 03 08:19:44 crc kubenswrapper[4947]: I1203 08:19:44.137342 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z5g7f"] Dec 03 08:19:46 crc kubenswrapper[4947]: I1203 08:19:46.057344 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z5g7f" podUID="9c14ed39-5287-4964-9f6e-1261578c8011" containerName="registry-server" containerID="cri-o://fd7acb98b15a2b19c1f4e1997dd9ecb997f7486752105c5dbb94a7c97ed5b9ec" gracePeriod=2 Dec 03 08:19:46 crc kubenswrapper[4947]: I1203 08:19:46.446374 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z5g7f" Dec 03 08:19:46 crc kubenswrapper[4947]: I1203 08:19:46.588303 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsh2k\" (UniqueName: \"kubernetes.io/projected/9c14ed39-5287-4964-9f6e-1261578c8011-kube-api-access-wsh2k\") pod \"9c14ed39-5287-4964-9f6e-1261578c8011\" (UID: \"9c14ed39-5287-4964-9f6e-1261578c8011\") " Dec 03 08:19:46 crc kubenswrapper[4947]: I1203 08:19:46.588484 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c14ed39-5287-4964-9f6e-1261578c8011-utilities\") pod \"9c14ed39-5287-4964-9f6e-1261578c8011\" (UID: \"9c14ed39-5287-4964-9f6e-1261578c8011\") " Dec 03 08:19:46 crc kubenswrapper[4947]: I1203 08:19:46.588584 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c14ed39-5287-4964-9f6e-1261578c8011-catalog-content\") pod \"9c14ed39-5287-4964-9f6e-1261578c8011\" (UID: \"9c14ed39-5287-4964-9f6e-1261578c8011\") " Dec 03 08:19:46 crc kubenswrapper[4947]: I1203 08:19:46.590371 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c14ed39-5287-4964-9f6e-1261578c8011-utilities" (OuterVolumeSpecName: "utilities") pod "9c14ed39-5287-4964-9f6e-1261578c8011" (UID: "9c14ed39-5287-4964-9f6e-1261578c8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:19:46 crc kubenswrapper[4947]: I1203 08:19:46.596538 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c14ed39-5287-4964-9f6e-1261578c8011-kube-api-access-wsh2k" (OuterVolumeSpecName: "kube-api-access-wsh2k") pod "9c14ed39-5287-4964-9f6e-1261578c8011" (UID: "9c14ed39-5287-4964-9f6e-1261578c8011"). InnerVolumeSpecName "kube-api-access-wsh2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:19:46 crc kubenswrapper[4947]: I1203 08:19:46.615116 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c14ed39-5287-4964-9f6e-1261578c8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c14ed39-5287-4964-9f6e-1261578c8011" (UID: "9c14ed39-5287-4964-9f6e-1261578c8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:19:46 crc kubenswrapper[4947]: I1203 08:19:46.691392 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c14ed39-5287-4964-9f6e-1261578c8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:19:46 crc kubenswrapper[4947]: I1203 08:19:46.691460 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c14ed39-5287-4964-9f6e-1261578c8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:19:46 crc kubenswrapper[4947]: I1203 08:19:46.691482 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsh2k\" (UniqueName: \"kubernetes.io/projected/9c14ed39-5287-4964-9f6e-1261578c8011-kube-api-access-wsh2k\") on node \"crc\" DevicePath \"\"" Dec 03 08:19:47 crc kubenswrapper[4947]: I1203 08:19:47.073625 4947 generic.go:334] "Generic (PLEG): container finished" podID="9c14ed39-5287-4964-9f6e-1261578c8011" containerID="fd7acb98b15a2b19c1f4e1997dd9ecb997f7486752105c5dbb94a7c97ed5b9ec" exitCode=0 Dec 03 08:19:47 crc kubenswrapper[4947]: I1203 08:19:47.073673 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5g7f" event={"ID":"9c14ed39-5287-4964-9f6e-1261578c8011","Type":"ContainerDied","Data":"fd7acb98b15a2b19c1f4e1997dd9ecb997f7486752105c5dbb94a7c97ed5b9ec"} Dec 03 08:19:47 crc kubenswrapper[4947]: I1203 08:19:47.074681 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z5g7f" event={"ID":"9c14ed39-5287-4964-9f6e-1261578c8011","Type":"ContainerDied","Data":"c85cd4e6f68e24208206aef8c93e8e37797ac8c5fd7e7de0e11b7127cfe72c2c"} Dec 03 08:19:47 crc kubenswrapper[4947]: I1203 08:19:47.074726 4947 scope.go:117] "RemoveContainer" containerID="fd7acb98b15a2b19c1f4e1997dd9ecb997f7486752105c5dbb94a7c97ed5b9ec" Dec 03 08:19:47 crc kubenswrapper[4947]: I1203 08:19:47.073684 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z5g7f" Dec 03 08:19:47 crc kubenswrapper[4947]: I1203 08:19:47.110200 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z5g7f"] Dec 03 08:19:47 crc kubenswrapper[4947]: I1203 08:19:47.111245 4947 scope.go:117] "RemoveContainer" containerID="df483199c6bed2931602cb4319fc19aa4deb4f125d3152dddc2a8a8739ef38f9" Dec 03 08:19:47 crc kubenswrapper[4947]: I1203 08:19:47.116054 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z5g7f"] Dec 03 08:19:47 crc kubenswrapper[4947]: I1203 08:19:47.137852 4947 scope.go:117] "RemoveContainer" containerID="1a80527dcbe6d40f3d62b2ea910b3156a7f6e0b996a14a4a2827b01b32924f99" Dec 03 08:19:47 crc kubenswrapper[4947]: I1203 08:19:47.156232 4947 scope.go:117] "RemoveContainer" containerID="fd7acb98b15a2b19c1f4e1997dd9ecb997f7486752105c5dbb94a7c97ed5b9ec" Dec 03 08:19:47 crc kubenswrapper[4947]: E1203 08:19:47.156714 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd7acb98b15a2b19c1f4e1997dd9ecb997f7486752105c5dbb94a7c97ed5b9ec\": container with ID starting with fd7acb98b15a2b19c1f4e1997dd9ecb997f7486752105c5dbb94a7c97ed5b9ec not found: ID does not exist" containerID="fd7acb98b15a2b19c1f4e1997dd9ecb997f7486752105c5dbb94a7c97ed5b9ec" Dec 03 08:19:47 crc kubenswrapper[4947]: I1203 08:19:47.156766 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd7acb98b15a2b19c1f4e1997dd9ecb997f7486752105c5dbb94a7c97ed5b9ec"} err="failed to get container status \"fd7acb98b15a2b19c1f4e1997dd9ecb997f7486752105c5dbb94a7c97ed5b9ec\": rpc error: code = NotFound desc = could not find container \"fd7acb98b15a2b19c1f4e1997dd9ecb997f7486752105c5dbb94a7c97ed5b9ec\": container with ID starting with fd7acb98b15a2b19c1f4e1997dd9ecb997f7486752105c5dbb94a7c97ed5b9ec not found: ID does not exist" Dec 03 08:19:47 crc kubenswrapper[4947]: I1203 08:19:47.156799 4947 scope.go:117] "RemoveContainer" containerID="df483199c6bed2931602cb4319fc19aa4deb4f125d3152dddc2a8a8739ef38f9" Dec 03 08:19:47 crc kubenswrapper[4947]: E1203 08:19:47.157243 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df483199c6bed2931602cb4319fc19aa4deb4f125d3152dddc2a8a8739ef38f9\": container with ID starting with df483199c6bed2931602cb4319fc19aa4deb4f125d3152dddc2a8a8739ef38f9 not found: ID does not exist" containerID="df483199c6bed2931602cb4319fc19aa4deb4f125d3152dddc2a8a8739ef38f9" Dec 03 08:19:47 crc kubenswrapper[4947]: I1203 08:19:47.157287 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df483199c6bed2931602cb4319fc19aa4deb4f125d3152dddc2a8a8739ef38f9"} err="failed to get container status \"df483199c6bed2931602cb4319fc19aa4deb4f125d3152dddc2a8a8739ef38f9\": rpc error: code = NotFound desc = could not find container \"df483199c6bed2931602cb4319fc19aa4deb4f125d3152dddc2a8a8739ef38f9\": container with ID starting with df483199c6bed2931602cb4319fc19aa4deb4f125d3152dddc2a8a8739ef38f9 not found: ID does not exist" Dec 03 08:19:47 crc kubenswrapper[4947]: I1203 08:19:47.157317 4947 scope.go:117] "RemoveContainer" containerID="1a80527dcbe6d40f3d62b2ea910b3156a7f6e0b996a14a4a2827b01b32924f99" Dec 03 08:19:47 crc kubenswrapper[4947]: E1203 08:19:47.157655 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a80527dcbe6d40f3d62b2ea910b3156a7f6e0b996a14a4a2827b01b32924f99\": container with ID starting with 1a80527dcbe6d40f3d62b2ea910b3156a7f6e0b996a14a4a2827b01b32924f99 not found: ID does not exist" containerID="1a80527dcbe6d40f3d62b2ea910b3156a7f6e0b996a14a4a2827b01b32924f99" Dec 03 08:19:47 crc kubenswrapper[4947]: I1203 08:19:47.157682 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a80527dcbe6d40f3d62b2ea910b3156a7f6e0b996a14a4a2827b01b32924f99"} err="failed to get container status \"1a80527dcbe6d40f3d62b2ea910b3156a7f6e0b996a14a4a2827b01b32924f99\": rpc error: code = NotFound desc = could not find container \"1a80527dcbe6d40f3d62b2ea910b3156a7f6e0b996a14a4a2827b01b32924f99\": container with ID starting with 1a80527dcbe6d40f3d62b2ea910b3156a7f6e0b996a14a4a2827b01b32924f99 not found: ID does not exist" Dec 03 08:19:49 crc kubenswrapper[4947]: I1203 08:19:49.121119 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c14ed39-5287-4964-9f6e-1261578c8011" path="/var/lib/kubelet/pods/9c14ed39-5287-4964-9f6e-1261578c8011/volumes" Dec 03 08:20:00 crc kubenswrapper[4947]: I1203 08:20:00.086853 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:20:00 crc kubenswrapper[4947]: I1203 08:20:00.087605 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:20:00 crc kubenswrapper[4947]: I1203 08:20:00.087678 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 08:20:00 crc kubenswrapper[4947]: I1203 08:20:00.088434 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fab3869a6a339861653758174ff6a69391a8ba3c2cecefc3d5b211cbc5751299"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:20:00 crc kubenswrapper[4947]: I1203 08:20:00.088583 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://fab3869a6a339861653758174ff6a69391a8ba3c2cecefc3d5b211cbc5751299" gracePeriod=600 Dec 03 08:20:01 crc kubenswrapper[4947]: I1203 08:20:01.231327 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="fab3869a6a339861653758174ff6a69391a8ba3c2cecefc3d5b211cbc5751299" exitCode=0 Dec 03 08:20:01 crc kubenswrapper[4947]: I1203 08:20:01.231434 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"fab3869a6a339861653758174ff6a69391a8ba3c2cecefc3d5b211cbc5751299"} Dec 03 08:20:01 crc kubenswrapper[4947]: I1203 08:20:01.231833 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025"} Dec 03 08:20:01 crc kubenswrapper[4947]: I1203 08:20:01.231879 4947 scope.go:117] "RemoveContainer" containerID="3b644970ee86a1014175f6cf8e7b4d01e681f45fbaf7ad1590ecb9f25e9ea517" Dec 03 08:22:00 crc kubenswrapper[4947]: I1203 08:22:00.086322 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:22:00 crc kubenswrapper[4947]: I1203 08:22:00.087537 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:22:30 crc kubenswrapper[4947]: I1203 08:22:30.086825 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:22:30 crc kubenswrapper[4947]: I1203 08:22:30.087372 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:23:00 crc kubenswrapper[4947]: I1203 08:23:00.086192 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:23:00 crc kubenswrapper[4947]: I1203 08:23:00.086719 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:23:00 crc kubenswrapper[4947]: I1203 08:23:00.086761 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 08:23:00 crc kubenswrapper[4947]: I1203 08:23:00.087145 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:23:00 crc kubenswrapper[4947]: I1203 08:23:00.087195 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" gracePeriod=600 Dec 03 08:23:00 crc kubenswrapper[4947]: E1203 08:23:00.214286 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:23:00 crc kubenswrapper[4947]: I1203 08:23:00.222795 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" exitCode=0 Dec 03 08:23:00 crc kubenswrapper[4947]: I1203 08:23:00.222889 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025"} Dec 03 08:23:00 crc kubenswrapper[4947]: I1203 08:23:00.223402 4947 scope.go:117] "RemoveContainer" containerID="fab3869a6a339861653758174ff6a69391a8ba3c2cecefc3d5b211cbc5751299" Dec 03 08:23:00 crc kubenswrapper[4947]: I1203 08:23:00.224063 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:23:00 crc kubenswrapper[4947]: E1203 08:23:00.224443 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:23:13 crc kubenswrapper[4947]: I1203 08:23:13.084018 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:23:13 crc kubenswrapper[4947]: E1203 08:23:13.084812 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:23:24 crc kubenswrapper[4947]: I1203 08:23:24.082591 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:23:24 crc kubenswrapper[4947]: E1203 08:23:24.083280 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:23:35 crc kubenswrapper[4947]: I1203 08:23:35.088485 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:23:35 crc kubenswrapper[4947]: E1203 08:23:35.089687 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:23:46 crc kubenswrapper[4947]: I1203 08:23:46.083735 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:23:46 crc kubenswrapper[4947]: E1203 08:23:46.084619 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:24:01 crc kubenswrapper[4947]: I1203 08:24:01.083218 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:24:01 crc kubenswrapper[4947]: E1203 08:24:01.084232 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:24:12 crc kubenswrapper[4947]: I1203 08:24:12.082915 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:24:12 crc kubenswrapper[4947]: E1203 08:24:12.083624 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:24:18 crc kubenswrapper[4947]: I1203 08:24:18.948152 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xvnk5"] Dec 03 08:24:18 crc kubenswrapper[4947]: E1203 08:24:18.949142 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c14ed39-5287-4964-9f6e-1261578c8011" containerName="extract-content" Dec 03 08:24:18 crc kubenswrapper[4947]: I1203 08:24:18.949162 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c14ed39-5287-4964-9f6e-1261578c8011" containerName="extract-content" Dec 03 08:24:18 crc kubenswrapper[4947]: E1203 08:24:18.949191 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c14ed39-5287-4964-9f6e-1261578c8011" containerName="registry-server" Dec 03 08:24:18 crc kubenswrapper[4947]: I1203 08:24:18.949205 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c14ed39-5287-4964-9f6e-1261578c8011" containerName="registry-server" Dec 03 08:24:18 crc kubenswrapper[4947]: E1203 08:24:18.949226 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c14ed39-5287-4964-9f6e-1261578c8011" containerName="extract-utilities" Dec 03 08:24:18 crc kubenswrapper[4947]: I1203 08:24:18.949241 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c14ed39-5287-4964-9f6e-1261578c8011" containerName="extract-utilities" Dec 03 08:24:18 crc kubenswrapper[4947]: I1203 08:24:18.949806 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c14ed39-5287-4964-9f6e-1261578c8011" containerName="registry-server" Dec 03 08:24:18 crc kubenswrapper[4947]: I1203 08:24:18.952206 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xvnk5" Dec 03 08:24:18 crc kubenswrapper[4947]: I1203 08:24:18.969641 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xvnk5"] Dec 03 08:24:19 crc kubenswrapper[4947]: I1203 08:24:19.128131 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9-catalog-content\") pod \"certified-operators-xvnk5\" (UID: \"f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9\") " pod="openshift-marketplace/certified-operators-xvnk5" Dec 03 08:24:19 crc kubenswrapper[4947]: I1203 08:24:19.128303 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9-utilities\") pod \"certified-operators-xvnk5\" (UID: \"f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9\") " pod="openshift-marketplace/certified-operators-xvnk5" Dec 03 08:24:19 crc kubenswrapper[4947]: I1203 08:24:19.128560 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzptx\" (UniqueName: \"kubernetes.io/projected/f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9-kube-api-access-tzptx\") pod \"certified-operators-xvnk5\" (UID: \"f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9\") " pod="openshift-marketplace/certified-operators-xvnk5" Dec 03 08:24:19 crc kubenswrapper[4947]: I1203 08:24:19.229743 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzptx\" (UniqueName: \"kubernetes.io/projected/f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9-kube-api-access-tzptx\") pod \"certified-operators-xvnk5\" (UID: \"f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9\") " pod="openshift-marketplace/certified-operators-xvnk5" Dec 03 08:24:19 crc kubenswrapper[4947]: I1203 08:24:19.229930 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9-catalog-content\") pod \"certified-operators-xvnk5\" (UID: \"f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9\") " pod="openshift-marketplace/certified-operators-xvnk5" Dec 03 08:24:19 crc kubenswrapper[4947]: I1203 08:24:19.230078 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9-utilities\") pod \"certified-operators-xvnk5\" (UID: \"f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9\") " pod="openshift-marketplace/certified-operators-xvnk5" Dec 03 08:24:19 crc kubenswrapper[4947]: I1203 08:24:19.230852 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9-catalog-content\") pod \"certified-operators-xvnk5\" (UID: \"f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9\") " pod="openshift-marketplace/certified-operators-xvnk5" Dec 03 08:24:19 crc kubenswrapper[4947]: I1203 08:24:19.230861 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9-utilities\") pod \"certified-operators-xvnk5\" (UID: \"f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9\") " pod="openshift-marketplace/certified-operators-xvnk5" Dec 03 08:24:19 crc kubenswrapper[4947]: I1203 08:24:19.267571 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzptx\" (UniqueName: \"kubernetes.io/projected/f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9-kube-api-access-tzptx\") pod \"certified-operators-xvnk5\" (UID: \"f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9\") " pod="openshift-marketplace/certified-operators-xvnk5" Dec 03 08:24:19 crc kubenswrapper[4947]: I1203 08:24:19.274969 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xvnk5" Dec 03 08:24:19 crc kubenswrapper[4947]: I1203 08:24:19.772932 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xvnk5"] Dec 03 08:24:19 crc kubenswrapper[4947]: W1203 08:24:19.774943 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9e1c6cb_1ba2_4e6f_89a6_35a5ad3f2ee9.slice/crio-98e6695caf8857d3052ed9e4075726f4b01335fd9220c078c500fa1e2de614ef WatchSource:0}: Error finding container 98e6695caf8857d3052ed9e4075726f4b01335fd9220c078c500fa1e2de614ef: Status 404 returned error can't find the container with id 98e6695caf8857d3052ed9e4075726f4b01335fd9220c078c500fa1e2de614ef Dec 03 08:24:19 crc kubenswrapper[4947]: I1203 08:24:19.909814 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvnk5" event={"ID":"f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9","Type":"ContainerStarted","Data":"98e6695caf8857d3052ed9e4075726f4b01335fd9220c078c500fa1e2de614ef"} Dec 03 08:24:20 crc kubenswrapper[4947]: I1203 08:24:20.923020 4947 generic.go:334] "Generic (PLEG): container finished" podID="f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9" containerID="48582ce5c2b1f972fd45565eaaff7432c76a888ff3cd8efe7ae4a83737229113" exitCode=0 Dec 03 08:24:20 crc kubenswrapper[4947]: I1203 08:24:20.923224 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvnk5" event={"ID":"f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9","Type":"ContainerDied","Data":"48582ce5c2b1f972fd45565eaaff7432c76a888ff3cd8efe7ae4a83737229113"} Dec 03 08:24:20 crc kubenswrapper[4947]: I1203 08:24:20.931461 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 08:24:21 crc kubenswrapper[4947]: I1203 08:24:21.933275 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvnk5" event={"ID":"f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9","Type":"ContainerStarted","Data":"d84ffcdeb330dd1395c4d4f1d3d3290d46282043bee789ca31633ba90a34ca5c"} Dec 03 08:24:22 crc kubenswrapper[4947]: I1203 08:24:22.147590 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n9mgb"] Dec 03 08:24:22 crc kubenswrapper[4947]: I1203 08:24:22.153222 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9mgb" Dec 03 08:24:22 crc kubenswrapper[4947]: I1203 08:24:22.176369 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n9mgb"] Dec 03 08:24:22 crc kubenswrapper[4947]: I1203 08:24:22.279006 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4bee9bc-f269-403f-bf3a-93ef017aeb1b-utilities\") pod \"redhat-operators-n9mgb\" (UID: \"b4bee9bc-f269-403f-bf3a-93ef017aeb1b\") " pod="openshift-marketplace/redhat-operators-n9mgb" Dec 03 08:24:22 crc kubenswrapper[4947]: I1203 08:24:22.279270 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stzjj\" (UniqueName: \"kubernetes.io/projected/b4bee9bc-f269-403f-bf3a-93ef017aeb1b-kube-api-access-stzjj\") pod \"redhat-operators-n9mgb\" (UID: \"b4bee9bc-f269-403f-bf3a-93ef017aeb1b\") " pod="openshift-marketplace/redhat-operators-n9mgb" Dec 03 08:24:22 crc kubenswrapper[4947]: I1203 08:24:22.279356 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4bee9bc-f269-403f-bf3a-93ef017aeb1b-catalog-content\") pod \"redhat-operators-n9mgb\" (UID: \"b4bee9bc-f269-403f-bf3a-93ef017aeb1b\") " pod="openshift-marketplace/redhat-operators-n9mgb" Dec 03 08:24:22 crc kubenswrapper[4947]: I1203 08:24:22.381220 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stzjj\" (UniqueName: \"kubernetes.io/projected/b4bee9bc-f269-403f-bf3a-93ef017aeb1b-kube-api-access-stzjj\") pod \"redhat-operators-n9mgb\" (UID: \"b4bee9bc-f269-403f-bf3a-93ef017aeb1b\") " pod="openshift-marketplace/redhat-operators-n9mgb" Dec 03 08:24:22 crc kubenswrapper[4947]: I1203 08:24:22.381303 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4bee9bc-f269-403f-bf3a-93ef017aeb1b-catalog-content\") pod \"redhat-operators-n9mgb\" (UID: \"b4bee9bc-f269-403f-bf3a-93ef017aeb1b\") " pod="openshift-marketplace/redhat-operators-n9mgb" Dec 03 08:24:22 crc kubenswrapper[4947]: I1203 08:24:22.381794 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4bee9bc-f269-403f-bf3a-93ef017aeb1b-catalog-content\") pod \"redhat-operators-n9mgb\" (UID: \"b4bee9bc-f269-403f-bf3a-93ef017aeb1b\") " pod="openshift-marketplace/redhat-operators-n9mgb" Dec 03 08:24:22 crc kubenswrapper[4947]: I1203 08:24:22.381878 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4bee9bc-f269-403f-bf3a-93ef017aeb1b-utilities\") pod \"redhat-operators-n9mgb\" (UID: \"b4bee9bc-f269-403f-bf3a-93ef017aeb1b\") " pod="openshift-marketplace/redhat-operators-n9mgb" Dec 03 08:24:22 crc kubenswrapper[4947]: I1203 08:24:22.382183 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4bee9bc-f269-403f-bf3a-93ef017aeb1b-utilities\") pod \"redhat-operators-n9mgb\" (UID: \"b4bee9bc-f269-403f-bf3a-93ef017aeb1b\") " pod="openshift-marketplace/redhat-operators-n9mgb" Dec 03 08:24:22 crc kubenswrapper[4947]: I1203 08:24:22.409869 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stzjj\" (UniqueName: \"kubernetes.io/projected/b4bee9bc-f269-403f-bf3a-93ef017aeb1b-kube-api-access-stzjj\") pod \"redhat-operators-n9mgb\" (UID: \"b4bee9bc-f269-403f-bf3a-93ef017aeb1b\") " pod="openshift-marketplace/redhat-operators-n9mgb" Dec 03 08:24:22 crc kubenswrapper[4947]: I1203 08:24:22.486619 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9mgb" Dec 03 08:24:22 crc kubenswrapper[4947]: I1203 08:24:22.932632 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n9mgb"] Dec 03 08:24:22 crc kubenswrapper[4947]: I1203 08:24:22.941778 4947 generic.go:334] "Generic (PLEG): container finished" podID="f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9" containerID="d84ffcdeb330dd1395c4d4f1d3d3290d46282043bee789ca31633ba90a34ca5c" exitCode=0 Dec 03 08:24:22 crc kubenswrapper[4947]: I1203 08:24:22.941817 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvnk5" event={"ID":"f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9","Type":"ContainerDied","Data":"d84ffcdeb330dd1395c4d4f1d3d3290d46282043bee789ca31633ba90a34ca5c"} Dec 03 08:24:23 crc kubenswrapper[4947]: I1203 08:24:23.949983 4947 generic.go:334] "Generic (PLEG): container finished" podID="b4bee9bc-f269-403f-bf3a-93ef017aeb1b" containerID="b579a059de1a4be80bc95ffd3ebf9a389ec1a05a6246640b4b8535bbd6a22f6c" exitCode=0 Dec 03 08:24:23 crc kubenswrapper[4947]: I1203 08:24:23.950056 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9mgb" event={"ID":"b4bee9bc-f269-403f-bf3a-93ef017aeb1b","Type":"ContainerDied","Data":"b579a059de1a4be80bc95ffd3ebf9a389ec1a05a6246640b4b8535bbd6a22f6c"} Dec 03 08:24:23 crc kubenswrapper[4947]: I1203 08:24:23.950413 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9mgb" event={"ID":"b4bee9bc-f269-403f-bf3a-93ef017aeb1b","Type":"ContainerStarted","Data":"c29fcd03467f7aafc870a62f07e0d4d7e7d80a50316b8d900287a467aade1ba1"} Dec 03 08:24:23 crc kubenswrapper[4947]: I1203 08:24:23.954746 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvnk5" event={"ID":"f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9","Type":"ContainerStarted","Data":"95ea610160ac9c807b343fd08276d7539cf44aacb00f1731cf50195c5e1e93bd"} Dec 03 08:24:23 crc kubenswrapper[4947]: I1203 08:24:23.987661 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xvnk5" podStartSLOduration=3.570167754 podStartE2EDuration="5.987641139s" podCreationTimestamp="2025-12-03 08:24:18 +0000 UTC" firstStartedPulling="2025-12-03 08:24:20.930869057 +0000 UTC m=+5722.191823533" lastFinishedPulling="2025-12-03 08:24:23.348342492 +0000 UTC m=+5724.609296918" observedRunningTime="2025-12-03 08:24:23.983549248 +0000 UTC m=+5725.244503674" watchObservedRunningTime="2025-12-03 08:24:23.987641139 +0000 UTC m=+5725.248595565" Dec 03 08:24:24 crc kubenswrapper[4947]: I1203 08:24:24.083700 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:24:24 crc kubenswrapper[4947]: E1203 08:24:24.084204 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:24:25 crc kubenswrapper[4947]: I1203 08:24:25.972383 4947 generic.go:334] "Generic (PLEG): container finished" podID="b4bee9bc-f269-403f-bf3a-93ef017aeb1b" containerID="6414d708e43ce28ff73bee26d76d53a402adf18356aa5710322759ff49dec73a" exitCode=0 Dec 03 08:24:25 crc kubenswrapper[4947]: I1203 08:24:25.972512 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9mgb" event={"ID":"b4bee9bc-f269-403f-bf3a-93ef017aeb1b","Type":"ContainerDied","Data":"6414d708e43ce28ff73bee26d76d53a402adf18356aa5710322759ff49dec73a"} Dec 03 08:24:26 crc kubenswrapper[4947]: I1203 08:24:26.986992 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9mgb" event={"ID":"b4bee9bc-f269-403f-bf3a-93ef017aeb1b","Type":"ContainerStarted","Data":"fa58abe7b6f98f33453817bda7367aa2e379c19267270af14d648bc88c008515"} Dec 03 08:24:27 crc kubenswrapper[4947]: I1203 08:24:27.012655 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n9mgb" podStartSLOduration=2.601110396 podStartE2EDuration="5.012629351s" podCreationTimestamp="2025-12-03 08:24:22 +0000 UTC" firstStartedPulling="2025-12-03 08:24:23.952140499 +0000 UTC m=+5725.213094935" lastFinishedPulling="2025-12-03 08:24:26.363659464 +0000 UTC m=+5727.624613890" observedRunningTime="2025-12-03 08:24:27.001277775 +0000 UTC m=+5728.262232231" watchObservedRunningTime="2025-12-03 08:24:27.012629351 +0000 UTC m=+5728.273583777" Dec 03 08:24:29 crc kubenswrapper[4947]: I1203 08:24:29.276057 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xvnk5" Dec 03 08:24:29 crc kubenswrapper[4947]: I1203 08:24:29.276730 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xvnk5" Dec 03 08:24:29 crc kubenswrapper[4947]: I1203 08:24:29.336888 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xvnk5" Dec 03 08:24:30 crc kubenswrapper[4947]: I1203 08:24:30.102341 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xvnk5" Dec 03 08:24:32 crc kubenswrapper[4947]: I1203 08:24:32.487579 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n9mgb" Dec 03 08:24:32 crc kubenswrapper[4947]: I1203 08:24:32.487667 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n9mgb" Dec 03 08:24:32 crc kubenswrapper[4947]: I1203 08:24:32.602025 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n9mgb" Dec 03 08:24:33 crc kubenswrapper[4947]: I1203 08:24:33.149505 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n9mgb" Dec 03 08:24:34 crc kubenswrapper[4947]: I1203 08:24:34.133107 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xvnk5"] Dec 03 08:24:34 crc kubenswrapper[4947]: I1203 08:24:34.133514 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xvnk5" podUID="f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9" containerName="registry-server" containerID="cri-o://95ea610160ac9c807b343fd08276d7539cf44aacb00f1731cf50195c5e1e93bd" gracePeriod=2 Dec 03 08:24:34 crc kubenswrapper[4947]: I1203 08:24:34.641598 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xvnk5" Dec 03 08:24:34 crc kubenswrapper[4947]: I1203 08:24:34.661203 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9-catalog-content\") pod \"f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9\" (UID: \"f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9\") " Dec 03 08:24:34 crc kubenswrapper[4947]: I1203 08:24:34.661357 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzptx\" (UniqueName: \"kubernetes.io/projected/f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9-kube-api-access-tzptx\") pod \"f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9\" (UID: \"f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9\") " Dec 03 08:24:34 crc kubenswrapper[4947]: I1203 08:24:34.661426 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9-utilities\") pod \"f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9\" (UID: \"f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9\") " Dec 03 08:24:34 crc kubenswrapper[4947]: I1203 08:24:34.662721 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9-utilities" (OuterVolumeSpecName: "utilities") pod "f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9" (UID: "f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:24:34 crc kubenswrapper[4947]: I1203 08:24:34.671754 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9-kube-api-access-tzptx" (OuterVolumeSpecName: "kube-api-access-tzptx") pod "f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9" (UID: "f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9"). InnerVolumeSpecName "kube-api-access-tzptx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:24:34 crc kubenswrapper[4947]: I1203 08:24:34.707278 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9" (UID: "f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:24:34 crc kubenswrapper[4947]: I1203 08:24:34.730435 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n9mgb"] Dec 03 08:24:34 crc kubenswrapper[4947]: I1203 08:24:34.762458 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzptx\" (UniqueName: \"kubernetes.io/projected/f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9-kube-api-access-tzptx\") on node \"crc\" DevicePath \"\"" Dec 03 08:24:34 crc kubenswrapper[4947]: I1203 08:24:34.762550 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:24:34 crc kubenswrapper[4947]: I1203 08:24:34.762576 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:24:35 crc kubenswrapper[4947]: I1203 08:24:35.105251 4947 generic.go:334] "Generic (PLEG): container finished" podID="f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9" containerID="95ea610160ac9c807b343fd08276d7539cf44aacb00f1731cf50195c5e1e93bd" exitCode=0 Dec 03 08:24:35 crc kubenswrapper[4947]: I1203 08:24:35.105352 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xvnk5" Dec 03 08:24:35 crc kubenswrapper[4947]: I1203 08:24:35.105343 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvnk5" event={"ID":"f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9","Type":"ContainerDied","Data":"95ea610160ac9c807b343fd08276d7539cf44aacb00f1731cf50195c5e1e93bd"} Dec 03 08:24:35 crc kubenswrapper[4947]: I1203 08:24:35.105473 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvnk5" event={"ID":"f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9","Type":"ContainerDied","Data":"98e6695caf8857d3052ed9e4075726f4b01335fd9220c078c500fa1e2de614ef"} Dec 03 08:24:35 crc kubenswrapper[4947]: I1203 08:24:35.105591 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n9mgb" podUID="b4bee9bc-f269-403f-bf3a-93ef017aeb1b" containerName="registry-server" containerID="cri-o://fa58abe7b6f98f33453817bda7367aa2e379c19267270af14d648bc88c008515" gracePeriod=2 Dec 03 08:24:35 crc kubenswrapper[4947]: I1203 08:24:35.105585 4947 scope.go:117] "RemoveContainer" containerID="95ea610160ac9c807b343fd08276d7539cf44aacb00f1731cf50195c5e1e93bd" Dec 03 08:24:35 crc kubenswrapper[4947]: I1203 08:24:35.135654 4947 scope.go:117] "RemoveContainer" containerID="d84ffcdeb330dd1395c4d4f1d3d3290d46282043bee789ca31633ba90a34ca5c" Dec 03 08:24:35 crc kubenswrapper[4947]: I1203 08:24:35.150193 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xvnk5"] Dec 03 08:24:35 crc kubenswrapper[4947]: I1203 08:24:35.156152 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xvnk5"] Dec 03 08:24:35 crc kubenswrapper[4947]: I1203 08:24:35.168203 4947 scope.go:117] "RemoveContainer" containerID="48582ce5c2b1f972fd45565eaaff7432c76a888ff3cd8efe7ae4a83737229113" Dec 03 08:24:35 crc kubenswrapper[4947]: I1203 08:24:35.188914 4947 scope.go:117] "RemoveContainer" containerID="95ea610160ac9c807b343fd08276d7539cf44aacb00f1731cf50195c5e1e93bd" Dec 03 08:24:35 crc kubenswrapper[4947]: E1203 08:24:35.189417 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95ea610160ac9c807b343fd08276d7539cf44aacb00f1731cf50195c5e1e93bd\": container with ID starting with 95ea610160ac9c807b343fd08276d7539cf44aacb00f1731cf50195c5e1e93bd not found: ID does not exist" containerID="95ea610160ac9c807b343fd08276d7539cf44aacb00f1731cf50195c5e1e93bd" Dec 03 08:24:35 crc kubenswrapper[4947]: I1203 08:24:35.189443 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95ea610160ac9c807b343fd08276d7539cf44aacb00f1731cf50195c5e1e93bd"} err="failed to get container status \"95ea610160ac9c807b343fd08276d7539cf44aacb00f1731cf50195c5e1e93bd\": rpc error: code = NotFound desc = could not find container \"95ea610160ac9c807b343fd08276d7539cf44aacb00f1731cf50195c5e1e93bd\": container with ID starting with 95ea610160ac9c807b343fd08276d7539cf44aacb00f1731cf50195c5e1e93bd not found: ID does not exist" Dec 03 08:24:35 crc kubenswrapper[4947]: I1203 08:24:35.189464 4947 scope.go:117] "RemoveContainer" containerID="d84ffcdeb330dd1395c4d4f1d3d3290d46282043bee789ca31633ba90a34ca5c" Dec 03 08:24:35 crc kubenswrapper[4947]: E1203 08:24:35.189839 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d84ffcdeb330dd1395c4d4f1d3d3290d46282043bee789ca31633ba90a34ca5c\": container with ID starting with d84ffcdeb330dd1395c4d4f1d3d3290d46282043bee789ca31633ba90a34ca5c not found: ID does not exist" containerID="d84ffcdeb330dd1395c4d4f1d3d3290d46282043bee789ca31633ba90a34ca5c" Dec 03 08:24:35 crc kubenswrapper[4947]: I1203 08:24:35.189862 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d84ffcdeb330dd1395c4d4f1d3d3290d46282043bee789ca31633ba90a34ca5c"} err="failed to get container status \"d84ffcdeb330dd1395c4d4f1d3d3290d46282043bee789ca31633ba90a34ca5c\": rpc error: code = NotFound desc = could not find container \"d84ffcdeb330dd1395c4d4f1d3d3290d46282043bee789ca31633ba90a34ca5c\": container with ID starting with d84ffcdeb330dd1395c4d4f1d3d3290d46282043bee789ca31633ba90a34ca5c not found: ID does not exist" Dec 03 08:24:35 crc kubenswrapper[4947]: I1203 08:24:35.189874 4947 scope.go:117] "RemoveContainer" containerID="48582ce5c2b1f972fd45565eaaff7432c76a888ff3cd8efe7ae4a83737229113" Dec 03 08:24:35 crc kubenswrapper[4947]: E1203 08:24:35.190452 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48582ce5c2b1f972fd45565eaaff7432c76a888ff3cd8efe7ae4a83737229113\": container with ID starting with 48582ce5c2b1f972fd45565eaaff7432c76a888ff3cd8efe7ae4a83737229113 not found: ID does not exist" containerID="48582ce5c2b1f972fd45565eaaff7432c76a888ff3cd8efe7ae4a83737229113" Dec 03 08:24:35 crc kubenswrapper[4947]: I1203 08:24:35.190473 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48582ce5c2b1f972fd45565eaaff7432c76a888ff3cd8efe7ae4a83737229113"} err="failed to get container status \"48582ce5c2b1f972fd45565eaaff7432c76a888ff3cd8efe7ae4a83737229113\": rpc error: code = NotFound desc = could not find container \"48582ce5c2b1f972fd45565eaaff7432c76a888ff3cd8efe7ae4a83737229113\": container with ID starting with 48582ce5c2b1f972fd45565eaaff7432c76a888ff3cd8efe7ae4a83737229113 not found: ID does not exist" Dec 03 08:24:35 crc kubenswrapper[4947]: I1203 08:24:35.602341 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9mgb" Dec 03 08:24:35 crc kubenswrapper[4947]: I1203 08:24:35.673422 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4bee9bc-f269-403f-bf3a-93ef017aeb1b-utilities\") pod \"b4bee9bc-f269-403f-bf3a-93ef017aeb1b\" (UID: \"b4bee9bc-f269-403f-bf3a-93ef017aeb1b\") " Dec 03 08:24:35 crc kubenswrapper[4947]: I1203 08:24:35.673584 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4bee9bc-f269-403f-bf3a-93ef017aeb1b-catalog-content\") pod \"b4bee9bc-f269-403f-bf3a-93ef017aeb1b\" (UID: \"b4bee9bc-f269-403f-bf3a-93ef017aeb1b\") " Dec 03 08:24:35 crc kubenswrapper[4947]: I1203 08:24:35.673626 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stzjj\" (UniqueName: \"kubernetes.io/projected/b4bee9bc-f269-403f-bf3a-93ef017aeb1b-kube-api-access-stzjj\") pod \"b4bee9bc-f269-403f-bf3a-93ef017aeb1b\" (UID: \"b4bee9bc-f269-403f-bf3a-93ef017aeb1b\") " Dec 03 08:24:35 crc kubenswrapper[4947]: I1203 08:24:35.675865 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4bee9bc-f269-403f-bf3a-93ef017aeb1b-utilities" (OuterVolumeSpecName: "utilities") pod "b4bee9bc-f269-403f-bf3a-93ef017aeb1b" (UID: "b4bee9bc-f269-403f-bf3a-93ef017aeb1b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:24:35 crc kubenswrapper[4947]: I1203 08:24:35.680668 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4bee9bc-f269-403f-bf3a-93ef017aeb1b-kube-api-access-stzjj" (OuterVolumeSpecName: "kube-api-access-stzjj") pod "b4bee9bc-f269-403f-bf3a-93ef017aeb1b" (UID: "b4bee9bc-f269-403f-bf3a-93ef017aeb1b"). InnerVolumeSpecName "kube-api-access-stzjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:24:35 crc kubenswrapper[4947]: I1203 08:24:35.775278 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4bee9bc-f269-403f-bf3a-93ef017aeb1b-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:24:35 crc kubenswrapper[4947]: I1203 08:24:35.775312 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stzjj\" (UniqueName: \"kubernetes.io/projected/b4bee9bc-f269-403f-bf3a-93ef017aeb1b-kube-api-access-stzjj\") on node \"crc\" DevicePath \"\"" Dec 03 08:24:36 crc kubenswrapper[4947]: I1203 08:24:36.120128 4947 generic.go:334] "Generic (PLEG): container finished" podID="b4bee9bc-f269-403f-bf3a-93ef017aeb1b" containerID="fa58abe7b6f98f33453817bda7367aa2e379c19267270af14d648bc88c008515" exitCode=0 Dec 03 08:24:36 crc kubenswrapper[4947]: I1203 08:24:36.120174 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9mgb" event={"ID":"b4bee9bc-f269-403f-bf3a-93ef017aeb1b","Type":"ContainerDied","Data":"fa58abe7b6f98f33453817bda7367aa2e379c19267270af14d648bc88c008515"} Dec 03 08:24:36 crc kubenswrapper[4947]: I1203 08:24:36.120203 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n9mgb" event={"ID":"b4bee9bc-f269-403f-bf3a-93ef017aeb1b","Type":"ContainerDied","Data":"c29fcd03467f7aafc870a62f07e0d4d7e7d80a50316b8d900287a467aade1ba1"} Dec 03 08:24:36 crc kubenswrapper[4947]: I1203 08:24:36.120209 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n9mgb" Dec 03 08:24:36 crc kubenswrapper[4947]: I1203 08:24:36.120235 4947 scope.go:117] "RemoveContainer" containerID="fa58abe7b6f98f33453817bda7367aa2e379c19267270af14d648bc88c008515" Dec 03 08:24:36 crc kubenswrapper[4947]: I1203 08:24:36.143845 4947 scope.go:117] "RemoveContainer" containerID="6414d708e43ce28ff73bee26d76d53a402adf18356aa5710322759ff49dec73a" Dec 03 08:24:36 crc kubenswrapper[4947]: I1203 08:24:36.164208 4947 scope.go:117] "RemoveContainer" containerID="b579a059de1a4be80bc95ffd3ebf9a389ec1a05a6246640b4b8535bbd6a22f6c" Dec 03 08:24:36 crc kubenswrapper[4947]: I1203 08:24:36.184960 4947 scope.go:117] "RemoveContainer" containerID="fa58abe7b6f98f33453817bda7367aa2e379c19267270af14d648bc88c008515" Dec 03 08:24:36 crc kubenswrapper[4947]: E1203 08:24:36.185524 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa58abe7b6f98f33453817bda7367aa2e379c19267270af14d648bc88c008515\": container with ID starting with fa58abe7b6f98f33453817bda7367aa2e379c19267270af14d648bc88c008515 not found: ID does not exist" containerID="fa58abe7b6f98f33453817bda7367aa2e379c19267270af14d648bc88c008515" Dec 03 08:24:36 crc kubenswrapper[4947]: I1203 08:24:36.185574 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa58abe7b6f98f33453817bda7367aa2e379c19267270af14d648bc88c008515"} err="failed to get container status \"fa58abe7b6f98f33453817bda7367aa2e379c19267270af14d648bc88c008515\": rpc error: code = NotFound desc = could not find container \"fa58abe7b6f98f33453817bda7367aa2e379c19267270af14d648bc88c008515\": container with ID starting with fa58abe7b6f98f33453817bda7367aa2e379c19267270af14d648bc88c008515 not found: ID does not exist" Dec 03 08:24:36 crc kubenswrapper[4947]: I1203 08:24:36.185608 4947 scope.go:117] "RemoveContainer" containerID="6414d708e43ce28ff73bee26d76d53a402adf18356aa5710322759ff49dec73a" Dec 03 08:24:36 crc kubenswrapper[4947]: E1203 08:24:36.185908 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6414d708e43ce28ff73bee26d76d53a402adf18356aa5710322759ff49dec73a\": container with ID starting with 6414d708e43ce28ff73bee26d76d53a402adf18356aa5710322759ff49dec73a not found: ID does not exist" containerID="6414d708e43ce28ff73bee26d76d53a402adf18356aa5710322759ff49dec73a" Dec 03 08:24:36 crc kubenswrapper[4947]: I1203 08:24:36.185948 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6414d708e43ce28ff73bee26d76d53a402adf18356aa5710322759ff49dec73a"} err="failed to get container status \"6414d708e43ce28ff73bee26d76d53a402adf18356aa5710322759ff49dec73a\": rpc error: code = NotFound desc = could not find container \"6414d708e43ce28ff73bee26d76d53a402adf18356aa5710322759ff49dec73a\": container with ID starting with 6414d708e43ce28ff73bee26d76d53a402adf18356aa5710322759ff49dec73a not found: ID does not exist" Dec 03 08:24:36 crc kubenswrapper[4947]: I1203 08:24:36.185974 4947 scope.go:117] "RemoveContainer" containerID="b579a059de1a4be80bc95ffd3ebf9a389ec1a05a6246640b4b8535bbd6a22f6c" Dec 03 08:24:36 crc kubenswrapper[4947]: E1203 08:24:36.186268 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b579a059de1a4be80bc95ffd3ebf9a389ec1a05a6246640b4b8535bbd6a22f6c\": container with ID starting with b579a059de1a4be80bc95ffd3ebf9a389ec1a05a6246640b4b8535bbd6a22f6c not found: ID does not exist" containerID="b579a059de1a4be80bc95ffd3ebf9a389ec1a05a6246640b4b8535bbd6a22f6c" Dec 03 08:24:36 crc kubenswrapper[4947]: I1203 08:24:36.186304 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b579a059de1a4be80bc95ffd3ebf9a389ec1a05a6246640b4b8535bbd6a22f6c"} err="failed to get container status \"b579a059de1a4be80bc95ffd3ebf9a389ec1a05a6246640b4b8535bbd6a22f6c\": rpc error: code = NotFound desc = could not find container \"b579a059de1a4be80bc95ffd3ebf9a389ec1a05a6246640b4b8535bbd6a22f6c\": container with ID starting with b579a059de1a4be80bc95ffd3ebf9a389ec1a05a6246640b4b8535bbd6a22f6c not found: ID does not exist" Dec 03 08:24:36 crc kubenswrapper[4947]: I1203 08:24:36.864241 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4bee9bc-f269-403f-bf3a-93ef017aeb1b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4bee9bc-f269-403f-bf3a-93ef017aeb1b" (UID: "b4bee9bc-f269-403f-bf3a-93ef017aeb1b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:24:36 crc kubenswrapper[4947]: I1203 08:24:36.894788 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4bee9bc-f269-403f-bf3a-93ef017aeb1b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:24:37 crc kubenswrapper[4947]: I1203 08:24:37.049457 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n9mgb"] Dec 03 08:24:37 crc kubenswrapper[4947]: I1203 08:24:37.054780 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n9mgb"] Dec 03 08:24:37 crc kubenswrapper[4947]: I1203 08:24:37.097975 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4bee9bc-f269-403f-bf3a-93ef017aeb1b" path="/var/lib/kubelet/pods/b4bee9bc-f269-403f-bf3a-93ef017aeb1b/volumes" Dec 03 08:24:37 crc kubenswrapper[4947]: I1203 08:24:37.099578 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9" path="/var/lib/kubelet/pods/f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9/volumes" Dec 03 08:24:39 crc kubenswrapper[4947]: I1203 08:24:39.088304 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:24:39 crc kubenswrapper[4947]: E1203 08:24:39.088709 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:24:50 crc kubenswrapper[4947]: I1203 08:24:50.083023 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:24:50 crc kubenswrapper[4947]: E1203 08:24:50.084695 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:25:01 crc kubenswrapper[4947]: I1203 08:25:01.083954 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:25:01 crc kubenswrapper[4947]: E1203 08:25:01.084945 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:25:14 crc kubenswrapper[4947]: I1203 08:25:14.082894 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:25:14 crc kubenswrapper[4947]: E1203 08:25:14.083719 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:25:25 crc kubenswrapper[4947]: I1203 08:25:25.084748 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:25:25 crc kubenswrapper[4947]: E1203 08:25:25.086016 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:25:39 crc kubenswrapper[4947]: I1203 08:25:39.102037 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:25:39 crc kubenswrapper[4947]: E1203 08:25:39.103024 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:25:52 crc kubenswrapper[4947]: I1203 08:25:52.083916 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:25:52 crc kubenswrapper[4947]: E1203 08:25:52.084514 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:26:06 crc kubenswrapper[4947]: I1203 08:26:06.083253 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:26:06 crc kubenswrapper[4947]: E1203 08:26:06.084223 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:26:20 crc kubenswrapper[4947]: I1203 08:26:20.083975 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:26:20 crc kubenswrapper[4947]: E1203 08:26:20.084716 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:26:35 crc kubenswrapper[4947]: I1203 08:26:35.083710 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:26:35 crc kubenswrapper[4947]: E1203 08:26:35.084939 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:26:46 crc kubenswrapper[4947]: I1203 08:26:46.083046 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:26:46 crc kubenswrapper[4947]: E1203 08:26:46.083681 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:26:57 crc kubenswrapper[4947]: I1203 08:26:57.083521 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:26:57 crc kubenswrapper[4947]: E1203 08:26:57.084280 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:27:12 crc kubenswrapper[4947]: I1203 08:27:12.082763 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:27:12 crc kubenswrapper[4947]: E1203 08:27:12.083825 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:27:23 crc kubenswrapper[4947]: I1203 08:27:23.085029 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:27:23 crc kubenswrapper[4947]: E1203 08:27:23.086098 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:27:37 crc kubenswrapper[4947]: I1203 08:27:37.084072 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:27:37 crc kubenswrapper[4947]: E1203 08:27:37.085165 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:27:52 crc kubenswrapper[4947]: I1203 08:27:52.083356 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:27:52 crc kubenswrapper[4947]: E1203 08:27:52.085672 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:27:58 crc kubenswrapper[4947]: I1203 08:27:58.176787 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6fpll"] Dec 03 08:27:58 crc kubenswrapper[4947]: E1203 08:27:58.177696 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4bee9bc-f269-403f-bf3a-93ef017aeb1b" containerName="extract-utilities" Dec 03 08:27:58 crc kubenswrapper[4947]: I1203 08:27:58.177714 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4bee9bc-f269-403f-bf3a-93ef017aeb1b" containerName="extract-utilities" Dec 03 08:27:58 crc kubenswrapper[4947]: E1203 08:27:58.177732 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9" containerName="registry-server" Dec 03 08:27:58 crc kubenswrapper[4947]: I1203 08:27:58.177740 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9" containerName="registry-server" Dec 03 08:27:58 crc kubenswrapper[4947]: E1203 08:27:58.177768 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9" containerName="extract-content" Dec 03 08:27:58 crc kubenswrapper[4947]: I1203 08:27:58.177776 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9" containerName="extract-content" Dec 03 08:27:58 crc kubenswrapper[4947]: E1203 08:27:58.177787 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9" containerName="extract-utilities" Dec 03 08:27:58 crc kubenswrapper[4947]: I1203 08:27:58.177795 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9" containerName="extract-utilities" Dec 03 08:27:58 crc kubenswrapper[4947]: E1203 08:27:58.177814 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4bee9bc-f269-403f-bf3a-93ef017aeb1b" containerName="registry-server" Dec 03 08:27:58 crc kubenswrapper[4947]: I1203 08:27:58.177822 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4bee9bc-f269-403f-bf3a-93ef017aeb1b" containerName="registry-server" Dec 03 08:27:58 crc kubenswrapper[4947]: E1203 08:27:58.177836 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4bee9bc-f269-403f-bf3a-93ef017aeb1b" containerName="extract-content" Dec 03 08:27:58 crc kubenswrapper[4947]: I1203 08:27:58.177845 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4bee9bc-f269-403f-bf3a-93ef017aeb1b" containerName="extract-content" Dec 03 08:27:58 crc kubenswrapper[4947]: I1203 08:27:58.178026 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4bee9bc-f269-403f-bf3a-93ef017aeb1b" containerName="registry-server" Dec 03 08:27:58 crc kubenswrapper[4947]: I1203 08:27:58.178045 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e1c6cb-1ba2-4e6f-89a6-35a5ad3f2ee9" containerName="registry-server" Dec 03 08:27:58 crc kubenswrapper[4947]: I1203 08:27:58.179332 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6fpll" Dec 03 08:27:58 crc kubenswrapper[4947]: I1203 08:27:58.203432 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78eca507-6ac7-4cf8-b4cd-d14a31b68df6-utilities\") pod \"community-operators-6fpll\" (UID: \"78eca507-6ac7-4cf8-b4cd-d14a31b68df6\") " pod="openshift-marketplace/community-operators-6fpll" Dec 03 08:27:58 crc kubenswrapper[4947]: I1203 08:27:58.203523 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78eca507-6ac7-4cf8-b4cd-d14a31b68df6-catalog-content\") pod \"community-operators-6fpll\" (UID: \"78eca507-6ac7-4cf8-b4cd-d14a31b68df6\") " pod="openshift-marketplace/community-operators-6fpll" Dec 03 08:27:58 crc kubenswrapper[4947]: I1203 08:27:58.203627 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgvkg\" (UniqueName: \"kubernetes.io/projected/78eca507-6ac7-4cf8-b4cd-d14a31b68df6-kube-api-access-fgvkg\") pod \"community-operators-6fpll\" (UID: \"78eca507-6ac7-4cf8-b4cd-d14a31b68df6\") " pod="openshift-marketplace/community-operators-6fpll" Dec 03 08:27:58 crc kubenswrapper[4947]: I1203 08:27:58.218705 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6fpll"] Dec 03 08:27:58 crc kubenswrapper[4947]: I1203 08:27:58.304412 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78eca507-6ac7-4cf8-b4cd-d14a31b68df6-utilities\") pod \"community-operators-6fpll\" (UID: \"78eca507-6ac7-4cf8-b4cd-d14a31b68df6\") " pod="openshift-marketplace/community-operators-6fpll" Dec 03 08:27:58 crc kubenswrapper[4947]: I1203 08:27:58.304626 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78eca507-6ac7-4cf8-b4cd-d14a31b68df6-catalog-content\") pod \"community-operators-6fpll\" (UID: \"78eca507-6ac7-4cf8-b4cd-d14a31b68df6\") " pod="openshift-marketplace/community-operators-6fpll" Dec 03 08:27:58 crc kubenswrapper[4947]: I1203 08:27:58.304691 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgvkg\" (UniqueName: \"kubernetes.io/projected/78eca507-6ac7-4cf8-b4cd-d14a31b68df6-kube-api-access-fgvkg\") pod \"community-operators-6fpll\" (UID: \"78eca507-6ac7-4cf8-b4cd-d14a31b68df6\") " pod="openshift-marketplace/community-operators-6fpll" Dec 03 08:27:58 crc kubenswrapper[4947]: I1203 08:27:58.305053 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78eca507-6ac7-4cf8-b4cd-d14a31b68df6-utilities\") pod \"community-operators-6fpll\" (UID: \"78eca507-6ac7-4cf8-b4cd-d14a31b68df6\") " pod="openshift-marketplace/community-operators-6fpll" Dec 03 08:27:58 crc kubenswrapper[4947]: I1203 08:27:58.305077 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78eca507-6ac7-4cf8-b4cd-d14a31b68df6-catalog-content\") pod \"community-operators-6fpll\" (UID: \"78eca507-6ac7-4cf8-b4cd-d14a31b68df6\") " pod="openshift-marketplace/community-operators-6fpll" Dec 03 08:27:58 crc kubenswrapper[4947]: I1203 08:27:58.335388 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgvkg\" (UniqueName: \"kubernetes.io/projected/78eca507-6ac7-4cf8-b4cd-d14a31b68df6-kube-api-access-fgvkg\") pod \"community-operators-6fpll\" (UID: \"78eca507-6ac7-4cf8-b4cd-d14a31b68df6\") " pod="openshift-marketplace/community-operators-6fpll" Dec 03 08:27:58 crc kubenswrapper[4947]: I1203 08:27:58.539602 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6fpll" Dec 03 08:27:59 crc kubenswrapper[4947]: I1203 08:27:59.036651 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6fpll"] Dec 03 08:28:00 crc kubenswrapper[4947]: I1203 08:28:00.016076 4947 generic.go:334] "Generic (PLEG): container finished" podID="78eca507-6ac7-4cf8-b4cd-d14a31b68df6" containerID="35273a70be946f67ade00cfcb687bcb6214e7bf44e4ae58f8500790ae1efc024" exitCode=0 Dec 03 08:28:00 crc kubenswrapper[4947]: I1203 08:28:00.016174 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6fpll" event={"ID":"78eca507-6ac7-4cf8-b4cd-d14a31b68df6","Type":"ContainerDied","Data":"35273a70be946f67ade00cfcb687bcb6214e7bf44e4ae58f8500790ae1efc024"} Dec 03 08:28:00 crc kubenswrapper[4947]: I1203 08:28:00.017893 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6fpll" event={"ID":"78eca507-6ac7-4cf8-b4cd-d14a31b68df6","Type":"ContainerStarted","Data":"d9851899df0a808926f2e3c7fbcb7f0240909e68c36468499095c80913517487"} Dec 03 08:28:01 crc kubenswrapper[4947]: I1203 08:28:01.030341 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6fpll" event={"ID":"78eca507-6ac7-4cf8-b4cd-d14a31b68df6","Type":"ContainerStarted","Data":"c46ca51463f20f612d77ac776723f8f9a34baeadaad087ef6a1445800c5c321a"} Dec 03 08:28:02 crc kubenswrapper[4947]: I1203 08:28:02.042032 4947 generic.go:334] "Generic (PLEG): container finished" podID="78eca507-6ac7-4cf8-b4cd-d14a31b68df6" containerID="c46ca51463f20f612d77ac776723f8f9a34baeadaad087ef6a1445800c5c321a" exitCode=0 Dec 03 08:28:02 crc kubenswrapper[4947]: I1203 08:28:02.042118 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6fpll" event={"ID":"78eca507-6ac7-4cf8-b4cd-d14a31b68df6","Type":"ContainerDied","Data":"c46ca51463f20f612d77ac776723f8f9a34baeadaad087ef6a1445800c5c321a"} Dec 03 08:28:03 crc kubenswrapper[4947]: I1203 08:28:03.052302 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6fpll" event={"ID":"78eca507-6ac7-4cf8-b4cd-d14a31b68df6","Type":"ContainerStarted","Data":"4cb9e21ccbd248d9fe5ecec0e466d4348919aa7fb8f119e77e4903f3060da1b4"} Dec 03 08:28:03 crc kubenswrapper[4947]: I1203 08:28:03.076989 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6fpll" podStartSLOduration=2.584574916 podStartE2EDuration="5.076964993s" podCreationTimestamp="2025-12-03 08:27:58 +0000 UTC" firstStartedPulling="2025-12-03 08:28:00.018909847 +0000 UTC m=+5941.279864293" lastFinishedPulling="2025-12-03 08:28:02.511299944 +0000 UTC m=+5943.772254370" observedRunningTime="2025-12-03 08:28:03.070111897 +0000 UTC m=+5944.331066343" watchObservedRunningTime="2025-12-03 08:28:03.076964993 +0000 UTC m=+5944.337919429" Dec 03 08:28:04 crc kubenswrapper[4947]: I1203 08:28:04.082967 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:28:05 crc kubenswrapper[4947]: I1203 08:28:05.068236 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"fc9eb1ca517896c39d83e96abd15de9c48114b61718564d2d1f7fbf10fce2730"} Dec 03 08:28:08 crc kubenswrapper[4947]: I1203 08:28:08.539969 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6fpll" Dec 03 08:28:08 crc kubenswrapper[4947]: I1203 08:28:08.540311 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6fpll" Dec 03 08:28:08 crc kubenswrapper[4947]: I1203 08:28:08.594158 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6fpll" Dec 03 08:28:09 crc kubenswrapper[4947]: I1203 08:28:09.157145 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6fpll" Dec 03 08:28:09 crc kubenswrapper[4947]: I1203 08:28:09.209832 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6fpll"] Dec 03 08:28:11 crc kubenswrapper[4947]: I1203 08:28:11.125848 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6fpll" podUID="78eca507-6ac7-4cf8-b4cd-d14a31b68df6" containerName="registry-server" containerID="cri-o://4cb9e21ccbd248d9fe5ecec0e466d4348919aa7fb8f119e77e4903f3060da1b4" gracePeriod=2 Dec 03 08:28:11 crc kubenswrapper[4947]: I1203 08:28:11.520660 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6fpll" Dec 03 08:28:11 crc kubenswrapper[4947]: I1203 08:28:11.617996 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgvkg\" (UniqueName: \"kubernetes.io/projected/78eca507-6ac7-4cf8-b4cd-d14a31b68df6-kube-api-access-fgvkg\") pod \"78eca507-6ac7-4cf8-b4cd-d14a31b68df6\" (UID: \"78eca507-6ac7-4cf8-b4cd-d14a31b68df6\") " Dec 03 08:28:11 crc kubenswrapper[4947]: I1203 08:28:11.618103 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78eca507-6ac7-4cf8-b4cd-d14a31b68df6-catalog-content\") pod \"78eca507-6ac7-4cf8-b4cd-d14a31b68df6\" (UID: \"78eca507-6ac7-4cf8-b4cd-d14a31b68df6\") " Dec 03 08:28:11 crc kubenswrapper[4947]: I1203 08:28:11.618187 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78eca507-6ac7-4cf8-b4cd-d14a31b68df6-utilities\") pod \"78eca507-6ac7-4cf8-b4cd-d14a31b68df6\" (UID: \"78eca507-6ac7-4cf8-b4cd-d14a31b68df6\") " Dec 03 08:28:11 crc kubenswrapper[4947]: I1203 08:28:11.619262 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78eca507-6ac7-4cf8-b4cd-d14a31b68df6-utilities" (OuterVolumeSpecName: "utilities") pod "78eca507-6ac7-4cf8-b4cd-d14a31b68df6" (UID: "78eca507-6ac7-4cf8-b4cd-d14a31b68df6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:28:11 crc kubenswrapper[4947]: I1203 08:28:11.631980 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78eca507-6ac7-4cf8-b4cd-d14a31b68df6-kube-api-access-fgvkg" (OuterVolumeSpecName: "kube-api-access-fgvkg") pod "78eca507-6ac7-4cf8-b4cd-d14a31b68df6" (UID: "78eca507-6ac7-4cf8-b4cd-d14a31b68df6"). InnerVolumeSpecName "kube-api-access-fgvkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:28:11 crc kubenswrapper[4947]: I1203 08:28:11.717805 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78eca507-6ac7-4cf8-b4cd-d14a31b68df6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78eca507-6ac7-4cf8-b4cd-d14a31b68df6" (UID: "78eca507-6ac7-4cf8-b4cd-d14a31b68df6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:28:11 crc kubenswrapper[4947]: I1203 08:28:11.719912 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78eca507-6ac7-4cf8-b4cd-d14a31b68df6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:28:11 crc kubenswrapper[4947]: I1203 08:28:11.719956 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78eca507-6ac7-4cf8-b4cd-d14a31b68df6-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:28:11 crc kubenswrapper[4947]: I1203 08:28:11.719968 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgvkg\" (UniqueName: \"kubernetes.io/projected/78eca507-6ac7-4cf8-b4cd-d14a31b68df6-kube-api-access-fgvkg\") on node \"crc\" DevicePath \"\"" Dec 03 08:28:12 crc kubenswrapper[4947]: I1203 08:28:12.139204 4947 generic.go:334] "Generic (PLEG): container finished" podID="78eca507-6ac7-4cf8-b4cd-d14a31b68df6" containerID="4cb9e21ccbd248d9fe5ecec0e466d4348919aa7fb8f119e77e4903f3060da1b4" exitCode=0 Dec 03 08:28:12 crc kubenswrapper[4947]: I1203 08:28:12.139281 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6fpll" event={"ID":"78eca507-6ac7-4cf8-b4cd-d14a31b68df6","Type":"ContainerDied","Data":"4cb9e21ccbd248d9fe5ecec0e466d4348919aa7fb8f119e77e4903f3060da1b4"} Dec 03 08:28:12 crc kubenswrapper[4947]: I1203 08:28:12.139314 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6fpll" event={"ID":"78eca507-6ac7-4cf8-b4cd-d14a31b68df6","Type":"ContainerDied","Data":"d9851899df0a808926f2e3c7fbcb7f0240909e68c36468499095c80913517487"} Dec 03 08:28:12 crc kubenswrapper[4947]: I1203 08:28:12.139363 4947 scope.go:117] "RemoveContainer" containerID="4cb9e21ccbd248d9fe5ecec0e466d4348919aa7fb8f119e77e4903f3060da1b4" Dec 03 08:28:12 crc kubenswrapper[4947]: I1203 08:28:12.139619 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6fpll" Dec 03 08:28:12 crc kubenswrapper[4947]: I1203 08:28:12.171378 4947 scope.go:117] "RemoveContainer" containerID="c46ca51463f20f612d77ac776723f8f9a34baeadaad087ef6a1445800c5c321a" Dec 03 08:28:12 crc kubenswrapper[4947]: I1203 08:28:12.203706 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6fpll"] Dec 03 08:28:12 crc kubenswrapper[4947]: I1203 08:28:12.220867 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6fpll"] Dec 03 08:28:12 crc kubenswrapper[4947]: I1203 08:28:12.229434 4947 scope.go:117] "RemoveContainer" containerID="35273a70be946f67ade00cfcb687bcb6214e7bf44e4ae58f8500790ae1efc024" Dec 03 08:28:12 crc kubenswrapper[4947]: I1203 08:28:12.266331 4947 scope.go:117] "RemoveContainer" containerID="4cb9e21ccbd248d9fe5ecec0e466d4348919aa7fb8f119e77e4903f3060da1b4" Dec 03 08:28:12 crc kubenswrapper[4947]: E1203 08:28:12.266880 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cb9e21ccbd248d9fe5ecec0e466d4348919aa7fb8f119e77e4903f3060da1b4\": container with ID starting with 4cb9e21ccbd248d9fe5ecec0e466d4348919aa7fb8f119e77e4903f3060da1b4 not found: ID does not exist" containerID="4cb9e21ccbd248d9fe5ecec0e466d4348919aa7fb8f119e77e4903f3060da1b4" Dec 03 08:28:12 crc kubenswrapper[4947]: I1203 08:28:12.266925 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cb9e21ccbd248d9fe5ecec0e466d4348919aa7fb8f119e77e4903f3060da1b4"} err="failed to get container status \"4cb9e21ccbd248d9fe5ecec0e466d4348919aa7fb8f119e77e4903f3060da1b4\": rpc error: code = NotFound desc = could not find container \"4cb9e21ccbd248d9fe5ecec0e466d4348919aa7fb8f119e77e4903f3060da1b4\": container with ID starting with 4cb9e21ccbd248d9fe5ecec0e466d4348919aa7fb8f119e77e4903f3060da1b4 not found: ID does not exist" Dec 03 08:28:12 crc kubenswrapper[4947]: I1203 08:28:12.266951 4947 scope.go:117] "RemoveContainer" containerID="c46ca51463f20f612d77ac776723f8f9a34baeadaad087ef6a1445800c5c321a" Dec 03 08:28:12 crc kubenswrapper[4947]: E1203 08:28:12.267218 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c46ca51463f20f612d77ac776723f8f9a34baeadaad087ef6a1445800c5c321a\": container with ID starting with c46ca51463f20f612d77ac776723f8f9a34baeadaad087ef6a1445800c5c321a not found: ID does not exist" containerID="c46ca51463f20f612d77ac776723f8f9a34baeadaad087ef6a1445800c5c321a" Dec 03 08:28:12 crc kubenswrapper[4947]: I1203 08:28:12.267260 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c46ca51463f20f612d77ac776723f8f9a34baeadaad087ef6a1445800c5c321a"} err="failed to get container status \"c46ca51463f20f612d77ac776723f8f9a34baeadaad087ef6a1445800c5c321a\": rpc error: code = NotFound desc = could not find container \"c46ca51463f20f612d77ac776723f8f9a34baeadaad087ef6a1445800c5c321a\": container with ID starting with c46ca51463f20f612d77ac776723f8f9a34baeadaad087ef6a1445800c5c321a not found: ID does not exist" Dec 03 08:28:12 crc kubenswrapper[4947]: I1203 08:28:12.267284 4947 scope.go:117] "RemoveContainer" containerID="35273a70be946f67ade00cfcb687bcb6214e7bf44e4ae58f8500790ae1efc024" Dec 03 08:28:12 crc kubenswrapper[4947]: E1203 08:28:12.267580 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35273a70be946f67ade00cfcb687bcb6214e7bf44e4ae58f8500790ae1efc024\": container with ID starting with 35273a70be946f67ade00cfcb687bcb6214e7bf44e4ae58f8500790ae1efc024 not found: ID does not exist" containerID="35273a70be946f67ade00cfcb687bcb6214e7bf44e4ae58f8500790ae1efc024" Dec 03 08:28:12 crc kubenswrapper[4947]: I1203 08:28:12.267605 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35273a70be946f67ade00cfcb687bcb6214e7bf44e4ae58f8500790ae1efc024"} err="failed to get container status \"35273a70be946f67ade00cfcb687bcb6214e7bf44e4ae58f8500790ae1efc024\": rpc error: code = NotFound desc = could not find container \"35273a70be946f67ade00cfcb687bcb6214e7bf44e4ae58f8500790ae1efc024\": container with ID starting with 35273a70be946f67ade00cfcb687bcb6214e7bf44e4ae58f8500790ae1efc024 not found: ID does not exist" Dec 03 08:28:13 crc kubenswrapper[4947]: I1203 08:28:13.095876 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78eca507-6ac7-4cf8-b4cd-d14a31b68df6" path="/var/lib/kubelet/pods/78eca507-6ac7-4cf8-b4cd-d14a31b68df6/volumes" Dec 03 08:29:47 crc kubenswrapper[4947]: I1203 08:29:47.735011 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hzcpb"] Dec 03 08:29:47 crc kubenswrapper[4947]: E1203 08:29:47.735806 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78eca507-6ac7-4cf8-b4cd-d14a31b68df6" containerName="extract-utilities" Dec 03 08:29:47 crc kubenswrapper[4947]: I1203 08:29:47.735818 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="78eca507-6ac7-4cf8-b4cd-d14a31b68df6" containerName="extract-utilities" Dec 03 08:29:47 crc kubenswrapper[4947]: E1203 08:29:47.735842 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78eca507-6ac7-4cf8-b4cd-d14a31b68df6" containerName="extract-content" Dec 03 08:29:47 crc kubenswrapper[4947]: I1203 08:29:47.735851 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="78eca507-6ac7-4cf8-b4cd-d14a31b68df6" containerName="extract-content" Dec 03 08:29:47 crc kubenswrapper[4947]: E1203 08:29:47.735870 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78eca507-6ac7-4cf8-b4cd-d14a31b68df6" containerName="registry-server" Dec 03 08:29:47 crc kubenswrapper[4947]: I1203 08:29:47.735878 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="78eca507-6ac7-4cf8-b4cd-d14a31b68df6" containerName="registry-server" Dec 03 08:29:47 crc kubenswrapper[4947]: I1203 08:29:47.736038 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="78eca507-6ac7-4cf8-b4cd-d14a31b68df6" containerName="registry-server" Dec 03 08:29:47 crc kubenswrapper[4947]: I1203 08:29:47.737125 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hzcpb" Dec 03 08:29:47 crc kubenswrapper[4947]: I1203 08:29:47.747323 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hzcpb"] Dec 03 08:29:47 crc kubenswrapper[4947]: I1203 08:29:47.794336 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9nk9\" (UniqueName: \"kubernetes.io/projected/928cd755-5c43-4aa3-ae83-af16b45abeaf-kube-api-access-v9nk9\") pod \"redhat-marketplace-hzcpb\" (UID: \"928cd755-5c43-4aa3-ae83-af16b45abeaf\") " pod="openshift-marketplace/redhat-marketplace-hzcpb" Dec 03 08:29:47 crc kubenswrapper[4947]: I1203 08:29:47.794430 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/928cd755-5c43-4aa3-ae83-af16b45abeaf-utilities\") pod \"redhat-marketplace-hzcpb\" (UID: \"928cd755-5c43-4aa3-ae83-af16b45abeaf\") " pod="openshift-marketplace/redhat-marketplace-hzcpb" Dec 03 08:29:47 crc kubenswrapper[4947]: I1203 08:29:47.794512 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/928cd755-5c43-4aa3-ae83-af16b45abeaf-catalog-content\") pod \"redhat-marketplace-hzcpb\" (UID: \"928cd755-5c43-4aa3-ae83-af16b45abeaf\") " pod="openshift-marketplace/redhat-marketplace-hzcpb" Dec 03 08:29:47 crc kubenswrapper[4947]: I1203 08:29:47.896141 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9nk9\" (UniqueName: \"kubernetes.io/projected/928cd755-5c43-4aa3-ae83-af16b45abeaf-kube-api-access-v9nk9\") pod \"redhat-marketplace-hzcpb\" (UID: \"928cd755-5c43-4aa3-ae83-af16b45abeaf\") " pod="openshift-marketplace/redhat-marketplace-hzcpb" Dec 03 08:29:47 crc kubenswrapper[4947]: I1203 08:29:47.896288 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/928cd755-5c43-4aa3-ae83-af16b45abeaf-utilities\") pod \"redhat-marketplace-hzcpb\" (UID: \"928cd755-5c43-4aa3-ae83-af16b45abeaf\") " pod="openshift-marketplace/redhat-marketplace-hzcpb" Dec 03 08:29:47 crc kubenswrapper[4947]: I1203 08:29:47.896356 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/928cd755-5c43-4aa3-ae83-af16b45abeaf-catalog-content\") pod \"redhat-marketplace-hzcpb\" (UID: \"928cd755-5c43-4aa3-ae83-af16b45abeaf\") " pod="openshift-marketplace/redhat-marketplace-hzcpb" Dec 03 08:29:47 crc kubenswrapper[4947]: I1203 08:29:47.897402 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/928cd755-5c43-4aa3-ae83-af16b45abeaf-catalog-content\") pod \"redhat-marketplace-hzcpb\" (UID: \"928cd755-5c43-4aa3-ae83-af16b45abeaf\") " pod="openshift-marketplace/redhat-marketplace-hzcpb" Dec 03 08:29:47 crc kubenswrapper[4947]: I1203 08:29:47.897411 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/928cd755-5c43-4aa3-ae83-af16b45abeaf-utilities\") pod \"redhat-marketplace-hzcpb\" (UID: \"928cd755-5c43-4aa3-ae83-af16b45abeaf\") " pod="openshift-marketplace/redhat-marketplace-hzcpb" Dec 03 08:29:47 crc kubenswrapper[4947]: I1203 08:29:47.923980 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9nk9\" (UniqueName: \"kubernetes.io/projected/928cd755-5c43-4aa3-ae83-af16b45abeaf-kube-api-access-v9nk9\") pod \"redhat-marketplace-hzcpb\" (UID: \"928cd755-5c43-4aa3-ae83-af16b45abeaf\") " pod="openshift-marketplace/redhat-marketplace-hzcpb" Dec 03 08:29:48 crc kubenswrapper[4947]: I1203 08:29:48.067429 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hzcpb" Dec 03 08:29:48 crc kubenswrapper[4947]: I1203 08:29:48.507316 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hzcpb"] Dec 03 08:29:48 crc kubenswrapper[4947]: W1203 08:29:48.511725 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod928cd755_5c43_4aa3_ae83_af16b45abeaf.slice/crio-8823c14180c0089f40c19039b3f310ef095a9410fef908b97f6128a4c792830c WatchSource:0}: Error finding container 8823c14180c0089f40c19039b3f310ef095a9410fef908b97f6128a4c792830c: Status 404 returned error can't find the container with id 8823c14180c0089f40c19039b3f310ef095a9410fef908b97f6128a4c792830c Dec 03 08:29:48 crc kubenswrapper[4947]: I1203 08:29:48.926197 4947 generic.go:334] "Generic (PLEG): container finished" podID="928cd755-5c43-4aa3-ae83-af16b45abeaf" containerID="05ed99a260962acba11100b54afbb8ae3dabe8a34d0996cc053158d39c867b42" exitCode=0 Dec 03 08:29:48 crc kubenswrapper[4947]: I1203 08:29:48.926251 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzcpb" event={"ID":"928cd755-5c43-4aa3-ae83-af16b45abeaf","Type":"ContainerDied","Data":"05ed99a260962acba11100b54afbb8ae3dabe8a34d0996cc053158d39c867b42"} Dec 03 08:29:48 crc kubenswrapper[4947]: I1203 08:29:48.926280 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzcpb" event={"ID":"928cd755-5c43-4aa3-ae83-af16b45abeaf","Type":"ContainerStarted","Data":"8823c14180c0089f40c19039b3f310ef095a9410fef908b97f6128a4c792830c"} Dec 03 08:29:48 crc kubenswrapper[4947]: I1203 08:29:48.928721 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 08:29:50 crc kubenswrapper[4947]: I1203 08:29:50.945656 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzcpb" event={"ID":"928cd755-5c43-4aa3-ae83-af16b45abeaf","Type":"ContainerStarted","Data":"e7ad2b0dbf560fa3c55b6558e560d5c412207334f70ac68858b05de12f0c6c3d"} Dec 03 08:29:51 crc kubenswrapper[4947]: I1203 08:29:51.959291 4947 generic.go:334] "Generic (PLEG): container finished" podID="928cd755-5c43-4aa3-ae83-af16b45abeaf" containerID="e7ad2b0dbf560fa3c55b6558e560d5c412207334f70ac68858b05de12f0c6c3d" exitCode=0 Dec 03 08:29:51 crc kubenswrapper[4947]: I1203 08:29:51.959392 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzcpb" event={"ID":"928cd755-5c43-4aa3-ae83-af16b45abeaf","Type":"ContainerDied","Data":"e7ad2b0dbf560fa3c55b6558e560d5c412207334f70ac68858b05de12f0c6c3d"} Dec 03 08:29:52 crc kubenswrapper[4947]: I1203 08:29:52.972106 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzcpb" event={"ID":"928cd755-5c43-4aa3-ae83-af16b45abeaf","Type":"ContainerStarted","Data":"7b20d62cefef1d1cb14d9332dc47845599fb0e59d4ea375c0845b68846f85150"} Dec 03 08:29:53 crc kubenswrapper[4947]: I1203 08:29:53.001576 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hzcpb" podStartSLOduration=2.399140735 podStartE2EDuration="6.001551523s" podCreationTimestamp="2025-12-03 08:29:47 +0000 UTC" firstStartedPulling="2025-12-03 08:29:48.928480763 +0000 UTC m=+6050.189435189" lastFinishedPulling="2025-12-03 08:29:52.530891511 +0000 UTC m=+6053.791845977" observedRunningTime="2025-12-03 08:29:52.998650605 +0000 UTC m=+6054.259605041" watchObservedRunningTime="2025-12-03 08:29:53.001551523 +0000 UTC m=+6054.262505989" Dec 03 08:29:58 crc kubenswrapper[4947]: I1203 08:29:58.068602 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hzcpb" Dec 03 08:29:58 crc kubenswrapper[4947]: I1203 08:29:58.069438 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hzcpb" Dec 03 08:29:58 crc kubenswrapper[4947]: I1203 08:29:58.150885 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hzcpb" Dec 03 08:29:59 crc kubenswrapper[4947]: I1203 08:29:59.116186 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hzcpb" Dec 03 08:29:59 crc kubenswrapper[4947]: I1203 08:29:59.498468 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hzcpb"] Dec 03 08:30:00 crc kubenswrapper[4947]: I1203 08:30:00.164840 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412510-xdtx6"] Dec 03 08:30:00 crc kubenswrapper[4947]: I1203 08:30:00.165836 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-xdtx6" Dec 03 08:30:00 crc kubenswrapper[4947]: I1203 08:30:00.167565 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 08:30:00 crc kubenswrapper[4947]: I1203 08:30:00.167565 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 08:30:00 crc kubenswrapper[4947]: I1203 08:30:00.172682 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1069a3b8-049a-4f2a-b7cf-1940a8129e4d-config-volume\") pod \"collect-profiles-29412510-xdtx6\" (UID: \"1069a3b8-049a-4f2a-b7cf-1940a8129e4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-xdtx6" Dec 03 08:30:00 crc kubenswrapper[4947]: I1203 08:30:00.172781 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1069a3b8-049a-4f2a-b7cf-1940a8129e4d-secret-volume\") pod \"collect-profiles-29412510-xdtx6\" (UID: \"1069a3b8-049a-4f2a-b7cf-1940a8129e4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-xdtx6" Dec 03 08:30:00 crc kubenswrapper[4947]: I1203 08:30:00.173468 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw4c6\" (UniqueName: \"kubernetes.io/projected/1069a3b8-049a-4f2a-b7cf-1940a8129e4d-kube-api-access-xw4c6\") pod \"collect-profiles-29412510-xdtx6\" (UID: \"1069a3b8-049a-4f2a-b7cf-1940a8129e4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-xdtx6" Dec 03 08:30:00 crc kubenswrapper[4947]: I1203 08:30:00.176870 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412510-xdtx6"] Dec 03 08:30:00 crc kubenswrapper[4947]: I1203 08:30:00.274176 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw4c6\" (UniqueName: \"kubernetes.io/projected/1069a3b8-049a-4f2a-b7cf-1940a8129e4d-kube-api-access-xw4c6\") pod \"collect-profiles-29412510-xdtx6\" (UID: \"1069a3b8-049a-4f2a-b7cf-1940a8129e4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-xdtx6" Dec 03 08:30:00 crc kubenswrapper[4947]: I1203 08:30:00.274245 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1069a3b8-049a-4f2a-b7cf-1940a8129e4d-config-volume\") pod \"collect-profiles-29412510-xdtx6\" (UID: \"1069a3b8-049a-4f2a-b7cf-1940a8129e4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-xdtx6" Dec 03 08:30:00 crc kubenswrapper[4947]: I1203 08:30:00.274293 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1069a3b8-049a-4f2a-b7cf-1940a8129e4d-secret-volume\") pod \"collect-profiles-29412510-xdtx6\" (UID: \"1069a3b8-049a-4f2a-b7cf-1940a8129e4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-xdtx6" Dec 03 08:30:00 crc kubenswrapper[4947]: I1203 08:30:00.276813 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1069a3b8-049a-4f2a-b7cf-1940a8129e4d-config-volume\") pod \"collect-profiles-29412510-xdtx6\" (UID: \"1069a3b8-049a-4f2a-b7cf-1940a8129e4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-xdtx6" Dec 03 08:30:00 crc kubenswrapper[4947]: I1203 08:30:00.280428 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1069a3b8-049a-4f2a-b7cf-1940a8129e4d-secret-volume\") pod \"collect-profiles-29412510-xdtx6\" (UID: \"1069a3b8-049a-4f2a-b7cf-1940a8129e4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-xdtx6" Dec 03 08:30:00 crc kubenswrapper[4947]: I1203 08:30:00.294565 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw4c6\" (UniqueName: \"kubernetes.io/projected/1069a3b8-049a-4f2a-b7cf-1940a8129e4d-kube-api-access-xw4c6\") pod \"collect-profiles-29412510-xdtx6\" (UID: \"1069a3b8-049a-4f2a-b7cf-1940a8129e4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-xdtx6" Dec 03 08:30:00 crc kubenswrapper[4947]: I1203 08:30:00.496711 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-xdtx6" Dec 03 08:30:00 crc kubenswrapper[4947]: I1203 08:30:00.931714 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412510-xdtx6"] Dec 03 08:30:01 crc kubenswrapper[4947]: I1203 08:30:01.044805 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-xdtx6" event={"ID":"1069a3b8-049a-4f2a-b7cf-1940a8129e4d","Type":"ContainerStarted","Data":"38ffec3e564f9362cf6c3eb921255e6500a92d4d50b34cb767664ea916480511"} Dec 03 08:30:01 crc kubenswrapper[4947]: I1203 08:30:01.045269 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hzcpb" podUID="928cd755-5c43-4aa3-ae83-af16b45abeaf" containerName="registry-server" containerID="cri-o://7b20d62cefef1d1cb14d9332dc47845599fb0e59d4ea375c0845b68846f85150" gracePeriod=2 Dec 03 08:30:01 crc kubenswrapper[4947]: I1203 08:30:01.546869 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hzcpb" Dec 03 08:30:01 crc kubenswrapper[4947]: I1203 08:30:01.702856 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/928cd755-5c43-4aa3-ae83-af16b45abeaf-catalog-content\") pod \"928cd755-5c43-4aa3-ae83-af16b45abeaf\" (UID: \"928cd755-5c43-4aa3-ae83-af16b45abeaf\") " Dec 03 08:30:01 crc kubenswrapper[4947]: I1203 08:30:01.703127 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/928cd755-5c43-4aa3-ae83-af16b45abeaf-utilities\") pod \"928cd755-5c43-4aa3-ae83-af16b45abeaf\" (UID: \"928cd755-5c43-4aa3-ae83-af16b45abeaf\") " Dec 03 08:30:01 crc kubenswrapper[4947]: I1203 08:30:01.703187 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9nk9\" (UniqueName: \"kubernetes.io/projected/928cd755-5c43-4aa3-ae83-af16b45abeaf-kube-api-access-v9nk9\") pod \"928cd755-5c43-4aa3-ae83-af16b45abeaf\" (UID: \"928cd755-5c43-4aa3-ae83-af16b45abeaf\") " Dec 03 08:30:01 crc kubenswrapper[4947]: I1203 08:30:01.703808 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/928cd755-5c43-4aa3-ae83-af16b45abeaf-utilities" (OuterVolumeSpecName: "utilities") pod "928cd755-5c43-4aa3-ae83-af16b45abeaf" (UID: "928cd755-5c43-4aa3-ae83-af16b45abeaf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:30:01 crc kubenswrapper[4947]: I1203 08:30:01.707430 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/928cd755-5c43-4aa3-ae83-af16b45abeaf-kube-api-access-v9nk9" (OuterVolumeSpecName: "kube-api-access-v9nk9") pod "928cd755-5c43-4aa3-ae83-af16b45abeaf" (UID: "928cd755-5c43-4aa3-ae83-af16b45abeaf"). InnerVolumeSpecName "kube-api-access-v9nk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:30:01 crc kubenswrapper[4947]: I1203 08:30:01.722964 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/928cd755-5c43-4aa3-ae83-af16b45abeaf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "928cd755-5c43-4aa3-ae83-af16b45abeaf" (UID: "928cd755-5c43-4aa3-ae83-af16b45abeaf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:30:01 crc kubenswrapper[4947]: I1203 08:30:01.804728 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/928cd755-5c43-4aa3-ae83-af16b45abeaf-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:30:01 crc kubenswrapper[4947]: I1203 08:30:01.804775 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/928cd755-5c43-4aa3-ae83-af16b45abeaf-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:30:01 crc kubenswrapper[4947]: I1203 08:30:01.804793 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9nk9\" (UniqueName: \"kubernetes.io/projected/928cd755-5c43-4aa3-ae83-af16b45abeaf-kube-api-access-v9nk9\") on node \"crc\" DevicePath \"\"" Dec 03 08:30:02 crc kubenswrapper[4947]: I1203 08:30:02.054883 4947 generic.go:334] "Generic (PLEG): container finished" podID="1069a3b8-049a-4f2a-b7cf-1940a8129e4d" containerID="8db58c493c8602607814eb9d1db832633a067c83e677e770dda23253fa0771aa" exitCode=0 Dec 03 08:30:02 crc kubenswrapper[4947]: I1203 08:30:02.054982 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-xdtx6" event={"ID":"1069a3b8-049a-4f2a-b7cf-1940a8129e4d","Type":"ContainerDied","Data":"8db58c493c8602607814eb9d1db832633a067c83e677e770dda23253fa0771aa"} Dec 03 08:30:02 crc kubenswrapper[4947]: I1203 08:30:02.059656 4947 generic.go:334] "Generic (PLEG): container finished" podID="928cd755-5c43-4aa3-ae83-af16b45abeaf" containerID="7b20d62cefef1d1cb14d9332dc47845599fb0e59d4ea375c0845b68846f85150" exitCode=0 Dec 03 08:30:02 crc kubenswrapper[4947]: I1203 08:30:02.059798 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hzcpb" Dec 03 08:30:02 crc kubenswrapper[4947]: I1203 08:30:02.059837 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzcpb" event={"ID":"928cd755-5c43-4aa3-ae83-af16b45abeaf","Type":"ContainerDied","Data":"7b20d62cefef1d1cb14d9332dc47845599fb0e59d4ea375c0845b68846f85150"} Dec 03 08:30:02 crc kubenswrapper[4947]: I1203 08:30:02.059904 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hzcpb" event={"ID":"928cd755-5c43-4aa3-ae83-af16b45abeaf","Type":"ContainerDied","Data":"8823c14180c0089f40c19039b3f310ef095a9410fef908b97f6128a4c792830c"} Dec 03 08:30:02 crc kubenswrapper[4947]: I1203 08:30:02.059940 4947 scope.go:117] "RemoveContainer" containerID="7b20d62cefef1d1cb14d9332dc47845599fb0e59d4ea375c0845b68846f85150" Dec 03 08:30:02 crc kubenswrapper[4947]: I1203 08:30:02.087373 4947 scope.go:117] "RemoveContainer" containerID="e7ad2b0dbf560fa3c55b6558e560d5c412207334f70ac68858b05de12f0c6c3d" Dec 03 08:30:02 crc kubenswrapper[4947]: I1203 08:30:02.109550 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hzcpb"] Dec 03 08:30:02 crc kubenswrapper[4947]: I1203 08:30:02.116713 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hzcpb"] Dec 03 08:30:02 crc kubenswrapper[4947]: I1203 08:30:02.136113 4947 scope.go:117] "RemoveContainer" containerID="05ed99a260962acba11100b54afbb8ae3dabe8a34d0996cc053158d39c867b42" Dec 03 08:30:02 crc kubenswrapper[4947]: I1203 08:30:02.152127 4947 scope.go:117] "RemoveContainer" containerID="7b20d62cefef1d1cb14d9332dc47845599fb0e59d4ea375c0845b68846f85150" Dec 03 08:30:02 crc kubenswrapper[4947]: E1203 08:30:02.152539 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b20d62cefef1d1cb14d9332dc47845599fb0e59d4ea375c0845b68846f85150\": container with ID starting with 7b20d62cefef1d1cb14d9332dc47845599fb0e59d4ea375c0845b68846f85150 not found: ID does not exist" containerID="7b20d62cefef1d1cb14d9332dc47845599fb0e59d4ea375c0845b68846f85150" Dec 03 08:30:02 crc kubenswrapper[4947]: I1203 08:30:02.152576 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b20d62cefef1d1cb14d9332dc47845599fb0e59d4ea375c0845b68846f85150"} err="failed to get container status \"7b20d62cefef1d1cb14d9332dc47845599fb0e59d4ea375c0845b68846f85150\": rpc error: code = NotFound desc = could not find container \"7b20d62cefef1d1cb14d9332dc47845599fb0e59d4ea375c0845b68846f85150\": container with ID starting with 7b20d62cefef1d1cb14d9332dc47845599fb0e59d4ea375c0845b68846f85150 not found: ID does not exist" Dec 03 08:30:02 crc kubenswrapper[4947]: I1203 08:30:02.152597 4947 scope.go:117] "RemoveContainer" containerID="e7ad2b0dbf560fa3c55b6558e560d5c412207334f70ac68858b05de12f0c6c3d" Dec 03 08:30:02 crc kubenswrapper[4947]: E1203 08:30:02.152877 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7ad2b0dbf560fa3c55b6558e560d5c412207334f70ac68858b05de12f0c6c3d\": container with ID starting with e7ad2b0dbf560fa3c55b6558e560d5c412207334f70ac68858b05de12f0c6c3d not found: ID does not exist" containerID="e7ad2b0dbf560fa3c55b6558e560d5c412207334f70ac68858b05de12f0c6c3d" Dec 03 08:30:02 crc kubenswrapper[4947]: I1203 08:30:02.152919 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7ad2b0dbf560fa3c55b6558e560d5c412207334f70ac68858b05de12f0c6c3d"} err="failed to get container status \"e7ad2b0dbf560fa3c55b6558e560d5c412207334f70ac68858b05de12f0c6c3d\": rpc error: code = NotFound desc = could not find container \"e7ad2b0dbf560fa3c55b6558e560d5c412207334f70ac68858b05de12f0c6c3d\": container with ID starting with e7ad2b0dbf560fa3c55b6558e560d5c412207334f70ac68858b05de12f0c6c3d not found: ID does not exist" Dec 03 08:30:02 crc kubenswrapper[4947]: I1203 08:30:02.152946 4947 scope.go:117] "RemoveContainer" containerID="05ed99a260962acba11100b54afbb8ae3dabe8a34d0996cc053158d39c867b42" Dec 03 08:30:02 crc kubenswrapper[4947]: E1203 08:30:02.153232 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05ed99a260962acba11100b54afbb8ae3dabe8a34d0996cc053158d39c867b42\": container with ID starting with 05ed99a260962acba11100b54afbb8ae3dabe8a34d0996cc053158d39c867b42 not found: ID does not exist" containerID="05ed99a260962acba11100b54afbb8ae3dabe8a34d0996cc053158d39c867b42" Dec 03 08:30:02 crc kubenswrapper[4947]: I1203 08:30:02.153416 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05ed99a260962acba11100b54afbb8ae3dabe8a34d0996cc053158d39c867b42"} err="failed to get container status \"05ed99a260962acba11100b54afbb8ae3dabe8a34d0996cc053158d39c867b42\": rpc error: code = NotFound desc = could not find container \"05ed99a260962acba11100b54afbb8ae3dabe8a34d0996cc053158d39c867b42\": container with ID starting with 05ed99a260962acba11100b54afbb8ae3dabe8a34d0996cc053158d39c867b42 not found: ID does not exist" Dec 03 08:30:03 crc kubenswrapper[4947]: I1203 08:30:03.106910 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="928cd755-5c43-4aa3-ae83-af16b45abeaf" path="/var/lib/kubelet/pods/928cd755-5c43-4aa3-ae83-af16b45abeaf/volumes" Dec 03 08:30:03 crc kubenswrapper[4947]: I1203 08:30:03.332613 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-xdtx6" Dec 03 08:30:03 crc kubenswrapper[4947]: I1203 08:30:03.426677 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw4c6\" (UniqueName: \"kubernetes.io/projected/1069a3b8-049a-4f2a-b7cf-1940a8129e4d-kube-api-access-xw4c6\") pod \"1069a3b8-049a-4f2a-b7cf-1940a8129e4d\" (UID: \"1069a3b8-049a-4f2a-b7cf-1940a8129e4d\") " Dec 03 08:30:03 crc kubenswrapper[4947]: I1203 08:30:03.426731 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1069a3b8-049a-4f2a-b7cf-1940a8129e4d-secret-volume\") pod \"1069a3b8-049a-4f2a-b7cf-1940a8129e4d\" (UID: \"1069a3b8-049a-4f2a-b7cf-1940a8129e4d\") " Dec 03 08:30:03 crc kubenswrapper[4947]: I1203 08:30:03.426785 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1069a3b8-049a-4f2a-b7cf-1940a8129e4d-config-volume\") pod \"1069a3b8-049a-4f2a-b7cf-1940a8129e4d\" (UID: \"1069a3b8-049a-4f2a-b7cf-1940a8129e4d\") " Dec 03 08:30:03 crc kubenswrapper[4947]: I1203 08:30:03.427483 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1069a3b8-049a-4f2a-b7cf-1940a8129e4d-config-volume" (OuterVolumeSpecName: "config-volume") pod "1069a3b8-049a-4f2a-b7cf-1940a8129e4d" (UID: "1069a3b8-049a-4f2a-b7cf-1940a8129e4d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:30:03 crc kubenswrapper[4947]: I1203 08:30:03.433580 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1069a3b8-049a-4f2a-b7cf-1940a8129e4d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1069a3b8-049a-4f2a-b7cf-1940a8129e4d" (UID: "1069a3b8-049a-4f2a-b7cf-1940a8129e4d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:30:03 crc kubenswrapper[4947]: I1203 08:30:03.433717 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1069a3b8-049a-4f2a-b7cf-1940a8129e4d-kube-api-access-xw4c6" (OuterVolumeSpecName: "kube-api-access-xw4c6") pod "1069a3b8-049a-4f2a-b7cf-1940a8129e4d" (UID: "1069a3b8-049a-4f2a-b7cf-1940a8129e4d"). InnerVolumeSpecName "kube-api-access-xw4c6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:30:03 crc kubenswrapper[4947]: I1203 08:30:03.528219 4947 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1069a3b8-049a-4f2a-b7cf-1940a8129e4d-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 08:30:03 crc kubenswrapper[4947]: I1203 08:30:03.528293 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw4c6\" (UniqueName: \"kubernetes.io/projected/1069a3b8-049a-4f2a-b7cf-1940a8129e4d-kube-api-access-xw4c6\") on node \"crc\" DevicePath \"\"" Dec 03 08:30:03 crc kubenswrapper[4947]: I1203 08:30:03.528318 4947 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1069a3b8-049a-4f2a-b7cf-1940a8129e4d-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 08:30:04 crc kubenswrapper[4947]: I1203 08:30:04.079085 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-xdtx6" event={"ID":"1069a3b8-049a-4f2a-b7cf-1940a8129e4d","Type":"ContainerDied","Data":"38ffec3e564f9362cf6c3eb921255e6500a92d4d50b34cb767664ea916480511"} Dec 03 08:30:04 crc kubenswrapper[4947]: I1203 08:30:04.079132 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-xdtx6" Dec 03 08:30:04 crc kubenswrapper[4947]: I1203 08:30:04.079150 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38ffec3e564f9362cf6c3eb921255e6500a92d4d50b34cb767664ea916480511" Dec 03 08:30:04 crc kubenswrapper[4947]: I1203 08:30:04.413296 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412465-g6hxq"] Dec 03 08:30:04 crc kubenswrapper[4947]: I1203 08:30:04.420449 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412465-g6hxq"] Dec 03 08:30:05 crc kubenswrapper[4947]: I1203 08:30:05.091812 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc67b8b8-984d-46d9-8e27-ea4a53b96228" path="/var/lib/kubelet/pods/fc67b8b8-984d-46d9-8e27-ea4a53b96228/volumes" Dec 03 08:30:06 crc kubenswrapper[4947]: I1203 08:30:06.310830 4947 scope.go:117] "RemoveContainer" containerID="9e1737c5ee372ae8c38b34e925326531e5a717360215ba424baf77ae2a889a72" Dec 03 08:30:30 crc kubenswrapper[4947]: I1203 08:30:30.086444 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:30:30 crc kubenswrapper[4947]: I1203 08:30:30.087020 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:31:00 crc kubenswrapper[4947]: I1203 08:31:00.086262 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:31:00 crc kubenswrapper[4947]: I1203 08:31:00.087059 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:31:30 crc kubenswrapper[4947]: I1203 08:31:30.087044 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:31:30 crc kubenswrapper[4947]: I1203 08:31:30.087897 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:31:30 crc kubenswrapper[4947]: I1203 08:31:30.087964 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 08:31:30 crc kubenswrapper[4947]: I1203 08:31:30.088737 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fc9eb1ca517896c39d83e96abd15de9c48114b61718564d2d1f7fbf10fce2730"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:31:30 crc kubenswrapper[4947]: I1203 08:31:30.088829 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://fc9eb1ca517896c39d83e96abd15de9c48114b61718564d2d1f7fbf10fce2730" gracePeriod=600 Dec 03 08:31:30 crc kubenswrapper[4947]: I1203 08:31:30.840946 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="fc9eb1ca517896c39d83e96abd15de9c48114b61718564d2d1f7fbf10fce2730" exitCode=0 Dec 03 08:31:30 crc kubenswrapper[4947]: I1203 08:31:30.841016 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"fc9eb1ca517896c39d83e96abd15de9c48114b61718564d2d1f7fbf10fce2730"} Dec 03 08:31:30 crc kubenswrapper[4947]: I1203 08:31:30.841564 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68"} Dec 03 08:31:30 crc kubenswrapper[4947]: I1203 08:31:30.841588 4947 scope.go:117] "RemoveContainer" containerID="6b78fec405fa66fd269e7cd422cee6d0ad96617d5f528fe88b4cc7771ca8f025" Dec 03 08:33:30 crc kubenswrapper[4947]: I1203 08:33:30.086525 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:33:30 crc kubenswrapper[4947]: I1203 08:33:30.087241 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:34:00 crc kubenswrapper[4947]: I1203 08:34:00.086140 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:34:00 crc kubenswrapper[4947]: I1203 08:34:00.086721 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:34:30 crc kubenswrapper[4947]: I1203 08:34:30.086475 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:34:30 crc kubenswrapper[4947]: I1203 08:34:30.087072 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:34:30 crc kubenswrapper[4947]: I1203 08:34:30.087116 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 08:34:30 crc kubenswrapper[4947]: I1203 08:34:30.087823 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:34:30 crc kubenswrapper[4947]: I1203 08:34:30.087876 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" gracePeriod=600 Dec 03 08:34:30 crc kubenswrapper[4947]: E1203 08:34:30.208104 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:34:30 crc kubenswrapper[4947]: I1203 08:34:30.351754 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" exitCode=0 Dec 03 08:34:30 crc kubenswrapper[4947]: I1203 08:34:30.351837 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68"} Dec 03 08:34:30 crc kubenswrapper[4947]: I1203 08:34:30.351917 4947 scope.go:117] "RemoveContainer" containerID="fc9eb1ca517896c39d83e96abd15de9c48114b61718564d2d1f7fbf10fce2730" Dec 03 08:34:30 crc kubenswrapper[4947]: I1203 08:34:30.352795 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:34:30 crc kubenswrapper[4947]: E1203 08:34:30.353366 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:34:38 crc kubenswrapper[4947]: I1203 08:34:38.506742 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mm285"] Dec 03 08:34:38 crc kubenswrapper[4947]: E1203 08:34:38.507782 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="928cd755-5c43-4aa3-ae83-af16b45abeaf" containerName="extract-content" Dec 03 08:34:38 crc kubenswrapper[4947]: I1203 08:34:38.507804 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="928cd755-5c43-4aa3-ae83-af16b45abeaf" containerName="extract-content" Dec 03 08:34:38 crc kubenswrapper[4947]: E1203 08:34:38.507837 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1069a3b8-049a-4f2a-b7cf-1940a8129e4d" containerName="collect-profiles" Dec 03 08:34:38 crc kubenswrapper[4947]: I1203 08:34:38.507849 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1069a3b8-049a-4f2a-b7cf-1940a8129e4d" containerName="collect-profiles" Dec 03 08:34:38 crc kubenswrapper[4947]: E1203 08:34:38.507870 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="928cd755-5c43-4aa3-ae83-af16b45abeaf" containerName="extract-utilities" Dec 03 08:34:38 crc kubenswrapper[4947]: I1203 08:34:38.507880 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="928cd755-5c43-4aa3-ae83-af16b45abeaf" containerName="extract-utilities" Dec 03 08:34:38 crc kubenswrapper[4947]: E1203 08:34:38.507911 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="928cd755-5c43-4aa3-ae83-af16b45abeaf" containerName="registry-server" Dec 03 08:34:38 crc kubenswrapper[4947]: I1203 08:34:38.507921 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="928cd755-5c43-4aa3-ae83-af16b45abeaf" containerName="registry-server" Dec 03 08:34:38 crc kubenswrapper[4947]: I1203 08:34:38.508142 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="928cd755-5c43-4aa3-ae83-af16b45abeaf" containerName="registry-server" Dec 03 08:34:38 crc kubenswrapper[4947]: I1203 08:34:38.508169 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="1069a3b8-049a-4f2a-b7cf-1940a8129e4d" containerName="collect-profiles" Dec 03 08:34:38 crc kubenswrapper[4947]: I1203 08:34:38.509860 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mm285" Dec 03 08:34:38 crc kubenswrapper[4947]: I1203 08:34:38.528555 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mm285"] Dec 03 08:34:38 crc kubenswrapper[4947]: I1203 08:34:38.616238 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28f3f1e7-c8d2-4e83-b8c9-a35ab645e413-catalog-content\") pod \"certified-operators-mm285\" (UID: \"28f3f1e7-c8d2-4e83-b8c9-a35ab645e413\") " pod="openshift-marketplace/certified-operators-mm285" Dec 03 08:34:38 crc kubenswrapper[4947]: I1203 08:34:38.616312 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28f3f1e7-c8d2-4e83-b8c9-a35ab645e413-utilities\") pod \"certified-operators-mm285\" (UID: \"28f3f1e7-c8d2-4e83-b8c9-a35ab645e413\") " pod="openshift-marketplace/certified-operators-mm285" Dec 03 08:34:38 crc kubenswrapper[4947]: I1203 08:34:38.616350 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9b2s\" (UniqueName: \"kubernetes.io/projected/28f3f1e7-c8d2-4e83-b8c9-a35ab645e413-kube-api-access-s9b2s\") pod \"certified-operators-mm285\" (UID: \"28f3f1e7-c8d2-4e83-b8c9-a35ab645e413\") " pod="openshift-marketplace/certified-operators-mm285" Dec 03 08:34:38 crc kubenswrapper[4947]: I1203 08:34:38.717808 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28f3f1e7-c8d2-4e83-b8c9-a35ab645e413-catalog-content\") pod \"certified-operators-mm285\" (UID: \"28f3f1e7-c8d2-4e83-b8c9-a35ab645e413\") " pod="openshift-marketplace/certified-operators-mm285" Dec 03 08:34:38 crc kubenswrapper[4947]: I1203 08:34:38.717885 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28f3f1e7-c8d2-4e83-b8c9-a35ab645e413-utilities\") pod \"certified-operators-mm285\" (UID: \"28f3f1e7-c8d2-4e83-b8c9-a35ab645e413\") " pod="openshift-marketplace/certified-operators-mm285" Dec 03 08:34:38 crc kubenswrapper[4947]: I1203 08:34:38.717926 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9b2s\" (UniqueName: \"kubernetes.io/projected/28f3f1e7-c8d2-4e83-b8c9-a35ab645e413-kube-api-access-s9b2s\") pod \"certified-operators-mm285\" (UID: \"28f3f1e7-c8d2-4e83-b8c9-a35ab645e413\") " pod="openshift-marketplace/certified-operators-mm285" Dec 03 08:34:38 crc kubenswrapper[4947]: I1203 08:34:38.718336 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28f3f1e7-c8d2-4e83-b8c9-a35ab645e413-catalog-content\") pod \"certified-operators-mm285\" (UID: \"28f3f1e7-c8d2-4e83-b8c9-a35ab645e413\") " pod="openshift-marketplace/certified-operators-mm285" Dec 03 08:34:38 crc kubenswrapper[4947]: I1203 08:34:38.718352 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28f3f1e7-c8d2-4e83-b8c9-a35ab645e413-utilities\") pod \"certified-operators-mm285\" (UID: \"28f3f1e7-c8d2-4e83-b8c9-a35ab645e413\") " pod="openshift-marketplace/certified-operators-mm285" Dec 03 08:34:38 crc kubenswrapper[4947]: I1203 08:34:38.736688 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9b2s\" (UniqueName: \"kubernetes.io/projected/28f3f1e7-c8d2-4e83-b8c9-a35ab645e413-kube-api-access-s9b2s\") pod \"certified-operators-mm285\" (UID: \"28f3f1e7-c8d2-4e83-b8c9-a35ab645e413\") " pod="openshift-marketplace/certified-operators-mm285" Dec 03 08:34:38 crc kubenswrapper[4947]: I1203 08:34:38.838059 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mm285" Dec 03 08:34:39 crc kubenswrapper[4947]: I1203 08:34:39.119245 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mm285"] Dec 03 08:34:39 crc kubenswrapper[4947]: I1203 08:34:39.430895 4947 generic.go:334] "Generic (PLEG): container finished" podID="28f3f1e7-c8d2-4e83-b8c9-a35ab645e413" containerID="7b23207e8c77d5ea0981bbf9e40077d7f259f2abfc512a24896b5f2a1b05a60b" exitCode=0 Dec 03 08:34:39 crc kubenswrapper[4947]: I1203 08:34:39.430940 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mm285" event={"ID":"28f3f1e7-c8d2-4e83-b8c9-a35ab645e413","Type":"ContainerDied","Data":"7b23207e8c77d5ea0981bbf9e40077d7f259f2abfc512a24896b5f2a1b05a60b"} Dec 03 08:34:39 crc kubenswrapper[4947]: I1203 08:34:39.431197 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mm285" event={"ID":"28f3f1e7-c8d2-4e83-b8c9-a35ab645e413","Type":"ContainerStarted","Data":"4cd7edd5aa8ae9d7fa037c25903438890eec8f89d4301c2a911edbb3f30db51c"} Dec 03 08:34:41 crc kubenswrapper[4947]: I1203 08:34:41.449592 4947 generic.go:334] "Generic (PLEG): container finished" podID="28f3f1e7-c8d2-4e83-b8c9-a35ab645e413" containerID="37e9a300b2ed2a0b113bf1bf1f11d22f3b57d84dc74c6b291274368c2aa2238f" exitCode=0 Dec 03 08:34:41 crc kubenswrapper[4947]: I1203 08:34:41.449905 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mm285" event={"ID":"28f3f1e7-c8d2-4e83-b8c9-a35ab645e413","Type":"ContainerDied","Data":"37e9a300b2ed2a0b113bf1bf1f11d22f3b57d84dc74c6b291274368c2aa2238f"} Dec 03 08:34:42 crc kubenswrapper[4947]: I1203 08:34:42.458525 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mm285" event={"ID":"28f3f1e7-c8d2-4e83-b8c9-a35ab645e413","Type":"ContainerStarted","Data":"7ae7d86e56034ed95ab7126d40e95e5c0cc59506dff83a5429153c96e20e0809"} Dec 03 08:34:42 crc kubenswrapper[4947]: I1203 08:34:42.484022 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mm285" podStartSLOduration=1.9991810330000002 podStartE2EDuration="4.484001085s" podCreationTimestamp="2025-12-03 08:34:38 +0000 UTC" firstStartedPulling="2025-12-03 08:34:39.432302042 +0000 UTC m=+6340.693256468" lastFinishedPulling="2025-12-03 08:34:41.917122074 +0000 UTC m=+6343.178076520" observedRunningTime="2025-12-03 08:34:42.479668048 +0000 UTC m=+6343.740622484" watchObservedRunningTime="2025-12-03 08:34:42.484001085 +0000 UTC m=+6343.744955511" Dec 03 08:34:44 crc kubenswrapper[4947]: I1203 08:34:44.083154 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:34:44 crc kubenswrapper[4947]: E1203 08:34:44.083748 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:34:48 crc kubenswrapper[4947]: I1203 08:34:48.838425 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mm285" Dec 03 08:34:48 crc kubenswrapper[4947]: I1203 08:34:48.838729 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mm285" Dec 03 08:34:48 crc kubenswrapper[4947]: I1203 08:34:48.882335 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mm285" Dec 03 08:34:49 crc kubenswrapper[4947]: I1203 08:34:49.577983 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mm285" Dec 03 08:34:49 crc kubenswrapper[4947]: I1203 08:34:49.623484 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mm285"] Dec 03 08:34:51 crc kubenswrapper[4947]: I1203 08:34:51.523288 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mm285" podUID="28f3f1e7-c8d2-4e83-b8c9-a35ab645e413" containerName="registry-server" containerID="cri-o://7ae7d86e56034ed95ab7126d40e95e5c0cc59506dff83a5429153c96e20e0809" gracePeriod=2 Dec 03 08:34:52 crc kubenswrapper[4947]: I1203 08:34:52.447951 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mm285" Dec 03 08:34:52 crc kubenswrapper[4947]: I1203 08:34:52.534195 4947 generic.go:334] "Generic (PLEG): container finished" podID="28f3f1e7-c8d2-4e83-b8c9-a35ab645e413" containerID="7ae7d86e56034ed95ab7126d40e95e5c0cc59506dff83a5429153c96e20e0809" exitCode=0 Dec 03 08:34:52 crc kubenswrapper[4947]: I1203 08:34:52.534254 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mm285" event={"ID":"28f3f1e7-c8d2-4e83-b8c9-a35ab645e413","Type":"ContainerDied","Data":"7ae7d86e56034ed95ab7126d40e95e5c0cc59506dff83a5429153c96e20e0809"} Dec 03 08:34:52 crc kubenswrapper[4947]: I1203 08:34:52.534266 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mm285" Dec 03 08:34:52 crc kubenswrapper[4947]: I1203 08:34:52.534284 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mm285" event={"ID":"28f3f1e7-c8d2-4e83-b8c9-a35ab645e413","Type":"ContainerDied","Data":"4cd7edd5aa8ae9d7fa037c25903438890eec8f89d4301c2a911edbb3f30db51c"} Dec 03 08:34:52 crc kubenswrapper[4947]: I1203 08:34:52.534304 4947 scope.go:117] "RemoveContainer" containerID="7ae7d86e56034ed95ab7126d40e95e5c0cc59506dff83a5429153c96e20e0809" Dec 03 08:34:52 crc kubenswrapper[4947]: I1203 08:34:52.552237 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9b2s\" (UniqueName: \"kubernetes.io/projected/28f3f1e7-c8d2-4e83-b8c9-a35ab645e413-kube-api-access-s9b2s\") pod \"28f3f1e7-c8d2-4e83-b8c9-a35ab645e413\" (UID: \"28f3f1e7-c8d2-4e83-b8c9-a35ab645e413\") " Dec 03 08:34:52 crc kubenswrapper[4947]: I1203 08:34:52.552294 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28f3f1e7-c8d2-4e83-b8c9-a35ab645e413-catalog-content\") pod \"28f3f1e7-c8d2-4e83-b8c9-a35ab645e413\" (UID: \"28f3f1e7-c8d2-4e83-b8c9-a35ab645e413\") " Dec 03 08:34:52 crc kubenswrapper[4947]: I1203 08:34:52.552364 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28f3f1e7-c8d2-4e83-b8c9-a35ab645e413-utilities\") pod \"28f3f1e7-c8d2-4e83-b8c9-a35ab645e413\" (UID: \"28f3f1e7-c8d2-4e83-b8c9-a35ab645e413\") " Dec 03 08:34:52 crc kubenswrapper[4947]: I1203 08:34:52.552975 4947 scope.go:117] "RemoveContainer" containerID="37e9a300b2ed2a0b113bf1bf1f11d22f3b57d84dc74c6b291274368c2aa2238f" Dec 03 08:34:52 crc kubenswrapper[4947]: I1203 08:34:52.553307 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28f3f1e7-c8d2-4e83-b8c9-a35ab645e413-utilities" (OuterVolumeSpecName: "utilities") pod "28f3f1e7-c8d2-4e83-b8c9-a35ab645e413" (UID: "28f3f1e7-c8d2-4e83-b8c9-a35ab645e413"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:34:52 crc kubenswrapper[4947]: I1203 08:34:52.561725 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28f3f1e7-c8d2-4e83-b8c9-a35ab645e413-kube-api-access-s9b2s" (OuterVolumeSpecName: "kube-api-access-s9b2s") pod "28f3f1e7-c8d2-4e83-b8c9-a35ab645e413" (UID: "28f3f1e7-c8d2-4e83-b8c9-a35ab645e413"). InnerVolumeSpecName "kube-api-access-s9b2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:34:52 crc kubenswrapper[4947]: I1203 08:34:52.568714 4947 scope.go:117] "RemoveContainer" containerID="7b23207e8c77d5ea0981bbf9e40077d7f259f2abfc512a24896b5f2a1b05a60b" Dec 03 08:34:52 crc kubenswrapper[4947]: I1203 08:34:52.606986 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28f3f1e7-c8d2-4e83-b8c9-a35ab645e413-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28f3f1e7-c8d2-4e83-b8c9-a35ab645e413" (UID: "28f3f1e7-c8d2-4e83-b8c9-a35ab645e413"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:34:52 crc kubenswrapper[4947]: I1203 08:34:52.616138 4947 scope.go:117] "RemoveContainer" containerID="7ae7d86e56034ed95ab7126d40e95e5c0cc59506dff83a5429153c96e20e0809" Dec 03 08:34:52 crc kubenswrapper[4947]: E1203 08:34:52.616468 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ae7d86e56034ed95ab7126d40e95e5c0cc59506dff83a5429153c96e20e0809\": container with ID starting with 7ae7d86e56034ed95ab7126d40e95e5c0cc59506dff83a5429153c96e20e0809 not found: ID does not exist" containerID="7ae7d86e56034ed95ab7126d40e95e5c0cc59506dff83a5429153c96e20e0809" Dec 03 08:34:52 crc kubenswrapper[4947]: I1203 08:34:52.616512 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae7d86e56034ed95ab7126d40e95e5c0cc59506dff83a5429153c96e20e0809"} err="failed to get container status \"7ae7d86e56034ed95ab7126d40e95e5c0cc59506dff83a5429153c96e20e0809\": rpc error: code = NotFound desc = could not find container \"7ae7d86e56034ed95ab7126d40e95e5c0cc59506dff83a5429153c96e20e0809\": container with ID starting with 7ae7d86e56034ed95ab7126d40e95e5c0cc59506dff83a5429153c96e20e0809 not found: ID does not exist" Dec 03 08:34:52 crc kubenswrapper[4947]: I1203 08:34:52.616532 4947 scope.go:117] "RemoveContainer" containerID="37e9a300b2ed2a0b113bf1bf1f11d22f3b57d84dc74c6b291274368c2aa2238f" Dec 03 08:34:52 crc kubenswrapper[4947]: E1203 08:34:52.616887 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37e9a300b2ed2a0b113bf1bf1f11d22f3b57d84dc74c6b291274368c2aa2238f\": container with ID starting with 37e9a300b2ed2a0b113bf1bf1f11d22f3b57d84dc74c6b291274368c2aa2238f not found: ID does not exist" containerID="37e9a300b2ed2a0b113bf1bf1f11d22f3b57d84dc74c6b291274368c2aa2238f" Dec 03 08:34:52 crc kubenswrapper[4947]: I1203 08:34:52.616913 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37e9a300b2ed2a0b113bf1bf1f11d22f3b57d84dc74c6b291274368c2aa2238f"} err="failed to get container status \"37e9a300b2ed2a0b113bf1bf1f11d22f3b57d84dc74c6b291274368c2aa2238f\": rpc error: code = NotFound desc = could not find container \"37e9a300b2ed2a0b113bf1bf1f11d22f3b57d84dc74c6b291274368c2aa2238f\": container with ID starting with 37e9a300b2ed2a0b113bf1bf1f11d22f3b57d84dc74c6b291274368c2aa2238f not found: ID does not exist" Dec 03 08:34:52 crc kubenswrapper[4947]: I1203 08:34:52.616929 4947 scope.go:117] "RemoveContainer" containerID="7b23207e8c77d5ea0981bbf9e40077d7f259f2abfc512a24896b5f2a1b05a60b" Dec 03 08:34:52 crc kubenswrapper[4947]: E1203 08:34:52.617388 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b23207e8c77d5ea0981bbf9e40077d7f259f2abfc512a24896b5f2a1b05a60b\": container with ID starting with 7b23207e8c77d5ea0981bbf9e40077d7f259f2abfc512a24896b5f2a1b05a60b not found: ID does not exist" containerID="7b23207e8c77d5ea0981bbf9e40077d7f259f2abfc512a24896b5f2a1b05a60b" Dec 03 08:34:52 crc kubenswrapper[4947]: I1203 08:34:52.617413 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b23207e8c77d5ea0981bbf9e40077d7f259f2abfc512a24896b5f2a1b05a60b"} err="failed to get container status \"7b23207e8c77d5ea0981bbf9e40077d7f259f2abfc512a24896b5f2a1b05a60b\": rpc error: code = NotFound desc = could not find container \"7b23207e8c77d5ea0981bbf9e40077d7f259f2abfc512a24896b5f2a1b05a60b\": container with ID starting with 7b23207e8c77d5ea0981bbf9e40077d7f259f2abfc512a24896b5f2a1b05a60b not found: ID does not exist" Dec 03 08:34:52 crc kubenswrapper[4947]: I1203 08:34:52.653884 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28f3f1e7-c8d2-4e83-b8c9-a35ab645e413-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:34:52 crc kubenswrapper[4947]: I1203 08:34:52.653925 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9b2s\" (UniqueName: \"kubernetes.io/projected/28f3f1e7-c8d2-4e83-b8c9-a35ab645e413-kube-api-access-s9b2s\") on node \"crc\" DevicePath \"\"" Dec 03 08:34:52 crc kubenswrapper[4947]: I1203 08:34:52.653940 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28f3f1e7-c8d2-4e83-b8c9-a35ab645e413-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:34:52 crc kubenswrapper[4947]: I1203 08:34:52.866623 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mm285"] Dec 03 08:34:52 crc kubenswrapper[4947]: I1203 08:34:52.874161 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mm285"] Dec 03 08:34:53 crc kubenswrapper[4947]: I1203 08:34:53.102231 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28f3f1e7-c8d2-4e83-b8c9-a35ab645e413" path="/var/lib/kubelet/pods/28f3f1e7-c8d2-4e83-b8c9-a35ab645e413/volumes" Dec 03 08:34:57 crc kubenswrapper[4947]: I1203 08:34:57.087070 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:34:57 crc kubenswrapper[4947]: E1203 08:34:57.087718 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:35:10 crc kubenswrapper[4947]: I1203 08:35:10.083063 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:35:10 crc kubenswrapper[4947]: E1203 08:35:10.083847 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:35:24 crc kubenswrapper[4947]: I1203 08:35:24.083957 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:35:24 crc kubenswrapper[4947]: E1203 08:35:24.084913 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:35:27 crc kubenswrapper[4947]: I1203 08:35:27.093218 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dkcn2"] Dec 03 08:35:27 crc kubenswrapper[4947]: E1203 08:35:27.095310 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f3f1e7-c8d2-4e83-b8c9-a35ab645e413" containerName="registry-server" Dec 03 08:35:27 crc kubenswrapper[4947]: I1203 08:35:27.095328 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f3f1e7-c8d2-4e83-b8c9-a35ab645e413" containerName="registry-server" Dec 03 08:35:27 crc kubenswrapper[4947]: E1203 08:35:27.095348 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f3f1e7-c8d2-4e83-b8c9-a35ab645e413" containerName="extract-content" Dec 03 08:35:27 crc kubenswrapper[4947]: I1203 08:35:27.095358 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f3f1e7-c8d2-4e83-b8c9-a35ab645e413" containerName="extract-content" Dec 03 08:35:27 crc kubenswrapper[4947]: E1203 08:35:27.095376 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f3f1e7-c8d2-4e83-b8c9-a35ab645e413" containerName="extract-utilities" Dec 03 08:35:27 crc kubenswrapper[4947]: I1203 08:35:27.095384 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f3f1e7-c8d2-4e83-b8c9-a35ab645e413" containerName="extract-utilities" Dec 03 08:35:27 crc kubenswrapper[4947]: I1203 08:35:27.095578 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="28f3f1e7-c8d2-4e83-b8c9-a35ab645e413" containerName="registry-server" Dec 03 08:35:27 crc kubenswrapper[4947]: I1203 08:35:27.096939 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dkcn2" Dec 03 08:35:27 crc kubenswrapper[4947]: I1203 08:35:27.109743 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dkcn2"] Dec 03 08:35:27 crc kubenswrapper[4947]: I1203 08:35:27.170215 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7660796-b2cd-4c3d-b671-4244d976f30f-utilities\") pod \"redhat-operators-dkcn2\" (UID: \"f7660796-b2cd-4c3d-b671-4244d976f30f\") " pod="openshift-marketplace/redhat-operators-dkcn2" Dec 03 08:35:27 crc kubenswrapper[4947]: I1203 08:35:27.170646 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkgcw\" (UniqueName: \"kubernetes.io/projected/f7660796-b2cd-4c3d-b671-4244d976f30f-kube-api-access-rkgcw\") pod \"redhat-operators-dkcn2\" (UID: \"f7660796-b2cd-4c3d-b671-4244d976f30f\") " pod="openshift-marketplace/redhat-operators-dkcn2" Dec 03 08:35:27 crc kubenswrapper[4947]: I1203 08:35:27.170704 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7660796-b2cd-4c3d-b671-4244d976f30f-catalog-content\") pod \"redhat-operators-dkcn2\" (UID: \"f7660796-b2cd-4c3d-b671-4244d976f30f\") " pod="openshift-marketplace/redhat-operators-dkcn2" Dec 03 08:35:27 crc kubenswrapper[4947]: I1203 08:35:27.272518 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkgcw\" (UniqueName: \"kubernetes.io/projected/f7660796-b2cd-4c3d-b671-4244d976f30f-kube-api-access-rkgcw\") pod \"redhat-operators-dkcn2\" (UID: \"f7660796-b2cd-4c3d-b671-4244d976f30f\") " pod="openshift-marketplace/redhat-operators-dkcn2" Dec 03 08:35:27 crc kubenswrapper[4947]: I1203 08:35:27.272574 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7660796-b2cd-4c3d-b671-4244d976f30f-catalog-content\") pod \"redhat-operators-dkcn2\" (UID: \"f7660796-b2cd-4c3d-b671-4244d976f30f\") " pod="openshift-marketplace/redhat-operators-dkcn2" Dec 03 08:35:27 crc kubenswrapper[4947]: I1203 08:35:27.272634 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7660796-b2cd-4c3d-b671-4244d976f30f-utilities\") pod \"redhat-operators-dkcn2\" (UID: \"f7660796-b2cd-4c3d-b671-4244d976f30f\") " pod="openshift-marketplace/redhat-operators-dkcn2" Dec 03 08:35:27 crc kubenswrapper[4947]: I1203 08:35:27.273148 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7660796-b2cd-4c3d-b671-4244d976f30f-catalog-content\") pod \"redhat-operators-dkcn2\" (UID: \"f7660796-b2cd-4c3d-b671-4244d976f30f\") " pod="openshift-marketplace/redhat-operators-dkcn2" Dec 03 08:35:27 crc kubenswrapper[4947]: I1203 08:35:27.273236 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7660796-b2cd-4c3d-b671-4244d976f30f-utilities\") pod \"redhat-operators-dkcn2\" (UID: \"f7660796-b2cd-4c3d-b671-4244d976f30f\") " pod="openshift-marketplace/redhat-operators-dkcn2" Dec 03 08:35:27 crc kubenswrapper[4947]: I1203 08:35:27.295881 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkgcw\" (UniqueName: \"kubernetes.io/projected/f7660796-b2cd-4c3d-b671-4244d976f30f-kube-api-access-rkgcw\") pod \"redhat-operators-dkcn2\" (UID: \"f7660796-b2cd-4c3d-b671-4244d976f30f\") " pod="openshift-marketplace/redhat-operators-dkcn2" Dec 03 08:35:27 crc kubenswrapper[4947]: I1203 08:35:27.421789 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dkcn2" Dec 03 08:35:27 crc kubenswrapper[4947]: I1203 08:35:27.849984 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dkcn2"] Dec 03 08:35:28 crc kubenswrapper[4947]: I1203 08:35:28.862901 4947 generic.go:334] "Generic (PLEG): container finished" podID="f7660796-b2cd-4c3d-b671-4244d976f30f" containerID="7ec04ec7a50f67086b3afd99070c7d7db31f190b89ea475d9f5e1d551fddc79a" exitCode=0 Dec 03 08:35:28 crc kubenswrapper[4947]: I1203 08:35:28.863025 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkcn2" event={"ID":"f7660796-b2cd-4c3d-b671-4244d976f30f","Type":"ContainerDied","Data":"7ec04ec7a50f67086b3afd99070c7d7db31f190b89ea475d9f5e1d551fddc79a"} Dec 03 08:35:28 crc kubenswrapper[4947]: I1203 08:35:28.863558 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkcn2" event={"ID":"f7660796-b2cd-4c3d-b671-4244d976f30f","Type":"ContainerStarted","Data":"c8e96dc9e1dc9897e69a42e25208f10866472632e7bfb9f0f18b49331ac85e3d"} Dec 03 08:35:28 crc kubenswrapper[4947]: I1203 08:35:28.865101 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 08:35:29 crc kubenswrapper[4947]: I1203 08:35:29.872122 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkcn2" event={"ID":"f7660796-b2cd-4c3d-b671-4244d976f30f","Type":"ContainerStarted","Data":"b8d6493fd39781a29949fc0aa1c639041d0a8e8f6d10a8c07575f31761c0efd0"} Dec 03 08:35:30 crc kubenswrapper[4947]: I1203 08:35:30.881024 4947 generic.go:334] "Generic (PLEG): container finished" podID="f7660796-b2cd-4c3d-b671-4244d976f30f" containerID="b8d6493fd39781a29949fc0aa1c639041d0a8e8f6d10a8c07575f31761c0efd0" exitCode=0 Dec 03 08:35:30 crc kubenswrapper[4947]: I1203 08:35:30.881076 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkcn2" event={"ID":"f7660796-b2cd-4c3d-b671-4244d976f30f","Type":"ContainerDied","Data":"b8d6493fd39781a29949fc0aa1c639041d0a8e8f6d10a8c07575f31761c0efd0"} Dec 03 08:35:31 crc kubenswrapper[4947]: I1203 08:35:31.896867 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkcn2" event={"ID":"f7660796-b2cd-4c3d-b671-4244d976f30f","Type":"ContainerStarted","Data":"5b22510c91d5d061eedf8effce30b97980f48dc4f1322a545615fb9779ff0a56"} Dec 03 08:35:31 crc kubenswrapper[4947]: I1203 08:35:31.914205 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dkcn2" podStartSLOduration=2.46624163 podStartE2EDuration="4.914182386s" podCreationTimestamp="2025-12-03 08:35:27 +0000 UTC" firstStartedPulling="2025-12-03 08:35:28.864815625 +0000 UTC m=+6390.125770061" lastFinishedPulling="2025-12-03 08:35:31.312756391 +0000 UTC m=+6392.573710817" observedRunningTime="2025-12-03 08:35:31.914086203 +0000 UTC m=+6393.175040649" watchObservedRunningTime="2025-12-03 08:35:31.914182386 +0000 UTC m=+6393.175136832" Dec 03 08:35:36 crc kubenswrapper[4947]: I1203 08:35:36.083380 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:35:36 crc kubenswrapper[4947]: E1203 08:35:36.084228 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:35:37 crc kubenswrapper[4947]: I1203 08:35:37.422941 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dkcn2" Dec 03 08:35:37 crc kubenswrapper[4947]: I1203 08:35:37.423643 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dkcn2" Dec 03 08:35:37 crc kubenswrapper[4947]: I1203 08:35:37.476183 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dkcn2" Dec 03 08:35:37 crc kubenswrapper[4947]: I1203 08:35:37.980382 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dkcn2" Dec 03 08:35:38 crc kubenswrapper[4947]: I1203 08:35:38.025174 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dkcn2"] Dec 03 08:35:39 crc kubenswrapper[4947]: I1203 08:35:39.962216 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dkcn2" podUID="f7660796-b2cd-4c3d-b671-4244d976f30f" containerName="registry-server" containerID="cri-o://5b22510c91d5d061eedf8effce30b97980f48dc4f1322a545615fb9779ff0a56" gracePeriod=2 Dec 03 08:35:41 crc kubenswrapper[4947]: I1203 08:35:41.981112 4947 generic.go:334] "Generic (PLEG): container finished" podID="f7660796-b2cd-4c3d-b671-4244d976f30f" containerID="5b22510c91d5d061eedf8effce30b97980f48dc4f1322a545615fb9779ff0a56" exitCode=0 Dec 03 08:35:41 crc kubenswrapper[4947]: I1203 08:35:41.981308 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkcn2" event={"ID":"f7660796-b2cd-4c3d-b671-4244d976f30f","Type":"ContainerDied","Data":"5b22510c91d5d061eedf8effce30b97980f48dc4f1322a545615fb9779ff0a56"} Dec 03 08:35:42 crc kubenswrapper[4947]: I1203 08:35:42.201880 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dkcn2" Dec 03 08:35:42 crc kubenswrapper[4947]: I1203 08:35:42.293023 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7660796-b2cd-4c3d-b671-4244d976f30f-catalog-content\") pod \"f7660796-b2cd-4c3d-b671-4244d976f30f\" (UID: \"f7660796-b2cd-4c3d-b671-4244d976f30f\") " Dec 03 08:35:42 crc kubenswrapper[4947]: I1203 08:35:42.293198 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkgcw\" (UniqueName: \"kubernetes.io/projected/f7660796-b2cd-4c3d-b671-4244d976f30f-kube-api-access-rkgcw\") pod \"f7660796-b2cd-4c3d-b671-4244d976f30f\" (UID: \"f7660796-b2cd-4c3d-b671-4244d976f30f\") " Dec 03 08:35:42 crc kubenswrapper[4947]: I1203 08:35:42.293243 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7660796-b2cd-4c3d-b671-4244d976f30f-utilities\") pod \"f7660796-b2cd-4c3d-b671-4244d976f30f\" (UID: \"f7660796-b2cd-4c3d-b671-4244d976f30f\") " Dec 03 08:35:42 crc kubenswrapper[4947]: I1203 08:35:42.295212 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7660796-b2cd-4c3d-b671-4244d976f30f-utilities" (OuterVolumeSpecName: "utilities") pod "f7660796-b2cd-4c3d-b671-4244d976f30f" (UID: "f7660796-b2cd-4c3d-b671-4244d976f30f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:35:42 crc kubenswrapper[4947]: I1203 08:35:42.300351 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7660796-b2cd-4c3d-b671-4244d976f30f-kube-api-access-rkgcw" (OuterVolumeSpecName: "kube-api-access-rkgcw") pod "f7660796-b2cd-4c3d-b671-4244d976f30f" (UID: "f7660796-b2cd-4c3d-b671-4244d976f30f"). InnerVolumeSpecName "kube-api-access-rkgcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:35:42 crc kubenswrapper[4947]: I1203 08:35:42.395196 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkgcw\" (UniqueName: \"kubernetes.io/projected/f7660796-b2cd-4c3d-b671-4244d976f30f-kube-api-access-rkgcw\") on node \"crc\" DevicePath \"\"" Dec 03 08:35:42 crc kubenswrapper[4947]: I1203 08:35:42.395233 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7660796-b2cd-4c3d-b671-4244d976f30f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:35:42 crc kubenswrapper[4947]: I1203 08:35:42.405590 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7660796-b2cd-4c3d-b671-4244d976f30f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f7660796-b2cd-4c3d-b671-4244d976f30f" (UID: "f7660796-b2cd-4c3d-b671-4244d976f30f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:35:42 crc kubenswrapper[4947]: I1203 08:35:42.496640 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7660796-b2cd-4c3d-b671-4244d976f30f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:35:42 crc kubenswrapper[4947]: I1203 08:35:42.990296 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkcn2" event={"ID":"f7660796-b2cd-4c3d-b671-4244d976f30f","Type":"ContainerDied","Data":"c8e96dc9e1dc9897e69a42e25208f10866472632e7bfb9f0f18b49331ac85e3d"} Dec 03 08:35:42 crc kubenswrapper[4947]: I1203 08:35:42.990633 4947 scope.go:117] "RemoveContainer" containerID="5b22510c91d5d061eedf8effce30b97980f48dc4f1322a545615fb9779ff0a56" Dec 03 08:35:42 crc kubenswrapper[4947]: I1203 08:35:42.990356 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dkcn2" Dec 03 08:35:43 crc kubenswrapper[4947]: I1203 08:35:43.010409 4947 scope.go:117] "RemoveContainer" containerID="b8d6493fd39781a29949fc0aa1c639041d0a8e8f6d10a8c07575f31761c0efd0" Dec 03 08:35:43 crc kubenswrapper[4947]: I1203 08:35:43.025808 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dkcn2"] Dec 03 08:35:43 crc kubenswrapper[4947]: I1203 08:35:43.033729 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dkcn2"] Dec 03 08:35:43 crc kubenswrapper[4947]: I1203 08:35:43.044505 4947 scope.go:117] "RemoveContainer" containerID="7ec04ec7a50f67086b3afd99070c7d7db31f190b89ea475d9f5e1d551fddc79a" Dec 03 08:35:43 crc kubenswrapper[4947]: I1203 08:35:43.091980 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7660796-b2cd-4c3d-b671-4244d976f30f" path="/var/lib/kubelet/pods/f7660796-b2cd-4c3d-b671-4244d976f30f/volumes" Dec 03 08:35:51 crc kubenswrapper[4947]: I1203 08:35:51.083353 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:35:51 crc kubenswrapper[4947]: E1203 08:35:51.083925 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:36:05 crc kubenswrapper[4947]: I1203 08:36:05.083002 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:36:05 crc kubenswrapper[4947]: E1203 08:36:05.083847 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:36:16 crc kubenswrapper[4947]: I1203 08:36:16.085287 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:36:16 crc kubenswrapper[4947]: E1203 08:36:16.086727 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:36:27 crc kubenswrapper[4947]: I1203 08:36:27.083876 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:36:27 crc kubenswrapper[4947]: E1203 08:36:27.085101 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:36:42 crc kubenswrapper[4947]: I1203 08:36:42.083630 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:36:42 crc kubenswrapper[4947]: E1203 08:36:42.084826 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:36:54 crc kubenswrapper[4947]: I1203 08:36:54.083047 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:36:54 crc kubenswrapper[4947]: E1203 08:36:54.083775 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:37:06 crc kubenswrapper[4947]: I1203 08:37:06.082523 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:37:06 crc kubenswrapper[4947]: E1203 08:37:06.083290 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:37:18 crc kubenswrapper[4947]: I1203 08:37:18.084144 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:37:18 crc kubenswrapper[4947]: E1203 08:37:18.084963 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:37:30 crc kubenswrapper[4947]: I1203 08:37:30.083017 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:37:30 crc kubenswrapper[4947]: E1203 08:37:30.083959 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:37:44 crc kubenswrapper[4947]: I1203 08:37:44.083184 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:37:44 crc kubenswrapper[4947]: E1203 08:37:44.083715 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:37:55 crc kubenswrapper[4947]: I1203 08:37:55.083949 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:37:55 crc kubenswrapper[4947]: E1203 08:37:55.084771 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:38:07 crc kubenswrapper[4947]: I1203 08:38:07.082586 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:38:07 crc kubenswrapper[4947]: E1203 08:38:07.083283 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:38:18 crc kubenswrapper[4947]: I1203 08:38:18.082545 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:38:18 crc kubenswrapper[4947]: E1203 08:38:18.083140 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:38:29 crc kubenswrapper[4947]: I1203 08:38:29.093887 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:38:29 crc kubenswrapper[4947]: E1203 08:38:29.095083 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:38:43 crc kubenswrapper[4947]: I1203 08:38:43.085418 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:38:43 crc kubenswrapper[4947]: E1203 08:38:43.086379 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:38:58 crc kubenswrapper[4947]: I1203 08:38:58.083283 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:38:58 crc kubenswrapper[4947]: E1203 08:38:58.084121 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:38:58 crc kubenswrapper[4947]: I1203 08:38:58.328108 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zlz87"] Dec 03 08:38:58 crc kubenswrapper[4947]: E1203 08:38:58.328696 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7660796-b2cd-4c3d-b671-4244d976f30f" containerName="extract-content" Dec 03 08:38:58 crc kubenswrapper[4947]: I1203 08:38:58.328741 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7660796-b2cd-4c3d-b671-4244d976f30f" containerName="extract-content" Dec 03 08:38:58 crc kubenswrapper[4947]: E1203 08:38:58.328801 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7660796-b2cd-4c3d-b671-4244d976f30f" containerName="registry-server" Dec 03 08:38:58 crc kubenswrapper[4947]: I1203 08:38:58.328819 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7660796-b2cd-4c3d-b671-4244d976f30f" containerName="registry-server" Dec 03 08:38:58 crc kubenswrapper[4947]: E1203 08:38:58.328852 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7660796-b2cd-4c3d-b671-4244d976f30f" containerName="extract-utilities" Dec 03 08:38:58 crc kubenswrapper[4947]: I1203 08:38:58.328867 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7660796-b2cd-4c3d-b671-4244d976f30f" containerName="extract-utilities" Dec 03 08:38:58 crc kubenswrapper[4947]: I1203 08:38:58.329200 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7660796-b2cd-4c3d-b671-4244d976f30f" containerName="registry-server" Dec 03 08:38:58 crc kubenswrapper[4947]: I1203 08:38:58.331282 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zlz87" Dec 03 08:38:58 crc kubenswrapper[4947]: I1203 08:38:58.337827 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zlz87"] Dec 03 08:38:58 crc kubenswrapper[4947]: I1203 08:38:58.362585 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91795005-40e9-4d23-8964-be5f263ac04d-utilities\") pod \"community-operators-zlz87\" (UID: \"91795005-40e9-4d23-8964-be5f263ac04d\") " pod="openshift-marketplace/community-operators-zlz87" Dec 03 08:38:58 crc kubenswrapper[4947]: I1203 08:38:58.362911 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91795005-40e9-4d23-8964-be5f263ac04d-catalog-content\") pod \"community-operators-zlz87\" (UID: \"91795005-40e9-4d23-8964-be5f263ac04d\") " pod="openshift-marketplace/community-operators-zlz87" Dec 03 08:38:58 crc kubenswrapper[4947]: I1203 08:38:58.363053 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8454z\" (UniqueName: \"kubernetes.io/projected/91795005-40e9-4d23-8964-be5f263ac04d-kube-api-access-8454z\") pod \"community-operators-zlz87\" (UID: \"91795005-40e9-4d23-8964-be5f263ac04d\") " pod="openshift-marketplace/community-operators-zlz87" Dec 03 08:38:58 crc kubenswrapper[4947]: I1203 08:38:58.463606 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8454z\" (UniqueName: \"kubernetes.io/projected/91795005-40e9-4d23-8964-be5f263ac04d-kube-api-access-8454z\") pod \"community-operators-zlz87\" (UID: \"91795005-40e9-4d23-8964-be5f263ac04d\") " pod="openshift-marketplace/community-operators-zlz87" Dec 03 08:38:58 crc kubenswrapper[4947]: I1203 08:38:58.463674 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91795005-40e9-4d23-8964-be5f263ac04d-utilities\") pod \"community-operators-zlz87\" (UID: \"91795005-40e9-4d23-8964-be5f263ac04d\") " pod="openshift-marketplace/community-operators-zlz87" Dec 03 08:38:58 crc kubenswrapper[4947]: I1203 08:38:58.463695 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91795005-40e9-4d23-8964-be5f263ac04d-catalog-content\") pod \"community-operators-zlz87\" (UID: \"91795005-40e9-4d23-8964-be5f263ac04d\") " pod="openshift-marketplace/community-operators-zlz87" Dec 03 08:38:58 crc kubenswrapper[4947]: I1203 08:38:58.464181 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91795005-40e9-4d23-8964-be5f263ac04d-catalog-content\") pod \"community-operators-zlz87\" (UID: \"91795005-40e9-4d23-8964-be5f263ac04d\") " pod="openshift-marketplace/community-operators-zlz87" Dec 03 08:38:58 crc kubenswrapper[4947]: I1203 08:38:58.464301 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91795005-40e9-4d23-8964-be5f263ac04d-utilities\") pod \"community-operators-zlz87\" (UID: \"91795005-40e9-4d23-8964-be5f263ac04d\") " pod="openshift-marketplace/community-operators-zlz87" Dec 03 08:38:58 crc kubenswrapper[4947]: I1203 08:38:58.494982 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8454z\" (UniqueName: \"kubernetes.io/projected/91795005-40e9-4d23-8964-be5f263ac04d-kube-api-access-8454z\") pod \"community-operators-zlz87\" (UID: \"91795005-40e9-4d23-8964-be5f263ac04d\") " pod="openshift-marketplace/community-operators-zlz87" Dec 03 08:38:58 crc kubenswrapper[4947]: I1203 08:38:58.662636 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zlz87" Dec 03 08:38:59 crc kubenswrapper[4947]: I1203 08:38:59.162919 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zlz87"] Dec 03 08:38:59 crc kubenswrapper[4947]: I1203 08:38:59.660595 4947 generic.go:334] "Generic (PLEG): container finished" podID="91795005-40e9-4d23-8964-be5f263ac04d" containerID="2b1ce603f40ff17e4a57a539e9e47a181861aa29d9c7f798cb01933b23386aec" exitCode=0 Dec 03 08:38:59 crc kubenswrapper[4947]: I1203 08:38:59.660643 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zlz87" event={"ID":"91795005-40e9-4d23-8964-be5f263ac04d","Type":"ContainerDied","Data":"2b1ce603f40ff17e4a57a539e9e47a181861aa29d9c7f798cb01933b23386aec"} Dec 03 08:38:59 crc kubenswrapper[4947]: I1203 08:38:59.660674 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zlz87" event={"ID":"91795005-40e9-4d23-8964-be5f263ac04d","Type":"ContainerStarted","Data":"d89c68f0d71964c69925635eadc33e12b649d32df2274e1d6a8a0346e20bf6ef"} Dec 03 08:39:01 crc kubenswrapper[4947]: I1203 08:39:01.679162 4947 generic.go:334] "Generic (PLEG): container finished" podID="91795005-40e9-4d23-8964-be5f263ac04d" containerID="ec40067a99636abefd4c0568cc6c7893b019f4da284acbd131086c3fd8b9cd45" exitCode=0 Dec 03 08:39:01 crc kubenswrapper[4947]: I1203 08:39:01.679246 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zlz87" event={"ID":"91795005-40e9-4d23-8964-be5f263ac04d","Type":"ContainerDied","Data":"ec40067a99636abefd4c0568cc6c7893b019f4da284acbd131086c3fd8b9cd45"} Dec 03 08:39:02 crc kubenswrapper[4947]: I1203 08:39:02.697895 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zlz87" event={"ID":"91795005-40e9-4d23-8964-be5f263ac04d","Type":"ContainerStarted","Data":"6dc47ced311a55646a4d52543af3354d03eafe8047d3a0b018fe17ced4089f47"} Dec 03 08:39:02 crc kubenswrapper[4947]: I1203 08:39:02.723402 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zlz87" podStartSLOduration=2.260178247 podStartE2EDuration="4.723380966s" podCreationTimestamp="2025-12-03 08:38:58 +0000 UTC" firstStartedPulling="2025-12-03 08:38:59.662784642 +0000 UTC m=+6600.923739068" lastFinishedPulling="2025-12-03 08:39:02.125987361 +0000 UTC m=+6603.386941787" observedRunningTime="2025-12-03 08:39:02.717913359 +0000 UTC m=+6603.978867835" watchObservedRunningTime="2025-12-03 08:39:02.723380966 +0000 UTC m=+6603.984335402" Dec 03 08:39:08 crc kubenswrapper[4947]: I1203 08:39:08.663164 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zlz87" Dec 03 08:39:08 crc kubenswrapper[4947]: I1203 08:39:08.663589 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zlz87" Dec 03 08:39:08 crc kubenswrapper[4947]: I1203 08:39:08.720180 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zlz87" Dec 03 08:39:08 crc kubenswrapper[4947]: I1203 08:39:08.780858 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zlz87" Dec 03 08:39:08 crc kubenswrapper[4947]: I1203 08:39:08.953005 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zlz87"] Dec 03 08:39:09 crc kubenswrapper[4947]: I1203 08:39:09.088472 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:39:09 crc kubenswrapper[4947]: E1203 08:39:09.089033 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:39:10 crc kubenswrapper[4947]: I1203 08:39:10.757239 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zlz87" podUID="91795005-40e9-4d23-8964-be5f263ac04d" containerName="registry-server" containerID="cri-o://6dc47ced311a55646a4d52543af3354d03eafe8047d3a0b018fe17ced4089f47" gracePeriod=2 Dec 03 08:39:11 crc kubenswrapper[4947]: I1203 08:39:11.169638 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zlz87" Dec 03 08:39:11 crc kubenswrapper[4947]: I1203 08:39:11.350341 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91795005-40e9-4d23-8964-be5f263ac04d-utilities\") pod \"91795005-40e9-4d23-8964-be5f263ac04d\" (UID: \"91795005-40e9-4d23-8964-be5f263ac04d\") " Dec 03 08:39:11 crc kubenswrapper[4947]: I1203 08:39:11.350454 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8454z\" (UniqueName: \"kubernetes.io/projected/91795005-40e9-4d23-8964-be5f263ac04d-kube-api-access-8454z\") pod \"91795005-40e9-4d23-8964-be5f263ac04d\" (UID: \"91795005-40e9-4d23-8964-be5f263ac04d\") " Dec 03 08:39:11 crc kubenswrapper[4947]: I1203 08:39:11.350681 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91795005-40e9-4d23-8964-be5f263ac04d-catalog-content\") pod \"91795005-40e9-4d23-8964-be5f263ac04d\" (UID: \"91795005-40e9-4d23-8964-be5f263ac04d\") " Dec 03 08:39:11 crc kubenswrapper[4947]: I1203 08:39:11.351706 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91795005-40e9-4d23-8964-be5f263ac04d-utilities" (OuterVolumeSpecName: "utilities") pod "91795005-40e9-4d23-8964-be5f263ac04d" (UID: "91795005-40e9-4d23-8964-be5f263ac04d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:39:11 crc kubenswrapper[4947]: I1203 08:39:11.356781 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91795005-40e9-4d23-8964-be5f263ac04d-kube-api-access-8454z" (OuterVolumeSpecName: "kube-api-access-8454z") pod "91795005-40e9-4d23-8964-be5f263ac04d" (UID: "91795005-40e9-4d23-8964-be5f263ac04d"). InnerVolumeSpecName "kube-api-access-8454z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:39:11 crc kubenswrapper[4947]: I1203 08:39:11.422543 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91795005-40e9-4d23-8964-be5f263ac04d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91795005-40e9-4d23-8964-be5f263ac04d" (UID: "91795005-40e9-4d23-8964-be5f263ac04d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:39:11 crc kubenswrapper[4947]: I1203 08:39:11.452898 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91795005-40e9-4d23-8964-be5f263ac04d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:39:11 crc kubenswrapper[4947]: I1203 08:39:11.452934 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8454z\" (UniqueName: \"kubernetes.io/projected/91795005-40e9-4d23-8964-be5f263ac04d-kube-api-access-8454z\") on node \"crc\" DevicePath \"\"" Dec 03 08:39:11 crc kubenswrapper[4947]: I1203 08:39:11.452945 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91795005-40e9-4d23-8964-be5f263ac04d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:39:11 crc kubenswrapper[4947]: I1203 08:39:11.765061 4947 generic.go:334] "Generic (PLEG): container finished" podID="91795005-40e9-4d23-8964-be5f263ac04d" containerID="6dc47ced311a55646a4d52543af3354d03eafe8047d3a0b018fe17ced4089f47" exitCode=0 Dec 03 08:39:11 crc kubenswrapper[4947]: I1203 08:39:11.765109 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zlz87" Dec 03 08:39:11 crc kubenswrapper[4947]: I1203 08:39:11.765108 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zlz87" event={"ID":"91795005-40e9-4d23-8964-be5f263ac04d","Type":"ContainerDied","Data":"6dc47ced311a55646a4d52543af3354d03eafe8047d3a0b018fe17ced4089f47"} Dec 03 08:39:11 crc kubenswrapper[4947]: I1203 08:39:11.765262 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zlz87" event={"ID":"91795005-40e9-4d23-8964-be5f263ac04d","Type":"ContainerDied","Data":"d89c68f0d71964c69925635eadc33e12b649d32df2274e1d6a8a0346e20bf6ef"} Dec 03 08:39:11 crc kubenswrapper[4947]: I1203 08:39:11.765293 4947 scope.go:117] "RemoveContainer" containerID="6dc47ced311a55646a4d52543af3354d03eafe8047d3a0b018fe17ced4089f47" Dec 03 08:39:11 crc kubenswrapper[4947]: I1203 08:39:11.790435 4947 scope.go:117] "RemoveContainer" containerID="ec40067a99636abefd4c0568cc6c7893b019f4da284acbd131086c3fd8b9cd45" Dec 03 08:39:11 crc kubenswrapper[4947]: I1203 08:39:11.805790 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zlz87"] Dec 03 08:39:11 crc kubenswrapper[4947]: I1203 08:39:11.810956 4947 scope.go:117] "RemoveContainer" containerID="2b1ce603f40ff17e4a57a539e9e47a181861aa29d9c7f798cb01933b23386aec" Dec 03 08:39:11 crc kubenswrapper[4947]: I1203 08:39:11.811920 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zlz87"] Dec 03 08:39:11 crc kubenswrapper[4947]: I1203 08:39:11.840269 4947 scope.go:117] "RemoveContainer" containerID="6dc47ced311a55646a4d52543af3354d03eafe8047d3a0b018fe17ced4089f47" Dec 03 08:39:11 crc kubenswrapper[4947]: E1203 08:39:11.840788 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dc47ced311a55646a4d52543af3354d03eafe8047d3a0b018fe17ced4089f47\": container with ID starting with 6dc47ced311a55646a4d52543af3354d03eafe8047d3a0b018fe17ced4089f47 not found: ID does not exist" containerID="6dc47ced311a55646a4d52543af3354d03eafe8047d3a0b018fe17ced4089f47" Dec 03 08:39:11 crc kubenswrapper[4947]: I1203 08:39:11.840837 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dc47ced311a55646a4d52543af3354d03eafe8047d3a0b018fe17ced4089f47"} err="failed to get container status \"6dc47ced311a55646a4d52543af3354d03eafe8047d3a0b018fe17ced4089f47\": rpc error: code = NotFound desc = could not find container \"6dc47ced311a55646a4d52543af3354d03eafe8047d3a0b018fe17ced4089f47\": container with ID starting with 6dc47ced311a55646a4d52543af3354d03eafe8047d3a0b018fe17ced4089f47 not found: ID does not exist" Dec 03 08:39:11 crc kubenswrapper[4947]: I1203 08:39:11.840862 4947 scope.go:117] "RemoveContainer" containerID="ec40067a99636abefd4c0568cc6c7893b019f4da284acbd131086c3fd8b9cd45" Dec 03 08:39:11 crc kubenswrapper[4947]: E1203 08:39:11.841131 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec40067a99636abefd4c0568cc6c7893b019f4da284acbd131086c3fd8b9cd45\": container with ID starting with ec40067a99636abefd4c0568cc6c7893b019f4da284acbd131086c3fd8b9cd45 not found: ID does not exist" containerID="ec40067a99636abefd4c0568cc6c7893b019f4da284acbd131086c3fd8b9cd45" Dec 03 08:39:11 crc kubenswrapper[4947]: I1203 08:39:11.841162 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec40067a99636abefd4c0568cc6c7893b019f4da284acbd131086c3fd8b9cd45"} err="failed to get container status \"ec40067a99636abefd4c0568cc6c7893b019f4da284acbd131086c3fd8b9cd45\": rpc error: code = NotFound desc = could not find container \"ec40067a99636abefd4c0568cc6c7893b019f4da284acbd131086c3fd8b9cd45\": container with ID starting with ec40067a99636abefd4c0568cc6c7893b019f4da284acbd131086c3fd8b9cd45 not found: ID does not exist" Dec 03 08:39:11 crc kubenswrapper[4947]: I1203 08:39:11.841181 4947 scope.go:117] "RemoveContainer" containerID="2b1ce603f40ff17e4a57a539e9e47a181861aa29d9c7f798cb01933b23386aec" Dec 03 08:39:11 crc kubenswrapper[4947]: E1203 08:39:11.841525 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b1ce603f40ff17e4a57a539e9e47a181861aa29d9c7f798cb01933b23386aec\": container with ID starting with 2b1ce603f40ff17e4a57a539e9e47a181861aa29d9c7f798cb01933b23386aec not found: ID does not exist" containerID="2b1ce603f40ff17e4a57a539e9e47a181861aa29d9c7f798cb01933b23386aec" Dec 03 08:39:11 crc kubenswrapper[4947]: I1203 08:39:11.841558 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b1ce603f40ff17e4a57a539e9e47a181861aa29d9c7f798cb01933b23386aec"} err="failed to get container status \"2b1ce603f40ff17e4a57a539e9e47a181861aa29d9c7f798cb01933b23386aec\": rpc error: code = NotFound desc = could not find container \"2b1ce603f40ff17e4a57a539e9e47a181861aa29d9c7f798cb01933b23386aec\": container with ID starting with 2b1ce603f40ff17e4a57a539e9e47a181861aa29d9c7f798cb01933b23386aec not found: ID does not exist" Dec 03 08:39:13 crc kubenswrapper[4947]: I1203 08:39:13.092849 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91795005-40e9-4d23-8964-be5f263ac04d" path="/var/lib/kubelet/pods/91795005-40e9-4d23-8964-be5f263ac04d/volumes" Dec 03 08:39:22 crc kubenswrapper[4947]: I1203 08:39:22.082385 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:39:22 crc kubenswrapper[4947]: E1203 08:39:22.083086 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:39:36 crc kubenswrapper[4947]: I1203 08:39:36.084349 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:39:36 crc kubenswrapper[4947]: I1203 08:39:36.978555 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"f497ed6e66f07c3659d612e4d03fc5594262467811b840160071934dd1dd8bac"} Dec 03 08:39:57 crc kubenswrapper[4947]: I1203 08:39:57.816318 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-44qkl"] Dec 03 08:39:57 crc kubenswrapper[4947]: E1203 08:39:57.817341 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91795005-40e9-4d23-8964-be5f263ac04d" containerName="registry-server" Dec 03 08:39:57 crc kubenswrapper[4947]: I1203 08:39:57.817361 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="91795005-40e9-4d23-8964-be5f263ac04d" containerName="registry-server" Dec 03 08:39:57 crc kubenswrapper[4947]: E1203 08:39:57.817399 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91795005-40e9-4d23-8964-be5f263ac04d" containerName="extract-content" Dec 03 08:39:57 crc kubenswrapper[4947]: I1203 08:39:57.817409 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="91795005-40e9-4d23-8964-be5f263ac04d" containerName="extract-content" Dec 03 08:39:57 crc kubenswrapper[4947]: E1203 08:39:57.817422 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91795005-40e9-4d23-8964-be5f263ac04d" containerName="extract-utilities" Dec 03 08:39:57 crc kubenswrapper[4947]: I1203 08:39:57.817432 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="91795005-40e9-4d23-8964-be5f263ac04d" containerName="extract-utilities" Dec 03 08:39:57 crc kubenswrapper[4947]: I1203 08:39:57.817660 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="91795005-40e9-4d23-8964-be5f263ac04d" containerName="registry-server" Dec 03 08:39:57 crc kubenswrapper[4947]: I1203 08:39:57.818893 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-44qkl" Dec 03 08:39:57 crc kubenswrapper[4947]: I1203 08:39:57.834773 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-44qkl"] Dec 03 08:39:57 crc kubenswrapper[4947]: I1203 08:39:57.836359 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c537da8-2bdd-488d-9edc-50c08b67ae9c-utilities\") pod \"redhat-marketplace-44qkl\" (UID: \"2c537da8-2bdd-488d-9edc-50c08b67ae9c\") " pod="openshift-marketplace/redhat-marketplace-44qkl" Dec 03 08:39:57 crc kubenswrapper[4947]: I1203 08:39:57.836424 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97ldt\" (UniqueName: \"kubernetes.io/projected/2c537da8-2bdd-488d-9edc-50c08b67ae9c-kube-api-access-97ldt\") pod \"redhat-marketplace-44qkl\" (UID: \"2c537da8-2bdd-488d-9edc-50c08b67ae9c\") " pod="openshift-marketplace/redhat-marketplace-44qkl" Dec 03 08:39:57 crc kubenswrapper[4947]: I1203 08:39:57.836536 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c537da8-2bdd-488d-9edc-50c08b67ae9c-catalog-content\") pod \"redhat-marketplace-44qkl\" (UID: \"2c537da8-2bdd-488d-9edc-50c08b67ae9c\") " pod="openshift-marketplace/redhat-marketplace-44qkl" Dec 03 08:39:57 crc kubenswrapper[4947]: I1203 08:39:57.937901 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c537da8-2bdd-488d-9edc-50c08b67ae9c-catalog-content\") pod \"redhat-marketplace-44qkl\" (UID: \"2c537da8-2bdd-488d-9edc-50c08b67ae9c\") " pod="openshift-marketplace/redhat-marketplace-44qkl" Dec 03 08:39:57 crc kubenswrapper[4947]: I1203 08:39:57.937976 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c537da8-2bdd-488d-9edc-50c08b67ae9c-utilities\") pod \"redhat-marketplace-44qkl\" (UID: \"2c537da8-2bdd-488d-9edc-50c08b67ae9c\") " pod="openshift-marketplace/redhat-marketplace-44qkl" Dec 03 08:39:57 crc kubenswrapper[4947]: I1203 08:39:57.938016 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97ldt\" (UniqueName: \"kubernetes.io/projected/2c537da8-2bdd-488d-9edc-50c08b67ae9c-kube-api-access-97ldt\") pod \"redhat-marketplace-44qkl\" (UID: \"2c537da8-2bdd-488d-9edc-50c08b67ae9c\") " pod="openshift-marketplace/redhat-marketplace-44qkl" Dec 03 08:39:57 crc kubenswrapper[4947]: I1203 08:39:57.938674 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c537da8-2bdd-488d-9edc-50c08b67ae9c-utilities\") pod \"redhat-marketplace-44qkl\" (UID: \"2c537da8-2bdd-488d-9edc-50c08b67ae9c\") " pod="openshift-marketplace/redhat-marketplace-44qkl" Dec 03 08:39:57 crc kubenswrapper[4947]: I1203 08:39:57.938680 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c537da8-2bdd-488d-9edc-50c08b67ae9c-catalog-content\") pod \"redhat-marketplace-44qkl\" (UID: \"2c537da8-2bdd-488d-9edc-50c08b67ae9c\") " pod="openshift-marketplace/redhat-marketplace-44qkl" Dec 03 08:39:57 crc kubenswrapper[4947]: I1203 08:39:57.957744 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97ldt\" (UniqueName: \"kubernetes.io/projected/2c537da8-2bdd-488d-9edc-50c08b67ae9c-kube-api-access-97ldt\") pod \"redhat-marketplace-44qkl\" (UID: \"2c537da8-2bdd-488d-9edc-50c08b67ae9c\") " pod="openshift-marketplace/redhat-marketplace-44qkl" Dec 03 08:39:58 crc kubenswrapper[4947]: I1203 08:39:58.185403 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-44qkl" Dec 03 08:39:58 crc kubenswrapper[4947]: I1203 08:39:58.631136 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-44qkl"] Dec 03 08:39:58 crc kubenswrapper[4947]: W1203 08:39:58.634548 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c537da8_2bdd_488d_9edc_50c08b67ae9c.slice/crio-fef758c2372e2b644f694c3ca6ac8057b3c0910e51bc17445a0894804975fdc7 WatchSource:0}: Error finding container fef758c2372e2b644f694c3ca6ac8057b3c0910e51bc17445a0894804975fdc7: Status 404 returned error can't find the container with id fef758c2372e2b644f694c3ca6ac8057b3c0910e51bc17445a0894804975fdc7 Dec 03 08:39:59 crc kubenswrapper[4947]: I1203 08:39:59.194311 4947 generic.go:334] "Generic (PLEG): container finished" podID="2c537da8-2bdd-488d-9edc-50c08b67ae9c" containerID="ddb44826c8bba6cb2a825c031ffc5491a6978b0d07f613371599515867bb9b3e" exitCode=0 Dec 03 08:39:59 crc kubenswrapper[4947]: I1203 08:39:59.194337 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44qkl" event={"ID":"2c537da8-2bdd-488d-9edc-50c08b67ae9c","Type":"ContainerDied","Data":"ddb44826c8bba6cb2a825c031ffc5491a6978b0d07f613371599515867bb9b3e"} Dec 03 08:39:59 crc kubenswrapper[4947]: I1203 08:39:59.194595 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44qkl" event={"ID":"2c537da8-2bdd-488d-9edc-50c08b67ae9c","Type":"ContainerStarted","Data":"fef758c2372e2b644f694c3ca6ac8057b3c0910e51bc17445a0894804975fdc7"} Dec 03 08:40:00 crc kubenswrapper[4947]: I1203 08:40:00.203424 4947 generic.go:334] "Generic (PLEG): container finished" podID="2c537da8-2bdd-488d-9edc-50c08b67ae9c" containerID="f2de3772cc8065960ffa888a4182aed7276e9c727597c5e14b49da390cb42d74" exitCode=0 Dec 03 08:40:00 crc kubenswrapper[4947]: I1203 08:40:00.203469 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44qkl" event={"ID":"2c537da8-2bdd-488d-9edc-50c08b67ae9c","Type":"ContainerDied","Data":"f2de3772cc8065960ffa888a4182aed7276e9c727597c5e14b49da390cb42d74"} Dec 03 08:40:01 crc kubenswrapper[4947]: I1203 08:40:01.212462 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44qkl" event={"ID":"2c537da8-2bdd-488d-9edc-50c08b67ae9c","Type":"ContainerStarted","Data":"ee609b330ec88d829cf4c6224c3ea599fdac0921b1a3f1e034bfc0cf96556143"} Dec 03 08:40:01 crc kubenswrapper[4947]: I1203 08:40:01.236539 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-44qkl" podStartSLOduration=2.788142644 podStartE2EDuration="4.236519324s" podCreationTimestamp="2025-12-03 08:39:57 +0000 UTC" firstStartedPulling="2025-12-03 08:39:59.195681863 +0000 UTC m=+6660.456636289" lastFinishedPulling="2025-12-03 08:40:00.644058533 +0000 UTC m=+6661.905012969" observedRunningTime="2025-12-03 08:40:01.227961523 +0000 UTC m=+6662.488915969" watchObservedRunningTime="2025-12-03 08:40:01.236519324 +0000 UTC m=+6662.497473770" Dec 03 08:40:08 crc kubenswrapper[4947]: I1203 08:40:08.185693 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-44qkl" Dec 03 08:40:08 crc kubenswrapper[4947]: I1203 08:40:08.186224 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-44qkl" Dec 03 08:40:08 crc kubenswrapper[4947]: I1203 08:40:08.236321 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-44qkl" Dec 03 08:40:08 crc kubenswrapper[4947]: I1203 08:40:08.304483 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-44qkl" Dec 03 08:40:08 crc kubenswrapper[4947]: I1203 08:40:08.478097 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-44qkl"] Dec 03 08:40:10 crc kubenswrapper[4947]: I1203 08:40:10.296870 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-44qkl" podUID="2c537da8-2bdd-488d-9edc-50c08b67ae9c" containerName="registry-server" containerID="cri-o://ee609b330ec88d829cf4c6224c3ea599fdac0921b1a3f1e034bfc0cf96556143" gracePeriod=2 Dec 03 08:40:10 crc kubenswrapper[4947]: I1203 08:40:10.662570 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-44qkl" Dec 03 08:40:10 crc kubenswrapper[4947]: I1203 08:40:10.837296 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c537da8-2bdd-488d-9edc-50c08b67ae9c-catalog-content\") pod \"2c537da8-2bdd-488d-9edc-50c08b67ae9c\" (UID: \"2c537da8-2bdd-488d-9edc-50c08b67ae9c\") " Dec 03 08:40:10 crc kubenswrapper[4947]: I1203 08:40:10.837372 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97ldt\" (UniqueName: \"kubernetes.io/projected/2c537da8-2bdd-488d-9edc-50c08b67ae9c-kube-api-access-97ldt\") pod \"2c537da8-2bdd-488d-9edc-50c08b67ae9c\" (UID: \"2c537da8-2bdd-488d-9edc-50c08b67ae9c\") " Dec 03 08:40:10 crc kubenswrapper[4947]: I1203 08:40:10.837520 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c537da8-2bdd-488d-9edc-50c08b67ae9c-utilities\") pod \"2c537da8-2bdd-488d-9edc-50c08b67ae9c\" (UID: \"2c537da8-2bdd-488d-9edc-50c08b67ae9c\") " Dec 03 08:40:10 crc kubenswrapper[4947]: I1203 08:40:10.839559 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c537da8-2bdd-488d-9edc-50c08b67ae9c-utilities" (OuterVolumeSpecName: "utilities") pod "2c537da8-2bdd-488d-9edc-50c08b67ae9c" (UID: "2c537da8-2bdd-488d-9edc-50c08b67ae9c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:40:10 crc kubenswrapper[4947]: I1203 08:40:10.843503 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c537da8-2bdd-488d-9edc-50c08b67ae9c-kube-api-access-97ldt" (OuterVolumeSpecName: "kube-api-access-97ldt") pod "2c537da8-2bdd-488d-9edc-50c08b67ae9c" (UID: "2c537da8-2bdd-488d-9edc-50c08b67ae9c"). InnerVolumeSpecName "kube-api-access-97ldt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:40:10 crc kubenswrapper[4947]: I1203 08:40:10.863233 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c537da8-2bdd-488d-9edc-50c08b67ae9c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c537da8-2bdd-488d-9edc-50c08b67ae9c" (UID: "2c537da8-2bdd-488d-9edc-50c08b67ae9c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:40:10 crc kubenswrapper[4947]: I1203 08:40:10.939485 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c537da8-2bdd-488d-9edc-50c08b67ae9c-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:40:10 crc kubenswrapper[4947]: I1203 08:40:10.939543 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c537da8-2bdd-488d-9edc-50c08b67ae9c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:40:10 crc kubenswrapper[4947]: I1203 08:40:10.939556 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97ldt\" (UniqueName: \"kubernetes.io/projected/2c537da8-2bdd-488d-9edc-50c08b67ae9c-kube-api-access-97ldt\") on node \"crc\" DevicePath \"\"" Dec 03 08:40:11 crc kubenswrapper[4947]: I1203 08:40:11.306163 4947 generic.go:334] "Generic (PLEG): container finished" podID="2c537da8-2bdd-488d-9edc-50c08b67ae9c" containerID="ee609b330ec88d829cf4c6224c3ea599fdac0921b1a3f1e034bfc0cf96556143" exitCode=0 Dec 03 08:40:11 crc kubenswrapper[4947]: I1203 08:40:11.306248 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-44qkl" Dec 03 08:40:11 crc kubenswrapper[4947]: I1203 08:40:11.306257 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44qkl" event={"ID":"2c537da8-2bdd-488d-9edc-50c08b67ae9c","Type":"ContainerDied","Data":"ee609b330ec88d829cf4c6224c3ea599fdac0921b1a3f1e034bfc0cf96556143"} Dec 03 08:40:11 crc kubenswrapper[4947]: I1203 08:40:11.306315 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-44qkl" event={"ID":"2c537da8-2bdd-488d-9edc-50c08b67ae9c","Type":"ContainerDied","Data":"fef758c2372e2b644f694c3ca6ac8057b3c0910e51bc17445a0894804975fdc7"} Dec 03 08:40:11 crc kubenswrapper[4947]: I1203 08:40:11.306341 4947 scope.go:117] "RemoveContainer" containerID="ee609b330ec88d829cf4c6224c3ea599fdac0921b1a3f1e034bfc0cf96556143" Dec 03 08:40:11 crc kubenswrapper[4947]: I1203 08:40:11.327446 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-44qkl"] Dec 03 08:40:11 crc kubenswrapper[4947]: I1203 08:40:11.332366 4947 scope.go:117] "RemoveContainer" containerID="f2de3772cc8065960ffa888a4182aed7276e9c727597c5e14b49da390cb42d74" Dec 03 08:40:11 crc kubenswrapper[4947]: I1203 08:40:11.333937 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-44qkl"] Dec 03 08:40:11 crc kubenswrapper[4947]: I1203 08:40:11.347460 4947 scope.go:117] "RemoveContainer" containerID="ddb44826c8bba6cb2a825c031ffc5491a6978b0d07f613371599515867bb9b3e" Dec 03 08:40:11 crc kubenswrapper[4947]: I1203 08:40:11.368019 4947 scope.go:117] "RemoveContainer" containerID="ee609b330ec88d829cf4c6224c3ea599fdac0921b1a3f1e034bfc0cf96556143" Dec 03 08:40:11 crc kubenswrapper[4947]: E1203 08:40:11.368458 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee609b330ec88d829cf4c6224c3ea599fdac0921b1a3f1e034bfc0cf96556143\": container with ID starting with ee609b330ec88d829cf4c6224c3ea599fdac0921b1a3f1e034bfc0cf96556143 not found: ID does not exist" containerID="ee609b330ec88d829cf4c6224c3ea599fdac0921b1a3f1e034bfc0cf96556143" Dec 03 08:40:11 crc kubenswrapper[4947]: I1203 08:40:11.368505 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee609b330ec88d829cf4c6224c3ea599fdac0921b1a3f1e034bfc0cf96556143"} err="failed to get container status \"ee609b330ec88d829cf4c6224c3ea599fdac0921b1a3f1e034bfc0cf96556143\": rpc error: code = NotFound desc = could not find container \"ee609b330ec88d829cf4c6224c3ea599fdac0921b1a3f1e034bfc0cf96556143\": container with ID starting with ee609b330ec88d829cf4c6224c3ea599fdac0921b1a3f1e034bfc0cf96556143 not found: ID does not exist" Dec 03 08:40:11 crc kubenswrapper[4947]: I1203 08:40:11.368532 4947 scope.go:117] "RemoveContainer" containerID="f2de3772cc8065960ffa888a4182aed7276e9c727597c5e14b49da390cb42d74" Dec 03 08:40:11 crc kubenswrapper[4947]: E1203 08:40:11.368857 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2de3772cc8065960ffa888a4182aed7276e9c727597c5e14b49da390cb42d74\": container with ID starting with f2de3772cc8065960ffa888a4182aed7276e9c727597c5e14b49da390cb42d74 not found: ID does not exist" containerID="f2de3772cc8065960ffa888a4182aed7276e9c727597c5e14b49da390cb42d74" Dec 03 08:40:11 crc kubenswrapper[4947]: I1203 08:40:11.368908 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2de3772cc8065960ffa888a4182aed7276e9c727597c5e14b49da390cb42d74"} err="failed to get container status \"f2de3772cc8065960ffa888a4182aed7276e9c727597c5e14b49da390cb42d74\": rpc error: code = NotFound desc = could not find container \"f2de3772cc8065960ffa888a4182aed7276e9c727597c5e14b49da390cb42d74\": container with ID starting with f2de3772cc8065960ffa888a4182aed7276e9c727597c5e14b49da390cb42d74 not found: ID does not exist" Dec 03 08:40:11 crc kubenswrapper[4947]: I1203 08:40:11.368958 4947 scope.go:117] "RemoveContainer" containerID="ddb44826c8bba6cb2a825c031ffc5491a6978b0d07f613371599515867bb9b3e" Dec 03 08:40:11 crc kubenswrapper[4947]: E1203 08:40:11.369406 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddb44826c8bba6cb2a825c031ffc5491a6978b0d07f613371599515867bb9b3e\": container with ID starting with ddb44826c8bba6cb2a825c031ffc5491a6978b0d07f613371599515867bb9b3e not found: ID does not exist" containerID="ddb44826c8bba6cb2a825c031ffc5491a6978b0d07f613371599515867bb9b3e" Dec 03 08:40:11 crc kubenswrapper[4947]: I1203 08:40:11.369433 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddb44826c8bba6cb2a825c031ffc5491a6978b0d07f613371599515867bb9b3e"} err="failed to get container status \"ddb44826c8bba6cb2a825c031ffc5491a6978b0d07f613371599515867bb9b3e\": rpc error: code = NotFound desc = could not find container \"ddb44826c8bba6cb2a825c031ffc5491a6978b0d07f613371599515867bb9b3e\": container with ID starting with ddb44826c8bba6cb2a825c031ffc5491a6978b0d07f613371599515867bb9b3e not found: ID does not exist" Dec 03 08:40:13 crc kubenswrapper[4947]: I1203 08:40:13.095340 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c537da8-2bdd-488d-9edc-50c08b67ae9c" path="/var/lib/kubelet/pods/2c537da8-2bdd-488d-9edc-50c08b67ae9c/volumes" Dec 03 08:42:00 crc kubenswrapper[4947]: I1203 08:42:00.086125 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:42:00 crc kubenswrapper[4947]: I1203 08:42:00.086645 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:42:30 crc kubenswrapper[4947]: I1203 08:42:30.087020 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:42:30 crc kubenswrapper[4947]: I1203 08:42:30.087702 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:43:00 crc kubenswrapper[4947]: I1203 08:43:00.093148 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:43:00 crc kubenswrapper[4947]: I1203 08:43:00.093765 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:43:00 crc kubenswrapper[4947]: I1203 08:43:00.093830 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 08:43:00 crc kubenswrapper[4947]: I1203 08:43:00.094879 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f497ed6e66f07c3659d612e4d03fc5594262467811b840160071934dd1dd8bac"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:43:00 crc kubenswrapper[4947]: I1203 08:43:00.094966 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://f497ed6e66f07c3659d612e4d03fc5594262467811b840160071934dd1dd8bac" gracePeriod=600 Dec 03 08:43:00 crc kubenswrapper[4947]: I1203 08:43:00.625013 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="f497ed6e66f07c3659d612e4d03fc5594262467811b840160071934dd1dd8bac" exitCode=0 Dec 03 08:43:00 crc kubenswrapper[4947]: I1203 08:43:00.625095 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"f497ed6e66f07c3659d612e4d03fc5594262467811b840160071934dd1dd8bac"} Dec 03 08:43:00 crc kubenswrapper[4947]: I1203 08:43:00.625421 4947 scope.go:117] "RemoveContainer" containerID="f6e7f9061299a052f433679fddb3c081193b1810c9ca787959d11c044c580a68" Dec 03 08:43:01 crc kubenswrapper[4947]: I1203 08:43:01.633693 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38"} Dec 03 08:45:00 crc kubenswrapper[4947]: I1203 08:45:00.145531 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412525-lpltb"] Dec 03 08:45:00 crc kubenswrapper[4947]: E1203 08:45:00.146396 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c537da8-2bdd-488d-9edc-50c08b67ae9c" containerName="extract-content" Dec 03 08:45:00 crc kubenswrapper[4947]: I1203 08:45:00.146414 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c537da8-2bdd-488d-9edc-50c08b67ae9c" containerName="extract-content" Dec 03 08:45:00 crc kubenswrapper[4947]: E1203 08:45:00.146444 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c537da8-2bdd-488d-9edc-50c08b67ae9c" containerName="registry-server" Dec 03 08:45:00 crc kubenswrapper[4947]: I1203 08:45:00.146452 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c537da8-2bdd-488d-9edc-50c08b67ae9c" containerName="registry-server" Dec 03 08:45:00 crc kubenswrapper[4947]: E1203 08:45:00.146477 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c537da8-2bdd-488d-9edc-50c08b67ae9c" containerName="extract-utilities" Dec 03 08:45:00 crc kubenswrapper[4947]: I1203 08:45:00.146486 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c537da8-2bdd-488d-9edc-50c08b67ae9c" containerName="extract-utilities" Dec 03 08:45:00 crc kubenswrapper[4947]: I1203 08:45:00.146666 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c537da8-2bdd-488d-9edc-50c08b67ae9c" containerName="registry-server" Dec 03 08:45:00 crc kubenswrapper[4947]: I1203 08:45:00.147259 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-lpltb" Dec 03 08:45:00 crc kubenswrapper[4947]: I1203 08:45:00.149509 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 08:45:00 crc kubenswrapper[4947]: I1203 08:45:00.150573 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 08:45:00 crc kubenswrapper[4947]: I1203 08:45:00.163031 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412525-lpltb"] Dec 03 08:45:00 crc kubenswrapper[4947]: I1203 08:45:00.304734 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6vp9\" (UniqueName: \"kubernetes.io/projected/603e2092-218a-4357-9eba-a012f333e76f-kube-api-access-g6vp9\") pod \"collect-profiles-29412525-lpltb\" (UID: \"603e2092-218a-4357-9eba-a012f333e76f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-lpltb" Dec 03 08:45:00 crc kubenswrapper[4947]: I1203 08:45:00.304911 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/603e2092-218a-4357-9eba-a012f333e76f-config-volume\") pod \"collect-profiles-29412525-lpltb\" (UID: \"603e2092-218a-4357-9eba-a012f333e76f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-lpltb" Dec 03 08:45:00 crc kubenswrapper[4947]: I1203 08:45:00.304952 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/603e2092-218a-4357-9eba-a012f333e76f-secret-volume\") pod \"collect-profiles-29412525-lpltb\" (UID: \"603e2092-218a-4357-9eba-a012f333e76f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-lpltb" Dec 03 08:45:00 crc kubenswrapper[4947]: I1203 08:45:00.406191 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/603e2092-218a-4357-9eba-a012f333e76f-secret-volume\") pod \"collect-profiles-29412525-lpltb\" (UID: \"603e2092-218a-4357-9eba-a012f333e76f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-lpltb" Dec 03 08:45:00 crc kubenswrapper[4947]: I1203 08:45:00.406370 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6vp9\" (UniqueName: \"kubernetes.io/projected/603e2092-218a-4357-9eba-a012f333e76f-kube-api-access-g6vp9\") pod \"collect-profiles-29412525-lpltb\" (UID: \"603e2092-218a-4357-9eba-a012f333e76f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-lpltb" Dec 03 08:45:00 crc kubenswrapper[4947]: I1203 08:45:00.406575 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/603e2092-218a-4357-9eba-a012f333e76f-config-volume\") pod \"collect-profiles-29412525-lpltb\" (UID: \"603e2092-218a-4357-9eba-a012f333e76f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-lpltb" Dec 03 08:45:00 crc kubenswrapper[4947]: I1203 08:45:00.407378 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/603e2092-218a-4357-9eba-a012f333e76f-config-volume\") pod \"collect-profiles-29412525-lpltb\" (UID: \"603e2092-218a-4357-9eba-a012f333e76f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-lpltb" Dec 03 08:45:00 crc kubenswrapper[4947]: I1203 08:45:00.423186 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/603e2092-218a-4357-9eba-a012f333e76f-secret-volume\") pod \"collect-profiles-29412525-lpltb\" (UID: \"603e2092-218a-4357-9eba-a012f333e76f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-lpltb" Dec 03 08:45:00 crc kubenswrapper[4947]: I1203 08:45:00.434364 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6vp9\" (UniqueName: \"kubernetes.io/projected/603e2092-218a-4357-9eba-a012f333e76f-kube-api-access-g6vp9\") pod \"collect-profiles-29412525-lpltb\" (UID: \"603e2092-218a-4357-9eba-a012f333e76f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-lpltb" Dec 03 08:45:00 crc kubenswrapper[4947]: I1203 08:45:00.468359 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-lpltb" Dec 03 08:45:00 crc kubenswrapper[4947]: I1203 08:45:00.889054 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412525-lpltb"] Dec 03 08:45:01 crc kubenswrapper[4947]: I1203 08:45:01.608707 4947 generic.go:334] "Generic (PLEG): container finished" podID="603e2092-218a-4357-9eba-a012f333e76f" containerID="36a6286e97b68a60233985e746e6f8392c373cb68563221b51e6f7e4d79ab9bb" exitCode=0 Dec 03 08:45:01 crc kubenswrapper[4947]: I1203 08:45:01.608826 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-lpltb" event={"ID":"603e2092-218a-4357-9eba-a012f333e76f","Type":"ContainerDied","Data":"36a6286e97b68a60233985e746e6f8392c373cb68563221b51e6f7e4d79ab9bb"} Dec 03 08:45:01 crc kubenswrapper[4947]: I1203 08:45:01.608979 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-lpltb" event={"ID":"603e2092-218a-4357-9eba-a012f333e76f","Type":"ContainerStarted","Data":"67ad6ae09b92f603fccec2ff354b975ab756e70c8250a3a2ec52438c05ef042b"} Dec 03 08:45:02 crc kubenswrapper[4947]: I1203 08:45:02.948454 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-lpltb" Dec 03 08:45:03 crc kubenswrapper[4947]: I1203 08:45:03.045533 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/603e2092-218a-4357-9eba-a012f333e76f-secret-volume\") pod \"603e2092-218a-4357-9eba-a012f333e76f\" (UID: \"603e2092-218a-4357-9eba-a012f333e76f\") " Dec 03 08:45:03 crc kubenswrapper[4947]: I1203 08:45:03.045612 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/603e2092-218a-4357-9eba-a012f333e76f-config-volume\") pod \"603e2092-218a-4357-9eba-a012f333e76f\" (UID: \"603e2092-218a-4357-9eba-a012f333e76f\") " Dec 03 08:45:03 crc kubenswrapper[4947]: I1203 08:45:03.045646 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6vp9\" (UniqueName: \"kubernetes.io/projected/603e2092-218a-4357-9eba-a012f333e76f-kube-api-access-g6vp9\") pod \"603e2092-218a-4357-9eba-a012f333e76f\" (UID: \"603e2092-218a-4357-9eba-a012f333e76f\") " Dec 03 08:45:03 crc kubenswrapper[4947]: I1203 08:45:03.046562 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/603e2092-218a-4357-9eba-a012f333e76f-config-volume" (OuterVolumeSpecName: "config-volume") pod "603e2092-218a-4357-9eba-a012f333e76f" (UID: "603e2092-218a-4357-9eba-a012f333e76f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:45:03 crc kubenswrapper[4947]: I1203 08:45:03.050593 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/603e2092-218a-4357-9eba-a012f333e76f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "603e2092-218a-4357-9eba-a012f333e76f" (UID: "603e2092-218a-4357-9eba-a012f333e76f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:45:03 crc kubenswrapper[4947]: I1203 08:45:03.050598 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/603e2092-218a-4357-9eba-a012f333e76f-kube-api-access-g6vp9" (OuterVolumeSpecName: "kube-api-access-g6vp9") pod "603e2092-218a-4357-9eba-a012f333e76f" (UID: "603e2092-218a-4357-9eba-a012f333e76f"). InnerVolumeSpecName "kube-api-access-g6vp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:45:03 crc kubenswrapper[4947]: I1203 08:45:03.146963 4947 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/603e2092-218a-4357-9eba-a012f333e76f-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 08:45:03 crc kubenswrapper[4947]: I1203 08:45:03.147001 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6vp9\" (UniqueName: \"kubernetes.io/projected/603e2092-218a-4357-9eba-a012f333e76f-kube-api-access-g6vp9\") on node \"crc\" DevicePath \"\"" Dec 03 08:45:03 crc kubenswrapper[4947]: I1203 08:45:03.147016 4947 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/603e2092-218a-4357-9eba-a012f333e76f-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 08:45:03 crc kubenswrapper[4947]: I1203 08:45:03.629851 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-lpltb" event={"ID":"603e2092-218a-4357-9eba-a012f333e76f","Type":"ContainerDied","Data":"67ad6ae09b92f603fccec2ff354b975ab756e70c8250a3a2ec52438c05ef042b"} Dec 03 08:45:03 crc kubenswrapper[4947]: I1203 08:45:03.630182 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67ad6ae09b92f603fccec2ff354b975ab756e70c8250a3a2ec52438c05ef042b" Dec 03 08:45:03 crc kubenswrapper[4947]: I1203 08:45:03.629928 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-lpltb" Dec 03 08:45:04 crc kubenswrapper[4947]: I1203 08:45:04.011906 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412480-clkgg"] Dec 03 08:45:04 crc kubenswrapper[4947]: I1203 08:45:04.018518 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412480-clkgg"] Dec 03 08:45:05 crc kubenswrapper[4947]: I1203 08:45:05.093039 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be921a3d-da61-477b-a169-851809fdbad4" path="/var/lib/kubelet/pods/be921a3d-da61-477b-a169-851809fdbad4/volumes" Dec 03 08:45:06 crc kubenswrapper[4947]: I1203 08:45:06.626443 4947 scope.go:117] "RemoveContainer" containerID="0a0341d26c855aee8f795e55171f786ff29d990ba4f86028d6e4d327450248e7" Dec 03 08:45:10 crc kubenswrapper[4947]: I1203 08:45:10.255333 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-npzts"] Dec 03 08:45:10 crc kubenswrapper[4947]: I1203 08:45:10.265629 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-npzts"] Dec 03 08:45:10 crc kubenswrapper[4947]: I1203 08:45:10.388915 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-ffgx5"] Dec 03 08:45:10 crc kubenswrapper[4947]: E1203 08:45:10.391466 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="603e2092-218a-4357-9eba-a012f333e76f" containerName="collect-profiles" Dec 03 08:45:10 crc kubenswrapper[4947]: I1203 08:45:10.391515 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="603e2092-218a-4357-9eba-a012f333e76f" containerName="collect-profiles" Dec 03 08:45:10 crc kubenswrapper[4947]: I1203 08:45:10.391717 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="603e2092-218a-4357-9eba-a012f333e76f" containerName="collect-profiles" Dec 03 08:45:10 crc kubenswrapper[4947]: I1203 08:45:10.392345 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ffgx5" Dec 03 08:45:10 crc kubenswrapper[4947]: I1203 08:45:10.395863 4947 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-26pm6" Dec 03 08:45:10 crc kubenswrapper[4947]: I1203 08:45:10.395885 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 03 08:45:10 crc kubenswrapper[4947]: I1203 08:45:10.395909 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 03 08:45:10 crc kubenswrapper[4947]: I1203 08:45:10.396241 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 03 08:45:10 crc kubenswrapper[4947]: I1203 08:45:10.399834 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ffgx5"] Dec 03 08:45:10 crc kubenswrapper[4947]: I1203 08:45:10.455990 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx76l\" (UniqueName: \"kubernetes.io/projected/6252c949-99d1-4dec-9d30-86eacd0ba4a1-kube-api-access-wx76l\") pod \"crc-storage-crc-ffgx5\" (UID: \"6252c949-99d1-4dec-9d30-86eacd0ba4a1\") " pod="crc-storage/crc-storage-crc-ffgx5" Dec 03 08:45:10 crc kubenswrapper[4947]: I1203 08:45:10.456152 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6252c949-99d1-4dec-9d30-86eacd0ba4a1-node-mnt\") pod \"crc-storage-crc-ffgx5\" (UID: \"6252c949-99d1-4dec-9d30-86eacd0ba4a1\") " pod="crc-storage/crc-storage-crc-ffgx5" Dec 03 08:45:10 crc kubenswrapper[4947]: I1203 08:45:10.456327 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6252c949-99d1-4dec-9d30-86eacd0ba4a1-crc-storage\") pod \"crc-storage-crc-ffgx5\" (UID: \"6252c949-99d1-4dec-9d30-86eacd0ba4a1\") " pod="crc-storage/crc-storage-crc-ffgx5" Dec 03 08:45:10 crc kubenswrapper[4947]: I1203 08:45:10.557680 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6252c949-99d1-4dec-9d30-86eacd0ba4a1-node-mnt\") pod \"crc-storage-crc-ffgx5\" (UID: \"6252c949-99d1-4dec-9d30-86eacd0ba4a1\") " pod="crc-storage/crc-storage-crc-ffgx5" Dec 03 08:45:10 crc kubenswrapper[4947]: I1203 08:45:10.557750 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6252c949-99d1-4dec-9d30-86eacd0ba4a1-crc-storage\") pod \"crc-storage-crc-ffgx5\" (UID: \"6252c949-99d1-4dec-9d30-86eacd0ba4a1\") " pod="crc-storage/crc-storage-crc-ffgx5" Dec 03 08:45:10 crc kubenswrapper[4947]: I1203 08:45:10.557835 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx76l\" (UniqueName: \"kubernetes.io/projected/6252c949-99d1-4dec-9d30-86eacd0ba4a1-kube-api-access-wx76l\") pod \"crc-storage-crc-ffgx5\" (UID: \"6252c949-99d1-4dec-9d30-86eacd0ba4a1\") " pod="crc-storage/crc-storage-crc-ffgx5" Dec 03 08:45:10 crc kubenswrapper[4947]: I1203 08:45:10.558057 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6252c949-99d1-4dec-9d30-86eacd0ba4a1-node-mnt\") pod \"crc-storage-crc-ffgx5\" (UID: \"6252c949-99d1-4dec-9d30-86eacd0ba4a1\") " pod="crc-storage/crc-storage-crc-ffgx5" Dec 03 08:45:10 crc kubenswrapper[4947]: I1203 08:45:10.558734 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6252c949-99d1-4dec-9d30-86eacd0ba4a1-crc-storage\") pod \"crc-storage-crc-ffgx5\" (UID: \"6252c949-99d1-4dec-9d30-86eacd0ba4a1\") " pod="crc-storage/crc-storage-crc-ffgx5" Dec 03 08:45:10 crc kubenswrapper[4947]: I1203 08:45:10.577685 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx76l\" (UniqueName: \"kubernetes.io/projected/6252c949-99d1-4dec-9d30-86eacd0ba4a1-kube-api-access-wx76l\") pod \"crc-storage-crc-ffgx5\" (UID: \"6252c949-99d1-4dec-9d30-86eacd0ba4a1\") " pod="crc-storage/crc-storage-crc-ffgx5" Dec 03 08:45:10 crc kubenswrapper[4947]: I1203 08:45:10.719549 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ffgx5" Dec 03 08:45:11 crc kubenswrapper[4947]: I1203 08:45:11.095596 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f803f9ab-0e25-403d-9a18-07a7ed80f160" path="/var/lib/kubelet/pods/f803f9ab-0e25-403d-9a18-07a7ed80f160/volumes" Dec 03 08:45:11 crc kubenswrapper[4947]: I1203 08:45:11.249215 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ffgx5"] Dec 03 08:45:11 crc kubenswrapper[4947]: I1203 08:45:11.252600 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 08:45:11 crc kubenswrapper[4947]: I1203 08:45:11.689143 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ffgx5" event={"ID":"6252c949-99d1-4dec-9d30-86eacd0ba4a1","Type":"ContainerStarted","Data":"1cf11ce60420abad636a8b4ff184afb21df7df5bc356e812a442b2e6a7e7a895"} Dec 03 08:45:12 crc kubenswrapper[4947]: I1203 08:45:12.698237 4947 generic.go:334] "Generic (PLEG): container finished" podID="6252c949-99d1-4dec-9d30-86eacd0ba4a1" containerID="6661ac4931f278258e2e252037c046f7b142547ed725d9d5274ffe73fb15e5f0" exitCode=0 Dec 03 08:45:12 crc kubenswrapper[4947]: I1203 08:45:12.698292 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ffgx5" event={"ID":"6252c949-99d1-4dec-9d30-86eacd0ba4a1","Type":"ContainerDied","Data":"6661ac4931f278258e2e252037c046f7b142547ed725d9d5274ffe73fb15e5f0"} Dec 03 08:45:13 crc kubenswrapper[4947]: I1203 08:45:13.956216 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ffgx5" Dec 03 08:45:14 crc kubenswrapper[4947]: I1203 08:45:14.017304 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6252c949-99d1-4dec-9d30-86eacd0ba4a1-node-mnt\") pod \"6252c949-99d1-4dec-9d30-86eacd0ba4a1\" (UID: \"6252c949-99d1-4dec-9d30-86eacd0ba4a1\") " Dec 03 08:45:14 crc kubenswrapper[4947]: I1203 08:45:14.017389 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wx76l\" (UniqueName: \"kubernetes.io/projected/6252c949-99d1-4dec-9d30-86eacd0ba4a1-kube-api-access-wx76l\") pod \"6252c949-99d1-4dec-9d30-86eacd0ba4a1\" (UID: \"6252c949-99d1-4dec-9d30-86eacd0ba4a1\") " Dec 03 08:45:14 crc kubenswrapper[4947]: I1203 08:45:14.017425 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6252c949-99d1-4dec-9d30-86eacd0ba4a1-crc-storage\") pod \"6252c949-99d1-4dec-9d30-86eacd0ba4a1\" (UID: \"6252c949-99d1-4dec-9d30-86eacd0ba4a1\") " Dec 03 08:45:14 crc kubenswrapper[4947]: I1203 08:45:14.017418 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6252c949-99d1-4dec-9d30-86eacd0ba4a1-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "6252c949-99d1-4dec-9d30-86eacd0ba4a1" (UID: "6252c949-99d1-4dec-9d30-86eacd0ba4a1"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 08:45:14 crc kubenswrapper[4947]: I1203 08:45:14.017763 4947 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/6252c949-99d1-4dec-9d30-86eacd0ba4a1-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 03 08:45:14 crc kubenswrapper[4947]: I1203 08:45:14.024766 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6252c949-99d1-4dec-9d30-86eacd0ba4a1-kube-api-access-wx76l" (OuterVolumeSpecName: "kube-api-access-wx76l") pod "6252c949-99d1-4dec-9d30-86eacd0ba4a1" (UID: "6252c949-99d1-4dec-9d30-86eacd0ba4a1"). InnerVolumeSpecName "kube-api-access-wx76l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:45:14 crc kubenswrapper[4947]: I1203 08:45:14.036548 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6252c949-99d1-4dec-9d30-86eacd0ba4a1-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "6252c949-99d1-4dec-9d30-86eacd0ba4a1" (UID: "6252c949-99d1-4dec-9d30-86eacd0ba4a1"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:45:14 crc kubenswrapper[4947]: I1203 08:45:14.119680 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wx76l\" (UniqueName: \"kubernetes.io/projected/6252c949-99d1-4dec-9d30-86eacd0ba4a1-kube-api-access-wx76l\") on node \"crc\" DevicePath \"\"" Dec 03 08:45:14 crc kubenswrapper[4947]: I1203 08:45:14.119723 4947 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/6252c949-99d1-4dec-9d30-86eacd0ba4a1-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 03 08:45:14 crc kubenswrapper[4947]: I1203 08:45:14.710824 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ffgx5" event={"ID":"6252c949-99d1-4dec-9d30-86eacd0ba4a1","Type":"ContainerDied","Data":"1cf11ce60420abad636a8b4ff184afb21df7df5bc356e812a442b2e6a7e7a895"} Dec 03 08:45:14 crc kubenswrapper[4947]: I1203 08:45:14.711085 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cf11ce60420abad636a8b4ff184afb21df7df5bc356e812a442b2e6a7e7a895" Dec 03 08:45:14 crc kubenswrapper[4947]: I1203 08:45:14.710877 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ffgx5" Dec 03 08:45:16 crc kubenswrapper[4947]: I1203 08:45:16.232424 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-ffgx5"] Dec 03 08:45:16 crc kubenswrapper[4947]: I1203 08:45:16.238145 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-ffgx5"] Dec 03 08:45:16 crc kubenswrapper[4947]: I1203 08:45:16.382144 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-9jllg"] Dec 03 08:45:16 crc kubenswrapper[4947]: E1203 08:45:16.382575 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6252c949-99d1-4dec-9d30-86eacd0ba4a1" containerName="storage" Dec 03 08:45:16 crc kubenswrapper[4947]: I1203 08:45:16.382599 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="6252c949-99d1-4dec-9d30-86eacd0ba4a1" containerName="storage" Dec 03 08:45:16 crc kubenswrapper[4947]: I1203 08:45:16.382860 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="6252c949-99d1-4dec-9d30-86eacd0ba4a1" containerName="storage" Dec 03 08:45:16 crc kubenswrapper[4947]: I1203 08:45:16.383462 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9jllg" Dec 03 08:45:16 crc kubenswrapper[4947]: I1203 08:45:16.385650 4947 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-26pm6" Dec 03 08:45:16 crc kubenswrapper[4947]: I1203 08:45:16.385807 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 03 08:45:16 crc kubenswrapper[4947]: I1203 08:45:16.385880 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 03 08:45:16 crc kubenswrapper[4947]: I1203 08:45:16.391584 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 03 08:45:16 crc kubenswrapper[4947]: I1203 08:45:16.409303 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9jllg"] Dec 03 08:45:16 crc kubenswrapper[4947]: I1203 08:45:16.454916 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/adb116dd-88d8-44b7-9cf0-4bfc5862e8d6-node-mnt\") pod \"crc-storage-crc-9jllg\" (UID: \"adb116dd-88d8-44b7-9cf0-4bfc5862e8d6\") " pod="crc-storage/crc-storage-crc-9jllg" Dec 03 08:45:16 crc kubenswrapper[4947]: I1203 08:45:16.454980 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/adb116dd-88d8-44b7-9cf0-4bfc5862e8d6-crc-storage\") pod \"crc-storage-crc-9jllg\" (UID: \"adb116dd-88d8-44b7-9cf0-4bfc5862e8d6\") " pod="crc-storage/crc-storage-crc-9jllg" Dec 03 08:45:16 crc kubenswrapper[4947]: I1203 08:45:16.455015 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnddp\" (UniqueName: \"kubernetes.io/projected/adb116dd-88d8-44b7-9cf0-4bfc5862e8d6-kube-api-access-wnddp\") pod \"crc-storage-crc-9jllg\" (UID: \"adb116dd-88d8-44b7-9cf0-4bfc5862e8d6\") " pod="crc-storage/crc-storage-crc-9jllg" Dec 03 08:45:16 crc kubenswrapper[4947]: I1203 08:45:16.556837 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/adb116dd-88d8-44b7-9cf0-4bfc5862e8d6-node-mnt\") pod \"crc-storage-crc-9jllg\" (UID: \"adb116dd-88d8-44b7-9cf0-4bfc5862e8d6\") " pod="crc-storage/crc-storage-crc-9jllg" Dec 03 08:45:16 crc kubenswrapper[4947]: I1203 08:45:16.556891 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/adb116dd-88d8-44b7-9cf0-4bfc5862e8d6-crc-storage\") pod \"crc-storage-crc-9jllg\" (UID: \"adb116dd-88d8-44b7-9cf0-4bfc5862e8d6\") " pod="crc-storage/crc-storage-crc-9jllg" Dec 03 08:45:16 crc kubenswrapper[4947]: I1203 08:45:16.556922 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnddp\" (UniqueName: \"kubernetes.io/projected/adb116dd-88d8-44b7-9cf0-4bfc5862e8d6-kube-api-access-wnddp\") pod \"crc-storage-crc-9jllg\" (UID: \"adb116dd-88d8-44b7-9cf0-4bfc5862e8d6\") " pod="crc-storage/crc-storage-crc-9jllg" Dec 03 08:45:16 crc kubenswrapper[4947]: I1203 08:45:16.557233 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/adb116dd-88d8-44b7-9cf0-4bfc5862e8d6-node-mnt\") pod \"crc-storage-crc-9jllg\" (UID: \"adb116dd-88d8-44b7-9cf0-4bfc5862e8d6\") " pod="crc-storage/crc-storage-crc-9jllg" Dec 03 08:45:16 crc kubenswrapper[4947]: I1203 08:45:16.557792 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/adb116dd-88d8-44b7-9cf0-4bfc5862e8d6-crc-storage\") pod \"crc-storage-crc-9jllg\" (UID: \"adb116dd-88d8-44b7-9cf0-4bfc5862e8d6\") " pod="crc-storage/crc-storage-crc-9jllg" Dec 03 08:45:16 crc kubenswrapper[4947]: I1203 08:45:16.578273 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnddp\" (UniqueName: \"kubernetes.io/projected/adb116dd-88d8-44b7-9cf0-4bfc5862e8d6-kube-api-access-wnddp\") pod \"crc-storage-crc-9jllg\" (UID: \"adb116dd-88d8-44b7-9cf0-4bfc5862e8d6\") " pod="crc-storage/crc-storage-crc-9jllg" Dec 03 08:45:16 crc kubenswrapper[4947]: I1203 08:45:16.705601 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9jllg" Dec 03 08:45:17 crc kubenswrapper[4947]: I1203 08:45:17.098894 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6252c949-99d1-4dec-9d30-86eacd0ba4a1" path="/var/lib/kubelet/pods/6252c949-99d1-4dec-9d30-86eacd0ba4a1/volumes" Dec 03 08:45:17 crc kubenswrapper[4947]: I1203 08:45:17.171975 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9jllg"] Dec 03 08:45:17 crc kubenswrapper[4947]: I1203 08:45:17.241955 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gwx6s"] Dec 03 08:45:17 crc kubenswrapper[4947]: I1203 08:45:17.244096 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwx6s" Dec 03 08:45:17 crc kubenswrapper[4947]: I1203 08:45:17.253959 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gwx6s"] Dec 03 08:45:17 crc kubenswrapper[4947]: I1203 08:45:17.368697 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55906082-135b-4b31-a279-014052b0c008-utilities\") pod \"certified-operators-gwx6s\" (UID: \"55906082-135b-4b31-a279-014052b0c008\") " pod="openshift-marketplace/certified-operators-gwx6s" Dec 03 08:45:17 crc kubenswrapper[4947]: I1203 08:45:17.368790 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7dn9\" (UniqueName: \"kubernetes.io/projected/55906082-135b-4b31-a279-014052b0c008-kube-api-access-m7dn9\") pod \"certified-operators-gwx6s\" (UID: \"55906082-135b-4b31-a279-014052b0c008\") " pod="openshift-marketplace/certified-operators-gwx6s" Dec 03 08:45:17 crc kubenswrapper[4947]: I1203 08:45:17.368828 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55906082-135b-4b31-a279-014052b0c008-catalog-content\") pod \"certified-operators-gwx6s\" (UID: \"55906082-135b-4b31-a279-014052b0c008\") " pod="openshift-marketplace/certified-operators-gwx6s" Dec 03 08:45:17 crc kubenswrapper[4947]: I1203 08:45:17.470607 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55906082-135b-4b31-a279-014052b0c008-utilities\") pod \"certified-operators-gwx6s\" (UID: \"55906082-135b-4b31-a279-014052b0c008\") " pod="openshift-marketplace/certified-operators-gwx6s" Dec 03 08:45:17 crc kubenswrapper[4947]: I1203 08:45:17.470688 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7dn9\" (UniqueName: \"kubernetes.io/projected/55906082-135b-4b31-a279-014052b0c008-kube-api-access-m7dn9\") pod \"certified-operators-gwx6s\" (UID: \"55906082-135b-4b31-a279-014052b0c008\") " pod="openshift-marketplace/certified-operators-gwx6s" Dec 03 08:45:17 crc kubenswrapper[4947]: I1203 08:45:17.470724 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55906082-135b-4b31-a279-014052b0c008-catalog-content\") pod \"certified-operators-gwx6s\" (UID: \"55906082-135b-4b31-a279-014052b0c008\") " pod="openshift-marketplace/certified-operators-gwx6s" Dec 03 08:45:17 crc kubenswrapper[4947]: I1203 08:45:17.471339 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55906082-135b-4b31-a279-014052b0c008-catalog-content\") pod \"certified-operators-gwx6s\" (UID: \"55906082-135b-4b31-a279-014052b0c008\") " pod="openshift-marketplace/certified-operators-gwx6s" Dec 03 08:45:17 crc kubenswrapper[4947]: I1203 08:45:17.471561 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55906082-135b-4b31-a279-014052b0c008-utilities\") pod \"certified-operators-gwx6s\" (UID: \"55906082-135b-4b31-a279-014052b0c008\") " pod="openshift-marketplace/certified-operators-gwx6s" Dec 03 08:45:17 crc kubenswrapper[4947]: I1203 08:45:17.493614 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7dn9\" (UniqueName: \"kubernetes.io/projected/55906082-135b-4b31-a279-014052b0c008-kube-api-access-m7dn9\") pod \"certified-operators-gwx6s\" (UID: \"55906082-135b-4b31-a279-014052b0c008\") " pod="openshift-marketplace/certified-operators-gwx6s" Dec 03 08:45:17 crc kubenswrapper[4947]: I1203 08:45:17.575718 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwx6s" Dec 03 08:45:17 crc kubenswrapper[4947]: I1203 08:45:17.741148 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9jllg" event={"ID":"adb116dd-88d8-44b7-9cf0-4bfc5862e8d6","Type":"ContainerStarted","Data":"f390185bb1e9d7822ba5bfce0e73a4ae640385bf37fa343e01083bfe40bccca9"} Dec 03 08:45:18 crc kubenswrapper[4947]: I1203 08:45:18.042067 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gwx6s"] Dec 03 08:45:18 crc kubenswrapper[4947]: W1203 08:45:18.046644 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55906082_135b_4b31_a279_014052b0c008.slice/crio-d296502da3d67f7c02422511103346cd013ca1e02f3d4d2eabb8be1a167d9d6b WatchSource:0}: Error finding container d296502da3d67f7c02422511103346cd013ca1e02f3d4d2eabb8be1a167d9d6b: Status 404 returned error can't find the container with id d296502da3d67f7c02422511103346cd013ca1e02f3d4d2eabb8be1a167d9d6b Dec 03 08:45:18 crc kubenswrapper[4947]: I1203 08:45:18.755038 4947 generic.go:334] "Generic (PLEG): container finished" podID="55906082-135b-4b31-a279-014052b0c008" containerID="dca525b2c212ebfc282e0f6a465e7275d7f278f683e47406ebbb76f4e82f7567" exitCode=0 Dec 03 08:45:18 crc kubenswrapper[4947]: I1203 08:45:18.755100 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwx6s" event={"ID":"55906082-135b-4b31-a279-014052b0c008","Type":"ContainerDied","Data":"dca525b2c212ebfc282e0f6a465e7275d7f278f683e47406ebbb76f4e82f7567"} Dec 03 08:45:18 crc kubenswrapper[4947]: I1203 08:45:18.756533 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwx6s" event={"ID":"55906082-135b-4b31-a279-014052b0c008","Type":"ContainerStarted","Data":"d296502da3d67f7c02422511103346cd013ca1e02f3d4d2eabb8be1a167d9d6b"} Dec 03 08:45:18 crc kubenswrapper[4947]: I1203 08:45:18.761904 4947 generic.go:334] "Generic (PLEG): container finished" podID="adb116dd-88d8-44b7-9cf0-4bfc5862e8d6" containerID="d6fe653cde81d3172b31ea0160f95f97a1c9e52b5fada134066ab0e81f1a51eb" exitCode=0 Dec 03 08:45:18 crc kubenswrapper[4947]: I1203 08:45:18.761938 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9jllg" event={"ID":"adb116dd-88d8-44b7-9cf0-4bfc5862e8d6","Type":"ContainerDied","Data":"d6fe653cde81d3172b31ea0160f95f97a1c9e52b5fada134066ab0e81f1a51eb"} Dec 03 08:45:19 crc kubenswrapper[4947]: I1203 08:45:19.785708 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwx6s" event={"ID":"55906082-135b-4b31-a279-014052b0c008","Type":"ContainerStarted","Data":"b395a856fb94963313325d5d7e7f35dd11f4476b990978a97ebec76aadb6efee"} Dec 03 08:45:20 crc kubenswrapper[4947]: I1203 08:45:20.149704 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9jllg" Dec 03 08:45:20 crc kubenswrapper[4947]: I1203 08:45:20.214565 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/adb116dd-88d8-44b7-9cf0-4bfc5862e8d6-crc-storage\") pod \"adb116dd-88d8-44b7-9cf0-4bfc5862e8d6\" (UID: \"adb116dd-88d8-44b7-9cf0-4bfc5862e8d6\") " Dec 03 08:45:20 crc kubenswrapper[4947]: I1203 08:45:20.215068 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/adb116dd-88d8-44b7-9cf0-4bfc5862e8d6-node-mnt\") pod \"adb116dd-88d8-44b7-9cf0-4bfc5862e8d6\" (UID: \"adb116dd-88d8-44b7-9cf0-4bfc5862e8d6\") " Dec 03 08:45:20 crc kubenswrapper[4947]: I1203 08:45:20.215097 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnddp\" (UniqueName: \"kubernetes.io/projected/adb116dd-88d8-44b7-9cf0-4bfc5862e8d6-kube-api-access-wnddp\") pod \"adb116dd-88d8-44b7-9cf0-4bfc5862e8d6\" (UID: \"adb116dd-88d8-44b7-9cf0-4bfc5862e8d6\") " Dec 03 08:45:20 crc kubenswrapper[4947]: I1203 08:45:20.215259 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adb116dd-88d8-44b7-9cf0-4bfc5862e8d6-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "adb116dd-88d8-44b7-9cf0-4bfc5862e8d6" (UID: "adb116dd-88d8-44b7-9cf0-4bfc5862e8d6"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 08:45:20 crc kubenswrapper[4947]: I1203 08:45:20.215434 4947 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/adb116dd-88d8-44b7-9cf0-4bfc5862e8d6-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 03 08:45:20 crc kubenswrapper[4947]: I1203 08:45:20.220784 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adb116dd-88d8-44b7-9cf0-4bfc5862e8d6-kube-api-access-wnddp" (OuterVolumeSpecName: "kube-api-access-wnddp") pod "adb116dd-88d8-44b7-9cf0-4bfc5862e8d6" (UID: "adb116dd-88d8-44b7-9cf0-4bfc5862e8d6"). InnerVolumeSpecName "kube-api-access-wnddp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:45:20 crc kubenswrapper[4947]: I1203 08:45:20.237752 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adb116dd-88d8-44b7-9cf0-4bfc5862e8d6-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "adb116dd-88d8-44b7-9cf0-4bfc5862e8d6" (UID: "adb116dd-88d8-44b7-9cf0-4bfc5862e8d6"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:45:20 crc kubenswrapper[4947]: I1203 08:45:20.317196 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnddp\" (UniqueName: \"kubernetes.io/projected/adb116dd-88d8-44b7-9cf0-4bfc5862e8d6-kube-api-access-wnddp\") on node \"crc\" DevicePath \"\"" Dec 03 08:45:20 crc kubenswrapper[4947]: I1203 08:45:20.317254 4947 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/adb116dd-88d8-44b7-9cf0-4bfc5862e8d6-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 03 08:45:20 crc kubenswrapper[4947]: I1203 08:45:20.790705 4947 generic.go:334] "Generic (PLEG): container finished" podID="55906082-135b-4b31-a279-014052b0c008" containerID="b395a856fb94963313325d5d7e7f35dd11f4476b990978a97ebec76aadb6efee" exitCode=0 Dec 03 08:45:20 crc kubenswrapper[4947]: I1203 08:45:20.790824 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwx6s" event={"ID":"55906082-135b-4b31-a279-014052b0c008","Type":"ContainerDied","Data":"b395a856fb94963313325d5d7e7f35dd11f4476b990978a97ebec76aadb6efee"} Dec 03 08:45:20 crc kubenswrapper[4947]: I1203 08:45:20.792356 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9jllg" event={"ID":"adb116dd-88d8-44b7-9cf0-4bfc5862e8d6","Type":"ContainerDied","Data":"f390185bb1e9d7822ba5bfce0e73a4ae640385bf37fa343e01083bfe40bccca9"} Dec 03 08:45:20 crc kubenswrapper[4947]: I1203 08:45:20.792401 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f390185bb1e9d7822ba5bfce0e73a4ae640385bf37fa343e01083bfe40bccca9" Dec 03 08:45:20 crc kubenswrapper[4947]: I1203 08:45:20.792465 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9jllg" Dec 03 08:45:21 crc kubenswrapper[4947]: I1203 08:45:21.802039 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwx6s" event={"ID":"55906082-135b-4b31-a279-014052b0c008","Type":"ContainerStarted","Data":"3cbb461e7385f0fe37534a7611b50795363f370c984b8146eb85fdf624ba8345"} Dec 03 08:45:21 crc kubenswrapper[4947]: I1203 08:45:21.826786 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gwx6s" podStartSLOduration=2.205138476 podStartE2EDuration="4.826757451s" podCreationTimestamp="2025-12-03 08:45:17 +0000 UTC" firstStartedPulling="2025-12-03 08:45:18.757439711 +0000 UTC m=+6980.018394147" lastFinishedPulling="2025-12-03 08:45:21.379058686 +0000 UTC m=+6982.640013122" observedRunningTime="2025-12-03 08:45:21.820969965 +0000 UTC m=+6983.081924391" watchObservedRunningTime="2025-12-03 08:45:21.826757451 +0000 UTC m=+6983.087711887" Dec 03 08:45:27 crc kubenswrapper[4947]: I1203 08:45:27.577000 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gwx6s" Dec 03 08:45:27 crc kubenswrapper[4947]: I1203 08:45:27.578034 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gwx6s" Dec 03 08:45:27 crc kubenswrapper[4947]: I1203 08:45:27.643247 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gwx6s" Dec 03 08:45:27 crc kubenswrapper[4947]: I1203 08:45:27.913071 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gwx6s" Dec 03 08:45:27 crc kubenswrapper[4947]: I1203 08:45:27.968190 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gwx6s"] Dec 03 08:45:29 crc kubenswrapper[4947]: I1203 08:45:29.866956 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gwx6s" podUID="55906082-135b-4b31-a279-014052b0c008" containerName="registry-server" containerID="cri-o://3cbb461e7385f0fe37534a7611b50795363f370c984b8146eb85fdf624ba8345" gracePeriod=2 Dec 03 08:45:30 crc kubenswrapper[4947]: I1203 08:45:30.086652 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:45:30 crc kubenswrapper[4947]: I1203 08:45:30.086773 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:45:30 crc kubenswrapper[4947]: I1203 08:45:30.878952 4947 generic.go:334] "Generic (PLEG): container finished" podID="55906082-135b-4b31-a279-014052b0c008" containerID="3cbb461e7385f0fe37534a7611b50795363f370c984b8146eb85fdf624ba8345" exitCode=0 Dec 03 08:45:30 crc kubenswrapper[4947]: I1203 08:45:30.879022 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwx6s" event={"ID":"55906082-135b-4b31-a279-014052b0c008","Type":"ContainerDied","Data":"3cbb461e7385f0fe37534a7611b50795363f370c984b8146eb85fdf624ba8345"} Dec 03 08:45:31 crc kubenswrapper[4947]: I1203 08:45:31.442642 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwx6s" Dec 03 08:45:31 crc kubenswrapper[4947]: I1203 08:45:31.595922 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55906082-135b-4b31-a279-014052b0c008-catalog-content\") pod \"55906082-135b-4b31-a279-014052b0c008\" (UID: \"55906082-135b-4b31-a279-014052b0c008\") " Dec 03 08:45:31 crc kubenswrapper[4947]: I1203 08:45:31.596009 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7dn9\" (UniqueName: \"kubernetes.io/projected/55906082-135b-4b31-a279-014052b0c008-kube-api-access-m7dn9\") pod \"55906082-135b-4b31-a279-014052b0c008\" (UID: \"55906082-135b-4b31-a279-014052b0c008\") " Dec 03 08:45:31 crc kubenswrapper[4947]: I1203 08:45:31.596051 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55906082-135b-4b31-a279-014052b0c008-utilities\") pod \"55906082-135b-4b31-a279-014052b0c008\" (UID: \"55906082-135b-4b31-a279-014052b0c008\") " Dec 03 08:45:31 crc kubenswrapper[4947]: I1203 08:45:31.599085 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55906082-135b-4b31-a279-014052b0c008-utilities" (OuterVolumeSpecName: "utilities") pod "55906082-135b-4b31-a279-014052b0c008" (UID: "55906082-135b-4b31-a279-014052b0c008"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:45:31 crc kubenswrapper[4947]: I1203 08:45:31.602904 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55906082-135b-4b31-a279-014052b0c008-kube-api-access-m7dn9" (OuterVolumeSpecName: "kube-api-access-m7dn9") pod "55906082-135b-4b31-a279-014052b0c008" (UID: "55906082-135b-4b31-a279-014052b0c008"). InnerVolumeSpecName "kube-api-access-m7dn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:45:31 crc kubenswrapper[4947]: I1203 08:45:31.671293 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55906082-135b-4b31-a279-014052b0c008-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55906082-135b-4b31-a279-014052b0c008" (UID: "55906082-135b-4b31-a279-014052b0c008"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:45:31 crc kubenswrapper[4947]: I1203 08:45:31.697935 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55906082-135b-4b31-a279-014052b0c008-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:45:31 crc kubenswrapper[4947]: I1203 08:45:31.697970 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7dn9\" (UniqueName: \"kubernetes.io/projected/55906082-135b-4b31-a279-014052b0c008-kube-api-access-m7dn9\") on node \"crc\" DevicePath \"\"" Dec 03 08:45:31 crc kubenswrapper[4947]: I1203 08:45:31.697993 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55906082-135b-4b31-a279-014052b0c008-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:45:31 crc kubenswrapper[4947]: I1203 08:45:31.888761 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gwx6s" event={"ID":"55906082-135b-4b31-a279-014052b0c008","Type":"ContainerDied","Data":"d296502da3d67f7c02422511103346cd013ca1e02f3d4d2eabb8be1a167d9d6b"} Dec 03 08:45:31 crc kubenswrapper[4947]: I1203 08:45:31.888816 4947 scope.go:117] "RemoveContainer" containerID="3cbb461e7385f0fe37534a7611b50795363f370c984b8146eb85fdf624ba8345" Dec 03 08:45:31 crc kubenswrapper[4947]: I1203 08:45:31.888824 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gwx6s" Dec 03 08:45:31 crc kubenswrapper[4947]: I1203 08:45:31.932057 4947 scope.go:117] "RemoveContainer" containerID="b395a856fb94963313325d5d7e7f35dd11f4476b990978a97ebec76aadb6efee" Dec 03 08:45:31 crc kubenswrapper[4947]: I1203 08:45:31.935822 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gwx6s"] Dec 03 08:45:31 crc kubenswrapper[4947]: I1203 08:45:31.947620 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gwx6s"] Dec 03 08:45:31 crc kubenswrapper[4947]: I1203 08:45:31.958798 4947 scope.go:117] "RemoveContainer" containerID="dca525b2c212ebfc282e0f6a465e7275d7f278f683e47406ebbb76f4e82f7567" Dec 03 08:45:33 crc kubenswrapper[4947]: I1203 08:45:33.099213 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55906082-135b-4b31-a279-014052b0c008" path="/var/lib/kubelet/pods/55906082-135b-4b31-a279-014052b0c008/volumes" Dec 03 08:45:54 crc kubenswrapper[4947]: I1203 08:45:54.638255 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kgzxm"] Dec 03 08:45:54 crc kubenswrapper[4947]: E1203 08:45:54.639134 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55906082-135b-4b31-a279-014052b0c008" containerName="extract-content" Dec 03 08:45:54 crc kubenswrapper[4947]: I1203 08:45:54.639148 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="55906082-135b-4b31-a279-014052b0c008" containerName="extract-content" Dec 03 08:45:54 crc kubenswrapper[4947]: E1203 08:45:54.639174 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55906082-135b-4b31-a279-014052b0c008" containerName="extract-utilities" Dec 03 08:45:54 crc kubenswrapper[4947]: I1203 08:45:54.639181 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="55906082-135b-4b31-a279-014052b0c008" containerName="extract-utilities" Dec 03 08:45:54 crc kubenswrapper[4947]: E1203 08:45:54.639201 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55906082-135b-4b31-a279-014052b0c008" containerName="registry-server" Dec 03 08:45:54 crc kubenswrapper[4947]: I1203 08:45:54.639208 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="55906082-135b-4b31-a279-014052b0c008" containerName="registry-server" Dec 03 08:45:54 crc kubenswrapper[4947]: E1203 08:45:54.639220 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adb116dd-88d8-44b7-9cf0-4bfc5862e8d6" containerName="storage" Dec 03 08:45:54 crc kubenswrapper[4947]: I1203 08:45:54.639227 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="adb116dd-88d8-44b7-9cf0-4bfc5862e8d6" containerName="storage" Dec 03 08:45:54 crc kubenswrapper[4947]: I1203 08:45:54.639403 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="adb116dd-88d8-44b7-9cf0-4bfc5862e8d6" containerName="storage" Dec 03 08:45:54 crc kubenswrapper[4947]: I1203 08:45:54.639425 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="55906082-135b-4b31-a279-014052b0c008" containerName="registry-server" Dec 03 08:45:54 crc kubenswrapper[4947]: I1203 08:45:54.640611 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kgzxm" Dec 03 08:45:54 crc kubenswrapper[4947]: I1203 08:45:54.658525 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kgzxm"] Dec 03 08:45:54 crc kubenswrapper[4947]: I1203 08:45:54.746919 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwscm\" (UniqueName: \"kubernetes.io/projected/84bc54d3-114e-4634-b330-c033e3079247-kube-api-access-vwscm\") pod \"redhat-operators-kgzxm\" (UID: \"84bc54d3-114e-4634-b330-c033e3079247\") " pod="openshift-marketplace/redhat-operators-kgzxm" Dec 03 08:45:54 crc kubenswrapper[4947]: I1203 08:45:54.746997 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84bc54d3-114e-4634-b330-c033e3079247-utilities\") pod \"redhat-operators-kgzxm\" (UID: \"84bc54d3-114e-4634-b330-c033e3079247\") " pod="openshift-marketplace/redhat-operators-kgzxm" Dec 03 08:45:54 crc kubenswrapper[4947]: I1203 08:45:54.747065 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84bc54d3-114e-4634-b330-c033e3079247-catalog-content\") pod \"redhat-operators-kgzxm\" (UID: \"84bc54d3-114e-4634-b330-c033e3079247\") " pod="openshift-marketplace/redhat-operators-kgzxm" Dec 03 08:45:54 crc kubenswrapper[4947]: I1203 08:45:54.848206 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwscm\" (UniqueName: \"kubernetes.io/projected/84bc54d3-114e-4634-b330-c033e3079247-kube-api-access-vwscm\") pod \"redhat-operators-kgzxm\" (UID: \"84bc54d3-114e-4634-b330-c033e3079247\") " pod="openshift-marketplace/redhat-operators-kgzxm" Dec 03 08:45:54 crc kubenswrapper[4947]: I1203 08:45:54.848329 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84bc54d3-114e-4634-b330-c033e3079247-utilities\") pod \"redhat-operators-kgzxm\" (UID: \"84bc54d3-114e-4634-b330-c033e3079247\") " pod="openshift-marketplace/redhat-operators-kgzxm" Dec 03 08:45:54 crc kubenswrapper[4947]: I1203 08:45:54.848441 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84bc54d3-114e-4634-b330-c033e3079247-catalog-content\") pod \"redhat-operators-kgzxm\" (UID: \"84bc54d3-114e-4634-b330-c033e3079247\") " pod="openshift-marketplace/redhat-operators-kgzxm" Dec 03 08:45:54 crc kubenswrapper[4947]: I1203 08:45:54.849157 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84bc54d3-114e-4634-b330-c033e3079247-utilities\") pod \"redhat-operators-kgzxm\" (UID: \"84bc54d3-114e-4634-b330-c033e3079247\") " pod="openshift-marketplace/redhat-operators-kgzxm" Dec 03 08:45:54 crc kubenswrapper[4947]: I1203 08:45:54.849157 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84bc54d3-114e-4634-b330-c033e3079247-catalog-content\") pod \"redhat-operators-kgzxm\" (UID: \"84bc54d3-114e-4634-b330-c033e3079247\") " pod="openshift-marketplace/redhat-operators-kgzxm" Dec 03 08:45:54 crc kubenswrapper[4947]: I1203 08:45:54.874469 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwscm\" (UniqueName: \"kubernetes.io/projected/84bc54d3-114e-4634-b330-c033e3079247-kube-api-access-vwscm\") pod \"redhat-operators-kgzxm\" (UID: \"84bc54d3-114e-4634-b330-c033e3079247\") " pod="openshift-marketplace/redhat-operators-kgzxm" Dec 03 08:45:54 crc kubenswrapper[4947]: I1203 08:45:54.961734 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kgzxm" Dec 03 08:45:55 crc kubenswrapper[4947]: I1203 08:45:55.403505 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kgzxm"] Dec 03 08:45:56 crc kubenswrapper[4947]: I1203 08:45:56.107838 4947 generic.go:334] "Generic (PLEG): container finished" podID="84bc54d3-114e-4634-b330-c033e3079247" containerID="0a36f5a946f7fa4adededfa63c4a25d30fb54f791efdac5cc9dfd87f0dc58dc6" exitCode=0 Dec 03 08:45:56 crc kubenswrapper[4947]: I1203 08:45:56.107876 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgzxm" event={"ID":"84bc54d3-114e-4634-b330-c033e3079247","Type":"ContainerDied","Data":"0a36f5a946f7fa4adededfa63c4a25d30fb54f791efdac5cc9dfd87f0dc58dc6"} Dec 03 08:45:56 crc kubenswrapper[4947]: I1203 08:45:56.108152 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgzxm" event={"ID":"84bc54d3-114e-4634-b330-c033e3079247","Type":"ContainerStarted","Data":"570c808626e2f5a9badae207b48fd9b2f87daf9a36a1daa7fad7c4d392473e7b"} Dec 03 08:45:57 crc kubenswrapper[4947]: I1203 08:45:57.114810 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgzxm" event={"ID":"84bc54d3-114e-4634-b330-c033e3079247","Type":"ContainerStarted","Data":"c11378080e3111b0b44525aebf3caba42882e69770671ddb5876d4a416fef505"} Dec 03 08:45:58 crc kubenswrapper[4947]: I1203 08:45:58.125883 4947 generic.go:334] "Generic (PLEG): container finished" podID="84bc54d3-114e-4634-b330-c033e3079247" containerID="c11378080e3111b0b44525aebf3caba42882e69770671ddb5876d4a416fef505" exitCode=0 Dec 03 08:45:58 crc kubenswrapper[4947]: I1203 08:45:58.125944 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgzxm" event={"ID":"84bc54d3-114e-4634-b330-c033e3079247","Type":"ContainerDied","Data":"c11378080e3111b0b44525aebf3caba42882e69770671ddb5876d4a416fef505"} Dec 03 08:45:59 crc kubenswrapper[4947]: I1203 08:45:59.135639 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgzxm" event={"ID":"84bc54d3-114e-4634-b330-c033e3079247","Type":"ContainerStarted","Data":"bd069602086407325d35f7adf16805f6df94cba6f5f606a48a58f0369e613ed2"} Dec 03 08:45:59 crc kubenswrapper[4947]: I1203 08:45:59.161109 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kgzxm" podStartSLOduration=2.419339134 podStartE2EDuration="5.161086474s" podCreationTimestamp="2025-12-03 08:45:54 +0000 UTC" firstStartedPulling="2025-12-03 08:45:56.109372739 +0000 UTC m=+7017.370327165" lastFinishedPulling="2025-12-03 08:45:58.851120079 +0000 UTC m=+7020.112074505" observedRunningTime="2025-12-03 08:45:59.153224831 +0000 UTC m=+7020.414179277" watchObservedRunningTime="2025-12-03 08:45:59.161086474 +0000 UTC m=+7020.422040920" Dec 03 08:46:00 crc kubenswrapper[4947]: I1203 08:46:00.086717 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:46:00 crc kubenswrapper[4947]: I1203 08:46:00.086789 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:46:04 crc kubenswrapper[4947]: I1203 08:46:04.962869 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kgzxm" Dec 03 08:46:04 crc kubenswrapper[4947]: I1203 08:46:04.963570 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kgzxm" Dec 03 08:46:05 crc kubenswrapper[4947]: I1203 08:46:05.013570 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kgzxm" Dec 03 08:46:05 crc kubenswrapper[4947]: I1203 08:46:05.265840 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kgzxm" Dec 03 08:46:05 crc kubenswrapper[4947]: I1203 08:46:05.324631 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kgzxm"] Dec 03 08:46:06 crc kubenswrapper[4947]: I1203 08:46:06.673076 4947 scope.go:117] "RemoveContainer" containerID="412c634436939ef4a8a4636d15751704c5c6f71a3607e5980bb469d235937256" Dec 03 08:46:07 crc kubenswrapper[4947]: I1203 08:46:07.211557 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kgzxm" podUID="84bc54d3-114e-4634-b330-c033e3079247" containerName="registry-server" containerID="cri-o://bd069602086407325d35f7adf16805f6df94cba6f5f606a48a58f0369e613ed2" gracePeriod=2 Dec 03 08:46:09 crc kubenswrapper[4947]: I1203 08:46:08.142862 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kgzxm" Dec 03 08:46:09 crc kubenswrapper[4947]: I1203 08:46:08.220425 4947 generic.go:334] "Generic (PLEG): container finished" podID="84bc54d3-114e-4634-b330-c033e3079247" containerID="bd069602086407325d35f7adf16805f6df94cba6f5f606a48a58f0369e613ed2" exitCode=0 Dec 03 08:46:09 crc kubenswrapper[4947]: I1203 08:46:08.220471 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgzxm" event={"ID":"84bc54d3-114e-4634-b330-c033e3079247","Type":"ContainerDied","Data":"bd069602086407325d35f7adf16805f6df94cba6f5f606a48a58f0369e613ed2"} Dec 03 08:46:09 crc kubenswrapper[4947]: I1203 08:46:08.220525 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kgzxm" event={"ID":"84bc54d3-114e-4634-b330-c033e3079247","Type":"ContainerDied","Data":"570c808626e2f5a9badae207b48fd9b2f87daf9a36a1daa7fad7c4d392473e7b"} Dec 03 08:46:09 crc kubenswrapper[4947]: I1203 08:46:08.220547 4947 scope.go:117] "RemoveContainer" containerID="bd069602086407325d35f7adf16805f6df94cba6f5f606a48a58f0369e613ed2" Dec 03 08:46:09 crc kubenswrapper[4947]: I1203 08:46:08.220556 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kgzxm" Dec 03 08:46:09 crc kubenswrapper[4947]: I1203 08:46:08.237410 4947 scope.go:117] "RemoveContainer" containerID="c11378080e3111b0b44525aebf3caba42882e69770671ddb5876d4a416fef505" Dec 03 08:46:09 crc kubenswrapper[4947]: I1203 08:46:08.268473 4947 scope.go:117] "RemoveContainer" containerID="0a36f5a946f7fa4adededfa63c4a25d30fb54f791efdac5cc9dfd87f0dc58dc6" Dec 03 08:46:09 crc kubenswrapper[4947]: I1203 08:46:08.275592 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84bc54d3-114e-4634-b330-c033e3079247-catalog-content\") pod \"84bc54d3-114e-4634-b330-c033e3079247\" (UID: \"84bc54d3-114e-4634-b330-c033e3079247\") " Dec 03 08:46:09 crc kubenswrapper[4947]: I1203 08:46:08.275656 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84bc54d3-114e-4634-b330-c033e3079247-utilities\") pod \"84bc54d3-114e-4634-b330-c033e3079247\" (UID: \"84bc54d3-114e-4634-b330-c033e3079247\") " Dec 03 08:46:09 crc kubenswrapper[4947]: I1203 08:46:08.275793 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwscm\" (UniqueName: \"kubernetes.io/projected/84bc54d3-114e-4634-b330-c033e3079247-kube-api-access-vwscm\") pod \"84bc54d3-114e-4634-b330-c033e3079247\" (UID: \"84bc54d3-114e-4634-b330-c033e3079247\") " Dec 03 08:46:09 crc kubenswrapper[4947]: I1203 08:46:08.277126 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84bc54d3-114e-4634-b330-c033e3079247-utilities" (OuterVolumeSpecName: "utilities") pod "84bc54d3-114e-4634-b330-c033e3079247" (UID: "84bc54d3-114e-4634-b330-c033e3079247"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:46:09 crc kubenswrapper[4947]: I1203 08:46:08.280701 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84bc54d3-114e-4634-b330-c033e3079247-kube-api-access-vwscm" (OuterVolumeSpecName: "kube-api-access-vwscm") pod "84bc54d3-114e-4634-b330-c033e3079247" (UID: "84bc54d3-114e-4634-b330-c033e3079247"). InnerVolumeSpecName "kube-api-access-vwscm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:46:09 crc kubenswrapper[4947]: I1203 08:46:08.321631 4947 scope.go:117] "RemoveContainer" containerID="bd069602086407325d35f7adf16805f6df94cba6f5f606a48a58f0369e613ed2" Dec 03 08:46:09 crc kubenswrapper[4947]: E1203 08:46:08.322111 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd069602086407325d35f7adf16805f6df94cba6f5f606a48a58f0369e613ed2\": container with ID starting with bd069602086407325d35f7adf16805f6df94cba6f5f606a48a58f0369e613ed2 not found: ID does not exist" containerID="bd069602086407325d35f7adf16805f6df94cba6f5f606a48a58f0369e613ed2" Dec 03 08:46:09 crc kubenswrapper[4947]: I1203 08:46:08.322165 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd069602086407325d35f7adf16805f6df94cba6f5f606a48a58f0369e613ed2"} err="failed to get container status \"bd069602086407325d35f7adf16805f6df94cba6f5f606a48a58f0369e613ed2\": rpc error: code = NotFound desc = could not find container \"bd069602086407325d35f7adf16805f6df94cba6f5f606a48a58f0369e613ed2\": container with ID starting with bd069602086407325d35f7adf16805f6df94cba6f5f606a48a58f0369e613ed2 not found: ID does not exist" Dec 03 08:46:09 crc kubenswrapper[4947]: I1203 08:46:08.322198 4947 scope.go:117] "RemoveContainer" containerID="c11378080e3111b0b44525aebf3caba42882e69770671ddb5876d4a416fef505" Dec 03 08:46:09 crc kubenswrapper[4947]: E1203 08:46:08.322760 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c11378080e3111b0b44525aebf3caba42882e69770671ddb5876d4a416fef505\": container with ID starting with c11378080e3111b0b44525aebf3caba42882e69770671ddb5876d4a416fef505 not found: ID does not exist" containerID="c11378080e3111b0b44525aebf3caba42882e69770671ddb5876d4a416fef505" Dec 03 08:46:09 crc kubenswrapper[4947]: I1203 08:46:08.322787 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c11378080e3111b0b44525aebf3caba42882e69770671ddb5876d4a416fef505"} err="failed to get container status \"c11378080e3111b0b44525aebf3caba42882e69770671ddb5876d4a416fef505\": rpc error: code = NotFound desc = could not find container \"c11378080e3111b0b44525aebf3caba42882e69770671ddb5876d4a416fef505\": container with ID starting with c11378080e3111b0b44525aebf3caba42882e69770671ddb5876d4a416fef505 not found: ID does not exist" Dec 03 08:46:09 crc kubenswrapper[4947]: I1203 08:46:08.322804 4947 scope.go:117] "RemoveContainer" containerID="0a36f5a946f7fa4adededfa63c4a25d30fb54f791efdac5cc9dfd87f0dc58dc6" Dec 03 08:46:09 crc kubenswrapper[4947]: E1203 08:46:08.323172 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a36f5a946f7fa4adededfa63c4a25d30fb54f791efdac5cc9dfd87f0dc58dc6\": container with ID starting with 0a36f5a946f7fa4adededfa63c4a25d30fb54f791efdac5cc9dfd87f0dc58dc6 not found: ID does not exist" containerID="0a36f5a946f7fa4adededfa63c4a25d30fb54f791efdac5cc9dfd87f0dc58dc6" Dec 03 08:46:09 crc kubenswrapper[4947]: I1203 08:46:08.323203 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a36f5a946f7fa4adededfa63c4a25d30fb54f791efdac5cc9dfd87f0dc58dc6"} err="failed to get container status \"0a36f5a946f7fa4adededfa63c4a25d30fb54f791efdac5cc9dfd87f0dc58dc6\": rpc error: code = NotFound desc = could not find container \"0a36f5a946f7fa4adededfa63c4a25d30fb54f791efdac5cc9dfd87f0dc58dc6\": container with ID starting with 0a36f5a946f7fa4adededfa63c4a25d30fb54f791efdac5cc9dfd87f0dc58dc6 not found: ID does not exist" Dec 03 08:46:09 crc kubenswrapper[4947]: I1203 08:46:08.377593 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84bc54d3-114e-4634-b330-c033e3079247-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:46:09 crc kubenswrapper[4947]: I1203 08:46:08.377631 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwscm\" (UniqueName: \"kubernetes.io/projected/84bc54d3-114e-4634-b330-c033e3079247-kube-api-access-vwscm\") on node \"crc\" DevicePath \"\"" Dec 03 08:46:09 crc kubenswrapper[4947]: I1203 08:46:09.463616 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84bc54d3-114e-4634-b330-c033e3079247-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84bc54d3-114e-4634-b330-c033e3079247" (UID: "84bc54d3-114e-4634-b330-c033e3079247"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:46:09 crc kubenswrapper[4947]: I1203 08:46:09.492920 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84bc54d3-114e-4634-b330-c033e3079247-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:46:09 crc kubenswrapper[4947]: I1203 08:46:09.753627 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kgzxm"] Dec 03 08:46:09 crc kubenswrapper[4947]: I1203 08:46:09.759070 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kgzxm"] Dec 03 08:46:11 crc kubenswrapper[4947]: I1203 08:46:11.093400 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84bc54d3-114e-4634-b330-c033e3079247" path="/var/lib/kubelet/pods/84bc54d3-114e-4634-b330-c033e3079247/volumes" Dec 03 08:46:30 crc kubenswrapper[4947]: I1203 08:46:30.086784 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:46:30 crc kubenswrapper[4947]: I1203 08:46:30.087361 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:46:30 crc kubenswrapper[4947]: I1203 08:46:30.087410 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 08:46:30 crc kubenswrapper[4947]: I1203 08:46:30.088055 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:46:30 crc kubenswrapper[4947]: I1203 08:46:30.088127 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" gracePeriod=600 Dec 03 08:46:30 crc kubenswrapper[4947]: E1203 08:46:30.217906 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:46:30 crc kubenswrapper[4947]: I1203 08:46:30.413230 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" exitCode=0 Dec 03 08:46:30 crc kubenswrapper[4947]: I1203 08:46:30.413285 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38"} Dec 03 08:46:30 crc kubenswrapper[4947]: I1203 08:46:30.413324 4947 scope.go:117] "RemoveContainer" containerID="f497ed6e66f07c3659d612e4d03fc5594262467811b840160071934dd1dd8bac" Dec 03 08:46:30 crc kubenswrapper[4947]: I1203 08:46:30.413890 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:46:30 crc kubenswrapper[4947]: E1203 08:46:30.414132 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:46:42 crc kubenswrapper[4947]: I1203 08:46:42.082656 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:46:42 crc kubenswrapper[4947]: E1203 08:46:42.083478 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:46:55 crc kubenswrapper[4947]: I1203 08:46:55.083179 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:46:55 crc kubenswrapper[4947]: E1203 08:46:55.083685 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:47:10 crc kubenswrapper[4947]: I1203 08:47:10.082814 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:47:10 crc kubenswrapper[4947]: E1203 08:47:10.083814 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:47:15 crc kubenswrapper[4947]: I1203 08:47:15.783760 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7688886755-b6lpz"] Dec 03 08:47:15 crc kubenswrapper[4947]: E1203 08:47:15.784431 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84bc54d3-114e-4634-b330-c033e3079247" containerName="extract-content" Dec 03 08:47:15 crc kubenswrapper[4947]: I1203 08:47:15.784443 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="84bc54d3-114e-4634-b330-c033e3079247" containerName="extract-content" Dec 03 08:47:15 crc kubenswrapper[4947]: E1203 08:47:15.784459 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84bc54d3-114e-4634-b330-c033e3079247" containerName="registry-server" Dec 03 08:47:15 crc kubenswrapper[4947]: I1203 08:47:15.784465 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="84bc54d3-114e-4634-b330-c033e3079247" containerName="registry-server" Dec 03 08:47:15 crc kubenswrapper[4947]: E1203 08:47:15.784479 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84bc54d3-114e-4634-b330-c033e3079247" containerName="extract-utilities" Dec 03 08:47:15 crc kubenswrapper[4947]: I1203 08:47:15.784488 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="84bc54d3-114e-4634-b330-c033e3079247" containerName="extract-utilities" Dec 03 08:47:15 crc kubenswrapper[4947]: I1203 08:47:15.784677 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="84bc54d3-114e-4634-b330-c033e3079247" containerName="registry-server" Dec 03 08:47:15 crc kubenswrapper[4947]: I1203 08:47:15.785457 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7688886755-b6lpz" Dec 03 08:47:15 crc kubenswrapper[4947]: I1203 08:47:15.787605 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 03 08:47:15 crc kubenswrapper[4947]: I1203 08:47:15.787631 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 03 08:47:15 crc kubenswrapper[4947]: I1203 08:47:15.787642 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 03 08:47:15 crc kubenswrapper[4947]: I1203 08:47:15.787811 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-849sv" Dec 03 08:47:15 crc kubenswrapper[4947]: I1203 08:47:15.788883 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 03 08:47:15 crc kubenswrapper[4947]: I1203 08:47:15.809897 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7688886755-b6lpz"] Dec 03 08:47:15 crc kubenswrapper[4947]: I1203 08:47:15.985115 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5c598e8-31fd-43d6-8469-752f4607ca80-dns-svc\") pod \"dnsmasq-dns-7688886755-b6lpz\" (UID: \"f5c598e8-31fd-43d6-8469-752f4607ca80\") " pod="openstack/dnsmasq-dns-7688886755-b6lpz" Dec 03 08:47:15 crc kubenswrapper[4947]: I1203 08:47:15.985198 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf96z\" (UniqueName: \"kubernetes.io/projected/f5c598e8-31fd-43d6-8469-752f4607ca80-kube-api-access-vf96z\") pod \"dnsmasq-dns-7688886755-b6lpz\" (UID: \"f5c598e8-31fd-43d6-8469-752f4607ca80\") " pod="openstack/dnsmasq-dns-7688886755-b6lpz" Dec 03 08:47:15 crc kubenswrapper[4947]: I1203 08:47:15.985244 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5c598e8-31fd-43d6-8469-752f4607ca80-config\") pod \"dnsmasq-dns-7688886755-b6lpz\" (UID: \"f5c598e8-31fd-43d6-8469-752f4607ca80\") " pod="openstack/dnsmasq-dns-7688886755-b6lpz" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.047327 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d49975d65-j2vxv"] Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.048762 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d49975d65-j2vxv" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.064215 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d49975d65-j2vxv"] Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.086650 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5c598e8-31fd-43d6-8469-752f4607ca80-dns-svc\") pod \"dnsmasq-dns-7688886755-b6lpz\" (UID: \"f5c598e8-31fd-43d6-8469-752f4607ca80\") " pod="openstack/dnsmasq-dns-7688886755-b6lpz" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.086759 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf96z\" (UniqueName: \"kubernetes.io/projected/f5c598e8-31fd-43d6-8469-752f4607ca80-kube-api-access-vf96z\") pod \"dnsmasq-dns-7688886755-b6lpz\" (UID: \"f5c598e8-31fd-43d6-8469-752f4607ca80\") " pod="openstack/dnsmasq-dns-7688886755-b6lpz" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.086807 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5c598e8-31fd-43d6-8469-752f4607ca80-config\") pod \"dnsmasq-dns-7688886755-b6lpz\" (UID: \"f5c598e8-31fd-43d6-8469-752f4607ca80\") " pod="openstack/dnsmasq-dns-7688886755-b6lpz" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.087886 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5c598e8-31fd-43d6-8469-752f4607ca80-config\") pod \"dnsmasq-dns-7688886755-b6lpz\" (UID: \"f5c598e8-31fd-43d6-8469-752f4607ca80\") " pod="openstack/dnsmasq-dns-7688886755-b6lpz" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.087891 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5c598e8-31fd-43d6-8469-752f4607ca80-dns-svc\") pod \"dnsmasq-dns-7688886755-b6lpz\" (UID: \"f5c598e8-31fd-43d6-8469-752f4607ca80\") " pod="openstack/dnsmasq-dns-7688886755-b6lpz" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.105387 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf96z\" (UniqueName: \"kubernetes.io/projected/f5c598e8-31fd-43d6-8469-752f4607ca80-kube-api-access-vf96z\") pod \"dnsmasq-dns-7688886755-b6lpz\" (UID: \"f5c598e8-31fd-43d6-8469-752f4607ca80\") " pod="openstack/dnsmasq-dns-7688886755-b6lpz" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.187577 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtxrl\" (UniqueName: \"kubernetes.io/projected/4141f187-e61b-4477-8e44-0506fc8d62e1-kube-api-access-xtxrl\") pod \"dnsmasq-dns-7d49975d65-j2vxv\" (UID: \"4141f187-e61b-4477-8e44-0506fc8d62e1\") " pod="openstack/dnsmasq-dns-7d49975d65-j2vxv" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.187854 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4141f187-e61b-4477-8e44-0506fc8d62e1-config\") pod \"dnsmasq-dns-7d49975d65-j2vxv\" (UID: \"4141f187-e61b-4477-8e44-0506fc8d62e1\") " pod="openstack/dnsmasq-dns-7d49975d65-j2vxv" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.187876 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4141f187-e61b-4477-8e44-0506fc8d62e1-dns-svc\") pod \"dnsmasq-dns-7d49975d65-j2vxv\" (UID: \"4141f187-e61b-4477-8e44-0506fc8d62e1\") " pod="openstack/dnsmasq-dns-7d49975d65-j2vxv" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.289074 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4141f187-e61b-4477-8e44-0506fc8d62e1-config\") pod \"dnsmasq-dns-7d49975d65-j2vxv\" (UID: \"4141f187-e61b-4477-8e44-0506fc8d62e1\") " pod="openstack/dnsmasq-dns-7d49975d65-j2vxv" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.289131 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4141f187-e61b-4477-8e44-0506fc8d62e1-dns-svc\") pod \"dnsmasq-dns-7d49975d65-j2vxv\" (UID: \"4141f187-e61b-4477-8e44-0506fc8d62e1\") " pod="openstack/dnsmasq-dns-7d49975d65-j2vxv" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.289195 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtxrl\" (UniqueName: \"kubernetes.io/projected/4141f187-e61b-4477-8e44-0506fc8d62e1-kube-api-access-xtxrl\") pod \"dnsmasq-dns-7d49975d65-j2vxv\" (UID: \"4141f187-e61b-4477-8e44-0506fc8d62e1\") " pod="openstack/dnsmasq-dns-7d49975d65-j2vxv" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.289957 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4141f187-e61b-4477-8e44-0506fc8d62e1-config\") pod \"dnsmasq-dns-7d49975d65-j2vxv\" (UID: \"4141f187-e61b-4477-8e44-0506fc8d62e1\") " pod="openstack/dnsmasq-dns-7d49975d65-j2vxv" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.290048 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4141f187-e61b-4477-8e44-0506fc8d62e1-dns-svc\") pod \"dnsmasq-dns-7d49975d65-j2vxv\" (UID: \"4141f187-e61b-4477-8e44-0506fc8d62e1\") " pod="openstack/dnsmasq-dns-7d49975d65-j2vxv" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.301869 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7688886755-b6lpz"] Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.302411 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7688886755-b6lpz" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.321432 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtxrl\" (UniqueName: \"kubernetes.io/projected/4141f187-e61b-4477-8e44-0506fc8d62e1-kube-api-access-xtxrl\") pod \"dnsmasq-dns-7d49975d65-j2vxv\" (UID: \"4141f187-e61b-4477-8e44-0506fc8d62e1\") " pod="openstack/dnsmasq-dns-7d49975d65-j2vxv" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.339064 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-ccdb85c4c-27q42"] Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.340584 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccdb85c4c-27q42" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.345392 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ccdb85c4c-27q42"] Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.367921 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d49975d65-j2vxv" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.496155 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bf0659e-26a6-4fb6-8866-9d0a32c497b0-dns-svc\") pod \"dnsmasq-dns-ccdb85c4c-27q42\" (UID: \"5bf0659e-26a6-4fb6-8866-9d0a32c497b0\") " pod="openstack/dnsmasq-dns-ccdb85c4c-27q42" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.496480 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdfds\" (UniqueName: \"kubernetes.io/projected/5bf0659e-26a6-4fb6-8866-9d0a32c497b0-kube-api-access-rdfds\") pod \"dnsmasq-dns-ccdb85c4c-27q42\" (UID: \"5bf0659e-26a6-4fb6-8866-9d0a32c497b0\") " pod="openstack/dnsmasq-dns-ccdb85c4c-27q42" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.496623 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bf0659e-26a6-4fb6-8866-9d0a32c497b0-config\") pod \"dnsmasq-dns-ccdb85c4c-27q42\" (UID: \"5bf0659e-26a6-4fb6-8866-9d0a32c497b0\") " pod="openstack/dnsmasq-dns-ccdb85c4c-27q42" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.602248 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdfds\" (UniqueName: \"kubernetes.io/projected/5bf0659e-26a6-4fb6-8866-9d0a32c497b0-kube-api-access-rdfds\") pod \"dnsmasq-dns-ccdb85c4c-27q42\" (UID: \"5bf0659e-26a6-4fb6-8866-9d0a32c497b0\") " pod="openstack/dnsmasq-dns-ccdb85c4c-27q42" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.602315 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bf0659e-26a6-4fb6-8866-9d0a32c497b0-config\") pod \"dnsmasq-dns-ccdb85c4c-27q42\" (UID: \"5bf0659e-26a6-4fb6-8866-9d0a32c497b0\") " pod="openstack/dnsmasq-dns-ccdb85c4c-27q42" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.602371 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bf0659e-26a6-4fb6-8866-9d0a32c497b0-dns-svc\") pod \"dnsmasq-dns-ccdb85c4c-27q42\" (UID: \"5bf0659e-26a6-4fb6-8866-9d0a32c497b0\") " pod="openstack/dnsmasq-dns-ccdb85c4c-27q42" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.603375 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bf0659e-26a6-4fb6-8866-9d0a32c497b0-dns-svc\") pod \"dnsmasq-dns-ccdb85c4c-27q42\" (UID: \"5bf0659e-26a6-4fb6-8866-9d0a32c497b0\") " pod="openstack/dnsmasq-dns-ccdb85c4c-27q42" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.604340 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bf0659e-26a6-4fb6-8866-9d0a32c497b0-config\") pod \"dnsmasq-dns-ccdb85c4c-27q42\" (UID: \"5bf0659e-26a6-4fb6-8866-9d0a32c497b0\") " pod="openstack/dnsmasq-dns-ccdb85c4c-27q42" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.630545 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdfds\" (UniqueName: \"kubernetes.io/projected/5bf0659e-26a6-4fb6-8866-9d0a32c497b0-kube-api-access-rdfds\") pod \"dnsmasq-dns-ccdb85c4c-27q42\" (UID: \"5bf0659e-26a6-4fb6-8866-9d0a32c497b0\") " pod="openstack/dnsmasq-dns-ccdb85c4c-27q42" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.648807 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d49975d65-j2vxv"] Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.675155 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7588bd9997-rxwb7"] Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.686755 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7588bd9997-rxwb7"] Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.686938 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7588bd9997-rxwb7" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.704615 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6633c67e-0968-48bc-864d-ccc1dad4e225-config\") pod \"dnsmasq-dns-7588bd9997-rxwb7\" (UID: \"6633c67e-0968-48bc-864d-ccc1dad4e225\") " pod="openstack/dnsmasq-dns-7588bd9997-rxwb7" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.704668 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqtdx\" (UniqueName: \"kubernetes.io/projected/6633c67e-0968-48bc-864d-ccc1dad4e225-kube-api-access-lqtdx\") pod \"dnsmasq-dns-7588bd9997-rxwb7\" (UID: \"6633c67e-0968-48bc-864d-ccc1dad4e225\") " pod="openstack/dnsmasq-dns-7588bd9997-rxwb7" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.704728 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6633c67e-0968-48bc-864d-ccc1dad4e225-dns-svc\") pod \"dnsmasq-dns-7588bd9997-rxwb7\" (UID: \"6633c67e-0968-48bc-864d-ccc1dad4e225\") " pod="openstack/dnsmasq-dns-7588bd9997-rxwb7" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.752270 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccdb85c4c-27q42" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.764186 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7688886755-b6lpz"] Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.807100 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7688886755-b6lpz" event={"ID":"f5c598e8-31fd-43d6-8469-752f4607ca80","Type":"ContainerStarted","Data":"ad629be4cfdf320c89b8b3b83089b57f39c717cd0d5bbc61bb3e9661fe31d8c4"} Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.809816 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6633c67e-0968-48bc-864d-ccc1dad4e225-config\") pod \"dnsmasq-dns-7588bd9997-rxwb7\" (UID: \"6633c67e-0968-48bc-864d-ccc1dad4e225\") " pod="openstack/dnsmasq-dns-7588bd9997-rxwb7" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.809882 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqtdx\" (UniqueName: \"kubernetes.io/projected/6633c67e-0968-48bc-864d-ccc1dad4e225-kube-api-access-lqtdx\") pod \"dnsmasq-dns-7588bd9997-rxwb7\" (UID: \"6633c67e-0968-48bc-864d-ccc1dad4e225\") " pod="openstack/dnsmasq-dns-7588bd9997-rxwb7" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.809933 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6633c67e-0968-48bc-864d-ccc1dad4e225-dns-svc\") pod \"dnsmasq-dns-7588bd9997-rxwb7\" (UID: \"6633c67e-0968-48bc-864d-ccc1dad4e225\") " pod="openstack/dnsmasq-dns-7588bd9997-rxwb7" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.815326 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6633c67e-0968-48bc-864d-ccc1dad4e225-config\") pod \"dnsmasq-dns-7588bd9997-rxwb7\" (UID: \"6633c67e-0968-48bc-864d-ccc1dad4e225\") " pod="openstack/dnsmasq-dns-7588bd9997-rxwb7" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.816027 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6633c67e-0968-48bc-864d-ccc1dad4e225-dns-svc\") pod \"dnsmasq-dns-7588bd9997-rxwb7\" (UID: \"6633c67e-0968-48bc-864d-ccc1dad4e225\") " pod="openstack/dnsmasq-dns-7588bd9997-rxwb7" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.838560 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqtdx\" (UniqueName: \"kubernetes.io/projected/6633c67e-0968-48bc-864d-ccc1dad4e225-kube-api-access-lqtdx\") pod \"dnsmasq-dns-7588bd9997-rxwb7\" (UID: \"6633c67e-0968-48bc-864d-ccc1dad4e225\") " pod="openstack/dnsmasq-dns-7588bd9997-rxwb7" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.950172 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.958866 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.963804 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.963927 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.964141 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.964191 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-nlkrw" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.964254 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 03 08:47:16 crc kubenswrapper[4947]: I1203 08:47:16.973584 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.015299 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7588bd9997-rxwb7" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.020217 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d49975d65-j2vxv"] Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.113639 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.113716 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.113744 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-500efb45-ae29-4e4b-aa32-34788ab98caa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-500efb45-ae29-4e4b-aa32-34788ab98caa\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.113982 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.114179 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.114228 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.114284 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.114328 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.114405 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6zqj\" (UniqueName: \"kubernetes.io/projected/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-kube-api-access-q6zqj\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.186753 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.189427 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.195639 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.195686 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.195738 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-rczmm" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.195794 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.195843 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.211104 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.215530 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.215559 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.215587 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.215617 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.215657 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6zqj\" (UniqueName: \"kubernetes.io/projected/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-kube-api-access-q6zqj\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.216255 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.216325 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.216348 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-500efb45-ae29-4e4b-aa32-34788ab98caa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-500efb45-ae29-4e4b-aa32-34788ab98caa\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.216371 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.219773 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.224169 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.224478 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.225098 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.234598 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.245319 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6zqj\" (UniqueName: \"kubernetes.io/projected/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-kube-api-access-q6zqj\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.248121 4947 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.248170 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-500efb45-ae29-4e4b-aa32-34788ab98caa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-500efb45-ae29-4e4b-aa32-34788ab98caa\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d18504434ce27c90b8e5c624bb6c35f78406c4172834edaa296b0c651f18eb88/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.250726 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.252173 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.284387 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ccdb85c4c-27q42"] Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.317340 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.317389 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.317415 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.317465 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.317494 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhqqn\" (UniqueName: \"kubernetes.io/projected/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-kube-api-access-qhqqn\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.317534 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.317551 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.317596 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.317630 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0793eb4e-c63e-412d-b29a-1812ecb3c7ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0793eb4e-c63e-412d-b29a-1812ecb3c7ed\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.341274 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-500efb45-ae29-4e4b-aa32-34788ab98caa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-500efb45-ae29-4e4b-aa32-34788ab98caa\") pod \"rabbitmq-server-0\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.426236 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0793eb4e-c63e-412d-b29a-1812ecb3c7ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0793eb4e-c63e-412d-b29a-1812ecb3c7ed\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.426329 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.426356 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.426374 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.426398 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.426433 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhqqn\" (UniqueName: \"kubernetes.io/projected/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-kube-api-access-qhqqn\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.426473 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.426515 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.426559 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.429556 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.429872 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.429877 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.430581 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.432853 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.432873 4947 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.432901 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0793eb4e-c63e-412d-b29a-1812ecb3c7ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0793eb4e-c63e-412d-b29a-1812ecb3c7ed\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/faa97777f376a835dba143cfa07e8f05ffcb84429b4e6bdd35b51866431b3476/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.432944 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.434786 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.449187 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhqqn\" (UniqueName: \"kubernetes.io/projected/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-kube-api-access-qhqqn\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.469869 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell2-server-0"] Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.475857 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.479020 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell2-server-dockercfg-727zm" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.479329 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell2-plugins-conf" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.480470 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell2-default-user" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.480596 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell2-erlang-cookie" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.480693 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell2-server-conf" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.481645 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell2-server-0"] Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.497128 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0793eb4e-c63e-412d-b29a-1812ecb3c7ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0793eb4e-c63e-412d-b29a-1812ecb3c7ed\") pod \"rabbitmq-cell1-server-0\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.515032 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.599408 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7588bd9997-rxwb7"] Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.604741 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.628656 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d4140e8-1bc3-4ee0-9355-5135833ce0d8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.628727 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d4140e8-1bc3-4ee0-9355-5135833ce0d8-plugins-conf\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.628761 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d4140e8-1bc3-4ee0-9355-5135833ce0d8-pod-info\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.628789 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25sz5\" (UniqueName: \"kubernetes.io/projected/2d4140e8-1bc3-4ee0-9355-5135833ce0d8-kube-api-access-25sz5\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.628806 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d4140e8-1bc3-4ee0-9355-5135833ce0d8-server-conf\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.628852 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-87e09d38-efb7-4700-ab0d-e2620bbbc592\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-87e09d38-efb7-4700-ab0d-e2620bbbc592\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.628869 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d4140e8-1bc3-4ee0-9355-5135833ce0d8-rabbitmq-confd\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.628887 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d4140e8-1bc3-4ee0-9355-5135833ce0d8-erlang-cookie-secret\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.628903 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d4140e8-1bc3-4ee0-9355-5135833ce0d8-rabbitmq-plugins\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.730812 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d4140e8-1bc3-4ee0-9355-5135833ce0d8-plugins-conf\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.731159 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d4140e8-1bc3-4ee0-9355-5135833ce0d8-pod-info\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.731199 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25sz5\" (UniqueName: \"kubernetes.io/projected/2d4140e8-1bc3-4ee0-9355-5135833ce0d8-kube-api-access-25sz5\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.731223 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d4140e8-1bc3-4ee0-9355-5135833ce0d8-server-conf\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.731297 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-87e09d38-efb7-4700-ab0d-e2620bbbc592\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-87e09d38-efb7-4700-ab0d-e2620bbbc592\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.731322 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d4140e8-1bc3-4ee0-9355-5135833ce0d8-rabbitmq-confd\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.731871 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2d4140e8-1bc3-4ee0-9355-5135833ce0d8-plugins-conf\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.732729 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d4140e8-1bc3-4ee0-9355-5135833ce0d8-erlang-cookie-secret\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.732758 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2d4140e8-1bc3-4ee0-9355-5135833ce0d8-server-conf\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.732779 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d4140e8-1bc3-4ee0-9355-5135833ce0d8-rabbitmq-plugins\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.732821 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d4140e8-1bc3-4ee0-9355-5135833ce0d8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.734400 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2d4140e8-1bc3-4ee0-9355-5135833ce0d8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.734820 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2d4140e8-1bc3-4ee0-9355-5135833ce0d8-rabbitmq-plugins\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.736420 4947 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.736455 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-87e09d38-efb7-4700-ab0d-e2620bbbc592\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-87e09d38-efb7-4700-ab0d-e2620bbbc592\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b92aa335d2934a3ca903e16e64d0a4faeafd4dbcef2b4e1a8272dad2a4ef4f9b/globalmount\"" pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.737690 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2d4140e8-1bc3-4ee0-9355-5135833ce0d8-erlang-cookie-secret\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.738450 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2d4140e8-1bc3-4ee0-9355-5135833ce0d8-pod-info\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.741529 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2d4140e8-1bc3-4ee0-9355-5135833ce0d8-rabbitmq-confd\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.750178 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25sz5\" (UniqueName: \"kubernetes.io/projected/2d4140e8-1bc3-4ee0-9355-5135833ce0d8-kube-api-access-25sz5\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.773671 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-87e09d38-efb7-4700-ab0d-e2620bbbc592\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-87e09d38-efb7-4700-ab0d-e2620bbbc592\") pod \"rabbitmq-cell2-server-0\" (UID: \"2d4140e8-1bc3-4ee0-9355-5135833ce0d8\") " pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.810712 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell3-server-0"] Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.813389 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.818956 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell3-default-user" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.819183 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell3-erlang-cookie" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.819323 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell3-plugins-conf" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.819451 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell3-server-dockercfg-btn7n" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.819655 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell3-server-conf" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.823294 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.826927 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell3-server-0"] Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.849876 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7588bd9997-rxwb7" event={"ID":"6633c67e-0968-48bc-864d-ccc1dad4e225","Type":"ContainerStarted","Data":"6a3a2d59000aaf9e3328246f94121dc003999580344ab708d6903163a09a692d"} Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.854129 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d49975d65-j2vxv" event={"ID":"4141f187-e61b-4477-8e44-0506fc8d62e1","Type":"ContainerStarted","Data":"65cfe4f5de54e4ebd067a4d31ba55f7d18c7950d1ddcb6153ebf59a189701f59"} Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.856298 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccdb85c4c-27q42" event={"ID":"5bf0659e-26a6-4fb6-8866-9d0a32c497b0","Type":"ContainerStarted","Data":"5c411bea8a5695c6c185feca73c341e53c239d3b1ff634c8cffc1548a1f07d0e"} Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.937344 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7402131c-647b-4e76-b6e2-1eebc53d4926\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7402131c-647b-4e76-b6e2-1eebc53d4926\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.937397 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/687851ea-a5db-409c-9562-726c0d59a375-server-conf\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.937456 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/687851ea-a5db-409c-9562-726c0d59a375-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.937482 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/687851ea-a5db-409c-9562-726c0d59a375-plugins-conf\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.937721 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/687851ea-a5db-409c-9562-726c0d59a375-rabbitmq-confd\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.937743 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/687851ea-a5db-409c-9562-726c0d59a375-rabbitmq-plugins\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.937760 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/687851ea-a5db-409c-9562-726c0d59a375-pod-info\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.937778 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/687851ea-a5db-409c-9562-726c0d59a375-erlang-cookie-secret\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:17 crc kubenswrapper[4947]: I1203 08:47:17.938222 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk87h\" (UniqueName: \"kubernetes.io/projected/687851ea-a5db-409c-9562-726c0d59a375-kube-api-access-fk87h\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.018521 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.043264 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/687851ea-a5db-409c-9562-726c0d59a375-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.043318 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/687851ea-a5db-409c-9562-726c0d59a375-plugins-conf\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.043336 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/687851ea-a5db-409c-9562-726c0d59a375-rabbitmq-confd\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.043357 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/687851ea-a5db-409c-9562-726c0d59a375-rabbitmq-plugins\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.043377 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/687851ea-a5db-409c-9562-726c0d59a375-pod-info\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.043393 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/687851ea-a5db-409c-9562-726c0d59a375-erlang-cookie-secret\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.043432 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk87h\" (UniqueName: \"kubernetes.io/projected/687851ea-a5db-409c-9562-726c0d59a375-kube-api-access-fk87h\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.043468 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7402131c-647b-4e76-b6e2-1eebc53d4926\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7402131c-647b-4e76-b6e2-1eebc53d4926\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.043490 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/687851ea-a5db-409c-9562-726c0d59a375-server-conf\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.044214 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/687851ea-a5db-409c-9562-726c0d59a375-plugins-conf\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.044388 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/687851ea-a5db-409c-9562-726c0d59a375-server-conf\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.044441 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/687851ea-a5db-409c-9562-726c0d59a375-rabbitmq-plugins\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.045259 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/687851ea-a5db-409c-9562-726c0d59a375-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.047910 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/687851ea-a5db-409c-9562-726c0d59a375-pod-info\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.048147 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/687851ea-a5db-409c-9562-726c0d59a375-rabbitmq-confd\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.048440 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/687851ea-a5db-409c-9562-726c0d59a375-erlang-cookie-secret\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.051277 4947 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.051302 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7402131c-647b-4e76-b6e2-1eebc53d4926\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7402131c-647b-4e76-b6e2-1eebc53d4926\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f6545819d4a6973d99428cbb478bd74f0e68ae505a8fac80b984319070e1c9f7/globalmount\"" pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.064544 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk87h\" (UniqueName: \"kubernetes.io/projected/687851ea-a5db-409c-9562-726c0d59a375-kube-api-access-fk87h\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.085669 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7402131c-647b-4e76-b6e2-1eebc53d4926\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7402131c-647b-4e76-b6e2-1eebc53d4926\") pod \"rabbitmq-cell3-server-0\" (UID: \"687851ea-a5db-409c-9562-726c0d59a375\") " pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.152389 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.205454 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 08:47:18 crc kubenswrapper[4947]: W1203 08:47:18.287743 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod901ce6e4_e34d_48c6_ba35_6c8a084cc6cc.slice/crio-4b2e0a88f19193a6fd180d75c501b48fc0f5e7546ca3c471e3664288ffb264e6 WatchSource:0}: Error finding container 4b2e0a88f19193a6fd180d75c501b48fc0f5e7546ca3c471e3664288ffb264e6: Status 404 returned error can't find the container with id 4b2e0a88f19193a6fd180d75c501b48fc0f5e7546ca3c471e3664288ffb264e6 Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.449359 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.471897 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.471984 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.481849 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-l4w4m" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.482036 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.487453 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.489075 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.491360 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.526482 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell2-server-0"] Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.562812 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7cf44fd8-460f-4a2e-8e17-4b34f5158c24-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7cf44fd8-460f-4a2e-8e17-4b34f5158c24\") " pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.563013 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cf44fd8-460f-4a2e-8e17-4b34f5158c24-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7cf44fd8-460f-4a2e-8e17-4b34f5158c24\") " pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.563066 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cf44fd8-460f-4a2e-8e17-4b34f5158c24-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7cf44fd8-460f-4a2e-8e17-4b34f5158c24\") " pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.563117 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf44fd8-460f-4a2e-8e17-4b34f5158c24-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7cf44fd8-460f-4a2e-8e17-4b34f5158c24\") " pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.563163 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7cf44fd8-460f-4a2e-8e17-4b34f5158c24-config-data-default\") pod \"openstack-galera-0\" (UID: \"7cf44fd8-460f-4a2e-8e17-4b34f5158c24\") " pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.563193 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-40448b88-0b64-4ca1-b289-15d142dde7e5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-40448b88-0b64-4ca1-b289-15d142dde7e5\") pod \"openstack-galera-0\" (UID: \"7cf44fd8-460f-4a2e-8e17-4b34f5158c24\") " pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.563263 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7cf44fd8-460f-4a2e-8e17-4b34f5158c24-kolla-config\") pod \"openstack-galera-0\" (UID: \"7cf44fd8-460f-4a2e-8e17-4b34f5158c24\") " pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.563294 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46ds4\" (UniqueName: \"kubernetes.io/projected/7cf44fd8-460f-4a2e-8e17-4b34f5158c24-kube-api-access-46ds4\") pod \"openstack-galera-0\" (UID: \"7cf44fd8-460f-4a2e-8e17-4b34f5158c24\") " pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.664433 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7cf44fd8-460f-4a2e-8e17-4b34f5158c24-kolla-config\") pod \"openstack-galera-0\" (UID: \"7cf44fd8-460f-4a2e-8e17-4b34f5158c24\") " pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.664499 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46ds4\" (UniqueName: \"kubernetes.io/projected/7cf44fd8-460f-4a2e-8e17-4b34f5158c24-kube-api-access-46ds4\") pod \"openstack-galera-0\" (UID: \"7cf44fd8-460f-4a2e-8e17-4b34f5158c24\") " pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.664556 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7cf44fd8-460f-4a2e-8e17-4b34f5158c24-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7cf44fd8-460f-4a2e-8e17-4b34f5158c24\") " pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.664591 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cf44fd8-460f-4a2e-8e17-4b34f5158c24-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7cf44fd8-460f-4a2e-8e17-4b34f5158c24\") " pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.664632 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cf44fd8-460f-4a2e-8e17-4b34f5158c24-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7cf44fd8-460f-4a2e-8e17-4b34f5158c24\") " pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.664674 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf44fd8-460f-4a2e-8e17-4b34f5158c24-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7cf44fd8-460f-4a2e-8e17-4b34f5158c24\") " pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.664714 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7cf44fd8-460f-4a2e-8e17-4b34f5158c24-config-data-default\") pod \"openstack-galera-0\" (UID: \"7cf44fd8-460f-4a2e-8e17-4b34f5158c24\") " pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.664740 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-40448b88-0b64-4ca1-b289-15d142dde7e5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-40448b88-0b64-4ca1-b289-15d142dde7e5\") pod \"openstack-galera-0\" (UID: \"7cf44fd8-460f-4a2e-8e17-4b34f5158c24\") " pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.665311 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7cf44fd8-460f-4a2e-8e17-4b34f5158c24-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7cf44fd8-460f-4a2e-8e17-4b34f5158c24\") " pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.665531 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7cf44fd8-460f-4a2e-8e17-4b34f5158c24-kolla-config\") pod \"openstack-galera-0\" (UID: \"7cf44fd8-460f-4a2e-8e17-4b34f5158c24\") " pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.665887 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7cf44fd8-460f-4a2e-8e17-4b34f5158c24-config-data-default\") pod \"openstack-galera-0\" (UID: \"7cf44fd8-460f-4a2e-8e17-4b34f5158c24\") " pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.666058 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7cf44fd8-460f-4a2e-8e17-4b34f5158c24-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7cf44fd8-460f-4a2e-8e17-4b34f5158c24\") " pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.668451 4947 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.668480 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-40448b88-0b64-4ca1-b289-15d142dde7e5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-40448b88-0b64-4ca1-b289-15d142dde7e5\") pod \"openstack-galera-0\" (UID: \"7cf44fd8-460f-4a2e-8e17-4b34f5158c24\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/037ce30664fb59f90b6f4b3ea6a4eb43bde2181b3e2232b848f7450e8bb6eebc/globalmount\"" pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.674751 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cf44fd8-460f-4a2e-8e17-4b34f5158c24-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7cf44fd8-460f-4a2e-8e17-4b34f5158c24\") " pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.676818 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cf44fd8-460f-4a2e-8e17-4b34f5158c24-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7cf44fd8-460f-4a2e-8e17-4b34f5158c24\") " pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.679299 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46ds4\" (UniqueName: \"kubernetes.io/projected/7cf44fd8-460f-4a2e-8e17-4b34f5158c24-kube-api-access-46ds4\") pod \"openstack-galera-0\" (UID: \"7cf44fd8-460f-4a2e-8e17-4b34f5158c24\") " pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.704827 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell3-server-0"] Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.711293 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-40448b88-0b64-4ca1-b289-15d142dde7e5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-40448b88-0b64-4ca1-b289-15d142dde7e5\") pod \"openstack-galera-0\" (UID: \"7cf44fd8-460f-4a2e-8e17-4b34f5158c24\") " pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: W1203 08:47:18.723496 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod687851ea_a5db_409c_9562_726c0d59a375.slice/crio-3b03fcceda9968fde68eff70628246097a312843a0ebbaf4c3b672c7183fec9c WatchSource:0}: Error finding container 3b03fcceda9968fde68eff70628246097a312843a0ebbaf4c3b672c7183fec9c: Status 404 returned error can't find the container with id 3b03fcceda9968fde68eff70628246097a312843a0ebbaf4c3b672c7183fec9c Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.834542 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.880893 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell2-server-0" event={"ID":"2d4140e8-1bc3-4ee0-9355-5135833ce0d8","Type":"ContainerStarted","Data":"c1f1f2f32fc8264107953e148300f1cb9c2a90cacbce63e01cbd8ea1cec43670"} Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.889134 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc","Type":"ContainerStarted","Data":"4b2e0a88f19193a6fd180d75c501b48fc0f5e7546ca3c471e3664288ffb264e6"} Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.894914 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c","Type":"ContainerStarted","Data":"78a3b90a6763c92e4006929e78b41a5bf55a52db2b1efe4e4bef8149a124648e"} Dec 03 08:47:18 crc kubenswrapper[4947]: I1203 08:47:18.900172 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell3-server-0" event={"ID":"687851ea-a5db-409c-9562-726c0d59a375","Type":"ContainerStarted","Data":"3b03fcceda9968fde68eff70628246097a312843a0ebbaf4c3b672c7183fec9c"} Dec 03 08:47:19 crc kubenswrapper[4947]: I1203 08:47:19.586473 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 08:47:19 crc kubenswrapper[4947]: W1203 08:47:19.598720 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cf44fd8_460f_4a2e_8e17_4b34f5158c24.slice/crio-02056a84fa51bdfe635a209d15d16f7872aff1f9614d41acb52a4ba0427533e7 WatchSource:0}: Error finding container 02056a84fa51bdfe635a209d15d16f7872aff1f9614d41acb52a4ba0427533e7: Status 404 returned error can't find the container with id 02056a84fa51bdfe635a209d15d16f7872aff1f9614d41acb52a4ba0427533e7 Dec 03 08:47:19 crc kubenswrapper[4947]: I1203 08:47:19.912379 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7cf44fd8-460f-4a2e-8e17-4b34f5158c24","Type":"ContainerStarted","Data":"02056a84fa51bdfe635a209d15d16f7872aff1f9614d41acb52a4ba0427533e7"} Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.094451 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.095666 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.101278 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.101698 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-slrp9" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.101998 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.102054 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.108682 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.205538 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec58c8e4-6d4f-4c79-a474-98ca677bc508-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ec58c8e4-6d4f-4c79-a474-98ca677bc508\") " pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.205593 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e87bb8c9-4f87-46d0-a6d6-5fc2fef99896\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e87bb8c9-4f87-46d0-a6d6-5fc2fef99896\") pod \"openstack-cell1-galera-0\" (UID: \"ec58c8e4-6d4f-4c79-a474-98ca677bc508\") " pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.205922 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ec58c8e4-6d4f-4c79-a474-98ca677bc508-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ec58c8e4-6d4f-4c79-a474-98ca677bc508\") " pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.205973 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ec58c8e4-6d4f-4c79-a474-98ca677bc508-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ec58c8e4-6d4f-4c79-a474-98ca677bc508\") " pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.206030 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25csh\" (UniqueName: \"kubernetes.io/projected/ec58c8e4-6d4f-4c79-a474-98ca677bc508-kube-api-access-25csh\") pod \"openstack-cell1-galera-0\" (UID: \"ec58c8e4-6d4f-4c79-a474-98ca677bc508\") " pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.206052 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec58c8e4-6d4f-4c79-a474-98ca677bc508-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ec58c8e4-6d4f-4c79-a474-98ca677bc508\") " pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.206123 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ec58c8e4-6d4f-4c79-a474-98ca677bc508-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ec58c8e4-6d4f-4c79-a474-98ca677bc508\") " pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.206153 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec58c8e4-6d4f-4c79-a474-98ca677bc508-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ec58c8e4-6d4f-4c79-a474-98ca677bc508\") " pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.309478 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ec58c8e4-6d4f-4c79-a474-98ca677bc508-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ec58c8e4-6d4f-4c79-a474-98ca677bc508\") " pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.309566 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ec58c8e4-6d4f-4c79-a474-98ca677bc508-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ec58c8e4-6d4f-4c79-a474-98ca677bc508\") " pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.309593 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25csh\" (UniqueName: \"kubernetes.io/projected/ec58c8e4-6d4f-4c79-a474-98ca677bc508-kube-api-access-25csh\") pod \"openstack-cell1-galera-0\" (UID: \"ec58c8e4-6d4f-4c79-a474-98ca677bc508\") " pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.309611 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec58c8e4-6d4f-4c79-a474-98ca677bc508-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ec58c8e4-6d4f-4c79-a474-98ca677bc508\") " pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.309650 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ec58c8e4-6d4f-4c79-a474-98ca677bc508-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ec58c8e4-6d4f-4c79-a474-98ca677bc508\") " pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.309667 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec58c8e4-6d4f-4c79-a474-98ca677bc508-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ec58c8e4-6d4f-4c79-a474-98ca677bc508\") " pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.309694 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec58c8e4-6d4f-4c79-a474-98ca677bc508-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ec58c8e4-6d4f-4c79-a474-98ca677bc508\") " pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.309713 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e87bb8c9-4f87-46d0-a6d6-5fc2fef99896\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e87bb8c9-4f87-46d0-a6d6-5fc2fef99896\") pod \"openstack-cell1-galera-0\" (UID: \"ec58c8e4-6d4f-4c79-a474-98ca677bc508\") " pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.310574 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ec58c8e4-6d4f-4c79-a474-98ca677bc508-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ec58c8e4-6d4f-4c79-a474-98ca677bc508\") " pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.310891 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ec58c8e4-6d4f-4c79-a474-98ca677bc508-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ec58c8e4-6d4f-4c79-a474-98ca677bc508\") " pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.312215 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ec58c8e4-6d4f-4c79-a474-98ca677bc508-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ec58c8e4-6d4f-4c79-a474-98ca677bc508\") " pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.312958 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec58c8e4-6d4f-4c79-a474-98ca677bc508-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ec58c8e4-6d4f-4c79-a474-98ca677bc508\") " pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.317895 4947 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.317947 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e87bb8c9-4f87-46d0-a6d6-5fc2fef99896\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e87bb8c9-4f87-46d0-a6d6-5fc2fef99896\") pod \"openstack-cell1-galera-0\" (UID: \"ec58c8e4-6d4f-4c79-a474-98ca677bc508\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4adb62fa7c4fb036904594dee58858d43bd80a879bf34ce666b9750616d76ce2/globalmount\"" pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.319129 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec58c8e4-6d4f-4c79-a474-98ca677bc508-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ec58c8e4-6d4f-4c79-a474-98ca677bc508\") " pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.319791 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec58c8e4-6d4f-4c79-a474-98ca677bc508-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ec58c8e4-6d4f-4c79-a474-98ca677bc508\") " pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.332607 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25csh\" (UniqueName: \"kubernetes.io/projected/ec58c8e4-6d4f-4c79-a474-98ca677bc508-kube-api-access-25csh\") pod \"openstack-cell1-galera-0\" (UID: \"ec58c8e4-6d4f-4c79-a474-98ca677bc508\") " pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.366322 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e87bb8c9-4f87-46d0-a6d6-5fc2fef99896\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e87bb8c9-4f87-46d0-a6d6-5fc2fef99896\") pod \"openstack-cell1-galera-0\" (UID: \"ec58c8e4-6d4f-4c79-a474-98ca677bc508\") " pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:20 crc kubenswrapper[4947]: I1203 08:47:20.431633 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.067523 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.086585 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:47:21 crc kubenswrapper[4947]: E1203 08:47:21.086801 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.545442 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell2-galera-0"] Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.547054 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.551423 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell2-svc" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.551476 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell2-scripts" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.551630 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell2-dockercfg-jxqsh" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.552226 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell2-config-data" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.562613 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell2-galera-0"] Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.660443 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5-operator-scripts\") pod \"openstack-cell2-galera-0\" (UID: \"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5\") " pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.660481 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a1634b9c-55e8-47cf-8148-f3ab013d263d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1634b9c-55e8-47cf-8148-f3ab013d263d\") pod \"openstack-cell2-galera-0\" (UID: \"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5\") " pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.660534 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5-kolla-config\") pod \"openstack-cell2-galera-0\" (UID: \"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5\") " pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.660559 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5-galera-tls-certs\") pod \"openstack-cell2-galera-0\" (UID: \"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5\") " pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.660595 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5-config-data-generated\") pod \"openstack-cell2-galera-0\" (UID: \"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5\") " pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.660626 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmffv\" (UniqueName: \"kubernetes.io/projected/01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5-kube-api-access-dmffv\") pod \"openstack-cell2-galera-0\" (UID: \"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5\") " pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.660662 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5-config-data-default\") pod \"openstack-cell2-galera-0\" (UID: \"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5\") " pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.660734 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5-combined-ca-bundle\") pod \"openstack-cell2-galera-0\" (UID: \"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5\") " pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.763100 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmffv\" (UniqueName: \"kubernetes.io/projected/01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5-kube-api-access-dmffv\") pod \"openstack-cell2-galera-0\" (UID: \"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5\") " pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.763226 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5-config-data-default\") pod \"openstack-cell2-galera-0\" (UID: \"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5\") " pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.763306 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5-combined-ca-bundle\") pod \"openstack-cell2-galera-0\" (UID: \"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5\") " pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.763382 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5-operator-scripts\") pod \"openstack-cell2-galera-0\" (UID: \"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5\") " pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.763413 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a1634b9c-55e8-47cf-8148-f3ab013d263d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1634b9c-55e8-47cf-8148-f3ab013d263d\") pod \"openstack-cell2-galera-0\" (UID: \"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5\") " pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.763457 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5-kolla-config\") pod \"openstack-cell2-galera-0\" (UID: \"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5\") " pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.763512 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5-galera-tls-certs\") pod \"openstack-cell2-galera-0\" (UID: \"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5\") " pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.763591 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5-config-data-generated\") pod \"openstack-cell2-galera-0\" (UID: \"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5\") " pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.764220 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5-config-data-generated\") pod \"openstack-cell2-galera-0\" (UID: \"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5\") " pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.766287 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5-config-data-default\") pod \"openstack-cell2-galera-0\" (UID: \"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5\") " pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.768858 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5-kolla-config\") pod \"openstack-cell2-galera-0\" (UID: \"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5\") " pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.770352 4947 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.770405 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a1634b9c-55e8-47cf-8148-f3ab013d263d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1634b9c-55e8-47cf-8148-f3ab013d263d\") pod \"openstack-cell2-galera-0\" (UID: \"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d96650bde049796f22ef83eded694cef956a83baeed21e0e3c0aece6123290ef/globalmount\"" pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.774718 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5-galera-tls-certs\") pod \"openstack-cell2-galera-0\" (UID: \"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5\") " pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.776622 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5-operator-scripts\") pod \"openstack-cell2-galera-0\" (UID: \"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5\") " pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.787901 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmffv\" (UniqueName: \"kubernetes.io/projected/01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5-kube-api-access-dmffv\") pod \"openstack-cell2-galera-0\" (UID: \"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5\") " pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.792144 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5-combined-ca-bundle\") pod \"openstack-cell2-galera-0\" (UID: \"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5\") " pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.805192 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.806975 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.810252 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-bp82v" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.810973 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.818264 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.865016 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ab6dc8c6-2e15-4572-8793-c862fc651be5-kolla-config\") pod \"memcached-0\" (UID: \"ab6dc8c6-2e15-4572-8793-c862fc651be5\") " pod="openstack/memcached-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.865084 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjhlz\" (UniqueName: \"kubernetes.io/projected/ab6dc8c6-2e15-4572-8793-c862fc651be5-kube-api-access-zjhlz\") pod \"memcached-0\" (UID: \"ab6dc8c6-2e15-4572-8793-c862fc651be5\") " pod="openstack/memcached-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.865132 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab6dc8c6-2e15-4572-8793-c862fc651be5-config-data\") pod \"memcached-0\" (UID: \"ab6dc8c6-2e15-4572-8793-c862fc651be5\") " pod="openstack/memcached-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.896285 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a1634b9c-55e8-47cf-8148-f3ab013d263d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1634b9c-55e8-47cf-8148-f3ab013d263d\") pod \"openstack-cell2-galera-0\" (UID: \"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5\") " pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.927815 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ec58c8e4-6d4f-4c79-a474-98ca677bc508","Type":"ContainerStarted","Data":"53a845f6912891769626e9e9fc57258fe3b2169e5a71d444b2d1b15b72ac52ef"} Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.967004 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ab6dc8c6-2e15-4572-8793-c862fc651be5-kolla-config\") pod \"memcached-0\" (UID: \"ab6dc8c6-2e15-4572-8793-c862fc651be5\") " pod="openstack/memcached-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.967068 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjhlz\" (UniqueName: \"kubernetes.io/projected/ab6dc8c6-2e15-4572-8793-c862fc651be5-kube-api-access-zjhlz\") pod \"memcached-0\" (UID: \"ab6dc8c6-2e15-4572-8793-c862fc651be5\") " pod="openstack/memcached-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.967118 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab6dc8c6-2e15-4572-8793-c862fc651be5-config-data\") pod \"memcached-0\" (UID: \"ab6dc8c6-2e15-4572-8793-c862fc651be5\") " pod="openstack/memcached-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.967841 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ab6dc8c6-2e15-4572-8793-c862fc651be5-kolla-config\") pod \"memcached-0\" (UID: \"ab6dc8c6-2e15-4572-8793-c862fc651be5\") " pod="openstack/memcached-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.968123 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ab6dc8c6-2e15-4572-8793-c862fc651be5-config-data\") pod \"memcached-0\" (UID: \"ab6dc8c6-2e15-4572-8793-c862fc651be5\") " pod="openstack/memcached-0" Dec 03 08:47:21 crc kubenswrapper[4947]: I1203 08:47:21.988738 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjhlz\" (UniqueName: \"kubernetes.io/projected/ab6dc8c6-2e15-4572-8793-c862fc651be5-kube-api-access-zjhlz\") pod \"memcached-0\" (UID: \"ab6dc8c6-2e15-4572-8793-c862fc651be5\") " pod="openstack/memcached-0" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.186906 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell2-galera-0" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.209566 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.718538 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.726051 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell2-galera-0"] Dec 03 08:47:22 crc kubenswrapper[4947]: W1203 08:47:22.731225 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01d9f67e_d65a_4f13_9bc6_3f7fcd68d8c5.slice/crio-13f0c1e3da743e100aeb12151d4cc242bfc9f5c014bf87a15f779532b87effa2 WatchSource:0}: Error finding container 13f0c1e3da743e100aeb12151d4cc242bfc9f5c014bf87a15f779532b87effa2: Status 404 returned error can't find the container with id 13f0c1e3da743e100aeb12151d4cc242bfc9f5c014bf87a15f779532b87effa2 Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.857372 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell3-galera-0"] Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.858681 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.862115 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell3-config-data" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.862414 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell3-svc" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.863858 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell3-scripts" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.864302 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell3-dockercfg-cxzb9" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.873527 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell3-galera-0"] Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.885935 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f8614a-2b9a-43e5-ae0b-2ca7b2749819-combined-ca-bundle\") pod \"openstack-cell3-galera-0\" (UID: \"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819\") " pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.885989 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e7f8614a-2b9a-43e5-ae0b-2ca7b2749819-config-data-default\") pod \"openstack-cell3-galera-0\" (UID: \"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819\") " pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.886012 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f51b205d-6c1d-4007-8ce3-0c96f7926859\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f51b205d-6c1d-4007-8ce3-0c96f7926859\") pod \"openstack-cell3-galera-0\" (UID: \"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819\") " pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.886183 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e7f8614a-2b9a-43e5-ae0b-2ca7b2749819-config-data-generated\") pod \"openstack-cell3-galera-0\" (UID: \"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819\") " pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.886233 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7f8614a-2b9a-43e5-ae0b-2ca7b2749819-galera-tls-certs\") pod \"openstack-cell3-galera-0\" (UID: \"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819\") " pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.886268 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e7f8614a-2b9a-43e5-ae0b-2ca7b2749819-kolla-config\") pod \"openstack-cell3-galera-0\" (UID: \"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819\") " pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.886365 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scmww\" (UniqueName: \"kubernetes.io/projected/e7f8614a-2b9a-43e5-ae0b-2ca7b2749819-kube-api-access-scmww\") pod \"openstack-cell3-galera-0\" (UID: \"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819\") " pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.886410 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7f8614a-2b9a-43e5-ae0b-2ca7b2749819-operator-scripts\") pod \"openstack-cell3-galera-0\" (UID: \"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819\") " pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.936463 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ab6dc8c6-2e15-4572-8793-c862fc651be5","Type":"ContainerStarted","Data":"2d49dd839ef14e9e9a30c2dd7c40767ba3184b2e33f7056262e37197d5ba5eb5"} Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.938119 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell2-galera-0" event={"ID":"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5","Type":"ContainerStarted","Data":"13f0c1e3da743e100aeb12151d4cc242bfc9f5c014bf87a15f779532b87effa2"} Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.987652 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f8614a-2b9a-43e5-ae0b-2ca7b2749819-combined-ca-bundle\") pod \"openstack-cell3-galera-0\" (UID: \"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819\") " pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.987717 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e7f8614a-2b9a-43e5-ae0b-2ca7b2749819-config-data-default\") pod \"openstack-cell3-galera-0\" (UID: \"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819\") " pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.987742 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f51b205d-6c1d-4007-8ce3-0c96f7926859\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f51b205d-6c1d-4007-8ce3-0c96f7926859\") pod \"openstack-cell3-galera-0\" (UID: \"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819\") " pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.987778 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e7f8614a-2b9a-43e5-ae0b-2ca7b2749819-config-data-generated\") pod \"openstack-cell3-galera-0\" (UID: \"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819\") " pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.987836 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7f8614a-2b9a-43e5-ae0b-2ca7b2749819-galera-tls-certs\") pod \"openstack-cell3-galera-0\" (UID: \"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819\") " pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.987866 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e7f8614a-2b9a-43e5-ae0b-2ca7b2749819-kolla-config\") pod \"openstack-cell3-galera-0\" (UID: \"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819\") " pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.987994 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scmww\" (UniqueName: \"kubernetes.io/projected/e7f8614a-2b9a-43e5-ae0b-2ca7b2749819-kube-api-access-scmww\") pod \"openstack-cell3-galera-0\" (UID: \"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819\") " pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.988057 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7f8614a-2b9a-43e5-ae0b-2ca7b2749819-operator-scripts\") pod \"openstack-cell3-galera-0\" (UID: \"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819\") " pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.988729 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e7f8614a-2b9a-43e5-ae0b-2ca7b2749819-kolla-config\") pod \"openstack-cell3-galera-0\" (UID: \"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819\") " pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.988782 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e7f8614a-2b9a-43e5-ae0b-2ca7b2749819-config-data-default\") pod \"openstack-cell3-galera-0\" (UID: \"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819\") " pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.989393 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e7f8614a-2b9a-43e5-ae0b-2ca7b2749819-config-data-generated\") pod \"openstack-cell3-galera-0\" (UID: \"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819\") " pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.989790 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7f8614a-2b9a-43e5-ae0b-2ca7b2749819-operator-scripts\") pod \"openstack-cell3-galera-0\" (UID: \"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819\") " pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.991132 4947 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.991154 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f51b205d-6c1d-4007-8ce3-0c96f7926859\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f51b205d-6c1d-4007-8ce3-0c96f7926859\") pod \"openstack-cell3-galera-0\" (UID: \"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9303c2464a72ab1354ddc0d12a8fdb59b5e51e643343d51598c91f38fae95955/globalmount\"" pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:22 crc kubenswrapper[4947]: I1203 08:47:22.995437 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7f8614a-2b9a-43e5-ae0b-2ca7b2749819-galera-tls-certs\") pod \"openstack-cell3-galera-0\" (UID: \"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819\") " pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:23 crc kubenswrapper[4947]: I1203 08:47:23.002669 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7f8614a-2b9a-43e5-ae0b-2ca7b2749819-combined-ca-bundle\") pod \"openstack-cell3-galera-0\" (UID: \"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819\") " pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:23 crc kubenswrapper[4947]: I1203 08:47:23.006103 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scmww\" (UniqueName: \"kubernetes.io/projected/e7f8614a-2b9a-43e5-ae0b-2ca7b2749819-kube-api-access-scmww\") pod \"openstack-cell3-galera-0\" (UID: \"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819\") " pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:23 crc kubenswrapper[4947]: I1203 08:47:23.022705 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f51b205d-6c1d-4007-8ce3-0c96f7926859\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f51b205d-6c1d-4007-8ce3-0c96f7926859\") pod \"openstack-cell3-galera-0\" (UID: \"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819\") " pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:23 crc kubenswrapper[4947]: I1203 08:47:23.195979 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell3-galera-0" Dec 03 08:47:23 crc kubenswrapper[4947]: I1203 08:47:23.622630 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell3-galera-0"] Dec 03 08:47:23 crc kubenswrapper[4947]: I1203 08:47:23.948091 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell3-galera-0" event={"ID":"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819","Type":"ContainerStarted","Data":"b257b93030359923383618376803e0eadc3afcbb58cfc1c1abeeab30e105fcb1"} Dec 03 08:47:35 crc kubenswrapper[4947]: I1203 08:47:35.083474 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:47:35 crc kubenswrapper[4947]: E1203 08:47:35.084331 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:47:49 crc kubenswrapper[4947]: I1203 08:47:49.088602 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:47:49 crc kubenswrapper[4947]: E1203 08:47:49.089567 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:47:58 crc kubenswrapper[4947]: E1203 08:47:58.067523 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:65066e8ca260a75886ae57f157049605" Dec 03 08:47:58 crc kubenswrapper[4947]: E1203 08:47:58.068075 4947 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:65066e8ca260a75886ae57f157049605" Dec 03 08:47:58 crc kubenswrapper[4947]: E1203 08:47:58.068229 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:65066e8ca260a75886ae57f157049605,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-25csh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(ec58c8e4-6d4f-4c79-a474-98ca677bc508): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 08:47:58 crc kubenswrapper[4947]: E1203 08:47:58.069427 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="ec58c8e4-6d4f-4c79-a474-98ca677bc508" Dec 03 08:47:58 crc kubenswrapper[4947]: E1203 08:47:58.805281 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:65066e8ca260a75886ae57f157049605" Dec 03 08:47:58 crc kubenswrapper[4947]: E1203 08:47:58.805334 4947 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:65066e8ca260a75886ae57f157049605" Dec 03 08:47:58 crc kubenswrapper[4947]: E1203 08:47:58.805457 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:65066e8ca260a75886ae57f157049605,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5cfh5d8h5bfh595h555hf9h57bh584h5fbh7ch674h579h6ch5c4h8dh545hb6h6hc6h7bh557hcdh544h54fh5ffh556h646h96h554h664h5d8h57dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lqtdx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7588bd9997-rxwb7_openstack(6633c67e-0968-48bc-864d-ccc1dad4e225): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 08:47:58 crc kubenswrapper[4947]: E1203 08:47:58.806794 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7588bd9997-rxwb7" podUID="6633c67e-0968-48bc-864d-ccc1dad4e225" Dec 03 08:47:58 crc kubenswrapper[4947]: E1203 08:47:58.816678 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:65066e8ca260a75886ae57f157049605" Dec 03 08:47:58 crc kubenswrapper[4947]: E1203 08:47:58.816746 4947 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:65066e8ca260a75886ae57f157049605" Dec 03 08:47:58 crc kubenswrapper[4947]: E1203 08:47:58.816897 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:65066e8ca260a75886ae57f157049605,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68bhf6hd9h5h9fh6fh8bh574h5c4h64bh678h5d9h668h685h68h664h55fh67ch597hc5h5f7h5ch5h685h6bh67ch594h68fh7hd9h54ch57cq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xtxrl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7d49975d65-j2vxv_openstack(4141f187-e61b-4477-8e44-0506fc8d62e1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 08:47:58 crc kubenswrapper[4947]: E1203 08:47:58.818177 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7d49975d65-j2vxv" podUID="4141f187-e61b-4477-8e44-0506fc8d62e1" Dec 03 08:47:58 crc kubenswrapper[4947]: E1203 08:47:58.831528 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:65066e8ca260a75886ae57f157049605" Dec 03 08:47:58 crc kubenswrapper[4947]: E1203 08:47:58.831576 4947 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:65066e8ca260a75886ae57f157049605" Dec 03 08:47:58 crc kubenswrapper[4947]: E1203 08:47:58.831673 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:65066e8ca260a75886ae57f157049605,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8chc6h5bh56fh546hb7hc8h67h5bchffh577h697h5b5h5bdh59bhf6hf4h558hb5h578h595h5cchfbh644h59ch7fh654h547h587h5cbh5d5h8fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vf96z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7688886755-b6lpz_openstack(f5c598e8-31fd-43d6-8469-752f4607ca80): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 08:47:58 crc kubenswrapper[4947]: E1203 08:47:58.832946 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7688886755-b6lpz" podUID="f5c598e8-31fd-43d6-8469-752f4607ca80" Dec 03 08:47:59 crc kubenswrapper[4947]: E1203 08:47:59.242892 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:65066e8ca260a75886ae57f157049605\\\"\"" pod="openstack/dnsmasq-dns-7588bd9997-rxwb7" podUID="6633c67e-0968-48bc-864d-ccc1dad4e225" Dec 03 08:47:59 crc kubenswrapper[4947]: E1203 08:47:59.660343 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-memcached:65066e8ca260a75886ae57f157049605" Dec 03 08:47:59 crc kubenswrapper[4947]: E1203 08:47:59.660659 4947 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-memcached:65066e8ca260a75886ae57f157049605" Dec 03 08:47:59 crc kubenswrapper[4947]: E1203 08:47:59.660859 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-memcached:65066e8ca260a75886ae57f157049605,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n67dh6bh5d5h559hc8h644h5bdh5h5d5h5f4h586h649h567h577h58bh8fh547h56ch5dh5cdh58dh65fh5fbh79h665h675h5bhbbh697h68bhf5h664q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zjhlz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(ab6dc8c6-2e15-4572-8793-c862fc651be5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 08:47:59 crc kubenswrapper[4947]: E1203 08:47:59.662031 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="ab6dc8c6-2e15-4572-8793-c862fc651be5" Dec 03 08:47:59 crc kubenswrapper[4947]: E1203 08:47:59.701077 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:65066e8ca260a75886ae57f157049605" Dec 03 08:47:59 crc kubenswrapper[4947]: E1203 08:47:59.701118 4947 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:65066e8ca260a75886ae57f157049605" Dec 03 08:47:59 crc kubenswrapper[4947]: E1203 08:47:59.701401 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:65066e8ca260a75886ae57f157049605,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n59dh596h64h55bh578h8bh585h57dh564h84h5f5h596hchcfh5c6h8h66fh68bh84h7dh675h5f9h58chfchc5h677h557hbdh8fh645h77h56dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdfds,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-ccdb85c4c-27q42_openstack(5bf0659e-26a6-4fb6-8866-9d0a32c497b0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 08:47:59 crc kubenswrapper[4947]: E1203 08:47:59.702727 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-ccdb85c4c-27q42" podUID="5bf0659e-26a6-4fb6-8866-9d0a32c497b0" Dec 03 08:47:59 crc kubenswrapper[4947]: I1203 08:47:59.967609 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d49975d65-j2vxv" Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.084385 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:48:00 crc kubenswrapper[4947]: E1203 08:48:00.084670 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.128852 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4141f187-e61b-4477-8e44-0506fc8d62e1-config\") pod \"4141f187-e61b-4477-8e44-0506fc8d62e1\" (UID: \"4141f187-e61b-4477-8e44-0506fc8d62e1\") " Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.128958 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtxrl\" (UniqueName: \"kubernetes.io/projected/4141f187-e61b-4477-8e44-0506fc8d62e1-kube-api-access-xtxrl\") pod \"4141f187-e61b-4477-8e44-0506fc8d62e1\" (UID: \"4141f187-e61b-4477-8e44-0506fc8d62e1\") " Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.129015 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4141f187-e61b-4477-8e44-0506fc8d62e1-dns-svc\") pod \"4141f187-e61b-4477-8e44-0506fc8d62e1\" (UID: \"4141f187-e61b-4477-8e44-0506fc8d62e1\") " Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.129413 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4141f187-e61b-4477-8e44-0506fc8d62e1-config" (OuterVolumeSpecName: "config") pod "4141f187-e61b-4477-8e44-0506fc8d62e1" (UID: "4141f187-e61b-4477-8e44-0506fc8d62e1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.129745 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4141f187-e61b-4477-8e44-0506fc8d62e1-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.130730 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4141f187-e61b-4477-8e44-0506fc8d62e1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4141f187-e61b-4477-8e44-0506fc8d62e1" (UID: "4141f187-e61b-4477-8e44-0506fc8d62e1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.134018 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4141f187-e61b-4477-8e44-0506fc8d62e1-kube-api-access-xtxrl" (OuterVolumeSpecName: "kube-api-access-xtxrl") pod "4141f187-e61b-4477-8e44-0506fc8d62e1" (UID: "4141f187-e61b-4477-8e44-0506fc8d62e1"). InnerVolumeSpecName "kube-api-access-xtxrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.160858 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7688886755-b6lpz" Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.232188 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtxrl\" (UniqueName: \"kubernetes.io/projected/4141f187-e61b-4477-8e44-0506fc8d62e1-kube-api-access-xtxrl\") on node \"crc\" DevicePath \"\"" Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.233093 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4141f187-e61b-4477-8e44-0506fc8d62e1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.250074 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7688886755-b6lpz" Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.250089 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7688886755-b6lpz" event={"ID":"f5c598e8-31fd-43d6-8469-752f4607ca80","Type":"ContainerDied","Data":"ad629be4cfdf320c89b8b3b83089b57f39c717cd0d5bbc61bb3e9661fe31d8c4"} Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.252809 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d49975d65-j2vxv" Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.252813 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d49975d65-j2vxv" event={"ID":"4141f187-e61b-4477-8e44-0506fc8d62e1","Type":"ContainerDied","Data":"65cfe4f5de54e4ebd067a4d31ba55f7d18c7950d1ddcb6153ebf59a189701f59"} Dec 03 08:48:00 crc kubenswrapper[4947]: E1203 08:48:00.254280 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:65066e8ca260a75886ae57f157049605\\\"\"" pod="openstack/dnsmasq-dns-ccdb85c4c-27q42" podUID="5bf0659e-26a6-4fb6-8866-9d0a32c497b0" Dec 03 08:48:00 crc kubenswrapper[4947]: E1203 08:48:00.254571 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-memcached:65066e8ca260a75886ae57f157049605\\\"\"" pod="openstack/memcached-0" podUID="ab6dc8c6-2e15-4572-8793-c862fc651be5" Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.339625 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d49975d65-j2vxv"] Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.340135 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5c598e8-31fd-43d6-8469-752f4607ca80-dns-svc\") pod \"f5c598e8-31fd-43d6-8469-752f4607ca80\" (UID: \"f5c598e8-31fd-43d6-8469-752f4607ca80\") " Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.340372 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5c598e8-31fd-43d6-8469-752f4607ca80-config\") pod \"f5c598e8-31fd-43d6-8469-752f4607ca80\" (UID: \"f5c598e8-31fd-43d6-8469-752f4607ca80\") " Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.340413 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf96z\" (UniqueName: \"kubernetes.io/projected/f5c598e8-31fd-43d6-8469-752f4607ca80-kube-api-access-vf96z\") pod \"f5c598e8-31fd-43d6-8469-752f4607ca80\" (UID: \"f5c598e8-31fd-43d6-8469-752f4607ca80\") " Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.342070 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5c598e8-31fd-43d6-8469-752f4607ca80-config" (OuterVolumeSpecName: "config") pod "f5c598e8-31fd-43d6-8469-752f4607ca80" (UID: "f5c598e8-31fd-43d6-8469-752f4607ca80"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.343059 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5c598e8-31fd-43d6-8469-752f4607ca80-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f5c598e8-31fd-43d6-8469-752f4607ca80" (UID: "f5c598e8-31fd-43d6-8469-752f4607ca80"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.358254 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5c598e8-31fd-43d6-8469-752f4607ca80-kube-api-access-vf96z" (OuterVolumeSpecName: "kube-api-access-vf96z") pod "f5c598e8-31fd-43d6-8469-752f4607ca80" (UID: "f5c598e8-31fd-43d6-8469-752f4607ca80"). InnerVolumeSpecName "kube-api-access-vf96z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.359453 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d49975d65-j2vxv"] Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.441589 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f5c598e8-31fd-43d6-8469-752f4607ca80-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.441632 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5c598e8-31fd-43d6-8469-752f4607ca80-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.441645 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf96z\" (UniqueName: \"kubernetes.io/projected/f5c598e8-31fd-43d6-8469-752f4607ca80-kube-api-access-vf96z\") on node \"crc\" DevicePath \"\"" Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.606345 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7688886755-b6lpz"] Dec 03 08:48:00 crc kubenswrapper[4947]: I1203 08:48:00.612018 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7688886755-b6lpz"] Dec 03 08:48:01 crc kubenswrapper[4947]: I1203 08:48:01.103427 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4141f187-e61b-4477-8e44-0506fc8d62e1" path="/var/lib/kubelet/pods/4141f187-e61b-4477-8e44-0506fc8d62e1/volumes" Dec 03 08:48:01 crc kubenswrapper[4947]: I1203 08:48:01.104202 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5c598e8-31fd-43d6-8469-752f4607ca80" path="/var/lib/kubelet/pods/f5c598e8-31fd-43d6-8469-752f4607ca80/volumes" Dec 03 08:48:01 crc kubenswrapper[4947]: I1203 08:48:01.275107 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell3-galera-0" event={"ID":"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819","Type":"ContainerStarted","Data":"1513de820460f27959106a03869186e3e1b7ba3ff165f71598b0b93f6442515c"} Dec 03 08:48:02 crc kubenswrapper[4947]: I1203 08:48:02.283639 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c","Type":"ContainerStarted","Data":"65a452d71377b704b21263ed92f89f4e78ea2cdb7b55dba54951c161cef2045f"} Dec 03 08:48:02 crc kubenswrapper[4947]: I1203 08:48:02.285454 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7cf44fd8-460f-4a2e-8e17-4b34f5158c24","Type":"ContainerStarted","Data":"6f3df1c68f54478cf8612aee2c88ecd18ec106bdc22a99657f88235ed00314aa"} Dec 03 08:48:02 crc kubenswrapper[4947]: I1203 08:48:02.286927 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell3-server-0" event={"ID":"687851ea-a5db-409c-9562-726c0d59a375","Type":"ContainerStarted","Data":"4f238d2a09aa16979ae97bbb3651a246990589a018e6c834e0165543a0e13189"} Dec 03 08:48:02 crc kubenswrapper[4947]: I1203 08:48:02.288746 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ec58c8e4-6d4f-4c79-a474-98ca677bc508","Type":"ContainerStarted","Data":"969b98d4a39436bc72f623aebadb3aeac130210386b5f1208ab9a3016ce976e4"} Dec 03 08:48:02 crc kubenswrapper[4947]: I1203 08:48:02.290444 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell2-server-0" event={"ID":"2d4140e8-1bc3-4ee0-9355-5135833ce0d8","Type":"ContainerStarted","Data":"44860c1636aafb24d276925624b1b412999f3f0da95dca599f0dc177e72bc596"} Dec 03 08:48:02 crc kubenswrapper[4947]: I1203 08:48:02.291991 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell2-galera-0" event={"ID":"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5","Type":"ContainerStarted","Data":"dfd7e3ddfc5ecae7990ed3eef9815cb1d5d7df10202de6de5a0c35d0271c605d"} Dec 03 08:48:02 crc kubenswrapper[4947]: I1203 08:48:02.293410 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc","Type":"ContainerStarted","Data":"0cbd86af121767565c5c5b894bba168f55c06a214ac75ea206bf37491d4c0cab"} Dec 03 08:48:05 crc kubenswrapper[4947]: I1203 08:48:05.319461 4947 generic.go:334] "Generic (PLEG): container finished" podID="01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5" containerID="dfd7e3ddfc5ecae7990ed3eef9815cb1d5d7df10202de6de5a0c35d0271c605d" exitCode=0 Dec 03 08:48:05 crc kubenswrapper[4947]: I1203 08:48:05.320155 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell2-galera-0" event={"ID":"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5","Type":"ContainerDied","Data":"dfd7e3ddfc5ecae7990ed3eef9815cb1d5d7df10202de6de5a0c35d0271c605d"} Dec 03 08:48:05 crc kubenswrapper[4947]: I1203 08:48:05.323636 4947 generic.go:334] "Generic (PLEG): container finished" podID="ec58c8e4-6d4f-4c79-a474-98ca677bc508" containerID="969b98d4a39436bc72f623aebadb3aeac130210386b5f1208ab9a3016ce976e4" exitCode=0 Dec 03 08:48:05 crc kubenswrapper[4947]: I1203 08:48:05.323741 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ec58c8e4-6d4f-4c79-a474-98ca677bc508","Type":"ContainerDied","Data":"969b98d4a39436bc72f623aebadb3aeac130210386b5f1208ab9a3016ce976e4"} Dec 03 08:48:05 crc kubenswrapper[4947]: I1203 08:48:05.326019 4947 generic.go:334] "Generic (PLEG): container finished" podID="e7f8614a-2b9a-43e5-ae0b-2ca7b2749819" containerID="1513de820460f27959106a03869186e3e1b7ba3ff165f71598b0b93f6442515c" exitCode=0 Dec 03 08:48:05 crc kubenswrapper[4947]: I1203 08:48:05.326058 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell3-galera-0" event={"ID":"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819","Type":"ContainerDied","Data":"1513de820460f27959106a03869186e3e1b7ba3ff165f71598b0b93f6442515c"} Dec 03 08:48:06 crc kubenswrapper[4947]: I1203 08:48:06.337765 4947 generic.go:334] "Generic (PLEG): container finished" podID="7cf44fd8-460f-4a2e-8e17-4b34f5158c24" containerID="6f3df1c68f54478cf8612aee2c88ecd18ec106bdc22a99657f88235ed00314aa" exitCode=0 Dec 03 08:48:06 crc kubenswrapper[4947]: I1203 08:48:06.337845 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7cf44fd8-460f-4a2e-8e17-4b34f5158c24","Type":"ContainerDied","Data":"6f3df1c68f54478cf8612aee2c88ecd18ec106bdc22a99657f88235ed00314aa"} Dec 03 08:48:06 crc kubenswrapper[4947]: I1203 08:48:06.341925 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ec58c8e4-6d4f-4c79-a474-98ca677bc508","Type":"ContainerStarted","Data":"89c3929f4049028761b6a155681b39747e3633aff50a31fac7d2cf290b977018"} Dec 03 08:48:06 crc kubenswrapper[4947]: I1203 08:48:06.345875 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell3-galera-0" event={"ID":"e7f8614a-2b9a-43e5-ae0b-2ca7b2749819","Type":"ContainerStarted","Data":"e617652fa8b2a086f15725a387635b9d4c082e277cacfa7ac853b378a88f93db"} Dec 03 08:48:06 crc kubenswrapper[4947]: I1203 08:48:06.350004 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell2-galera-0" event={"ID":"01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5","Type":"ContainerStarted","Data":"b6fbd27bce8b77de2df866204f8762d305779c1da571ccb50105ec29ea075ad3"} Dec 03 08:48:06 crc kubenswrapper[4947]: I1203 08:48:06.404669 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell3-galera-0" podStartSLOduration=9.281059774 podStartE2EDuration="45.404642822s" podCreationTimestamp="2025-12-03 08:47:21 +0000 UTC" firstStartedPulling="2025-12-03 08:47:23.643743889 +0000 UTC m=+7104.904698315" lastFinishedPulling="2025-12-03 08:47:59.767326897 +0000 UTC m=+7141.028281363" observedRunningTime="2025-12-03 08:48:06.394628392 +0000 UTC m=+7147.655582838" watchObservedRunningTime="2025-12-03 08:48:06.404642822 +0000 UTC m=+7147.665597278" Dec 03 08:48:06 crc kubenswrapper[4947]: I1203 08:48:06.448048 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371989.406754 podStartE2EDuration="47.448022494s" podCreationTimestamp="2025-12-03 08:47:19 +0000 UTC" firstStartedPulling="2025-12-03 08:47:21.073076352 +0000 UTC m=+7102.334030778" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:48:06.446290348 +0000 UTC m=+7147.707244794" watchObservedRunningTime="2025-12-03 08:48:06.448022494 +0000 UTC m=+7147.708976930" Dec 03 08:48:06 crc kubenswrapper[4947]: I1203 08:48:06.457188 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell2-galera-0" podStartSLOduration=9.393305278 podStartE2EDuration="46.457163291s" podCreationTimestamp="2025-12-03 08:47:20 +0000 UTC" firstStartedPulling="2025-12-03 08:47:22.732745945 +0000 UTC m=+7103.993700371" lastFinishedPulling="2025-12-03 08:47:59.796603958 +0000 UTC m=+7141.057558384" observedRunningTime="2025-12-03 08:48:06.425076514 +0000 UTC m=+7147.686030960" watchObservedRunningTime="2025-12-03 08:48:06.457163291 +0000 UTC m=+7147.718117737" Dec 03 08:48:07 crc kubenswrapper[4947]: I1203 08:48:07.358753 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7cf44fd8-460f-4a2e-8e17-4b34f5158c24","Type":"ContainerStarted","Data":"bcee632512f91c5d2b22692290be52be9939bab0f4dc5b5bf52fe5ed9149e94e"} Dec 03 08:48:07 crc kubenswrapper[4947]: I1203 08:48:07.388947 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=10.346913866 podStartE2EDuration="50.388930746s" podCreationTimestamp="2025-12-03 08:47:17 +0000 UTC" firstStartedPulling="2025-12-03 08:47:19.601945754 +0000 UTC m=+7100.862900180" lastFinishedPulling="2025-12-03 08:47:59.643962634 +0000 UTC m=+7140.904917060" observedRunningTime="2025-12-03 08:48:07.384989949 +0000 UTC m=+7148.645944405" watchObservedRunningTime="2025-12-03 08:48:07.388930746 +0000 UTC m=+7148.649885182" Dec 03 08:48:08 crc kubenswrapper[4947]: I1203 08:48:08.834981 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 03 08:48:08 crc kubenswrapper[4947]: I1203 08:48:08.836178 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 03 08:48:10 crc kubenswrapper[4947]: I1203 08:48:10.432218 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 03 08:48:10 crc kubenswrapper[4947]: I1203 08:48:10.432269 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 03 08:48:10 crc kubenswrapper[4947]: I1203 08:48:10.513198 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 03 08:48:11 crc kubenswrapper[4947]: I1203 08:48:11.508661 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 03 08:48:12 crc kubenswrapper[4947]: I1203 08:48:12.187450 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell2-galera-0" Dec 03 08:48:12 crc kubenswrapper[4947]: I1203 08:48:12.188855 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell2-galera-0" Dec 03 08:48:13 crc kubenswrapper[4947]: I1203 08:48:13.083743 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:48:13 crc kubenswrapper[4947]: E1203 08:48:13.083926 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:48:13 crc kubenswrapper[4947]: I1203 08:48:13.197135 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell3-galera-0" Dec 03 08:48:13 crc kubenswrapper[4947]: I1203 08:48:13.197176 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell3-galera-0" Dec 03 08:48:13 crc kubenswrapper[4947]: I1203 08:48:13.275447 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell3-galera-0" Dec 03 08:48:13 crc kubenswrapper[4947]: I1203 08:48:13.483182 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell3-galera-0" Dec 03 08:48:14 crc kubenswrapper[4947]: I1203 08:48:14.415746 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ab6dc8c6-2e15-4572-8793-c862fc651be5","Type":"ContainerStarted","Data":"628bb251ea6f2b41d58eedf6595f91e6b18f5c427935cb84b55ee69c6e44c0dc"} Dec 03 08:48:14 crc kubenswrapper[4947]: I1203 08:48:14.416318 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 03 08:48:14 crc kubenswrapper[4947]: I1203 08:48:14.441834 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.880633239 podStartE2EDuration="53.441806846s" podCreationTimestamp="2025-12-03 08:47:21 +0000 UTC" firstStartedPulling="2025-12-03 08:47:22.730811312 +0000 UTC m=+7103.991765748" lastFinishedPulling="2025-12-03 08:48:13.291984929 +0000 UTC m=+7154.552939355" observedRunningTime="2025-12-03 08:48:14.432024881 +0000 UTC m=+7155.692979307" watchObservedRunningTime="2025-12-03 08:48:14.441806846 +0000 UTC m=+7155.702761302" Dec 03 08:48:14 crc kubenswrapper[4947]: I1203 08:48:14.924966 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 03 08:48:15 crc kubenswrapper[4947]: I1203 08:48:15.009729 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 03 08:48:15 crc kubenswrapper[4947]: I1203 08:48:15.280534 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell2-galera-0" Dec 03 08:48:15 crc kubenswrapper[4947]: I1203 08:48:15.341681 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell2-galera-0" Dec 03 08:48:15 crc kubenswrapper[4947]: I1203 08:48:15.424405 4947 generic.go:334] "Generic (PLEG): container finished" podID="5bf0659e-26a6-4fb6-8866-9d0a32c497b0" containerID="d003752c1b26d50458cd8e76b05d9a9a68d378bfe163c97555380debbf157da1" exitCode=0 Dec 03 08:48:15 crc kubenswrapper[4947]: I1203 08:48:15.424511 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccdb85c4c-27q42" event={"ID":"5bf0659e-26a6-4fb6-8866-9d0a32c497b0","Type":"ContainerDied","Data":"d003752c1b26d50458cd8e76b05d9a9a68d378bfe163c97555380debbf157da1"} Dec 03 08:48:15 crc kubenswrapper[4947]: I1203 08:48:15.426976 4947 generic.go:334] "Generic (PLEG): container finished" podID="6633c67e-0968-48bc-864d-ccc1dad4e225" containerID="863020e731feb640eb9137cd5837aca54dfb7231a5f52162f5a544e502e58a9d" exitCode=0 Dec 03 08:48:15 crc kubenswrapper[4947]: I1203 08:48:15.427812 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7588bd9997-rxwb7" event={"ID":"6633c67e-0968-48bc-864d-ccc1dad4e225","Type":"ContainerDied","Data":"863020e731feb640eb9137cd5837aca54dfb7231a5f52162f5a544e502e58a9d"} Dec 03 08:48:16 crc kubenswrapper[4947]: I1203 08:48:16.437649 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7588bd9997-rxwb7" event={"ID":"6633c67e-0968-48bc-864d-ccc1dad4e225","Type":"ContainerStarted","Data":"b37a47ffa455398cb198300db47f462997a02116feae5133082c328bb95da476"} Dec 03 08:48:16 crc kubenswrapper[4947]: I1203 08:48:16.437875 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7588bd9997-rxwb7" Dec 03 08:48:16 crc kubenswrapper[4947]: I1203 08:48:16.439800 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccdb85c4c-27q42" event={"ID":"5bf0659e-26a6-4fb6-8866-9d0a32c497b0","Type":"ContainerStarted","Data":"d14b71ba65e842de6b1b25b3a4376b1930b58e8a3c82fb760a0525a11a70ec34"} Dec 03 08:48:16 crc kubenswrapper[4947]: I1203 08:48:16.440018 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-ccdb85c4c-27q42" Dec 03 08:48:16 crc kubenswrapper[4947]: I1203 08:48:16.456098 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7588bd9997-rxwb7" podStartSLOduration=3.414996322 podStartE2EDuration="1m0.45607408s" podCreationTimestamp="2025-12-03 08:47:16 +0000 UTC" firstStartedPulling="2025-12-03 08:47:17.639438118 +0000 UTC m=+7098.900392544" lastFinishedPulling="2025-12-03 08:48:14.680515836 +0000 UTC m=+7155.941470302" observedRunningTime="2025-12-03 08:48:16.452594045 +0000 UTC m=+7157.713548471" watchObservedRunningTime="2025-12-03 08:48:16.45607408 +0000 UTC m=+7157.717028516" Dec 03 08:48:16 crc kubenswrapper[4947]: I1203 08:48:16.474441 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-ccdb85c4c-27q42" podStartSLOduration=-9223371976.38035 podStartE2EDuration="1m0.474425526s" podCreationTimestamp="2025-12-03 08:47:16 +0000 UTC" firstStartedPulling="2025-12-03 08:47:17.335593608 +0000 UTC m=+7098.596548034" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:48:16.469912784 +0000 UTC m=+7157.730867210" watchObservedRunningTime="2025-12-03 08:48:16.474425526 +0000 UTC m=+7157.735379952" Dec 03 08:48:21 crc kubenswrapper[4947]: I1203 08:48:21.754553 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-ccdb85c4c-27q42" Dec 03 08:48:22 crc kubenswrapper[4947]: I1203 08:48:22.016714 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7588bd9997-rxwb7" Dec 03 08:48:22 crc kubenswrapper[4947]: I1203 08:48:22.066101 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ccdb85c4c-27q42"] Dec 03 08:48:22 crc kubenswrapper[4947]: I1203 08:48:22.211217 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 03 08:48:22 crc kubenswrapper[4947]: I1203 08:48:22.492800 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-ccdb85c4c-27q42" podUID="5bf0659e-26a6-4fb6-8866-9d0a32c497b0" containerName="dnsmasq-dns" containerID="cri-o://d14b71ba65e842de6b1b25b3a4376b1930b58e8a3c82fb760a0525a11a70ec34" gracePeriod=10 Dec 03 08:48:23 crc kubenswrapper[4947]: I1203 08:48:23.508692 4947 generic.go:334] "Generic (PLEG): container finished" podID="5bf0659e-26a6-4fb6-8866-9d0a32c497b0" containerID="d14b71ba65e842de6b1b25b3a4376b1930b58e8a3c82fb760a0525a11a70ec34" exitCode=0 Dec 03 08:48:23 crc kubenswrapper[4947]: I1203 08:48:23.508767 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccdb85c4c-27q42" event={"ID":"5bf0659e-26a6-4fb6-8866-9d0a32c497b0","Type":"ContainerDied","Data":"d14b71ba65e842de6b1b25b3a4376b1930b58e8a3c82fb760a0525a11a70ec34"} Dec 03 08:48:24 crc kubenswrapper[4947]: I1203 08:48:24.035470 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccdb85c4c-27q42" Dec 03 08:48:24 crc kubenswrapper[4947]: I1203 08:48:24.133592 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bf0659e-26a6-4fb6-8866-9d0a32c497b0-config\") pod \"5bf0659e-26a6-4fb6-8866-9d0a32c497b0\" (UID: \"5bf0659e-26a6-4fb6-8866-9d0a32c497b0\") " Dec 03 08:48:24 crc kubenswrapper[4947]: I1203 08:48:24.133776 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdfds\" (UniqueName: \"kubernetes.io/projected/5bf0659e-26a6-4fb6-8866-9d0a32c497b0-kube-api-access-rdfds\") pod \"5bf0659e-26a6-4fb6-8866-9d0a32c497b0\" (UID: \"5bf0659e-26a6-4fb6-8866-9d0a32c497b0\") " Dec 03 08:48:24 crc kubenswrapper[4947]: I1203 08:48:24.133862 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bf0659e-26a6-4fb6-8866-9d0a32c497b0-dns-svc\") pod \"5bf0659e-26a6-4fb6-8866-9d0a32c497b0\" (UID: \"5bf0659e-26a6-4fb6-8866-9d0a32c497b0\") " Dec 03 08:48:24 crc kubenswrapper[4947]: I1203 08:48:24.144878 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bf0659e-26a6-4fb6-8866-9d0a32c497b0-kube-api-access-rdfds" (OuterVolumeSpecName: "kube-api-access-rdfds") pod "5bf0659e-26a6-4fb6-8866-9d0a32c497b0" (UID: "5bf0659e-26a6-4fb6-8866-9d0a32c497b0"). InnerVolumeSpecName "kube-api-access-rdfds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:48:24 crc kubenswrapper[4947]: I1203 08:48:24.172445 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bf0659e-26a6-4fb6-8866-9d0a32c497b0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5bf0659e-26a6-4fb6-8866-9d0a32c497b0" (UID: "5bf0659e-26a6-4fb6-8866-9d0a32c497b0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:48:24 crc kubenswrapper[4947]: I1203 08:48:24.173185 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bf0659e-26a6-4fb6-8866-9d0a32c497b0-config" (OuterVolumeSpecName: "config") pod "5bf0659e-26a6-4fb6-8866-9d0a32c497b0" (UID: "5bf0659e-26a6-4fb6-8866-9d0a32c497b0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:48:24 crc kubenswrapper[4947]: I1203 08:48:24.235018 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdfds\" (UniqueName: \"kubernetes.io/projected/5bf0659e-26a6-4fb6-8866-9d0a32c497b0-kube-api-access-rdfds\") on node \"crc\" DevicePath \"\"" Dec 03 08:48:24 crc kubenswrapper[4947]: I1203 08:48:24.235292 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bf0659e-26a6-4fb6-8866-9d0a32c497b0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 08:48:24 crc kubenswrapper[4947]: I1203 08:48:24.235301 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bf0659e-26a6-4fb6-8866-9d0a32c497b0-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:48:24 crc kubenswrapper[4947]: I1203 08:48:24.516754 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccdb85c4c-27q42" event={"ID":"5bf0659e-26a6-4fb6-8866-9d0a32c497b0","Type":"ContainerDied","Data":"5c411bea8a5695c6c185feca73c341e53c239d3b1ff634c8cffc1548a1f07d0e"} Dec 03 08:48:24 crc kubenswrapper[4947]: I1203 08:48:24.516788 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccdb85c4c-27q42" Dec 03 08:48:24 crc kubenswrapper[4947]: I1203 08:48:24.516817 4947 scope.go:117] "RemoveContainer" containerID="d14b71ba65e842de6b1b25b3a4376b1930b58e8a3c82fb760a0525a11a70ec34" Dec 03 08:48:24 crc kubenswrapper[4947]: I1203 08:48:24.548874 4947 scope.go:117] "RemoveContainer" containerID="d003752c1b26d50458cd8e76b05d9a9a68d378bfe163c97555380debbf157da1" Dec 03 08:48:24 crc kubenswrapper[4947]: I1203 08:48:24.548974 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ccdb85c4c-27q42"] Dec 03 08:48:24 crc kubenswrapper[4947]: I1203 08:48:24.556782 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-ccdb85c4c-27q42"] Dec 03 08:48:25 crc kubenswrapper[4947]: I1203 08:48:25.091613 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bf0659e-26a6-4fb6-8866-9d0a32c497b0" path="/var/lib/kubelet/pods/5bf0659e-26a6-4fb6-8866-9d0a32c497b0/volumes" Dec 03 08:48:26 crc kubenswrapper[4947]: I1203 08:48:26.083569 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:48:26 crc kubenswrapper[4947]: E1203 08:48:26.083896 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:48:33 crc kubenswrapper[4947]: I1203 08:48:33.763053 4947 generic.go:334] "Generic (PLEG): container finished" podID="901ce6e4-e34d-48c6-ba35-6c8a084cc6cc" containerID="0cbd86af121767565c5c5b894bba168f55c06a214ac75ea206bf37491d4c0cab" exitCode=0 Dec 03 08:48:33 crc kubenswrapper[4947]: I1203 08:48:33.763797 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc","Type":"ContainerDied","Data":"0cbd86af121767565c5c5b894bba168f55c06a214ac75ea206bf37491d4c0cab"} Dec 03 08:48:34 crc kubenswrapper[4947]: I1203 08:48:34.776069 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc","Type":"ContainerStarted","Data":"1fe04bdf7e9c802536bbcaed2b96583a1c6ebfb1d47bb5c029b4e7afd64a007d"} Dec 03 08:48:34 crc kubenswrapper[4947]: I1203 08:48:34.776818 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 03 08:48:34 crc kubenswrapper[4947]: I1203 08:48:34.778686 4947 generic.go:334] "Generic (PLEG): container finished" podID="5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c" containerID="65a452d71377b704b21263ed92f89f4e78ea2cdb7b55dba54951c161cef2045f" exitCode=0 Dec 03 08:48:34 crc kubenswrapper[4947]: I1203 08:48:34.778744 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c","Type":"ContainerDied","Data":"65a452d71377b704b21263ed92f89f4e78ea2cdb7b55dba54951c161cef2045f"} Dec 03 08:48:34 crc kubenswrapper[4947]: I1203 08:48:34.782347 4947 generic.go:334] "Generic (PLEG): container finished" podID="687851ea-a5db-409c-9562-726c0d59a375" containerID="4f238d2a09aa16979ae97bbb3651a246990589a018e6c834e0165543a0e13189" exitCode=0 Dec 03 08:48:34 crc kubenswrapper[4947]: I1203 08:48:34.782423 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell3-server-0" event={"ID":"687851ea-a5db-409c-9562-726c0d59a375","Type":"ContainerDied","Data":"4f238d2a09aa16979ae97bbb3651a246990589a018e6c834e0165543a0e13189"} Dec 03 08:48:34 crc kubenswrapper[4947]: I1203 08:48:34.784318 4947 generic.go:334] "Generic (PLEG): container finished" podID="2d4140e8-1bc3-4ee0-9355-5135833ce0d8" containerID="44860c1636aafb24d276925624b1b412999f3f0da95dca599f0dc177e72bc596" exitCode=0 Dec 03 08:48:34 crc kubenswrapper[4947]: I1203 08:48:34.784361 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell2-server-0" event={"ID":"2d4140e8-1bc3-4ee0-9355-5135833ce0d8","Type":"ContainerDied","Data":"44860c1636aafb24d276925624b1b412999f3f0da95dca599f0dc177e72bc596"} Dec 03 08:48:34 crc kubenswrapper[4947]: I1203 08:48:34.809997 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.349864307 podStartE2EDuration="1m19.809974495s" podCreationTimestamp="2025-12-03 08:47:15 +0000 UTC" firstStartedPulling="2025-12-03 08:47:18.290614792 +0000 UTC m=+7099.551569218" lastFinishedPulling="2025-12-03 08:47:59.75072497 +0000 UTC m=+7141.011679406" observedRunningTime="2025-12-03 08:48:34.799444361 +0000 UTC m=+7176.060398797" watchObservedRunningTime="2025-12-03 08:48:34.809974495 +0000 UTC m=+7176.070928931" Dec 03 08:48:35 crc kubenswrapper[4947]: I1203 08:48:35.793286 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell3-server-0" event={"ID":"687851ea-a5db-409c-9562-726c0d59a375","Type":"ContainerStarted","Data":"a4ad8a0c26e2b06cf40a26d9f1c8d90d59c94b704e86db0b05742cdd2651d2a9"} Dec 03 08:48:35 crc kubenswrapper[4947]: I1203 08:48:35.794254 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:48:35 crc kubenswrapper[4947]: I1203 08:48:35.796407 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell2-server-0" event={"ID":"2d4140e8-1bc3-4ee0-9355-5135833ce0d8","Type":"ContainerStarted","Data":"8b11adfc0a4e696206a531f17f967c08e0cd5f5762b448e826f449d4432ec01e"} Dec 03 08:48:35 crc kubenswrapper[4947]: I1203 08:48:35.796877 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:48:35 crc kubenswrapper[4947]: I1203 08:48:35.799477 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c","Type":"ContainerStarted","Data":"1c00c7aa69cbe111e60585c1d2851bddc1b91d3727f95173e8f4ce45b4f536fa"} Dec 03 08:48:35 crc kubenswrapper[4947]: I1203 08:48:35.799871 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:48:35 crc kubenswrapper[4947]: I1203 08:48:35.821329 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell3-server-0" podStartSLOduration=38.856404244 podStartE2EDuration="1m19.821291981s" podCreationTimestamp="2025-12-03 08:47:16 +0000 UTC" firstStartedPulling="2025-12-03 08:47:18.726458318 +0000 UTC m=+7099.987412744" lastFinishedPulling="2025-12-03 08:47:59.691346055 +0000 UTC m=+7140.952300481" observedRunningTime="2025-12-03 08:48:35.81758805 +0000 UTC m=+7177.078542476" watchObservedRunningTime="2025-12-03 08:48:35.821291981 +0000 UTC m=+7177.082246407" Dec 03 08:48:35 crc kubenswrapper[4947]: I1203 08:48:35.881013 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.282750244 podStartE2EDuration="1m19.880991553s" podCreationTimestamp="2025-12-03 08:47:16 +0000 UTC" firstStartedPulling="2025-12-03 08:47:18.045852679 +0000 UTC m=+7099.306807105" lastFinishedPulling="2025-12-03 08:47:59.644093988 +0000 UTC m=+7140.905048414" observedRunningTime="2025-12-03 08:48:35.844520698 +0000 UTC m=+7177.105475124" watchObservedRunningTime="2025-12-03 08:48:35.880991553 +0000 UTC m=+7177.141945969" Dec 03 08:48:35 crc kubenswrapper[4947]: I1203 08:48:35.884230 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell2-server-0" podStartSLOduration=38.683468601 podStartE2EDuration="1m19.88420674s" podCreationTimestamp="2025-12-03 08:47:16 +0000 UTC" firstStartedPulling="2025-12-03 08:47:18.494304976 +0000 UTC m=+7099.755259402" lastFinishedPulling="2025-12-03 08:47:59.695043075 +0000 UTC m=+7140.955997541" observedRunningTime="2025-12-03 08:48:35.879119083 +0000 UTC m=+7177.140073509" watchObservedRunningTime="2025-12-03 08:48:35.88420674 +0000 UTC m=+7177.145161166" Dec 03 08:48:39 crc kubenswrapper[4947]: I1203 08:48:39.089944 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:48:39 crc kubenswrapper[4947]: E1203 08:48:39.090591 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:48:47 crc kubenswrapper[4947]: I1203 08:48:47.518407 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:48:47 crc kubenswrapper[4947]: I1203 08:48:47.608708 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 03 08:48:47 crc kubenswrapper[4947]: I1203 08:48:47.826830 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell2-server-0" Dec 03 08:48:48 crc kubenswrapper[4947]: I1203 08:48:48.154770 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell3-server-0" Dec 03 08:48:52 crc kubenswrapper[4947]: I1203 08:48:52.082860 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:48:52 crc kubenswrapper[4947]: E1203 08:48:52.083345 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:48:55 crc kubenswrapper[4947]: I1203 08:48:55.727685 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-544b7dbbc5-76cn4"] Dec 03 08:48:55 crc kubenswrapper[4947]: E1203 08:48:55.729686 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bf0659e-26a6-4fb6-8866-9d0a32c497b0" containerName="init" Dec 03 08:48:55 crc kubenswrapper[4947]: I1203 08:48:55.729790 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bf0659e-26a6-4fb6-8866-9d0a32c497b0" containerName="init" Dec 03 08:48:55 crc kubenswrapper[4947]: E1203 08:48:55.729892 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bf0659e-26a6-4fb6-8866-9d0a32c497b0" containerName="dnsmasq-dns" Dec 03 08:48:55 crc kubenswrapper[4947]: I1203 08:48:55.729968 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bf0659e-26a6-4fb6-8866-9d0a32c497b0" containerName="dnsmasq-dns" Dec 03 08:48:55 crc kubenswrapper[4947]: I1203 08:48:55.730211 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bf0659e-26a6-4fb6-8866-9d0a32c497b0" containerName="dnsmasq-dns" Dec 03 08:48:55 crc kubenswrapper[4947]: I1203 08:48:55.731303 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-544b7dbbc5-76cn4" Dec 03 08:48:55 crc kubenswrapper[4947]: I1203 08:48:55.751883 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-544b7dbbc5-76cn4"] Dec 03 08:48:55 crc kubenswrapper[4947]: I1203 08:48:55.920227 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34af4296-adaf-4d08-a259-833f7696e897-dns-svc\") pod \"dnsmasq-dns-544b7dbbc5-76cn4\" (UID: \"34af4296-adaf-4d08-a259-833f7696e897\") " pod="openstack/dnsmasq-dns-544b7dbbc5-76cn4" Dec 03 08:48:55 crc kubenswrapper[4947]: I1203 08:48:55.920312 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kx7q\" (UniqueName: \"kubernetes.io/projected/34af4296-adaf-4d08-a259-833f7696e897-kube-api-access-4kx7q\") pod \"dnsmasq-dns-544b7dbbc5-76cn4\" (UID: \"34af4296-adaf-4d08-a259-833f7696e897\") " pod="openstack/dnsmasq-dns-544b7dbbc5-76cn4" Dec 03 08:48:55 crc kubenswrapper[4947]: I1203 08:48:55.920350 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34af4296-adaf-4d08-a259-833f7696e897-config\") pod \"dnsmasq-dns-544b7dbbc5-76cn4\" (UID: \"34af4296-adaf-4d08-a259-833f7696e897\") " pod="openstack/dnsmasq-dns-544b7dbbc5-76cn4" Dec 03 08:48:56 crc kubenswrapper[4947]: I1203 08:48:56.021560 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kx7q\" (UniqueName: \"kubernetes.io/projected/34af4296-adaf-4d08-a259-833f7696e897-kube-api-access-4kx7q\") pod \"dnsmasq-dns-544b7dbbc5-76cn4\" (UID: \"34af4296-adaf-4d08-a259-833f7696e897\") " pod="openstack/dnsmasq-dns-544b7dbbc5-76cn4" Dec 03 08:48:56 crc kubenswrapper[4947]: I1203 08:48:56.021631 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34af4296-adaf-4d08-a259-833f7696e897-config\") pod \"dnsmasq-dns-544b7dbbc5-76cn4\" (UID: \"34af4296-adaf-4d08-a259-833f7696e897\") " pod="openstack/dnsmasq-dns-544b7dbbc5-76cn4" Dec 03 08:48:56 crc kubenswrapper[4947]: I1203 08:48:56.021763 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34af4296-adaf-4d08-a259-833f7696e897-dns-svc\") pod \"dnsmasq-dns-544b7dbbc5-76cn4\" (UID: \"34af4296-adaf-4d08-a259-833f7696e897\") " pod="openstack/dnsmasq-dns-544b7dbbc5-76cn4" Dec 03 08:48:56 crc kubenswrapper[4947]: I1203 08:48:56.022600 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34af4296-adaf-4d08-a259-833f7696e897-config\") pod \"dnsmasq-dns-544b7dbbc5-76cn4\" (UID: \"34af4296-adaf-4d08-a259-833f7696e897\") " pod="openstack/dnsmasq-dns-544b7dbbc5-76cn4" Dec 03 08:48:56 crc kubenswrapper[4947]: I1203 08:48:56.022842 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34af4296-adaf-4d08-a259-833f7696e897-dns-svc\") pod \"dnsmasq-dns-544b7dbbc5-76cn4\" (UID: \"34af4296-adaf-4d08-a259-833f7696e897\") " pod="openstack/dnsmasq-dns-544b7dbbc5-76cn4" Dec 03 08:48:56 crc kubenswrapper[4947]: I1203 08:48:56.046771 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kx7q\" (UniqueName: \"kubernetes.io/projected/34af4296-adaf-4d08-a259-833f7696e897-kube-api-access-4kx7q\") pod \"dnsmasq-dns-544b7dbbc5-76cn4\" (UID: \"34af4296-adaf-4d08-a259-833f7696e897\") " pod="openstack/dnsmasq-dns-544b7dbbc5-76cn4" Dec 03 08:48:56 crc kubenswrapper[4947]: I1203 08:48:56.052407 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-544b7dbbc5-76cn4" Dec 03 08:48:56 crc kubenswrapper[4947]: I1203 08:48:56.352761 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 08:48:56 crc kubenswrapper[4947]: I1203 08:48:56.523362 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-544b7dbbc5-76cn4"] Dec 03 08:48:56 crc kubenswrapper[4947]: W1203 08:48:56.537867 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34af4296_adaf_4d08_a259_833f7696e897.slice/crio-20dcd4efbf67553ede0dd903138734befb965a5613797122283e1a3191caa805 WatchSource:0}: Error finding container 20dcd4efbf67553ede0dd903138734befb965a5613797122283e1a3191caa805: Status 404 returned error can't find the container with id 20dcd4efbf67553ede0dd903138734befb965a5613797122283e1a3191caa805 Dec 03 08:48:56 crc kubenswrapper[4947]: I1203 08:48:56.982839 4947 generic.go:334] "Generic (PLEG): container finished" podID="34af4296-adaf-4d08-a259-833f7696e897" containerID="3f945920fbf0b5572dbfdb5b03192b0b0dc8fb7b108a0eadecd3dbb0e45f7950" exitCode=0 Dec 03 08:48:56 crc kubenswrapper[4947]: I1203 08:48:56.983099 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-544b7dbbc5-76cn4" event={"ID":"34af4296-adaf-4d08-a259-833f7696e897","Type":"ContainerDied","Data":"3f945920fbf0b5572dbfdb5b03192b0b0dc8fb7b108a0eadecd3dbb0e45f7950"} Dec 03 08:48:56 crc kubenswrapper[4947]: I1203 08:48:56.983128 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-544b7dbbc5-76cn4" event={"ID":"34af4296-adaf-4d08-a259-833f7696e897","Type":"ContainerStarted","Data":"20dcd4efbf67553ede0dd903138734befb965a5613797122283e1a3191caa805"} Dec 03 08:48:57 crc kubenswrapper[4947]: I1203 08:48:57.115467 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 08:48:57 crc kubenswrapper[4947]: I1203 08:48:57.995252 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-544b7dbbc5-76cn4" event={"ID":"34af4296-adaf-4d08-a259-833f7696e897","Type":"ContainerStarted","Data":"cced417981eea7a51a02e2c3074eacd6c4d0df4786f28a0c1379230a96d98b04"} Dec 03 08:48:57 crc kubenswrapper[4947]: I1203 08:48:57.996570 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-544b7dbbc5-76cn4" Dec 03 08:48:58 crc kubenswrapper[4947]: I1203 08:48:58.033547 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-544b7dbbc5-76cn4" podStartSLOduration=3.033480304 podStartE2EDuration="3.033480304s" podCreationTimestamp="2025-12-03 08:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:48:58.03298371 +0000 UTC m=+7199.293938136" watchObservedRunningTime="2025-12-03 08:48:58.033480304 +0000 UTC m=+7199.294434730" Dec 03 08:48:58 crc kubenswrapper[4947]: I1203 08:48:58.065963 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="901ce6e4-e34d-48c6-ba35-6c8a084cc6cc" containerName="rabbitmq" containerID="cri-o://1fe04bdf7e9c802536bbcaed2b96583a1c6ebfb1d47bb5c029b4e7afd64a007d" gracePeriod=604799 Dec 03 08:48:58 crc kubenswrapper[4947]: I1203 08:48:58.849969 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c" containerName="rabbitmq" containerID="cri-o://1c00c7aa69cbe111e60585c1d2851bddc1b91d3727f95173e8f4ce45b4f536fa" gracePeriod=604799 Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.082847 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:49:04 crc kubenswrapper[4947]: E1203 08:49:04.083814 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.713397 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.865299 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-rabbitmq-confd\") pod \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.865391 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-erlang-cookie-secret\") pod \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.865464 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-rabbitmq-erlang-cookie\") pod \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.865525 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6zqj\" (UniqueName: \"kubernetes.io/projected/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-kube-api-access-q6zqj\") pod \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.865739 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-500efb45-ae29-4e4b-aa32-34788ab98caa\") pod \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.865785 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-server-conf\") pod \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.865952 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-pod-info\") pod \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.866161 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "901ce6e4-e34d-48c6-ba35-6c8a084cc6cc" (UID: "901ce6e4-e34d-48c6-ba35-6c8a084cc6cc"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.866688 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-rabbitmq-plugins\") pod \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.866771 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-plugins-conf\") pod \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\" (UID: \"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc\") " Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.867296 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "901ce6e4-e34d-48c6-ba35-6c8a084cc6cc" (UID: "901ce6e4-e34d-48c6-ba35-6c8a084cc6cc"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.867804 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "901ce6e4-e34d-48c6-ba35-6c8a084cc6cc" (UID: "901ce6e4-e34d-48c6-ba35-6c8a084cc6cc"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.868574 4947 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.868778 4947 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.868958 4947 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.871485 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "901ce6e4-e34d-48c6-ba35-6c8a084cc6cc" (UID: "901ce6e4-e34d-48c6-ba35-6c8a084cc6cc"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.871637 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-pod-info" (OuterVolumeSpecName: "pod-info") pod "901ce6e4-e34d-48c6-ba35-6c8a084cc6cc" (UID: "901ce6e4-e34d-48c6-ba35-6c8a084cc6cc"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.873560 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-kube-api-access-q6zqj" (OuterVolumeSpecName: "kube-api-access-q6zqj") pod "901ce6e4-e34d-48c6-ba35-6c8a084cc6cc" (UID: "901ce6e4-e34d-48c6-ba35-6c8a084cc6cc"). InnerVolumeSpecName "kube-api-access-q6zqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.883262 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-500efb45-ae29-4e4b-aa32-34788ab98caa" (OuterVolumeSpecName: "persistence") pod "901ce6e4-e34d-48c6-ba35-6c8a084cc6cc" (UID: "901ce6e4-e34d-48c6-ba35-6c8a084cc6cc"). InnerVolumeSpecName "pvc-500efb45-ae29-4e4b-aa32-34788ab98caa". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.890579 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-server-conf" (OuterVolumeSpecName: "server-conf") pod "901ce6e4-e34d-48c6-ba35-6c8a084cc6cc" (UID: "901ce6e4-e34d-48c6-ba35-6c8a084cc6cc"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.948302 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "901ce6e4-e34d-48c6-ba35-6c8a084cc6cc" (UID: "901ce6e4-e34d-48c6-ba35-6c8a084cc6cc"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.970425 4947 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.970462 4947 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.970473 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6zqj\" (UniqueName: \"kubernetes.io/projected/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-kube-api-access-q6zqj\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.970521 4947 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-500efb45-ae29-4e4b-aa32-34788ab98caa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-500efb45-ae29-4e4b-aa32-34788ab98caa\") on node \"crc\" " Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.970532 4947 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-server-conf\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.970541 4947 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc-pod-info\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.988111 4947 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 08:49:04 crc kubenswrapper[4947]: I1203 08:49:04.988554 4947 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-500efb45-ae29-4e4b-aa32-34788ab98caa" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-500efb45-ae29-4e4b-aa32-34788ab98caa") on node "crc" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.058166 4947 generic.go:334] "Generic (PLEG): container finished" podID="901ce6e4-e34d-48c6-ba35-6c8a084cc6cc" containerID="1fe04bdf7e9c802536bbcaed2b96583a1c6ebfb1d47bb5c029b4e7afd64a007d" exitCode=0 Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.058213 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc","Type":"ContainerDied","Data":"1fe04bdf7e9c802536bbcaed2b96583a1c6ebfb1d47bb5c029b4e7afd64a007d"} Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.058239 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.058266 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"901ce6e4-e34d-48c6-ba35-6c8a084cc6cc","Type":"ContainerDied","Data":"4b2e0a88f19193a6fd180d75c501b48fc0f5e7546ca3c471e3664288ffb264e6"} Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.058286 4947 scope.go:117] "RemoveContainer" containerID="1fe04bdf7e9c802536bbcaed2b96583a1c6ebfb1d47bb5c029b4e7afd64a007d" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.071955 4947 reconciler_common.go:293] "Volume detached for volume \"pvc-500efb45-ae29-4e4b-aa32-34788ab98caa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-500efb45-ae29-4e4b-aa32-34788ab98caa\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.104185 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.118621 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.125692 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 08:49:05 crc kubenswrapper[4947]: E1203 08:49:05.126070 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="901ce6e4-e34d-48c6-ba35-6c8a084cc6cc" containerName="rabbitmq" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.126085 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="901ce6e4-e34d-48c6-ba35-6c8a084cc6cc" containerName="rabbitmq" Dec 03 08:49:05 crc kubenswrapper[4947]: E1203 08:49:05.126132 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="901ce6e4-e34d-48c6-ba35-6c8a084cc6cc" containerName="setup-container" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.126141 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="901ce6e4-e34d-48c6-ba35-6c8a084cc6cc" containerName="setup-container" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.126362 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="901ce6e4-e34d-48c6-ba35-6c8a084cc6cc" containerName="rabbitmq" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.127658 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.133041 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.133306 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.135464 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-nlkrw" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.137676 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.138483 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.138946 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.142417 4947 scope.go:117] "RemoveContainer" containerID="0cbd86af121767565c5c5b894bba168f55c06a214ac75ea206bf37491d4c0cab" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.169022 4947 scope.go:117] "RemoveContainer" containerID="1fe04bdf7e9c802536bbcaed2b96583a1c6ebfb1d47bb5c029b4e7afd64a007d" Dec 03 08:49:05 crc kubenswrapper[4947]: E1203 08:49:05.169775 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fe04bdf7e9c802536bbcaed2b96583a1c6ebfb1d47bb5c029b4e7afd64a007d\": container with ID starting with 1fe04bdf7e9c802536bbcaed2b96583a1c6ebfb1d47bb5c029b4e7afd64a007d not found: ID does not exist" containerID="1fe04bdf7e9c802536bbcaed2b96583a1c6ebfb1d47bb5c029b4e7afd64a007d" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.169815 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fe04bdf7e9c802536bbcaed2b96583a1c6ebfb1d47bb5c029b4e7afd64a007d"} err="failed to get container status \"1fe04bdf7e9c802536bbcaed2b96583a1c6ebfb1d47bb5c029b4e7afd64a007d\": rpc error: code = NotFound desc = could not find container \"1fe04bdf7e9c802536bbcaed2b96583a1c6ebfb1d47bb5c029b4e7afd64a007d\": container with ID starting with 1fe04bdf7e9c802536bbcaed2b96583a1c6ebfb1d47bb5c029b4e7afd64a007d not found: ID does not exist" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.169840 4947 scope.go:117] "RemoveContainer" containerID="0cbd86af121767565c5c5b894bba168f55c06a214ac75ea206bf37491d4c0cab" Dec 03 08:49:05 crc kubenswrapper[4947]: E1203 08:49:05.170313 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cbd86af121767565c5c5b894bba168f55c06a214ac75ea206bf37491d4c0cab\": container with ID starting with 0cbd86af121767565c5c5b894bba168f55c06a214ac75ea206bf37491d4c0cab not found: ID does not exist" containerID="0cbd86af121767565c5c5b894bba168f55c06a214ac75ea206bf37491d4c0cab" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.170519 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cbd86af121767565c5c5b894bba168f55c06a214ac75ea206bf37491d4c0cab"} err="failed to get container status \"0cbd86af121767565c5c5b894bba168f55c06a214ac75ea206bf37491d4c0cab\": rpc error: code = NotFound desc = could not find container \"0cbd86af121767565c5c5b894bba168f55c06a214ac75ea206bf37491d4c0cab\": container with ID starting with 0cbd86af121767565c5c5b894bba168f55c06a214ac75ea206bf37491d4c0cab not found: ID does not exist" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.287679 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a895404f-5bdf-4f7a-999f-8312a567c1d5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.287758 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a895404f-5bdf-4f7a-999f-8312a567c1d5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.287774 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a895404f-5bdf-4f7a-999f-8312a567c1d5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.287827 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a895404f-5bdf-4f7a-999f-8312a567c1d5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.287864 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-500efb45-ae29-4e4b-aa32-34788ab98caa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-500efb45-ae29-4e4b-aa32-34788ab98caa\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.287889 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb2cl\" (UniqueName: \"kubernetes.io/projected/a895404f-5bdf-4f7a-999f-8312a567c1d5-kube-api-access-qb2cl\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.287920 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a895404f-5bdf-4f7a-999f-8312a567c1d5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.287955 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a895404f-5bdf-4f7a-999f-8312a567c1d5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.287969 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a895404f-5bdf-4f7a-999f-8312a567c1d5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.389483 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a895404f-5bdf-4f7a-999f-8312a567c1d5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.389589 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a895404f-5bdf-4f7a-999f-8312a567c1d5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.389638 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a895404f-5bdf-4f7a-999f-8312a567c1d5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.389659 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a895404f-5bdf-4f7a-999f-8312a567c1d5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.389704 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a895404f-5bdf-4f7a-999f-8312a567c1d5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.389720 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a895404f-5bdf-4f7a-999f-8312a567c1d5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.389757 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a895404f-5bdf-4f7a-999f-8312a567c1d5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.389797 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-500efb45-ae29-4e4b-aa32-34788ab98caa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-500efb45-ae29-4e4b-aa32-34788ab98caa\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.389817 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb2cl\" (UniqueName: \"kubernetes.io/projected/a895404f-5bdf-4f7a-999f-8312a567c1d5-kube-api-access-qb2cl\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.390804 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a895404f-5bdf-4f7a-999f-8312a567c1d5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.391202 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a895404f-5bdf-4f7a-999f-8312a567c1d5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.391529 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a895404f-5bdf-4f7a-999f-8312a567c1d5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.393612 4947 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.393645 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-500efb45-ae29-4e4b-aa32-34788ab98caa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-500efb45-ae29-4e4b-aa32-34788ab98caa\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d18504434ce27c90b8e5c624bb6c35f78406c4172834edaa296b0c651f18eb88/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.394146 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a895404f-5bdf-4f7a-999f-8312a567c1d5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.397566 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a895404f-5bdf-4f7a-999f-8312a567c1d5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.397576 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a895404f-5bdf-4f7a-999f-8312a567c1d5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.398678 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a895404f-5bdf-4f7a-999f-8312a567c1d5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.414071 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb2cl\" (UniqueName: \"kubernetes.io/projected/a895404f-5bdf-4f7a-999f-8312a567c1d5-kube-api-access-qb2cl\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.444400 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-500efb45-ae29-4e4b-aa32-34788ab98caa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-500efb45-ae29-4e4b-aa32-34788ab98caa\") pod \"rabbitmq-server-0\" (UID: \"a895404f-5bdf-4f7a-999f-8312a567c1d5\") " pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.458338 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.557332 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.605628 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-server-conf\") pod \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.606230 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-erlang-cookie-secret\") pod \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.606277 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-rabbitmq-confd\") pod \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.606901 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-pod-info\") pod \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.606947 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-rabbitmq-erlang-cookie\") pod \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.607023 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhqqn\" (UniqueName: \"kubernetes.io/projected/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-kube-api-access-qhqqn\") pod \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.607115 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0793eb4e-c63e-412d-b29a-1812ecb3c7ed\") pod \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.607175 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-plugins-conf\") pod \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.607199 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-rabbitmq-plugins\") pod \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\" (UID: \"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c\") " Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.614558 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c" (UID: "5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.615509 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c" (UID: "5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.615802 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c" (UID: "5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.617207 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-kube-api-access-qhqqn" (OuterVolumeSpecName: "kube-api-access-qhqqn") pod "5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c" (UID: "5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c"). InnerVolumeSpecName "kube-api-access-qhqqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.619683 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c" (UID: "5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.625457 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-pod-info" (OuterVolumeSpecName: "pod-info") pod "5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c" (UID: "5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.629523 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0793eb4e-c63e-412d-b29a-1812ecb3c7ed" (OuterVolumeSpecName: "persistence") pod "5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c" (UID: "5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c"). InnerVolumeSpecName "pvc-0793eb4e-c63e-412d-b29a-1812ecb3c7ed". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.629954 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-server-conf" (OuterVolumeSpecName: "server-conf") pod "5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c" (UID: "5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.707922 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c" (UID: "5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.708796 4947 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.708827 4947 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.708835 4947 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-server-conf\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.708843 4947 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.708853 4947 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.708860 4947 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-pod-info\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.708870 4947 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.708881 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhqqn\" (UniqueName: \"kubernetes.io/projected/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c-kube-api-access-qhqqn\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.708907 4947 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-0793eb4e-c63e-412d-b29a-1812ecb3c7ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0793eb4e-c63e-412d-b29a-1812ecb3c7ed\") on node \"crc\" " Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.727205 4947 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.727383 4947 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-0793eb4e-c63e-412d-b29a-1812ecb3c7ed" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0793eb4e-c63e-412d-b29a-1812ecb3c7ed") on node "crc" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.810331 4947 reconciler_common.go:293] "Volume detached for volume \"pvc-0793eb4e-c63e-412d-b29a-1812ecb3c7ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0793eb4e-c63e-412d-b29a-1812ecb3c7ed\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:05 crc kubenswrapper[4947]: I1203 08:49:05.906003 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.053826 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-544b7dbbc5-76cn4" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.076869 4947 generic.go:334] "Generic (PLEG): container finished" podID="5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c" containerID="1c00c7aa69cbe111e60585c1d2851bddc1b91d3727f95173e8f4ce45b4f536fa" exitCode=0 Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.077031 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c","Type":"ContainerDied","Data":"1c00c7aa69cbe111e60585c1d2851bddc1b91d3727f95173e8f4ce45b4f536fa"} Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.077087 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c","Type":"ContainerDied","Data":"78a3b90a6763c92e4006929e78b41a5bf55a52db2b1efe4e4bef8149a124648e"} Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.077126 4947 scope.go:117] "RemoveContainer" containerID="1c00c7aa69cbe111e60585c1d2851bddc1b91d3727f95173e8f4ce45b4f536fa" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.080324 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.086253 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a895404f-5bdf-4f7a-999f-8312a567c1d5","Type":"ContainerStarted","Data":"7741b3ecba775489216e1ed5c4dddbe59146d5314b45d4ab917ef300de1d1dc3"} Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.118684 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7588bd9997-rxwb7"] Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.119000 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7588bd9997-rxwb7" podUID="6633c67e-0968-48bc-864d-ccc1dad4e225" containerName="dnsmasq-dns" containerID="cri-o://b37a47ffa455398cb198300db47f462997a02116feae5133082c328bb95da476" gracePeriod=10 Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.124593 4947 scope.go:117] "RemoveContainer" containerID="65a452d71377b704b21263ed92f89f4e78ea2cdb7b55dba54951c161cef2045f" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.161828 4947 scope.go:117] "RemoveContainer" containerID="1c00c7aa69cbe111e60585c1d2851bddc1b91d3727f95173e8f4ce45b4f536fa" Dec 03 08:49:06 crc kubenswrapper[4947]: E1203 08:49:06.165286 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c00c7aa69cbe111e60585c1d2851bddc1b91d3727f95173e8f4ce45b4f536fa\": container with ID starting with 1c00c7aa69cbe111e60585c1d2851bddc1b91d3727f95173e8f4ce45b4f536fa not found: ID does not exist" containerID="1c00c7aa69cbe111e60585c1d2851bddc1b91d3727f95173e8f4ce45b4f536fa" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.165332 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c00c7aa69cbe111e60585c1d2851bddc1b91d3727f95173e8f4ce45b4f536fa"} err="failed to get container status \"1c00c7aa69cbe111e60585c1d2851bddc1b91d3727f95173e8f4ce45b4f536fa\": rpc error: code = NotFound desc = could not find container \"1c00c7aa69cbe111e60585c1d2851bddc1b91d3727f95173e8f4ce45b4f536fa\": container with ID starting with 1c00c7aa69cbe111e60585c1d2851bddc1b91d3727f95173e8f4ce45b4f536fa not found: ID does not exist" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.165358 4947 scope.go:117] "RemoveContainer" containerID="65a452d71377b704b21263ed92f89f4e78ea2cdb7b55dba54951c161cef2045f" Dec 03 08:49:06 crc kubenswrapper[4947]: E1203 08:49:06.166219 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65a452d71377b704b21263ed92f89f4e78ea2cdb7b55dba54951c161cef2045f\": container with ID starting with 65a452d71377b704b21263ed92f89f4e78ea2cdb7b55dba54951c161cef2045f not found: ID does not exist" containerID="65a452d71377b704b21263ed92f89f4e78ea2cdb7b55dba54951c161cef2045f" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.166267 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65a452d71377b704b21263ed92f89f4e78ea2cdb7b55dba54951c161cef2045f"} err="failed to get container status \"65a452d71377b704b21263ed92f89f4e78ea2cdb7b55dba54951c161cef2045f\": rpc error: code = NotFound desc = could not find container \"65a452d71377b704b21263ed92f89f4e78ea2cdb7b55dba54951c161cef2045f\": container with ID starting with 65a452d71377b704b21263ed92f89f4e78ea2cdb7b55dba54951c161cef2045f not found: ID does not exist" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.170699 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.178548 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.200473 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 08:49:06 crc kubenswrapper[4947]: E1203 08:49:06.203642 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c" containerName="setup-container" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.203670 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c" containerName="setup-container" Dec 03 08:49:06 crc kubenswrapper[4947]: E1203 08:49:06.203718 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c" containerName="rabbitmq" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.203728 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c" containerName="rabbitmq" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.203940 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c" containerName="rabbitmq" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.205037 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.211101 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.211161 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.211294 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.211563 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-rczmm" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.211748 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.224074 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.325262 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a0056672-0468-47c1-a7d6-5c5479eef74e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.325311 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0793eb4e-c63e-412d-b29a-1812ecb3c7ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0793eb4e-c63e-412d-b29a-1812ecb3c7ed\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.325346 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a0056672-0468-47c1-a7d6-5c5479eef74e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.325652 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a0056672-0468-47c1-a7d6-5c5479eef74e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.325697 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a0056672-0468-47c1-a7d6-5c5479eef74e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.325755 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx9gc\" (UniqueName: \"kubernetes.io/projected/a0056672-0468-47c1-a7d6-5c5479eef74e-kube-api-access-nx9gc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.325800 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a0056672-0468-47c1-a7d6-5c5479eef74e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.325844 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a0056672-0468-47c1-a7d6-5c5479eef74e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.332242 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a0056672-0468-47c1-a7d6-5c5479eef74e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.434090 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx9gc\" (UniqueName: \"kubernetes.io/projected/a0056672-0468-47c1-a7d6-5c5479eef74e-kube-api-access-nx9gc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.434164 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a0056672-0468-47c1-a7d6-5c5479eef74e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.434205 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a0056672-0468-47c1-a7d6-5c5479eef74e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.434268 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a0056672-0468-47c1-a7d6-5c5479eef74e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.434303 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a0056672-0468-47c1-a7d6-5c5479eef74e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.434332 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0793eb4e-c63e-412d-b29a-1812ecb3c7ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0793eb4e-c63e-412d-b29a-1812ecb3c7ed\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.434389 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a0056672-0468-47c1-a7d6-5c5479eef74e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.434434 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a0056672-0468-47c1-a7d6-5c5479eef74e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.434464 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a0056672-0468-47c1-a7d6-5c5479eef74e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.435001 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a0056672-0468-47c1-a7d6-5c5479eef74e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.435045 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a0056672-0468-47c1-a7d6-5c5479eef74e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.435541 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a0056672-0468-47c1-a7d6-5c5479eef74e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.435607 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a0056672-0468-47c1-a7d6-5c5479eef74e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.439052 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a0056672-0468-47c1-a7d6-5c5479eef74e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.439145 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a0056672-0468-47c1-a7d6-5c5479eef74e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.439259 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a0056672-0468-47c1-a7d6-5c5479eef74e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.439385 4947 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.439410 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0793eb4e-c63e-412d-b29a-1812ecb3c7ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0793eb4e-c63e-412d-b29a-1812ecb3c7ed\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/faa97777f376a835dba143cfa07e8f05ffcb84429b4e6bdd35b51866431b3476/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.458273 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx9gc\" (UniqueName: \"kubernetes.io/projected/a0056672-0468-47c1-a7d6-5c5479eef74e-kube-api-access-nx9gc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.490823 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0793eb4e-c63e-412d-b29a-1812ecb3c7ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0793eb4e-c63e-412d-b29a-1812ecb3c7ed\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0056672-0468-47c1-a7d6-5c5479eef74e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.524667 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.799348 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7588bd9997-rxwb7" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.841616 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6633c67e-0968-48bc-864d-ccc1dad4e225-dns-svc\") pod \"6633c67e-0968-48bc-864d-ccc1dad4e225\" (UID: \"6633c67e-0968-48bc-864d-ccc1dad4e225\") " Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.841679 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6633c67e-0968-48bc-864d-ccc1dad4e225-config\") pod \"6633c67e-0968-48bc-864d-ccc1dad4e225\" (UID: \"6633c67e-0968-48bc-864d-ccc1dad4e225\") " Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.841733 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqtdx\" (UniqueName: \"kubernetes.io/projected/6633c67e-0968-48bc-864d-ccc1dad4e225-kube-api-access-lqtdx\") pod \"6633c67e-0968-48bc-864d-ccc1dad4e225\" (UID: \"6633c67e-0968-48bc-864d-ccc1dad4e225\") " Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.847759 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6633c67e-0968-48bc-864d-ccc1dad4e225-kube-api-access-lqtdx" (OuterVolumeSpecName: "kube-api-access-lqtdx") pod "6633c67e-0968-48bc-864d-ccc1dad4e225" (UID: "6633c67e-0968-48bc-864d-ccc1dad4e225"). InnerVolumeSpecName "kube-api-access-lqtdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.880713 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6633c67e-0968-48bc-864d-ccc1dad4e225-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6633c67e-0968-48bc-864d-ccc1dad4e225" (UID: "6633c67e-0968-48bc-864d-ccc1dad4e225"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.882544 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6633c67e-0968-48bc-864d-ccc1dad4e225-config" (OuterVolumeSpecName: "config") pod "6633c67e-0968-48bc-864d-ccc1dad4e225" (UID: "6633c67e-0968-48bc-864d-ccc1dad4e225"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.942721 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.944216 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6633c67e-0968-48bc-864d-ccc1dad4e225-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.944245 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6633c67e-0968-48bc-864d-ccc1dad4e225-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:06 crc kubenswrapper[4947]: I1203 08:49:06.944260 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqtdx\" (UniqueName: \"kubernetes.io/projected/6633c67e-0968-48bc-864d-ccc1dad4e225-kube-api-access-lqtdx\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:07 crc kubenswrapper[4947]: W1203 08:49:07.041584 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0056672_0468_47c1_a7d6_5c5479eef74e.slice/crio-3c483f0ca728a89f114143f1dd0694ea9b9a86a41204f1f864427e4bed96d74a WatchSource:0}: Error finding container 3c483f0ca728a89f114143f1dd0694ea9b9a86a41204f1f864427e4bed96d74a: Status 404 returned error can't find the container with id 3c483f0ca728a89f114143f1dd0694ea9b9a86a41204f1f864427e4bed96d74a Dec 03 08:49:07 crc kubenswrapper[4947]: I1203 08:49:07.103937 4947 generic.go:334] "Generic (PLEG): container finished" podID="6633c67e-0968-48bc-864d-ccc1dad4e225" containerID="b37a47ffa455398cb198300db47f462997a02116feae5133082c328bb95da476" exitCode=0 Dec 03 08:49:07 crc kubenswrapper[4947]: I1203 08:49:07.104041 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7588bd9997-rxwb7" Dec 03 08:49:07 crc kubenswrapper[4947]: I1203 08:49:07.110321 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c" path="/var/lib/kubelet/pods/5632b5c4-c05b-4739-8bc6-7ef4ab0d4a4c/volumes" Dec 03 08:49:07 crc kubenswrapper[4947]: I1203 08:49:07.111217 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="901ce6e4-e34d-48c6-ba35-6c8a084cc6cc" path="/var/lib/kubelet/pods/901ce6e4-e34d-48c6-ba35-6c8a084cc6cc/volumes" Dec 03 08:49:07 crc kubenswrapper[4947]: I1203 08:49:07.112006 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a0056672-0468-47c1-a7d6-5c5479eef74e","Type":"ContainerStarted","Data":"3c483f0ca728a89f114143f1dd0694ea9b9a86a41204f1f864427e4bed96d74a"} Dec 03 08:49:07 crc kubenswrapper[4947]: I1203 08:49:07.112043 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7588bd9997-rxwb7" event={"ID":"6633c67e-0968-48bc-864d-ccc1dad4e225","Type":"ContainerDied","Data":"b37a47ffa455398cb198300db47f462997a02116feae5133082c328bb95da476"} Dec 03 08:49:07 crc kubenswrapper[4947]: I1203 08:49:07.112062 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7588bd9997-rxwb7" event={"ID":"6633c67e-0968-48bc-864d-ccc1dad4e225","Type":"ContainerDied","Data":"6a3a2d59000aaf9e3328246f94121dc003999580344ab708d6903163a09a692d"} Dec 03 08:49:07 crc kubenswrapper[4947]: I1203 08:49:07.112086 4947 scope.go:117] "RemoveContainer" containerID="b37a47ffa455398cb198300db47f462997a02116feae5133082c328bb95da476" Dec 03 08:49:07 crc kubenswrapper[4947]: I1203 08:49:07.150219 4947 scope.go:117] "RemoveContainer" containerID="863020e731feb640eb9137cd5837aca54dfb7231a5f52162f5a544e502e58a9d" Dec 03 08:49:07 crc kubenswrapper[4947]: I1203 08:49:07.153306 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7588bd9997-rxwb7"] Dec 03 08:49:07 crc kubenswrapper[4947]: I1203 08:49:07.160823 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7588bd9997-rxwb7"] Dec 03 08:49:07 crc kubenswrapper[4947]: I1203 08:49:07.182902 4947 scope.go:117] "RemoveContainer" containerID="b37a47ffa455398cb198300db47f462997a02116feae5133082c328bb95da476" Dec 03 08:49:07 crc kubenswrapper[4947]: E1203 08:49:07.183413 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b37a47ffa455398cb198300db47f462997a02116feae5133082c328bb95da476\": container with ID starting with b37a47ffa455398cb198300db47f462997a02116feae5133082c328bb95da476 not found: ID does not exist" containerID="b37a47ffa455398cb198300db47f462997a02116feae5133082c328bb95da476" Dec 03 08:49:07 crc kubenswrapper[4947]: I1203 08:49:07.183476 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b37a47ffa455398cb198300db47f462997a02116feae5133082c328bb95da476"} err="failed to get container status \"b37a47ffa455398cb198300db47f462997a02116feae5133082c328bb95da476\": rpc error: code = NotFound desc = could not find container \"b37a47ffa455398cb198300db47f462997a02116feae5133082c328bb95da476\": container with ID starting with b37a47ffa455398cb198300db47f462997a02116feae5133082c328bb95da476 not found: ID does not exist" Dec 03 08:49:07 crc kubenswrapper[4947]: I1203 08:49:07.183529 4947 scope.go:117] "RemoveContainer" containerID="863020e731feb640eb9137cd5837aca54dfb7231a5f52162f5a544e502e58a9d" Dec 03 08:49:07 crc kubenswrapper[4947]: E1203 08:49:07.183908 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"863020e731feb640eb9137cd5837aca54dfb7231a5f52162f5a544e502e58a9d\": container with ID starting with 863020e731feb640eb9137cd5837aca54dfb7231a5f52162f5a544e502e58a9d not found: ID does not exist" containerID="863020e731feb640eb9137cd5837aca54dfb7231a5f52162f5a544e502e58a9d" Dec 03 08:49:07 crc kubenswrapper[4947]: I1203 08:49:07.183950 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"863020e731feb640eb9137cd5837aca54dfb7231a5f52162f5a544e502e58a9d"} err="failed to get container status \"863020e731feb640eb9137cd5837aca54dfb7231a5f52162f5a544e502e58a9d\": rpc error: code = NotFound desc = could not find container \"863020e731feb640eb9137cd5837aca54dfb7231a5f52162f5a544e502e58a9d\": container with ID starting with 863020e731feb640eb9137cd5837aca54dfb7231a5f52162f5a544e502e58a9d not found: ID does not exist" Dec 03 08:49:08 crc kubenswrapper[4947]: I1203 08:49:08.112535 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a895404f-5bdf-4f7a-999f-8312a567c1d5","Type":"ContainerStarted","Data":"c80dc07040102e28e0d329781e407ec9e8523a51875d2f0bf17a3aad7c60dd52"} Dec 03 08:49:09 crc kubenswrapper[4947]: I1203 08:49:09.094290 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6633c67e-0968-48bc-864d-ccc1dad4e225" path="/var/lib/kubelet/pods/6633c67e-0968-48bc-864d-ccc1dad4e225/volumes" Dec 03 08:49:09 crc kubenswrapper[4947]: I1203 08:49:09.123432 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a0056672-0468-47c1-a7d6-5c5479eef74e","Type":"ContainerStarted","Data":"df8e6e9d9863b862dae682446da5cb975528f4ba728c5a7a09f101edf828e008"} Dec 03 08:49:17 crc kubenswrapper[4947]: I1203 08:49:17.082742 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:49:17 crc kubenswrapper[4947]: E1203 08:49:17.083401 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:49:28 crc kubenswrapper[4947]: I1203 08:49:28.083136 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:49:28 crc kubenswrapper[4947]: E1203 08:49:28.083920 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:49:35 crc kubenswrapper[4947]: I1203 08:49:35.042168 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-296jr"] Dec 03 08:49:35 crc kubenswrapper[4947]: E1203 08:49:35.043588 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6633c67e-0968-48bc-864d-ccc1dad4e225" containerName="init" Dec 03 08:49:35 crc kubenswrapper[4947]: I1203 08:49:35.043619 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="6633c67e-0968-48bc-864d-ccc1dad4e225" containerName="init" Dec 03 08:49:35 crc kubenswrapper[4947]: E1203 08:49:35.043701 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6633c67e-0968-48bc-864d-ccc1dad4e225" containerName="dnsmasq-dns" Dec 03 08:49:35 crc kubenswrapper[4947]: I1203 08:49:35.043722 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="6633c67e-0968-48bc-864d-ccc1dad4e225" containerName="dnsmasq-dns" Dec 03 08:49:35 crc kubenswrapper[4947]: I1203 08:49:35.044121 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="6633c67e-0968-48bc-864d-ccc1dad4e225" containerName="dnsmasq-dns" Dec 03 08:49:35 crc kubenswrapper[4947]: I1203 08:49:35.046841 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-296jr" Dec 03 08:49:35 crc kubenswrapper[4947]: I1203 08:49:35.060896 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-296jr"] Dec 03 08:49:35 crc kubenswrapper[4947]: I1203 08:49:35.142969 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cdkf\" (UniqueName: \"kubernetes.io/projected/4d0ec2dd-c887-4271-8a25-9935963cfcfa-kube-api-access-2cdkf\") pod \"community-operators-296jr\" (UID: \"4d0ec2dd-c887-4271-8a25-9935963cfcfa\") " pod="openshift-marketplace/community-operators-296jr" Dec 03 08:49:35 crc kubenswrapper[4947]: I1203 08:49:35.143035 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d0ec2dd-c887-4271-8a25-9935963cfcfa-utilities\") pod \"community-operators-296jr\" (UID: \"4d0ec2dd-c887-4271-8a25-9935963cfcfa\") " pod="openshift-marketplace/community-operators-296jr" Dec 03 08:49:35 crc kubenswrapper[4947]: I1203 08:49:35.143102 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d0ec2dd-c887-4271-8a25-9935963cfcfa-catalog-content\") pod \"community-operators-296jr\" (UID: \"4d0ec2dd-c887-4271-8a25-9935963cfcfa\") " pod="openshift-marketplace/community-operators-296jr" Dec 03 08:49:35 crc kubenswrapper[4947]: I1203 08:49:35.244091 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d0ec2dd-c887-4271-8a25-9935963cfcfa-catalog-content\") pod \"community-operators-296jr\" (UID: \"4d0ec2dd-c887-4271-8a25-9935963cfcfa\") " pod="openshift-marketplace/community-operators-296jr" Dec 03 08:49:35 crc kubenswrapper[4947]: I1203 08:49:35.244269 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cdkf\" (UniqueName: \"kubernetes.io/projected/4d0ec2dd-c887-4271-8a25-9935963cfcfa-kube-api-access-2cdkf\") pod \"community-operators-296jr\" (UID: \"4d0ec2dd-c887-4271-8a25-9935963cfcfa\") " pod="openshift-marketplace/community-operators-296jr" Dec 03 08:49:35 crc kubenswrapper[4947]: I1203 08:49:35.244301 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d0ec2dd-c887-4271-8a25-9935963cfcfa-utilities\") pod \"community-operators-296jr\" (UID: \"4d0ec2dd-c887-4271-8a25-9935963cfcfa\") " pod="openshift-marketplace/community-operators-296jr" Dec 03 08:49:35 crc kubenswrapper[4947]: I1203 08:49:35.245820 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d0ec2dd-c887-4271-8a25-9935963cfcfa-catalog-content\") pod \"community-operators-296jr\" (UID: \"4d0ec2dd-c887-4271-8a25-9935963cfcfa\") " pod="openshift-marketplace/community-operators-296jr" Dec 03 08:49:35 crc kubenswrapper[4947]: I1203 08:49:35.246759 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d0ec2dd-c887-4271-8a25-9935963cfcfa-utilities\") pod \"community-operators-296jr\" (UID: \"4d0ec2dd-c887-4271-8a25-9935963cfcfa\") " pod="openshift-marketplace/community-operators-296jr" Dec 03 08:49:35 crc kubenswrapper[4947]: I1203 08:49:35.268729 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cdkf\" (UniqueName: \"kubernetes.io/projected/4d0ec2dd-c887-4271-8a25-9935963cfcfa-kube-api-access-2cdkf\") pod \"community-operators-296jr\" (UID: \"4d0ec2dd-c887-4271-8a25-9935963cfcfa\") " pod="openshift-marketplace/community-operators-296jr" Dec 03 08:49:35 crc kubenswrapper[4947]: I1203 08:49:35.390216 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-296jr" Dec 03 08:49:35 crc kubenswrapper[4947]: I1203 08:49:35.663205 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-296jr"] Dec 03 08:49:36 crc kubenswrapper[4947]: I1203 08:49:36.358582 4947 generic.go:334] "Generic (PLEG): container finished" podID="4d0ec2dd-c887-4271-8a25-9935963cfcfa" containerID="b2caa70b7a425ebfc6365173e5fe9effc9d89bf202cda6cf2fa333b94f50db30" exitCode=0 Dec 03 08:49:36 crc kubenswrapper[4947]: I1203 08:49:36.358903 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-296jr" event={"ID":"4d0ec2dd-c887-4271-8a25-9935963cfcfa","Type":"ContainerDied","Data":"b2caa70b7a425ebfc6365173e5fe9effc9d89bf202cda6cf2fa333b94f50db30"} Dec 03 08:49:36 crc kubenswrapper[4947]: I1203 08:49:36.359364 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-296jr" event={"ID":"4d0ec2dd-c887-4271-8a25-9935963cfcfa","Type":"ContainerStarted","Data":"0bc185b569763dc9acecb53d7e151608eb1544186308db90db6cf7886dd902e1"} Dec 03 08:49:37 crc kubenswrapper[4947]: I1203 08:49:37.368751 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-296jr" event={"ID":"4d0ec2dd-c887-4271-8a25-9935963cfcfa","Type":"ContainerStarted","Data":"9957caf0f1a6e6e760038726089f8662adb758b7a82f5592fb7e569b5ac3bc1f"} Dec 03 08:49:38 crc kubenswrapper[4947]: I1203 08:49:38.384616 4947 generic.go:334] "Generic (PLEG): container finished" podID="4d0ec2dd-c887-4271-8a25-9935963cfcfa" containerID="9957caf0f1a6e6e760038726089f8662adb758b7a82f5592fb7e569b5ac3bc1f" exitCode=0 Dec 03 08:49:38 crc kubenswrapper[4947]: I1203 08:49:38.384733 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-296jr" event={"ID":"4d0ec2dd-c887-4271-8a25-9935963cfcfa","Type":"ContainerDied","Data":"9957caf0f1a6e6e760038726089f8662adb758b7a82f5592fb7e569b5ac3bc1f"} Dec 03 08:49:39 crc kubenswrapper[4947]: I1203 08:49:39.399989 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-296jr" event={"ID":"4d0ec2dd-c887-4271-8a25-9935963cfcfa","Type":"ContainerStarted","Data":"ce04f473b6fc583fcc437d78290a3a82c234a98f7f64b847efb0d8f04db49b55"} Dec 03 08:49:39 crc kubenswrapper[4947]: I1203 08:49:39.432443 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-296jr" podStartSLOduration=1.995620713 podStartE2EDuration="4.432420823s" podCreationTimestamp="2025-12-03 08:49:35 +0000 UTC" firstStartedPulling="2025-12-03 08:49:36.361452968 +0000 UTC m=+7237.622407404" lastFinishedPulling="2025-12-03 08:49:38.798253048 +0000 UTC m=+7240.059207514" observedRunningTime="2025-12-03 08:49:39.420617914 +0000 UTC m=+7240.681572380" watchObservedRunningTime="2025-12-03 08:49:39.432420823 +0000 UTC m=+7240.693375269" Dec 03 08:49:40 crc kubenswrapper[4947]: I1203 08:49:40.083707 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:49:40 crc kubenswrapper[4947]: E1203 08:49:40.083985 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:49:40 crc kubenswrapper[4947]: I1203 08:49:40.409745 4947 generic.go:334] "Generic (PLEG): container finished" podID="a895404f-5bdf-4f7a-999f-8312a567c1d5" containerID="c80dc07040102e28e0d329781e407ec9e8523a51875d2f0bf17a3aad7c60dd52" exitCode=0 Dec 03 08:49:40 crc kubenswrapper[4947]: I1203 08:49:40.409840 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a895404f-5bdf-4f7a-999f-8312a567c1d5","Type":"ContainerDied","Data":"c80dc07040102e28e0d329781e407ec9e8523a51875d2f0bf17a3aad7c60dd52"} Dec 03 08:49:41 crc kubenswrapper[4947]: I1203 08:49:41.419658 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a895404f-5bdf-4f7a-999f-8312a567c1d5","Type":"ContainerStarted","Data":"09c8d9100c7c6cdc3152bce87b31da493f422985935010abdafa32bcef5b8494"} Dec 03 08:49:41 crc kubenswrapper[4947]: I1203 08:49:41.420372 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 03 08:49:41 crc kubenswrapper[4947]: I1203 08:49:41.421667 4947 generic.go:334] "Generic (PLEG): container finished" podID="a0056672-0468-47c1-a7d6-5c5479eef74e" containerID="df8e6e9d9863b862dae682446da5cb975528f4ba728c5a7a09f101edf828e008" exitCode=0 Dec 03 08:49:41 crc kubenswrapper[4947]: I1203 08:49:41.421696 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a0056672-0468-47c1-a7d6-5c5479eef74e","Type":"ContainerDied","Data":"df8e6e9d9863b862dae682446da5cb975528f4ba728c5a7a09f101edf828e008"} Dec 03 08:49:41 crc kubenswrapper[4947]: I1203 08:49:41.456399 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.456377459 podStartE2EDuration="36.456377459s" podCreationTimestamp="2025-12-03 08:49:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:49:41.452938595 +0000 UTC m=+7242.713893021" watchObservedRunningTime="2025-12-03 08:49:41.456377459 +0000 UTC m=+7242.717331895" Dec 03 08:49:42 crc kubenswrapper[4947]: I1203 08:49:42.431457 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a0056672-0468-47c1-a7d6-5c5479eef74e","Type":"ContainerStarted","Data":"1915cd3e075ae156cdea266f0892f921daf2722b78b7fdcee8a864d75cb96084"} Dec 03 08:49:42 crc kubenswrapper[4947]: I1203 08:49:42.432555 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:42 crc kubenswrapper[4947]: I1203 08:49:42.454332 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.454305702 podStartE2EDuration="36.454305702s" podCreationTimestamp="2025-12-03 08:49:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:49:42.450822268 +0000 UTC m=+7243.711776694" watchObservedRunningTime="2025-12-03 08:49:42.454305702 +0000 UTC m=+7243.715260148" Dec 03 08:49:45 crc kubenswrapper[4947]: I1203 08:49:45.391273 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-296jr" Dec 03 08:49:45 crc kubenswrapper[4947]: I1203 08:49:45.391772 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-296jr" Dec 03 08:49:45 crc kubenswrapper[4947]: I1203 08:49:45.453771 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-296jr" Dec 03 08:49:45 crc kubenswrapper[4947]: I1203 08:49:45.552535 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-296jr" Dec 03 08:49:45 crc kubenswrapper[4947]: I1203 08:49:45.701578 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-296jr"] Dec 03 08:49:47 crc kubenswrapper[4947]: I1203 08:49:47.478096 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-296jr" podUID="4d0ec2dd-c887-4271-8a25-9935963cfcfa" containerName="registry-server" containerID="cri-o://ce04f473b6fc583fcc437d78290a3a82c234a98f7f64b847efb0d8f04db49b55" gracePeriod=2 Dec 03 08:49:47 crc kubenswrapper[4947]: I1203 08:49:47.889067 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-296jr" Dec 03 08:49:47 crc kubenswrapper[4947]: I1203 08:49:47.957452 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d0ec2dd-c887-4271-8a25-9935963cfcfa-utilities\") pod \"4d0ec2dd-c887-4271-8a25-9935963cfcfa\" (UID: \"4d0ec2dd-c887-4271-8a25-9935963cfcfa\") " Dec 03 08:49:47 crc kubenswrapper[4947]: I1203 08:49:47.957511 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d0ec2dd-c887-4271-8a25-9935963cfcfa-catalog-content\") pod \"4d0ec2dd-c887-4271-8a25-9935963cfcfa\" (UID: \"4d0ec2dd-c887-4271-8a25-9935963cfcfa\") " Dec 03 08:49:47 crc kubenswrapper[4947]: I1203 08:49:47.957617 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cdkf\" (UniqueName: \"kubernetes.io/projected/4d0ec2dd-c887-4271-8a25-9935963cfcfa-kube-api-access-2cdkf\") pod \"4d0ec2dd-c887-4271-8a25-9935963cfcfa\" (UID: \"4d0ec2dd-c887-4271-8a25-9935963cfcfa\") " Dec 03 08:49:47 crc kubenswrapper[4947]: I1203 08:49:47.958250 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d0ec2dd-c887-4271-8a25-9935963cfcfa-utilities" (OuterVolumeSpecName: "utilities") pod "4d0ec2dd-c887-4271-8a25-9935963cfcfa" (UID: "4d0ec2dd-c887-4271-8a25-9935963cfcfa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:49:47 crc kubenswrapper[4947]: I1203 08:49:47.963223 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d0ec2dd-c887-4271-8a25-9935963cfcfa-kube-api-access-2cdkf" (OuterVolumeSpecName: "kube-api-access-2cdkf") pod "4d0ec2dd-c887-4271-8a25-9935963cfcfa" (UID: "4d0ec2dd-c887-4271-8a25-9935963cfcfa"). InnerVolumeSpecName "kube-api-access-2cdkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:49:48 crc kubenswrapper[4947]: I1203 08:49:48.059355 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cdkf\" (UniqueName: \"kubernetes.io/projected/4d0ec2dd-c887-4271-8a25-9935963cfcfa-kube-api-access-2cdkf\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:48 crc kubenswrapper[4947]: I1203 08:49:48.059386 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d0ec2dd-c887-4271-8a25-9935963cfcfa-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:48 crc kubenswrapper[4947]: I1203 08:49:48.099958 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d0ec2dd-c887-4271-8a25-9935963cfcfa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d0ec2dd-c887-4271-8a25-9935963cfcfa" (UID: "4d0ec2dd-c887-4271-8a25-9935963cfcfa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:49:48 crc kubenswrapper[4947]: I1203 08:49:48.161082 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d0ec2dd-c887-4271-8a25-9935963cfcfa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:48 crc kubenswrapper[4947]: I1203 08:49:48.486986 4947 generic.go:334] "Generic (PLEG): container finished" podID="4d0ec2dd-c887-4271-8a25-9935963cfcfa" containerID="ce04f473b6fc583fcc437d78290a3a82c234a98f7f64b847efb0d8f04db49b55" exitCode=0 Dec 03 08:49:48 crc kubenswrapper[4947]: I1203 08:49:48.487028 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-296jr" event={"ID":"4d0ec2dd-c887-4271-8a25-9935963cfcfa","Type":"ContainerDied","Data":"ce04f473b6fc583fcc437d78290a3a82c234a98f7f64b847efb0d8f04db49b55"} Dec 03 08:49:48 crc kubenswrapper[4947]: I1203 08:49:48.487053 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-296jr" event={"ID":"4d0ec2dd-c887-4271-8a25-9935963cfcfa","Type":"ContainerDied","Data":"0bc185b569763dc9acecb53d7e151608eb1544186308db90db6cf7886dd902e1"} Dec 03 08:49:48 crc kubenswrapper[4947]: I1203 08:49:48.487069 4947 scope.go:117] "RemoveContainer" containerID="ce04f473b6fc583fcc437d78290a3a82c234a98f7f64b847efb0d8f04db49b55" Dec 03 08:49:48 crc kubenswrapper[4947]: I1203 08:49:48.487079 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-296jr" Dec 03 08:49:48 crc kubenswrapper[4947]: I1203 08:49:48.531199 4947 scope.go:117] "RemoveContainer" containerID="9957caf0f1a6e6e760038726089f8662adb758b7a82f5592fb7e569b5ac3bc1f" Dec 03 08:49:48 crc kubenswrapper[4947]: I1203 08:49:48.531385 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-296jr"] Dec 03 08:49:48 crc kubenswrapper[4947]: I1203 08:49:48.542873 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-296jr"] Dec 03 08:49:48 crc kubenswrapper[4947]: I1203 08:49:48.562358 4947 scope.go:117] "RemoveContainer" containerID="b2caa70b7a425ebfc6365173e5fe9effc9d89bf202cda6cf2fa333b94f50db30" Dec 03 08:49:48 crc kubenswrapper[4947]: I1203 08:49:48.615135 4947 scope.go:117] "RemoveContainer" containerID="ce04f473b6fc583fcc437d78290a3a82c234a98f7f64b847efb0d8f04db49b55" Dec 03 08:49:48 crc kubenswrapper[4947]: E1203 08:49:48.615990 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce04f473b6fc583fcc437d78290a3a82c234a98f7f64b847efb0d8f04db49b55\": container with ID starting with ce04f473b6fc583fcc437d78290a3a82c234a98f7f64b847efb0d8f04db49b55 not found: ID does not exist" containerID="ce04f473b6fc583fcc437d78290a3a82c234a98f7f64b847efb0d8f04db49b55" Dec 03 08:49:48 crc kubenswrapper[4947]: I1203 08:49:48.616046 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce04f473b6fc583fcc437d78290a3a82c234a98f7f64b847efb0d8f04db49b55"} err="failed to get container status \"ce04f473b6fc583fcc437d78290a3a82c234a98f7f64b847efb0d8f04db49b55\": rpc error: code = NotFound desc = could not find container \"ce04f473b6fc583fcc437d78290a3a82c234a98f7f64b847efb0d8f04db49b55\": container with ID starting with ce04f473b6fc583fcc437d78290a3a82c234a98f7f64b847efb0d8f04db49b55 not found: ID does not exist" Dec 03 08:49:48 crc kubenswrapper[4947]: I1203 08:49:48.616076 4947 scope.go:117] "RemoveContainer" containerID="9957caf0f1a6e6e760038726089f8662adb758b7a82f5592fb7e569b5ac3bc1f" Dec 03 08:49:48 crc kubenswrapper[4947]: E1203 08:49:48.616790 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9957caf0f1a6e6e760038726089f8662adb758b7a82f5592fb7e569b5ac3bc1f\": container with ID starting with 9957caf0f1a6e6e760038726089f8662adb758b7a82f5592fb7e569b5ac3bc1f not found: ID does not exist" containerID="9957caf0f1a6e6e760038726089f8662adb758b7a82f5592fb7e569b5ac3bc1f" Dec 03 08:49:48 crc kubenswrapper[4947]: I1203 08:49:48.616864 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9957caf0f1a6e6e760038726089f8662adb758b7a82f5592fb7e569b5ac3bc1f"} err="failed to get container status \"9957caf0f1a6e6e760038726089f8662adb758b7a82f5592fb7e569b5ac3bc1f\": rpc error: code = NotFound desc = could not find container \"9957caf0f1a6e6e760038726089f8662adb758b7a82f5592fb7e569b5ac3bc1f\": container with ID starting with 9957caf0f1a6e6e760038726089f8662adb758b7a82f5592fb7e569b5ac3bc1f not found: ID does not exist" Dec 03 08:49:48 crc kubenswrapper[4947]: I1203 08:49:48.616906 4947 scope.go:117] "RemoveContainer" containerID="b2caa70b7a425ebfc6365173e5fe9effc9d89bf202cda6cf2fa333b94f50db30" Dec 03 08:49:48 crc kubenswrapper[4947]: E1203 08:49:48.617302 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2caa70b7a425ebfc6365173e5fe9effc9d89bf202cda6cf2fa333b94f50db30\": container with ID starting with b2caa70b7a425ebfc6365173e5fe9effc9d89bf202cda6cf2fa333b94f50db30 not found: ID does not exist" containerID="b2caa70b7a425ebfc6365173e5fe9effc9d89bf202cda6cf2fa333b94f50db30" Dec 03 08:49:48 crc kubenswrapper[4947]: I1203 08:49:48.617328 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2caa70b7a425ebfc6365173e5fe9effc9d89bf202cda6cf2fa333b94f50db30"} err="failed to get container status \"b2caa70b7a425ebfc6365173e5fe9effc9d89bf202cda6cf2fa333b94f50db30\": rpc error: code = NotFound desc = could not find container \"b2caa70b7a425ebfc6365173e5fe9effc9d89bf202cda6cf2fa333b94f50db30\": container with ID starting with b2caa70b7a425ebfc6365173e5fe9effc9d89bf202cda6cf2fa333b94f50db30 not found: ID does not exist" Dec 03 08:49:49 crc kubenswrapper[4947]: I1203 08:49:49.097008 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d0ec2dd-c887-4271-8a25-9935963cfcfa" path="/var/lib/kubelet/pods/4d0ec2dd-c887-4271-8a25-9935963cfcfa/volumes" Dec 03 08:49:55 crc kubenswrapper[4947]: I1203 08:49:55.083665 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:49:55 crc kubenswrapper[4947]: E1203 08:49:55.084771 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:49:55 crc kubenswrapper[4947]: I1203 08:49:55.462422 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 03 08:49:56 crc kubenswrapper[4947]: I1203 08:49:56.528714 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 03 08:49:57 crc kubenswrapper[4947]: I1203 08:49:57.807807 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Dec 03 08:49:57 crc kubenswrapper[4947]: E1203 08:49:57.808376 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d0ec2dd-c887-4271-8a25-9935963cfcfa" containerName="extract-content" Dec 03 08:49:57 crc kubenswrapper[4947]: I1203 08:49:57.808405 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d0ec2dd-c887-4271-8a25-9935963cfcfa" containerName="extract-content" Dec 03 08:49:57 crc kubenswrapper[4947]: E1203 08:49:57.808445 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d0ec2dd-c887-4271-8a25-9935963cfcfa" containerName="registry-server" Dec 03 08:49:57 crc kubenswrapper[4947]: I1203 08:49:57.808461 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d0ec2dd-c887-4271-8a25-9935963cfcfa" containerName="registry-server" Dec 03 08:49:57 crc kubenswrapper[4947]: E1203 08:49:57.808598 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d0ec2dd-c887-4271-8a25-9935963cfcfa" containerName="extract-utilities" Dec 03 08:49:57 crc kubenswrapper[4947]: I1203 08:49:57.808619 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d0ec2dd-c887-4271-8a25-9935963cfcfa" containerName="extract-utilities" Dec 03 08:49:57 crc kubenswrapper[4947]: I1203 08:49:57.809037 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d0ec2dd-c887-4271-8a25-9935963cfcfa" containerName="registry-server" Dec 03 08:49:57 crc kubenswrapper[4947]: I1203 08:49:57.810183 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 03 08:49:57 crc kubenswrapper[4947]: I1203 08:49:57.814584 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-kk56l" Dec 03 08:49:57 crc kubenswrapper[4947]: I1203 08:49:57.830058 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 03 08:49:57 crc kubenswrapper[4947]: I1203 08:49:57.932572 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6mkk\" (UniqueName: \"kubernetes.io/projected/5a046d05-a527-424c-a76e-34a6343cfd74-kube-api-access-x6mkk\") pod \"mariadb-client-1-default\" (UID: \"5a046d05-a527-424c-a76e-34a6343cfd74\") " pod="openstack/mariadb-client-1-default" Dec 03 08:49:57 crc kubenswrapper[4947]: I1203 08:49:57.951115 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z2sfr"] Dec 03 08:49:57 crc kubenswrapper[4947]: I1203 08:49:57.952984 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z2sfr" Dec 03 08:49:57 crc kubenswrapper[4947]: I1203 08:49:57.972031 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z2sfr"] Dec 03 08:49:58 crc kubenswrapper[4947]: I1203 08:49:58.035116 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6mkk\" (UniqueName: \"kubernetes.io/projected/5a046d05-a527-424c-a76e-34a6343cfd74-kube-api-access-x6mkk\") pod \"mariadb-client-1-default\" (UID: \"5a046d05-a527-424c-a76e-34a6343cfd74\") " pod="openstack/mariadb-client-1-default" Dec 03 08:49:58 crc kubenswrapper[4947]: I1203 08:49:58.056303 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6mkk\" (UniqueName: \"kubernetes.io/projected/5a046d05-a527-424c-a76e-34a6343cfd74-kube-api-access-x6mkk\") pod \"mariadb-client-1-default\" (UID: \"5a046d05-a527-424c-a76e-34a6343cfd74\") " pod="openstack/mariadb-client-1-default" Dec 03 08:49:58 crc kubenswrapper[4947]: I1203 08:49:58.136426 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1db9b190-6dad-476a-9b3d-c6ba48c6bb3d-utilities\") pod \"redhat-marketplace-z2sfr\" (UID: \"1db9b190-6dad-476a-9b3d-c6ba48c6bb3d\") " pod="openshift-marketplace/redhat-marketplace-z2sfr" Dec 03 08:49:58 crc kubenswrapper[4947]: I1203 08:49:58.136655 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p4n9\" (UniqueName: \"kubernetes.io/projected/1db9b190-6dad-476a-9b3d-c6ba48c6bb3d-kube-api-access-6p4n9\") pod \"redhat-marketplace-z2sfr\" (UID: \"1db9b190-6dad-476a-9b3d-c6ba48c6bb3d\") " pod="openshift-marketplace/redhat-marketplace-z2sfr" Dec 03 08:49:58 crc kubenswrapper[4947]: I1203 08:49:58.136877 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1db9b190-6dad-476a-9b3d-c6ba48c6bb3d-catalog-content\") pod \"redhat-marketplace-z2sfr\" (UID: \"1db9b190-6dad-476a-9b3d-c6ba48c6bb3d\") " pod="openshift-marketplace/redhat-marketplace-z2sfr" Dec 03 08:49:58 crc kubenswrapper[4947]: I1203 08:49:58.184118 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 03 08:49:58 crc kubenswrapper[4947]: I1203 08:49:58.238288 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1db9b190-6dad-476a-9b3d-c6ba48c6bb3d-utilities\") pod \"redhat-marketplace-z2sfr\" (UID: \"1db9b190-6dad-476a-9b3d-c6ba48c6bb3d\") " pod="openshift-marketplace/redhat-marketplace-z2sfr" Dec 03 08:49:58 crc kubenswrapper[4947]: I1203 08:49:58.238380 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p4n9\" (UniqueName: \"kubernetes.io/projected/1db9b190-6dad-476a-9b3d-c6ba48c6bb3d-kube-api-access-6p4n9\") pod \"redhat-marketplace-z2sfr\" (UID: \"1db9b190-6dad-476a-9b3d-c6ba48c6bb3d\") " pod="openshift-marketplace/redhat-marketplace-z2sfr" Dec 03 08:49:58 crc kubenswrapper[4947]: I1203 08:49:58.238440 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1db9b190-6dad-476a-9b3d-c6ba48c6bb3d-catalog-content\") pod \"redhat-marketplace-z2sfr\" (UID: \"1db9b190-6dad-476a-9b3d-c6ba48c6bb3d\") " pod="openshift-marketplace/redhat-marketplace-z2sfr" Dec 03 08:49:58 crc kubenswrapper[4947]: I1203 08:49:58.238822 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1db9b190-6dad-476a-9b3d-c6ba48c6bb3d-catalog-content\") pod \"redhat-marketplace-z2sfr\" (UID: \"1db9b190-6dad-476a-9b3d-c6ba48c6bb3d\") " pod="openshift-marketplace/redhat-marketplace-z2sfr" Dec 03 08:49:58 crc kubenswrapper[4947]: I1203 08:49:58.238895 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1db9b190-6dad-476a-9b3d-c6ba48c6bb3d-utilities\") pod \"redhat-marketplace-z2sfr\" (UID: \"1db9b190-6dad-476a-9b3d-c6ba48c6bb3d\") " pod="openshift-marketplace/redhat-marketplace-z2sfr" Dec 03 08:49:58 crc kubenswrapper[4947]: I1203 08:49:58.255829 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p4n9\" (UniqueName: \"kubernetes.io/projected/1db9b190-6dad-476a-9b3d-c6ba48c6bb3d-kube-api-access-6p4n9\") pod \"redhat-marketplace-z2sfr\" (UID: \"1db9b190-6dad-476a-9b3d-c6ba48c6bb3d\") " pod="openshift-marketplace/redhat-marketplace-z2sfr" Dec 03 08:49:58 crc kubenswrapper[4947]: I1203 08:49:58.293339 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z2sfr" Dec 03 08:49:58 crc kubenswrapper[4947]: I1203 08:49:58.717135 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 03 08:49:58 crc kubenswrapper[4947]: I1203 08:49:58.759098 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z2sfr"] Dec 03 08:49:58 crc kubenswrapper[4947]: W1203 08:49:58.764901 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1db9b190_6dad_476a_9b3d_c6ba48c6bb3d.slice/crio-3f48755f9070072d1dafe760df168e2d4c17e601a639318d92efa26146ce8d26 WatchSource:0}: Error finding container 3f48755f9070072d1dafe760df168e2d4c17e601a639318d92efa26146ce8d26: Status 404 returned error can't find the container with id 3f48755f9070072d1dafe760df168e2d4c17e601a639318d92efa26146ce8d26 Dec 03 08:49:59 crc kubenswrapper[4947]: I1203 08:49:59.604362 4947 generic.go:334] "Generic (PLEG): container finished" podID="1db9b190-6dad-476a-9b3d-c6ba48c6bb3d" containerID="cafe56a9a16dd8d0ed621a426a8e46823826ac0e735047dc41e2d7ed7047d7da" exitCode=0 Dec 03 08:49:59 crc kubenswrapper[4947]: I1203 08:49:59.604550 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2sfr" event={"ID":"1db9b190-6dad-476a-9b3d-c6ba48c6bb3d","Type":"ContainerDied","Data":"cafe56a9a16dd8d0ed621a426a8e46823826ac0e735047dc41e2d7ed7047d7da"} Dec 03 08:49:59 crc kubenswrapper[4947]: I1203 08:49:59.604961 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2sfr" event={"ID":"1db9b190-6dad-476a-9b3d-c6ba48c6bb3d","Type":"ContainerStarted","Data":"3f48755f9070072d1dafe760df168e2d4c17e601a639318d92efa26146ce8d26"} Dec 03 08:49:59 crc kubenswrapper[4947]: I1203 08:49:59.607876 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"5a046d05-a527-424c-a76e-34a6343cfd74","Type":"ContainerStarted","Data":"003e33db8e277013d7f1526a394ae55d6cd94d1e72da3e242799d1feceb4989a"} Dec 03 08:50:00 crc kubenswrapper[4947]: I1203 08:50:00.616803 4947 generic.go:334] "Generic (PLEG): container finished" podID="5a046d05-a527-424c-a76e-34a6343cfd74" containerID="2ca2399af38763ea022d7d8ce682596d3074eb3a5c6f1ac77d87208699e60d19" exitCode=0 Dec 03 08:50:00 crc kubenswrapper[4947]: I1203 08:50:00.616848 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"5a046d05-a527-424c-a76e-34a6343cfd74","Type":"ContainerDied","Data":"2ca2399af38763ea022d7d8ce682596d3074eb3a5c6f1ac77d87208699e60d19"} Dec 03 08:50:01 crc kubenswrapper[4947]: I1203 08:50:01.629442 4947 generic.go:334] "Generic (PLEG): container finished" podID="1db9b190-6dad-476a-9b3d-c6ba48c6bb3d" containerID="3572521c3dae4a79790465e88ac1d43911059a8a5560cadd926db68cf5d3e6bb" exitCode=0 Dec 03 08:50:01 crc kubenswrapper[4947]: I1203 08:50:01.629629 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2sfr" event={"ID":"1db9b190-6dad-476a-9b3d-c6ba48c6bb3d","Type":"ContainerDied","Data":"3572521c3dae4a79790465e88ac1d43911059a8a5560cadd926db68cf5d3e6bb"} Dec 03 08:50:02 crc kubenswrapper[4947]: I1203 08:50:02.160138 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 03 08:50:02 crc kubenswrapper[4947]: I1203 08:50:02.190991 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_5a046d05-a527-424c-a76e-34a6343cfd74/mariadb-client-1-default/0.log" Dec 03 08:50:02 crc kubenswrapper[4947]: I1203 08:50:02.219233 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 03 08:50:02 crc kubenswrapper[4947]: I1203 08:50:02.227090 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 03 08:50:02 crc kubenswrapper[4947]: I1203 08:50:02.313671 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6mkk\" (UniqueName: \"kubernetes.io/projected/5a046d05-a527-424c-a76e-34a6343cfd74-kube-api-access-x6mkk\") pod \"5a046d05-a527-424c-a76e-34a6343cfd74\" (UID: \"5a046d05-a527-424c-a76e-34a6343cfd74\") " Dec 03 08:50:02 crc kubenswrapper[4947]: I1203 08:50:02.318838 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a046d05-a527-424c-a76e-34a6343cfd74-kube-api-access-x6mkk" (OuterVolumeSpecName: "kube-api-access-x6mkk") pod "5a046d05-a527-424c-a76e-34a6343cfd74" (UID: "5a046d05-a527-424c-a76e-34a6343cfd74"). InnerVolumeSpecName "kube-api-access-x6mkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:50:02 crc kubenswrapper[4947]: I1203 08:50:02.415650 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6mkk\" (UniqueName: \"kubernetes.io/projected/5a046d05-a527-424c-a76e-34a6343cfd74-kube-api-access-x6mkk\") on node \"crc\" DevicePath \"\"" Dec 03 08:50:02 crc kubenswrapper[4947]: I1203 08:50:02.421281 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-cell1"] Dec 03 08:50:02 crc kubenswrapper[4947]: E1203 08:50:02.421589 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a046d05-a527-424c-a76e-34a6343cfd74" containerName="mariadb-client-1-default" Dec 03 08:50:02 crc kubenswrapper[4947]: I1203 08:50:02.421600 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a046d05-a527-424c-a76e-34a6343cfd74" containerName="mariadb-client-1-default" Dec 03 08:50:02 crc kubenswrapper[4947]: I1203 08:50:02.421815 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a046d05-a527-424c-a76e-34a6343cfd74" containerName="mariadb-client-1-default" Dec 03 08:50:02 crc kubenswrapper[4947]: I1203 08:50:02.422299 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-cell1" Dec 03 08:50:02 crc kubenswrapper[4947]: I1203 08:50:02.456574 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-cell1"] Dec 03 08:50:02 crc kubenswrapper[4947]: I1203 08:50:02.618394 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlxl7\" (UniqueName: \"kubernetes.io/projected/6723e63f-aa53-4642-84a7-d398e14c0047-kube-api-access-qlxl7\") pod \"mariadb-client-1-cell1\" (UID: \"6723e63f-aa53-4642-84a7-d398e14c0047\") " pod="openstack/mariadb-client-1-cell1" Dec 03 08:50:02 crc kubenswrapper[4947]: I1203 08:50:02.645376 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="003e33db8e277013d7f1526a394ae55d6cd94d1e72da3e242799d1feceb4989a" Dec 03 08:50:02 crc kubenswrapper[4947]: I1203 08:50:02.645403 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 03 08:50:02 crc kubenswrapper[4947]: I1203 08:50:02.648920 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2sfr" event={"ID":"1db9b190-6dad-476a-9b3d-c6ba48c6bb3d","Type":"ContainerStarted","Data":"5b737ee517368df0c75f740ff729bc0758f54ed0ee60a32ee17614db29482b0e"} Dec 03 08:50:02 crc kubenswrapper[4947]: I1203 08:50:02.695388 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z2sfr" podStartSLOduration=3.275215749 podStartE2EDuration="5.695360229s" podCreationTimestamp="2025-12-03 08:49:57 +0000 UTC" firstStartedPulling="2025-12-03 08:49:59.608863215 +0000 UTC m=+7260.869817691" lastFinishedPulling="2025-12-03 08:50:02.029007745 +0000 UTC m=+7263.289962171" observedRunningTime="2025-12-03 08:50:02.683727775 +0000 UTC m=+7263.944682201" watchObservedRunningTime="2025-12-03 08:50:02.695360229 +0000 UTC m=+7263.956314695" Dec 03 08:50:02 crc kubenswrapper[4947]: I1203 08:50:02.719650 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlxl7\" (UniqueName: \"kubernetes.io/projected/6723e63f-aa53-4642-84a7-d398e14c0047-kube-api-access-qlxl7\") pod \"mariadb-client-1-cell1\" (UID: \"6723e63f-aa53-4642-84a7-d398e14c0047\") " pod="openstack/mariadb-client-1-cell1" Dec 03 08:50:02 crc kubenswrapper[4947]: I1203 08:50:02.753246 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlxl7\" (UniqueName: \"kubernetes.io/projected/6723e63f-aa53-4642-84a7-d398e14c0047-kube-api-access-qlxl7\") pod \"mariadb-client-1-cell1\" (UID: \"6723e63f-aa53-4642-84a7-d398e14c0047\") " pod="openstack/mariadb-client-1-cell1" Dec 03 08:50:03 crc kubenswrapper[4947]: I1203 08:50:03.042177 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-cell1" Dec 03 08:50:03 crc kubenswrapper[4947]: I1203 08:50:03.101770 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a046d05-a527-424c-a76e-34a6343cfd74" path="/var/lib/kubelet/pods/5a046d05-a527-424c-a76e-34a6343cfd74/volumes" Dec 03 08:50:03 crc kubenswrapper[4947]: I1203 08:50:03.612286 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-cell1"] Dec 03 08:50:03 crc kubenswrapper[4947]: W1203 08:50:03.618200 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6723e63f_aa53_4642_84a7_d398e14c0047.slice/crio-9b848ae4534d2a44aa2b7c12d42588c7de89c4bf9c1be56a6f91b93c9bc8a983 WatchSource:0}: Error finding container 9b848ae4534d2a44aa2b7c12d42588c7de89c4bf9c1be56a6f91b93c9bc8a983: Status 404 returned error can't find the container with id 9b848ae4534d2a44aa2b7c12d42588c7de89c4bf9c1be56a6f91b93c9bc8a983 Dec 03 08:50:03 crc kubenswrapper[4947]: I1203 08:50:03.659986 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-cell1" event={"ID":"6723e63f-aa53-4642-84a7-d398e14c0047","Type":"ContainerStarted","Data":"9b848ae4534d2a44aa2b7c12d42588c7de89c4bf9c1be56a6f91b93c9bc8a983"} Dec 03 08:50:04 crc kubenswrapper[4947]: I1203 08:50:04.674472 4947 generic.go:334] "Generic (PLEG): container finished" podID="6723e63f-aa53-4642-84a7-d398e14c0047" containerID="83f621afc8477264eaae8dd259684e83b30f848ae6a10eca3f64dda53974f8f2" exitCode=0 Dec 03 08:50:04 crc kubenswrapper[4947]: I1203 08:50:04.674539 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-cell1" event={"ID":"6723e63f-aa53-4642-84a7-d398e14c0047","Type":"ContainerDied","Data":"83f621afc8477264eaae8dd259684e83b30f848ae6a10eca3f64dda53974f8f2"} Dec 03 08:50:06 crc kubenswrapper[4947]: I1203 08:50:06.060576 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-cell1" Dec 03 08:50:06 crc kubenswrapper[4947]: I1203 08:50:06.083353 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:50:06 crc kubenswrapper[4947]: E1203 08:50:06.083744 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:50:06 crc kubenswrapper[4947]: I1203 08:50:06.083914 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-cell1_6723e63f-aa53-4642-84a7-d398e14c0047/mariadb-client-1-cell1/0.log" Dec 03 08:50:06 crc kubenswrapper[4947]: I1203 08:50:06.112346 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-cell1"] Dec 03 08:50:06 crc kubenswrapper[4947]: I1203 08:50:06.118969 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-cell1"] Dec 03 08:50:06 crc kubenswrapper[4947]: I1203 08:50:06.209159 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlxl7\" (UniqueName: \"kubernetes.io/projected/6723e63f-aa53-4642-84a7-d398e14c0047-kube-api-access-qlxl7\") pod \"6723e63f-aa53-4642-84a7-d398e14c0047\" (UID: \"6723e63f-aa53-4642-84a7-d398e14c0047\") " Dec 03 08:50:06 crc kubenswrapper[4947]: I1203 08:50:06.215696 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6723e63f-aa53-4642-84a7-d398e14c0047-kube-api-access-qlxl7" (OuterVolumeSpecName: "kube-api-access-qlxl7") pod "6723e63f-aa53-4642-84a7-d398e14c0047" (UID: "6723e63f-aa53-4642-84a7-d398e14c0047"). InnerVolumeSpecName "kube-api-access-qlxl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:50:06 crc kubenswrapper[4947]: I1203 08:50:06.258925 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-cell2"] Dec 03 08:50:06 crc kubenswrapper[4947]: E1203 08:50:06.259246 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6723e63f-aa53-4642-84a7-d398e14c0047" containerName="mariadb-client-1-cell1" Dec 03 08:50:06 crc kubenswrapper[4947]: I1203 08:50:06.259264 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="6723e63f-aa53-4642-84a7-d398e14c0047" containerName="mariadb-client-1-cell1" Dec 03 08:50:06 crc kubenswrapper[4947]: I1203 08:50:06.259463 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="6723e63f-aa53-4642-84a7-d398e14c0047" containerName="mariadb-client-1-cell1" Dec 03 08:50:06 crc kubenswrapper[4947]: I1203 08:50:06.260092 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-cell2" Dec 03 08:50:06 crc kubenswrapper[4947]: I1203 08:50:06.265325 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-cell2"] Dec 03 08:50:06 crc kubenswrapper[4947]: I1203 08:50:06.311265 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlxl7\" (UniqueName: \"kubernetes.io/projected/6723e63f-aa53-4642-84a7-d398e14c0047-kube-api-access-qlxl7\") on node \"crc\" DevicePath \"\"" Dec 03 08:50:06 crc kubenswrapper[4947]: I1203 08:50:06.413196 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m59jf\" (UniqueName: \"kubernetes.io/projected/2c021ddc-a6f5-44c3-9d0d-6184a6f459b4-kube-api-access-m59jf\") pod \"mariadb-client-1-cell2\" (UID: \"2c021ddc-a6f5-44c3-9d0d-6184a6f459b4\") " pod="openstack/mariadb-client-1-cell2" Dec 03 08:50:06 crc kubenswrapper[4947]: I1203 08:50:06.514189 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m59jf\" (UniqueName: \"kubernetes.io/projected/2c021ddc-a6f5-44c3-9d0d-6184a6f459b4-kube-api-access-m59jf\") pod \"mariadb-client-1-cell2\" (UID: \"2c021ddc-a6f5-44c3-9d0d-6184a6f459b4\") " pod="openstack/mariadb-client-1-cell2" Dec 03 08:50:06 crc kubenswrapper[4947]: I1203 08:50:06.532264 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m59jf\" (UniqueName: \"kubernetes.io/projected/2c021ddc-a6f5-44c3-9d0d-6184a6f459b4-kube-api-access-m59jf\") pod \"mariadb-client-1-cell2\" (UID: \"2c021ddc-a6f5-44c3-9d0d-6184a6f459b4\") " pod="openstack/mariadb-client-1-cell2" Dec 03 08:50:06 crc kubenswrapper[4947]: I1203 08:50:06.583436 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-cell2" Dec 03 08:50:06 crc kubenswrapper[4947]: I1203 08:50:06.697791 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b848ae4534d2a44aa2b7c12d42588c7de89c4bf9c1be56a6f91b93c9bc8a983" Dec 03 08:50:06 crc kubenswrapper[4947]: I1203 08:50:06.697874 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-cell1" Dec 03 08:50:06 crc kubenswrapper[4947]: I1203 08:50:06.997018 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-cell2"] Dec 03 08:50:07 crc kubenswrapper[4947]: W1203 08:50:07.002408 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c021ddc_a6f5_44c3_9d0d_6184a6f459b4.slice/crio-8d65d52c8e17aeec51d57323e0a8d3bf1b9904ce3dc0e7aa98969cca260e147f WatchSource:0}: Error finding container 8d65d52c8e17aeec51d57323e0a8d3bf1b9904ce3dc0e7aa98969cca260e147f: Status 404 returned error can't find the container with id 8d65d52c8e17aeec51d57323e0a8d3bf1b9904ce3dc0e7aa98969cca260e147f Dec 03 08:50:07 crc kubenswrapper[4947]: I1203 08:50:07.093743 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6723e63f-aa53-4642-84a7-d398e14c0047" path="/var/lib/kubelet/pods/6723e63f-aa53-4642-84a7-d398e14c0047/volumes" Dec 03 08:50:07 crc kubenswrapper[4947]: I1203 08:50:07.712127 4947 generic.go:334] "Generic (PLEG): container finished" podID="2c021ddc-a6f5-44c3-9d0d-6184a6f459b4" containerID="60f2f09bdfac3ec3139a5ea45f17ee8b52d3025c54e0200afcde4cdd0ca5caca" exitCode=0 Dec 03 08:50:07 crc kubenswrapper[4947]: I1203 08:50:07.712191 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-cell2" event={"ID":"2c021ddc-a6f5-44c3-9d0d-6184a6f459b4","Type":"ContainerDied","Data":"60f2f09bdfac3ec3139a5ea45f17ee8b52d3025c54e0200afcde4cdd0ca5caca"} Dec 03 08:50:07 crc kubenswrapper[4947]: I1203 08:50:07.712576 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-cell2" event={"ID":"2c021ddc-a6f5-44c3-9d0d-6184a6f459b4","Type":"ContainerStarted","Data":"8d65d52c8e17aeec51d57323e0a8d3bf1b9904ce3dc0e7aa98969cca260e147f"} Dec 03 08:50:08 crc kubenswrapper[4947]: I1203 08:50:08.294938 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z2sfr" Dec 03 08:50:08 crc kubenswrapper[4947]: I1203 08:50:08.295254 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z2sfr" Dec 03 08:50:08 crc kubenswrapper[4947]: I1203 08:50:08.382750 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z2sfr" Dec 03 08:50:08 crc kubenswrapper[4947]: I1203 08:50:08.815280 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z2sfr" Dec 03 08:50:09 crc kubenswrapper[4947]: I1203 08:50:09.166386 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-cell2" Dec 03 08:50:09 crc kubenswrapper[4947]: I1203 08:50:09.184141 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-cell2_2c021ddc-a6f5-44c3-9d0d-6184a6f459b4/mariadb-client-1-cell2/0.log" Dec 03 08:50:09 crc kubenswrapper[4947]: I1203 08:50:09.214346 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-cell2"] Dec 03 08:50:09 crc kubenswrapper[4947]: I1203 08:50:09.220148 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-cell2"] Dec 03 08:50:09 crc kubenswrapper[4947]: I1203 08:50:09.263672 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m59jf\" (UniqueName: \"kubernetes.io/projected/2c021ddc-a6f5-44c3-9d0d-6184a6f459b4-kube-api-access-m59jf\") pod \"2c021ddc-a6f5-44c3-9d0d-6184a6f459b4\" (UID: \"2c021ddc-a6f5-44c3-9d0d-6184a6f459b4\") " Dec 03 08:50:09 crc kubenswrapper[4947]: I1203 08:50:09.272107 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c021ddc-a6f5-44c3-9d0d-6184a6f459b4-kube-api-access-m59jf" (OuterVolumeSpecName: "kube-api-access-m59jf") pod "2c021ddc-a6f5-44c3-9d0d-6184a6f459b4" (UID: "2c021ddc-a6f5-44c3-9d0d-6184a6f459b4"). InnerVolumeSpecName "kube-api-access-m59jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:50:09 crc kubenswrapper[4947]: I1203 08:50:09.366042 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m59jf\" (UniqueName: \"kubernetes.io/projected/2c021ddc-a6f5-44c3-9d0d-6184a6f459b4-kube-api-access-m59jf\") on node \"crc\" DevicePath \"\"" Dec 03 08:50:09 crc kubenswrapper[4947]: I1203 08:50:09.732705 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d65d52c8e17aeec51d57323e0a8d3bf1b9904ce3dc0e7aa98969cca260e147f" Dec 03 08:50:09 crc kubenswrapper[4947]: I1203 08:50:09.732724 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-cell2" Dec 03 08:50:09 crc kubenswrapper[4947]: I1203 08:50:09.788192 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Dec 03 08:50:09 crc kubenswrapper[4947]: E1203 08:50:09.788696 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c021ddc-a6f5-44c3-9d0d-6184a6f459b4" containerName="mariadb-client-1-cell2" Dec 03 08:50:09 crc kubenswrapper[4947]: I1203 08:50:09.788726 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c021ddc-a6f5-44c3-9d0d-6184a6f459b4" containerName="mariadb-client-1-cell2" Dec 03 08:50:09 crc kubenswrapper[4947]: I1203 08:50:09.788966 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c021ddc-a6f5-44c3-9d0d-6184a6f459b4" containerName="mariadb-client-1-cell2" Dec 03 08:50:09 crc kubenswrapper[4947]: I1203 08:50:09.789831 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 03 08:50:09 crc kubenswrapper[4947]: I1203 08:50:09.792337 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-kk56l" Dec 03 08:50:09 crc kubenswrapper[4947]: I1203 08:50:09.807851 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 03 08:50:09 crc kubenswrapper[4947]: I1203 08:50:09.935001 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z2sfr"] Dec 03 08:50:09 crc kubenswrapper[4947]: I1203 08:50:09.977899 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc4gj\" (UniqueName: \"kubernetes.io/projected/85a03670-cea5-47a1-b778-890ce2083dd9-kube-api-access-dc4gj\") pod \"mariadb-client-2-default\" (UID: \"85a03670-cea5-47a1-b778-890ce2083dd9\") " pod="openstack/mariadb-client-2-default" Dec 03 08:50:10 crc kubenswrapper[4947]: I1203 08:50:10.079209 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc4gj\" (UniqueName: \"kubernetes.io/projected/85a03670-cea5-47a1-b778-890ce2083dd9-kube-api-access-dc4gj\") pod \"mariadb-client-2-default\" (UID: \"85a03670-cea5-47a1-b778-890ce2083dd9\") " pod="openstack/mariadb-client-2-default" Dec 03 08:50:10 crc kubenswrapper[4947]: I1203 08:50:10.102081 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc4gj\" (UniqueName: \"kubernetes.io/projected/85a03670-cea5-47a1-b778-890ce2083dd9-kube-api-access-dc4gj\") pod \"mariadb-client-2-default\" (UID: \"85a03670-cea5-47a1-b778-890ce2083dd9\") " pod="openstack/mariadb-client-2-default" Dec 03 08:50:10 crc kubenswrapper[4947]: I1203 08:50:10.114448 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 03 08:50:10 crc kubenswrapper[4947]: I1203 08:50:10.461902 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 03 08:50:10 crc kubenswrapper[4947]: I1203 08:50:10.740750 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"85a03670-cea5-47a1-b778-890ce2083dd9","Type":"ContainerStarted","Data":"0a2196ec9e60bf5895093ad6d1067123d66b16940956e9cefdd92250bba01631"} Dec 03 08:50:10 crc kubenswrapper[4947]: I1203 08:50:10.740804 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"85a03670-cea5-47a1-b778-890ce2083dd9","Type":"ContainerStarted","Data":"4568664c838ef0daae802708f42e49f3b0b71a109049c0d41486508089a1021a"} Dec 03 08:50:10 crc kubenswrapper[4947]: I1203 08:50:10.762698 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-2-default" podStartSLOduration=1.762677031 podStartE2EDuration="1.762677031s" podCreationTimestamp="2025-12-03 08:50:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:50:10.756585026 +0000 UTC m=+7272.017539452" watchObservedRunningTime="2025-12-03 08:50:10.762677031 +0000 UTC m=+7272.023631457" Dec 03 08:50:11 crc kubenswrapper[4947]: I1203 08:50:11.114449 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c021ddc-a6f5-44c3-9d0d-6184a6f459b4" path="/var/lib/kubelet/pods/2c021ddc-a6f5-44c3-9d0d-6184a6f459b4/volumes" Dec 03 08:50:11 crc kubenswrapper[4947]: I1203 08:50:11.750246 4947 generic.go:334] "Generic (PLEG): container finished" podID="85a03670-cea5-47a1-b778-890ce2083dd9" containerID="0a2196ec9e60bf5895093ad6d1067123d66b16940956e9cefdd92250bba01631" exitCode=1 Dec 03 08:50:11 crc kubenswrapper[4947]: I1203 08:50:11.750320 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"85a03670-cea5-47a1-b778-890ce2083dd9","Type":"ContainerDied","Data":"0a2196ec9e60bf5895093ad6d1067123d66b16940956e9cefdd92250bba01631"} Dec 03 08:50:11 crc kubenswrapper[4947]: I1203 08:50:11.750522 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z2sfr" podUID="1db9b190-6dad-476a-9b3d-c6ba48c6bb3d" containerName="registry-server" containerID="cri-o://5b737ee517368df0c75f740ff729bc0758f54ed0ee60a32ee17614db29482b0e" gracePeriod=2 Dec 03 08:50:12 crc kubenswrapper[4947]: I1203 08:50:12.195947 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z2sfr" Dec 03 08:50:12 crc kubenswrapper[4947]: I1203 08:50:12.324176 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1db9b190-6dad-476a-9b3d-c6ba48c6bb3d-utilities\") pod \"1db9b190-6dad-476a-9b3d-c6ba48c6bb3d\" (UID: \"1db9b190-6dad-476a-9b3d-c6ba48c6bb3d\") " Dec 03 08:50:12 crc kubenswrapper[4947]: I1203 08:50:12.324554 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1db9b190-6dad-476a-9b3d-c6ba48c6bb3d-catalog-content\") pod \"1db9b190-6dad-476a-9b3d-c6ba48c6bb3d\" (UID: \"1db9b190-6dad-476a-9b3d-c6ba48c6bb3d\") " Dec 03 08:50:12 crc kubenswrapper[4947]: I1203 08:50:12.324586 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p4n9\" (UniqueName: \"kubernetes.io/projected/1db9b190-6dad-476a-9b3d-c6ba48c6bb3d-kube-api-access-6p4n9\") pod \"1db9b190-6dad-476a-9b3d-c6ba48c6bb3d\" (UID: \"1db9b190-6dad-476a-9b3d-c6ba48c6bb3d\") " Dec 03 08:50:12 crc kubenswrapper[4947]: I1203 08:50:12.325081 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1db9b190-6dad-476a-9b3d-c6ba48c6bb3d-utilities" (OuterVolumeSpecName: "utilities") pod "1db9b190-6dad-476a-9b3d-c6ba48c6bb3d" (UID: "1db9b190-6dad-476a-9b3d-c6ba48c6bb3d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:50:12 crc kubenswrapper[4947]: I1203 08:50:12.331969 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1db9b190-6dad-476a-9b3d-c6ba48c6bb3d-kube-api-access-6p4n9" (OuterVolumeSpecName: "kube-api-access-6p4n9") pod "1db9b190-6dad-476a-9b3d-c6ba48c6bb3d" (UID: "1db9b190-6dad-476a-9b3d-c6ba48c6bb3d"). InnerVolumeSpecName "kube-api-access-6p4n9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:50:12 crc kubenswrapper[4947]: I1203 08:50:12.341814 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1db9b190-6dad-476a-9b3d-c6ba48c6bb3d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1db9b190-6dad-476a-9b3d-c6ba48c6bb3d" (UID: "1db9b190-6dad-476a-9b3d-c6ba48c6bb3d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:50:12 crc kubenswrapper[4947]: I1203 08:50:12.426232 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1db9b190-6dad-476a-9b3d-c6ba48c6bb3d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:50:12 crc kubenswrapper[4947]: I1203 08:50:12.426274 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1db9b190-6dad-476a-9b3d-c6ba48c6bb3d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:50:12 crc kubenswrapper[4947]: I1203 08:50:12.426290 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p4n9\" (UniqueName: \"kubernetes.io/projected/1db9b190-6dad-476a-9b3d-c6ba48c6bb3d-kube-api-access-6p4n9\") on node \"crc\" DevicePath \"\"" Dec 03 08:50:12 crc kubenswrapper[4947]: I1203 08:50:12.761062 4947 generic.go:334] "Generic (PLEG): container finished" podID="1db9b190-6dad-476a-9b3d-c6ba48c6bb3d" containerID="5b737ee517368df0c75f740ff729bc0758f54ed0ee60a32ee17614db29482b0e" exitCode=0 Dec 03 08:50:12 crc kubenswrapper[4947]: I1203 08:50:12.761136 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z2sfr" Dec 03 08:50:12 crc kubenswrapper[4947]: I1203 08:50:12.761186 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2sfr" event={"ID":"1db9b190-6dad-476a-9b3d-c6ba48c6bb3d","Type":"ContainerDied","Data":"5b737ee517368df0c75f740ff729bc0758f54ed0ee60a32ee17614db29482b0e"} Dec 03 08:50:12 crc kubenswrapper[4947]: I1203 08:50:12.761225 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2sfr" event={"ID":"1db9b190-6dad-476a-9b3d-c6ba48c6bb3d","Type":"ContainerDied","Data":"3f48755f9070072d1dafe760df168e2d4c17e601a639318d92efa26146ce8d26"} Dec 03 08:50:12 crc kubenswrapper[4947]: I1203 08:50:12.761255 4947 scope.go:117] "RemoveContainer" containerID="5b737ee517368df0c75f740ff729bc0758f54ed0ee60a32ee17614db29482b0e" Dec 03 08:50:12 crc kubenswrapper[4947]: I1203 08:50:12.781887 4947 scope.go:117] "RemoveContainer" containerID="3572521c3dae4a79790465e88ac1d43911059a8a5560cadd926db68cf5d3e6bb" Dec 03 08:50:12 crc kubenswrapper[4947]: I1203 08:50:12.804958 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z2sfr"] Dec 03 08:50:12 crc kubenswrapper[4947]: I1203 08:50:12.812504 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z2sfr"] Dec 03 08:50:12 crc kubenswrapper[4947]: I1203 08:50:12.818953 4947 scope.go:117] "RemoveContainer" containerID="cafe56a9a16dd8d0ed621a426a8e46823826ac0e735047dc41e2d7ed7047d7da" Dec 03 08:50:12 crc kubenswrapper[4947]: I1203 08:50:12.847992 4947 scope.go:117] "RemoveContainer" containerID="5b737ee517368df0c75f740ff729bc0758f54ed0ee60a32ee17614db29482b0e" Dec 03 08:50:12 crc kubenswrapper[4947]: E1203 08:50:12.848448 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b737ee517368df0c75f740ff729bc0758f54ed0ee60a32ee17614db29482b0e\": container with ID starting with 5b737ee517368df0c75f740ff729bc0758f54ed0ee60a32ee17614db29482b0e not found: ID does not exist" containerID="5b737ee517368df0c75f740ff729bc0758f54ed0ee60a32ee17614db29482b0e" Dec 03 08:50:12 crc kubenswrapper[4947]: I1203 08:50:12.848485 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b737ee517368df0c75f740ff729bc0758f54ed0ee60a32ee17614db29482b0e"} err="failed to get container status \"5b737ee517368df0c75f740ff729bc0758f54ed0ee60a32ee17614db29482b0e\": rpc error: code = NotFound desc = could not find container \"5b737ee517368df0c75f740ff729bc0758f54ed0ee60a32ee17614db29482b0e\": container with ID starting with 5b737ee517368df0c75f740ff729bc0758f54ed0ee60a32ee17614db29482b0e not found: ID does not exist" Dec 03 08:50:12 crc kubenswrapper[4947]: I1203 08:50:12.848522 4947 scope.go:117] "RemoveContainer" containerID="3572521c3dae4a79790465e88ac1d43911059a8a5560cadd926db68cf5d3e6bb" Dec 03 08:50:12 crc kubenswrapper[4947]: E1203 08:50:12.848785 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3572521c3dae4a79790465e88ac1d43911059a8a5560cadd926db68cf5d3e6bb\": container with ID starting with 3572521c3dae4a79790465e88ac1d43911059a8a5560cadd926db68cf5d3e6bb not found: ID does not exist" containerID="3572521c3dae4a79790465e88ac1d43911059a8a5560cadd926db68cf5d3e6bb" Dec 03 08:50:12 crc kubenswrapper[4947]: I1203 08:50:12.848805 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3572521c3dae4a79790465e88ac1d43911059a8a5560cadd926db68cf5d3e6bb"} err="failed to get container status \"3572521c3dae4a79790465e88ac1d43911059a8a5560cadd926db68cf5d3e6bb\": rpc error: code = NotFound desc = could not find container \"3572521c3dae4a79790465e88ac1d43911059a8a5560cadd926db68cf5d3e6bb\": container with ID starting with 3572521c3dae4a79790465e88ac1d43911059a8a5560cadd926db68cf5d3e6bb not found: ID does not exist" Dec 03 08:50:12 crc kubenswrapper[4947]: I1203 08:50:12.848816 4947 scope.go:117] "RemoveContainer" containerID="cafe56a9a16dd8d0ed621a426a8e46823826ac0e735047dc41e2d7ed7047d7da" Dec 03 08:50:12 crc kubenswrapper[4947]: E1203 08:50:12.849203 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cafe56a9a16dd8d0ed621a426a8e46823826ac0e735047dc41e2d7ed7047d7da\": container with ID starting with cafe56a9a16dd8d0ed621a426a8e46823826ac0e735047dc41e2d7ed7047d7da not found: ID does not exist" containerID="cafe56a9a16dd8d0ed621a426a8e46823826ac0e735047dc41e2d7ed7047d7da" Dec 03 08:50:12 crc kubenswrapper[4947]: I1203 08:50:12.849290 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cafe56a9a16dd8d0ed621a426a8e46823826ac0e735047dc41e2d7ed7047d7da"} err="failed to get container status \"cafe56a9a16dd8d0ed621a426a8e46823826ac0e735047dc41e2d7ed7047d7da\": rpc error: code = NotFound desc = could not find container \"cafe56a9a16dd8d0ed621a426a8e46823826ac0e735047dc41e2d7ed7047d7da\": container with ID starting with cafe56a9a16dd8d0ed621a426a8e46823826ac0e735047dc41e2d7ed7047d7da not found: ID does not exist" Dec 03 08:50:13 crc kubenswrapper[4947]: I1203 08:50:13.092657 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1db9b190-6dad-476a-9b3d-c6ba48c6bb3d" path="/var/lib/kubelet/pods/1db9b190-6dad-476a-9b3d-c6ba48c6bb3d/volumes" Dec 03 08:50:13 crc kubenswrapper[4947]: I1203 08:50:13.127503 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 03 08:50:13 crc kubenswrapper[4947]: I1203 08:50:13.170727 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 03 08:50:13 crc kubenswrapper[4947]: I1203 08:50:13.177141 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 03 08:50:13 crc kubenswrapper[4947]: I1203 08:50:13.238066 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc4gj\" (UniqueName: \"kubernetes.io/projected/85a03670-cea5-47a1-b778-890ce2083dd9-kube-api-access-dc4gj\") pod \"85a03670-cea5-47a1-b778-890ce2083dd9\" (UID: \"85a03670-cea5-47a1-b778-890ce2083dd9\") " Dec 03 08:50:13 crc kubenswrapper[4947]: I1203 08:50:13.243501 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85a03670-cea5-47a1-b778-890ce2083dd9-kube-api-access-dc4gj" (OuterVolumeSpecName: "kube-api-access-dc4gj") pod "85a03670-cea5-47a1-b778-890ce2083dd9" (UID: "85a03670-cea5-47a1-b778-890ce2083dd9"). InnerVolumeSpecName "kube-api-access-dc4gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:50:13 crc kubenswrapper[4947]: I1203 08:50:13.340405 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc4gj\" (UniqueName: \"kubernetes.io/projected/85a03670-cea5-47a1-b778-890ce2083dd9-kube-api-access-dc4gj\") on node \"crc\" DevicePath \"\"" Dec 03 08:50:13 crc kubenswrapper[4947]: I1203 08:50:13.733641 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Dec 03 08:50:13 crc kubenswrapper[4947]: E1203 08:50:13.733940 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1db9b190-6dad-476a-9b3d-c6ba48c6bb3d" containerName="extract-content" Dec 03 08:50:13 crc kubenswrapper[4947]: I1203 08:50:13.733952 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1db9b190-6dad-476a-9b3d-c6ba48c6bb3d" containerName="extract-content" Dec 03 08:50:13 crc kubenswrapper[4947]: E1203 08:50:13.733968 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85a03670-cea5-47a1-b778-890ce2083dd9" containerName="mariadb-client-2-default" Dec 03 08:50:13 crc kubenswrapper[4947]: I1203 08:50:13.733975 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="85a03670-cea5-47a1-b778-890ce2083dd9" containerName="mariadb-client-2-default" Dec 03 08:50:13 crc kubenswrapper[4947]: E1203 08:50:13.734001 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1db9b190-6dad-476a-9b3d-c6ba48c6bb3d" containerName="registry-server" Dec 03 08:50:13 crc kubenswrapper[4947]: I1203 08:50:13.734008 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1db9b190-6dad-476a-9b3d-c6ba48c6bb3d" containerName="registry-server" Dec 03 08:50:13 crc kubenswrapper[4947]: E1203 08:50:13.734019 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1db9b190-6dad-476a-9b3d-c6ba48c6bb3d" containerName="extract-utilities" Dec 03 08:50:13 crc kubenswrapper[4947]: I1203 08:50:13.734210 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1db9b190-6dad-476a-9b3d-c6ba48c6bb3d" containerName="extract-utilities" Dec 03 08:50:13 crc kubenswrapper[4947]: I1203 08:50:13.734363 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="1db9b190-6dad-476a-9b3d-c6ba48c6bb3d" containerName="registry-server" Dec 03 08:50:13 crc kubenswrapper[4947]: I1203 08:50:13.734392 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="85a03670-cea5-47a1-b778-890ce2083dd9" containerName="mariadb-client-2-default" Dec 03 08:50:13 crc kubenswrapper[4947]: I1203 08:50:13.735025 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 03 08:50:13 crc kubenswrapper[4947]: I1203 08:50:13.746980 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9jhb\" (UniqueName: \"kubernetes.io/projected/2719f367-14a2-4c18-ac82-4c1cfc09b804-kube-api-access-k9jhb\") pod \"mariadb-client-1\" (UID: \"2719f367-14a2-4c18-ac82-4c1cfc09b804\") " pod="openstack/mariadb-client-1" Dec 03 08:50:13 crc kubenswrapper[4947]: I1203 08:50:13.755357 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Dec 03 08:50:13 crc kubenswrapper[4947]: I1203 08:50:13.806448 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 03 08:50:13 crc kubenswrapper[4947]: I1203 08:50:13.806743 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4568664c838ef0daae802708f42e49f3b0b71a109049c0d41486508089a1021a" Dec 03 08:50:13 crc kubenswrapper[4947]: I1203 08:50:13.848119 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9jhb\" (UniqueName: \"kubernetes.io/projected/2719f367-14a2-4c18-ac82-4c1cfc09b804-kube-api-access-k9jhb\") pod \"mariadb-client-1\" (UID: \"2719f367-14a2-4c18-ac82-4c1cfc09b804\") " pod="openstack/mariadb-client-1" Dec 03 08:50:13 crc kubenswrapper[4947]: I1203 08:50:13.867925 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9jhb\" (UniqueName: \"kubernetes.io/projected/2719f367-14a2-4c18-ac82-4c1cfc09b804-kube-api-access-k9jhb\") pod \"mariadb-client-1\" (UID: \"2719f367-14a2-4c18-ac82-4c1cfc09b804\") " pod="openstack/mariadb-client-1" Dec 03 08:50:14 crc kubenswrapper[4947]: I1203 08:50:14.111655 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 03 08:50:14 crc kubenswrapper[4947]: I1203 08:50:14.465824 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Dec 03 08:50:14 crc kubenswrapper[4947]: W1203 08:50:14.473835 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2719f367_14a2_4c18_ac82_4c1cfc09b804.slice/crio-f45f6165ab339181b5933f7e5877e7c6b5428fd1ac6e83cdd16ec2ffb7fb443c WatchSource:0}: Error finding container f45f6165ab339181b5933f7e5877e7c6b5428fd1ac6e83cdd16ec2ffb7fb443c: Status 404 returned error can't find the container with id f45f6165ab339181b5933f7e5877e7c6b5428fd1ac6e83cdd16ec2ffb7fb443c Dec 03 08:50:14 crc kubenswrapper[4947]: I1203 08:50:14.819898 4947 generic.go:334] "Generic (PLEG): container finished" podID="2719f367-14a2-4c18-ac82-4c1cfc09b804" containerID="a1d9ab3a7fe12226bc19d68aa5d32254cff1250ab5cbfabeb7e6625b4bda711f" exitCode=0 Dec 03 08:50:14 crc kubenswrapper[4947]: I1203 08:50:14.819986 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"2719f367-14a2-4c18-ac82-4c1cfc09b804","Type":"ContainerDied","Data":"a1d9ab3a7fe12226bc19d68aa5d32254cff1250ab5cbfabeb7e6625b4bda711f"} Dec 03 08:50:14 crc kubenswrapper[4947]: I1203 08:50:14.820205 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"2719f367-14a2-4c18-ac82-4c1cfc09b804","Type":"ContainerStarted","Data":"f45f6165ab339181b5933f7e5877e7c6b5428fd1ac6e83cdd16ec2ffb7fb443c"} Dec 03 08:50:15 crc kubenswrapper[4947]: I1203 08:50:15.100891 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85a03670-cea5-47a1-b778-890ce2083dd9" path="/var/lib/kubelet/pods/85a03670-cea5-47a1-b778-890ce2083dd9/volumes" Dec 03 08:50:16 crc kubenswrapper[4947]: I1203 08:50:16.331268 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 03 08:50:16 crc kubenswrapper[4947]: I1203 08:50:16.354416 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_2719f367-14a2-4c18-ac82-4c1cfc09b804/mariadb-client-1/0.log" Dec 03 08:50:16 crc kubenswrapper[4947]: I1203 08:50:16.384571 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Dec 03 08:50:16 crc kubenswrapper[4947]: I1203 08:50:16.393021 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Dec 03 08:50:16 crc kubenswrapper[4947]: I1203 08:50:16.502062 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9jhb\" (UniqueName: \"kubernetes.io/projected/2719f367-14a2-4c18-ac82-4c1cfc09b804-kube-api-access-k9jhb\") pod \"2719f367-14a2-4c18-ac82-4c1cfc09b804\" (UID: \"2719f367-14a2-4c18-ac82-4c1cfc09b804\") " Dec 03 08:50:16 crc kubenswrapper[4947]: I1203 08:50:16.507617 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2719f367-14a2-4c18-ac82-4c1cfc09b804-kube-api-access-k9jhb" (OuterVolumeSpecName: "kube-api-access-k9jhb") pod "2719f367-14a2-4c18-ac82-4c1cfc09b804" (UID: "2719f367-14a2-4c18-ac82-4c1cfc09b804"). InnerVolumeSpecName "kube-api-access-k9jhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:50:16 crc kubenswrapper[4947]: I1203 08:50:16.605021 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9jhb\" (UniqueName: \"kubernetes.io/projected/2719f367-14a2-4c18-ac82-4c1cfc09b804-kube-api-access-k9jhb\") on node \"crc\" DevicePath \"\"" Dec 03 08:50:16 crc kubenswrapper[4947]: I1203 08:50:16.839794 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f45f6165ab339181b5933f7e5877e7c6b5428fd1ac6e83cdd16ec2ffb7fb443c" Dec 03 08:50:16 crc kubenswrapper[4947]: I1203 08:50:16.839876 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 03 08:50:16 crc kubenswrapper[4947]: I1203 08:50:16.949341 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Dec 03 08:50:16 crc kubenswrapper[4947]: E1203 08:50:16.950011 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2719f367-14a2-4c18-ac82-4c1cfc09b804" containerName="mariadb-client-1" Dec 03 08:50:16 crc kubenswrapper[4947]: I1203 08:50:16.950042 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="2719f367-14a2-4c18-ac82-4c1cfc09b804" containerName="mariadb-client-1" Dec 03 08:50:16 crc kubenswrapper[4947]: I1203 08:50:16.950366 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="2719f367-14a2-4c18-ac82-4c1cfc09b804" containerName="mariadb-client-1" Dec 03 08:50:16 crc kubenswrapper[4947]: I1203 08:50:16.951580 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 03 08:50:16 crc kubenswrapper[4947]: I1203 08:50:16.954166 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-kk56l" Dec 03 08:50:16 crc kubenswrapper[4947]: I1203 08:50:16.959113 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 03 08:50:17 crc kubenswrapper[4947]: I1203 08:50:17.096423 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2719f367-14a2-4c18-ac82-4c1cfc09b804" path="/var/lib/kubelet/pods/2719f367-14a2-4c18-ac82-4c1cfc09b804/volumes" Dec 03 08:50:17 crc kubenswrapper[4947]: I1203 08:50:17.112842 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt8wv\" (UniqueName: \"kubernetes.io/projected/52afeb22-6e43-4fdd-bb4d-a8b1d2f581d1-kube-api-access-nt8wv\") pod \"mariadb-client-4-default\" (UID: \"52afeb22-6e43-4fdd-bb4d-a8b1d2f581d1\") " pod="openstack/mariadb-client-4-default" Dec 03 08:50:17 crc kubenswrapper[4947]: I1203 08:50:17.213969 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt8wv\" (UniqueName: \"kubernetes.io/projected/52afeb22-6e43-4fdd-bb4d-a8b1d2f581d1-kube-api-access-nt8wv\") pod \"mariadb-client-4-default\" (UID: \"52afeb22-6e43-4fdd-bb4d-a8b1d2f581d1\") " pod="openstack/mariadb-client-4-default" Dec 03 08:50:17 crc kubenswrapper[4947]: I1203 08:50:17.237160 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt8wv\" (UniqueName: \"kubernetes.io/projected/52afeb22-6e43-4fdd-bb4d-a8b1d2f581d1-kube-api-access-nt8wv\") pod \"mariadb-client-4-default\" (UID: \"52afeb22-6e43-4fdd-bb4d-a8b1d2f581d1\") " pod="openstack/mariadb-client-4-default" Dec 03 08:50:17 crc kubenswrapper[4947]: I1203 08:50:17.283163 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 03 08:50:17 crc kubenswrapper[4947]: I1203 08:50:17.809180 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 03 08:50:17 crc kubenswrapper[4947]: W1203 08:50:17.819283 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52afeb22_6e43_4fdd_bb4d_a8b1d2f581d1.slice/crio-743eee29a77b8f54e164d175f5e6970bfa685a57dd51e4e0e639843f84765bc1 WatchSource:0}: Error finding container 743eee29a77b8f54e164d175f5e6970bfa685a57dd51e4e0e639843f84765bc1: Status 404 returned error can't find the container with id 743eee29a77b8f54e164d175f5e6970bfa685a57dd51e4e0e639843f84765bc1 Dec 03 08:50:17 crc kubenswrapper[4947]: I1203 08:50:17.849510 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"52afeb22-6e43-4fdd-bb4d-a8b1d2f581d1","Type":"ContainerStarted","Data":"743eee29a77b8f54e164d175f5e6970bfa685a57dd51e4e0e639843f84765bc1"} Dec 03 08:50:18 crc kubenswrapper[4947]: I1203 08:50:18.862509 4947 generic.go:334] "Generic (PLEG): container finished" podID="52afeb22-6e43-4fdd-bb4d-a8b1d2f581d1" containerID="12d5fb4d0d17324686bafe940ec60bc215ac462272bddcc18d29c67c939ddb8b" exitCode=0 Dec 03 08:50:18 crc kubenswrapper[4947]: I1203 08:50:18.862557 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"52afeb22-6e43-4fdd-bb4d-a8b1d2f581d1","Type":"ContainerDied","Data":"12d5fb4d0d17324686bafe940ec60bc215ac462272bddcc18d29c67c939ddb8b"} Dec 03 08:50:19 crc kubenswrapper[4947]: I1203 08:50:19.089912 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:50:19 crc kubenswrapper[4947]: E1203 08:50:19.090705 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:50:20 crc kubenswrapper[4947]: I1203 08:50:20.261235 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 03 08:50:20 crc kubenswrapper[4947]: I1203 08:50:20.281921 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_52afeb22-6e43-4fdd-bb4d-a8b1d2f581d1/mariadb-client-4-default/0.log" Dec 03 08:50:20 crc kubenswrapper[4947]: I1203 08:50:20.315306 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 03 08:50:20 crc kubenswrapper[4947]: I1203 08:50:20.320096 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 03 08:50:20 crc kubenswrapper[4947]: I1203 08:50:20.367768 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt8wv\" (UniqueName: \"kubernetes.io/projected/52afeb22-6e43-4fdd-bb4d-a8b1d2f581d1-kube-api-access-nt8wv\") pod \"52afeb22-6e43-4fdd-bb4d-a8b1d2f581d1\" (UID: \"52afeb22-6e43-4fdd-bb4d-a8b1d2f581d1\") " Dec 03 08:50:20 crc kubenswrapper[4947]: I1203 08:50:20.375290 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52afeb22-6e43-4fdd-bb4d-a8b1d2f581d1-kube-api-access-nt8wv" (OuterVolumeSpecName: "kube-api-access-nt8wv") pod "52afeb22-6e43-4fdd-bb4d-a8b1d2f581d1" (UID: "52afeb22-6e43-4fdd-bb4d-a8b1d2f581d1"). InnerVolumeSpecName "kube-api-access-nt8wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:50:20 crc kubenswrapper[4947]: I1203 08:50:20.469857 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt8wv\" (UniqueName: \"kubernetes.io/projected/52afeb22-6e43-4fdd-bb4d-a8b1d2f581d1-kube-api-access-nt8wv\") on node \"crc\" DevicePath \"\"" Dec 03 08:50:20 crc kubenswrapper[4947]: I1203 08:50:20.498780 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-cell1"] Dec 03 08:50:20 crc kubenswrapper[4947]: E1203 08:50:20.499310 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52afeb22-6e43-4fdd-bb4d-a8b1d2f581d1" containerName="mariadb-client-4-default" Dec 03 08:50:20 crc kubenswrapper[4947]: I1203 08:50:20.499338 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="52afeb22-6e43-4fdd-bb4d-a8b1d2f581d1" containerName="mariadb-client-4-default" Dec 03 08:50:20 crc kubenswrapper[4947]: I1203 08:50:20.499609 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="52afeb22-6e43-4fdd-bb4d-a8b1d2f581d1" containerName="mariadb-client-4-default" Dec 03 08:50:20 crc kubenswrapper[4947]: I1203 08:50:20.500559 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-cell1" Dec 03 08:50:20 crc kubenswrapper[4947]: I1203 08:50:20.507425 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-cell1"] Dec 03 08:50:20 crc kubenswrapper[4947]: I1203 08:50:20.671772 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6xzz\" (UniqueName: \"kubernetes.io/projected/7b3b6874-ffe9-41c6-a860-3700ffab61a6-kube-api-access-n6xzz\") pod \"mariadb-client-4-cell1\" (UID: \"7b3b6874-ffe9-41c6-a860-3700ffab61a6\") " pod="openstack/mariadb-client-4-cell1" Dec 03 08:50:20 crc kubenswrapper[4947]: I1203 08:50:20.773821 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6xzz\" (UniqueName: \"kubernetes.io/projected/7b3b6874-ffe9-41c6-a860-3700ffab61a6-kube-api-access-n6xzz\") pod \"mariadb-client-4-cell1\" (UID: \"7b3b6874-ffe9-41c6-a860-3700ffab61a6\") " pod="openstack/mariadb-client-4-cell1" Dec 03 08:50:20 crc kubenswrapper[4947]: I1203 08:50:20.792792 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6xzz\" (UniqueName: \"kubernetes.io/projected/7b3b6874-ffe9-41c6-a860-3700ffab61a6-kube-api-access-n6xzz\") pod \"mariadb-client-4-cell1\" (UID: \"7b3b6874-ffe9-41c6-a860-3700ffab61a6\") " pod="openstack/mariadb-client-4-cell1" Dec 03 08:50:20 crc kubenswrapper[4947]: I1203 08:50:20.827352 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-cell1" Dec 03 08:50:20 crc kubenswrapper[4947]: I1203 08:50:20.895556 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="743eee29a77b8f54e164d175f5e6970bfa685a57dd51e4e0e639843f84765bc1" Dec 03 08:50:20 crc kubenswrapper[4947]: I1203 08:50:20.895618 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 03 08:50:21 crc kubenswrapper[4947]: I1203 08:50:21.096339 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52afeb22-6e43-4fdd-bb4d-a8b1d2f581d1" path="/var/lib/kubelet/pods/52afeb22-6e43-4fdd-bb4d-a8b1d2f581d1/volumes" Dec 03 08:50:21 crc kubenswrapper[4947]: I1203 08:50:21.200749 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-cell1"] Dec 03 08:50:21 crc kubenswrapper[4947]: I1203 08:50:21.907349 4947 generic.go:334] "Generic (PLEG): container finished" podID="7b3b6874-ffe9-41c6-a860-3700ffab61a6" containerID="d60a6b0f4fd2cacb027a1325e1b0d1867e9e1a1a7ebf9e7d677d970b1e79effc" exitCode=0 Dec 03 08:50:21 crc kubenswrapper[4947]: I1203 08:50:21.907450 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-cell1" event={"ID":"7b3b6874-ffe9-41c6-a860-3700ffab61a6","Type":"ContainerDied","Data":"d60a6b0f4fd2cacb027a1325e1b0d1867e9e1a1a7ebf9e7d677d970b1e79effc"} Dec 03 08:50:21 crc kubenswrapper[4947]: I1203 08:50:21.907797 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-cell1" event={"ID":"7b3b6874-ffe9-41c6-a860-3700ffab61a6","Type":"ContainerStarted","Data":"c3f95a6b410ef7ec260766f077e918cc3f8bc41ead2af254340f0910598ba653"} Dec 03 08:50:23 crc kubenswrapper[4947]: I1203 08:50:23.280033 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-cell1" Dec 03 08:50:23 crc kubenswrapper[4947]: I1203 08:50:23.298905 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-cell1_7b3b6874-ffe9-41c6-a860-3700ffab61a6/mariadb-client-4-cell1/0.log" Dec 03 08:50:23 crc kubenswrapper[4947]: I1203 08:50:23.334549 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-cell1"] Dec 03 08:50:23 crc kubenswrapper[4947]: I1203 08:50:23.344570 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-cell1"] Dec 03 08:50:23 crc kubenswrapper[4947]: I1203 08:50:23.422675 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6xzz\" (UniqueName: \"kubernetes.io/projected/7b3b6874-ffe9-41c6-a860-3700ffab61a6-kube-api-access-n6xzz\") pod \"7b3b6874-ffe9-41c6-a860-3700ffab61a6\" (UID: \"7b3b6874-ffe9-41c6-a860-3700ffab61a6\") " Dec 03 08:50:23 crc kubenswrapper[4947]: I1203 08:50:23.427799 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b3b6874-ffe9-41c6-a860-3700ffab61a6-kube-api-access-n6xzz" (OuterVolumeSpecName: "kube-api-access-n6xzz") pod "7b3b6874-ffe9-41c6-a860-3700ffab61a6" (UID: "7b3b6874-ffe9-41c6-a860-3700ffab61a6"). InnerVolumeSpecName "kube-api-access-n6xzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:50:23 crc kubenswrapper[4947]: I1203 08:50:23.465923 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-cell2"] Dec 03 08:50:23 crc kubenswrapper[4947]: E1203 08:50:23.466601 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b3b6874-ffe9-41c6-a860-3700ffab61a6" containerName="mariadb-client-4-cell1" Dec 03 08:50:23 crc kubenswrapper[4947]: I1203 08:50:23.466625 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b3b6874-ffe9-41c6-a860-3700ffab61a6" containerName="mariadb-client-4-cell1" Dec 03 08:50:23 crc kubenswrapper[4947]: I1203 08:50:23.466871 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b3b6874-ffe9-41c6-a860-3700ffab61a6" containerName="mariadb-client-4-cell1" Dec 03 08:50:23 crc kubenswrapper[4947]: I1203 08:50:23.467676 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-cell2" Dec 03 08:50:23 crc kubenswrapper[4947]: I1203 08:50:23.472217 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-cell2"] Dec 03 08:50:23 crc kubenswrapper[4947]: I1203 08:50:23.524298 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6xzz\" (UniqueName: \"kubernetes.io/projected/7b3b6874-ffe9-41c6-a860-3700ffab61a6-kube-api-access-n6xzz\") on node \"crc\" DevicePath \"\"" Dec 03 08:50:23 crc kubenswrapper[4947]: I1203 08:50:23.626155 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njzkr\" (UniqueName: \"kubernetes.io/projected/c9ac361b-f30c-46e5-a95b-3aa275841db7-kube-api-access-njzkr\") pod \"mariadb-client-4-cell2\" (UID: \"c9ac361b-f30c-46e5-a95b-3aa275841db7\") " pod="openstack/mariadb-client-4-cell2" Dec 03 08:50:23 crc kubenswrapper[4947]: I1203 08:50:23.727248 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njzkr\" (UniqueName: \"kubernetes.io/projected/c9ac361b-f30c-46e5-a95b-3aa275841db7-kube-api-access-njzkr\") pod \"mariadb-client-4-cell2\" (UID: \"c9ac361b-f30c-46e5-a95b-3aa275841db7\") " pod="openstack/mariadb-client-4-cell2" Dec 03 08:50:23 crc kubenswrapper[4947]: I1203 08:50:23.756317 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njzkr\" (UniqueName: \"kubernetes.io/projected/c9ac361b-f30c-46e5-a95b-3aa275841db7-kube-api-access-njzkr\") pod \"mariadb-client-4-cell2\" (UID: \"c9ac361b-f30c-46e5-a95b-3aa275841db7\") " pod="openstack/mariadb-client-4-cell2" Dec 03 08:50:23 crc kubenswrapper[4947]: I1203 08:50:23.798822 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-cell2" Dec 03 08:50:23 crc kubenswrapper[4947]: I1203 08:50:23.925407 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3f95a6b410ef7ec260766f077e918cc3f8bc41ead2af254340f0910598ba653" Dec 03 08:50:23 crc kubenswrapper[4947]: I1203 08:50:23.925456 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-cell1" Dec 03 08:50:24 crc kubenswrapper[4947]: I1203 08:50:24.104369 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-cell2"] Dec 03 08:50:24 crc kubenswrapper[4947]: I1203 08:50:24.939544 4947 generic.go:334] "Generic (PLEG): container finished" podID="c9ac361b-f30c-46e5-a95b-3aa275841db7" containerID="201b2b6fe81ba45cfa6bf1529aa9b77dda9b34ad66898da6965f011ddd565f6f" exitCode=0 Dec 03 08:50:24 crc kubenswrapper[4947]: I1203 08:50:24.939648 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-cell2" event={"ID":"c9ac361b-f30c-46e5-a95b-3aa275841db7","Type":"ContainerDied","Data":"201b2b6fe81ba45cfa6bf1529aa9b77dda9b34ad66898da6965f011ddd565f6f"} Dec 03 08:50:24 crc kubenswrapper[4947]: I1203 08:50:24.939989 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-cell2" event={"ID":"c9ac361b-f30c-46e5-a95b-3aa275841db7","Type":"ContainerStarted","Data":"ae4fbcb259f4421642835c62d7276d7f38911a3a505803208ed127771c014e19"} Dec 03 08:50:25 crc kubenswrapper[4947]: I1203 08:50:25.099643 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b3b6874-ffe9-41c6-a860-3700ffab61a6" path="/var/lib/kubelet/pods/7b3b6874-ffe9-41c6-a860-3700ffab61a6/volumes" Dec 03 08:50:26 crc kubenswrapper[4947]: I1203 08:50:26.402019 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-cell2" Dec 03 08:50:26 crc kubenswrapper[4947]: I1203 08:50:26.421702 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-cell2_c9ac361b-f30c-46e5-a95b-3aa275841db7/mariadb-client-4-cell2/0.log" Dec 03 08:50:26 crc kubenswrapper[4947]: I1203 08:50:26.456806 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-cell2"] Dec 03 08:50:26 crc kubenswrapper[4947]: I1203 08:50:26.463869 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-cell2"] Dec 03 08:50:26 crc kubenswrapper[4947]: I1203 08:50:26.573659 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njzkr\" (UniqueName: \"kubernetes.io/projected/c9ac361b-f30c-46e5-a95b-3aa275841db7-kube-api-access-njzkr\") pod \"c9ac361b-f30c-46e5-a95b-3aa275841db7\" (UID: \"c9ac361b-f30c-46e5-a95b-3aa275841db7\") " Dec 03 08:50:26 crc kubenswrapper[4947]: I1203 08:50:26.581960 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9ac361b-f30c-46e5-a95b-3aa275841db7-kube-api-access-njzkr" (OuterVolumeSpecName: "kube-api-access-njzkr") pod "c9ac361b-f30c-46e5-a95b-3aa275841db7" (UID: "c9ac361b-f30c-46e5-a95b-3aa275841db7"). InnerVolumeSpecName "kube-api-access-njzkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:50:26 crc kubenswrapper[4947]: I1203 08:50:26.676476 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njzkr\" (UniqueName: \"kubernetes.io/projected/c9ac361b-f30c-46e5-a95b-3aa275841db7-kube-api-access-njzkr\") on node \"crc\" DevicePath \"\"" Dec 03 08:50:26 crc kubenswrapper[4947]: I1203 08:50:26.969312 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae4fbcb259f4421642835c62d7276d7f38911a3a505803208ed127771c014e19" Dec 03 08:50:26 crc kubenswrapper[4947]: I1203 08:50:26.969410 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-cell2" Dec 03 08:50:27 crc kubenswrapper[4947]: I1203 08:50:27.095764 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9ac361b-f30c-46e5-a95b-3aa275841db7" path="/var/lib/kubelet/pods/c9ac361b-f30c-46e5-a95b-3aa275841db7/volumes" Dec 03 08:50:30 crc kubenswrapper[4947]: I1203 08:50:30.288027 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Dec 03 08:50:30 crc kubenswrapper[4947]: E1203 08:50:30.288744 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ac361b-f30c-46e5-a95b-3aa275841db7" containerName="mariadb-client-4-cell2" Dec 03 08:50:30 crc kubenswrapper[4947]: I1203 08:50:30.288763 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ac361b-f30c-46e5-a95b-3aa275841db7" containerName="mariadb-client-4-cell2" Dec 03 08:50:30 crc kubenswrapper[4947]: I1203 08:50:30.288956 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ac361b-f30c-46e5-a95b-3aa275841db7" containerName="mariadb-client-4-cell2" Dec 03 08:50:30 crc kubenswrapper[4947]: I1203 08:50:30.289626 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 03 08:50:30 crc kubenswrapper[4947]: I1203 08:50:30.293211 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-kk56l" Dec 03 08:50:30 crc kubenswrapper[4947]: I1203 08:50:30.311414 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 03 08:50:30 crc kubenswrapper[4947]: I1203 08:50:30.349564 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh4jk\" (UniqueName: \"kubernetes.io/projected/c744d093-0bd8-4928-8d1b-55ebe18f2a93-kube-api-access-vh4jk\") pod \"mariadb-client-5-default\" (UID: \"c744d093-0bd8-4928-8d1b-55ebe18f2a93\") " pod="openstack/mariadb-client-5-default" Dec 03 08:50:30 crc kubenswrapper[4947]: I1203 08:50:30.451284 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh4jk\" (UniqueName: \"kubernetes.io/projected/c744d093-0bd8-4928-8d1b-55ebe18f2a93-kube-api-access-vh4jk\") pod \"mariadb-client-5-default\" (UID: \"c744d093-0bd8-4928-8d1b-55ebe18f2a93\") " pod="openstack/mariadb-client-5-default" Dec 03 08:50:30 crc kubenswrapper[4947]: I1203 08:50:30.474941 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh4jk\" (UniqueName: \"kubernetes.io/projected/c744d093-0bd8-4928-8d1b-55ebe18f2a93-kube-api-access-vh4jk\") pod \"mariadb-client-5-default\" (UID: \"c744d093-0bd8-4928-8d1b-55ebe18f2a93\") " pod="openstack/mariadb-client-5-default" Dec 03 08:50:30 crc kubenswrapper[4947]: I1203 08:50:30.646065 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 03 08:50:31 crc kubenswrapper[4947]: I1203 08:50:31.228615 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 03 08:50:32 crc kubenswrapper[4947]: I1203 08:50:32.021437 4947 generic.go:334] "Generic (PLEG): container finished" podID="c744d093-0bd8-4928-8d1b-55ebe18f2a93" containerID="31805a99d09b86ab643aec11b3d97303a8dc8de42576ea91cc5a89ae12e463f9" exitCode=0 Dec 03 08:50:32 crc kubenswrapper[4947]: I1203 08:50:32.021480 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"c744d093-0bd8-4928-8d1b-55ebe18f2a93","Type":"ContainerDied","Data":"31805a99d09b86ab643aec11b3d97303a8dc8de42576ea91cc5a89ae12e463f9"} Dec 03 08:50:32 crc kubenswrapper[4947]: I1203 08:50:32.021523 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"c744d093-0bd8-4928-8d1b-55ebe18f2a93","Type":"ContainerStarted","Data":"f9dc6fb57a41b24cafd26a12d9e2308530be3b87467b625c070965728c29cc58"} Dec 03 08:50:32 crc kubenswrapper[4947]: I1203 08:50:32.082420 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:50:32 crc kubenswrapper[4947]: E1203 08:50:32.082661 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:50:33 crc kubenswrapper[4947]: I1203 08:50:33.479278 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 03 08:50:33 crc kubenswrapper[4947]: I1203 08:50:33.502034 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_c744d093-0bd8-4928-8d1b-55ebe18f2a93/mariadb-client-5-default/0.log" Dec 03 08:50:33 crc kubenswrapper[4947]: I1203 08:50:33.511908 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh4jk\" (UniqueName: \"kubernetes.io/projected/c744d093-0bd8-4928-8d1b-55ebe18f2a93-kube-api-access-vh4jk\") pod \"c744d093-0bd8-4928-8d1b-55ebe18f2a93\" (UID: \"c744d093-0bd8-4928-8d1b-55ebe18f2a93\") " Dec 03 08:50:33 crc kubenswrapper[4947]: I1203 08:50:33.518965 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c744d093-0bd8-4928-8d1b-55ebe18f2a93-kube-api-access-vh4jk" (OuterVolumeSpecName: "kube-api-access-vh4jk") pod "c744d093-0bd8-4928-8d1b-55ebe18f2a93" (UID: "c744d093-0bd8-4928-8d1b-55ebe18f2a93"). InnerVolumeSpecName "kube-api-access-vh4jk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:50:33 crc kubenswrapper[4947]: I1203 08:50:33.537711 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 03 08:50:33 crc kubenswrapper[4947]: I1203 08:50:33.550096 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 03 08:50:33 crc kubenswrapper[4947]: I1203 08:50:33.614346 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh4jk\" (UniqueName: \"kubernetes.io/projected/c744d093-0bd8-4928-8d1b-55ebe18f2a93-kube-api-access-vh4jk\") on node \"crc\" DevicePath \"\"" Dec 03 08:50:33 crc kubenswrapper[4947]: I1203 08:50:33.728423 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Dec 03 08:50:33 crc kubenswrapper[4947]: E1203 08:50:33.729115 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c744d093-0bd8-4928-8d1b-55ebe18f2a93" containerName="mariadb-client-5-default" Dec 03 08:50:33 crc kubenswrapper[4947]: I1203 08:50:33.729162 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="c744d093-0bd8-4928-8d1b-55ebe18f2a93" containerName="mariadb-client-5-default" Dec 03 08:50:33 crc kubenswrapper[4947]: I1203 08:50:33.729642 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="c744d093-0bd8-4928-8d1b-55ebe18f2a93" containerName="mariadb-client-5-default" Dec 03 08:50:33 crc kubenswrapper[4947]: I1203 08:50:33.730829 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 03 08:50:33 crc kubenswrapper[4947]: I1203 08:50:33.738659 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 03 08:50:33 crc kubenswrapper[4947]: I1203 08:50:33.817142 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gpqg\" (UniqueName: \"kubernetes.io/projected/a3b04197-babc-4447-8a1a-003388202251-kube-api-access-4gpqg\") pod \"mariadb-client-6-default\" (UID: \"a3b04197-babc-4447-8a1a-003388202251\") " pod="openstack/mariadb-client-6-default" Dec 03 08:50:33 crc kubenswrapper[4947]: I1203 08:50:33.919093 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gpqg\" (UniqueName: \"kubernetes.io/projected/a3b04197-babc-4447-8a1a-003388202251-kube-api-access-4gpqg\") pod \"mariadb-client-6-default\" (UID: \"a3b04197-babc-4447-8a1a-003388202251\") " pod="openstack/mariadb-client-6-default" Dec 03 08:50:33 crc kubenswrapper[4947]: I1203 08:50:33.945349 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gpqg\" (UniqueName: \"kubernetes.io/projected/a3b04197-babc-4447-8a1a-003388202251-kube-api-access-4gpqg\") pod \"mariadb-client-6-default\" (UID: \"a3b04197-babc-4447-8a1a-003388202251\") " pod="openstack/mariadb-client-6-default" Dec 03 08:50:34 crc kubenswrapper[4947]: I1203 08:50:34.048426 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9dc6fb57a41b24cafd26a12d9e2308530be3b87467b625c070965728c29cc58" Dec 03 08:50:34 crc kubenswrapper[4947]: I1203 08:50:34.048577 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 03 08:50:34 crc kubenswrapper[4947]: I1203 08:50:34.055984 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 03 08:50:34 crc kubenswrapper[4947]: I1203 08:50:34.383056 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 03 08:50:34 crc kubenswrapper[4947]: W1203 08:50:34.387616 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3b04197_babc_4447_8a1a_003388202251.slice/crio-4d88f0aafd8b7c6f3f99f61b3d18ba81c3c1b554ca8ae01a2c0857d77b5ed8c6 WatchSource:0}: Error finding container 4d88f0aafd8b7c6f3f99f61b3d18ba81c3c1b554ca8ae01a2c0857d77b5ed8c6: Status 404 returned error can't find the container with id 4d88f0aafd8b7c6f3f99f61b3d18ba81c3c1b554ca8ae01a2c0857d77b5ed8c6 Dec 03 08:50:35 crc kubenswrapper[4947]: I1203 08:50:35.059812 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"a3b04197-babc-4447-8a1a-003388202251","Type":"ContainerStarted","Data":"97583157f998f22e0d10629240546ffdbb85fb8b99dfb94ecbd8bd7b3d8ad86c"} Dec 03 08:50:35 crc kubenswrapper[4947]: I1203 08:50:35.060180 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"a3b04197-babc-4447-8a1a-003388202251","Type":"ContainerStarted","Data":"4d88f0aafd8b7c6f3f99f61b3d18ba81c3c1b554ca8ae01a2c0857d77b5ed8c6"} Dec 03 08:50:35 crc kubenswrapper[4947]: I1203 08:50:35.084820 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-6-default" podStartSLOduration=2.084786284 podStartE2EDuration="2.084786284s" podCreationTimestamp="2025-12-03 08:50:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:50:35.073402597 +0000 UTC m=+7296.334357023" watchObservedRunningTime="2025-12-03 08:50:35.084786284 +0000 UTC m=+7296.345740710" Dec 03 08:50:35 crc kubenswrapper[4947]: I1203 08:50:35.098829 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c744d093-0bd8-4928-8d1b-55ebe18f2a93" path="/var/lib/kubelet/pods/c744d093-0bd8-4928-8d1b-55ebe18f2a93/volumes" Dec 03 08:50:35 crc kubenswrapper[4947]: I1203 08:50:35.122446 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-6-default_a3b04197-babc-4447-8a1a-003388202251/mariadb-client-6-default/0.log" Dec 03 08:50:36 crc kubenswrapper[4947]: I1203 08:50:36.069894 4947 generic.go:334] "Generic (PLEG): container finished" podID="a3b04197-babc-4447-8a1a-003388202251" containerID="97583157f998f22e0d10629240546ffdbb85fb8b99dfb94ecbd8bd7b3d8ad86c" exitCode=1 Dec 03 08:50:36 crc kubenswrapper[4947]: I1203 08:50:36.069984 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"a3b04197-babc-4447-8a1a-003388202251","Type":"ContainerDied","Data":"97583157f998f22e0d10629240546ffdbb85fb8b99dfb94ecbd8bd7b3d8ad86c"} Dec 03 08:50:37 crc kubenswrapper[4947]: I1203 08:50:37.503674 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 03 08:50:37 crc kubenswrapper[4947]: I1203 08:50:37.553752 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 03 08:50:37 crc kubenswrapper[4947]: I1203 08:50:37.563696 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 03 08:50:37 crc kubenswrapper[4947]: I1203 08:50:37.579274 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gpqg\" (UniqueName: \"kubernetes.io/projected/a3b04197-babc-4447-8a1a-003388202251-kube-api-access-4gpqg\") pod \"a3b04197-babc-4447-8a1a-003388202251\" (UID: \"a3b04197-babc-4447-8a1a-003388202251\") " Dec 03 08:50:37 crc kubenswrapper[4947]: I1203 08:50:37.585761 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3b04197-babc-4447-8a1a-003388202251-kube-api-access-4gpqg" (OuterVolumeSpecName: "kube-api-access-4gpqg") pod "a3b04197-babc-4447-8a1a-003388202251" (UID: "a3b04197-babc-4447-8a1a-003388202251"). InnerVolumeSpecName "kube-api-access-4gpqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:50:37 crc kubenswrapper[4947]: I1203 08:50:37.680616 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gpqg\" (UniqueName: \"kubernetes.io/projected/a3b04197-babc-4447-8a1a-003388202251-kube-api-access-4gpqg\") on node \"crc\" DevicePath \"\"" Dec 03 08:50:37 crc kubenswrapper[4947]: I1203 08:50:37.692640 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Dec 03 08:50:37 crc kubenswrapper[4947]: E1203 08:50:37.693005 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3b04197-babc-4447-8a1a-003388202251" containerName="mariadb-client-6-default" Dec 03 08:50:37 crc kubenswrapper[4947]: I1203 08:50:37.693027 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b04197-babc-4447-8a1a-003388202251" containerName="mariadb-client-6-default" Dec 03 08:50:37 crc kubenswrapper[4947]: I1203 08:50:37.693243 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3b04197-babc-4447-8a1a-003388202251" containerName="mariadb-client-6-default" Dec 03 08:50:37 crc kubenswrapper[4947]: I1203 08:50:37.693886 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 03 08:50:37 crc kubenswrapper[4947]: I1203 08:50:37.700389 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 03 08:50:37 crc kubenswrapper[4947]: I1203 08:50:37.884216 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cv52\" (UniqueName: \"kubernetes.io/projected/88bca2d5-ae97-4e3e-8610-105847aad4a9-kube-api-access-4cv52\") pod \"mariadb-client-7-default\" (UID: \"88bca2d5-ae97-4e3e-8610-105847aad4a9\") " pod="openstack/mariadb-client-7-default" Dec 03 08:50:37 crc kubenswrapper[4947]: I1203 08:50:37.986268 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cv52\" (UniqueName: \"kubernetes.io/projected/88bca2d5-ae97-4e3e-8610-105847aad4a9-kube-api-access-4cv52\") pod \"mariadb-client-7-default\" (UID: \"88bca2d5-ae97-4e3e-8610-105847aad4a9\") " pod="openstack/mariadb-client-7-default" Dec 03 08:50:38 crc kubenswrapper[4947]: I1203 08:50:38.008654 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cv52\" (UniqueName: \"kubernetes.io/projected/88bca2d5-ae97-4e3e-8610-105847aad4a9-kube-api-access-4cv52\") pod \"mariadb-client-7-default\" (UID: \"88bca2d5-ae97-4e3e-8610-105847aad4a9\") " pod="openstack/mariadb-client-7-default" Dec 03 08:50:38 crc kubenswrapper[4947]: I1203 08:50:38.054392 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 03 08:50:38 crc kubenswrapper[4947]: I1203 08:50:38.092808 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d88f0aafd8b7c6f3f99f61b3d18ba81c3c1b554ca8ae01a2c0857d77b5ed8c6" Dec 03 08:50:38 crc kubenswrapper[4947]: I1203 08:50:38.093063 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 03 08:50:38 crc kubenswrapper[4947]: I1203 08:50:38.546240 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 03 08:50:39 crc kubenswrapper[4947]: I1203 08:50:39.098913 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3b04197-babc-4447-8a1a-003388202251" path="/var/lib/kubelet/pods/a3b04197-babc-4447-8a1a-003388202251/volumes" Dec 03 08:50:39 crc kubenswrapper[4947]: I1203 08:50:39.108055 4947 generic.go:334] "Generic (PLEG): container finished" podID="88bca2d5-ae97-4e3e-8610-105847aad4a9" containerID="5eca3217def3ecd5b2d75fd1dc435a56499c15bda38682c73b3ba2c7de295d51" exitCode=0 Dec 03 08:50:39 crc kubenswrapper[4947]: I1203 08:50:39.108090 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"88bca2d5-ae97-4e3e-8610-105847aad4a9","Type":"ContainerDied","Data":"5eca3217def3ecd5b2d75fd1dc435a56499c15bda38682c73b3ba2c7de295d51"} Dec 03 08:50:39 crc kubenswrapper[4947]: I1203 08:50:39.108112 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"88bca2d5-ae97-4e3e-8610-105847aad4a9","Type":"ContainerStarted","Data":"34d01655424d6c87ff8f3b33858f729d66ba5af6fceb5dc0d4d04e6c3266da28"} Dec 03 08:50:40 crc kubenswrapper[4947]: I1203 08:50:40.590818 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 03 08:50:40 crc kubenswrapper[4947]: I1203 08:50:40.611747 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_88bca2d5-ae97-4e3e-8610-105847aad4a9/mariadb-client-7-default/0.log" Dec 03 08:50:40 crc kubenswrapper[4947]: I1203 08:50:40.635412 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 03 08:50:40 crc kubenswrapper[4947]: I1203 08:50:40.652784 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 03 08:50:40 crc kubenswrapper[4947]: I1203 08:50:40.736472 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cv52\" (UniqueName: \"kubernetes.io/projected/88bca2d5-ae97-4e3e-8610-105847aad4a9-kube-api-access-4cv52\") pod \"88bca2d5-ae97-4e3e-8610-105847aad4a9\" (UID: \"88bca2d5-ae97-4e3e-8610-105847aad4a9\") " Dec 03 08:50:40 crc kubenswrapper[4947]: I1203 08:50:40.745325 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88bca2d5-ae97-4e3e-8610-105847aad4a9-kube-api-access-4cv52" (OuterVolumeSpecName: "kube-api-access-4cv52") pod "88bca2d5-ae97-4e3e-8610-105847aad4a9" (UID: "88bca2d5-ae97-4e3e-8610-105847aad4a9"). InnerVolumeSpecName "kube-api-access-4cv52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:50:40 crc kubenswrapper[4947]: I1203 08:50:40.839102 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cv52\" (UniqueName: \"kubernetes.io/projected/88bca2d5-ae97-4e3e-8610-105847aad4a9-kube-api-access-4cv52\") on node \"crc\" DevicePath \"\"" Dec 03 08:50:40 crc kubenswrapper[4947]: I1203 08:50:40.841123 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Dec 03 08:50:40 crc kubenswrapper[4947]: E1203 08:50:40.841940 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88bca2d5-ae97-4e3e-8610-105847aad4a9" containerName="mariadb-client-7-default" Dec 03 08:50:40 crc kubenswrapper[4947]: I1203 08:50:40.842111 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="88bca2d5-ae97-4e3e-8610-105847aad4a9" containerName="mariadb-client-7-default" Dec 03 08:50:40 crc kubenswrapper[4947]: I1203 08:50:40.842595 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="88bca2d5-ae97-4e3e-8610-105847aad4a9" containerName="mariadb-client-7-default" Dec 03 08:50:40 crc kubenswrapper[4947]: I1203 08:50:40.843616 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 03 08:50:40 crc kubenswrapper[4947]: I1203 08:50:40.874240 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 03 08:50:41 crc kubenswrapper[4947]: I1203 08:50:41.041693 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq9jb\" (UniqueName: \"kubernetes.io/projected/1db8ec1b-82a2-4328-92dd-e0c47af016f8-kube-api-access-dq9jb\") pod \"mariadb-client-2\" (UID: \"1db8ec1b-82a2-4328-92dd-e0c47af016f8\") " pod="openstack/mariadb-client-2" Dec 03 08:50:41 crc kubenswrapper[4947]: I1203 08:50:41.096528 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88bca2d5-ae97-4e3e-8610-105847aad4a9" path="/var/lib/kubelet/pods/88bca2d5-ae97-4e3e-8610-105847aad4a9/volumes" Dec 03 08:50:41 crc kubenswrapper[4947]: I1203 08:50:41.126050 4947 scope.go:117] "RemoveContainer" containerID="5eca3217def3ecd5b2d75fd1dc435a56499c15bda38682c73b3ba2c7de295d51" Dec 03 08:50:41 crc kubenswrapper[4947]: I1203 08:50:41.126125 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 03 08:50:41 crc kubenswrapper[4947]: I1203 08:50:41.144126 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq9jb\" (UniqueName: \"kubernetes.io/projected/1db8ec1b-82a2-4328-92dd-e0c47af016f8-kube-api-access-dq9jb\") pod \"mariadb-client-2\" (UID: \"1db8ec1b-82a2-4328-92dd-e0c47af016f8\") " pod="openstack/mariadb-client-2" Dec 03 08:50:41 crc kubenswrapper[4947]: I1203 08:50:41.162662 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq9jb\" (UniqueName: \"kubernetes.io/projected/1db8ec1b-82a2-4328-92dd-e0c47af016f8-kube-api-access-dq9jb\") pod \"mariadb-client-2\" (UID: \"1db8ec1b-82a2-4328-92dd-e0c47af016f8\") " pod="openstack/mariadb-client-2" Dec 03 08:50:41 crc kubenswrapper[4947]: I1203 08:50:41.188434 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 03 08:50:41 crc kubenswrapper[4947]: I1203 08:50:41.698008 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 03 08:50:41 crc kubenswrapper[4947]: W1203 08:50:41.701679 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1db8ec1b_82a2_4328_92dd_e0c47af016f8.slice/crio-144bdfeea71c96a76ac8536f2e5ac5f64ed1546aa478fc186877fb64a0743527 WatchSource:0}: Error finding container 144bdfeea71c96a76ac8536f2e5ac5f64ed1546aa478fc186877fb64a0743527: Status 404 returned error can't find the container with id 144bdfeea71c96a76ac8536f2e5ac5f64ed1546aa478fc186877fb64a0743527 Dec 03 08:50:42 crc kubenswrapper[4947]: I1203 08:50:42.138449 4947 generic.go:334] "Generic (PLEG): container finished" podID="1db8ec1b-82a2-4328-92dd-e0c47af016f8" containerID="c92325f730a67ad15cfb8942610456d0455ee59290aa6358571ac1873b9b7c5b" exitCode=0 Dec 03 08:50:42 crc kubenswrapper[4947]: I1203 08:50:42.138547 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"1db8ec1b-82a2-4328-92dd-e0c47af016f8","Type":"ContainerDied","Data":"c92325f730a67ad15cfb8942610456d0455ee59290aa6358571ac1873b9b7c5b"} Dec 03 08:50:42 crc kubenswrapper[4947]: I1203 08:50:42.138577 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"1db8ec1b-82a2-4328-92dd-e0c47af016f8","Type":"ContainerStarted","Data":"144bdfeea71c96a76ac8536f2e5ac5f64ed1546aa478fc186877fb64a0743527"} Dec 03 08:50:43 crc kubenswrapper[4947]: I1203 08:50:43.528194 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 03 08:50:43 crc kubenswrapper[4947]: I1203 08:50:43.549355 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_1db8ec1b-82a2-4328-92dd-e0c47af016f8/mariadb-client-2/0.log" Dec 03 08:50:43 crc kubenswrapper[4947]: I1203 08:50:43.577199 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Dec 03 08:50:43 crc kubenswrapper[4947]: I1203 08:50:43.582388 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Dec 03 08:50:43 crc kubenswrapper[4947]: I1203 08:50:43.684190 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq9jb\" (UniqueName: \"kubernetes.io/projected/1db8ec1b-82a2-4328-92dd-e0c47af016f8-kube-api-access-dq9jb\") pod \"1db8ec1b-82a2-4328-92dd-e0c47af016f8\" (UID: \"1db8ec1b-82a2-4328-92dd-e0c47af016f8\") " Dec 03 08:50:43 crc kubenswrapper[4947]: I1203 08:50:43.692748 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1db8ec1b-82a2-4328-92dd-e0c47af016f8-kube-api-access-dq9jb" (OuterVolumeSpecName: "kube-api-access-dq9jb") pod "1db8ec1b-82a2-4328-92dd-e0c47af016f8" (UID: "1db8ec1b-82a2-4328-92dd-e0c47af016f8"). InnerVolumeSpecName "kube-api-access-dq9jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:50:43 crc kubenswrapper[4947]: I1203 08:50:43.787566 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq9jb\" (UniqueName: \"kubernetes.io/projected/1db8ec1b-82a2-4328-92dd-e0c47af016f8-kube-api-access-dq9jb\") on node \"crc\" DevicePath \"\"" Dec 03 08:50:44 crc kubenswrapper[4947]: I1203 08:50:44.083402 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:50:44 crc kubenswrapper[4947]: E1203 08:50:44.083934 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:50:44 crc kubenswrapper[4947]: I1203 08:50:44.166438 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="144bdfeea71c96a76ac8536f2e5ac5f64ed1546aa478fc186877fb64a0743527" Dec 03 08:50:44 crc kubenswrapper[4947]: I1203 08:50:44.166596 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 03 08:50:45 crc kubenswrapper[4947]: I1203 08:50:45.099904 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1db8ec1b-82a2-4328-92dd-e0c47af016f8" path="/var/lib/kubelet/pods/1db8ec1b-82a2-4328-92dd-e0c47af016f8/volumes" Dec 03 08:50:46 crc kubenswrapper[4947]: I1203 08:50:46.731883 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-cell1"] Dec 03 08:50:46 crc kubenswrapper[4947]: E1203 08:50:46.733808 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1db8ec1b-82a2-4328-92dd-e0c47af016f8" containerName="mariadb-client-2" Dec 03 08:50:46 crc kubenswrapper[4947]: I1203 08:50:46.734028 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1db8ec1b-82a2-4328-92dd-e0c47af016f8" containerName="mariadb-client-2" Dec 03 08:50:46 crc kubenswrapper[4947]: I1203 08:50:46.734619 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="1db8ec1b-82a2-4328-92dd-e0c47af016f8" containerName="mariadb-client-2" Dec 03 08:50:46 crc kubenswrapper[4947]: I1203 08:50:46.736006 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-cell1" Dec 03 08:50:46 crc kubenswrapper[4947]: I1203 08:50:46.741620 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-cell1"] Dec 03 08:50:46 crc kubenswrapper[4947]: I1203 08:50:46.743675 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tksfr\" (UniqueName: \"kubernetes.io/projected/cba1ca3e-2b0e-4a26-ac14-181d6836dbf1-kube-api-access-tksfr\") pod \"mariadb-client-5-cell1\" (UID: \"cba1ca3e-2b0e-4a26-ac14-181d6836dbf1\") " pod="openstack/mariadb-client-5-cell1" Dec 03 08:50:46 crc kubenswrapper[4947]: I1203 08:50:46.745578 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-kk56l" Dec 03 08:50:46 crc kubenswrapper[4947]: I1203 08:50:46.845339 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tksfr\" (UniqueName: \"kubernetes.io/projected/cba1ca3e-2b0e-4a26-ac14-181d6836dbf1-kube-api-access-tksfr\") pod \"mariadb-client-5-cell1\" (UID: \"cba1ca3e-2b0e-4a26-ac14-181d6836dbf1\") " pod="openstack/mariadb-client-5-cell1" Dec 03 08:50:46 crc kubenswrapper[4947]: I1203 08:50:46.868912 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tksfr\" (UniqueName: \"kubernetes.io/projected/cba1ca3e-2b0e-4a26-ac14-181d6836dbf1-kube-api-access-tksfr\") pod \"mariadb-client-5-cell1\" (UID: \"cba1ca3e-2b0e-4a26-ac14-181d6836dbf1\") " pod="openstack/mariadb-client-5-cell1" Dec 03 08:50:47 crc kubenswrapper[4947]: I1203 08:50:47.067110 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-cell1" Dec 03 08:50:47 crc kubenswrapper[4947]: I1203 08:50:47.328766 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-cell1"] Dec 03 08:50:48 crc kubenswrapper[4947]: I1203 08:50:48.226556 4947 generic.go:334] "Generic (PLEG): container finished" podID="cba1ca3e-2b0e-4a26-ac14-181d6836dbf1" containerID="72286d5fc3da6a7cee0d17a487c2de3f827439254bf27a02dc8fb07e908d5b04" exitCode=0 Dec 03 08:50:48 crc kubenswrapper[4947]: I1203 08:50:48.226620 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-cell1" event={"ID":"cba1ca3e-2b0e-4a26-ac14-181d6836dbf1","Type":"ContainerDied","Data":"72286d5fc3da6a7cee0d17a487c2de3f827439254bf27a02dc8fb07e908d5b04"} Dec 03 08:50:48 crc kubenswrapper[4947]: I1203 08:50:48.226891 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-cell1" event={"ID":"cba1ca3e-2b0e-4a26-ac14-181d6836dbf1","Type":"ContainerStarted","Data":"099c3ed2eab6fb151b6341613490bbc7a462d25afc9427bb47bb41573a480263"} Dec 03 08:50:49 crc kubenswrapper[4947]: I1203 08:50:49.618005 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-cell1" Dec 03 08:50:49 crc kubenswrapper[4947]: I1203 08:50:49.637189 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-cell1_cba1ca3e-2b0e-4a26-ac14-181d6836dbf1/mariadb-client-5-cell1/0.log" Dec 03 08:50:49 crc kubenswrapper[4947]: I1203 08:50:49.683293 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-cell1"] Dec 03 08:50:49 crc kubenswrapper[4947]: I1203 08:50:49.690845 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-cell1"] Dec 03 08:50:49 crc kubenswrapper[4947]: I1203 08:50:49.788327 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tksfr\" (UniqueName: \"kubernetes.io/projected/cba1ca3e-2b0e-4a26-ac14-181d6836dbf1-kube-api-access-tksfr\") pod \"cba1ca3e-2b0e-4a26-ac14-181d6836dbf1\" (UID: \"cba1ca3e-2b0e-4a26-ac14-181d6836dbf1\") " Dec 03 08:50:49 crc kubenswrapper[4947]: I1203 08:50:49.798620 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cba1ca3e-2b0e-4a26-ac14-181d6836dbf1-kube-api-access-tksfr" (OuterVolumeSpecName: "kube-api-access-tksfr") pod "cba1ca3e-2b0e-4a26-ac14-181d6836dbf1" (UID: "cba1ca3e-2b0e-4a26-ac14-181d6836dbf1"). InnerVolumeSpecName "kube-api-access-tksfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:50:49 crc kubenswrapper[4947]: I1203 08:50:49.849916 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-cell1"] Dec 03 08:50:49 crc kubenswrapper[4947]: E1203 08:50:49.850221 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba1ca3e-2b0e-4a26-ac14-181d6836dbf1" containerName="mariadb-client-5-cell1" Dec 03 08:50:49 crc kubenswrapper[4947]: I1203 08:50:49.850233 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba1ca3e-2b0e-4a26-ac14-181d6836dbf1" containerName="mariadb-client-5-cell1" Dec 03 08:50:49 crc kubenswrapper[4947]: I1203 08:50:49.850383 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba1ca3e-2b0e-4a26-ac14-181d6836dbf1" containerName="mariadb-client-5-cell1" Dec 03 08:50:49 crc kubenswrapper[4947]: I1203 08:50:49.850916 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-cell1" Dec 03 08:50:49 crc kubenswrapper[4947]: I1203 08:50:49.869027 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-cell1"] Dec 03 08:50:49 crc kubenswrapper[4947]: I1203 08:50:49.889851 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tksfr\" (UniqueName: \"kubernetes.io/projected/cba1ca3e-2b0e-4a26-ac14-181d6836dbf1-kube-api-access-tksfr\") on node \"crc\" DevicePath \"\"" Dec 03 08:50:49 crc kubenswrapper[4947]: I1203 08:50:49.991392 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vch2z\" (UniqueName: \"kubernetes.io/projected/222840a5-da5f-4559-8885-d2e99ff40a7c-kube-api-access-vch2z\") pod \"mariadb-client-6-cell1\" (UID: \"222840a5-da5f-4559-8885-d2e99ff40a7c\") " pod="openstack/mariadb-client-6-cell1" Dec 03 08:50:50 crc kubenswrapper[4947]: I1203 08:50:50.093097 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vch2z\" (UniqueName: \"kubernetes.io/projected/222840a5-da5f-4559-8885-d2e99ff40a7c-kube-api-access-vch2z\") pod \"mariadb-client-6-cell1\" (UID: \"222840a5-da5f-4559-8885-d2e99ff40a7c\") " pod="openstack/mariadb-client-6-cell1" Dec 03 08:50:50 crc kubenswrapper[4947]: I1203 08:50:50.110098 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vch2z\" (UniqueName: \"kubernetes.io/projected/222840a5-da5f-4559-8885-d2e99ff40a7c-kube-api-access-vch2z\") pod \"mariadb-client-6-cell1\" (UID: \"222840a5-da5f-4559-8885-d2e99ff40a7c\") " pod="openstack/mariadb-client-6-cell1" Dec 03 08:50:50 crc kubenswrapper[4947]: I1203 08:50:50.183031 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-cell1" Dec 03 08:50:50 crc kubenswrapper[4947]: I1203 08:50:50.249999 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="099c3ed2eab6fb151b6341613490bbc7a462d25afc9427bb47bb41573a480263" Dec 03 08:50:50 crc kubenswrapper[4947]: I1203 08:50:50.250074 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-cell1" Dec 03 08:50:50 crc kubenswrapper[4947]: I1203 08:50:50.671221 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-cell1"] Dec 03 08:50:51 crc kubenswrapper[4947]: I1203 08:50:51.099122 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cba1ca3e-2b0e-4a26-ac14-181d6836dbf1" path="/var/lib/kubelet/pods/cba1ca3e-2b0e-4a26-ac14-181d6836dbf1/volumes" Dec 03 08:50:51 crc kubenswrapper[4947]: I1203 08:50:51.269406 4947 generic.go:334] "Generic (PLEG): container finished" podID="222840a5-da5f-4559-8885-d2e99ff40a7c" containerID="98f1bd90d8ed5873a807b62265ffa3d85a6f53701298b8a4b5a14d55ab6251af" exitCode=1 Dec 03 08:50:51 crc kubenswrapper[4947]: I1203 08:50:51.269448 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-cell1" event={"ID":"222840a5-da5f-4559-8885-d2e99ff40a7c","Type":"ContainerDied","Data":"98f1bd90d8ed5873a807b62265ffa3d85a6f53701298b8a4b5a14d55ab6251af"} Dec 03 08:50:51 crc kubenswrapper[4947]: I1203 08:50:51.269910 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-cell1" event={"ID":"222840a5-da5f-4559-8885-d2e99ff40a7c","Type":"ContainerStarted","Data":"86a686b7b67901496656eb227adcf739955cba1c5d885eede536d543167e13d4"} Dec 03 08:50:52 crc kubenswrapper[4947]: I1203 08:50:52.716133 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-cell1" Dec 03 08:50:52 crc kubenswrapper[4947]: I1203 08:50:52.739419 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-6-cell1_222840a5-da5f-4559-8885-d2e99ff40a7c/mariadb-client-6-cell1/0.log" Dec 03 08:50:52 crc kubenswrapper[4947]: I1203 08:50:52.777214 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-cell1"] Dec 03 08:50:52 crc kubenswrapper[4947]: I1203 08:50:52.785141 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-cell1"] Dec 03 08:50:52 crc kubenswrapper[4947]: I1203 08:50:52.840139 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vch2z\" (UniqueName: \"kubernetes.io/projected/222840a5-da5f-4559-8885-d2e99ff40a7c-kube-api-access-vch2z\") pod \"222840a5-da5f-4559-8885-d2e99ff40a7c\" (UID: \"222840a5-da5f-4559-8885-d2e99ff40a7c\") " Dec 03 08:50:52 crc kubenswrapper[4947]: I1203 08:50:52.848796 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/222840a5-da5f-4559-8885-d2e99ff40a7c-kube-api-access-vch2z" (OuterVolumeSpecName: "kube-api-access-vch2z") pod "222840a5-da5f-4559-8885-d2e99ff40a7c" (UID: "222840a5-da5f-4559-8885-d2e99ff40a7c"). InnerVolumeSpecName "kube-api-access-vch2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:50:52 crc kubenswrapper[4947]: I1203 08:50:52.925365 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-cell1"] Dec 03 08:50:52 crc kubenswrapper[4947]: E1203 08:50:52.925790 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="222840a5-da5f-4559-8885-d2e99ff40a7c" containerName="mariadb-client-6-cell1" Dec 03 08:50:52 crc kubenswrapper[4947]: I1203 08:50:52.925808 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="222840a5-da5f-4559-8885-d2e99ff40a7c" containerName="mariadb-client-6-cell1" Dec 03 08:50:52 crc kubenswrapper[4947]: I1203 08:50:52.926009 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="222840a5-da5f-4559-8885-d2e99ff40a7c" containerName="mariadb-client-6-cell1" Dec 03 08:50:52 crc kubenswrapper[4947]: I1203 08:50:52.926741 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-cell1" Dec 03 08:50:52 crc kubenswrapper[4947]: I1203 08:50:52.933672 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-cell1"] Dec 03 08:50:52 crc kubenswrapper[4947]: I1203 08:50:52.950310 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vch2z\" (UniqueName: \"kubernetes.io/projected/222840a5-da5f-4559-8885-d2e99ff40a7c-kube-api-access-vch2z\") on node \"crc\" DevicePath \"\"" Dec 03 08:50:53 crc kubenswrapper[4947]: I1203 08:50:53.052131 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54cqb\" (UniqueName: \"kubernetes.io/projected/f0c3ba3e-76db-4846-9b60-4d6b1ca02ec6-kube-api-access-54cqb\") pod \"mariadb-client-7-cell1\" (UID: \"f0c3ba3e-76db-4846-9b60-4d6b1ca02ec6\") " pod="openstack/mariadb-client-7-cell1" Dec 03 08:50:53 crc kubenswrapper[4947]: I1203 08:50:53.092301 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="222840a5-da5f-4559-8885-d2e99ff40a7c" path="/var/lib/kubelet/pods/222840a5-da5f-4559-8885-d2e99ff40a7c/volumes" Dec 03 08:50:53 crc kubenswrapper[4947]: I1203 08:50:53.154286 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54cqb\" (UniqueName: \"kubernetes.io/projected/f0c3ba3e-76db-4846-9b60-4d6b1ca02ec6-kube-api-access-54cqb\") pod \"mariadb-client-7-cell1\" (UID: \"f0c3ba3e-76db-4846-9b60-4d6b1ca02ec6\") " pod="openstack/mariadb-client-7-cell1" Dec 03 08:50:53 crc kubenswrapper[4947]: I1203 08:50:53.171270 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54cqb\" (UniqueName: \"kubernetes.io/projected/f0c3ba3e-76db-4846-9b60-4d6b1ca02ec6-kube-api-access-54cqb\") pod \"mariadb-client-7-cell1\" (UID: \"f0c3ba3e-76db-4846-9b60-4d6b1ca02ec6\") " pod="openstack/mariadb-client-7-cell1" Dec 03 08:50:53 crc kubenswrapper[4947]: I1203 08:50:53.251558 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-cell1" Dec 03 08:50:53 crc kubenswrapper[4947]: I1203 08:50:53.290705 4947 scope.go:117] "RemoveContainer" containerID="98f1bd90d8ed5873a807b62265ffa3d85a6f53701298b8a4b5a14d55ab6251af" Dec 03 08:50:53 crc kubenswrapper[4947]: I1203 08:50:53.290717 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-cell1" Dec 03 08:50:53 crc kubenswrapper[4947]: I1203 08:50:53.767094 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-cell1"] Dec 03 08:50:53 crc kubenswrapper[4947]: W1203 08:50:53.771398 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0c3ba3e_76db_4846_9b60_4d6b1ca02ec6.slice/crio-f0bac98b00282437e0d2de5466854f835e28c09eedb3181363d484d1636273bc WatchSource:0}: Error finding container f0bac98b00282437e0d2de5466854f835e28c09eedb3181363d484d1636273bc: Status 404 returned error can't find the container with id f0bac98b00282437e0d2de5466854f835e28c09eedb3181363d484d1636273bc Dec 03 08:50:54 crc kubenswrapper[4947]: I1203 08:50:54.301653 4947 generic.go:334] "Generic (PLEG): container finished" podID="f0c3ba3e-76db-4846-9b60-4d6b1ca02ec6" containerID="d72547e13fb41480acf1a76e6ab15b941d2700c0ece24141730f4a84e01ea235" exitCode=0 Dec 03 08:50:54 crc kubenswrapper[4947]: I1203 08:50:54.301693 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-cell1" event={"ID":"f0c3ba3e-76db-4846-9b60-4d6b1ca02ec6","Type":"ContainerDied","Data":"d72547e13fb41480acf1a76e6ab15b941d2700c0ece24141730f4a84e01ea235"} Dec 03 08:50:54 crc kubenswrapper[4947]: I1203 08:50:54.301719 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-cell1" event={"ID":"f0c3ba3e-76db-4846-9b60-4d6b1ca02ec6","Type":"ContainerStarted","Data":"f0bac98b00282437e0d2de5466854f835e28c09eedb3181363d484d1636273bc"} Dec 03 08:50:55 crc kubenswrapper[4947]: I1203 08:50:55.733814 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-cell1" Dec 03 08:50:55 crc kubenswrapper[4947]: I1203 08:50:55.756411 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-cell1_f0c3ba3e-76db-4846-9b60-4d6b1ca02ec6/mariadb-client-7-cell1/0.log" Dec 03 08:50:55 crc kubenswrapper[4947]: I1203 08:50:55.791383 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-cell1"] Dec 03 08:50:55 crc kubenswrapper[4947]: I1203 08:50:55.799628 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-cell1"] Dec 03 08:50:55 crc kubenswrapper[4947]: I1203 08:50:55.897090 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54cqb\" (UniqueName: \"kubernetes.io/projected/f0c3ba3e-76db-4846-9b60-4d6b1ca02ec6-kube-api-access-54cqb\") pod \"f0c3ba3e-76db-4846-9b60-4d6b1ca02ec6\" (UID: \"f0c3ba3e-76db-4846-9b60-4d6b1ca02ec6\") " Dec 03 08:50:55 crc kubenswrapper[4947]: I1203 08:50:55.903766 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0c3ba3e-76db-4846-9b60-4d6b1ca02ec6-kube-api-access-54cqb" (OuterVolumeSpecName: "kube-api-access-54cqb") pod "f0c3ba3e-76db-4846-9b60-4d6b1ca02ec6" (UID: "f0c3ba3e-76db-4846-9b60-4d6b1ca02ec6"). InnerVolumeSpecName "kube-api-access-54cqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:50:55 crc kubenswrapper[4947]: I1203 08:50:55.965369 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Dec 03 08:50:55 crc kubenswrapper[4947]: E1203 08:50:55.965880 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0c3ba3e-76db-4846-9b60-4d6b1ca02ec6" containerName="mariadb-client-7-cell1" Dec 03 08:50:55 crc kubenswrapper[4947]: I1203 08:50:55.965909 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0c3ba3e-76db-4846-9b60-4d6b1ca02ec6" containerName="mariadb-client-7-cell1" Dec 03 08:50:55 crc kubenswrapper[4947]: I1203 08:50:55.966148 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0c3ba3e-76db-4846-9b60-4d6b1ca02ec6" containerName="mariadb-client-7-cell1" Dec 03 08:50:55 crc kubenswrapper[4947]: I1203 08:50:55.966968 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 03 08:50:55 crc kubenswrapper[4947]: I1203 08:50:55.979870 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 03 08:50:55 crc kubenswrapper[4947]: I1203 08:50:55.999756 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54cqb\" (UniqueName: \"kubernetes.io/projected/f0c3ba3e-76db-4846-9b60-4d6b1ca02ec6-kube-api-access-54cqb\") on node \"crc\" DevicePath \"\"" Dec 03 08:50:56 crc kubenswrapper[4947]: I1203 08:50:56.101579 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb8fv\" (UniqueName: \"kubernetes.io/projected/d71c2964-5385-45ce-9e60-8c2d8c89e284-kube-api-access-mb8fv\") pod \"mariadb-client-2\" (UID: \"d71c2964-5385-45ce-9e60-8c2d8c89e284\") " pod="openstack/mariadb-client-2" Dec 03 08:50:56 crc kubenswrapper[4947]: I1203 08:50:56.203232 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb8fv\" (UniqueName: \"kubernetes.io/projected/d71c2964-5385-45ce-9e60-8c2d8c89e284-kube-api-access-mb8fv\") pod \"mariadb-client-2\" (UID: \"d71c2964-5385-45ce-9e60-8c2d8c89e284\") " pod="openstack/mariadb-client-2" Dec 03 08:50:56 crc kubenswrapper[4947]: I1203 08:50:56.223246 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb8fv\" (UniqueName: \"kubernetes.io/projected/d71c2964-5385-45ce-9e60-8c2d8c89e284-kube-api-access-mb8fv\") pod \"mariadb-client-2\" (UID: \"d71c2964-5385-45ce-9e60-8c2d8c89e284\") " pod="openstack/mariadb-client-2" Dec 03 08:50:56 crc kubenswrapper[4947]: I1203 08:50:56.288833 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 03 08:50:56 crc kubenswrapper[4947]: I1203 08:50:56.330895 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0bac98b00282437e0d2de5466854f835e28c09eedb3181363d484d1636273bc" Dec 03 08:50:56 crc kubenswrapper[4947]: I1203 08:50:56.330986 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-cell1" Dec 03 08:50:56 crc kubenswrapper[4947]: I1203 08:50:56.851235 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 03 08:50:57 crc kubenswrapper[4947]: I1203 08:50:57.095039 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0c3ba3e-76db-4846-9b60-4d6b1ca02ec6" path="/var/lib/kubelet/pods/f0c3ba3e-76db-4846-9b60-4d6b1ca02ec6/volumes" Dec 03 08:50:57 crc kubenswrapper[4947]: I1203 08:50:57.356477 4947 generic.go:334] "Generic (PLEG): container finished" podID="d71c2964-5385-45ce-9e60-8c2d8c89e284" containerID="94e6aef8f4bc1c05db11b152e81dd1f103a52fab0f88f0b356715763aaeb40f5" exitCode=1 Dec 03 08:50:57 crc kubenswrapper[4947]: I1203 08:50:57.356550 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"d71c2964-5385-45ce-9e60-8c2d8c89e284","Type":"ContainerDied","Data":"94e6aef8f4bc1c05db11b152e81dd1f103a52fab0f88f0b356715763aaeb40f5"} Dec 03 08:50:57 crc kubenswrapper[4947]: I1203 08:50:57.356582 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"d71c2964-5385-45ce-9e60-8c2d8c89e284","Type":"ContainerStarted","Data":"4af38f76e1ebd3c08ea42dc10f417730a3820aed02771cea0f2c0fa00034028d"} Dec 03 08:50:58 crc kubenswrapper[4947]: I1203 08:50:58.082771 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:50:58 crc kubenswrapper[4947]: E1203 08:50:58.083600 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:50:58 crc kubenswrapper[4947]: I1203 08:50:58.780626 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 03 08:50:58 crc kubenswrapper[4947]: I1203 08:50:58.800561 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_d71c2964-5385-45ce-9e60-8c2d8c89e284/mariadb-client-2/0.log" Dec 03 08:50:58 crc kubenswrapper[4947]: I1203 08:50:58.830083 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Dec 03 08:50:58 crc kubenswrapper[4947]: I1203 08:50:58.830739 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Dec 03 08:50:58 crc kubenswrapper[4947]: I1203 08:50:58.952693 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mb8fv\" (UniqueName: \"kubernetes.io/projected/d71c2964-5385-45ce-9e60-8c2d8c89e284-kube-api-access-mb8fv\") pod \"d71c2964-5385-45ce-9e60-8c2d8c89e284\" (UID: \"d71c2964-5385-45ce-9e60-8c2d8c89e284\") " Dec 03 08:50:58 crc kubenswrapper[4947]: I1203 08:50:58.958008 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d71c2964-5385-45ce-9e60-8c2d8c89e284-kube-api-access-mb8fv" (OuterVolumeSpecName: "kube-api-access-mb8fv") pod "d71c2964-5385-45ce-9e60-8c2d8c89e284" (UID: "d71c2964-5385-45ce-9e60-8c2d8c89e284"). InnerVolumeSpecName "kube-api-access-mb8fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:50:59 crc kubenswrapper[4947]: I1203 08:50:59.054250 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mb8fv\" (UniqueName: \"kubernetes.io/projected/d71c2964-5385-45ce-9e60-8c2d8c89e284-kube-api-access-mb8fv\") on node \"crc\" DevicePath \"\"" Dec 03 08:50:59 crc kubenswrapper[4947]: I1203 08:50:59.092284 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d71c2964-5385-45ce-9e60-8c2d8c89e284" path="/var/lib/kubelet/pods/d71c2964-5385-45ce-9e60-8c2d8c89e284/volumes" Dec 03 08:50:59 crc kubenswrapper[4947]: I1203 08:50:59.377522 4947 scope.go:117] "RemoveContainer" containerID="94e6aef8f4bc1c05db11b152e81dd1f103a52fab0f88f0b356715763aaeb40f5" Dec 03 08:50:59 crc kubenswrapper[4947]: I1203 08:50:59.377563 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 03 08:51:01 crc kubenswrapper[4947]: I1203 08:51:01.787380 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-cell2"] Dec 03 08:51:01 crc kubenswrapper[4947]: E1203 08:51:01.788178 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d71c2964-5385-45ce-9e60-8c2d8c89e284" containerName="mariadb-client-2" Dec 03 08:51:01 crc kubenswrapper[4947]: I1203 08:51:01.788195 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="d71c2964-5385-45ce-9e60-8c2d8c89e284" containerName="mariadb-client-2" Dec 03 08:51:01 crc kubenswrapper[4947]: I1203 08:51:01.788443 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="d71c2964-5385-45ce-9e60-8c2d8c89e284" containerName="mariadb-client-2" Dec 03 08:51:01 crc kubenswrapper[4947]: I1203 08:51:01.789089 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-cell2" Dec 03 08:51:01 crc kubenswrapper[4947]: I1203 08:51:01.793580 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-kk56l" Dec 03 08:51:01 crc kubenswrapper[4947]: I1203 08:51:01.796558 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-cell2"] Dec 03 08:51:01 crc kubenswrapper[4947]: I1203 08:51:01.910267 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl2qr\" (UniqueName: \"kubernetes.io/projected/78eb0898-c810-40b3-9814-c128ecf5fe95-kube-api-access-tl2qr\") pod \"mariadb-client-5-cell2\" (UID: \"78eb0898-c810-40b3-9814-c128ecf5fe95\") " pod="openstack/mariadb-client-5-cell2" Dec 03 08:51:02 crc kubenswrapper[4947]: I1203 08:51:02.011726 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl2qr\" (UniqueName: \"kubernetes.io/projected/78eb0898-c810-40b3-9814-c128ecf5fe95-kube-api-access-tl2qr\") pod \"mariadb-client-5-cell2\" (UID: \"78eb0898-c810-40b3-9814-c128ecf5fe95\") " pod="openstack/mariadb-client-5-cell2" Dec 03 08:51:02 crc kubenswrapper[4947]: I1203 08:51:02.032103 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl2qr\" (UniqueName: \"kubernetes.io/projected/78eb0898-c810-40b3-9814-c128ecf5fe95-kube-api-access-tl2qr\") pod \"mariadb-client-5-cell2\" (UID: \"78eb0898-c810-40b3-9814-c128ecf5fe95\") " pod="openstack/mariadb-client-5-cell2" Dec 03 08:51:02 crc kubenswrapper[4947]: I1203 08:51:02.118798 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-cell2" Dec 03 08:51:02 crc kubenswrapper[4947]: I1203 08:51:02.627336 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-cell2"] Dec 03 08:51:03 crc kubenswrapper[4947]: I1203 08:51:03.414875 4947 generic.go:334] "Generic (PLEG): container finished" podID="78eb0898-c810-40b3-9814-c128ecf5fe95" containerID="b97c18a5cdac726e8700f6878cdec964e134c18a4e1ced65013c9bf32ba9b8aa" exitCode=0 Dec 03 08:51:03 crc kubenswrapper[4947]: I1203 08:51:03.414960 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-cell2" event={"ID":"78eb0898-c810-40b3-9814-c128ecf5fe95","Type":"ContainerDied","Data":"b97c18a5cdac726e8700f6878cdec964e134c18a4e1ced65013c9bf32ba9b8aa"} Dec 03 08:51:03 crc kubenswrapper[4947]: I1203 08:51:03.417003 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-cell2" event={"ID":"78eb0898-c810-40b3-9814-c128ecf5fe95","Type":"ContainerStarted","Data":"a831d2b1f0d92d15166056c057d2a3c3710711ab27427330a89ef1341ecefe0d"} Dec 03 08:51:04 crc kubenswrapper[4947]: I1203 08:51:04.808622 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-cell2" Dec 03 08:51:04 crc kubenswrapper[4947]: I1203 08:51:04.829443 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-cell2_78eb0898-c810-40b3-9814-c128ecf5fe95/mariadb-client-5-cell2/0.log" Dec 03 08:51:04 crc kubenswrapper[4947]: I1203 08:51:04.857136 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-cell2"] Dec 03 08:51:04 crc kubenswrapper[4947]: I1203 08:51:04.862279 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-cell2"] Dec 03 08:51:04 crc kubenswrapper[4947]: I1203 08:51:04.954526 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl2qr\" (UniqueName: \"kubernetes.io/projected/78eb0898-c810-40b3-9814-c128ecf5fe95-kube-api-access-tl2qr\") pod \"78eb0898-c810-40b3-9814-c128ecf5fe95\" (UID: \"78eb0898-c810-40b3-9814-c128ecf5fe95\") " Dec 03 08:51:04 crc kubenswrapper[4947]: I1203 08:51:04.962917 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78eb0898-c810-40b3-9814-c128ecf5fe95-kube-api-access-tl2qr" (OuterVolumeSpecName: "kube-api-access-tl2qr") pod "78eb0898-c810-40b3-9814-c128ecf5fe95" (UID: "78eb0898-c810-40b3-9814-c128ecf5fe95"). InnerVolumeSpecName "kube-api-access-tl2qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:51:05 crc kubenswrapper[4947]: I1203 08:51:05.006365 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-cell2"] Dec 03 08:51:05 crc kubenswrapper[4947]: E1203 08:51:05.006773 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78eb0898-c810-40b3-9814-c128ecf5fe95" containerName="mariadb-client-5-cell2" Dec 03 08:51:05 crc kubenswrapper[4947]: I1203 08:51:05.006795 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="78eb0898-c810-40b3-9814-c128ecf5fe95" containerName="mariadb-client-5-cell2" Dec 03 08:51:05 crc kubenswrapper[4947]: I1203 08:51:05.007020 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="78eb0898-c810-40b3-9814-c128ecf5fe95" containerName="mariadb-client-5-cell2" Dec 03 08:51:05 crc kubenswrapper[4947]: I1203 08:51:05.007672 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-cell2" Dec 03 08:51:05 crc kubenswrapper[4947]: I1203 08:51:05.014162 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-cell2"] Dec 03 08:51:05 crc kubenswrapper[4947]: I1203 08:51:05.056230 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl2qr\" (UniqueName: \"kubernetes.io/projected/78eb0898-c810-40b3-9814-c128ecf5fe95-kube-api-access-tl2qr\") on node \"crc\" DevicePath \"\"" Dec 03 08:51:05 crc kubenswrapper[4947]: I1203 08:51:05.091375 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78eb0898-c810-40b3-9814-c128ecf5fe95" path="/var/lib/kubelet/pods/78eb0898-c810-40b3-9814-c128ecf5fe95/volumes" Dec 03 08:51:05 crc kubenswrapper[4947]: I1203 08:51:05.158103 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6ppd\" (UniqueName: \"kubernetes.io/projected/043d9285-6d44-4e20-ae6b-44aeb525a71c-kube-api-access-w6ppd\") pod \"mariadb-client-6-cell2\" (UID: \"043d9285-6d44-4e20-ae6b-44aeb525a71c\") " pod="openstack/mariadb-client-6-cell2" Dec 03 08:51:05 crc kubenswrapper[4947]: I1203 08:51:05.259265 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6ppd\" (UniqueName: \"kubernetes.io/projected/043d9285-6d44-4e20-ae6b-44aeb525a71c-kube-api-access-w6ppd\") pod \"mariadb-client-6-cell2\" (UID: \"043d9285-6d44-4e20-ae6b-44aeb525a71c\") " pod="openstack/mariadb-client-6-cell2" Dec 03 08:51:05 crc kubenswrapper[4947]: I1203 08:51:05.282530 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6ppd\" (UniqueName: \"kubernetes.io/projected/043d9285-6d44-4e20-ae6b-44aeb525a71c-kube-api-access-w6ppd\") pod \"mariadb-client-6-cell2\" (UID: \"043d9285-6d44-4e20-ae6b-44aeb525a71c\") " pod="openstack/mariadb-client-6-cell2" Dec 03 08:51:05 crc kubenswrapper[4947]: I1203 08:51:05.332219 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-cell2" Dec 03 08:51:05 crc kubenswrapper[4947]: I1203 08:51:05.442298 4947 scope.go:117] "RemoveContainer" containerID="b97c18a5cdac726e8700f6878cdec964e134c18a4e1ced65013c9bf32ba9b8aa" Dec 03 08:51:05 crc kubenswrapper[4947]: I1203 08:51:05.442400 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-cell2" Dec 03 08:51:05 crc kubenswrapper[4947]: I1203 08:51:05.902743 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-cell2"] Dec 03 08:51:05 crc kubenswrapper[4947]: W1203 08:51:05.902779 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod043d9285_6d44_4e20_ae6b_44aeb525a71c.slice/crio-789e4955fd59aacbf5ab525793a760604a424b1d3c9026b50cc60a072afb154c WatchSource:0}: Error finding container 789e4955fd59aacbf5ab525793a760604a424b1d3c9026b50cc60a072afb154c: Status 404 returned error can't find the container with id 789e4955fd59aacbf5ab525793a760604a424b1d3c9026b50cc60a072afb154c Dec 03 08:51:06 crc kubenswrapper[4947]: I1203 08:51:06.453570 4947 generic.go:334] "Generic (PLEG): container finished" podID="043d9285-6d44-4e20-ae6b-44aeb525a71c" containerID="1c64e271a3348574b5db5e47c0dcece52c8e11b2ac61cc97d5a33d6d96bfd8a0" exitCode=1 Dec 03 08:51:06 crc kubenswrapper[4947]: I1203 08:51:06.453701 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-cell2" event={"ID":"043d9285-6d44-4e20-ae6b-44aeb525a71c","Type":"ContainerDied","Data":"1c64e271a3348574b5db5e47c0dcece52c8e11b2ac61cc97d5a33d6d96bfd8a0"} Dec 03 08:51:06 crc kubenswrapper[4947]: I1203 08:51:06.454801 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-cell2" event={"ID":"043d9285-6d44-4e20-ae6b-44aeb525a71c","Type":"ContainerStarted","Data":"789e4955fd59aacbf5ab525793a760604a424b1d3c9026b50cc60a072afb154c"} Dec 03 08:51:07 crc kubenswrapper[4947]: I1203 08:51:07.815462 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-cell2" Dec 03 08:51:07 crc kubenswrapper[4947]: I1203 08:51:07.835056 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-6-cell2_043d9285-6d44-4e20-ae6b-44aeb525a71c/mariadb-client-6-cell2/0.log" Dec 03 08:51:07 crc kubenswrapper[4947]: I1203 08:51:07.860721 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-cell2"] Dec 03 08:51:07 crc kubenswrapper[4947]: I1203 08:51:07.866026 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-cell2"] Dec 03 08:51:07 crc kubenswrapper[4947]: I1203 08:51:07.899705 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6ppd\" (UniqueName: \"kubernetes.io/projected/043d9285-6d44-4e20-ae6b-44aeb525a71c-kube-api-access-w6ppd\") pod \"043d9285-6d44-4e20-ae6b-44aeb525a71c\" (UID: \"043d9285-6d44-4e20-ae6b-44aeb525a71c\") " Dec 03 08:51:07 crc kubenswrapper[4947]: I1203 08:51:07.908332 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/043d9285-6d44-4e20-ae6b-44aeb525a71c-kube-api-access-w6ppd" (OuterVolumeSpecName: "kube-api-access-w6ppd") pod "043d9285-6d44-4e20-ae6b-44aeb525a71c" (UID: "043d9285-6d44-4e20-ae6b-44aeb525a71c"). InnerVolumeSpecName "kube-api-access-w6ppd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:51:08 crc kubenswrapper[4947]: I1203 08:51:08.002742 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6ppd\" (UniqueName: \"kubernetes.io/projected/043d9285-6d44-4e20-ae6b-44aeb525a71c-kube-api-access-w6ppd\") on node \"crc\" DevicePath \"\"" Dec 03 08:51:08 crc kubenswrapper[4947]: I1203 08:51:08.028027 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-cell2"] Dec 03 08:51:08 crc kubenswrapper[4947]: E1203 08:51:08.028623 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="043d9285-6d44-4e20-ae6b-44aeb525a71c" containerName="mariadb-client-6-cell2" Dec 03 08:51:08 crc kubenswrapper[4947]: I1203 08:51:08.028649 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="043d9285-6d44-4e20-ae6b-44aeb525a71c" containerName="mariadb-client-6-cell2" Dec 03 08:51:08 crc kubenswrapper[4947]: I1203 08:51:08.028904 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="043d9285-6d44-4e20-ae6b-44aeb525a71c" containerName="mariadb-client-6-cell2" Dec 03 08:51:08 crc kubenswrapper[4947]: I1203 08:51:08.029757 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-cell2" Dec 03 08:51:08 crc kubenswrapper[4947]: I1203 08:51:08.036228 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-cell2"] Dec 03 08:51:08 crc kubenswrapper[4947]: I1203 08:51:08.103985 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zms7\" (UniqueName: \"kubernetes.io/projected/2b5f5ec2-69b7-4fa4-905b-b7c47ffc690f-kube-api-access-5zms7\") pod \"mariadb-client-7-cell2\" (UID: \"2b5f5ec2-69b7-4fa4-905b-b7c47ffc690f\") " pod="openstack/mariadb-client-7-cell2" Dec 03 08:51:08 crc kubenswrapper[4947]: I1203 08:51:08.205074 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zms7\" (UniqueName: \"kubernetes.io/projected/2b5f5ec2-69b7-4fa4-905b-b7c47ffc690f-kube-api-access-5zms7\") pod \"mariadb-client-7-cell2\" (UID: \"2b5f5ec2-69b7-4fa4-905b-b7c47ffc690f\") " pod="openstack/mariadb-client-7-cell2" Dec 03 08:51:08 crc kubenswrapper[4947]: I1203 08:51:08.234156 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zms7\" (UniqueName: \"kubernetes.io/projected/2b5f5ec2-69b7-4fa4-905b-b7c47ffc690f-kube-api-access-5zms7\") pod \"mariadb-client-7-cell2\" (UID: \"2b5f5ec2-69b7-4fa4-905b-b7c47ffc690f\") " pod="openstack/mariadb-client-7-cell2" Dec 03 08:51:08 crc kubenswrapper[4947]: I1203 08:51:08.349318 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-cell2" Dec 03 08:51:08 crc kubenswrapper[4947]: I1203 08:51:08.482967 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="789e4955fd59aacbf5ab525793a760604a424b1d3c9026b50cc60a072afb154c" Dec 03 08:51:08 crc kubenswrapper[4947]: I1203 08:51:08.483043 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-cell2" Dec 03 08:51:08 crc kubenswrapper[4947]: I1203 08:51:08.888577 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-cell2"] Dec 03 08:51:09 crc kubenswrapper[4947]: I1203 08:51:09.096471 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="043d9285-6d44-4e20-ae6b-44aeb525a71c" path="/var/lib/kubelet/pods/043d9285-6d44-4e20-ae6b-44aeb525a71c/volumes" Dec 03 08:51:09 crc kubenswrapper[4947]: I1203 08:51:09.494535 4947 generic.go:334] "Generic (PLEG): container finished" podID="2b5f5ec2-69b7-4fa4-905b-b7c47ffc690f" containerID="772ac72474d9e4ae9ac05b5af16ee85f74aad4e3bc156e2e37b21ad78e7ab20d" exitCode=0 Dec 03 08:51:09 crc kubenswrapper[4947]: I1203 08:51:09.494594 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-cell2" event={"ID":"2b5f5ec2-69b7-4fa4-905b-b7c47ffc690f","Type":"ContainerDied","Data":"772ac72474d9e4ae9ac05b5af16ee85f74aad4e3bc156e2e37b21ad78e7ab20d"} Dec 03 08:51:09 crc kubenswrapper[4947]: I1203 08:51:09.494857 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-cell2" event={"ID":"2b5f5ec2-69b7-4fa4-905b-b7c47ffc690f","Type":"ContainerStarted","Data":"0252a0f293591b3a8a14f45de35c7c8add6a410da2809e51291081b4d93226c2"} Dec 03 08:51:10 crc kubenswrapper[4947]: I1203 08:51:10.839629 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-cell2" Dec 03 08:51:10 crc kubenswrapper[4947]: I1203 08:51:10.858128 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-cell2_2b5f5ec2-69b7-4fa4-905b-b7c47ffc690f/mariadb-client-7-cell2/0.log" Dec 03 08:51:10 crc kubenswrapper[4947]: I1203 08:51:10.886848 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-cell2"] Dec 03 08:51:10 crc kubenswrapper[4947]: I1203 08:51:10.892131 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-cell2"] Dec 03 08:51:10 crc kubenswrapper[4947]: I1203 08:51:10.956129 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zms7\" (UniqueName: \"kubernetes.io/projected/2b5f5ec2-69b7-4fa4-905b-b7c47ffc690f-kube-api-access-5zms7\") pod \"2b5f5ec2-69b7-4fa4-905b-b7c47ffc690f\" (UID: \"2b5f5ec2-69b7-4fa4-905b-b7c47ffc690f\") " Dec 03 08:51:10 crc kubenswrapper[4947]: I1203 08:51:10.964387 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b5f5ec2-69b7-4fa4-905b-b7c47ffc690f-kube-api-access-5zms7" (OuterVolumeSpecName: "kube-api-access-5zms7") pod "2b5f5ec2-69b7-4fa4-905b-b7c47ffc690f" (UID: "2b5f5ec2-69b7-4fa4-905b-b7c47ffc690f"). InnerVolumeSpecName "kube-api-access-5zms7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:51:11 crc kubenswrapper[4947]: I1203 08:51:11.057915 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zms7\" (UniqueName: \"kubernetes.io/projected/2b5f5ec2-69b7-4fa4-905b-b7c47ffc690f-kube-api-access-5zms7\") on node \"crc\" DevicePath \"\"" Dec 03 08:51:11 crc kubenswrapper[4947]: I1203 08:51:11.067654 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Dec 03 08:51:11 crc kubenswrapper[4947]: E1203 08:51:11.068146 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b5f5ec2-69b7-4fa4-905b-b7c47ffc690f" containerName="mariadb-client-7-cell2" Dec 03 08:51:11 crc kubenswrapper[4947]: I1203 08:51:11.068173 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b5f5ec2-69b7-4fa4-905b-b7c47ffc690f" containerName="mariadb-client-7-cell2" Dec 03 08:51:11 crc kubenswrapper[4947]: I1203 08:51:11.068477 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b5f5ec2-69b7-4fa4-905b-b7c47ffc690f" containerName="mariadb-client-7-cell2" Dec 03 08:51:11 crc kubenswrapper[4947]: I1203 08:51:11.069358 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 03 08:51:11 crc kubenswrapper[4947]: I1203 08:51:11.077091 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 03 08:51:11 crc kubenswrapper[4947]: I1203 08:51:11.110353 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:51:11 crc kubenswrapper[4947]: E1203 08:51:11.110681 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:51:11 crc kubenswrapper[4947]: I1203 08:51:11.123813 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b5f5ec2-69b7-4fa4-905b-b7c47ffc690f" path="/var/lib/kubelet/pods/2b5f5ec2-69b7-4fa4-905b-b7c47ffc690f/volumes" Dec 03 08:51:11 crc kubenswrapper[4947]: I1203 08:51:11.159598 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz2zs\" (UniqueName: \"kubernetes.io/projected/75465520-be59-4c49-8f90-334dfded3bf6-kube-api-access-pz2zs\") pod \"mariadb-client-2\" (UID: \"75465520-be59-4c49-8f90-334dfded3bf6\") " pod="openstack/mariadb-client-2" Dec 03 08:51:11 crc kubenswrapper[4947]: I1203 08:51:11.261019 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz2zs\" (UniqueName: \"kubernetes.io/projected/75465520-be59-4c49-8f90-334dfded3bf6-kube-api-access-pz2zs\") pod \"mariadb-client-2\" (UID: \"75465520-be59-4c49-8f90-334dfded3bf6\") " pod="openstack/mariadb-client-2" Dec 03 08:51:11 crc kubenswrapper[4947]: I1203 08:51:11.284133 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz2zs\" (UniqueName: \"kubernetes.io/projected/75465520-be59-4c49-8f90-334dfded3bf6-kube-api-access-pz2zs\") pod \"mariadb-client-2\" (UID: \"75465520-be59-4c49-8f90-334dfded3bf6\") " pod="openstack/mariadb-client-2" Dec 03 08:51:11 crc kubenswrapper[4947]: I1203 08:51:11.426800 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 03 08:51:11 crc kubenswrapper[4947]: I1203 08:51:11.521318 4947 scope.go:117] "RemoveContainer" containerID="772ac72474d9e4ae9ac05b5af16ee85f74aad4e3bc156e2e37b21ad78e7ab20d" Dec 03 08:51:11 crc kubenswrapper[4947]: I1203 08:51:11.521416 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-cell2" Dec 03 08:51:12 crc kubenswrapper[4947]: I1203 08:51:12.017579 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 03 08:51:12 crc kubenswrapper[4947]: W1203 08:51:12.022811 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75465520_be59_4c49_8f90_334dfded3bf6.slice/crio-ae1fb4298d9fe253ac1c41dd45c9d813368ee743c5b2e0ea87756772ac8bcaa7 WatchSource:0}: Error finding container ae1fb4298d9fe253ac1c41dd45c9d813368ee743c5b2e0ea87756772ac8bcaa7: Status 404 returned error can't find the container with id ae1fb4298d9fe253ac1c41dd45c9d813368ee743c5b2e0ea87756772ac8bcaa7 Dec 03 08:51:12 crc kubenswrapper[4947]: I1203 08:51:12.531543 4947 generic.go:334] "Generic (PLEG): container finished" podID="75465520-be59-4c49-8f90-334dfded3bf6" containerID="568dd128d3c5a85c8d8e040b77e4d60aeab273e0d3808024e643d0e831aae1fd" exitCode=1 Dec 03 08:51:12 crc kubenswrapper[4947]: I1203 08:51:12.531601 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"75465520-be59-4c49-8f90-334dfded3bf6","Type":"ContainerDied","Data":"568dd128d3c5a85c8d8e040b77e4d60aeab273e0d3808024e643d0e831aae1fd"} Dec 03 08:51:12 crc kubenswrapper[4947]: I1203 08:51:12.531695 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"75465520-be59-4c49-8f90-334dfded3bf6","Type":"ContainerStarted","Data":"ae1fb4298d9fe253ac1c41dd45c9d813368ee743c5b2e0ea87756772ac8bcaa7"} Dec 03 08:51:13 crc kubenswrapper[4947]: I1203 08:51:13.906704 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 03 08:51:13 crc kubenswrapper[4947]: I1203 08:51:13.925806 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_75465520-be59-4c49-8f90-334dfded3bf6/mariadb-client-2/0.log" Dec 03 08:51:13 crc kubenswrapper[4947]: I1203 08:51:13.958392 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Dec 03 08:51:13 crc kubenswrapper[4947]: I1203 08:51:13.965996 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Dec 03 08:51:14 crc kubenswrapper[4947]: I1203 08:51:14.007194 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz2zs\" (UniqueName: \"kubernetes.io/projected/75465520-be59-4c49-8f90-334dfded3bf6-kube-api-access-pz2zs\") pod \"75465520-be59-4c49-8f90-334dfded3bf6\" (UID: \"75465520-be59-4c49-8f90-334dfded3bf6\") " Dec 03 08:51:14 crc kubenswrapper[4947]: I1203 08:51:14.014807 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75465520-be59-4c49-8f90-334dfded3bf6-kube-api-access-pz2zs" (OuterVolumeSpecName: "kube-api-access-pz2zs") pod "75465520-be59-4c49-8f90-334dfded3bf6" (UID: "75465520-be59-4c49-8f90-334dfded3bf6"). InnerVolumeSpecName "kube-api-access-pz2zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:51:14 crc kubenswrapper[4947]: I1203 08:51:14.109100 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz2zs\" (UniqueName: \"kubernetes.io/projected/75465520-be59-4c49-8f90-334dfded3bf6-kube-api-access-pz2zs\") on node \"crc\" DevicePath \"\"" Dec 03 08:51:14 crc kubenswrapper[4947]: I1203 08:51:14.555895 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae1fb4298d9fe253ac1c41dd45c9d813368ee743c5b2e0ea87756772ac8bcaa7" Dec 03 08:51:14 crc kubenswrapper[4947]: I1203 08:51:14.555949 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 03 08:51:15 crc kubenswrapper[4947]: I1203 08:51:15.099981 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75465520-be59-4c49-8f90-334dfded3bf6" path="/var/lib/kubelet/pods/75465520-be59-4c49-8f90-334dfded3bf6/volumes" Dec 03 08:51:24 crc kubenswrapper[4947]: I1203 08:51:24.083533 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:51:24 crc kubenswrapper[4947]: E1203 08:51:24.084328 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:51:37 crc kubenswrapper[4947]: I1203 08:51:37.083568 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:51:37 crc kubenswrapper[4947]: I1203 08:51:37.801204 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"775b6bc62996b75547fbefabe838c42ae4e96572661adec2381430b5b658aa8a"} Dec 03 08:52:07 crc kubenswrapper[4947]: I1203 08:52:07.049315 4947 scope.go:117] "RemoveContainer" containerID="6661ac4931f278258e2e252037c046f7b142547ed725d9d5274ffe73fb15e5f0" Dec 03 08:54:00 crc kubenswrapper[4947]: I1203 08:54:00.086285 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:54:00 crc kubenswrapper[4947]: I1203 08:54:00.087086 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:54:30 crc kubenswrapper[4947]: I1203 08:54:30.086297 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:54:30 crc kubenswrapper[4947]: I1203 08:54:30.087052 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:55:00 crc kubenswrapper[4947]: I1203 08:55:00.086842 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:55:00 crc kubenswrapper[4947]: I1203 08:55:00.087588 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:55:00 crc kubenswrapper[4947]: I1203 08:55:00.087647 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 08:55:00 crc kubenswrapper[4947]: I1203 08:55:00.088274 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"775b6bc62996b75547fbefabe838c42ae4e96572661adec2381430b5b658aa8a"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:55:00 crc kubenswrapper[4947]: I1203 08:55:00.088360 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://775b6bc62996b75547fbefabe838c42ae4e96572661adec2381430b5b658aa8a" gracePeriod=600 Dec 03 08:55:00 crc kubenswrapper[4947]: I1203 08:55:00.745737 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="775b6bc62996b75547fbefabe838c42ae4e96572661adec2381430b5b658aa8a" exitCode=0 Dec 03 08:55:00 crc kubenswrapper[4947]: I1203 08:55:00.745842 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"775b6bc62996b75547fbefabe838c42ae4e96572661adec2381430b5b658aa8a"} Dec 03 08:55:00 crc kubenswrapper[4947]: I1203 08:55:00.746094 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce"} Dec 03 08:55:00 crc kubenswrapper[4947]: I1203 08:55:00.746127 4947 scope.go:117] "RemoveContainer" containerID="3fec9cece12c1ea9aecd259a07654f01f22712ec60121c1a7754f3f72952ad38" Dec 03 08:55:28 crc kubenswrapper[4947]: I1203 08:55:28.437083 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-khh49"] Dec 03 08:55:28 crc kubenswrapper[4947]: E1203 08:55:28.437870 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75465520-be59-4c49-8f90-334dfded3bf6" containerName="mariadb-client-2" Dec 03 08:55:28 crc kubenswrapper[4947]: I1203 08:55:28.437884 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="75465520-be59-4c49-8f90-334dfded3bf6" containerName="mariadb-client-2" Dec 03 08:55:28 crc kubenswrapper[4947]: I1203 08:55:28.438040 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="75465520-be59-4c49-8f90-334dfded3bf6" containerName="mariadb-client-2" Dec 03 08:55:28 crc kubenswrapper[4947]: I1203 08:55:28.441047 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-khh49" Dec 03 08:55:28 crc kubenswrapper[4947]: I1203 08:55:28.451325 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-khh49"] Dec 03 08:55:28 crc kubenswrapper[4947]: I1203 08:55:28.547780 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c8233f1-4d4e-47f2-b5bc-71d05564a57f-utilities\") pod \"certified-operators-khh49\" (UID: \"3c8233f1-4d4e-47f2-b5bc-71d05564a57f\") " pod="openshift-marketplace/certified-operators-khh49" Dec 03 08:55:28 crc kubenswrapper[4947]: I1203 08:55:28.547861 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c8233f1-4d4e-47f2-b5bc-71d05564a57f-catalog-content\") pod \"certified-operators-khh49\" (UID: \"3c8233f1-4d4e-47f2-b5bc-71d05564a57f\") " pod="openshift-marketplace/certified-operators-khh49" Dec 03 08:55:28 crc kubenswrapper[4947]: I1203 08:55:28.547931 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sswst\" (UniqueName: \"kubernetes.io/projected/3c8233f1-4d4e-47f2-b5bc-71d05564a57f-kube-api-access-sswst\") pod \"certified-operators-khh49\" (UID: \"3c8233f1-4d4e-47f2-b5bc-71d05564a57f\") " pod="openshift-marketplace/certified-operators-khh49" Dec 03 08:55:28 crc kubenswrapper[4947]: I1203 08:55:28.649901 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c8233f1-4d4e-47f2-b5bc-71d05564a57f-catalog-content\") pod \"certified-operators-khh49\" (UID: \"3c8233f1-4d4e-47f2-b5bc-71d05564a57f\") " pod="openshift-marketplace/certified-operators-khh49" Dec 03 08:55:28 crc kubenswrapper[4947]: I1203 08:55:28.650042 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sswst\" (UniqueName: \"kubernetes.io/projected/3c8233f1-4d4e-47f2-b5bc-71d05564a57f-kube-api-access-sswst\") pod \"certified-operators-khh49\" (UID: \"3c8233f1-4d4e-47f2-b5bc-71d05564a57f\") " pod="openshift-marketplace/certified-operators-khh49" Dec 03 08:55:28 crc kubenswrapper[4947]: I1203 08:55:28.650091 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c8233f1-4d4e-47f2-b5bc-71d05564a57f-utilities\") pod \"certified-operators-khh49\" (UID: \"3c8233f1-4d4e-47f2-b5bc-71d05564a57f\") " pod="openshift-marketplace/certified-operators-khh49" Dec 03 08:55:28 crc kubenswrapper[4947]: I1203 08:55:28.650582 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c8233f1-4d4e-47f2-b5bc-71d05564a57f-catalog-content\") pod \"certified-operators-khh49\" (UID: \"3c8233f1-4d4e-47f2-b5bc-71d05564a57f\") " pod="openshift-marketplace/certified-operators-khh49" Dec 03 08:55:28 crc kubenswrapper[4947]: I1203 08:55:28.650647 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c8233f1-4d4e-47f2-b5bc-71d05564a57f-utilities\") pod \"certified-operators-khh49\" (UID: \"3c8233f1-4d4e-47f2-b5bc-71d05564a57f\") " pod="openshift-marketplace/certified-operators-khh49" Dec 03 08:55:28 crc kubenswrapper[4947]: I1203 08:55:28.680418 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sswst\" (UniqueName: \"kubernetes.io/projected/3c8233f1-4d4e-47f2-b5bc-71d05564a57f-kube-api-access-sswst\") pod \"certified-operators-khh49\" (UID: \"3c8233f1-4d4e-47f2-b5bc-71d05564a57f\") " pod="openshift-marketplace/certified-operators-khh49" Dec 03 08:55:28 crc kubenswrapper[4947]: I1203 08:55:28.773695 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-khh49" Dec 03 08:55:29 crc kubenswrapper[4947]: I1203 08:55:29.312964 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-khh49"] Dec 03 08:55:29 crc kubenswrapper[4947]: W1203 08:55:29.325695 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c8233f1_4d4e_47f2_b5bc_71d05564a57f.slice/crio-91d144eed7ad6f5deb2e65ba079825e81a9f331a7ec96fd9563847ee31fa23dd WatchSource:0}: Error finding container 91d144eed7ad6f5deb2e65ba079825e81a9f331a7ec96fd9563847ee31fa23dd: Status 404 returned error can't find the container with id 91d144eed7ad6f5deb2e65ba079825e81a9f331a7ec96fd9563847ee31fa23dd Dec 03 08:55:30 crc kubenswrapper[4947]: I1203 08:55:30.023648 4947 generic.go:334] "Generic (PLEG): container finished" podID="3c8233f1-4d4e-47f2-b5bc-71d05564a57f" containerID="1044646d2f45d4c4e9715b2557199e946462c6c0ae2fc6e363877ead3d51b9ce" exitCode=0 Dec 03 08:55:30 crc kubenswrapper[4947]: I1203 08:55:30.023736 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khh49" event={"ID":"3c8233f1-4d4e-47f2-b5bc-71d05564a57f","Type":"ContainerDied","Data":"1044646d2f45d4c4e9715b2557199e946462c6c0ae2fc6e363877ead3d51b9ce"} Dec 03 08:55:30 crc kubenswrapper[4947]: I1203 08:55:30.024233 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khh49" event={"ID":"3c8233f1-4d4e-47f2-b5bc-71d05564a57f","Type":"ContainerStarted","Data":"91d144eed7ad6f5deb2e65ba079825e81a9f331a7ec96fd9563847ee31fa23dd"} Dec 03 08:55:30 crc kubenswrapper[4947]: I1203 08:55:30.028384 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 08:55:31 crc kubenswrapper[4947]: I1203 08:55:31.034977 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khh49" event={"ID":"3c8233f1-4d4e-47f2-b5bc-71d05564a57f","Type":"ContainerStarted","Data":"9a5d3afe3b6a28138c9c517caf2ba6f5791b6593158262adbed23e24704cbd3a"} Dec 03 08:55:32 crc kubenswrapper[4947]: I1203 08:55:32.047726 4947 generic.go:334] "Generic (PLEG): container finished" podID="3c8233f1-4d4e-47f2-b5bc-71d05564a57f" containerID="9a5d3afe3b6a28138c9c517caf2ba6f5791b6593158262adbed23e24704cbd3a" exitCode=0 Dec 03 08:55:32 crc kubenswrapper[4947]: I1203 08:55:32.047873 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khh49" event={"ID":"3c8233f1-4d4e-47f2-b5bc-71d05564a57f","Type":"ContainerDied","Data":"9a5d3afe3b6a28138c9c517caf2ba6f5791b6593158262adbed23e24704cbd3a"} Dec 03 08:55:33 crc kubenswrapper[4947]: I1203 08:55:33.063602 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khh49" event={"ID":"3c8233f1-4d4e-47f2-b5bc-71d05564a57f","Type":"ContainerStarted","Data":"8af41c6b1f32e105faafa9e885ef6c33cf9bbf4010e9f6fac60435aa129e083a"} Dec 03 08:55:33 crc kubenswrapper[4947]: I1203 08:55:33.081213 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-khh49" podStartSLOduration=2.589414364 podStartE2EDuration="5.081188979s" podCreationTimestamp="2025-12-03 08:55:28 +0000 UTC" firstStartedPulling="2025-12-03 08:55:30.027417609 +0000 UTC m=+7591.288372075" lastFinishedPulling="2025-12-03 08:55:32.519192234 +0000 UTC m=+7593.780146690" observedRunningTime="2025-12-03 08:55:33.078203168 +0000 UTC m=+7594.339157604" watchObservedRunningTime="2025-12-03 08:55:33.081188979 +0000 UTC m=+7594.342143425" Dec 03 08:55:38 crc kubenswrapper[4947]: I1203 08:55:38.774590 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-khh49" Dec 03 08:55:38 crc kubenswrapper[4947]: I1203 08:55:38.775107 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-khh49" Dec 03 08:55:38 crc kubenswrapper[4947]: I1203 08:55:38.819196 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-khh49" Dec 03 08:55:39 crc kubenswrapper[4947]: I1203 08:55:39.177912 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-khh49" Dec 03 08:55:39 crc kubenswrapper[4947]: I1203 08:55:39.225471 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-khh49"] Dec 03 08:55:41 crc kubenswrapper[4947]: I1203 08:55:41.135162 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-khh49" podUID="3c8233f1-4d4e-47f2-b5bc-71d05564a57f" containerName="registry-server" containerID="cri-o://8af41c6b1f32e105faafa9e885ef6c33cf9bbf4010e9f6fac60435aa129e083a" gracePeriod=2 Dec 03 08:55:42 crc kubenswrapper[4947]: I1203 08:55:42.142810 4947 generic.go:334] "Generic (PLEG): container finished" podID="3c8233f1-4d4e-47f2-b5bc-71d05564a57f" containerID="8af41c6b1f32e105faafa9e885ef6c33cf9bbf4010e9f6fac60435aa129e083a" exitCode=0 Dec 03 08:55:42 crc kubenswrapper[4947]: I1203 08:55:42.142850 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khh49" event={"ID":"3c8233f1-4d4e-47f2-b5bc-71d05564a57f","Type":"ContainerDied","Data":"8af41c6b1f32e105faafa9e885ef6c33cf9bbf4010e9f6fac60435aa129e083a"} Dec 03 08:55:42 crc kubenswrapper[4947]: I1203 08:55:42.791428 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-khh49" Dec 03 08:55:42 crc kubenswrapper[4947]: I1203 08:55:42.898959 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c8233f1-4d4e-47f2-b5bc-71d05564a57f-catalog-content\") pod \"3c8233f1-4d4e-47f2-b5bc-71d05564a57f\" (UID: \"3c8233f1-4d4e-47f2-b5bc-71d05564a57f\") " Dec 03 08:55:42 crc kubenswrapper[4947]: I1203 08:55:42.899040 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sswst\" (UniqueName: \"kubernetes.io/projected/3c8233f1-4d4e-47f2-b5bc-71d05564a57f-kube-api-access-sswst\") pod \"3c8233f1-4d4e-47f2-b5bc-71d05564a57f\" (UID: \"3c8233f1-4d4e-47f2-b5bc-71d05564a57f\") " Dec 03 08:55:42 crc kubenswrapper[4947]: I1203 08:55:42.899191 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c8233f1-4d4e-47f2-b5bc-71d05564a57f-utilities\") pod \"3c8233f1-4d4e-47f2-b5bc-71d05564a57f\" (UID: \"3c8233f1-4d4e-47f2-b5bc-71d05564a57f\") " Dec 03 08:55:42 crc kubenswrapper[4947]: I1203 08:55:42.900085 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c8233f1-4d4e-47f2-b5bc-71d05564a57f-utilities" (OuterVolumeSpecName: "utilities") pod "3c8233f1-4d4e-47f2-b5bc-71d05564a57f" (UID: "3c8233f1-4d4e-47f2-b5bc-71d05564a57f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:55:42 crc kubenswrapper[4947]: I1203 08:55:42.905641 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c8233f1-4d4e-47f2-b5bc-71d05564a57f-kube-api-access-sswst" (OuterVolumeSpecName: "kube-api-access-sswst") pod "3c8233f1-4d4e-47f2-b5bc-71d05564a57f" (UID: "3c8233f1-4d4e-47f2-b5bc-71d05564a57f"). InnerVolumeSpecName "kube-api-access-sswst". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:55:42 crc kubenswrapper[4947]: I1203 08:55:42.947713 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c8233f1-4d4e-47f2-b5bc-71d05564a57f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c8233f1-4d4e-47f2-b5bc-71d05564a57f" (UID: "3c8233f1-4d4e-47f2-b5bc-71d05564a57f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:55:43 crc kubenswrapper[4947]: I1203 08:55:43.000535 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c8233f1-4d4e-47f2-b5bc-71d05564a57f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:55:43 crc kubenswrapper[4947]: I1203 08:55:43.000568 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sswst\" (UniqueName: \"kubernetes.io/projected/3c8233f1-4d4e-47f2-b5bc-71d05564a57f-kube-api-access-sswst\") on node \"crc\" DevicePath \"\"" Dec 03 08:55:43 crc kubenswrapper[4947]: I1203 08:55:43.000582 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c8233f1-4d4e-47f2-b5bc-71d05564a57f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:55:43 crc kubenswrapper[4947]: I1203 08:55:43.152410 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-khh49" event={"ID":"3c8233f1-4d4e-47f2-b5bc-71d05564a57f","Type":"ContainerDied","Data":"91d144eed7ad6f5deb2e65ba079825e81a9f331a7ec96fd9563847ee31fa23dd"} Dec 03 08:55:43 crc kubenswrapper[4947]: I1203 08:55:43.152463 4947 scope.go:117] "RemoveContainer" containerID="8af41c6b1f32e105faafa9e885ef6c33cf9bbf4010e9f6fac60435aa129e083a" Dec 03 08:55:43 crc kubenswrapper[4947]: I1203 08:55:43.152602 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-khh49" Dec 03 08:55:43 crc kubenswrapper[4947]: I1203 08:55:43.172801 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-khh49"] Dec 03 08:55:43 crc kubenswrapper[4947]: I1203 08:55:43.178707 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-khh49"] Dec 03 08:55:43 crc kubenswrapper[4947]: I1203 08:55:43.181356 4947 scope.go:117] "RemoveContainer" containerID="9a5d3afe3b6a28138c9c517caf2ba6f5791b6593158262adbed23e24704cbd3a" Dec 03 08:55:43 crc kubenswrapper[4947]: I1203 08:55:43.197533 4947 scope.go:117] "RemoveContainer" containerID="1044646d2f45d4c4e9715b2557199e946462c6c0ae2fc6e363877ead3d51b9ce" Dec 03 08:55:45 crc kubenswrapper[4947]: I1203 08:55:45.107408 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c8233f1-4d4e-47f2-b5bc-71d05564a57f" path="/var/lib/kubelet/pods/3c8233f1-4d4e-47f2-b5bc-71d05564a57f/volumes" Dec 03 08:56:02 crc kubenswrapper[4947]: I1203 08:56:02.748843 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xt6hc"] Dec 03 08:56:02 crc kubenswrapper[4947]: E1203 08:56:02.750233 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c8233f1-4d4e-47f2-b5bc-71d05564a57f" containerName="extract-utilities" Dec 03 08:56:02 crc kubenswrapper[4947]: I1203 08:56:02.750253 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c8233f1-4d4e-47f2-b5bc-71d05564a57f" containerName="extract-utilities" Dec 03 08:56:02 crc kubenswrapper[4947]: E1203 08:56:02.750294 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c8233f1-4d4e-47f2-b5bc-71d05564a57f" containerName="extract-content" Dec 03 08:56:02 crc kubenswrapper[4947]: I1203 08:56:02.750305 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c8233f1-4d4e-47f2-b5bc-71d05564a57f" containerName="extract-content" Dec 03 08:56:02 crc kubenswrapper[4947]: E1203 08:56:02.750322 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c8233f1-4d4e-47f2-b5bc-71d05564a57f" containerName="registry-server" Dec 03 08:56:02 crc kubenswrapper[4947]: I1203 08:56:02.750336 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c8233f1-4d4e-47f2-b5bc-71d05564a57f" containerName="registry-server" Dec 03 08:56:02 crc kubenswrapper[4947]: I1203 08:56:02.750610 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c8233f1-4d4e-47f2-b5bc-71d05564a57f" containerName="registry-server" Dec 03 08:56:02 crc kubenswrapper[4947]: I1203 08:56:02.752882 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xt6hc" Dec 03 08:56:02 crc kubenswrapper[4947]: I1203 08:56:02.771103 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xt6hc"] Dec 03 08:56:02 crc kubenswrapper[4947]: I1203 08:56:02.843917 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0037fd95-a113-4af8-b6ee-77057139e617-utilities\") pod \"redhat-operators-xt6hc\" (UID: \"0037fd95-a113-4af8-b6ee-77057139e617\") " pod="openshift-marketplace/redhat-operators-xt6hc" Dec 03 08:56:02 crc kubenswrapper[4947]: I1203 08:56:02.843980 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0037fd95-a113-4af8-b6ee-77057139e617-catalog-content\") pod \"redhat-operators-xt6hc\" (UID: \"0037fd95-a113-4af8-b6ee-77057139e617\") " pod="openshift-marketplace/redhat-operators-xt6hc" Dec 03 08:56:02 crc kubenswrapper[4947]: I1203 08:56:02.844074 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9qt9\" (UniqueName: \"kubernetes.io/projected/0037fd95-a113-4af8-b6ee-77057139e617-kube-api-access-v9qt9\") pod \"redhat-operators-xt6hc\" (UID: \"0037fd95-a113-4af8-b6ee-77057139e617\") " pod="openshift-marketplace/redhat-operators-xt6hc" Dec 03 08:56:02 crc kubenswrapper[4947]: I1203 08:56:02.946090 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0037fd95-a113-4af8-b6ee-77057139e617-utilities\") pod \"redhat-operators-xt6hc\" (UID: \"0037fd95-a113-4af8-b6ee-77057139e617\") " pod="openshift-marketplace/redhat-operators-xt6hc" Dec 03 08:56:02 crc kubenswrapper[4947]: I1203 08:56:02.946162 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0037fd95-a113-4af8-b6ee-77057139e617-catalog-content\") pod \"redhat-operators-xt6hc\" (UID: \"0037fd95-a113-4af8-b6ee-77057139e617\") " pod="openshift-marketplace/redhat-operators-xt6hc" Dec 03 08:56:02 crc kubenswrapper[4947]: I1203 08:56:02.946232 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9qt9\" (UniqueName: \"kubernetes.io/projected/0037fd95-a113-4af8-b6ee-77057139e617-kube-api-access-v9qt9\") pod \"redhat-operators-xt6hc\" (UID: \"0037fd95-a113-4af8-b6ee-77057139e617\") " pod="openshift-marketplace/redhat-operators-xt6hc" Dec 03 08:56:02 crc kubenswrapper[4947]: I1203 08:56:02.946664 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0037fd95-a113-4af8-b6ee-77057139e617-utilities\") pod \"redhat-operators-xt6hc\" (UID: \"0037fd95-a113-4af8-b6ee-77057139e617\") " pod="openshift-marketplace/redhat-operators-xt6hc" Dec 03 08:56:02 crc kubenswrapper[4947]: I1203 08:56:02.946776 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0037fd95-a113-4af8-b6ee-77057139e617-catalog-content\") pod \"redhat-operators-xt6hc\" (UID: \"0037fd95-a113-4af8-b6ee-77057139e617\") " pod="openshift-marketplace/redhat-operators-xt6hc" Dec 03 08:56:02 crc kubenswrapper[4947]: I1203 08:56:02.970915 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9qt9\" (UniqueName: \"kubernetes.io/projected/0037fd95-a113-4af8-b6ee-77057139e617-kube-api-access-v9qt9\") pod \"redhat-operators-xt6hc\" (UID: \"0037fd95-a113-4af8-b6ee-77057139e617\") " pod="openshift-marketplace/redhat-operators-xt6hc" Dec 03 08:56:03 crc kubenswrapper[4947]: I1203 08:56:03.112387 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xt6hc" Dec 03 08:56:03 crc kubenswrapper[4947]: I1203 08:56:03.583762 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xt6hc"] Dec 03 08:56:04 crc kubenswrapper[4947]: I1203 08:56:04.322641 4947 generic.go:334] "Generic (PLEG): container finished" podID="0037fd95-a113-4af8-b6ee-77057139e617" containerID="1f5b30189edd05ce771ea925caf41ecc8e2611bb04a3b4b56d7a8bf66e127766" exitCode=0 Dec 03 08:56:04 crc kubenswrapper[4947]: I1203 08:56:04.322733 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xt6hc" event={"ID":"0037fd95-a113-4af8-b6ee-77057139e617","Type":"ContainerDied","Data":"1f5b30189edd05ce771ea925caf41ecc8e2611bb04a3b4b56d7a8bf66e127766"} Dec 03 08:56:04 crc kubenswrapper[4947]: I1203 08:56:04.322897 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xt6hc" event={"ID":"0037fd95-a113-4af8-b6ee-77057139e617","Type":"ContainerStarted","Data":"8988b24e2287cca8db40ea578fb6bd648421cfebe5c73cde00878b8a964a871b"} Dec 03 08:56:05 crc kubenswrapper[4947]: I1203 08:56:05.334319 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xt6hc" event={"ID":"0037fd95-a113-4af8-b6ee-77057139e617","Type":"ContainerStarted","Data":"796c2eaf79f892cfb04d26fddf2abd9024e2bf4729fd3e166c0e90586b26bb5e"} Dec 03 08:56:06 crc kubenswrapper[4947]: I1203 08:56:06.350617 4947 generic.go:334] "Generic (PLEG): container finished" podID="0037fd95-a113-4af8-b6ee-77057139e617" containerID="796c2eaf79f892cfb04d26fddf2abd9024e2bf4729fd3e166c0e90586b26bb5e" exitCode=0 Dec 03 08:56:06 crc kubenswrapper[4947]: I1203 08:56:06.350913 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xt6hc" event={"ID":"0037fd95-a113-4af8-b6ee-77057139e617","Type":"ContainerDied","Data":"796c2eaf79f892cfb04d26fddf2abd9024e2bf4729fd3e166c0e90586b26bb5e"} Dec 03 08:56:07 crc kubenswrapper[4947]: I1203 08:56:07.182117 4947 scope.go:117] "RemoveContainer" containerID="83f621afc8477264eaae8dd259684e83b30f848ae6a10eca3f64dda53974f8f2" Dec 03 08:56:07 crc kubenswrapper[4947]: I1203 08:56:07.208963 4947 scope.go:117] "RemoveContainer" containerID="2ca2399af38763ea022d7d8ce682596d3074eb3a5c6f1ac77d87208699e60d19" Dec 03 08:56:07 crc kubenswrapper[4947]: I1203 08:56:07.240664 4947 scope.go:117] "RemoveContainer" containerID="60f2f09bdfac3ec3139a5ea45f17ee8b52d3025c54e0200afcde4cdd0ca5caca" Dec 03 08:56:07 crc kubenswrapper[4947]: I1203 08:56:07.362716 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xt6hc" event={"ID":"0037fd95-a113-4af8-b6ee-77057139e617","Type":"ContainerStarted","Data":"387adc63a5e9d7f0c60bef5b3115edfb286e3f3ceab75558eee52824bc3aa2f8"} Dec 03 08:56:07 crc kubenswrapper[4947]: I1203 08:56:07.384626 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xt6hc" podStartSLOduration=2.964834095 podStartE2EDuration="5.384604235s" podCreationTimestamp="2025-12-03 08:56:02 +0000 UTC" firstStartedPulling="2025-12-03 08:56:04.324675358 +0000 UTC m=+7625.585629804" lastFinishedPulling="2025-12-03 08:56:06.744445478 +0000 UTC m=+7628.005399944" observedRunningTime="2025-12-03 08:56:07.383215027 +0000 UTC m=+7628.644169483" watchObservedRunningTime="2025-12-03 08:56:07.384604235 +0000 UTC m=+7628.645558661" Dec 03 08:56:13 crc kubenswrapper[4947]: I1203 08:56:13.113347 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xt6hc" Dec 03 08:56:13 crc kubenswrapper[4947]: I1203 08:56:13.115130 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xt6hc" Dec 03 08:56:13 crc kubenswrapper[4947]: I1203 08:56:13.156752 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xt6hc" Dec 03 08:56:13 crc kubenswrapper[4947]: I1203 08:56:13.472565 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xt6hc" Dec 03 08:56:13 crc kubenswrapper[4947]: I1203 08:56:13.520985 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xt6hc"] Dec 03 08:56:15 crc kubenswrapper[4947]: I1203 08:56:15.437912 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xt6hc" podUID="0037fd95-a113-4af8-b6ee-77057139e617" containerName="registry-server" containerID="cri-o://387adc63a5e9d7f0c60bef5b3115edfb286e3f3ceab75558eee52824bc3aa2f8" gracePeriod=2 Dec 03 08:56:19 crc kubenswrapper[4947]: I1203 08:56:19.487982 4947 generic.go:334] "Generic (PLEG): container finished" podID="0037fd95-a113-4af8-b6ee-77057139e617" containerID="387adc63a5e9d7f0c60bef5b3115edfb286e3f3ceab75558eee52824bc3aa2f8" exitCode=0 Dec 03 08:56:19 crc kubenswrapper[4947]: I1203 08:56:19.488084 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xt6hc" event={"ID":"0037fd95-a113-4af8-b6ee-77057139e617","Type":"ContainerDied","Data":"387adc63a5e9d7f0c60bef5b3115edfb286e3f3ceab75558eee52824bc3aa2f8"} Dec 03 08:56:19 crc kubenswrapper[4947]: I1203 08:56:19.680194 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xt6hc" Dec 03 08:56:19 crc kubenswrapper[4947]: I1203 08:56:19.759862 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0037fd95-a113-4af8-b6ee-77057139e617-catalog-content\") pod \"0037fd95-a113-4af8-b6ee-77057139e617\" (UID: \"0037fd95-a113-4af8-b6ee-77057139e617\") " Dec 03 08:56:19 crc kubenswrapper[4947]: I1203 08:56:19.759962 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9qt9\" (UniqueName: \"kubernetes.io/projected/0037fd95-a113-4af8-b6ee-77057139e617-kube-api-access-v9qt9\") pod \"0037fd95-a113-4af8-b6ee-77057139e617\" (UID: \"0037fd95-a113-4af8-b6ee-77057139e617\") " Dec 03 08:56:19 crc kubenswrapper[4947]: I1203 08:56:19.760142 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0037fd95-a113-4af8-b6ee-77057139e617-utilities\") pod \"0037fd95-a113-4af8-b6ee-77057139e617\" (UID: \"0037fd95-a113-4af8-b6ee-77057139e617\") " Dec 03 08:56:19 crc kubenswrapper[4947]: I1203 08:56:19.761758 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0037fd95-a113-4af8-b6ee-77057139e617-utilities" (OuterVolumeSpecName: "utilities") pod "0037fd95-a113-4af8-b6ee-77057139e617" (UID: "0037fd95-a113-4af8-b6ee-77057139e617"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:56:19 crc kubenswrapper[4947]: I1203 08:56:19.770404 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0037fd95-a113-4af8-b6ee-77057139e617-kube-api-access-v9qt9" (OuterVolumeSpecName: "kube-api-access-v9qt9") pod "0037fd95-a113-4af8-b6ee-77057139e617" (UID: "0037fd95-a113-4af8-b6ee-77057139e617"). InnerVolumeSpecName "kube-api-access-v9qt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:56:19 crc kubenswrapper[4947]: I1203 08:56:19.862683 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0037fd95-a113-4af8-b6ee-77057139e617-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:56:19 crc kubenswrapper[4947]: I1203 08:56:19.862719 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9qt9\" (UniqueName: \"kubernetes.io/projected/0037fd95-a113-4af8-b6ee-77057139e617-kube-api-access-v9qt9\") on node \"crc\" DevicePath \"\"" Dec 03 08:56:19 crc kubenswrapper[4947]: I1203 08:56:19.910894 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0037fd95-a113-4af8-b6ee-77057139e617-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0037fd95-a113-4af8-b6ee-77057139e617" (UID: "0037fd95-a113-4af8-b6ee-77057139e617"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:56:19 crc kubenswrapper[4947]: I1203 08:56:19.964773 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0037fd95-a113-4af8-b6ee-77057139e617-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:56:20 crc kubenswrapper[4947]: I1203 08:56:20.503283 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xt6hc" event={"ID":"0037fd95-a113-4af8-b6ee-77057139e617","Type":"ContainerDied","Data":"8988b24e2287cca8db40ea578fb6bd648421cfebe5c73cde00878b8a964a871b"} Dec 03 08:56:20 crc kubenswrapper[4947]: I1203 08:56:20.503386 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xt6hc" Dec 03 08:56:20 crc kubenswrapper[4947]: I1203 08:56:20.503880 4947 scope.go:117] "RemoveContainer" containerID="387adc63a5e9d7f0c60bef5b3115edfb286e3f3ceab75558eee52824bc3aa2f8" Dec 03 08:56:20 crc kubenswrapper[4947]: I1203 08:56:20.539115 4947 scope.go:117] "RemoveContainer" containerID="796c2eaf79f892cfb04d26fddf2abd9024e2bf4729fd3e166c0e90586b26bb5e" Dec 03 08:56:20 crc kubenswrapper[4947]: I1203 08:56:20.557450 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xt6hc"] Dec 03 08:56:20 crc kubenswrapper[4947]: I1203 08:56:20.571762 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xt6hc"] Dec 03 08:56:20 crc kubenswrapper[4947]: I1203 08:56:20.587483 4947 scope.go:117] "RemoveContainer" containerID="1f5b30189edd05ce771ea925caf41ecc8e2611bb04a3b4b56d7a8bf66e127766" Dec 03 08:56:21 crc kubenswrapper[4947]: I1203 08:56:21.096348 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0037fd95-a113-4af8-b6ee-77057139e617" path="/var/lib/kubelet/pods/0037fd95-a113-4af8-b6ee-77057139e617/volumes" Dec 03 08:56:25 crc kubenswrapper[4947]: I1203 08:56:25.556016 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Dec 03 08:56:25 crc kubenswrapper[4947]: E1203 08:56:25.557018 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0037fd95-a113-4af8-b6ee-77057139e617" containerName="extract-utilities" Dec 03 08:56:25 crc kubenswrapper[4947]: I1203 08:56:25.557039 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="0037fd95-a113-4af8-b6ee-77057139e617" containerName="extract-utilities" Dec 03 08:56:25 crc kubenswrapper[4947]: E1203 08:56:25.557079 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0037fd95-a113-4af8-b6ee-77057139e617" containerName="extract-content" Dec 03 08:56:25 crc kubenswrapper[4947]: I1203 08:56:25.557092 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="0037fd95-a113-4af8-b6ee-77057139e617" containerName="extract-content" Dec 03 08:56:25 crc kubenswrapper[4947]: E1203 08:56:25.557112 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0037fd95-a113-4af8-b6ee-77057139e617" containerName="registry-server" Dec 03 08:56:25 crc kubenswrapper[4947]: I1203 08:56:25.557125 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="0037fd95-a113-4af8-b6ee-77057139e617" containerName="registry-server" Dec 03 08:56:25 crc kubenswrapper[4947]: I1203 08:56:25.557383 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="0037fd95-a113-4af8-b6ee-77057139e617" containerName="registry-server" Dec 03 08:56:25 crc kubenswrapper[4947]: I1203 08:56:25.558222 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 03 08:56:25 crc kubenswrapper[4947]: I1203 08:56:25.561712 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-kk56l" Dec 03 08:56:25 crc kubenswrapper[4947]: I1203 08:56:25.578983 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Dec 03 08:56:25 crc kubenswrapper[4947]: I1203 08:56:25.663097 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-89c0eb3b-9372-4171-abc2-70ebb79f8ff7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-89c0eb3b-9372-4171-abc2-70ebb79f8ff7\") pod \"mariadb-copy-data\" (UID: \"a213096c-8020-47a7-b5b8-8d18edb3562e\") " pod="openstack/mariadb-copy-data" Dec 03 08:56:25 crc kubenswrapper[4947]: I1203 08:56:25.663216 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvk7r\" (UniqueName: \"kubernetes.io/projected/a213096c-8020-47a7-b5b8-8d18edb3562e-kube-api-access-cvk7r\") pod \"mariadb-copy-data\" (UID: \"a213096c-8020-47a7-b5b8-8d18edb3562e\") " pod="openstack/mariadb-copy-data" Dec 03 08:56:25 crc kubenswrapper[4947]: I1203 08:56:25.764842 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-89c0eb3b-9372-4171-abc2-70ebb79f8ff7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-89c0eb3b-9372-4171-abc2-70ebb79f8ff7\") pod \"mariadb-copy-data\" (UID: \"a213096c-8020-47a7-b5b8-8d18edb3562e\") " pod="openstack/mariadb-copy-data" Dec 03 08:56:25 crc kubenswrapper[4947]: I1203 08:56:25.765058 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvk7r\" (UniqueName: \"kubernetes.io/projected/a213096c-8020-47a7-b5b8-8d18edb3562e-kube-api-access-cvk7r\") pod \"mariadb-copy-data\" (UID: \"a213096c-8020-47a7-b5b8-8d18edb3562e\") " pod="openstack/mariadb-copy-data" Dec 03 08:56:25 crc kubenswrapper[4947]: I1203 08:56:25.770318 4947 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 08:56:25 crc kubenswrapper[4947]: I1203 08:56:25.770392 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-89c0eb3b-9372-4171-abc2-70ebb79f8ff7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-89c0eb3b-9372-4171-abc2-70ebb79f8ff7\") pod \"mariadb-copy-data\" (UID: \"a213096c-8020-47a7-b5b8-8d18edb3562e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/81795b7e23b723aa42966afdc3d720c4e99a729bca17afd1b8991298ee39e877/globalmount\"" pod="openstack/mariadb-copy-data" Dec 03 08:56:25 crc kubenswrapper[4947]: I1203 08:56:25.791925 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvk7r\" (UniqueName: \"kubernetes.io/projected/a213096c-8020-47a7-b5b8-8d18edb3562e-kube-api-access-cvk7r\") pod \"mariadb-copy-data\" (UID: \"a213096c-8020-47a7-b5b8-8d18edb3562e\") " pod="openstack/mariadb-copy-data" Dec 03 08:56:25 crc kubenswrapper[4947]: I1203 08:56:25.812990 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-89c0eb3b-9372-4171-abc2-70ebb79f8ff7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-89c0eb3b-9372-4171-abc2-70ebb79f8ff7\") pod \"mariadb-copy-data\" (UID: \"a213096c-8020-47a7-b5b8-8d18edb3562e\") " pod="openstack/mariadb-copy-data" Dec 03 08:56:25 crc kubenswrapper[4947]: I1203 08:56:25.876821 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 03 08:56:26 crc kubenswrapper[4947]: I1203 08:56:26.364902 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Dec 03 08:56:26 crc kubenswrapper[4947]: I1203 08:56:26.563647 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"a213096c-8020-47a7-b5b8-8d18edb3562e","Type":"ContainerStarted","Data":"a774104fb59861765ccaabd157d8f847500fdc4ded570d2a2ab01b9a6ebf1c08"} Dec 03 08:56:26 crc kubenswrapper[4947]: I1203 08:56:26.564599 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"a213096c-8020-47a7-b5b8-8d18edb3562e","Type":"ContainerStarted","Data":"993cd659883a0a791beb8c481e3da86ce7b53963c3aa78400a85c322886de094"} Dec 03 08:56:26 crc kubenswrapper[4947]: I1203 08:56:26.589724 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=2.58970296 podStartE2EDuration="2.58970296s" podCreationTimestamp="2025-12-03 08:56:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:56:26.582939567 +0000 UTC m=+7647.843894003" watchObservedRunningTime="2025-12-03 08:56:26.58970296 +0000 UTC m=+7647.850657396" Dec 03 08:56:30 crc kubenswrapper[4947]: I1203 08:56:30.502782 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 03 08:56:30 crc kubenswrapper[4947]: I1203 08:56:30.504266 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 03 08:56:30 crc kubenswrapper[4947]: I1203 08:56:30.518399 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 03 08:56:30 crc kubenswrapper[4947]: I1203 08:56:30.555624 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t96n\" (UniqueName: \"kubernetes.io/projected/3bdbb7ba-8579-4bba-9282-72fea2df941d-kube-api-access-4t96n\") pod \"mariadb-client\" (UID: \"3bdbb7ba-8579-4bba-9282-72fea2df941d\") " pod="openstack/mariadb-client" Dec 03 08:56:30 crc kubenswrapper[4947]: I1203 08:56:30.657725 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t96n\" (UniqueName: \"kubernetes.io/projected/3bdbb7ba-8579-4bba-9282-72fea2df941d-kube-api-access-4t96n\") pod \"mariadb-client\" (UID: \"3bdbb7ba-8579-4bba-9282-72fea2df941d\") " pod="openstack/mariadb-client" Dec 03 08:56:30 crc kubenswrapper[4947]: I1203 08:56:30.678884 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t96n\" (UniqueName: \"kubernetes.io/projected/3bdbb7ba-8579-4bba-9282-72fea2df941d-kube-api-access-4t96n\") pod \"mariadb-client\" (UID: \"3bdbb7ba-8579-4bba-9282-72fea2df941d\") " pod="openstack/mariadb-client" Dec 03 08:56:30 crc kubenswrapper[4947]: I1203 08:56:30.836171 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 03 08:56:31 crc kubenswrapper[4947]: I1203 08:56:31.254238 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 03 08:56:31 crc kubenswrapper[4947]: W1203 08:56:31.259053 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bdbb7ba_8579_4bba_9282_72fea2df941d.slice/crio-a6d80a33fb1a0ceb34f7816d1cb0af4a1aa0fc12182a86ec9869821220ea1d18 WatchSource:0}: Error finding container a6d80a33fb1a0ceb34f7816d1cb0af4a1aa0fc12182a86ec9869821220ea1d18: Status 404 returned error can't find the container with id a6d80a33fb1a0ceb34f7816d1cb0af4a1aa0fc12182a86ec9869821220ea1d18 Dec 03 08:56:31 crc kubenswrapper[4947]: I1203 08:56:31.609804 4947 generic.go:334] "Generic (PLEG): container finished" podID="3bdbb7ba-8579-4bba-9282-72fea2df941d" containerID="b7e5220e376e445cb5c0539a007dad481d8a330665f4caaf9623a1397719ceb4" exitCode=0 Dec 03 08:56:31 crc kubenswrapper[4947]: I1203 08:56:31.609849 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"3bdbb7ba-8579-4bba-9282-72fea2df941d","Type":"ContainerDied","Data":"b7e5220e376e445cb5c0539a007dad481d8a330665f4caaf9623a1397719ceb4"} Dec 03 08:56:31 crc kubenswrapper[4947]: I1203 08:56:31.610094 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"3bdbb7ba-8579-4bba-9282-72fea2df941d","Type":"ContainerStarted","Data":"a6d80a33fb1a0ceb34f7816d1cb0af4a1aa0fc12182a86ec9869821220ea1d18"} Dec 03 08:56:33 crc kubenswrapper[4947]: I1203 08:56:33.016817 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 03 08:56:33 crc kubenswrapper[4947]: I1203 08:56:33.041323 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_3bdbb7ba-8579-4bba-9282-72fea2df941d/mariadb-client/0.log" Dec 03 08:56:33 crc kubenswrapper[4947]: I1203 08:56:33.080616 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 03 08:56:33 crc kubenswrapper[4947]: I1203 08:56:33.094152 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 03 08:56:33 crc kubenswrapper[4947]: I1203 08:56:33.097699 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t96n\" (UniqueName: \"kubernetes.io/projected/3bdbb7ba-8579-4bba-9282-72fea2df941d-kube-api-access-4t96n\") pod \"3bdbb7ba-8579-4bba-9282-72fea2df941d\" (UID: \"3bdbb7ba-8579-4bba-9282-72fea2df941d\") " Dec 03 08:56:33 crc kubenswrapper[4947]: I1203 08:56:33.103937 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bdbb7ba-8579-4bba-9282-72fea2df941d-kube-api-access-4t96n" (OuterVolumeSpecName: "kube-api-access-4t96n") pod "3bdbb7ba-8579-4bba-9282-72fea2df941d" (UID: "3bdbb7ba-8579-4bba-9282-72fea2df941d"). InnerVolumeSpecName "kube-api-access-4t96n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:56:33 crc kubenswrapper[4947]: I1203 08:56:33.199366 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t96n\" (UniqueName: \"kubernetes.io/projected/3bdbb7ba-8579-4bba-9282-72fea2df941d-kube-api-access-4t96n\") on node \"crc\" DevicePath \"\"" Dec 03 08:56:33 crc kubenswrapper[4947]: I1203 08:56:33.256218 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 03 08:56:33 crc kubenswrapper[4947]: E1203 08:56:33.257521 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bdbb7ba-8579-4bba-9282-72fea2df941d" containerName="mariadb-client" Dec 03 08:56:33 crc kubenswrapper[4947]: I1203 08:56:33.257674 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bdbb7ba-8579-4bba-9282-72fea2df941d" containerName="mariadb-client" Dec 03 08:56:33 crc kubenswrapper[4947]: I1203 08:56:33.258085 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bdbb7ba-8579-4bba-9282-72fea2df941d" containerName="mariadb-client" Dec 03 08:56:33 crc kubenswrapper[4947]: I1203 08:56:33.259071 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 03 08:56:33 crc kubenswrapper[4947]: I1203 08:56:33.264624 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 03 08:56:33 crc kubenswrapper[4947]: I1203 08:56:33.401294 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg2b2\" (UniqueName: \"kubernetes.io/projected/dd5acfda-aeac-48b0-9c40-ffe717a1a7df-kube-api-access-bg2b2\") pod \"mariadb-client\" (UID: \"dd5acfda-aeac-48b0-9c40-ffe717a1a7df\") " pod="openstack/mariadb-client" Dec 03 08:56:33 crc kubenswrapper[4947]: I1203 08:56:33.502516 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg2b2\" (UniqueName: \"kubernetes.io/projected/dd5acfda-aeac-48b0-9c40-ffe717a1a7df-kube-api-access-bg2b2\") pod \"mariadb-client\" (UID: \"dd5acfda-aeac-48b0-9c40-ffe717a1a7df\") " pod="openstack/mariadb-client" Dec 03 08:56:33 crc kubenswrapper[4947]: I1203 08:56:33.527154 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg2b2\" (UniqueName: \"kubernetes.io/projected/dd5acfda-aeac-48b0-9c40-ffe717a1a7df-kube-api-access-bg2b2\") pod \"mariadb-client\" (UID: \"dd5acfda-aeac-48b0-9c40-ffe717a1a7df\") " pod="openstack/mariadb-client" Dec 03 08:56:33 crc kubenswrapper[4947]: I1203 08:56:33.574918 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 03 08:56:33 crc kubenswrapper[4947]: I1203 08:56:33.630999 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6d80a33fb1a0ceb34f7816d1cb0af4a1aa0fc12182a86ec9869821220ea1d18" Dec 03 08:56:33 crc kubenswrapper[4947]: I1203 08:56:33.631067 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 03 08:56:33 crc kubenswrapper[4947]: I1203 08:56:33.661095 4947 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="3bdbb7ba-8579-4bba-9282-72fea2df941d" podUID="dd5acfda-aeac-48b0-9c40-ffe717a1a7df" Dec 03 08:56:34 crc kubenswrapper[4947]: I1203 08:56:34.037799 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 03 08:56:34 crc kubenswrapper[4947]: I1203 08:56:34.642174 4947 generic.go:334] "Generic (PLEG): container finished" podID="dd5acfda-aeac-48b0-9c40-ffe717a1a7df" containerID="8312780366e14dabe54cbc01f62c65fb2b382984e7edadef950f5b9fead43ed3" exitCode=0 Dec 03 08:56:34 crc kubenswrapper[4947]: I1203 08:56:34.642234 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"dd5acfda-aeac-48b0-9c40-ffe717a1a7df","Type":"ContainerDied","Data":"8312780366e14dabe54cbc01f62c65fb2b382984e7edadef950f5b9fead43ed3"} Dec 03 08:56:34 crc kubenswrapper[4947]: I1203 08:56:34.642419 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"dd5acfda-aeac-48b0-9c40-ffe717a1a7df","Type":"ContainerStarted","Data":"b3c8e376cfd96db08fd564590ad32a46b334b336e8d73a5859c49c1e9a13a3e8"} Dec 03 08:56:35 crc kubenswrapper[4947]: I1203 08:56:35.097677 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bdbb7ba-8579-4bba-9282-72fea2df941d" path="/var/lib/kubelet/pods/3bdbb7ba-8579-4bba-9282-72fea2df941d/volumes" Dec 03 08:56:35 crc kubenswrapper[4947]: I1203 08:56:35.969551 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 03 08:56:35 crc kubenswrapper[4947]: I1203 08:56:35.989947 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_dd5acfda-aeac-48b0-9c40-ffe717a1a7df/mariadb-client/0.log" Dec 03 08:56:36 crc kubenswrapper[4947]: I1203 08:56:36.014061 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 03 08:56:36 crc kubenswrapper[4947]: I1203 08:56:36.020260 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 03 08:56:36 crc kubenswrapper[4947]: I1203 08:56:36.044275 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg2b2\" (UniqueName: \"kubernetes.io/projected/dd5acfda-aeac-48b0-9c40-ffe717a1a7df-kube-api-access-bg2b2\") pod \"dd5acfda-aeac-48b0-9c40-ffe717a1a7df\" (UID: \"dd5acfda-aeac-48b0-9c40-ffe717a1a7df\") " Dec 03 08:56:36 crc kubenswrapper[4947]: I1203 08:56:36.049964 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd5acfda-aeac-48b0-9c40-ffe717a1a7df-kube-api-access-bg2b2" (OuterVolumeSpecName: "kube-api-access-bg2b2") pod "dd5acfda-aeac-48b0-9c40-ffe717a1a7df" (UID: "dd5acfda-aeac-48b0-9c40-ffe717a1a7df"). InnerVolumeSpecName "kube-api-access-bg2b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:56:36 crc kubenswrapper[4947]: I1203 08:56:36.137047 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 03 08:56:36 crc kubenswrapper[4947]: E1203 08:56:36.137346 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd5acfda-aeac-48b0-9c40-ffe717a1a7df" containerName="mariadb-client" Dec 03 08:56:36 crc kubenswrapper[4947]: I1203 08:56:36.137357 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd5acfda-aeac-48b0-9c40-ffe717a1a7df" containerName="mariadb-client" Dec 03 08:56:36 crc kubenswrapper[4947]: I1203 08:56:36.137550 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd5acfda-aeac-48b0-9c40-ffe717a1a7df" containerName="mariadb-client" Dec 03 08:56:36 crc kubenswrapper[4947]: I1203 08:56:36.138172 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 03 08:56:36 crc kubenswrapper[4947]: I1203 08:56:36.146303 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg2b2\" (UniqueName: \"kubernetes.io/projected/dd5acfda-aeac-48b0-9c40-ffe717a1a7df-kube-api-access-bg2b2\") on node \"crc\" DevicePath \"\"" Dec 03 08:56:36 crc kubenswrapper[4947]: I1203 08:56:36.153937 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 03 08:56:36 crc kubenswrapper[4947]: I1203 08:56:36.247112 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsdxm\" (UniqueName: \"kubernetes.io/projected/20833025-1355-469f-81ea-d908d4858a45-kube-api-access-dsdxm\") pod \"mariadb-client\" (UID: \"20833025-1355-469f-81ea-d908d4858a45\") " pod="openstack/mariadb-client" Dec 03 08:56:36 crc kubenswrapper[4947]: I1203 08:56:36.348631 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsdxm\" (UniqueName: \"kubernetes.io/projected/20833025-1355-469f-81ea-d908d4858a45-kube-api-access-dsdxm\") pod \"mariadb-client\" (UID: \"20833025-1355-469f-81ea-d908d4858a45\") " pod="openstack/mariadb-client" Dec 03 08:56:36 crc kubenswrapper[4947]: I1203 08:56:36.376110 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsdxm\" (UniqueName: \"kubernetes.io/projected/20833025-1355-469f-81ea-d908d4858a45-kube-api-access-dsdxm\") pod \"mariadb-client\" (UID: \"20833025-1355-469f-81ea-d908d4858a45\") " pod="openstack/mariadb-client" Dec 03 08:56:36 crc kubenswrapper[4947]: I1203 08:56:36.460711 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 03 08:56:36 crc kubenswrapper[4947]: I1203 08:56:36.657916 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3c8e376cfd96db08fd564590ad32a46b334b336e8d73a5859c49c1e9a13a3e8" Dec 03 08:56:36 crc kubenswrapper[4947]: I1203 08:56:36.658021 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 03 08:56:36 crc kubenswrapper[4947]: I1203 08:56:36.674725 4947 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="dd5acfda-aeac-48b0-9c40-ffe717a1a7df" podUID="20833025-1355-469f-81ea-d908d4858a45" Dec 03 08:56:36 crc kubenswrapper[4947]: I1203 08:56:36.915927 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 03 08:56:36 crc kubenswrapper[4947]: W1203 08:56:36.924316 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20833025_1355_469f_81ea_d908d4858a45.slice/crio-f9f288c0116ad75d20feec66f65d916216da48ce5cc0c5f371053cce7f54e796 WatchSource:0}: Error finding container f9f288c0116ad75d20feec66f65d916216da48ce5cc0c5f371053cce7f54e796: Status 404 returned error can't find the container with id f9f288c0116ad75d20feec66f65d916216da48ce5cc0c5f371053cce7f54e796 Dec 03 08:56:37 crc kubenswrapper[4947]: I1203 08:56:37.091114 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd5acfda-aeac-48b0-9c40-ffe717a1a7df" path="/var/lib/kubelet/pods/dd5acfda-aeac-48b0-9c40-ffe717a1a7df/volumes" Dec 03 08:56:37 crc kubenswrapper[4947]: I1203 08:56:37.669091 4947 generic.go:334] "Generic (PLEG): container finished" podID="20833025-1355-469f-81ea-d908d4858a45" containerID="7cfb3dadbed4c99cdef3765c4aa1651e016194728817eef87f18a8e4fcbb2528" exitCode=0 Dec 03 08:56:37 crc kubenswrapper[4947]: I1203 08:56:37.669161 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"20833025-1355-469f-81ea-d908d4858a45","Type":"ContainerDied","Data":"7cfb3dadbed4c99cdef3765c4aa1651e016194728817eef87f18a8e4fcbb2528"} Dec 03 08:56:37 crc kubenswrapper[4947]: I1203 08:56:37.671631 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"20833025-1355-469f-81ea-d908d4858a45","Type":"ContainerStarted","Data":"f9f288c0116ad75d20feec66f65d916216da48ce5cc0c5f371053cce7f54e796"} Dec 03 08:56:39 crc kubenswrapper[4947]: I1203 08:56:39.034187 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 03 08:56:39 crc kubenswrapper[4947]: I1203 08:56:39.053135 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_20833025-1355-469f-81ea-d908d4858a45/mariadb-client/0.log" Dec 03 08:56:39 crc kubenswrapper[4947]: I1203 08:56:39.092430 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsdxm\" (UniqueName: \"kubernetes.io/projected/20833025-1355-469f-81ea-d908d4858a45-kube-api-access-dsdxm\") pod \"20833025-1355-469f-81ea-d908d4858a45\" (UID: \"20833025-1355-469f-81ea-d908d4858a45\") " Dec 03 08:56:39 crc kubenswrapper[4947]: I1203 08:56:39.099643 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 03 08:56:39 crc kubenswrapper[4947]: I1203 08:56:39.099677 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 03 08:56:39 crc kubenswrapper[4947]: I1203 08:56:39.108689 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20833025-1355-469f-81ea-d908d4858a45-kube-api-access-dsdxm" (OuterVolumeSpecName: "kube-api-access-dsdxm") pod "20833025-1355-469f-81ea-d908d4858a45" (UID: "20833025-1355-469f-81ea-d908d4858a45"). InnerVolumeSpecName "kube-api-access-dsdxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:56:39 crc kubenswrapper[4947]: I1203 08:56:39.194327 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsdxm\" (UniqueName: \"kubernetes.io/projected/20833025-1355-469f-81ea-d908d4858a45-kube-api-access-dsdxm\") on node \"crc\" DevicePath \"\"" Dec 03 08:56:39 crc kubenswrapper[4947]: I1203 08:56:39.690483 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9f288c0116ad75d20feec66f65d916216da48ce5cc0c5f371053cce7f54e796" Dec 03 08:56:39 crc kubenswrapper[4947]: I1203 08:56:39.690586 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 03 08:56:39 crc kubenswrapper[4947]: I1203 08:56:39.835192 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 03 08:56:39 crc kubenswrapper[4947]: E1203 08:56:39.835523 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20833025-1355-469f-81ea-d908d4858a45" containerName="mariadb-client" Dec 03 08:56:39 crc kubenswrapper[4947]: I1203 08:56:39.835537 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="20833025-1355-469f-81ea-d908d4858a45" containerName="mariadb-client" Dec 03 08:56:39 crc kubenswrapper[4947]: I1203 08:56:39.835721 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="20833025-1355-469f-81ea-d908d4858a45" containerName="mariadb-client" Dec 03 08:56:39 crc kubenswrapper[4947]: I1203 08:56:39.836324 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 03 08:56:39 crc kubenswrapper[4947]: I1203 08:56:39.843298 4947 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="20833025-1355-469f-81ea-d908d4858a45" podUID="4b0853a5-0484-4595-a3a9-2c0e38440a90" Dec 03 08:56:39 crc kubenswrapper[4947]: I1203 08:56:39.865727 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 03 08:56:39 crc kubenswrapper[4947]: I1203 08:56:39.906219 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqjjm\" (UniqueName: \"kubernetes.io/projected/4b0853a5-0484-4595-a3a9-2c0e38440a90-kube-api-access-zqjjm\") pod \"mariadb-client\" (UID: \"4b0853a5-0484-4595-a3a9-2c0e38440a90\") " pod="openstack/mariadb-client" Dec 03 08:56:40 crc kubenswrapper[4947]: I1203 08:56:40.007969 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqjjm\" (UniqueName: \"kubernetes.io/projected/4b0853a5-0484-4595-a3a9-2c0e38440a90-kube-api-access-zqjjm\") pod \"mariadb-client\" (UID: \"4b0853a5-0484-4595-a3a9-2c0e38440a90\") " pod="openstack/mariadb-client" Dec 03 08:56:40 crc kubenswrapper[4947]: I1203 08:56:40.029751 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqjjm\" (UniqueName: \"kubernetes.io/projected/4b0853a5-0484-4595-a3a9-2c0e38440a90-kube-api-access-zqjjm\") pod \"mariadb-client\" (UID: \"4b0853a5-0484-4595-a3a9-2c0e38440a90\") " pod="openstack/mariadb-client" Dec 03 08:56:40 crc kubenswrapper[4947]: I1203 08:56:40.155133 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 03 08:56:40 crc kubenswrapper[4947]: I1203 08:56:40.595513 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 03 08:56:40 crc kubenswrapper[4947]: W1203 08:56:40.601142 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b0853a5_0484_4595_a3a9_2c0e38440a90.slice/crio-f5d943fb194f63ac16bfe7c69b8eae55896e6179f65c6f53548989240683c8f5 WatchSource:0}: Error finding container f5d943fb194f63ac16bfe7c69b8eae55896e6179f65c6f53548989240683c8f5: Status 404 returned error can't find the container with id f5d943fb194f63ac16bfe7c69b8eae55896e6179f65c6f53548989240683c8f5 Dec 03 08:56:40 crc kubenswrapper[4947]: I1203 08:56:40.698214 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"4b0853a5-0484-4595-a3a9-2c0e38440a90","Type":"ContainerStarted","Data":"f5d943fb194f63ac16bfe7c69b8eae55896e6179f65c6f53548989240683c8f5"} Dec 03 08:56:41 crc kubenswrapper[4947]: I1203 08:56:41.102422 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20833025-1355-469f-81ea-d908d4858a45" path="/var/lib/kubelet/pods/20833025-1355-469f-81ea-d908d4858a45/volumes" Dec 03 08:56:41 crc kubenswrapper[4947]: I1203 08:56:41.707582 4947 generic.go:334] "Generic (PLEG): container finished" podID="4b0853a5-0484-4595-a3a9-2c0e38440a90" containerID="be209a575278e2f465cbf6a02c21dfd76a5a2085fa6ff333faa202b7bcaca4d8" exitCode=0 Dec 03 08:56:41 crc kubenswrapper[4947]: I1203 08:56:41.707676 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"4b0853a5-0484-4595-a3a9-2c0e38440a90","Type":"ContainerDied","Data":"be209a575278e2f465cbf6a02c21dfd76a5a2085fa6ff333faa202b7bcaca4d8"} Dec 03 08:56:43 crc kubenswrapper[4947]: I1203 08:56:43.064641 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 03 08:56:43 crc kubenswrapper[4947]: I1203 08:56:43.082927 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_4b0853a5-0484-4595-a3a9-2c0e38440a90/mariadb-client/0.log" Dec 03 08:56:43 crc kubenswrapper[4947]: I1203 08:56:43.127593 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 03 08:56:43 crc kubenswrapper[4947]: I1203 08:56:43.146069 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 03 08:56:43 crc kubenswrapper[4947]: I1203 08:56:43.163032 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqjjm\" (UniqueName: \"kubernetes.io/projected/4b0853a5-0484-4595-a3a9-2c0e38440a90-kube-api-access-zqjjm\") pod \"4b0853a5-0484-4595-a3a9-2c0e38440a90\" (UID: \"4b0853a5-0484-4595-a3a9-2c0e38440a90\") " Dec 03 08:56:43 crc kubenswrapper[4947]: I1203 08:56:43.168670 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b0853a5-0484-4595-a3a9-2c0e38440a90-kube-api-access-zqjjm" (OuterVolumeSpecName: "kube-api-access-zqjjm") pod "4b0853a5-0484-4595-a3a9-2c0e38440a90" (UID: "4b0853a5-0484-4595-a3a9-2c0e38440a90"). InnerVolumeSpecName "kube-api-access-zqjjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:56:43 crc kubenswrapper[4947]: I1203 08:56:43.264822 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqjjm\" (UniqueName: \"kubernetes.io/projected/4b0853a5-0484-4595-a3a9-2c0e38440a90-kube-api-access-zqjjm\") on node \"crc\" DevicePath \"\"" Dec 03 08:56:43 crc kubenswrapper[4947]: I1203 08:56:43.748997 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5d943fb194f63ac16bfe7c69b8eae55896e6179f65c6f53548989240683c8f5" Dec 03 08:56:43 crc kubenswrapper[4947]: I1203 08:56:43.749078 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 03 08:56:45 crc kubenswrapper[4947]: I1203 08:56:45.103213 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b0853a5-0484-4595-a3a9-2c0e38440a90" path="/var/lib/kubelet/pods/4b0853a5-0484-4595-a3a9-2c0e38440a90/volumes" Dec 03 08:57:00 crc kubenswrapper[4947]: I1203 08:57:00.086804 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:57:00 crc kubenswrapper[4947]: I1203 08:57:00.087322 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:57:07 crc kubenswrapper[4947]: I1203 08:57:07.345709 4947 scope.go:117] "RemoveContainer" containerID="201b2b6fe81ba45cfa6bf1529aa9b77dda9b34ad66898da6965f011ddd565f6f" Dec 03 08:57:07 crc kubenswrapper[4947]: I1203 08:57:07.369647 4947 scope.go:117] "RemoveContainer" containerID="c92325f730a67ad15cfb8942610456d0455ee59290aa6358571ac1873b9b7c5b" Dec 03 08:57:07 crc kubenswrapper[4947]: I1203 08:57:07.416244 4947 scope.go:117] "RemoveContainer" containerID="97583157f998f22e0d10629240546ffdbb85fb8b99dfb94ecbd8bd7b3d8ad86c" Dec 03 08:57:07 crc kubenswrapper[4947]: I1203 08:57:07.456408 4947 scope.go:117] "RemoveContainer" containerID="1c64e271a3348574b5db5e47c0dcece52c8e11b2ac61cc97d5a33d6d96bfd8a0" Dec 03 08:57:07 crc kubenswrapper[4947]: I1203 08:57:07.493026 4947 scope.go:117] "RemoveContainer" containerID="d72547e13fb41480acf1a76e6ab15b941d2700c0ece24141730f4a84e01ea235" Dec 03 08:57:07 crc kubenswrapper[4947]: I1203 08:57:07.534630 4947 scope.go:117] "RemoveContainer" containerID="0a2196ec9e60bf5895093ad6d1067123d66b16940956e9cefdd92250bba01631" Dec 03 08:57:07 crc kubenswrapper[4947]: I1203 08:57:07.559897 4947 scope.go:117] "RemoveContainer" containerID="d60a6b0f4fd2cacb027a1325e1b0d1867e9e1a1a7ebf9e7d677d970b1e79effc" Dec 03 08:57:07 crc kubenswrapper[4947]: I1203 08:57:07.579094 4947 scope.go:117] "RemoveContainer" containerID="12d5fb4d0d17324686bafe940ec60bc215ac462272bddcc18d29c67c939ddb8b" Dec 03 08:57:07 crc kubenswrapper[4947]: I1203 08:57:07.597262 4947 scope.go:117] "RemoveContainer" containerID="72286d5fc3da6a7cee0d17a487c2de3f827439254bf27a02dc8fb07e908d5b04" Dec 03 08:57:07 crc kubenswrapper[4947]: I1203 08:57:07.626208 4947 scope.go:117] "RemoveContainer" containerID="31805a99d09b86ab643aec11b3d97303a8dc8de42576ea91cc5a89ae12e463f9" Dec 03 08:57:07 crc kubenswrapper[4947]: I1203 08:57:07.640908 4947 scope.go:117] "RemoveContainer" containerID="a1d9ab3a7fe12226bc19d68aa5d32254cff1250ab5cbfabeb7e6625b4bda711f" Dec 03 08:57:25 crc kubenswrapper[4947]: I1203 08:57:25.911051 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 08:57:25 crc kubenswrapper[4947]: E1203 08:57:25.911970 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b0853a5-0484-4595-a3a9-2c0e38440a90" containerName="mariadb-client" Dec 03 08:57:25 crc kubenswrapper[4947]: I1203 08:57:25.911987 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b0853a5-0484-4595-a3a9-2c0e38440a90" containerName="mariadb-client" Dec 03 08:57:25 crc kubenswrapper[4947]: I1203 08:57:25.912133 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b0853a5-0484-4595-a3a9-2c0e38440a90" containerName="mariadb-client" Dec 03 08:57:25 crc kubenswrapper[4947]: I1203 08:57:25.913002 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 08:57:25 crc kubenswrapper[4947]: I1203 08:57:25.914987 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 03 08:57:25 crc kubenswrapper[4947]: I1203 08:57:25.915436 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 03 08:57:25 crc kubenswrapper[4947]: I1203 08:57:25.916203 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-n5lrr" Dec 03 08:57:25 crc kubenswrapper[4947]: I1203 08:57:25.918806 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 08:57:25 crc kubenswrapper[4947]: I1203 08:57:25.937325 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 03 08:57:25 crc kubenswrapper[4947]: I1203 08:57:25.938641 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 03 08:57:25 crc kubenswrapper[4947]: I1203 08:57:25.959436 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 03 08:57:25 crc kubenswrapper[4947]: I1203 08:57:25.962062 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 03 08:57:25 crc kubenswrapper[4947]: I1203 08:57:25.973740 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 03 08:57:25 crc kubenswrapper[4947]: I1203 08:57:25.991708 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.047432 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-18b877ac-f62a-4d5d-a54e-34dc6fe5b463\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-18b877ac-f62a-4d5d-a54e-34dc6fe5b463\") pod \"ovsdbserver-nb-2\" (UID: \"621edb84-a88e-4aa5-9be0-1c50edbe0c80\") " pod="openstack/ovsdbserver-nb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.047474 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2l4l\" (UniqueName: \"kubernetes.io/projected/8fb8263d-b49d-43b1-8133-88972776e3a1-kube-api-access-z2l4l\") pod \"ovsdbserver-nb-1\" (UID: \"8fb8263d-b49d-43b1-8133-88972776e3a1\") " pod="openstack/ovsdbserver-nb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.047525 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/621edb84-a88e-4aa5-9be0-1c50edbe0c80-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"621edb84-a88e-4aa5-9be0-1c50edbe0c80\") " pod="openstack/ovsdbserver-nb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.047543 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/621edb84-a88e-4aa5-9be0-1c50edbe0c80-config\") pod \"ovsdbserver-nb-2\" (UID: \"621edb84-a88e-4aa5-9be0-1c50edbe0c80\") " pod="openstack/ovsdbserver-nb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.047558 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/621edb84-a88e-4aa5-9be0-1c50edbe0c80-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"621edb84-a88e-4aa5-9be0-1c50edbe0c80\") " pod="openstack/ovsdbserver-nb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.047605 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/621edb84-a88e-4aa5-9be0-1c50edbe0c80-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"621edb84-a88e-4aa5-9be0-1c50edbe0c80\") " pod="openstack/ovsdbserver-nb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.047622 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fb8263d-b49d-43b1-8133-88972776e3a1-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"8fb8263d-b49d-43b1-8133-88972776e3a1\") " pod="openstack/ovsdbserver-nb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.047642 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9372c287-0c89-4690-8d86-74825aa7e960-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9372c287-0c89-4690-8d86-74825aa7e960\") " pod="openstack/ovsdbserver-nb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.047667 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9372c287-0c89-4690-8d86-74825aa7e960-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9372c287-0c89-4690-8d86-74825aa7e960\") " pod="openstack/ovsdbserver-nb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.047699 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b879dcd0-711e-4c21-9557-ac17bda4567e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b879dcd0-711e-4c21-9557-ac17bda4567e\") pod \"ovsdbserver-nb-0\" (UID: \"9372c287-0c89-4690-8d86-74825aa7e960\") " pod="openstack/ovsdbserver-nb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.047737 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9372c287-0c89-4690-8d86-74825aa7e960-config\") pod \"ovsdbserver-nb-0\" (UID: \"9372c287-0c89-4690-8d86-74825aa7e960\") " pod="openstack/ovsdbserver-nb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.047765 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d478\" (UniqueName: \"kubernetes.io/projected/621edb84-a88e-4aa5-9be0-1c50edbe0c80-kube-api-access-5d478\") pod \"ovsdbserver-nb-2\" (UID: \"621edb84-a88e-4aa5-9be0-1c50edbe0c80\") " pod="openstack/ovsdbserver-nb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.047796 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzhhx\" (UniqueName: \"kubernetes.io/projected/9372c287-0c89-4690-8d86-74825aa7e960-kube-api-access-hzhhx\") pod \"ovsdbserver-nb-0\" (UID: \"9372c287-0c89-4690-8d86-74825aa7e960\") " pod="openstack/ovsdbserver-nb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.048057 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fb8263d-b49d-43b1-8133-88972776e3a1-config\") pod \"ovsdbserver-nb-1\" (UID: \"8fb8263d-b49d-43b1-8133-88972776e3a1\") " pod="openstack/ovsdbserver-nb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.048152 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-54ecea41-19da-4b07-99e3-69cbbe05d5c6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-54ecea41-19da-4b07-99e3-69cbbe05d5c6\") pod \"ovsdbserver-nb-1\" (UID: \"8fb8263d-b49d-43b1-8133-88972776e3a1\") " pod="openstack/ovsdbserver-nb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.048189 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fb8263d-b49d-43b1-8133-88972776e3a1-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"8fb8263d-b49d-43b1-8133-88972776e3a1\") " pod="openstack/ovsdbserver-nb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.048208 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8fb8263d-b49d-43b1-8133-88972776e3a1-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"8fb8263d-b49d-43b1-8133-88972776e3a1\") " pod="openstack/ovsdbserver-nb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.048225 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9372c287-0c89-4690-8d86-74825aa7e960-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9372c287-0c89-4690-8d86-74825aa7e960\") " pod="openstack/ovsdbserver-nb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.124159 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.125858 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.128625 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.128821 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.133255 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-qs4rh" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.140002 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.148905 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf\") " pod="openstack/ovsdbserver-sb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.148967 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fb8263d-b49d-43b1-8133-88972776e3a1-config\") pod \"ovsdbserver-nb-1\" (UID: \"8fb8263d-b49d-43b1-8133-88972776e3a1\") " pod="openstack/ovsdbserver-nb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.149000 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-54ecea41-19da-4b07-99e3-69cbbe05d5c6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-54ecea41-19da-4b07-99e3-69cbbe05d5c6\") pod \"ovsdbserver-nb-1\" (UID: \"8fb8263d-b49d-43b1-8133-88972776e3a1\") " pod="openstack/ovsdbserver-nb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.149025 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fb8263d-b49d-43b1-8133-88972776e3a1-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"8fb8263d-b49d-43b1-8133-88972776e3a1\") " pod="openstack/ovsdbserver-nb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.149047 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8fb8263d-b49d-43b1-8133-88972776e3a1-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"8fb8263d-b49d-43b1-8133-88972776e3a1\") " pod="openstack/ovsdbserver-nb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.149073 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9372c287-0c89-4690-8d86-74825aa7e960-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9372c287-0c89-4690-8d86-74825aa7e960\") " pod="openstack/ovsdbserver-nb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.149118 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-681e404a-f2e4-4c8b-9425-81ae857a9c34\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-681e404a-f2e4-4c8b-9425-81ae857a9c34\") pod \"ovsdbserver-sb-0\" (UID: \"b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf\") " pod="openstack/ovsdbserver-sb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.149145 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-18b877ac-f62a-4d5d-a54e-34dc6fe5b463\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-18b877ac-f62a-4d5d-a54e-34dc6fe5b463\") pod \"ovsdbserver-nb-2\" (UID: \"621edb84-a88e-4aa5-9be0-1c50edbe0c80\") " pod="openstack/ovsdbserver-nb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.149173 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2l4l\" (UniqueName: \"kubernetes.io/projected/8fb8263d-b49d-43b1-8133-88972776e3a1-kube-api-access-z2l4l\") pod \"ovsdbserver-nb-1\" (UID: \"8fb8263d-b49d-43b1-8133-88972776e3a1\") " pod="openstack/ovsdbserver-nb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.149198 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9pbl\" (UniqueName: \"kubernetes.io/projected/b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf-kube-api-access-t9pbl\") pod \"ovsdbserver-sb-0\" (UID: \"b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf\") " pod="openstack/ovsdbserver-sb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.149227 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/621edb84-a88e-4aa5-9be0-1c50edbe0c80-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"621edb84-a88e-4aa5-9be0-1c50edbe0c80\") " pod="openstack/ovsdbserver-nb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.149251 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/621edb84-a88e-4aa5-9be0-1c50edbe0c80-config\") pod \"ovsdbserver-nb-2\" (UID: \"621edb84-a88e-4aa5-9be0-1c50edbe0c80\") " pod="openstack/ovsdbserver-nb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.149273 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/621edb84-a88e-4aa5-9be0-1c50edbe0c80-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"621edb84-a88e-4aa5-9be0-1c50edbe0c80\") " pod="openstack/ovsdbserver-nb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.149297 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/621edb84-a88e-4aa5-9be0-1c50edbe0c80-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"621edb84-a88e-4aa5-9be0-1c50edbe0c80\") " pod="openstack/ovsdbserver-nb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.149321 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fb8263d-b49d-43b1-8133-88972776e3a1-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"8fb8263d-b49d-43b1-8133-88972776e3a1\") " pod="openstack/ovsdbserver-nb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.149346 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9372c287-0c89-4690-8d86-74825aa7e960-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9372c287-0c89-4690-8d86-74825aa7e960\") " pod="openstack/ovsdbserver-nb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.149369 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9372c287-0c89-4690-8d86-74825aa7e960-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9372c287-0c89-4690-8d86-74825aa7e960\") " pod="openstack/ovsdbserver-nb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.149406 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b879dcd0-711e-4c21-9557-ac17bda4567e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b879dcd0-711e-4c21-9557-ac17bda4567e\") pod \"ovsdbserver-nb-0\" (UID: \"9372c287-0c89-4690-8d86-74825aa7e960\") " pod="openstack/ovsdbserver-nb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.149445 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9372c287-0c89-4690-8d86-74825aa7e960-config\") pod \"ovsdbserver-nb-0\" (UID: \"9372c287-0c89-4690-8d86-74825aa7e960\") " pod="openstack/ovsdbserver-nb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.149473 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf-config\") pod \"ovsdbserver-sb-0\" (UID: \"b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf\") " pod="openstack/ovsdbserver-sb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.149515 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d478\" (UniqueName: \"kubernetes.io/projected/621edb84-a88e-4aa5-9be0-1c50edbe0c80-kube-api-access-5d478\") pod \"ovsdbserver-nb-2\" (UID: \"621edb84-a88e-4aa5-9be0-1c50edbe0c80\") " pod="openstack/ovsdbserver-nb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.149539 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf\") " pod="openstack/ovsdbserver-sb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.149567 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzhhx\" (UniqueName: \"kubernetes.io/projected/9372c287-0c89-4690-8d86-74825aa7e960-kube-api-access-hzhhx\") pod \"ovsdbserver-nb-0\" (UID: \"9372c287-0c89-4690-8d86-74825aa7e960\") " pod="openstack/ovsdbserver-nb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.149589 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf\") " pod="openstack/ovsdbserver-sb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.150761 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8fb8263d-b49d-43b1-8133-88972776e3a1-config\") pod \"ovsdbserver-nb-1\" (UID: \"8fb8263d-b49d-43b1-8133-88972776e3a1\") " pod="openstack/ovsdbserver-nb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.153729 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.155966 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.156191 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8fb8263d-b49d-43b1-8133-88972776e3a1-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"8fb8263d-b49d-43b1-8133-88972776e3a1\") " pod="openstack/ovsdbserver-nb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.156555 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9372c287-0c89-4690-8d86-74825aa7e960-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9372c287-0c89-4690-8d86-74825aa7e960\") " pod="openstack/ovsdbserver-nb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.156792 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9372c287-0c89-4690-8d86-74825aa7e960-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9372c287-0c89-4690-8d86-74825aa7e960\") " pod="openstack/ovsdbserver-nb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.157100 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9372c287-0c89-4690-8d86-74825aa7e960-config\") pod \"ovsdbserver-nb-0\" (UID: \"9372c287-0c89-4690-8d86-74825aa7e960\") " pod="openstack/ovsdbserver-nb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.157888 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.158070 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/621edb84-a88e-4aa5-9be0-1c50edbe0c80-config\") pod \"ovsdbserver-nb-2\" (UID: \"621edb84-a88e-4aa5-9be0-1c50edbe0c80\") " pod="openstack/ovsdbserver-nb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.157898 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/621edb84-a88e-4aa5-9be0-1c50edbe0c80-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"621edb84-a88e-4aa5-9be0-1c50edbe0c80\") " pod="openstack/ovsdbserver-nb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.158507 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/621edb84-a88e-4aa5-9be0-1c50edbe0c80-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"621edb84-a88e-4aa5-9be0-1c50edbe0c80\") " pod="openstack/ovsdbserver-nb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.159753 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.161820 4947 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.161856 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-18b877ac-f62a-4d5d-a54e-34dc6fe5b463\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-18b877ac-f62a-4d5d-a54e-34dc6fe5b463\") pod \"ovsdbserver-nb-2\" (UID: \"621edb84-a88e-4aa5-9be0-1c50edbe0c80\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/52756a404adf1e115dc51e2806d0f9234033d9ace935600518ef5a57dfd739d1/globalmount\"" pod="openstack/ovsdbserver-nb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.162022 4947 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.162054 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b879dcd0-711e-4c21-9557-ac17bda4567e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b879dcd0-711e-4c21-9557-ac17bda4567e\") pod \"ovsdbserver-nb-0\" (UID: \"9372c287-0c89-4690-8d86-74825aa7e960\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/37811d436b6f757ce4b2c256ba2becb3d0893cf0785a79de97ab5fdacb5f5fe0/globalmount\"" pod="openstack/ovsdbserver-nb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.162386 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fb8263d-b49d-43b1-8133-88972776e3a1-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"8fb8263d-b49d-43b1-8133-88972776e3a1\") " pod="openstack/ovsdbserver-nb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.163557 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9372c287-0c89-4690-8d86-74825aa7e960-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9372c287-0c89-4690-8d86-74825aa7e960\") " pod="openstack/ovsdbserver-nb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.170032 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fb8263d-b49d-43b1-8133-88972776e3a1-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"8fb8263d-b49d-43b1-8133-88972776e3a1\") " pod="openstack/ovsdbserver-nb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.177580 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.177753 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/621edb84-a88e-4aa5-9be0-1c50edbe0c80-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"621edb84-a88e-4aa5-9be0-1c50edbe0c80\") " pod="openstack/ovsdbserver-nb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.178861 4947 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.179794 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-54ecea41-19da-4b07-99e3-69cbbe05d5c6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-54ecea41-19da-4b07-99e3-69cbbe05d5c6\") pod \"ovsdbserver-nb-1\" (UID: \"8fb8263d-b49d-43b1-8133-88972776e3a1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e9760647cd03f47ed5cd300d9a8c9f4eada5b212c24556350b4b43047cdf9ceb/globalmount\"" pod="openstack/ovsdbserver-nb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.186033 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.187386 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d478\" (UniqueName: \"kubernetes.io/projected/621edb84-a88e-4aa5-9be0-1c50edbe0c80-kube-api-access-5d478\") pod \"ovsdbserver-nb-2\" (UID: \"621edb84-a88e-4aa5-9be0-1c50edbe0c80\") " pod="openstack/ovsdbserver-nb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.189134 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzhhx\" (UniqueName: \"kubernetes.io/projected/9372c287-0c89-4690-8d86-74825aa7e960-kube-api-access-hzhhx\") pod \"ovsdbserver-nb-0\" (UID: \"9372c287-0c89-4690-8d86-74825aa7e960\") " pod="openstack/ovsdbserver-nb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.191738 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2l4l\" (UniqueName: \"kubernetes.io/projected/8fb8263d-b49d-43b1-8133-88972776e3a1-kube-api-access-z2l4l\") pod \"ovsdbserver-nb-1\" (UID: \"8fb8263d-b49d-43b1-8133-88972776e3a1\") " pod="openstack/ovsdbserver-nb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.205252 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b879dcd0-711e-4c21-9557-ac17bda4567e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b879dcd0-711e-4c21-9557-ac17bda4567e\") pod \"ovsdbserver-nb-0\" (UID: \"9372c287-0c89-4690-8d86-74825aa7e960\") " pod="openstack/ovsdbserver-nb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.209960 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-18b877ac-f62a-4d5d-a54e-34dc6fe5b463\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-18b877ac-f62a-4d5d-a54e-34dc6fe5b463\") pod \"ovsdbserver-nb-2\" (UID: \"621edb84-a88e-4aa5-9be0-1c50edbe0c80\") " pod="openstack/ovsdbserver-nb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.229103 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-54ecea41-19da-4b07-99e3-69cbbe05d5c6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-54ecea41-19da-4b07-99e3-69cbbe05d5c6\") pod \"ovsdbserver-nb-1\" (UID: \"8fb8263d-b49d-43b1-8133-88972776e3a1\") " pod="openstack/ovsdbserver-nb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.238561 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.252284 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9pbl\" (UniqueName: \"kubernetes.io/projected/b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf-kube-api-access-t9pbl\") pod \"ovsdbserver-sb-0\" (UID: \"b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf\") " pod="openstack/ovsdbserver-sb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.252382 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf-config\") pod \"ovsdbserver-sb-0\" (UID: \"b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf\") " pod="openstack/ovsdbserver-sb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.252408 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf\") " pod="openstack/ovsdbserver-sb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.252430 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf\") " pod="openstack/ovsdbserver-sb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.252455 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf\") " pod="openstack/ovsdbserver-sb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.252532 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-681e404a-f2e4-4c8b-9425-81ae857a9c34\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-681e404a-f2e4-4c8b-9425-81ae857a9c34\") pod \"ovsdbserver-sb-0\" (UID: \"b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf\") " pod="openstack/ovsdbserver-sb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.253821 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf\") " pod="openstack/ovsdbserver-sb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.254385 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf-config\") pod \"ovsdbserver-sb-0\" (UID: \"b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf\") " pod="openstack/ovsdbserver-sb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.257008 4947 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.257094 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-681e404a-f2e4-4c8b-9425-81ae857a9c34\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-681e404a-f2e4-4c8b-9425-81ae857a9c34\") pod \"ovsdbserver-sb-0\" (UID: \"b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1cfbd96a89135c6cb436137bdde48fc3b255a8e2a339cf46d905eb3c55288b4b/globalmount\"" pod="openstack/ovsdbserver-sb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.257565 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf\") " pod="openstack/ovsdbserver-sb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.265754 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf\") " pod="openstack/ovsdbserver-sb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.265972 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.269103 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9pbl\" (UniqueName: \"kubernetes.io/projected/b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf-kube-api-access-t9pbl\") pod \"ovsdbserver-sb-0\" (UID: \"b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf\") " pod="openstack/ovsdbserver-sb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.291559 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.302025 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-681e404a-f2e4-4c8b-9425-81ae857a9c34\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-681e404a-f2e4-4c8b-9425-81ae857a9c34\") pod \"ovsdbserver-sb-0\" (UID: \"b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf\") " pod="openstack/ovsdbserver-sb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.353779 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b935af14-28eb-4ae6-b66a-c7405f1f55f5-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"b935af14-28eb-4ae6-b66a-c7405f1f55f5\") " pod="openstack/ovsdbserver-sb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.354067 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3c2ab8dc-1af4-4d33-912e-95a86e33ab3d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3c2ab8dc-1af4-4d33-912e-95a86e33ab3d\") pod \"ovsdbserver-sb-1\" (UID: \"b935af14-28eb-4ae6-b66a-c7405f1f55f5\") " pod="openstack/ovsdbserver-sb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.354104 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fjtf\" (UniqueName: \"kubernetes.io/projected/b935af14-28eb-4ae6-b66a-c7405f1f55f5-kube-api-access-5fjtf\") pod \"ovsdbserver-sb-1\" (UID: \"b935af14-28eb-4ae6-b66a-c7405f1f55f5\") " pod="openstack/ovsdbserver-sb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.354124 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23e1eaf-4865-4a29-b5db-00de006317dc-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"e23e1eaf-4865-4a29-b5db-00de006317dc\") " pod="openstack/ovsdbserver-sb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.354151 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b935af14-28eb-4ae6-b66a-c7405f1f55f5-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"b935af14-28eb-4ae6-b66a-c7405f1f55f5\") " pod="openstack/ovsdbserver-sb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.354188 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e23e1eaf-4865-4a29-b5db-00de006317dc-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"e23e1eaf-4865-4a29-b5db-00de006317dc\") " pod="openstack/ovsdbserver-sb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.354207 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c4qr\" (UniqueName: \"kubernetes.io/projected/e23e1eaf-4865-4a29-b5db-00de006317dc-kube-api-access-6c4qr\") pod \"ovsdbserver-sb-2\" (UID: \"e23e1eaf-4865-4a29-b5db-00de006317dc\") " pod="openstack/ovsdbserver-sb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.354226 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b935af14-28eb-4ae6-b66a-c7405f1f55f5-config\") pod \"ovsdbserver-sb-1\" (UID: \"b935af14-28eb-4ae6-b66a-c7405f1f55f5\") " pod="openstack/ovsdbserver-sb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.354252 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b935af14-28eb-4ae6-b66a-c7405f1f55f5-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"b935af14-28eb-4ae6-b66a-c7405f1f55f5\") " pod="openstack/ovsdbserver-sb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.354268 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6eafd4dc-9f21-4999-b190-a32c070360ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eafd4dc-9f21-4999-b190-a32c070360ed\") pod \"ovsdbserver-sb-2\" (UID: \"e23e1eaf-4865-4a29-b5db-00de006317dc\") " pod="openstack/ovsdbserver-sb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.354284 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e23e1eaf-4865-4a29-b5db-00de006317dc-config\") pod \"ovsdbserver-sb-2\" (UID: \"e23e1eaf-4865-4a29-b5db-00de006317dc\") " pod="openstack/ovsdbserver-sb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.354308 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e23e1eaf-4865-4a29-b5db-00de006317dc-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"e23e1eaf-4865-4a29-b5db-00de006317dc\") " pod="openstack/ovsdbserver-sb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.447501 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.456483 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b935af14-28eb-4ae6-b66a-c7405f1f55f5-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"b935af14-28eb-4ae6-b66a-c7405f1f55f5\") " pod="openstack/ovsdbserver-sb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.461283 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b935af14-28eb-4ae6-b66a-c7405f1f55f5-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"b935af14-28eb-4ae6-b66a-c7405f1f55f5\") " pod="openstack/ovsdbserver-sb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.461423 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e23e1eaf-4865-4a29-b5db-00de006317dc-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"e23e1eaf-4865-4a29-b5db-00de006317dc\") " pod="openstack/ovsdbserver-sb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.461461 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c4qr\" (UniqueName: \"kubernetes.io/projected/e23e1eaf-4865-4a29-b5db-00de006317dc-kube-api-access-6c4qr\") pod \"ovsdbserver-sb-2\" (UID: \"e23e1eaf-4865-4a29-b5db-00de006317dc\") " pod="openstack/ovsdbserver-sb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.461514 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b935af14-28eb-4ae6-b66a-c7405f1f55f5-config\") pod \"ovsdbserver-sb-1\" (UID: \"b935af14-28eb-4ae6-b66a-c7405f1f55f5\") " pod="openstack/ovsdbserver-sb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.461566 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b935af14-28eb-4ae6-b66a-c7405f1f55f5-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"b935af14-28eb-4ae6-b66a-c7405f1f55f5\") " pod="openstack/ovsdbserver-sb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.461584 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6eafd4dc-9f21-4999-b190-a32c070360ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eafd4dc-9f21-4999-b190-a32c070360ed\") pod \"ovsdbserver-sb-2\" (UID: \"e23e1eaf-4865-4a29-b5db-00de006317dc\") " pod="openstack/ovsdbserver-sb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.461604 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e23e1eaf-4865-4a29-b5db-00de006317dc-config\") pod \"ovsdbserver-sb-2\" (UID: \"e23e1eaf-4865-4a29-b5db-00de006317dc\") " pod="openstack/ovsdbserver-sb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.461647 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e23e1eaf-4865-4a29-b5db-00de006317dc-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"e23e1eaf-4865-4a29-b5db-00de006317dc\") " pod="openstack/ovsdbserver-sb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.461746 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b935af14-28eb-4ae6-b66a-c7405f1f55f5-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"b935af14-28eb-4ae6-b66a-c7405f1f55f5\") " pod="openstack/ovsdbserver-sb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.461770 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3c2ab8dc-1af4-4d33-912e-95a86e33ab3d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3c2ab8dc-1af4-4d33-912e-95a86e33ab3d\") pod \"ovsdbserver-sb-1\" (UID: \"b935af14-28eb-4ae6-b66a-c7405f1f55f5\") " pod="openstack/ovsdbserver-sb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.461817 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fjtf\" (UniqueName: \"kubernetes.io/projected/b935af14-28eb-4ae6-b66a-c7405f1f55f5-kube-api-access-5fjtf\") pod \"ovsdbserver-sb-1\" (UID: \"b935af14-28eb-4ae6-b66a-c7405f1f55f5\") " pod="openstack/ovsdbserver-sb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.461836 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23e1eaf-4865-4a29-b5db-00de006317dc-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"e23e1eaf-4865-4a29-b5db-00de006317dc\") " pod="openstack/ovsdbserver-sb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.462117 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e23e1eaf-4865-4a29-b5db-00de006317dc-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"e23e1eaf-4865-4a29-b5db-00de006317dc\") " pod="openstack/ovsdbserver-sb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.462303 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b935af14-28eb-4ae6-b66a-c7405f1f55f5-config\") pod \"ovsdbserver-sb-1\" (UID: \"b935af14-28eb-4ae6-b66a-c7405f1f55f5\") " pod="openstack/ovsdbserver-sb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.462697 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e23e1eaf-4865-4a29-b5db-00de006317dc-config\") pod \"ovsdbserver-sb-2\" (UID: \"e23e1eaf-4865-4a29-b5db-00de006317dc\") " pod="openstack/ovsdbserver-sb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.463237 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b935af14-28eb-4ae6-b66a-c7405f1f55f5-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"b935af14-28eb-4ae6-b66a-c7405f1f55f5\") " pod="openstack/ovsdbserver-sb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.467736 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b935af14-28eb-4ae6-b66a-c7405f1f55f5-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"b935af14-28eb-4ae6-b66a-c7405f1f55f5\") " pod="openstack/ovsdbserver-sb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.468085 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e23e1eaf-4865-4a29-b5db-00de006317dc-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"e23e1eaf-4865-4a29-b5db-00de006317dc\") " pod="openstack/ovsdbserver-sb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.468400 4947 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.468423 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3c2ab8dc-1af4-4d33-912e-95a86e33ab3d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3c2ab8dc-1af4-4d33-912e-95a86e33ab3d\") pod \"ovsdbserver-sb-1\" (UID: \"b935af14-28eb-4ae6-b66a-c7405f1f55f5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/831ebf6173e8d822f9c669587f9257fc9a774506d3266d8595e0a964969fe1e1/globalmount\"" pod="openstack/ovsdbserver-sb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.473797 4947 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.473842 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6eafd4dc-9f21-4999-b190-a32c070360ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eafd4dc-9f21-4999-b190-a32c070360ed\") pod \"ovsdbserver-sb-2\" (UID: \"e23e1eaf-4865-4a29-b5db-00de006317dc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5ad2a6a62817322dbfa1d31f79e56cdf247f6b0de69059b46c1e9a960c752948/globalmount\"" pod="openstack/ovsdbserver-sb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.478340 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e23e1eaf-4865-4a29-b5db-00de006317dc-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"e23e1eaf-4865-4a29-b5db-00de006317dc\") " pod="openstack/ovsdbserver-sb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.480996 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fjtf\" (UniqueName: \"kubernetes.io/projected/b935af14-28eb-4ae6-b66a-c7405f1f55f5-kube-api-access-5fjtf\") pod \"ovsdbserver-sb-1\" (UID: \"b935af14-28eb-4ae6-b66a-c7405f1f55f5\") " pod="openstack/ovsdbserver-sb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.483254 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c4qr\" (UniqueName: \"kubernetes.io/projected/e23e1eaf-4865-4a29-b5db-00de006317dc-kube-api-access-6c4qr\") pod \"ovsdbserver-sb-2\" (UID: \"e23e1eaf-4865-4a29-b5db-00de006317dc\") " pod="openstack/ovsdbserver-sb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.502434 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3c2ab8dc-1af4-4d33-912e-95a86e33ab3d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3c2ab8dc-1af4-4d33-912e-95a86e33ab3d\") pod \"ovsdbserver-sb-1\" (UID: \"b935af14-28eb-4ae6-b66a-c7405f1f55f5\") " pod="openstack/ovsdbserver-sb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.514014 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6eafd4dc-9f21-4999-b190-a32c070360ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6eafd4dc-9f21-4999-b190-a32c070360ed\") pod \"ovsdbserver-sb-2\" (UID: \"e23e1eaf-4865-4a29-b5db-00de006317dc\") " pod="openstack/ovsdbserver-sb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.564974 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.689843 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.776103 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 08:57:26 crc kubenswrapper[4947]: W1203 08:57:26.862740 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fb8263d_b49d_43b1_8133_88972776e3a1.slice/crio-1414a5a34102f65bd47375c6ba35a4a67e06cdfeeba65f392fadf0da21101c35 WatchSource:0}: Error finding container 1414a5a34102f65bd47375c6ba35a4a67e06cdfeeba65f392fadf0da21101c35: Status 404 returned error can't find the container with id 1414a5a34102f65bd47375c6ba35a4a67e06cdfeeba65f392fadf0da21101c35 Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.863717 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 03 08:57:26 crc kubenswrapper[4947]: I1203 08:57:26.959825 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 03 08:57:26 crc kubenswrapper[4947]: W1203 08:57:26.964677 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod621edb84_a88e_4aa5_9be0_1c50edbe0c80.slice/crio-85accf52a9d50779358e7cb4b363e969841b96696f76197274823a6b1ef67fc9 WatchSource:0}: Error finding container 85accf52a9d50779358e7cb4b363e969841b96696f76197274823a6b1ef67fc9: Status 404 returned error can't find the container with id 85accf52a9d50779358e7cb4b363e969841b96696f76197274823a6b1ef67fc9 Dec 03 08:57:27 crc kubenswrapper[4947]: I1203 08:57:27.074868 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 03 08:57:27 crc kubenswrapper[4947]: W1203 08:57:27.082096 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb935af14_28eb_4ae6_b66a_c7405f1f55f5.slice/crio-bfb90be918a1cf0fdbf7b920a477b67e8cfb051c2a61ff1b5e54e5c83800760d WatchSource:0}: Error finding container bfb90be918a1cf0fdbf7b920a477b67e8cfb051c2a61ff1b5e54e5c83800760d: Status 404 returned error can't find the container with id bfb90be918a1cf0fdbf7b920a477b67e8cfb051c2a61ff1b5e54e5c83800760d Dec 03 08:57:27 crc kubenswrapper[4947]: I1203 08:57:27.168894 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"8fb8263d-b49d-43b1-8133-88972776e3a1","Type":"ContainerStarted","Data":"1414a5a34102f65bd47375c6ba35a4a67e06cdfeeba65f392fadf0da21101c35"} Dec 03 08:57:27 crc kubenswrapper[4947]: I1203 08:57:27.170225 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"621edb84-a88e-4aa5-9be0-1c50edbe0c80","Type":"ContainerStarted","Data":"85accf52a9d50779358e7cb4b363e969841b96696f76197274823a6b1ef67fc9"} Dec 03 08:57:27 crc kubenswrapper[4947]: I1203 08:57:27.171184 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9372c287-0c89-4690-8d86-74825aa7e960","Type":"ContainerStarted","Data":"bc2e52554f2285a3ac928fc64639eaedc902c5278fb9ccb424bd0af28676e1eb"} Dec 03 08:57:27 crc kubenswrapper[4947]: I1203 08:57:27.172380 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"b935af14-28eb-4ae6-b66a-c7405f1f55f5","Type":"ContainerStarted","Data":"bfb90be918a1cf0fdbf7b920a477b67e8cfb051c2a61ff1b5e54e5c83800760d"} Dec 03 08:57:27 crc kubenswrapper[4947]: I1203 08:57:27.205522 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 03 08:57:27 crc kubenswrapper[4947]: I1203 08:57:27.856675 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 08:57:28 crc kubenswrapper[4947]: I1203 08:57:28.187017 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf","Type":"ContainerStarted","Data":"ed25e6c0b0e9a703e3ed09e485a57632d19a7261ec1c52ffab518e56f130aeed"} Dec 03 08:57:28 crc kubenswrapper[4947]: I1203 08:57:28.190030 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"e23e1eaf-4865-4a29-b5db-00de006317dc","Type":"ContainerStarted","Data":"47337c445225a967808c1f2319f7d96eb242255f5c9425d285235b9556390866"} Dec 03 08:57:30 crc kubenswrapper[4947]: I1203 08:57:30.086874 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:57:30 crc kubenswrapper[4947]: I1203 08:57:30.087177 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:57:32 crc kubenswrapper[4947]: I1203 08:57:32.224782 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"e23e1eaf-4865-4a29-b5db-00de006317dc","Type":"ContainerStarted","Data":"30df59404e2bb0e9aba18d248d1a99615f17cc32bfbbea96c5dad9f46af1a8f7"} Dec 03 08:57:32 crc kubenswrapper[4947]: I1203 08:57:32.225342 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"e23e1eaf-4865-4a29-b5db-00de006317dc","Type":"ContainerStarted","Data":"297352396b0b1c4464c86ceb8f01735c2e208c309e3116090a7a7f3c136a6cc5"} Dec 03 08:57:32 crc kubenswrapper[4947]: I1203 08:57:32.226655 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9372c287-0c89-4690-8d86-74825aa7e960","Type":"ContainerStarted","Data":"12dc8d11a09ceafc1183436c53145f53af83e8faee6ca0af986a31d2496a334a"} Dec 03 08:57:32 crc kubenswrapper[4947]: I1203 08:57:32.226688 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9372c287-0c89-4690-8d86-74825aa7e960","Type":"ContainerStarted","Data":"dddc7a9f7f7e9df0f906258bafda99e0b19d97c9f6bb631ee26722afe947ab5b"} Dec 03 08:57:32 crc kubenswrapper[4947]: I1203 08:57:32.228940 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"b935af14-28eb-4ae6-b66a-c7405f1f55f5","Type":"ContainerStarted","Data":"6a42cb11b3dcd9cffaa141b753f21a460299ba6e5e9df5801bced8409c9ba15b"} Dec 03 08:57:32 crc kubenswrapper[4947]: I1203 08:57:32.229004 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"b935af14-28eb-4ae6-b66a-c7405f1f55f5","Type":"ContainerStarted","Data":"13b4e3537ca9e2ef0e722e4af691482e593a304affe100da1bb5468cd4d0d47d"} Dec 03 08:57:32 crc kubenswrapper[4947]: I1203 08:57:32.231858 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf","Type":"ContainerStarted","Data":"9206f2c7acbb6c5860a634ee59e24b0484c386e7e3ebcc799b34a59456dc3969"} Dec 03 08:57:32 crc kubenswrapper[4947]: I1203 08:57:32.231920 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf","Type":"ContainerStarted","Data":"f6ff3bf9270cf1b7c1689365a222a53c859fbced09624bc79443f128eb7816c8"} Dec 03 08:57:32 crc kubenswrapper[4947]: I1203 08:57:32.234547 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"8fb8263d-b49d-43b1-8133-88972776e3a1","Type":"ContainerStarted","Data":"37e70db30bfd875cdf4b19cf39c52f0b328117929b173fa9ae765f7be653f7c6"} Dec 03 08:57:32 crc kubenswrapper[4947]: I1203 08:57:32.234594 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"8fb8263d-b49d-43b1-8133-88972776e3a1","Type":"ContainerStarted","Data":"64e144ce213497ad2deda9cc48ad8111fd42849050e3f8d9092b17b3790ea70b"} Dec 03 08:57:32 crc kubenswrapper[4947]: I1203 08:57:32.237458 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"621edb84-a88e-4aa5-9be0-1c50edbe0c80","Type":"ContainerStarted","Data":"cc4acb7dab958c4b156a7472b2c1b683aaf6a84aab255ffa1c874a77f19a5201"} Dec 03 08:57:32 crc kubenswrapper[4947]: I1203 08:57:32.237504 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"621edb84-a88e-4aa5-9be0-1c50edbe0c80","Type":"ContainerStarted","Data":"6fba90c337ff7c4716961cee5ba86b6889125135afbdbb62c217e5b4e02fa681"} Dec 03 08:57:32 crc kubenswrapper[4947]: I1203 08:57:32.238875 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 03 08:57:32 crc kubenswrapper[4947]: I1203 08:57:32.246898 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.167062861 podStartE2EDuration="7.246877463s" podCreationTimestamp="2025-12-03 08:57:25 +0000 UTC" firstStartedPulling="2025-12-03 08:57:27.213643962 +0000 UTC m=+7708.474598388" lastFinishedPulling="2025-12-03 08:57:31.293458524 +0000 UTC m=+7712.554412990" observedRunningTime="2025-12-03 08:57:32.243886722 +0000 UTC m=+7713.504841148" watchObservedRunningTime="2025-12-03 08:57:32.246877463 +0000 UTC m=+7713.507831889" Dec 03 08:57:32 crc kubenswrapper[4947]: I1203 08:57:32.266130 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Dec 03 08:57:32 crc kubenswrapper[4947]: I1203 08:57:32.267072 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.837173567 podStartE2EDuration="8.267048899s" podCreationTimestamp="2025-12-03 08:57:24 +0000 UTC" firstStartedPulling="2025-12-03 08:57:26.864315823 +0000 UTC m=+7708.125270249" lastFinishedPulling="2025-12-03 08:57:31.294191115 +0000 UTC m=+7712.555145581" observedRunningTime="2025-12-03 08:57:32.261783857 +0000 UTC m=+7713.522738293" watchObservedRunningTime="2025-12-03 08:57:32.267048899 +0000 UTC m=+7713.528003335" Dec 03 08:57:32 crc kubenswrapper[4947]: I1203 08:57:32.286560 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.96038982 podStartE2EDuration="8.286541186s" podCreationTimestamp="2025-12-03 08:57:24 +0000 UTC" firstStartedPulling="2025-12-03 08:57:26.967255107 +0000 UTC m=+7708.228209533" lastFinishedPulling="2025-12-03 08:57:31.293406473 +0000 UTC m=+7712.554360899" observedRunningTime="2025-12-03 08:57:32.281220532 +0000 UTC m=+7713.542174968" watchObservedRunningTime="2025-12-03 08:57:32.286541186 +0000 UTC m=+7713.547495612" Dec 03 08:57:32 crc kubenswrapper[4947]: I1203 08:57:32.292758 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Dec 03 08:57:32 crc kubenswrapper[4947]: I1203 08:57:32.321793 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.807984588 podStartE2EDuration="8.321775839s" podCreationTimestamp="2025-12-03 08:57:24 +0000 UTC" firstStartedPulling="2025-12-03 08:57:26.782945773 +0000 UTC m=+7708.043900199" lastFinishedPulling="2025-12-03 08:57:31.296737024 +0000 UTC m=+7712.557691450" observedRunningTime="2025-12-03 08:57:32.321005098 +0000 UTC m=+7713.581959524" watchObservedRunningTime="2025-12-03 08:57:32.321775839 +0000 UTC m=+7713.582730265" Dec 03 08:57:32 crc kubenswrapper[4947]: I1203 08:57:32.325346 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.103179872 podStartE2EDuration="7.325330855s" podCreationTimestamp="2025-12-03 08:57:25 +0000 UTC" firstStartedPulling="2025-12-03 08:57:27.084895549 +0000 UTC m=+7708.345849975" lastFinishedPulling="2025-12-03 08:57:31.307046532 +0000 UTC m=+7712.568000958" observedRunningTime="2025-12-03 08:57:32.308214692 +0000 UTC m=+7713.569169118" watchObservedRunningTime="2025-12-03 08:57:32.325330855 +0000 UTC m=+7713.586285281" Dec 03 08:57:32 crc kubenswrapper[4947]: I1203 08:57:32.344285 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.924124008 podStartE2EDuration="7.344267598s" podCreationTimestamp="2025-12-03 08:57:25 +0000 UTC" firstStartedPulling="2025-12-03 08:57:27.874037235 +0000 UTC m=+7709.134991661" lastFinishedPulling="2025-12-03 08:57:31.294180825 +0000 UTC m=+7712.555135251" observedRunningTime="2025-12-03 08:57:32.342444368 +0000 UTC m=+7713.603398804" watchObservedRunningTime="2025-12-03 08:57:32.344267598 +0000 UTC m=+7713.605222024" Dec 03 08:57:32 crc kubenswrapper[4947]: I1203 08:57:32.448507 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 03 08:57:32 crc kubenswrapper[4947]: I1203 08:57:32.566000 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Dec 03 08:57:32 crc kubenswrapper[4947]: I1203 08:57:32.690699 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Dec 03 08:57:35 crc kubenswrapper[4947]: I1203 08:57:35.282285 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 03 08:57:35 crc kubenswrapper[4947]: I1203 08:57:35.282882 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 03 08:57:35 crc kubenswrapper[4947]: I1203 08:57:35.308923 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Dec 03 08:57:35 crc kubenswrapper[4947]: I1203 08:57:35.309429 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Dec 03 08:57:35 crc kubenswrapper[4947]: I1203 08:57:35.332699 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Dec 03 08:57:35 crc kubenswrapper[4947]: I1203 08:57:35.333126 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Dec 03 08:57:35 crc kubenswrapper[4947]: I1203 08:57:35.486505 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 03 08:57:35 crc kubenswrapper[4947]: I1203 08:57:35.486947 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 03 08:57:35 crc kubenswrapper[4947]: I1203 08:57:35.600912 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Dec 03 08:57:35 crc kubenswrapper[4947]: I1203 08:57:35.601518 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Dec 03 08:57:35 crc kubenswrapper[4947]: I1203 08:57:35.727861 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Dec 03 08:57:35 crc kubenswrapper[4947]: I1203 08:57:35.728294 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.285342 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.321710 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.327361 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.327479 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.328646 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.332390 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.539279 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5566ccbc57-lxntc"] Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.540614 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5566ccbc57-lxntc" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.543174 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.548860 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5566ccbc57-lxntc"] Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.639441 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfd3a102-4d76-4c9d-ad99-63deb530fb04-ovsdbserver-nb\") pod \"dnsmasq-dns-5566ccbc57-lxntc\" (UID: \"bfd3a102-4d76-4c9d-ad99-63deb530fb04\") " pod="openstack/dnsmasq-dns-5566ccbc57-lxntc" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.639606 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8pxk\" (UniqueName: \"kubernetes.io/projected/bfd3a102-4d76-4c9d-ad99-63deb530fb04-kube-api-access-d8pxk\") pod \"dnsmasq-dns-5566ccbc57-lxntc\" (UID: \"bfd3a102-4d76-4c9d-ad99-63deb530fb04\") " pod="openstack/dnsmasq-dns-5566ccbc57-lxntc" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.639632 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfd3a102-4d76-4c9d-ad99-63deb530fb04-config\") pod \"dnsmasq-dns-5566ccbc57-lxntc\" (UID: \"bfd3a102-4d76-4c9d-ad99-63deb530fb04\") " pod="openstack/dnsmasq-dns-5566ccbc57-lxntc" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.639701 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfd3a102-4d76-4c9d-ad99-63deb530fb04-dns-svc\") pod \"dnsmasq-dns-5566ccbc57-lxntc\" (UID: \"bfd3a102-4d76-4c9d-ad99-63deb530fb04\") " pod="openstack/dnsmasq-dns-5566ccbc57-lxntc" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.724616 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5566ccbc57-lxntc"] Dec 03 08:57:36 crc kubenswrapper[4947]: E1203 08:57:36.725278 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-d8pxk ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5566ccbc57-lxntc" podUID="bfd3a102-4d76-4c9d-ad99-63deb530fb04" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.741332 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfd3a102-4d76-4c9d-ad99-63deb530fb04-dns-svc\") pod \"dnsmasq-dns-5566ccbc57-lxntc\" (UID: \"bfd3a102-4d76-4c9d-ad99-63deb530fb04\") " pod="openstack/dnsmasq-dns-5566ccbc57-lxntc" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.741429 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfd3a102-4d76-4c9d-ad99-63deb530fb04-ovsdbserver-nb\") pod \"dnsmasq-dns-5566ccbc57-lxntc\" (UID: \"bfd3a102-4d76-4c9d-ad99-63deb530fb04\") " pod="openstack/dnsmasq-dns-5566ccbc57-lxntc" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.741502 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8pxk\" (UniqueName: \"kubernetes.io/projected/bfd3a102-4d76-4c9d-ad99-63deb530fb04-kube-api-access-d8pxk\") pod \"dnsmasq-dns-5566ccbc57-lxntc\" (UID: \"bfd3a102-4d76-4c9d-ad99-63deb530fb04\") " pod="openstack/dnsmasq-dns-5566ccbc57-lxntc" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.741523 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfd3a102-4d76-4c9d-ad99-63deb530fb04-config\") pod \"dnsmasq-dns-5566ccbc57-lxntc\" (UID: \"bfd3a102-4d76-4c9d-ad99-63deb530fb04\") " pod="openstack/dnsmasq-dns-5566ccbc57-lxntc" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.742400 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfd3a102-4d76-4c9d-ad99-63deb530fb04-config\") pod \"dnsmasq-dns-5566ccbc57-lxntc\" (UID: \"bfd3a102-4d76-4c9d-ad99-63deb530fb04\") " pod="openstack/dnsmasq-dns-5566ccbc57-lxntc" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.743097 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfd3a102-4d76-4c9d-ad99-63deb530fb04-dns-svc\") pod \"dnsmasq-dns-5566ccbc57-lxntc\" (UID: \"bfd3a102-4d76-4c9d-ad99-63deb530fb04\") " pod="openstack/dnsmasq-dns-5566ccbc57-lxntc" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.743815 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfd3a102-4d76-4c9d-ad99-63deb530fb04-ovsdbserver-nb\") pod \"dnsmasq-dns-5566ccbc57-lxntc\" (UID: \"bfd3a102-4d76-4c9d-ad99-63deb530fb04\") " pod="openstack/dnsmasq-dns-5566ccbc57-lxntc" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.753834 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7dcb87545c-zzp6p"] Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.755379 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.759876 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.771970 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7dcb87545c-zzp6p"] Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.779695 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8pxk\" (UniqueName: \"kubernetes.io/projected/bfd3a102-4d76-4c9d-ad99-63deb530fb04-kube-api-access-d8pxk\") pod \"dnsmasq-dns-5566ccbc57-lxntc\" (UID: \"bfd3a102-4d76-4c9d-ad99-63deb530fb04\") " pod="openstack/dnsmasq-dns-5566ccbc57-lxntc" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.843527 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dvsv\" (UniqueName: \"kubernetes.io/projected/5e49849a-1b2b-42de-bd60-16253363c01e-kube-api-access-6dvsv\") pod \"dnsmasq-dns-7dcb87545c-zzp6p\" (UID: \"5e49849a-1b2b-42de-bd60-16253363c01e\") " pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.843624 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e49849a-1b2b-42de-bd60-16253363c01e-dns-svc\") pod \"dnsmasq-dns-7dcb87545c-zzp6p\" (UID: \"5e49849a-1b2b-42de-bd60-16253363c01e\") " pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.843666 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e49849a-1b2b-42de-bd60-16253363c01e-config\") pod \"dnsmasq-dns-7dcb87545c-zzp6p\" (UID: \"5e49849a-1b2b-42de-bd60-16253363c01e\") " pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.843874 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e49849a-1b2b-42de-bd60-16253363c01e-ovsdbserver-sb\") pod \"dnsmasq-dns-7dcb87545c-zzp6p\" (UID: \"5e49849a-1b2b-42de-bd60-16253363c01e\") " pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.844007 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e49849a-1b2b-42de-bd60-16253363c01e-ovsdbserver-nb\") pod \"dnsmasq-dns-7dcb87545c-zzp6p\" (UID: \"5e49849a-1b2b-42de-bd60-16253363c01e\") " pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.945149 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dvsv\" (UniqueName: \"kubernetes.io/projected/5e49849a-1b2b-42de-bd60-16253363c01e-kube-api-access-6dvsv\") pod \"dnsmasq-dns-7dcb87545c-zzp6p\" (UID: \"5e49849a-1b2b-42de-bd60-16253363c01e\") " pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.945240 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e49849a-1b2b-42de-bd60-16253363c01e-dns-svc\") pod \"dnsmasq-dns-7dcb87545c-zzp6p\" (UID: \"5e49849a-1b2b-42de-bd60-16253363c01e\") " pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.945273 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e49849a-1b2b-42de-bd60-16253363c01e-config\") pod \"dnsmasq-dns-7dcb87545c-zzp6p\" (UID: \"5e49849a-1b2b-42de-bd60-16253363c01e\") " pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.945322 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e49849a-1b2b-42de-bd60-16253363c01e-ovsdbserver-sb\") pod \"dnsmasq-dns-7dcb87545c-zzp6p\" (UID: \"5e49849a-1b2b-42de-bd60-16253363c01e\") " pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.945364 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e49849a-1b2b-42de-bd60-16253363c01e-ovsdbserver-nb\") pod \"dnsmasq-dns-7dcb87545c-zzp6p\" (UID: \"5e49849a-1b2b-42de-bd60-16253363c01e\") " pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.946203 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e49849a-1b2b-42de-bd60-16253363c01e-ovsdbserver-sb\") pod \"dnsmasq-dns-7dcb87545c-zzp6p\" (UID: \"5e49849a-1b2b-42de-bd60-16253363c01e\") " pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.946286 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e49849a-1b2b-42de-bd60-16253363c01e-dns-svc\") pod \"dnsmasq-dns-7dcb87545c-zzp6p\" (UID: \"5e49849a-1b2b-42de-bd60-16253363c01e\") " pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.946595 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e49849a-1b2b-42de-bd60-16253363c01e-ovsdbserver-nb\") pod \"dnsmasq-dns-7dcb87545c-zzp6p\" (UID: \"5e49849a-1b2b-42de-bd60-16253363c01e\") " pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.946631 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e49849a-1b2b-42de-bd60-16253363c01e-config\") pod \"dnsmasq-dns-7dcb87545c-zzp6p\" (UID: \"5e49849a-1b2b-42de-bd60-16253363c01e\") " pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" Dec 03 08:57:36 crc kubenswrapper[4947]: I1203 08:57:36.966296 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dvsv\" (UniqueName: \"kubernetes.io/projected/5e49849a-1b2b-42de-bd60-16253363c01e-kube-api-access-6dvsv\") pod \"dnsmasq-dns-7dcb87545c-zzp6p\" (UID: \"5e49849a-1b2b-42de-bd60-16253363c01e\") " pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" Dec 03 08:57:37 crc kubenswrapper[4947]: I1203 08:57:37.072892 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" Dec 03 08:57:37 crc kubenswrapper[4947]: I1203 08:57:37.287307 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5566ccbc57-lxntc" Dec 03 08:57:37 crc kubenswrapper[4947]: I1203 08:57:37.300250 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5566ccbc57-lxntc" Dec 03 08:57:37 crc kubenswrapper[4947]: I1203 08:57:37.354560 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfd3a102-4d76-4c9d-ad99-63deb530fb04-ovsdbserver-nb\") pod \"bfd3a102-4d76-4c9d-ad99-63deb530fb04\" (UID: \"bfd3a102-4d76-4c9d-ad99-63deb530fb04\") " Dec 03 08:57:37 crc kubenswrapper[4947]: I1203 08:57:37.354613 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfd3a102-4d76-4c9d-ad99-63deb530fb04-dns-svc\") pod \"bfd3a102-4d76-4c9d-ad99-63deb530fb04\" (UID: \"bfd3a102-4d76-4c9d-ad99-63deb530fb04\") " Dec 03 08:57:37 crc kubenswrapper[4947]: I1203 08:57:37.354735 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8pxk\" (UniqueName: \"kubernetes.io/projected/bfd3a102-4d76-4c9d-ad99-63deb530fb04-kube-api-access-d8pxk\") pod \"bfd3a102-4d76-4c9d-ad99-63deb530fb04\" (UID: \"bfd3a102-4d76-4c9d-ad99-63deb530fb04\") " Dec 03 08:57:37 crc kubenswrapper[4947]: I1203 08:57:37.354790 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfd3a102-4d76-4c9d-ad99-63deb530fb04-config\") pod \"bfd3a102-4d76-4c9d-ad99-63deb530fb04\" (UID: \"bfd3a102-4d76-4c9d-ad99-63deb530fb04\") " Dec 03 08:57:37 crc kubenswrapper[4947]: I1203 08:57:37.355458 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfd3a102-4d76-4c9d-ad99-63deb530fb04-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bfd3a102-4d76-4c9d-ad99-63deb530fb04" (UID: "bfd3a102-4d76-4c9d-ad99-63deb530fb04"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:57:37 crc kubenswrapper[4947]: I1203 08:57:37.355721 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfd3a102-4d76-4c9d-ad99-63deb530fb04-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bfd3a102-4d76-4c9d-ad99-63deb530fb04" (UID: "bfd3a102-4d76-4c9d-ad99-63deb530fb04"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:57:37 crc kubenswrapper[4947]: I1203 08:57:37.356117 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfd3a102-4d76-4c9d-ad99-63deb530fb04-config" (OuterVolumeSpecName: "config") pod "bfd3a102-4d76-4c9d-ad99-63deb530fb04" (UID: "bfd3a102-4d76-4c9d-ad99-63deb530fb04"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:57:37 crc kubenswrapper[4947]: I1203 08:57:37.360397 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfd3a102-4d76-4c9d-ad99-63deb530fb04-kube-api-access-d8pxk" (OuterVolumeSpecName: "kube-api-access-d8pxk") pod "bfd3a102-4d76-4c9d-ad99-63deb530fb04" (UID: "bfd3a102-4d76-4c9d-ad99-63deb530fb04"). InnerVolumeSpecName "kube-api-access-d8pxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:57:37 crc kubenswrapper[4947]: I1203 08:57:37.456727 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bfd3a102-4d76-4c9d-ad99-63deb530fb04-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 08:57:37 crc kubenswrapper[4947]: I1203 08:57:37.456759 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bfd3a102-4d76-4c9d-ad99-63deb530fb04-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 08:57:37 crc kubenswrapper[4947]: I1203 08:57:37.456769 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8pxk\" (UniqueName: \"kubernetes.io/projected/bfd3a102-4d76-4c9d-ad99-63deb530fb04-kube-api-access-d8pxk\") on node \"crc\" DevicePath \"\"" Dec 03 08:57:37 crc kubenswrapper[4947]: I1203 08:57:37.456779 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bfd3a102-4d76-4c9d-ad99-63deb530fb04-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:57:37 crc kubenswrapper[4947]: I1203 08:57:37.494755 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7dcb87545c-zzp6p"] Dec 03 08:57:38 crc kubenswrapper[4947]: I1203 08:57:38.300777 4947 generic.go:334] "Generic (PLEG): container finished" podID="5e49849a-1b2b-42de-bd60-16253363c01e" containerID="338c069c38effe57991e7aa8dcdc21e4502a417170bd007ca26cdfd5d29c2832" exitCode=0 Dec 03 08:57:38 crc kubenswrapper[4947]: I1203 08:57:38.301047 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5566ccbc57-lxntc" Dec 03 08:57:38 crc kubenswrapper[4947]: I1203 08:57:38.303784 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" event={"ID":"5e49849a-1b2b-42de-bd60-16253363c01e","Type":"ContainerDied","Data":"338c069c38effe57991e7aa8dcdc21e4502a417170bd007ca26cdfd5d29c2832"} Dec 03 08:57:38 crc kubenswrapper[4947]: I1203 08:57:38.303858 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" event={"ID":"5e49849a-1b2b-42de-bd60-16253363c01e","Type":"ContainerStarted","Data":"ea30656cab73e77e0047b496e6392e7ee83245442ca7166985bf62fb00fbe019"} Dec 03 08:57:38 crc kubenswrapper[4947]: I1203 08:57:38.513453 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5566ccbc57-lxntc"] Dec 03 08:57:38 crc kubenswrapper[4947]: I1203 08:57:38.525152 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5566ccbc57-lxntc"] Dec 03 08:57:39 crc kubenswrapper[4947]: I1203 08:57:39.097172 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfd3a102-4d76-4c9d-ad99-63deb530fb04" path="/var/lib/kubelet/pods/bfd3a102-4d76-4c9d-ad99-63deb530fb04/volumes" Dec 03 08:57:39 crc kubenswrapper[4947]: I1203 08:57:39.276890 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Dec 03 08:57:39 crc kubenswrapper[4947]: I1203 08:57:39.278992 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 03 08:57:39 crc kubenswrapper[4947]: I1203 08:57:39.283874 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Dec 03 08:57:39 crc kubenswrapper[4947]: I1203 08:57:39.301366 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Dec 03 08:57:39 crc kubenswrapper[4947]: I1203 08:57:39.325476 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" event={"ID":"5e49849a-1b2b-42de-bd60-16253363c01e","Type":"ContainerStarted","Data":"ba0e60c52084d8894006de81c82444b31c942dfd3cbc9b7a800f8a67187556a3"} Dec 03 08:57:39 crc kubenswrapper[4947]: I1203 08:57:39.325759 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" Dec 03 08:57:39 crc kubenswrapper[4947]: I1203 08:57:39.367295 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" podStartSLOduration=3.367268899 podStartE2EDuration="3.367268899s" podCreationTimestamp="2025-12-03 08:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:57:39.356290952 +0000 UTC m=+7720.617245378" watchObservedRunningTime="2025-12-03 08:57:39.367268899 +0000 UTC m=+7720.628223335" Dec 03 08:57:39 crc kubenswrapper[4947]: I1203 08:57:39.388539 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/a3a74721-2312-4c7a-aa51-3895521fb264-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"a3a74721-2312-4c7a-aa51-3895521fb264\") " pod="openstack/ovn-copy-data" Dec 03 08:57:39 crc kubenswrapper[4947]: I1203 08:57:39.388632 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dmfp\" (UniqueName: \"kubernetes.io/projected/a3a74721-2312-4c7a-aa51-3895521fb264-kube-api-access-2dmfp\") pod \"ovn-copy-data\" (UID: \"a3a74721-2312-4c7a-aa51-3895521fb264\") " pod="openstack/ovn-copy-data" Dec 03 08:57:39 crc kubenswrapper[4947]: I1203 08:57:39.388707 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4a320df5-843d-4c14-b669-5ed2b01cab89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a320df5-843d-4c14-b669-5ed2b01cab89\") pod \"ovn-copy-data\" (UID: \"a3a74721-2312-4c7a-aa51-3895521fb264\") " pod="openstack/ovn-copy-data" Dec 03 08:57:39 crc kubenswrapper[4947]: I1203 08:57:39.490711 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/a3a74721-2312-4c7a-aa51-3895521fb264-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"a3a74721-2312-4c7a-aa51-3895521fb264\") " pod="openstack/ovn-copy-data" Dec 03 08:57:39 crc kubenswrapper[4947]: I1203 08:57:39.490801 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dmfp\" (UniqueName: \"kubernetes.io/projected/a3a74721-2312-4c7a-aa51-3895521fb264-kube-api-access-2dmfp\") pod \"ovn-copy-data\" (UID: \"a3a74721-2312-4c7a-aa51-3895521fb264\") " pod="openstack/ovn-copy-data" Dec 03 08:57:39 crc kubenswrapper[4947]: I1203 08:57:39.491004 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4a320df5-843d-4c14-b669-5ed2b01cab89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a320df5-843d-4c14-b669-5ed2b01cab89\") pod \"ovn-copy-data\" (UID: \"a3a74721-2312-4c7a-aa51-3895521fb264\") " pod="openstack/ovn-copy-data" Dec 03 08:57:39 crc kubenswrapper[4947]: I1203 08:57:39.493213 4947 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 08:57:39 crc kubenswrapper[4947]: I1203 08:57:39.493256 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4a320df5-843d-4c14-b669-5ed2b01cab89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a320df5-843d-4c14-b669-5ed2b01cab89\") pod \"ovn-copy-data\" (UID: \"a3a74721-2312-4c7a-aa51-3895521fb264\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1e2d9187b2ad8635d8fa6a7e69a95e128ac2f98f6bd75925f88c02159dd09425/globalmount\"" pod="openstack/ovn-copy-data" Dec 03 08:57:39 crc kubenswrapper[4947]: I1203 08:57:39.502754 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/a3a74721-2312-4c7a-aa51-3895521fb264-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"a3a74721-2312-4c7a-aa51-3895521fb264\") " pod="openstack/ovn-copy-data" Dec 03 08:57:39 crc kubenswrapper[4947]: I1203 08:57:39.516349 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dmfp\" (UniqueName: \"kubernetes.io/projected/a3a74721-2312-4c7a-aa51-3895521fb264-kube-api-access-2dmfp\") pod \"ovn-copy-data\" (UID: \"a3a74721-2312-4c7a-aa51-3895521fb264\") " pod="openstack/ovn-copy-data" Dec 03 08:57:39 crc kubenswrapper[4947]: I1203 08:57:39.539443 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4a320df5-843d-4c14-b669-5ed2b01cab89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a320df5-843d-4c14-b669-5ed2b01cab89\") pod \"ovn-copy-data\" (UID: \"a3a74721-2312-4c7a-aa51-3895521fb264\") " pod="openstack/ovn-copy-data" Dec 03 08:57:39 crc kubenswrapper[4947]: I1203 08:57:39.618824 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 03 08:57:40 crc kubenswrapper[4947]: I1203 08:57:40.116144 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Dec 03 08:57:40 crc kubenswrapper[4947]: I1203 08:57:40.335622 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"a3a74721-2312-4c7a-aa51-3895521fb264","Type":"ContainerStarted","Data":"ae3259ef96cd797ba24c8c7db315e0866c380c8a8a5d1e94ed7944097219261b"} Dec 03 08:57:41 crc kubenswrapper[4947]: I1203 08:57:41.350827 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"a3a74721-2312-4c7a-aa51-3895521fb264","Type":"ContainerStarted","Data":"4421b16e41f753b457d095ff036848b78058ede638470831e1998120db09e943"} Dec 03 08:57:41 crc kubenswrapper[4947]: I1203 08:57:41.368764 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.199958551 podStartE2EDuration="3.368745445s" podCreationTimestamp="2025-12-03 08:57:38 +0000 UTC" firstStartedPulling="2025-12-03 08:57:40.119584148 +0000 UTC m=+7721.380538574" lastFinishedPulling="2025-12-03 08:57:40.288371042 +0000 UTC m=+7721.549325468" observedRunningTime="2025-12-03 08:57:41.366436482 +0000 UTC m=+7722.627390928" watchObservedRunningTime="2025-12-03 08:57:41.368745445 +0000 UTC m=+7722.629699871" Dec 03 08:57:47 crc kubenswrapper[4947]: I1203 08:57:47.074767 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" Dec 03 08:57:47 crc kubenswrapper[4947]: I1203 08:57:47.150708 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-544b7dbbc5-76cn4"] Dec 03 08:57:47 crc kubenswrapper[4947]: I1203 08:57:47.151042 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-544b7dbbc5-76cn4" podUID="34af4296-adaf-4d08-a259-833f7696e897" containerName="dnsmasq-dns" containerID="cri-o://cced417981eea7a51a02e2c3074eacd6c4d0df4786f28a0c1379230a96d98b04" gracePeriod=10 Dec 03 08:57:48 crc kubenswrapper[4947]: I1203 08:57:48.296312 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-544b7dbbc5-76cn4" Dec 03 08:57:48 crc kubenswrapper[4947]: I1203 08:57:48.406616 4947 generic.go:334] "Generic (PLEG): container finished" podID="34af4296-adaf-4d08-a259-833f7696e897" containerID="cced417981eea7a51a02e2c3074eacd6c4d0df4786f28a0c1379230a96d98b04" exitCode=0 Dec 03 08:57:48 crc kubenswrapper[4947]: I1203 08:57:48.406678 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-544b7dbbc5-76cn4" Dec 03 08:57:48 crc kubenswrapper[4947]: I1203 08:57:48.406693 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-544b7dbbc5-76cn4" event={"ID":"34af4296-adaf-4d08-a259-833f7696e897","Type":"ContainerDied","Data":"cced417981eea7a51a02e2c3074eacd6c4d0df4786f28a0c1379230a96d98b04"} Dec 03 08:57:48 crc kubenswrapper[4947]: I1203 08:57:48.407327 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-544b7dbbc5-76cn4" event={"ID":"34af4296-adaf-4d08-a259-833f7696e897","Type":"ContainerDied","Data":"20dcd4efbf67553ede0dd903138734befb965a5613797122283e1a3191caa805"} Dec 03 08:57:48 crc kubenswrapper[4947]: I1203 08:57:48.407403 4947 scope.go:117] "RemoveContainer" containerID="cced417981eea7a51a02e2c3074eacd6c4d0df4786f28a0c1379230a96d98b04" Dec 03 08:57:48 crc kubenswrapper[4947]: E1203 08:57:48.425421 4947 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.196:37878->38.102.83.196:38979: write tcp 38.102.83.196:37878->38.102.83.196:38979: write: broken pipe Dec 03 08:57:48 crc kubenswrapper[4947]: I1203 08:57:48.427560 4947 scope.go:117] "RemoveContainer" containerID="3f945920fbf0b5572dbfdb5b03192b0b0dc8fb7b108a0eadecd3dbb0e45f7950" Dec 03 08:57:48 crc kubenswrapper[4947]: I1203 08:57:48.438762 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34af4296-adaf-4d08-a259-833f7696e897-dns-svc\") pod \"34af4296-adaf-4d08-a259-833f7696e897\" (UID: \"34af4296-adaf-4d08-a259-833f7696e897\") " Dec 03 08:57:48 crc kubenswrapper[4947]: I1203 08:57:48.440746 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34af4296-adaf-4d08-a259-833f7696e897-config\") pod \"34af4296-adaf-4d08-a259-833f7696e897\" (UID: \"34af4296-adaf-4d08-a259-833f7696e897\") " Dec 03 08:57:48 crc kubenswrapper[4947]: I1203 08:57:48.440795 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kx7q\" (UniqueName: \"kubernetes.io/projected/34af4296-adaf-4d08-a259-833f7696e897-kube-api-access-4kx7q\") pod \"34af4296-adaf-4d08-a259-833f7696e897\" (UID: \"34af4296-adaf-4d08-a259-833f7696e897\") " Dec 03 08:57:48 crc kubenswrapper[4947]: I1203 08:57:48.445300 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34af4296-adaf-4d08-a259-833f7696e897-kube-api-access-4kx7q" (OuterVolumeSpecName: "kube-api-access-4kx7q") pod "34af4296-adaf-4d08-a259-833f7696e897" (UID: "34af4296-adaf-4d08-a259-833f7696e897"). InnerVolumeSpecName "kube-api-access-4kx7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:57:48 crc kubenswrapper[4947]: I1203 08:57:48.452427 4947 scope.go:117] "RemoveContainer" containerID="cced417981eea7a51a02e2c3074eacd6c4d0df4786f28a0c1379230a96d98b04" Dec 03 08:57:48 crc kubenswrapper[4947]: E1203 08:57:48.452922 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cced417981eea7a51a02e2c3074eacd6c4d0df4786f28a0c1379230a96d98b04\": container with ID starting with cced417981eea7a51a02e2c3074eacd6c4d0df4786f28a0c1379230a96d98b04 not found: ID does not exist" containerID="cced417981eea7a51a02e2c3074eacd6c4d0df4786f28a0c1379230a96d98b04" Dec 03 08:57:48 crc kubenswrapper[4947]: I1203 08:57:48.452962 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cced417981eea7a51a02e2c3074eacd6c4d0df4786f28a0c1379230a96d98b04"} err="failed to get container status \"cced417981eea7a51a02e2c3074eacd6c4d0df4786f28a0c1379230a96d98b04\": rpc error: code = NotFound desc = could not find container \"cced417981eea7a51a02e2c3074eacd6c4d0df4786f28a0c1379230a96d98b04\": container with ID starting with cced417981eea7a51a02e2c3074eacd6c4d0df4786f28a0c1379230a96d98b04 not found: ID does not exist" Dec 03 08:57:48 crc kubenswrapper[4947]: I1203 08:57:48.452986 4947 scope.go:117] "RemoveContainer" containerID="3f945920fbf0b5572dbfdb5b03192b0b0dc8fb7b108a0eadecd3dbb0e45f7950" Dec 03 08:57:48 crc kubenswrapper[4947]: E1203 08:57:48.453336 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f945920fbf0b5572dbfdb5b03192b0b0dc8fb7b108a0eadecd3dbb0e45f7950\": container with ID starting with 3f945920fbf0b5572dbfdb5b03192b0b0dc8fb7b108a0eadecd3dbb0e45f7950 not found: ID does not exist" containerID="3f945920fbf0b5572dbfdb5b03192b0b0dc8fb7b108a0eadecd3dbb0e45f7950" Dec 03 08:57:48 crc kubenswrapper[4947]: I1203 08:57:48.453361 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f945920fbf0b5572dbfdb5b03192b0b0dc8fb7b108a0eadecd3dbb0e45f7950"} err="failed to get container status \"3f945920fbf0b5572dbfdb5b03192b0b0dc8fb7b108a0eadecd3dbb0e45f7950\": rpc error: code = NotFound desc = could not find container \"3f945920fbf0b5572dbfdb5b03192b0b0dc8fb7b108a0eadecd3dbb0e45f7950\": container with ID starting with 3f945920fbf0b5572dbfdb5b03192b0b0dc8fb7b108a0eadecd3dbb0e45f7950 not found: ID does not exist" Dec 03 08:57:48 crc kubenswrapper[4947]: I1203 08:57:48.490607 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34af4296-adaf-4d08-a259-833f7696e897-config" (OuterVolumeSpecName: "config") pod "34af4296-adaf-4d08-a259-833f7696e897" (UID: "34af4296-adaf-4d08-a259-833f7696e897"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:57:48 crc kubenswrapper[4947]: I1203 08:57:48.492280 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34af4296-adaf-4d08-a259-833f7696e897-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "34af4296-adaf-4d08-a259-833f7696e897" (UID: "34af4296-adaf-4d08-a259-833f7696e897"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:57:48 crc kubenswrapper[4947]: I1203 08:57:48.543297 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34af4296-adaf-4d08-a259-833f7696e897-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 08:57:48 crc kubenswrapper[4947]: I1203 08:57:48.543323 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34af4296-adaf-4d08-a259-833f7696e897-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:57:48 crc kubenswrapper[4947]: I1203 08:57:48.543333 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kx7q\" (UniqueName: \"kubernetes.io/projected/34af4296-adaf-4d08-a259-833f7696e897-kube-api-access-4kx7q\") on node \"crc\" DevicePath \"\"" Dec 03 08:57:48 crc kubenswrapper[4947]: I1203 08:57:48.750350 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-544b7dbbc5-76cn4"] Dec 03 08:57:48 crc kubenswrapper[4947]: I1203 08:57:48.759639 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-544b7dbbc5-76cn4"] Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.092142 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34af4296-adaf-4d08-a259-833f7696e897" path="/var/lib/kubelet/pods/34af4296-adaf-4d08-a259-833f7696e897/volumes" Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.238844 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 03 08:57:49 crc kubenswrapper[4947]: E1203 08:57:49.239174 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34af4296-adaf-4d08-a259-833f7696e897" containerName="init" Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.239189 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="34af4296-adaf-4d08-a259-833f7696e897" containerName="init" Dec 03 08:57:49 crc kubenswrapper[4947]: E1203 08:57:49.239213 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34af4296-adaf-4d08-a259-833f7696e897" containerName="dnsmasq-dns" Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.239219 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="34af4296-adaf-4d08-a259-833f7696e897" containerName="dnsmasq-dns" Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.239389 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="34af4296-adaf-4d08-a259-833f7696e897" containerName="dnsmasq-dns" Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.240236 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.243432 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.243459 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.244959 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-6hfqf" Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.256731 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.356690 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/523c147f-a1b7-44cb-bd8b-3c917311b905-scripts\") pod \"ovn-northd-0\" (UID: \"523c147f-a1b7-44cb-bd8b-3c917311b905\") " pod="openstack/ovn-northd-0" Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.356741 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnhgp\" (UniqueName: \"kubernetes.io/projected/523c147f-a1b7-44cb-bd8b-3c917311b905-kube-api-access-gnhgp\") pod \"ovn-northd-0\" (UID: \"523c147f-a1b7-44cb-bd8b-3c917311b905\") " pod="openstack/ovn-northd-0" Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.356782 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/523c147f-a1b7-44cb-bd8b-3c917311b905-config\") pod \"ovn-northd-0\" (UID: \"523c147f-a1b7-44cb-bd8b-3c917311b905\") " pod="openstack/ovn-northd-0" Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.356803 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/523c147f-a1b7-44cb-bd8b-3c917311b905-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"523c147f-a1b7-44cb-bd8b-3c917311b905\") " pod="openstack/ovn-northd-0" Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.356894 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/523c147f-a1b7-44cb-bd8b-3c917311b905-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"523c147f-a1b7-44cb-bd8b-3c917311b905\") " pod="openstack/ovn-northd-0" Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.458775 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/523c147f-a1b7-44cb-bd8b-3c917311b905-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"523c147f-a1b7-44cb-bd8b-3c917311b905\") " pod="openstack/ovn-northd-0" Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.458894 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/523c147f-a1b7-44cb-bd8b-3c917311b905-scripts\") pod \"ovn-northd-0\" (UID: \"523c147f-a1b7-44cb-bd8b-3c917311b905\") " pod="openstack/ovn-northd-0" Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.458954 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnhgp\" (UniqueName: \"kubernetes.io/projected/523c147f-a1b7-44cb-bd8b-3c917311b905-kube-api-access-gnhgp\") pod \"ovn-northd-0\" (UID: \"523c147f-a1b7-44cb-bd8b-3c917311b905\") " pod="openstack/ovn-northd-0" Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.459050 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/523c147f-a1b7-44cb-bd8b-3c917311b905-config\") pod \"ovn-northd-0\" (UID: \"523c147f-a1b7-44cb-bd8b-3c917311b905\") " pod="openstack/ovn-northd-0" Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.459093 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/523c147f-a1b7-44cb-bd8b-3c917311b905-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"523c147f-a1b7-44cb-bd8b-3c917311b905\") " pod="openstack/ovn-northd-0" Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.459824 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/523c147f-a1b7-44cb-bd8b-3c917311b905-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"523c147f-a1b7-44cb-bd8b-3c917311b905\") " pod="openstack/ovn-northd-0" Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.460208 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/523c147f-a1b7-44cb-bd8b-3c917311b905-scripts\") pod \"ovn-northd-0\" (UID: \"523c147f-a1b7-44cb-bd8b-3c917311b905\") " pod="openstack/ovn-northd-0" Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.460315 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/523c147f-a1b7-44cb-bd8b-3c917311b905-config\") pod \"ovn-northd-0\" (UID: \"523c147f-a1b7-44cb-bd8b-3c917311b905\") " pod="openstack/ovn-northd-0" Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.465556 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/523c147f-a1b7-44cb-bd8b-3c917311b905-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"523c147f-a1b7-44cb-bd8b-3c917311b905\") " pod="openstack/ovn-northd-0" Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.497732 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnhgp\" (UniqueName: \"kubernetes.io/projected/523c147f-a1b7-44cb-bd8b-3c917311b905-kube-api-access-gnhgp\") pod \"ovn-northd-0\" (UID: \"523c147f-a1b7-44cb-bd8b-3c917311b905\") " pod="openstack/ovn-northd-0" Dec 03 08:57:49 crc kubenswrapper[4947]: I1203 08:57:49.562415 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 08:57:50 crc kubenswrapper[4947]: I1203 08:57:50.007386 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 08:57:50 crc kubenswrapper[4947]: I1203 08:57:50.423209 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"523c147f-a1b7-44cb-bd8b-3c917311b905","Type":"ContainerStarted","Data":"4362d6be8968f0fcc5496560ef4f0d6b36da84d6f232792ed8d557c92e20ef29"} Dec 03 08:57:51 crc kubenswrapper[4947]: I1203 08:57:51.440843 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"523c147f-a1b7-44cb-bd8b-3c917311b905","Type":"ContainerStarted","Data":"8bb828ae78c3c55e16eedfbc151d5a53ccc71a301ec2a794b203c00f3ef73188"} Dec 03 08:57:51 crc kubenswrapper[4947]: I1203 08:57:51.441297 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"523c147f-a1b7-44cb-bd8b-3c917311b905","Type":"ContainerStarted","Data":"5dca926b7d31d0af81bb9a5ad4504f938e39a6b745a3b394223970f549e032e7"} Dec 03 08:57:51 crc kubenswrapper[4947]: I1203 08:57:51.442077 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 03 08:57:51 crc kubenswrapper[4947]: I1203 08:57:51.488984 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.8041958999999999 podStartE2EDuration="2.488955431s" podCreationTimestamp="2025-12-03 08:57:49 +0000 UTC" firstStartedPulling="2025-12-03 08:57:50.012692481 +0000 UTC m=+7731.273646907" lastFinishedPulling="2025-12-03 08:57:50.697452012 +0000 UTC m=+7731.958406438" observedRunningTime="2025-12-03 08:57:51.479112955 +0000 UTC m=+7732.740067371" watchObservedRunningTime="2025-12-03 08:57:51.488955431 +0000 UTC m=+7732.749909897" Dec 03 08:57:55 crc kubenswrapper[4947]: I1203 08:57:55.917751 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-2608-account-create-update-478sx"] Dec 03 08:57:55 crc kubenswrapper[4947]: I1203 08:57:55.920814 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2608-account-create-update-478sx" Dec 03 08:57:55 crc kubenswrapper[4947]: I1203 08:57:55.923544 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 03 08:57:55 crc kubenswrapper[4947]: I1203 08:57:55.929347 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-6qs67"] Dec 03 08:57:55 crc kubenswrapper[4947]: I1203 08:57:55.930719 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6qs67" Dec 03 08:57:55 crc kubenswrapper[4947]: I1203 08:57:55.943736 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-6qs67"] Dec 03 08:57:55 crc kubenswrapper[4947]: I1203 08:57:55.952132 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2608-account-create-update-478sx"] Dec 03 08:57:56 crc kubenswrapper[4947]: I1203 08:57:56.087888 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n77rw\" (UniqueName: \"kubernetes.io/projected/cb78927d-aa75-4748-810a-d7866f36be52-kube-api-access-n77rw\") pod \"keystone-db-create-6qs67\" (UID: \"cb78927d-aa75-4748-810a-d7866f36be52\") " pod="openstack/keystone-db-create-6qs67" Dec 03 08:57:56 crc kubenswrapper[4947]: I1203 08:57:56.087996 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6982b993-d2e1-4cba-bf74-0f3ca0366c62-operator-scripts\") pod \"keystone-2608-account-create-update-478sx\" (UID: \"6982b993-d2e1-4cba-bf74-0f3ca0366c62\") " pod="openstack/keystone-2608-account-create-update-478sx" Dec 03 08:57:56 crc kubenswrapper[4947]: I1203 08:57:56.088057 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb78927d-aa75-4748-810a-d7866f36be52-operator-scripts\") pod \"keystone-db-create-6qs67\" (UID: \"cb78927d-aa75-4748-810a-d7866f36be52\") " pod="openstack/keystone-db-create-6qs67" Dec 03 08:57:56 crc kubenswrapper[4947]: I1203 08:57:56.088286 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pndx5\" (UniqueName: \"kubernetes.io/projected/6982b993-d2e1-4cba-bf74-0f3ca0366c62-kube-api-access-pndx5\") pod \"keystone-2608-account-create-update-478sx\" (UID: \"6982b993-d2e1-4cba-bf74-0f3ca0366c62\") " pod="openstack/keystone-2608-account-create-update-478sx" Dec 03 08:57:56 crc kubenswrapper[4947]: I1203 08:57:56.190580 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pndx5\" (UniqueName: \"kubernetes.io/projected/6982b993-d2e1-4cba-bf74-0f3ca0366c62-kube-api-access-pndx5\") pod \"keystone-2608-account-create-update-478sx\" (UID: \"6982b993-d2e1-4cba-bf74-0f3ca0366c62\") " pod="openstack/keystone-2608-account-create-update-478sx" Dec 03 08:57:56 crc kubenswrapper[4947]: I1203 08:57:56.190705 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n77rw\" (UniqueName: \"kubernetes.io/projected/cb78927d-aa75-4748-810a-d7866f36be52-kube-api-access-n77rw\") pod \"keystone-db-create-6qs67\" (UID: \"cb78927d-aa75-4748-810a-d7866f36be52\") " pod="openstack/keystone-db-create-6qs67" Dec 03 08:57:56 crc kubenswrapper[4947]: I1203 08:57:56.190825 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6982b993-d2e1-4cba-bf74-0f3ca0366c62-operator-scripts\") pod \"keystone-2608-account-create-update-478sx\" (UID: \"6982b993-d2e1-4cba-bf74-0f3ca0366c62\") " pod="openstack/keystone-2608-account-create-update-478sx" Dec 03 08:57:56 crc kubenswrapper[4947]: I1203 08:57:56.190877 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb78927d-aa75-4748-810a-d7866f36be52-operator-scripts\") pod \"keystone-db-create-6qs67\" (UID: \"cb78927d-aa75-4748-810a-d7866f36be52\") " pod="openstack/keystone-db-create-6qs67" Dec 03 08:57:56 crc kubenswrapper[4947]: I1203 08:57:56.194345 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6982b993-d2e1-4cba-bf74-0f3ca0366c62-operator-scripts\") pod \"keystone-2608-account-create-update-478sx\" (UID: \"6982b993-d2e1-4cba-bf74-0f3ca0366c62\") " pod="openstack/keystone-2608-account-create-update-478sx" Dec 03 08:57:56 crc kubenswrapper[4947]: I1203 08:57:56.194576 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb78927d-aa75-4748-810a-d7866f36be52-operator-scripts\") pod \"keystone-db-create-6qs67\" (UID: \"cb78927d-aa75-4748-810a-d7866f36be52\") " pod="openstack/keystone-db-create-6qs67" Dec 03 08:57:56 crc kubenswrapper[4947]: I1203 08:57:56.213386 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pndx5\" (UniqueName: \"kubernetes.io/projected/6982b993-d2e1-4cba-bf74-0f3ca0366c62-kube-api-access-pndx5\") pod \"keystone-2608-account-create-update-478sx\" (UID: \"6982b993-d2e1-4cba-bf74-0f3ca0366c62\") " pod="openstack/keystone-2608-account-create-update-478sx" Dec 03 08:57:56 crc kubenswrapper[4947]: I1203 08:57:56.214888 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n77rw\" (UniqueName: \"kubernetes.io/projected/cb78927d-aa75-4748-810a-d7866f36be52-kube-api-access-n77rw\") pod \"keystone-db-create-6qs67\" (UID: \"cb78927d-aa75-4748-810a-d7866f36be52\") " pod="openstack/keystone-db-create-6qs67" Dec 03 08:57:56 crc kubenswrapper[4947]: I1203 08:57:56.251948 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2608-account-create-update-478sx" Dec 03 08:57:56 crc kubenswrapper[4947]: I1203 08:57:56.261392 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6qs67" Dec 03 08:57:57 crc kubenswrapper[4947]: I1203 08:57:56.742953 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-6qs67"] Dec 03 08:57:57 crc kubenswrapper[4947]: I1203 08:57:56.806680 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2608-account-create-update-478sx"] Dec 03 08:57:57 crc kubenswrapper[4947]: W1203 08:57:56.839593 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6982b993_d2e1_4cba_bf74_0f3ca0366c62.slice/crio-3bfabcec663992d86ef1bafd8668f13a4e44f561718cf73e6f7c5286c4cad667 WatchSource:0}: Error finding container 3bfabcec663992d86ef1bafd8668f13a4e44f561718cf73e6f7c5286c4cad667: Status 404 returned error can't find the container with id 3bfabcec663992d86ef1bafd8668f13a4e44f561718cf73e6f7c5286c4cad667 Dec 03 08:57:57 crc kubenswrapper[4947]: I1203 08:57:57.495448 4947 generic.go:334] "Generic (PLEG): container finished" podID="cb78927d-aa75-4748-810a-d7866f36be52" containerID="8cdb3a42d8e864a9e5fe82396e8a4bd7af0fa5455af8950d1dcc63efc5d302bd" exitCode=0 Dec 03 08:57:57 crc kubenswrapper[4947]: I1203 08:57:57.495531 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-6qs67" event={"ID":"cb78927d-aa75-4748-810a-d7866f36be52","Type":"ContainerDied","Data":"8cdb3a42d8e864a9e5fe82396e8a4bd7af0fa5455af8950d1dcc63efc5d302bd"} Dec 03 08:57:57 crc kubenswrapper[4947]: I1203 08:57:57.495798 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-6qs67" event={"ID":"cb78927d-aa75-4748-810a-d7866f36be52","Type":"ContainerStarted","Data":"59bc28de3fa633ebc4737707e76e13b919792c94ec1392577c080330dede0fe4"} Dec 03 08:57:57 crc kubenswrapper[4947]: I1203 08:57:57.497522 4947 generic.go:334] "Generic (PLEG): container finished" podID="6982b993-d2e1-4cba-bf74-0f3ca0366c62" containerID="97b742b625c5f67d43a9de58f2b495fe557433fa21e4d4003c8c1a89c20f24da" exitCode=0 Dec 03 08:57:57 crc kubenswrapper[4947]: I1203 08:57:57.497551 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2608-account-create-update-478sx" event={"ID":"6982b993-d2e1-4cba-bf74-0f3ca0366c62","Type":"ContainerDied","Data":"97b742b625c5f67d43a9de58f2b495fe557433fa21e4d4003c8c1a89c20f24da"} Dec 03 08:57:57 crc kubenswrapper[4947]: I1203 08:57:57.497568 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2608-account-create-update-478sx" event={"ID":"6982b993-d2e1-4cba-bf74-0f3ca0366c62","Type":"ContainerStarted","Data":"3bfabcec663992d86ef1bafd8668f13a4e44f561718cf73e6f7c5286c4cad667"} Dec 03 08:57:58 crc kubenswrapper[4947]: I1203 08:57:58.901040 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2608-account-create-update-478sx" Dec 03 08:57:58 crc kubenswrapper[4947]: I1203 08:57:58.907251 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6qs67" Dec 03 08:57:59 crc kubenswrapper[4947]: I1203 08:57:59.054953 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n77rw\" (UniqueName: \"kubernetes.io/projected/cb78927d-aa75-4748-810a-d7866f36be52-kube-api-access-n77rw\") pod \"cb78927d-aa75-4748-810a-d7866f36be52\" (UID: \"cb78927d-aa75-4748-810a-d7866f36be52\") " Dec 03 08:57:59 crc kubenswrapper[4947]: I1203 08:57:59.055102 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb78927d-aa75-4748-810a-d7866f36be52-operator-scripts\") pod \"cb78927d-aa75-4748-810a-d7866f36be52\" (UID: \"cb78927d-aa75-4748-810a-d7866f36be52\") " Dec 03 08:57:59 crc kubenswrapper[4947]: I1203 08:57:59.055238 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pndx5\" (UniqueName: \"kubernetes.io/projected/6982b993-d2e1-4cba-bf74-0f3ca0366c62-kube-api-access-pndx5\") pod \"6982b993-d2e1-4cba-bf74-0f3ca0366c62\" (UID: \"6982b993-d2e1-4cba-bf74-0f3ca0366c62\") " Dec 03 08:57:59 crc kubenswrapper[4947]: I1203 08:57:59.055325 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6982b993-d2e1-4cba-bf74-0f3ca0366c62-operator-scripts\") pod \"6982b993-d2e1-4cba-bf74-0f3ca0366c62\" (UID: \"6982b993-d2e1-4cba-bf74-0f3ca0366c62\") " Dec 03 08:57:59 crc kubenswrapper[4947]: I1203 08:57:59.056109 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6982b993-d2e1-4cba-bf74-0f3ca0366c62-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6982b993-d2e1-4cba-bf74-0f3ca0366c62" (UID: "6982b993-d2e1-4cba-bf74-0f3ca0366c62"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:57:59 crc kubenswrapper[4947]: I1203 08:57:59.056128 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb78927d-aa75-4748-810a-d7866f36be52-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cb78927d-aa75-4748-810a-d7866f36be52" (UID: "cb78927d-aa75-4748-810a-d7866f36be52"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:57:59 crc kubenswrapper[4947]: I1203 08:57:59.061727 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb78927d-aa75-4748-810a-d7866f36be52-kube-api-access-n77rw" (OuterVolumeSpecName: "kube-api-access-n77rw") pod "cb78927d-aa75-4748-810a-d7866f36be52" (UID: "cb78927d-aa75-4748-810a-d7866f36be52"). InnerVolumeSpecName "kube-api-access-n77rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:57:59 crc kubenswrapper[4947]: I1203 08:57:59.062725 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6982b993-d2e1-4cba-bf74-0f3ca0366c62-kube-api-access-pndx5" (OuterVolumeSpecName: "kube-api-access-pndx5") pod "6982b993-d2e1-4cba-bf74-0f3ca0366c62" (UID: "6982b993-d2e1-4cba-bf74-0f3ca0366c62"). InnerVolumeSpecName "kube-api-access-pndx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:57:59 crc kubenswrapper[4947]: I1203 08:57:59.156998 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6982b993-d2e1-4cba-bf74-0f3ca0366c62-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:57:59 crc kubenswrapper[4947]: I1203 08:57:59.157044 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n77rw\" (UniqueName: \"kubernetes.io/projected/cb78927d-aa75-4748-810a-d7866f36be52-kube-api-access-n77rw\") on node \"crc\" DevicePath \"\"" Dec 03 08:57:59 crc kubenswrapper[4947]: I1203 08:57:59.157055 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb78927d-aa75-4748-810a-d7866f36be52-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:57:59 crc kubenswrapper[4947]: I1203 08:57:59.157065 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pndx5\" (UniqueName: \"kubernetes.io/projected/6982b993-d2e1-4cba-bf74-0f3ca0366c62-kube-api-access-pndx5\") on node \"crc\" DevicePath \"\"" Dec 03 08:57:59 crc kubenswrapper[4947]: I1203 08:57:59.520413 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2608-account-create-update-478sx" event={"ID":"6982b993-d2e1-4cba-bf74-0f3ca0366c62","Type":"ContainerDied","Data":"3bfabcec663992d86ef1bafd8668f13a4e44f561718cf73e6f7c5286c4cad667"} Dec 03 08:57:59 crc kubenswrapper[4947]: I1203 08:57:59.520463 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bfabcec663992d86ef1bafd8668f13a4e44f561718cf73e6f7c5286c4cad667" Dec 03 08:57:59 crc kubenswrapper[4947]: I1203 08:57:59.520434 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2608-account-create-update-478sx" Dec 03 08:57:59 crc kubenswrapper[4947]: I1203 08:57:59.524723 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-6qs67" event={"ID":"cb78927d-aa75-4748-810a-d7866f36be52","Type":"ContainerDied","Data":"59bc28de3fa633ebc4737707e76e13b919792c94ec1392577c080330dede0fe4"} Dec 03 08:57:59 crc kubenswrapper[4947]: I1203 08:57:59.524754 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59bc28de3fa633ebc4737707e76e13b919792c94ec1392577c080330dede0fe4" Dec 03 08:57:59 crc kubenswrapper[4947]: I1203 08:57:59.524791 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6qs67" Dec 03 08:58:00 crc kubenswrapper[4947]: I1203 08:58:00.086200 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:58:00 crc kubenswrapper[4947]: I1203 08:58:00.086570 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:58:00 crc kubenswrapper[4947]: I1203 08:58:00.086628 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 08:58:00 crc kubenswrapper[4947]: I1203 08:58:00.087331 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:58:00 crc kubenswrapper[4947]: I1203 08:58:00.087440 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" gracePeriod=600 Dec 03 08:58:00 crc kubenswrapper[4947]: E1203 08:58:00.214426 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:58:00 crc kubenswrapper[4947]: I1203 08:58:00.539229 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" exitCode=0 Dec 03 08:58:00 crc kubenswrapper[4947]: I1203 08:58:00.539276 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce"} Dec 03 08:58:00 crc kubenswrapper[4947]: I1203 08:58:00.539315 4947 scope.go:117] "RemoveContainer" containerID="775b6bc62996b75547fbefabe838c42ae4e96572661adec2381430b5b658aa8a" Dec 03 08:58:00 crc kubenswrapper[4947]: I1203 08:58:00.540110 4947 scope.go:117] "RemoveContainer" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" Dec 03 08:58:00 crc kubenswrapper[4947]: E1203 08:58:00.540452 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:58:01 crc kubenswrapper[4947]: I1203 08:58:01.525345 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-5mn24"] Dec 03 08:58:01 crc kubenswrapper[4947]: E1203 08:58:01.526062 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6982b993-d2e1-4cba-bf74-0f3ca0366c62" containerName="mariadb-account-create-update" Dec 03 08:58:01 crc kubenswrapper[4947]: I1203 08:58:01.526078 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="6982b993-d2e1-4cba-bf74-0f3ca0366c62" containerName="mariadb-account-create-update" Dec 03 08:58:01 crc kubenswrapper[4947]: E1203 08:58:01.526118 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb78927d-aa75-4748-810a-d7866f36be52" containerName="mariadb-database-create" Dec 03 08:58:01 crc kubenswrapper[4947]: I1203 08:58:01.526127 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb78927d-aa75-4748-810a-d7866f36be52" containerName="mariadb-database-create" Dec 03 08:58:01 crc kubenswrapper[4947]: I1203 08:58:01.526333 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb78927d-aa75-4748-810a-d7866f36be52" containerName="mariadb-database-create" Dec 03 08:58:01 crc kubenswrapper[4947]: I1203 08:58:01.526356 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="6982b993-d2e1-4cba-bf74-0f3ca0366c62" containerName="mariadb-account-create-update" Dec 03 08:58:01 crc kubenswrapper[4947]: I1203 08:58:01.527065 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5mn24" Dec 03 08:58:01 crc kubenswrapper[4947]: I1203 08:58:01.529339 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-t6xjq" Dec 03 08:58:01 crc kubenswrapper[4947]: I1203 08:58:01.530110 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 08:58:01 crc kubenswrapper[4947]: I1203 08:58:01.539911 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-5mn24"] Dec 03 08:58:01 crc kubenswrapper[4947]: I1203 08:58:01.540585 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 08:58:01 crc kubenswrapper[4947]: I1203 08:58:01.541113 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 08:58:01 crc kubenswrapper[4947]: I1203 08:58:01.709183 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9r26\" (UniqueName: \"kubernetes.io/projected/d528b7ff-54f5-4220-ad84-035b6dfc19ce-kube-api-access-s9r26\") pod \"keystone-db-sync-5mn24\" (UID: \"d528b7ff-54f5-4220-ad84-035b6dfc19ce\") " pod="openstack/keystone-db-sync-5mn24" Dec 03 08:58:01 crc kubenswrapper[4947]: I1203 08:58:01.709395 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d528b7ff-54f5-4220-ad84-035b6dfc19ce-combined-ca-bundle\") pod \"keystone-db-sync-5mn24\" (UID: \"d528b7ff-54f5-4220-ad84-035b6dfc19ce\") " pod="openstack/keystone-db-sync-5mn24" Dec 03 08:58:01 crc kubenswrapper[4947]: I1203 08:58:01.709483 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d528b7ff-54f5-4220-ad84-035b6dfc19ce-config-data\") pod \"keystone-db-sync-5mn24\" (UID: \"d528b7ff-54f5-4220-ad84-035b6dfc19ce\") " pod="openstack/keystone-db-sync-5mn24" Dec 03 08:58:01 crc kubenswrapper[4947]: I1203 08:58:01.811388 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9r26\" (UniqueName: \"kubernetes.io/projected/d528b7ff-54f5-4220-ad84-035b6dfc19ce-kube-api-access-s9r26\") pod \"keystone-db-sync-5mn24\" (UID: \"d528b7ff-54f5-4220-ad84-035b6dfc19ce\") " pod="openstack/keystone-db-sync-5mn24" Dec 03 08:58:01 crc kubenswrapper[4947]: I1203 08:58:01.811529 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d528b7ff-54f5-4220-ad84-035b6dfc19ce-combined-ca-bundle\") pod \"keystone-db-sync-5mn24\" (UID: \"d528b7ff-54f5-4220-ad84-035b6dfc19ce\") " pod="openstack/keystone-db-sync-5mn24" Dec 03 08:58:01 crc kubenswrapper[4947]: I1203 08:58:01.811585 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d528b7ff-54f5-4220-ad84-035b6dfc19ce-config-data\") pod \"keystone-db-sync-5mn24\" (UID: \"d528b7ff-54f5-4220-ad84-035b6dfc19ce\") " pod="openstack/keystone-db-sync-5mn24" Dec 03 08:58:01 crc kubenswrapper[4947]: I1203 08:58:01.817396 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d528b7ff-54f5-4220-ad84-035b6dfc19ce-combined-ca-bundle\") pod \"keystone-db-sync-5mn24\" (UID: \"d528b7ff-54f5-4220-ad84-035b6dfc19ce\") " pod="openstack/keystone-db-sync-5mn24" Dec 03 08:58:01 crc kubenswrapper[4947]: I1203 08:58:01.818038 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d528b7ff-54f5-4220-ad84-035b6dfc19ce-config-data\") pod \"keystone-db-sync-5mn24\" (UID: \"d528b7ff-54f5-4220-ad84-035b6dfc19ce\") " pod="openstack/keystone-db-sync-5mn24" Dec 03 08:58:01 crc kubenswrapper[4947]: I1203 08:58:01.831999 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9r26\" (UniqueName: \"kubernetes.io/projected/d528b7ff-54f5-4220-ad84-035b6dfc19ce-kube-api-access-s9r26\") pod \"keystone-db-sync-5mn24\" (UID: \"d528b7ff-54f5-4220-ad84-035b6dfc19ce\") " pod="openstack/keystone-db-sync-5mn24" Dec 03 08:58:01 crc kubenswrapper[4947]: I1203 08:58:01.882936 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5mn24" Dec 03 08:58:02 crc kubenswrapper[4947]: I1203 08:58:02.348093 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-5mn24"] Dec 03 08:58:02 crc kubenswrapper[4947]: W1203 08:58:02.356178 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd528b7ff_54f5_4220_ad84_035b6dfc19ce.slice/crio-3c51a08eac153e99aa5804ac5fe57b0ff5ad49b9efb311b8f55563abeeeb0d06 WatchSource:0}: Error finding container 3c51a08eac153e99aa5804ac5fe57b0ff5ad49b9efb311b8f55563abeeeb0d06: Status 404 returned error can't find the container with id 3c51a08eac153e99aa5804ac5fe57b0ff5ad49b9efb311b8f55563abeeeb0d06 Dec 03 08:58:02 crc kubenswrapper[4947]: I1203 08:58:02.561845 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5mn24" event={"ID":"d528b7ff-54f5-4220-ad84-035b6dfc19ce","Type":"ContainerStarted","Data":"3c51a08eac153e99aa5804ac5fe57b0ff5ad49b9efb311b8f55563abeeeb0d06"} Dec 03 08:58:04 crc kubenswrapper[4947]: I1203 08:58:04.638980 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 03 08:58:07 crc kubenswrapper[4947]: I1203 08:58:07.825959 4947 scope.go:117] "RemoveContainer" containerID="568dd128d3c5a85c8d8e040b77e4d60aeab273e0d3808024e643d0e831aae1fd" Dec 03 08:58:08 crc kubenswrapper[4947]: I1203 08:58:08.622649 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5mn24" event={"ID":"d528b7ff-54f5-4220-ad84-035b6dfc19ce","Type":"ContainerStarted","Data":"7a43acc78b9ca7b20f0c5f0fffe44f9d9b22480e88012dc6a043a6c344ca162c"} Dec 03 08:58:08 crc kubenswrapper[4947]: I1203 08:58:08.642162 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-5mn24" podStartSLOduration=2.425814325 podStartE2EDuration="7.642142738s" podCreationTimestamp="2025-12-03 08:58:01 +0000 UTC" firstStartedPulling="2025-12-03 08:58:02.359368359 +0000 UTC m=+7743.620322795" lastFinishedPulling="2025-12-03 08:58:07.575696762 +0000 UTC m=+7748.836651208" observedRunningTime="2025-12-03 08:58:08.637299967 +0000 UTC m=+7749.898254403" watchObservedRunningTime="2025-12-03 08:58:08.642142738 +0000 UTC m=+7749.903097184" Dec 03 08:58:09 crc kubenswrapper[4947]: I1203 08:58:09.631612 4947 generic.go:334] "Generic (PLEG): container finished" podID="d528b7ff-54f5-4220-ad84-035b6dfc19ce" containerID="7a43acc78b9ca7b20f0c5f0fffe44f9d9b22480e88012dc6a043a6c344ca162c" exitCode=0 Dec 03 08:58:09 crc kubenswrapper[4947]: I1203 08:58:09.631664 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5mn24" event={"ID":"d528b7ff-54f5-4220-ad84-035b6dfc19ce","Type":"ContainerDied","Data":"7a43acc78b9ca7b20f0c5f0fffe44f9d9b22480e88012dc6a043a6c344ca162c"} Dec 03 08:58:10 crc kubenswrapper[4947]: I1203 08:58:10.972725 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5mn24" Dec 03 08:58:11 crc kubenswrapper[4947]: I1203 08:58:11.097452 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d528b7ff-54f5-4220-ad84-035b6dfc19ce-combined-ca-bundle\") pod \"d528b7ff-54f5-4220-ad84-035b6dfc19ce\" (UID: \"d528b7ff-54f5-4220-ad84-035b6dfc19ce\") " Dec 03 08:58:11 crc kubenswrapper[4947]: I1203 08:58:11.097538 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d528b7ff-54f5-4220-ad84-035b6dfc19ce-config-data\") pod \"d528b7ff-54f5-4220-ad84-035b6dfc19ce\" (UID: \"d528b7ff-54f5-4220-ad84-035b6dfc19ce\") " Dec 03 08:58:11 crc kubenswrapper[4947]: I1203 08:58:11.097691 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9r26\" (UniqueName: \"kubernetes.io/projected/d528b7ff-54f5-4220-ad84-035b6dfc19ce-kube-api-access-s9r26\") pod \"d528b7ff-54f5-4220-ad84-035b6dfc19ce\" (UID: \"d528b7ff-54f5-4220-ad84-035b6dfc19ce\") " Dec 03 08:58:11 crc kubenswrapper[4947]: I1203 08:58:11.110505 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d528b7ff-54f5-4220-ad84-035b6dfc19ce-kube-api-access-s9r26" (OuterVolumeSpecName: "kube-api-access-s9r26") pod "d528b7ff-54f5-4220-ad84-035b6dfc19ce" (UID: "d528b7ff-54f5-4220-ad84-035b6dfc19ce"). InnerVolumeSpecName "kube-api-access-s9r26". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:58:11 crc kubenswrapper[4947]: I1203 08:58:11.129285 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d528b7ff-54f5-4220-ad84-035b6dfc19ce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d528b7ff-54f5-4220-ad84-035b6dfc19ce" (UID: "d528b7ff-54f5-4220-ad84-035b6dfc19ce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:11 crc kubenswrapper[4947]: I1203 08:58:11.151737 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d528b7ff-54f5-4220-ad84-035b6dfc19ce-config-data" (OuterVolumeSpecName: "config-data") pod "d528b7ff-54f5-4220-ad84-035b6dfc19ce" (UID: "d528b7ff-54f5-4220-ad84-035b6dfc19ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:11 crc kubenswrapper[4947]: I1203 08:58:11.199968 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9r26\" (UniqueName: \"kubernetes.io/projected/d528b7ff-54f5-4220-ad84-035b6dfc19ce-kube-api-access-s9r26\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:11 crc kubenswrapper[4947]: I1203 08:58:11.200005 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d528b7ff-54f5-4220-ad84-035b6dfc19ce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:11 crc kubenswrapper[4947]: I1203 08:58:11.200017 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d528b7ff-54f5-4220-ad84-035b6dfc19ce-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:11 crc kubenswrapper[4947]: I1203 08:58:11.648950 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-5mn24" event={"ID":"d528b7ff-54f5-4220-ad84-035b6dfc19ce","Type":"ContainerDied","Data":"3c51a08eac153e99aa5804ac5fe57b0ff5ad49b9efb311b8f55563abeeeb0d06"} Dec 03 08:58:11 crc kubenswrapper[4947]: I1203 08:58:11.648993 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c51a08eac153e99aa5804ac5fe57b0ff5ad49b9efb311b8f55563abeeeb0d06" Dec 03 08:58:11 crc kubenswrapper[4947]: I1203 08:58:11.648991 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-5mn24" Dec 03 08:58:11 crc kubenswrapper[4947]: I1203 08:58:11.856890 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-fs9jw"] Dec 03 08:58:11 crc kubenswrapper[4947]: E1203 08:58:11.857372 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d528b7ff-54f5-4220-ad84-035b6dfc19ce" containerName="keystone-db-sync" Dec 03 08:58:11 crc kubenswrapper[4947]: I1203 08:58:11.857430 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="d528b7ff-54f5-4220-ad84-035b6dfc19ce" containerName="keystone-db-sync" Dec 03 08:58:11 crc kubenswrapper[4947]: I1203 08:58:11.857687 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="d528b7ff-54f5-4220-ad84-035b6dfc19ce" containerName="keystone-db-sync" Dec 03 08:58:11 crc kubenswrapper[4947]: I1203 08:58:11.858273 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fs9jw" Dec 03 08:58:11 crc kubenswrapper[4947]: I1203 08:58:11.867738 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 08:58:11 crc kubenswrapper[4947]: I1203 08:58:11.868131 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-t6xjq" Dec 03 08:58:11 crc kubenswrapper[4947]: I1203 08:58:11.868322 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 08:58:11 crc kubenswrapper[4947]: I1203 08:58:11.888709 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 08:58:11 crc kubenswrapper[4947]: I1203 08:58:11.889084 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 08:58:11 crc kubenswrapper[4947]: I1203 08:58:11.895930 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c49d858cc-lvjg7"] Dec 03 08:58:11 crc kubenswrapper[4947]: I1203 08:58:11.897830 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" Dec 03 08:58:11 crc kubenswrapper[4947]: I1203 08:58:11.902457 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fs9jw"] Dec 03 08:58:11 crc kubenswrapper[4947]: I1203 08:58:11.934245 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c49d858cc-lvjg7"] Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.015384 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-dns-svc\") pod \"dnsmasq-dns-5c49d858cc-lvjg7\" (UID: \"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3\") " pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.015447 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-ovsdbserver-sb\") pod \"dnsmasq-dns-5c49d858cc-lvjg7\" (UID: \"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3\") " pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.015512 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-fernet-keys\") pod \"keystone-bootstrap-fs9jw\" (UID: \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\") " pod="openstack/keystone-bootstrap-fs9jw" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.015539 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-combined-ca-bundle\") pod \"keystone-bootstrap-fs9jw\" (UID: \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\") " pod="openstack/keystone-bootstrap-fs9jw" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.015562 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wxfv\" (UniqueName: \"kubernetes.io/projected/856c04ab-49b2-45cf-ba63-d7d6eb947a52-kube-api-access-7wxfv\") pod \"keystone-bootstrap-fs9jw\" (UID: \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\") " pod="openstack/keystone-bootstrap-fs9jw" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.015587 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-credential-keys\") pod \"keystone-bootstrap-fs9jw\" (UID: \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\") " pod="openstack/keystone-bootstrap-fs9jw" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.015604 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-config-data\") pod \"keystone-bootstrap-fs9jw\" (UID: \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\") " pod="openstack/keystone-bootstrap-fs9jw" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.015624 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-ovsdbserver-nb\") pod \"dnsmasq-dns-5c49d858cc-lvjg7\" (UID: \"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3\") " pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.015637 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-scripts\") pod \"keystone-bootstrap-fs9jw\" (UID: \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\") " pod="openstack/keystone-bootstrap-fs9jw" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.015651 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjllr\" (UniqueName: \"kubernetes.io/projected/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-kube-api-access-vjllr\") pod \"dnsmasq-dns-5c49d858cc-lvjg7\" (UID: \"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3\") " pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.015697 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-config\") pod \"dnsmasq-dns-5c49d858cc-lvjg7\" (UID: \"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3\") " pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.117230 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-fernet-keys\") pod \"keystone-bootstrap-fs9jw\" (UID: \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\") " pod="openstack/keystone-bootstrap-fs9jw" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.117290 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-combined-ca-bundle\") pod \"keystone-bootstrap-fs9jw\" (UID: \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\") " pod="openstack/keystone-bootstrap-fs9jw" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.117329 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wxfv\" (UniqueName: \"kubernetes.io/projected/856c04ab-49b2-45cf-ba63-d7d6eb947a52-kube-api-access-7wxfv\") pod \"keystone-bootstrap-fs9jw\" (UID: \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\") " pod="openstack/keystone-bootstrap-fs9jw" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.117369 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-credential-keys\") pod \"keystone-bootstrap-fs9jw\" (UID: \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\") " pod="openstack/keystone-bootstrap-fs9jw" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.117393 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-config-data\") pod \"keystone-bootstrap-fs9jw\" (UID: \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\") " pod="openstack/keystone-bootstrap-fs9jw" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.118059 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-ovsdbserver-nb\") pod \"dnsmasq-dns-5c49d858cc-lvjg7\" (UID: \"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3\") " pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.118160 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-scripts\") pod \"keystone-bootstrap-fs9jw\" (UID: \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\") " pod="openstack/keystone-bootstrap-fs9jw" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.118261 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjllr\" (UniqueName: \"kubernetes.io/projected/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-kube-api-access-vjllr\") pod \"dnsmasq-dns-5c49d858cc-lvjg7\" (UID: \"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3\") " pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.118421 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-config\") pod \"dnsmasq-dns-5c49d858cc-lvjg7\" (UID: \"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3\") " pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.118570 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-dns-svc\") pod \"dnsmasq-dns-5c49d858cc-lvjg7\" (UID: \"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3\") " pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.118735 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-ovsdbserver-sb\") pod \"dnsmasq-dns-5c49d858cc-lvjg7\" (UID: \"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3\") " pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.119352 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-config\") pod \"dnsmasq-dns-5c49d858cc-lvjg7\" (UID: \"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3\") " pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.119419 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-dns-svc\") pod \"dnsmasq-dns-5c49d858cc-lvjg7\" (UID: \"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3\") " pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.119811 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-ovsdbserver-nb\") pod \"dnsmasq-dns-5c49d858cc-lvjg7\" (UID: \"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3\") " pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.120910 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-ovsdbserver-sb\") pod \"dnsmasq-dns-5c49d858cc-lvjg7\" (UID: \"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3\") " pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.123338 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-combined-ca-bundle\") pod \"keystone-bootstrap-fs9jw\" (UID: \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\") " pod="openstack/keystone-bootstrap-fs9jw" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.124472 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-fernet-keys\") pod \"keystone-bootstrap-fs9jw\" (UID: \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\") " pod="openstack/keystone-bootstrap-fs9jw" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.130166 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-config-data\") pod \"keystone-bootstrap-fs9jw\" (UID: \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\") " pod="openstack/keystone-bootstrap-fs9jw" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.131623 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-credential-keys\") pod \"keystone-bootstrap-fs9jw\" (UID: \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\") " pod="openstack/keystone-bootstrap-fs9jw" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.132470 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-scripts\") pod \"keystone-bootstrap-fs9jw\" (UID: \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\") " pod="openstack/keystone-bootstrap-fs9jw" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.136889 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wxfv\" (UniqueName: \"kubernetes.io/projected/856c04ab-49b2-45cf-ba63-d7d6eb947a52-kube-api-access-7wxfv\") pod \"keystone-bootstrap-fs9jw\" (UID: \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\") " pod="openstack/keystone-bootstrap-fs9jw" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.153943 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjllr\" (UniqueName: \"kubernetes.io/projected/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-kube-api-access-vjllr\") pod \"dnsmasq-dns-5c49d858cc-lvjg7\" (UID: \"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3\") " pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.176441 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fs9jw" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.239222 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.649970 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fs9jw"] Dec 03 08:58:12 crc kubenswrapper[4947]: W1203 08:58:12.665926 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod856c04ab_49b2_45cf_ba63_d7d6eb947a52.slice/crio-f4a005069843d543412625958a3b3950cf38e59c10a8fe5132a158dfe3abe6f3 WatchSource:0}: Error finding container f4a005069843d543412625958a3b3950cf38e59c10a8fe5132a158dfe3abe6f3: Status 404 returned error can't find the container with id f4a005069843d543412625958a3b3950cf38e59c10a8fe5132a158dfe3abe6f3 Dec 03 08:58:12 crc kubenswrapper[4947]: I1203 08:58:12.737360 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c49d858cc-lvjg7"] Dec 03 08:58:12 crc kubenswrapper[4947]: W1203 08:58:12.746308 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b3db51a_8016_47a1_b4e9_ba4e547b8bf3.slice/crio-222db27ffdbfab5296b883ed8c8e75da6fd9cb82615543b48d45c107b59e24a7 WatchSource:0}: Error finding container 222db27ffdbfab5296b883ed8c8e75da6fd9cb82615543b48d45c107b59e24a7: Status 404 returned error can't find the container with id 222db27ffdbfab5296b883ed8c8e75da6fd9cb82615543b48d45c107b59e24a7 Dec 03 08:58:13 crc kubenswrapper[4947]: I1203 08:58:13.668172 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fs9jw" event={"ID":"856c04ab-49b2-45cf-ba63-d7d6eb947a52","Type":"ContainerStarted","Data":"e748cd90bfa674d6a7540867fa744d3b98638934c547607bcc3c871e1ab835af"} Dec 03 08:58:13 crc kubenswrapper[4947]: I1203 08:58:13.668626 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fs9jw" event={"ID":"856c04ab-49b2-45cf-ba63-d7d6eb947a52","Type":"ContainerStarted","Data":"f4a005069843d543412625958a3b3950cf38e59c10a8fe5132a158dfe3abe6f3"} Dec 03 08:58:13 crc kubenswrapper[4947]: I1203 08:58:13.672718 4947 generic.go:334] "Generic (PLEG): container finished" podID="4b3db51a-8016-47a1-b4e9-ba4e547b8bf3" containerID="ff478d6bef4fc2dd1f2c8a28f60e73fca1c09a9f497bb761693089dc0f972535" exitCode=0 Dec 03 08:58:13 crc kubenswrapper[4947]: I1203 08:58:13.672770 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" event={"ID":"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3","Type":"ContainerDied","Data":"ff478d6bef4fc2dd1f2c8a28f60e73fca1c09a9f497bb761693089dc0f972535"} Dec 03 08:58:13 crc kubenswrapper[4947]: I1203 08:58:13.672799 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" event={"ID":"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3","Type":"ContainerStarted","Data":"222db27ffdbfab5296b883ed8c8e75da6fd9cb82615543b48d45c107b59e24a7"} Dec 03 08:58:13 crc kubenswrapper[4947]: I1203 08:58:13.726911 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-fs9jw" podStartSLOduration=2.726886172 podStartE2EDuration="2.726886172s" podCreationTimestamp="2025-12-03 08:58:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:58:13.69612218 +0000 UTC m=+7754.957076616" watchObservedRunningTime="2025-12-03 08:58:13.726886172 +0000 UTC m=+7754.987840608" Dec 03 08:58:14 crc kubenswrapper[4947]: I1203 08:58:14.683206 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" event={"ID":"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3","Type":"ContainerStarted","Data":"11d95e386845b420651b487d2666f9517f9496f412b8269d0d676df18836f801"} Dec 03 08:58:14 crc kubenswrapper[4947]: I1203 08:58:14.684157 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" Dec 03 08:58:14 crc kubenswrapper[4947]: I1203 08:58:14.717609 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" podStartSLOduration=3.717588229 podStartE2EDuration="3.717588229s" podCreationTimestamp="2025-12-03 08:58:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:58:14.702258044 +0000 UTC m=+7755.963212510" watchObservedRunningTime="2025-12-03 08:58:14.717588229 +0000 UTC m=+7755.978542655" Dec 03 08:58:15 crc kubenswrapper[4947]: I1203 08:58:15.083576 4947 scope.go:117] "RemoveContainer" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" Dec 03 08:58:15 crc kubenswrapper[4947]: E1203 08:58:15.083857 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:58:16 crc kubenswrapper[4947]: I1203 08:58:16.696528 4947 generic.go:334] "Generic (PLEG): container finished" podID="856c04ab-49b2-45cf-ba63-d7d6eb947a52" containerID="e748cd90bfa674d6a7540867fa744d3b98638934c547607bcc3c871e1ab835af" exitCode=0 Dec 03 08:58:16 crc kubenswrapper[4947]: I1203 08:58:16.696582 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fs9jw" event={"ID":"856c04ab-49b2-45cf-ba63-d7d6eb947a52","Type":"ContainerDied","Data":"e748cd90bfa674d6a7540867fa744d3b98638934c547607bcc3c871e1ab835af"} Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.160710 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fs9jw" Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.235687 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-combined-ca-bundle\") pod \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\" (UID: \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\") " Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.235846 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wxfv\" (UniqueName: \"kubernetes.io/projected/856c04ab-49b2-45cf-ba63-d7d6eb947a52-kube-api-access-7wxfv\") pod \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\" (UID: \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\") " Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.235971 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-credential-keys\") pod \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\" (UID: \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\") " Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.236013 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-config-data\") pod \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\" (UID: \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\") " Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.236051 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-fernet-keys\") pod \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\" (UID: \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\") " Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.236161 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-scripts\") pod \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\" (UID: \"856c04ab-49b2-45cf-ba63-d7d6eb947a52\") " Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.242127 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-scripts" (OuterVolumeSpecName: "scripts") pod "856c04ab-49b2-45cf-ba63-d7d6eb947a52" (UID: "856c04ab-49b2-45cf-ba63-d7d6eb947a52"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.242826 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "856c04ab-49b2-45cf-ba63-d7d6eb947a52" (UID: "856c04ab-49b2-45cf-ba63-d7d6eb947a52"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.247088 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "856c04ab-49b2-45cf-ba63-d7d6eb947a52" (UID: "856c04ab-49b2-45cf-ba63-d7d6eb947a52"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.247842 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/856c04ab-49b2-45cf-ba63-d7d6eb947a52-kube-api-access-7wxfv" (OuterVolumeSpecName: "kube-api-access-7wxfv") pod "856c04ab-49b2-45cf-ba63-d7d6eb947a52" (UID: "856c04ab-49b2-45cf-ba63-d7d6eb947a52"). InnerVolumeSpecName "kube-api-access-7wxfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.262824 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "856c04ab-49b2-45cf-ba63-d7d6eb947a52" (UID: "856c04ab-49b2-45cf-ba63-d7d6eb947a52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.277377 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-config-data" (OuterVolumeSpecName: "config-data") pod "856c04ab-49b2-45cf-ba63-d7d6eb947a52" (UID: "856c04ab-49b2-45cf-ba63-d7d6eb947a52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.338645 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.339000 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wxfv\" (UniqueName: \"kubernetes.io/projected/856c04ab-49b2-45cf-ba63-d7d6eb947a52-kube-api-access-7wxfv\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.339055 4947 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.339069 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.339080 4947 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.339093 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/856c04ab-49b2-45cf-ba63-d7d6eb947a52-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.718883 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fs9jw" event={"ID":"856c04ab-49b2-45cf-ba63-d7d6eb947a52","Type":"ContainerDied","Data":"f4a005069843d543412625958a3b3950cf38e59c10a8fe5132a158dfe3abe6f3"} Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.718919 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4a005069843d543412625958a3b3950cf38e59c10a8fe5132a158dfe3abe6f3" Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.719318 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fs9jw" Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.826160 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-fs9jw"] Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.834657 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-fs9jw"] Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.903388 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-swh7v"] Dec 03 08:58:18 crc kubenswrapper[4947]: E1203 08:58:18.904940 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="856c04ab-49b2-45cf-ba63-d7d6eb947a52" containerName="keystone-bootstrap" Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.904986 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="856c04ab-49b2-45cf-ba63-d7d6eb947a52" containerName="keystone-bootstrap" Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.913411 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="856c04ab-49b2-45cf-ba63-d7d6eb947a52" containerName="keystone-bootstrap" Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.916929 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-swh7v"] Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.917048 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-swh7v" Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.926952 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.926990 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.927206 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.927269 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-t6xjq" Dec 03 08:58:18 crc kubenswrapper[4947]: I1203 08:58:18.935759 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 08:58:19 crc kubenswrapper[4947]: I1203 08:58:19.053874 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-fernet-keys\") pod \"keystone-bootstrap-swh7v\" (UID: \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\") " pod="openstack/keystone-bootstrap-swh7v" Dec 03 08:58:19 crc kubenswrapper[4947]: I1203 08:58:19.054124 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djkzs\" (UniqueName: \"kubernetes.io/projected/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-kube-api-access-djkzs\") pod \"keystone-bootstrap-swh7v\" (UID: \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\") " pod="openstack/keystone-bootstrap-swh7v" Dec 03 08:58:19 crc kubenswrapper[4947]: I1203 08:58:19.054185 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-config-data\") pod \"keystone-bootstrap-swh7v\" (UID: \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\") " pod="openstack/keystone-bootstrap-swh7v" Dec 03 08:58:19 crc kubenswrapper[4947]: I1203 08:58:19.054254 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-credential-keys\") pod \"keystone-bootstrap-swh7v\" (UID: \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\") " pod="openstack/keystone-bootstrap-swh7v" Dec 03 08:58:19 crc kubenswrapper[4947]: I1203 08:58:19.054324 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-scripts\") pod \"keystone-bootstrap-swh7v\" (UID: \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\") " pod="openstack/keystone-bootstrap-swh7v" Dec 03 08:58:19 crc kubenswrapper[4947]: I1203 08:58:19.054384 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-combined-ca-bundle\") pod \"keystone-bootstrap-swh7v\" (UID: \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\") " pod="openstack/keystone-bootstrap-swh7v" Dec 03 08:58:19 crc kubenswrapper[4947]: I1203 08:58:19.095459 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="856c04ab-49b2-45cf-ba63-d7d6eb947a52" path="/var/lib/kubelet/pods/856c04ab-49b2-45cf-ba63-d7d6eb947a52/volumes" Dec 03 08:58:19 crc kubenswrapper[4947]: I1203 08:58:19.155567 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-combined-ca-bundle\") pod \"keystone-bootstrap-swh7v\" (UID: \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\") " pod="openstack/keystone-bootstrap-swh7v" Dec 03 08:58:19 crc kubenswrapper[4947]: I1203 08:58:19.155629 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-fernet-keys\") pod \"keystone-bootstrap-swh7v\" (UID: \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\") " pod="openstack/keystone-bootstrap-swh7v" Dec 03 08:58:19 crc kubenswrapper[4947]: I1203 08:58:19.155680 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djkzs\" (UniqueName: \"kubernetes.io/projected/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-kube-api-access-djkzs\") pod \"keystone-bootstrap-swh7v\" (UID: \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\") " pod="openstack/keystone-bootstrap-swh7v" Dec 03 08:58:19 crc kubenswrapper[4947]: I1203 08:58:19.155696 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-config-data\") pod \"keystone-bootstrap-swh7v\" (UID: \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\") " pod="openstack/keystone-bootstrap-swh7v" Dec 03 08:58:19 crc kubenswrapper[4947]: I1203 08:58:19.155715 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-credential-keys\") pod \"keystone-bootstrap-swh7v\" (UID: \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\") " pod="openstack/keystone-bootstrap-swh7v" Dec 03 08:58:19 crc kubenswrapper[4947]: I1203 08:58:19.155770 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-scripts\") pod \"keystone-bootstrap-swh7v\" (UID: \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\") " pod="openstack/keystone-bootstrap-swh7v" Dec 03 08:58:19 crc kubenswrapper[4947]: I1203 08:58:19.161950 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-credential-keys\") pod \"keystone-bootstrap-swh7v\" (UID: \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\") " pod="openstack/keystone-bootstrap-swh7v" Dec 03 08:58:19 crc kubenswrapper[4947]: I1203 08:58:19.162331 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-config-data\") pod \"keystone-bootstrap-swh7v\" (UID: \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\") " pod="openstack/keystone-bootstrap-swh7v" Dec 03 08:58:19 crc kubenswrapper[4947]: I1203 08:58:19.162656 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-combined-ca-bundle\") pod \"keystone-bootstrap-swh7v\" (UID: \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\") " pod="openstack/keystone-bootstrap-swh7v" Dec 03 08:58:19 crc kubenswrapper[4947]: I1203 08:58:19.162802 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-scripts\") pod \"keystone-bootstrap-swh7v\" (UID: \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\") " pod="openstack/keystone-bootstrap-swh7v" Dec 03 08:58:19 crc kubenswrapper[4947]: I1203 08:58:19.162928 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-fernet-keys\") pod \"keystone-bootstrap-swh7v\" (UID: \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\") " pod="openstack/keystone-bootstrap-swh7v" Dec 03 08:58:19 crc kubenswrapper[4947]: I1203 08:58:19.178124 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djkzs\" (UniqueName: \"kubernetes.io/projected/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-kube-api-access-djkzs\") pod \"keystone-bootstrap-swh7v\" (UID: \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\") " pod="openstack/keystone-bootstrap-swh7v" Dec 03 08:58:19 crc kubenswrapper[4947]: I1203 08:58:19.257474 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-swh7v" Dec 03 08:58:19 crc kubenswrapper[4947]: I1203 08:58:19.748520 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-swh7v"] Dec 03 08:58:20 crc kubenswrapper[4947]: I1203 08:58:20.744641 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-swh7v" event={"ID":"42bb363a-5b68-4d1d-8ca6-dbaa2592da72","Type":"ContainerStarted","Data":"fe60c4bc59b5523f41907aebb468432dd68a5b3ca714711f3fe9b20206b7095f"} Dec 03 08:58:20 crc kubenswrapper[4947]: I1203 08:58:20.744945 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-swh7v" event={"ID":"42bb363a-5b68-4d1d-8ca6-dbaa2592da72","Type":"ContainerStarted","Data":"3a4f0897a584ae01b866456c6221fa678edb8277b9dc5cc4a5355f2e08247e71"} Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.241759 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.268750 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-swh7v" podStartSLOduration=4.268732215 podStartE2EDuration="4.268732215s" podCreationTimestamp="2025-12-03 08:58:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:58:20.759865923 +0000 UTC m=+7762.020820359" watchObservedRunningTime="2025-12-03 08:58:22.268732215 +0000 UTC m=+7763.529686641" Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.323314 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7dcb87545c-zzp6p"] Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.323772 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" podUID="5e49849a-1b2b-42de-bd60-16253363c01e" containerName="dnsmasq-dns" containerID="cri-o://ba0e60c52084d8894006de81c82444b31c942dfd3cbc9b7a800f8a67187556a3" gracePeriod=10 Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.752053 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.781587 4947 generic.go:334] "Generic (PLEG): container finished" podID="5e49849a-1b2b-42de-bd60-16253363c01e" containerID="ba0e60c52084d8894006de81c82444b31c942dfd3cbc9b7a800f8a67187556a3" exitCode=0 Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.781678 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" event={"ID":"5e49849a-1b2b-42de-bd60-16253363c01e","Type":"ContainerDied","Data":"ba0e60c52084d8894006de81c82444b31c942dfd3cbc9b7a800f8a67187556a3"} Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.781710 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" event={"ID":"5e49849a-1b2b-42de-bd60-16253363c01e","Type":"ContainerDied","Data":"ea30656cab73e77e0047b496e6392e7ee83245442ca7166985bf62fb00fbe019"} Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.781741 4947 scope.go:117] "RemoveContainer" containerID="ba0e60c52084d8894006de81c82444b31c942dfd3cbc9b7a800f8a67187556a3" Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.781939 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dcb87545c-zzp6p" Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.824002 4947 scope.go:117] "RemoveContainer" containerID="338c069c38effe57991e7aa8dcdc21e4502a417170bd007ca26cdfd5d29c2832" Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.830166 4947 generic.go:334] "Generic (PLEG): container finished" podID="42bb363a-5b68-4d1d-8ca6-dbaa2592da72" containerID="fe60c4bc59b5523f41907aebb468432dd68a5b3ca714711f3fe9b20206b7095f" exitCode=0 Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.830413 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-swh7v" event={"ID":"42bb363a-5b68-4d1d-8ca6-dbaa2592da72","Type":"ContainerDied","Data":"fe60c4bc59b5523f41907aebb468432dd68a5b3ca714711f3fe9b20206b7095f"} Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.843998 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e49849a-1b2b-42de-bd60-16253363c01e-dns-svc\") pod \"5e49849a-1b2b-42de-bd60-16253363c01e\" (UID: \"5e49849a-1b2b-42de-bd60-16253363c01e\") " Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.850673 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e49849a-1b2b-42de-bd60-16253363c01e-ovsdbserver-sb\") pod \"5e49849a-1b2b-42de-bd60-16253363c01e\" (UID: \"5e49849a-1b2b-42de-bd60-16253363c01e\") " Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.850720 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dvsv\" (UniqueName: \"kubernetes.io/projected/5e49849a-1b2b-42de-bd60-16253363c01e-kube-api-access-6dvsv\") pod \"5e49849a-1b2b-42de-bd60-16253363c01e\" (UID: \"5e49849a-1b2b-42de-bd60-16253363c01e\") " Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.850790 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e49849a-1b2b-42de-bd60-16253363c01e-config\") pod \"5e49849a-1b2b-42de-bd60-16253363c01e\" (UID: \"5e49849a-1b2b-42de-bd60-16253363c01e\") " Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.850936 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e49849a-1b2b-42de-bd60-16253363c01e-ovsdbserver-nb\") pod \"5e49849a-1b2b-42de-bd60-16253363c01e\" (UID: \"5e49849a-1b2b-42de-bd60-16253363c01e\") " Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.899631 4947 scope.go:117] "RemoveContainer" containerID="ba0e60c52084d8894006de81c82444b31c942dfd3cbc9b7a800f8a67187556a3" Dec 03 08:58:22 crc kubenswrapper[4947]: E1203 08:58:22.902868 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba0e60c52084d8894006de81c82444b31c942dfd3cbc9b7a800f8a67187556a3\": container with ID starting with ba0e60c52084d8894006de81c82444b31c942dfd3cbc9b7a800f8a67187556a3 not found: ID does not exist" containerID="ba0e60c52084d8894006de81c82444b31c942dfd3cbc9b7a800f8a67187556a3" Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.902964 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba0e60c52084d8894006de81c82444b31c942dfd3cbc9b7a800f8a67187556a3"} err="failed to get container status \"ba0e60c52084d8894006de81c82444b31c942dfd3cbc9b7a800f8a67187556a3\": rpc error: code = NotFound desc = could not find container \"ba0e60c52084d8894006de81c82444b31c942dfd3cbc9b7a800f8a67187556a3\": container with ID starting with ba0e60c52084d8894006de81c82444b31c942dfd3cbc9b7a800f8a67187556a3 not found: ID does not exist" Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.902987 4947 scope.go:117] "RemoveContainer" containerID="338c069c38effe57991e7aa8dcdc21e4502a417170bd007ca26cdfd5d29c2832" Dec 03 08:58:22 crc kubenswrapper[4947]: E1203 08:58:22.905965 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"338c069c38effe57991e7aa8dcdc21e4502a417170bd007ca26cdfd5d29c2832\": container with ID starting with 338c069c38effe57991e7aa8dcdc21e4502a417170bd007ca26cdfd5d29c2832 not found: ID does not exist" containerID="338c069c38effe57991e7aa8dcdc21e4502a417170bd007ca26cdfd5d29c2832" Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.906025 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"338c069c38effe57991e7aa8dcdc21e4502a417170bd007ca26cdfd5d29c2832"} err="failed to get container status \"338c069c38effe57991e7aa8dcdc21e4502a417170bd007ca26cdfd5d29c2832\": rpc error: code = NotFound desc = could not find container \"338c069c38effe57991e7aa8dcdc21e4502a417170bd007ca26cdfd5d29c2832\": container with ID starting with 338c069c38effe57991e7aa8dcdc21e4502a417170bd007ca26cdfd5d29c2832 not found: ID does not exist" Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.908902 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e49849a-1b2b-42de-bd60-16253363c01e-kube-api-access-6dvsv" (OuterVolumeSpecName: "kube-api-access-6dvsv") pod "5e49849a-1b2b-42de-bd60-16253363c01e" (UID: "5e49849a-1b2b-42de-bd60-16253363c01e"). InnerVolumeSpecName "kube-api-access-6dvsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.953210 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dvsv\" (UniqueName: \"kubernetes.io/projected/5e49849a-1b2b-42de-bd60-16253363c01e-kube-api-access-6dvsv\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.955346 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e49849a-1b2b-42de-bd60-16253363c01e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5e49849a-1b2b-42de-bd60-16253363c01e" (UID: "5e49849a-1b2b-42de-bd60-16253363c01e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.960048 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e49849a-1b2b-42de-bd60-16253363c01e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5e49849a-1b2b-42de-bd60-16253363c01e" (UID: "5e49849a-1b2b-42de-bd60-16253363c01e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.967254 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e49849a-1b2b-42de-bd60-16253363c01e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5e49849a-1b2b-42de-bd60-16253363c01e" (UID: "5e49849a-1b2b-42de-bd60-16253363c01e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:58:22 crc kubenswrapper[4947]: I1203 08:58:22.980150 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e49849a-1b2b-42de-bd60-16253363c01e-config" (OuterVolumeSpecName: "config") pod "5e49849a-1b2b-42de-bd60-16253363c01e" (UID: "5e49849a-1b2b-42de-bd60-16253363c01e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:58:23 crc kubenswrapper[4947]: I1203 08:58:23.054354 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e49849a-1b2b-42de-bd60-16253363c01e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:23 crc kubenswrapper[4947]: I1203 08:58:23.054602 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e49849a-1b2b-42de-bd60-16253363c01e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:23 crc kubenswrapper[4947]: I1203 08:58:23.054661 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e49849a-1b2b-42de-bd60-16253363c01e-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:23 crc kubenswrapper[4947]: I1203 08:58:23.054709 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e49849a-1b2b-42de-bd60-16253363c01e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:23 crc kubenswrapper[4947]: I1203 08:58:23.126327 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7dcb87545c-zzp6p"] Dec 03 08:58:23 crc kubenswrapper[4947]: I1203 08:58:23.137313 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7dcb87545c-zzp6p"] Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.242019 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-swh7v" Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.379179 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djkzs\" (UniqueName: \"kubernetes.io/projected/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-kube-api-access-djkzs\") pod \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\" (UID: \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\") " Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.379307 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-credential-keys\") pod \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\" (UID: \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\") " Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.379358 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-config-data\") pod \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\" (UID: \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\") " Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.379405 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-combined-ca-bundle\") pod \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\" (UID: \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\") " Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.379473 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-fernet-keys\") pod \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\" (UID: \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\") " Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.379560 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-scripts\") pod \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\" (UID: \"42bb363a-5b68-4d1d-8ca6-dbaa2592da72\") " Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.384763 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-scripts" (OuterVolumeSpecName: "scripts") pod "42bb363a-5b68-4d1d-8ca6-dbaa2592da72" (UID: "42bb363a-5b68-4d1d-8ca6-dbaa2592da72"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.385343 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-kube-api-access-djkzs" (OuterVolumeSpecName: "kube-api-access-djkzs") pod "42bb363a-5b68-4d1d-8ca6-dbaa2592da72" (UID: "42bb363a-5b68-4d1d-8ca6-dbaa2592da72"). InnerVolumeSpecName "kube-api-access-djkzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.388005 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "42bb363a-5b68-4d1d-8ca6-dbaa2592da72" (UID: "42bb363a-5b68-4d1d-8ca6-dbaa2592da72"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.390111 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "42bb363a-5b68-4d1d-8ca6-dbaa2592da72" (UID: "42bb363a-5b68-4d1d-8ca6-dbaa2592da72"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.404241 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42bb363a-5b68-4d1d-8ca6-dbaa2592da72" (UID: "42bb363a-5b68-4d1d-8ca6-dbaa2592da72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.412700 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-config-data" (OuterVolumeSpecName: "config-data") pod "42bb363a-5b68-4d1d-8ca6-dbaa2592da72" (UID: "42bb363a-5b68-4d1d-8ca6-dbaa2592da72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.480968 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.481004 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djkzs\" (UniqueName: \"kubernetes.io/projected/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-kube-api-access-djkzs\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.481018 4947 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.481027 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.481037 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.481044 4947 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42bb363a-5b68-4d1d-8ca6-dbaa2592da72-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.854761 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-swh7v" event={"ID":"42bb363a-5b68-4d1d-8ca6-dbaa2592da72","Type":"ContainerDied","Data":"3a4f0897a584ae01b866456c6221fa678edb8277b9dc5cc4a5355f2e08247e71"} Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.855087 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a4f0897a584ae01b866456c6221fa678edb8277b9dc5cc4a5355f2e08247e71" Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.854818 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-swh7v" Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.992467 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7f49ddfb6d-86f4p"] Dec 03 08:58:24 crc kubenswrapper[4947]: E1203 08:58:24.993017 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e49849a-1b2b-42de-bd60-16253363c01e" containerName="init" Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.993050 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e49849a-1b2b-42de-bd60-16253363c01e" containerName="init" Dec 03 08:58:24 crc kubenswrapper[4947]: E1203 08:58:24.993089 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e49849a-1b2b-42de-bd60-16253363c01e" containerName="dnsmasq-dns" Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.993101 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e49849a-1b2b-42de-bd60-16253363c01e" containerName="dnsmasq-dns" Dec 03 08:58:24 crc kubenswrapper[4947]: E1203 08:58:24.993142 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42bb363a-5b68-4d1d-8ca6-dbaa2592da72" containerName="keystone-bootstrap" Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.993156 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="42bb363a-5b68-4d1d-8ca6-dbaa2592da72" containerName="keystone-bootstrap" Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.993429 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e49849a-1b2b-42de-bd60-16253363c01e" containerName="dnsmasq-dns" Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.993475 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="42bb363a-5b68-4d1d-8ca6-dbaa2592da72" containerName="keystone-bootstrap" Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.994466 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7f49ddfb6d-86f4p" Dec 03 08:58:24 crc kubenswrapper[4947]: I1203 08:58:24.997249 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.002125 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7f49ddfb6d-86f4p"] Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.011051 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-t6xjq" Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.011296 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.011543 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.093475 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp2zx\" (UniqueName: \"kubernetes.io/projected/4261d958-be0d-4748-801a-6a08686d5e46-kube-api-access-zp2zx\") pod \"keystone-7f49ddfb6d-86f4p\" (UID: \"4261d958-be0d-4748-801a-6a08686d5e46\") " pod="openstack/keystone-7f49ddfb6d-86f4p" Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.093726 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4261d958-be0d-4748-801a-6a08686d5e46-scripts\") pod \"keystone-7f49ddfb6d-86f4p\" (UID: \"4261d958-be0d-4748-801a-6a08686d5e46\") " pod="openstack/keystone-7f49ddfb6d-86f4p" Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.093864 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4261d958-be0d-4748-801a-6a08686d5e46-config-data\") pod \"keystone-7f49ddfb6d-86f4p\" (UID: \"4261d958-be0d-4748-801a-6a08686d5e46\") " pod="openstack/keystone-7f49ddfb6d-86f4p" Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.093951 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4261d958-be0d-4748-801a-6a08686d5e46-combined-ca-bundle\") pod \"keystone-7f49ddfb6d-86f4p\" (UID: \"4261d958-be0d-4748-801a-6a08686d5e46\") " pod="openstack/keystone-7f49ddfb6d-86f4p" Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.094068 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4261d958-be0d-4748-801a-6a08686d5e46-fernet-keys\") pod \"keystone-7f49ddfb6d-86f4p\" (UID: \"4261d958-be0d-4748-801a-6a08686d5e46\") " pod="openstack/keystone-7f49ddfb6d-86f4p" Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.094202 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4261d958-be0d-4748-801a-6a08686d5e46-credential-keys\") pod \"keystone-7f49ddfb6d-86f4p\" (UID: \"4261d958-be0d-4748-801a-6a08686d5e46\") " pod="openstack/keystone-7f49ddfb6d-86f4p" Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.096936 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e49849a-1b2b-42de-bd60-16253363c01e" path="/var/lib/kubelet/pods/5e49849a-1b2b-42de-bd60-16253363c01e/volumes" Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.195608 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4261d958-be0d-4748-801a-6a08686d5e46-fernet-keys\") pod \"keystone-7f49ddfb6d-86f4p\" (UID: \"4261d958-be0d-4748-801a-6a08686d5e46\") " pod="openstack/keystone-7f49ddfb6d-86f4p" Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.195724 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4261d958-be0d-4748-801a-6a08686d5e46-credential-keys\") pod \"keystone-7f49ddfb6d-86f4p\" (UID: \"4261d958-be0d-4748-801a-6a08686d5e46\") " pod="openstack/keystone-7f49ddfb6d-86f4p" Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.195849 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp2zx\" (UniqueName: \"kubernetes.io/projected/4261d958-be0d-4748-801a-6a08686d5e46-kube-api-access-zp2zx\") pod \"keystone-7f49ddfb6d-86f4p\" (UID: \"4261d958-be0d-4748-801a-6a08686d5e46\") " pod="openstack/keystone-7f49ddfb6d-86f4p" Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.195873 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4261d958-be0d-4748-801a-6a08686d5e46-scripts\") pod \"keystone-7f49ddfb6d-86f4p\" (UID: \"4261d958-be0d-4748-801a-6a08686d5e46\") " pod="openstack/keystone-7f49ddfb6d-86f4p" Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.195904 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4261d958-be0d-4748-801a-6a08686d5e46-config-data\") pod \"keystone-7f49ddfb6d-86f4p\" (UID: \"4261d958-be0d-4748-801a-6a08686d5e46\") " pod="openstack/keystone-7f49ddfb6d-86f4p" Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.195924 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4261d958-be0d-4748-801a-6a08686d5e46-combined-ca-bundle\") pod \"keystone-7f49ddfb6d-86f4p\" (UID: \"4261d958-be0d-4748-801a-6a08686d5e46\") " pod="openstack/keystone-7f49ddfb6d-86f4p" Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.205179 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4261d958-be0d-4748-801a-6a08686d5e46-credential-keys\") pod \"keystone-7f49ddfb6d-86f4p\" (UID: \"4261d958-be0d-4748-801a-6a08686d5e46\") " pod="openstack/keystone-7f49ddfb6d-86f4p" Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.206191 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4261d958-be0d-4748-801a-6a08686d5e46-combined-ca-bundle\") pod \"keystone-7f49ddfb6d-86f4p\" (UID: \"4261d958-be0d-4748-801a-6a08686d5e46\") " pod="openstack/keystone-7f49ddfb6d-86f4p" Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.208877 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4261d958-be0d-4748-801a-6a08686d5e46-scripts\") pod \"keystone-7f49ddfb6d-86f4p\" (UID: \"4261d958-be0d-4748-801a-6a08686d5e46\") " pod="openstack/keystone-7f49ddfb6d-86f4p" Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.208951 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4261d958-be0d-4748-801a-6a08686d5e46-config-data\") pod \"keystone-7f49ddfb6d-86f4p\" (UID: \"4261d958-be0d-4748-801a-6a08686d5e46\") " pod="openstack/keystone-7f49ddfb6d-86f4p" Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.209470 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4261d958-be0d-4748-801a-6a08686d5e46-fernet-keys\") pod \"keystone-7f49ddfb6d-86f4p\" (UID: \"4261d958-be0d-4748-801a-6a08686d5e46\") " pod="openstack/keystone-7f49ddfb6d-86f4p" Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.222061 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp2zx\" (UniqueName: \"kubernetes.io/projected/4261d958-be0d-4748-801a-6a08686d5e46-kube-api-access-zp2zx\") pod \"keystone-7f49ddfb6d-86f4p\" (UID: \"4261d958-be0d-4748-801a-6a08686d5e46\") " pod="openstack/keystone-7f49ddfb6d-86f4p" Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.321636 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7f49ddfb6d-86f4p" Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.789077 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7f49ddfb6d-86f4p"] Dec 03 08:58:25 crc kubenswrapper[4947]: I1203 08:58:25.868067 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7f49ddfb6d-86f4p" event={"ID":"4261d958-be0d-4748-801a-6a08686d5e46","Type":"ContainerStarted","Data":"c100015502f07904669ba15f1c1ad4920f38adcb31f43b9f0bc80ae82e610ecb"} Dec 03 08:58:26 crc kubenswrapper[4947]: I1203 08:58:26.875455 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7f49ddfb6d-86f4p" event={"ID":"4261d958-be0d-4748-801a-6a08686d5e46","Type":"ContainerStarted","Data":"ab3af65663d2507348fb72ce14ed047ff3b49a71a81b6cea914d1cb03b020dd6"} Dec 03 08:58:26 crc kubenswrapper[4947]: I1203 08:58:26.875818 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7f49ddfb6d-86f4p" Dec 03 08:58:26 crc kubenswrapper[4947]: I1203 08:58:26.903556 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7f49ddfb6d-86f4p" podStartSLOduration=2.90353474 podStartE2EDuration="2.90353474s" podCreationTimestamp="2025-12-03 08:58:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:58:26.897171248 +0000 UTC m=+7768.158125694" watchObservedRunningTime="2025-12-03 08:58:26.90353474 +0000 UTC m=+7768.164489166" Dec 03 08:58:28 crc kubenswrapper[4947]: I1203 08:58:28.083185 4947 scope.go:117] "RemoveContainer" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" Dec 03 08:58:28 crc kubenswrapper[4947]: E1203 08:58:28.083757 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:58:43 crc kubenswrapper[4947]: I1203 08:58:43.082884 4947 scope.go:117] "RemoveContainer" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" Dec 03 08:58:43 crc kubenswrapper[4947]: E1203 08:58:43.084313 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:58:56 crc kubenswrapper[4947]: I1203 08:58:56.963726 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7f49ddfb6d-86f4p" Dec 03 08:58:58 crc kubenswrapper[4947]: I1203 08:58:58.083611 4947 scope.go:117] "RemoveContainer" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" Dec 03 08:58:58 crc kubenswrapper[4947]: E1203 08:58:58.084143 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:59:00 crc kubenswrapper[4947]: I1203 08:59:00.842705 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 03 08:59:00 crc kubenswrapper[4947]: I1203 08:59:00.845307 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 08:59:00 crc kubenswrapper[4947]: I1203 08:59:00.848540 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 03 08:59:00 crc kubenswrapper[4947]: I1203 08:59:00.848699 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 03 08:59:00 crc kubenswrapper[4947]: I1203 08:59:00.848907 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-2wnvd" Dec 03 08:59:00 crc kubenswrapper[4947]: I1203 08:59:00.856257 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 08:59:00 crc kubenswrapper[4947]: I1203 08:59:00.913930 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4331d824-805d-455d-af0a-585746ab0364-openstack-config-secret\") pod \"openstackclient\" (UID: \"4331d824-805d-455d-af0a-585746ab0364\") " pod="openstack/openstackclient" Dec 03 08:59:00 crc kubenswrapper[4947]: I1203 08:59:00.914265 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4331d824-805d-455d-af0a-585746ab0364-openstack-config\") pod \"openstackclient\" (UID: \"4331d824-805d-455d-af0a-585746ab0364\") " pod="openstack/openstackclient" Dec 03 08:59:00 crc kubenswrapper[4947]: I1203 08:59:00.914372 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfj7t\" (UniqueName: \"kubernetes.io/projected/4331d824-805d-455d-af0a-585746ab0364-kube-api-access-dfj7t\") pod \"openstackclient\" (UID: \"4331d824-805d-455d-af0a-585746ab0364\") " pod="openstack/openstackclient" Dec 03 08:59:01 crc kubenswrapper[4947]: I1203 08:59:01.016680 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4331d824-805d-455d-af0a-585746ab0364-openstack-config-secret\") pod \"openstackclient\" (UID: \"4331d824-805d-455d-af0a-585746ab0364\") " pod="openstack/openstackclient" Dec 03 08:59:01 crc kubenswrapper[4947]: I1203 08:59:01.016740 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4331d824-805d-455d-af0a-585746ab0364-openstack-config\") pod \"openstackclient\" (UID: \"4331d824-805d-455d-af0a-585746ab0364\") " pod="openstack/openstackclient" Dec 03 08:59:01 crc kubenswrapper[4947]: I1203 08:59:01.016864 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfj7t\" (UniqueName: \"kubernetes.io/projected/4331d824-805d-455d-af0a-585746ab0364-kube-api-access-dfj7t\") pod \"openstackclient\" (UID: \"4331d824-805d-455d-af0a-585746ab0364\") " pod="openstack/openstackclient" Dec 03 08:59:01 crc kubenswrapper[4947]: I1203 08:59:01.017821 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4331d824-805d-455d-af0a-585746ab0364-openstack-config\") pod \"openstackclient\" (UID: \"4331d824-805d-455d-af0a-585746ab0364\") " pod="openstack/openstackclient" Dec 03 08:59:01 crc kubenswrapper[4947]: I1203 08:59:01.024889 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4331d824-805d-455d-af0a-585746ab0364-openstack-config-secret\") pod \"openstackclient\" (UID: \"4331d824-805d-455d-af0a-585746ab0364\") " pod="openstack/openstackclient" Dec 03 08:59:01 crc kubenswrapper[4947]: I1203 08:59:01.035150 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfj7t\" (UniqueName: \"kubernetes.io/projected/4331d824-805d-455d-af0a-585746ab0364-kube-api-access-dfj7t\") pod \"openstackclient\" (UID: \"4331d824-805d-455d-af0a-585746ab0364\") " pod="openstack/openstackclient" Dec 03 08:59:01 crc kubenswrapper[4947]: I1203 08:59:01.178525 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 08:59:01 crc kubenswrapper[4947]: W1203 08:59:01.812463 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4331d824_805d_455d_af0a_585746ab0364.slice/crio-177dd0d564805cfa9137e50cf178a93d5887fe28d383968c40a2e1d2e258d80a WatchSource:0}: Error finding container 177dd0d564805cfa9137e50cf178a93d5887fe28d383968c40a2e1d2e258d80a: Status 404 returned error can't find the container with id 177dd0d564805cfa9137e50cf178a93d5887fe28d383968c40a2e1d2e258d80a Dec 03 08:59:01 crc kubenswrapper[4947]: I1203 08:59:01.821269 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 08:59:02 crc kubenswrapper[4947]: I1203 08:59:02.194714 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4331d824-805d-455d-af0a-585746ab0364","Type":"ContainerStarted","Data":"177dd0d564805cfa9137e50cf178a93d5887fe28d383968c40a2e1d2e258d80a"} Dec 03 08:59:12 crc kubenswrapper[4947]: I1203 08:59:12.083248 4947 scope.go:117] "RemoveContainer" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" Dec 03 08:59:12 crc kubenswrapper[4947]: E1203 08:59:12.083984 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:59:13 crc kubenswrapper[4947]: I1203 08:59:13.293944 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4331d824-805d-455d-af0a-585746ab0364","Type":"ContainerStarted","Data":"19f8253dec3ce074344b7772a491e1fed8663e02ce71520faef32efa5c0485cf"} Dec 03 08:59:13 crc kubenswrapper[4947]: I1203 08:59:13.318740 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.254637882 podStartE2EDuration="13.318720358s" podCreationTimestamp="2025-12-03 08:59:00 +0000 UTC" firstStartedPulling="2025-12-03 08:59:01.814727873 +0000 UTC m=+7803.075682299" lastFinishedPulling="2025-12-03 08:59:12.878810349 +0000 UTC m=+7814.139764775" observedRunningTime="2025-12-03 08:59:13.31105757 +0000 UTC m=+7814.572012006" watchObservedRunningTime="2025-12-03 08:59:13.318720358 +0000 UTC m=+7814.579674784" Dec 03 08:59:25 crc kubenswrapper[4947]: I1203 08:59:25.083318 4947 scope.go:117] "RemoveContainer" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" Dec 03 08:59:25 crc kubenswrapper[4947]: E1203 08:59:25.084468 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:59:39 crc kubenswrapper[4947]: I1203 08:59:39.088440 4947 scope.go:117] "RemoveContainer" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" Dec 03 08:59:39 crc kubenswrapper[4947]: E1203 08:59:39.089872 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:59:51 crc kubenswrapper[4947]: I1203 08:59:51.083123 4947 scope.go:117] "RemoveContainer" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" Dec 03 08:59:51 crc kubenswrapper[4947]: E1203 08:59:51.083883 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 08:59:58 crc kubenswrapper[4947]: E1203 08:59:58.912063 4947 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.196:34898->38.102.83.196:38979: write tcp 38.102.83.196:34898->38.102.83.196:38979: write: broken pipe Dec 03 09:00:00 crc kubenswrapper[4947]: I1203 09:00:00.175614 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412540-7czh5"] Dec 03 09:00:00 crc kubenswrapper[4947]: I1203 09:00:00.177365 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-7czh5" Dec 03 09:00:00 crc kubenswrapper[4947]: I1203 09:00:00.180446 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 09:00:00 crc kubenswrapper[4947]: I1203 09:00:00.180902 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 09:00:00 crc kubenswrapper[4947]: I1203 09:00:00.188875 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412540-7czh5"] Dec 03 09:00:00 crc kubenswrapper[4947]: I1203 09:00:00.263446 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d95529d8-a859-4730-a786-39c231ba2be3-secret-volume\") pod \"collect-profiles-29412540-7czh5\" (UID: \"d95529d8-a859-4730-a786-39c231ba2be3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-7czh5" Dec 03 09:00:00 crc kubenswrapper[4947]: I1203 09:00:00.263743 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj9wp\" (UniqueName: \"kubernetes.io/projected/d95529d8-a859-4730-a786-39c231ba2be3-kube-api-access-jj9wp\") pod \"collect-profiles-29412540-7czh5\" (UID: \"d95529d8-a859-4730-a786-39c231ba2be3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-7czh5" Dec 03 09:00:00 crc kubenswrapper[4947]: I1203 09:00:00.263870 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d95529d8-a859-4730-a786-39c231ba2be3-config-volume\") pod \"collect-profiles-29412540-7czh5\" (UID: \"d95529d8-a859-4730-a786-39c231ba2be3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-7czh5" Dec 03 09:00:00 crc kubenswrapper[4947]: I1203 09:00:00.364863 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d95529d8-a859-4730-a786-39c231ba2be3-secret-volume\") pod \"collect-profiles-29412540-7czh5\" (UID: \"d95529d8-a859-4730-a786-39c231ba2be3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-7czh5" Dec 03 09:00:00 crc kubenswrapper[4947]: I1203 09:00:00.365026 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj9wp\" (UniqueName: \"kubernetes.io/projected/d95529d8-a859-4730-a786-39c231ba2be3-kube-api-access-jj9wp\") pod \"collect-profiles-29412540-7czh5\" (UID: \"d95529d8-a859-4730-a786-39c231ba2be3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-7czh5" Dec 03 09:00:00 crc kubenswrapper[4947]: I1203 09:00:00.365076 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d95529d8-a859-4730-a786-39c231ba2be3-config-volume\") pod \"collect-profiles-29412540-7czh5\" (UID: \"d95529d8-a859-4730-a786-39c231ba2be3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-7czh5" Dec 03 09:00:00 crc kubenswrapper[4947]: I1203 09:00:00.365999 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d95529d8-a859-4730-a786-39c231ba2be3-config-volume\") pod \"collect-profiles-29412540-7czh5\" (UID: \"d95529d8-a859-4730-a786-39c231ba2be3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-7czh5" Dec 03 09:00:00 crc kubenswrapper[4947]: I1203 09:00:00.370896 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d95529d8-a859-4730-a786-39c231ba2be3-secret-volume\") pod \"collect-profiles-29412540-7czh5\" (UID: \"d95529d8-a859-4730-a786-39c231ba2be3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-7czh5" Dec 03 09:00:00 crc kubenswrapper[4947]: I1203 09:00:00.382808 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj9wp\" (UniqueName: \"kubernetes.io/projected/d95529d8-a859-4730-a786-39c231ba2be3-kube-api-access-jj9wp\") pod \"collect-profiles-29412540-7czh5\" (UID: \"d95529d8-a859-4730-a786-39c231ba2be3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-7czh5" Dec 03 09:00:00 crc kubenswrapper[4947]: I1203 09:00:00.505189 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-7czh5" Dec 03 09:00:00 crc kubenswrapper[4947]: I1203 09:00:00.959082 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412540-7czh5"] Dec 03 09:00:00 crc kubenswrapper[4947]: W1203 09:00:00.968685 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd95529d8_a859_4730_a786_39c231ba2be3.slice/crio-b84bdf1cc48fc1ec569419b5c381801f7bc2d8e9d12cc65640fdd7d38b59eab2 WatchSource:0}: Error finding container b84bdf1cc48fc1ec569419b5c381801f7bc2d8e9d12cc65640fdd7d38b59eab2: Status 404 returned error can't find the container with id b84bdf1cc48fc1ec569419b5c381801f7bc2d8e9d12cc65640fdd7d38b59eab2 Dec 03 09:00:01 crc kubenswrapper[4947]: I1203 09:00:01.775229 4947 generic.go:334] "Generic (PLEG): container finished" podID="d95529d8-a859-4730-a786-39c231ba2be3" containerID="c12306b6242e792f15a718fed2336e9aafc9ee168add5af83e3c7e397b85b8d6" exitCode=0 Dec 03 09:00:01 crc kubenswrapper[4947]: I1203 09:00:01.775451 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-7czh5" event={"ID":"d95529d8-a859-4730-a786-39c231ba2be3","Type":"ContainerDied","Data":"c12306b6242e792f15a718fed2336e9aafc9ee168add5af83e3c7e397b85b8d6"} Dec 03 09:00:01 crc kubenswrapper[4947]: I1203 09:00:01.775579 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-7czh5" event={"ID":"d95529d8-a859-4730-a786-39c231ba2be3","Type":"ContainerStarted","Data":"b84bdf1cc48fc1ec569419b5c381801f7bc2d8e9d12cc65640fdd7d38b59eab2"} Dec 03 09:00:03 crc kubenswrapper[4947]: I1203 09:00:03.112283 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-7czh5" Dec 03 09:00:03 crc kubenswrapper[4947]: I1203 09:00:03.243786 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d95529d8-a859-4730-a786-39c231ba2be3-config-volume\") pod \"d95529d8-a859-4730-a786-39c231ba2be3\" (UID: \"d95529d8-a859-4730-a786-39c231ba2be3\") " Dec 03 09:00:03 crc kubenswrapper[4947]: I1203 09:00:03.243869 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d95529d8-a859-4730-a786-39c231ba2be3-secret-volume\") pod \"d95529d8-a859-4730-a786-39c231ba2be3\" (UID: \"d95529d8-a859-4730-a786-39c231ba2be3\") " Dec 03 09:00:03 crc kubenswrapper[4947]: I1203 09:00:03.243916 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj9wp\" (UniqueName: \"kubernetes.io/projected/d95529d8-a859-4730-a786-39c231ba2be3-kube-api-access-jj9wp\") pod \"d95529d8-a859-4730-a786-39c231ba2be3\" (UID: \"d95529d8-a859-4730-a786-39c231ba2be3\") " Dec 03 09:00:03 crc kubenswrapper[4947]: I1203 09:00:03.244513 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d95529d8-a859-4730-a786-39c231ba2be3-config-volume" (OuterVolumeSpecName: "config-volume") pod "d95529d8-a859-4730-a786-39c231ba2be3" (UID: "d95529d8-a859-4730-a786-39c231ba2be3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:00:03 crc kubenswrapper[4947]: I1203 09:00:03.249997 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d95529d8-a859-4730-a786-39c231ba2be3-kube-api-access-jj9wp" (OuterVolumeSpecName: "kube-api-access-jj9wp") pod "d95529d8-a859-4730-a786-39c231ba2be3" (UID: "d95529d8-a859-4730-a786-39c231ba2be3"). InnerVolumeSpecName "kube-api-access-jj9wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:00:03 crc kubenswrapper[4947]: I1203 09:00:03.251673 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d95529d8-a859-4730-a786-39c231ba2be3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d95529d8-a859-4730-a786-39c231ba2be3" (UID: "d95529d8-a859-4730-a786-39c231ba2be3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:00:03 crc kubenswrapper[4947]: I1203 09:00:03.346380 4947 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d95529d8-a859-4730-a786-39c231ba2be3-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:03 crc kubenswrapper[4947]: I1203 09:00:03.346416 4947 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d95529d8-a859-4730-a786-39c231ba2be3-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:03 crc kubenswrapper[4947]: I1203 09:00:03.346426 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj9wp\" (UniqueName: \"kubernetes.io/projected/d95529d8-a859-4730-a786-39c231ba2be3-kube-api-access-jj9wp\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:03 crc kubenswrapper[4947]: I1203 09:00:03.797292 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-7czh5" event={"ID":"d95529d8-a859-4730-a786-39c231ba2be3","Type":"ContainerDied","Data":"b84bdf1cc48fc1ec569419b5c381801f7bc2d8e9d12cc65640fdd7d38b59eab2"} Dec 03 09:00:03 crc kubenswrapper[4947]: I1203 09:00:03.797356 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b84bdf1cc48fc1ec569419b5c381801f7bc2d8e9d12cc65640fdd7d38b59eab2" Dec 03 09:00:03 crc kubenswrapper[4947]: I1203 09:00:03.797434 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-7czh5" Dec 03 09:00:04 crc kubenswrapper[4947]: I1203 09:00:04.083855 4947 scope.go:117] "RemoveContainer" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" Dec 03 09:00:04 crc kubenswrapper[4947]: E1203 09:00:04.084290 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:00:04 crc kubenswrapper[4947]: I1203 09:00:04.201235 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412495-qhws9"] Dec 03 09:00:04 crc kubenswrapper[4947]: I1203 09:00:04.209262 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412495-qhws9"] Dec 03 09:00:05 crc kubenswrapper[4947]: I1203 09:00:05.099791 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="914fca72-8b32-450d-bcff-d9d3c6e72cf1" path="/var/lib/kubelet/pods/914fca72-8b32-450d-bcff-d9d3c6e72cf1/volumes" Dec 03 09:00:12 crc kubenswrapper[4947]: I1203 09:00:12.729249 4947 scope.go:117] "RemoveContainer" containerID="3adb80a992017975c3495f8b0899f1a09c1de0d2be37de6255432ccdd9e9b3e5" Dec 03 09:00:18 crc kubenswrapper[4947]: I1203 09:00:18.083393 4947 scope.go:117] "RemoveContainer" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" Dec 03 09:00:18 crc kubenswrapper[4947]: E1203 09:00:18.084343 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:00:29 crc kubenswrapper[4947]: I1203 09:00:29.087597 4947 scope.go:117] "RemoveContainer" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" Dec 03 09:00:29 crc kubenswrapper[4947]: E1203 09:00:29.088209 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.302401 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-b2c3-account-create-update-x6vpw"] Dec 03 09:00:32 crc kubenswrapper[4947]: E1203 09:00:32.303454 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d95529d8-a859-4730-a786-39c231ba2be3" containerName="collect-profiles" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.303481 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="d95529d8-a859-4730-a786-39c231ba2be3" containerName="collect-profiles" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.303832 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="d95529d8-a859-4730-a786-39c231ba2be3" containerName="collect-profiles" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.304717 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b2c3-account-create-update-x6vpw" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.308507 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.318015 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-26vzc"] Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.349750 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b2c3-account-create-update-x6vpw"] Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.349911 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-26vzc" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.361096 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-26vzc"] Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.400788 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bca192c-e85f-4503-a747-2f0c118272f1-operator-scripts\") pod \"barbican-db-create-26vzc\" (UID: \"8bca192c-e85f-4503-a747-2f0c118272f1\") " pod="openstack/barbican-db-create-26vzc" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.400845 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkjwj\" (UniqueName: \"kubernetes.io/projected/8bca192c-e85f-4503-a747-2f0c118272f1-kube-api-access-rkjwj\") pod \"barbican-db-create-26vzc\" (UID: \"8bca192c-e85f-4503-a747-2f0c118272f1\") " pod="openstack/barbican-db-create-26vzc" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.400941 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7df294d-c9d9-48cf-8be3-5cecafe3001e-operator-scripts\") pod \"barbican-b2c3-account-create-update-x6vpw\" (UID: \"c7df294d-c9d9-48cf-8be3-5cecafe3001e\") " pod="openstack/barbican-b2c3-account-create-update-x6vpw" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.401322 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7kd4\" (UniqueName: \"kubernetes.io/projected/c7df294d-c9d9-48cf-8be3-5cecafe3001e-kube-api-access-w7kd4\") pod \"barbican-b2c3-account-create-update-x6vpw\" (UID: \"c7df294d-c9d9-48cf-8be3-5cecafe3001e\") " pod="openstack/barbican-b2c3-account-create-update-x6vpw" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.472821 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p4qz6"] Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.475808 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p4qz6" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.486080 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p4qz6"] Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.503565 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7kd4\" (UniqueName: \"kubernetes.io/projected/c7df294d-c9d9-48cf-8be3-5cecafe3001e-kube-api-access-w7kd4\") pod \"barbican-b2c3-account-create-update-x6vpw\" (UID: \"c7df294d-c9d9-48cf-8be3-5cecafe3001e\") " pod="openstack/barbican-b2c3-account-create-update-x6vpw" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.503650 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bca192c-e85f-4503-a747-2f0c118272f1-operator-scripts\") pod \"barbican-db-create-26vzc\" (UID: \"8bca192c-e85f-4503-a747-2f0c118272f1\") " pod="openstack/barbican-db-create-26vzc" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.503682 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkjwj\" (UniqueName: \"kubernetes.io/projected/8bca192c-e85f-4503-a747-2f0c118272f1-kube-api-access-rkjwj\") pod \"barbican-db-create-26vzc\" (UID: \"8bca192c-e85f-4503-a747-2f0c118272f1\") " pod="openstack/barbican-db-create-26vzc" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.503773 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7df294d-c9d9-48cf-8be3-5cecafe3001e-operator-scripts\") pod \"barbican-b2c3-account-create-update-x6vpw\" (UID: \"c7df294d-c9d9-48cf-8be3-5cecafe3001e\") " pod="openstack/barbican-b2c3-account-create-update-x6vpw" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.504934 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7df294d-c9d9-48cf-8be3-5cecafe3001e-operator-scripts\") pod \"barbican-b2c3-account-create-update-x6vpw\" (UID: \"c7df294d-c9d9-48cf-8be3-5cecafe3001e\") " pod="openstack/barbican-b2c3-account-create-update-x6vpw" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.505942 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bca192c-e85f-4503-a747-2f0c118272f1-operator-scripts\") pod \"barbican-db-create-26vzc\" (UID: \"8bca192c-e85f-4503-a747-2f0c118272f1\") " pod="openstack/barbican-db-create-26vzc" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.524445 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7kd4\" (UniqueName: \"kubernetes.io/projected/c7df294d-c9d9-48cf-8be3-5cecafe3001e-kube-api-access-w7kd4\") pod \"barbican-b2c3-account-create-update-x6vpw\" (UID: \"c7df294d-c9d9-48cf-8be3-5cecafe3001e\") " pod="openstack/barbican-b2c3-account-create-update-x6vpw" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.535459 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkjwj\" (UniqueName: \"kubernetes.io/projected/8bca192c-e85f-4503-a747-2f0c118272f1-kube-api-access-rkjwj\") pod \"barbican-db-create-26vzc\" (UID: \"8bca192c-e85f-4503-a747-2f0c118272f1\") " pod="openstack/barbican-db-create-26vzc" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.605318 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2smwl\" (UniqueName: \"kubernetes.io/projected/362082d3-410e-43a7-b28c-e887d262a76b-kube-api-access-2smwl\") pod \"redhat-marketplace-p4qz6\" (UID: \"362082d3-410e-43a7-b28c-e887d262a76b\") " pod="openshift-marketplace/redhat-marketplace-p4qz6" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.605448 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/362082d3-410e-43a7-b28c-e887d262a76b-catalog-content\") pod \"redhat-marketplace-p4qz6\" (UID: \"362082d3-410e-43a7-b28c-e887d262a76b\") " pod="openshift-marketplace/redhat-marketplace-p4qz6" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.605513 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/362082d3-410e-43a7-b28c-e887d262a76b-utilities\") pod \"redhat-marketplace-p4qz6\" (UID: \"362082d3-410e-43a7-b28c-e887d262a76b\") " pod="openshift-marketplace/redhat-marketplace-p4qz6" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.634928 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b2c3-account-create-update-x6vpw" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.672329 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-26vzc" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.708693 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/362082d3-410e-43a7-b28c-e887d262a76b-catalog-content\") pod \"redhat-marketplace-p4qz6\" (UID: \"362082d3-410e-43a7-b28c-e887d262a76b\") " pod="openshift-marketplace/redhat-marketplace-p4qz6" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.708756 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/362082d3-410e-43a7-b28c-e887d262a76b-utilities\") pod \"redhat-marketplace-p4qz6\" (UID: \"362082d3-410e-43a7-b28c-e887d262a76b\") " pod="openshift-marketplace/redhat-marketplace-p4qz6" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.708812 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2smwl\" (UniqueName: \"kubernetes.io/projected/362082d3-410e-43a7-b28c-e887d262a76b-kube-api-access-2smwl\") pod \"redhat-marketplace-p4qz6\" (UID: \"362082d3-410e-43a7-b28c-e887d262a76b\") " pod="openshift-marketplace/redhat-marketplace-p4qz6" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.709610 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/362082d3-410e-43a7-b28c-e887d262a76b-catalog-content\") pod \"redhat-marketplace-p4qz6\" (UID: \"362082d3-410e-43a7-b28c-e887d262a76b\") " pod="openshift-marketplace/redhat-marketplace-p4qz6" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.709924 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/362082d3-410e-43a7-b28c-e887d262a76b-utilities\") pod \"redhat-marketplace-p4qz6\" (UID: \"362082d3-410e-43a7-b28c-e887d262a76b\") " pod="openshift-marketplace/redhat-marketplace-p4qz6" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.727969 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2smwl\" (UniqueName: \"kubernetes.io/projected/362082d3-410e-43a7-b28c-e887d262a76b-kube-api-access-2smwl\") pod \"redhat-marketplace-p4qz6\" (UID: \"362082d3-410e-43a7-b28c-e887d262a76b\") " pod="openshift-marketplace/redhat-marketplace-p4qz6" Dec 03 09:00:32 crc kubenswrapper[4947]: I1203 09:00:32.797311 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p4qz6" Dec 03 09:00:33 crc kubenswrapper[4947]: I1203 09:00:33.133305 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b2c3-account-create-update-x6vpw"] Dec 03 09:00:33 crc kubenswrapper[4947]: I1203 09:00:33.185910 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-26vzc"] Dec 03 09:00:33 crc kubenswrapper[4947]: W1203 09:00:33.187676 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bca192c_e85f_4503_a747_2f0c118272f1.slice/crio-ffedf976e886c03547d57e22851080a63d074efe8e459e98f9bccb1bf9f6d5ad WatchSource:0}: Error finding container ffedf976e886c03547d57e22851080a63d074efe8e459e98f9bccb1bf9f6d5ad: Status 404 returned error can't find the container with id ffedf976e886c03547d57e22851080a63d074efe8e459e98f9bccb1bf9f6d5ad Dec 03 09:00:33 crc kubenswrapper[4947]: I1203 09:00:33.289556 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p4qz6"] Dec 03 09:00:33 crc kubenswrapper[4947]: W1203 09:00:33.292774 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod362082d3_410e_43a7_b28c_e887d262a76b.slice/crio-1d6c72a161ad2087d32325ae27fb2dd25bab871e7b34b40dd3ac6b3168f4698c WatchSource:0}: Error finding container 1d6c72a161ad2087d32325ae27fb2dd25bab871e7b34b40dd3ac6b3168f4698c: Status 404 returned error can't find the container with id 1d6c72a161ad2087d32325ae27fb2dd25bab871e7b34b40dd3ac6b3168f4698c Dec 03 09:00:34 crc kubenswrapper[4947]: I1203 09:00:34.101099 4947 generic.go:334] "Generic (PLEG): container finished" podID="c7df294d-c9d9-48cf-8be3-5cecafe3001e" containerID="100b23c66bbbe88856e28fbc130c17c71d54adcb36c332a7a27521c2da23a932" exitCode=0 Dec 03 09:00:34 crc kubenswrapper[4947]: I1203 09:00:34.101161 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b2c3-account-create-update-x6vpw" event={"ID":"c7df294d-c9d9-48cf-8be3-5cecafe3001e","Type":"ContainerDied","Data":"100b23c66bbbe88856e28fbc130c17c71d54adcb36c332a7a27521c2da23a932"} Dec 03 09:00:34 crc kubenswrapper[4947]: I1203 09:00:34.101663 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b2c3-account-create-update-x6vpw" event={"ID":"c7df294d-c9d9-48cf-8be3-5cecafe3001e","Type":"ContainerStarted","Data":"62add8ed77848cfd9a3b82ad6884633e880a76f6d709326138924c044149872c"} Dec 03 09:00:34 crc kubenswrapper[4947]: I1203 09:00:34.105455 4947 generic.go:334] "Generic (PLEG): container finished" podID="8bca192c-e85f-4503-a747-2f0c118272f1" containerID="d9cef58309b351e3c7551d971b6c4de87f629ebbab948fb89d9f663e6e0e672c" exitCode=0 Dec 03 09:00:34 crc kubenswrapper[4947]: I1203 09:00:34.105571 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-26vzc" event={"ID":"8bca192c-e85f-4503-a747-2f0c118272f1","Type":"ContainerDied","Data":"d9cef58309b351e3c7551d971b6c4de87f629ebbab948fb89d9f663e6e0e672c"} Dec 03 09:00:34 crc kubenswrapper[4947]: I1203 09:00:34.105611 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-26vzc" event={"ID":"8bca192c-e85f-4503-a747-2f0c118272f1","Type":"ContainerStarted","Data":"ffedf976e886c03547d57e22851080a63d074efe8e459e98f9bccb1bf9f6d5ad"} Dec 03 09:00:34 crc kubenswrapper[4947]: I1203 09:00:34.108203 4947 generic.go:334] "Generic (PLEG): container finished" podID="362082d3-410e-43a7-b28c-e887d262a76b" containerID="1543bf849d333c451fb27a5566a064ef7776fbbf2a1bfe661cac1fda176e8c80" exitCode=0 Dec 03 09:00:34 crc kubenswrapper[4947]: I1203 09:00:34.108266 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p4qz6" event={"ID":"362082d3-410e-43a7-b28c-e887d262a76b","Type":"ContainerDied","Data":"1543bf849d333c451fb27a5566a064ef7776fbbf2a1bfe661cac1fda176e8c80"} Dec 03 09:00:34 crc kubenswrapper[4947]: I1203 09:00:34.108299 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p4qz6" event={"ID":"362082d3-410e-43a7-b28c-e887d262a76b","Type":"ContainerStarted","Data":"1d6c72a161ad2087d32325ae27fb2dd25bab871e7b34b40dd3ac6b3168f4698c"} Dec 03 09:00:34 crc kubenswrapper[4947]: I1203 09:00:34.112982 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 09:00:36 crc kubenswrapper[4947]: I1203 09:00:36.098472 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-26vzc" Dec 03 09:00:36 crc kubenswrapper[4947]: I1203 09:00:36.105770 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b2c3-account-create-update-x6vpw" Dec 03 09:00:36 crc kubenswrapper[4947]: I1203 09:00:36.127096 4947 generic.go:334] "Generic (PLEG): container finished" podID="362082d3-410e-43a7-b28c-e887d262a76b" containerID="b289c59bec4cba25a13fa584a08994ccf9ee0973f6bd3f8d6ca0c15bb619d0e0" exitCode=0 Dec 03 09:00:36 crc kubenswrapper[4947]: I1203 09:00:36.127194 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p4qz6" event={"ID":"362082d3-410e-43a7-b28c-e887d262a76b","Type":"ContainerDied","Data":"b289c59bec4cba25a13fa584a08994ccf9ee0973f6bd3f8d6ca0c15bb619d0e0"} Dec 03 09:00:36 crc kubenswrapper[4947]: I1203 09:00:36.129006 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b2c3-account-create-update-x6vpw" Dec 03 09:00:36 crc kubenswrapper[4947]: I1203 09:00:36.129099 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b2c3-account-create-update-x6vpw" event={"ID":"c7df294d-c9d9-48cf-8be3-5cecafe3001e","Type":"ContainerDied","Data":"62add8ed77848cfd9a3b82ad6884633e880a76f6d709326138924c044149872c"} Dec 03 09:00:36 crc kubenswrapper[4947]: I1203 09:00:36.129147 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62add8ed77848cfd9a3b82ad6884633e880a76f6d709326138924c044149872c" Dec 03 09:00:36 crc kubenswrapper[4947]: I1203 09:00:36.134988 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-26vzc" event={"ID":"8bca192c-e85f-4503-a747-2f0c118272f1","Type":"ContainerDied","Data":"ffedf976e886c03547d57e22851080a63d074efe8e459e98f9bccb1bf9f6d5ad"} Dec 03 09:00:36 crc kubenswrapper[4947]: I1203 09:00:36.135051 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffedf976e886c03547d57e22851080a63d074efe8e459e98f9bccb1bf9f6d5ad" Dec 03 09:00:36 crc kubenswrapper[4947]: I1203 09:00:36.135089 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-26vzc" Dec 03 09:00:36 crc kubenswrapper[4947]: I1203 09:00:36.167391 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7df294d-c9d9-48cf-8be3-5cecafe3001e-operator-scripts\") pod \"c7df294d-c9d9-48cf-8be3-5cecafe3001e\" (UID: \"c7df294d-c9d9-48cf-8be3-5cecafe3001e\") " Dec 03 09:00:36 crc kubenswrapper[4947]: I1203 09:00:36.167731 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7kd4\" (UniqueName: \"kubernetes.io/projected/c7df294d-c9d9-48cf-8be3-5cecafe3001e-kube-api-access-w7kd4\") pod \"c7df294d-c9d9-48cf-8be3-5cecafe3001e\" (UID: \"c7df294d-c9d9-48cf-8be3-5cecafe3001e\") " Dec 03 09:00:36 crc kubenswrapper[4947]: I1203 09:00:36.167875 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkjwj\" (UniqueName: \"kubernetes.io/projected/8bca192c-e85f-4503-a747-2f0c118272f1-kube-api-access-rkjwj\") pod \"8bca192c-e85f-4503-a747-2f0c118272f1\" (UID: \"8bca192c-e85f-4503-a747-2f0c118272f1\") " Dec 03 09:00:36 crc kubenswrapper[4947]: I1203 09:00:36.167947 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bca192c-e85f-4503-a747-2f0c118272f1-operator-scripts\") pod \"8bca192c-e85f-4503-a747-2f0c118272f1\" (UID: \"8bca192c-e85f-4503-a747-2f0c118272f1\") " Dec 03 09:00:36 crc kubenswrapper[4947]: I1203 09:00:36.168438 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7df294d-c9d9-48cf-8be3-5cecafe3001e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c7df294d-c9d9-48cf-8be3-5cecafe3001e" (UID: "c7df294d-c9d9-48cf-8be3-5cecafe3001e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:00:36 crc kubenswrapper[4947]: I1203 09:00:36.168582 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bca192c-e85f-4503-a747-2f0c118272f1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8bca192c-e85f-4503-a747-2f0c118272f1" (UID: "8bca192c-e85f-4503-a747-2f0c118272f1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:00:36 crc kubenswrapper[4947]: I1203 09:00:36.169060 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bca192c-e85f-4503-a747-2f0c118272f1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:36 crc kubenswrapper[4947]: I1203 09:00:36.169084 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7df294d-c9d9-48cf-8be3-5cecafe3001e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:36 crc kubenswrapper[4947]: I1203 09:00:36.174210 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7df294d-c9d9-48cf-8be3-5cecafe3001e-kube-api-access-w7kd4" (OuterVolumeSpecName: "kube-api-access-w7kd4") pod "c7df294d-c9d9-48cf-8be3-5cecafe3001e" (UID: "c7df294d-c9d9-48cf-8be3-5cecafe3001e"). InnerVolumeSpecName "kube-api-access-w7kd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:00:36 crc kubenswrapper[4947]: I1203 09:00:36.176808 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bca192c-e85f-4503-a747-2f0c118272f1-kube-api-access-rkjwj" (OuterVolumeSpecName: "kube-api-access-rkjwj") pod "8bca192c-e85f-4503-a747-2f0c118272f1" (UID: "8bca192c-e85f-4503-a747-2f0c118272f1"). InnerVolumeSpecName "kube-api-access-rkjwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:00:36 crc kubenswrapper[4947]: I1203 09:00:36.270941 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7kd4\" (UniqueName: \"kubernetes.io/projected/c7df294d-c9d9-48cf-8be3-5cecafe3001e-kube-api-access-w7kd4\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:36 crc kubenswrapper[4947]: I1203 09:00:36.270970 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkjwj\" (UniqueName: \"kubernetes.io/projected/8bca192c-e85f-4503-a747-2f0c118272f1-kube-api-access-rkjwj\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:37 crc kubenswrapper[4947]: I1203 09:00:37.145456 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p4qz6" event={"ID":"362082d3-410e-43a7-b28c-e887d262a76b","Type":"ContainerStarted","Data":"37ac84861db074e8a14ff25386e093ea94ab446a41e8233da8fd4059aaa7bee2"} Dec 03 09:00:37 crc kubenswrapper[4947]: I1203 09:00:37.169235 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p4qz6" podStartSLOduration=2.796412033 podStartE2EDuration="5.169211203s" podCreationTimestamp="2025-12-03 09:00:32 +0000 UTC" firstStartedPulling="2025-12-03 09:00:34.112766581 +0000 UTC m=+7895.373720997" lastFinishedPulling="2025-12-03 09:00:36.485565731 +0000 UTC m=+7897.746520167" observedRunningTime="2025-12-03 09:00:37.165255966 +0000 UTC m=+7898.426210402" watchObservedRunningTime="2025-12-03 09:00:37.169211203 +0000 UTC m=+7898.430165639" Dec 03 09:00:37 crc kubenswrapper[4947]: I1203 09:00:37.519803 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-cpsv7"] Dec 03 09:00:37 crc kubenswrapper[4947]: E1203 09:00:37.520227 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bca192c-e85f-4503-a747-2f0c118272f1" containerName="mariadb-database-create" Dec 03 09:00:37 crc kubenswrapper[4947]: I1203 09:00:37.520252 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bca192c-e85f-4503-a747-2f0c118272f1" containerName="mariadb-database-create" Dec 03 09:00:37 crc kubenswrapper[4947]: E1203 09:00:37.520273 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7df294d-c9d9-48cf-8be3-5cecafe3001e" containerName="mariadb-account-create-update" Dec 03 09:00:37 crc kubenswrapper[4947]: I1203 09:00:37.520282 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7df294d-c9d9-48cf-8be3-5cecafe3001e" containerName="mariadb-account-create-update" Dec 03 09:00:37 crc kubenswrapper[4947]: I1203 09:00:37.520506 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7df294d-c9d9-48cf-8be3-5cecafe3001e" containerName="mariadb-account-create-update" Dec 03 09:00:37 crc kubenswrapper[4947]: I1203 09:00:37.520534 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bca192c-e85f-4503-a747-2f0c118272f1" containerName="mariadb-database-create" Dec 03 09:00:37 crc kubenswrapper[4947]: I1203 09:00:37.521205 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cpsv7" Dec 03 09:00:37 crc kubenswrapper[4947]: I1203 09:00:37.523524 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 09:00:37 crc kubenswrapper[4947]: I1203 09:00:37.523749 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-776dx" Dec 03 09:00:37 crc kubenswrapper[4947]: I1203 09:00:37.533620 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-cpsv7"] Dec 03 09:00:37 crc kubenswrapper[4947]: I1203 09:00:37.593370 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/24481172-ba60-4445-a58f-aa1f21e7368c-db-sync-config-data\") pod \"barbican-db-sync-cpsv7\" (UID: \"24481172-ba60-4445-a58f-aa1f21e7368c\") " pod="openstack/barbican-db-sync-cpsv7" Dec 03 09:00:37 crc kubenswrapper[4947]: I1203 09:00:37.593626 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24481172-ba60-4445-a58f-aa1f21e7368c-combined-ca-bundle\") pod \"barbican-db-sync-cpsv7\" (UID: \"24481172-ba60-4445-a58f-aa1f21e7368c\") " pod="openstack/barbican-db-sync-cpsv7" Dec 03 09:00:37 crc kubenswrapper[4947]: I1203 09:00:37.593718 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5glmc\" (UniqueName: \"kubernetes.io/projected/24481172-ba60-4445-a58f-aa1f21e7368c-kube-api-access-5glmc\") pod \"barbican-db-sync-cpsv7\" (UID: \"24481172-ba60-4445-a58f-aa1f21e7368c\") " pod="openstack/barbican-db-sync-cpsv7" Dec 03 09:00:37 crc kubenswrapper[4947]: I1203 09:00:37.695330 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24481172-ba60-4445-a58f-aa1f21e7368c-combined-ca-bundle\") pod \"barbican-db-sync-cpsv7\" (UID: \"24481172-ba60-4445-a58f-aa1f21e7368c\") " pod="openstack/barbican-db-sync-cpsv7" Dec 03 09:00:37 crc kubenswrapper[4947]: I1203 09:00:37.695384 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5glmc\" (UniqueName: \"kubernetes.io/projected/24481172-ba60-4445-a58f-aa1f21e7368c-kube-api-access-5glmc\") pod \"barbican-db-sync-cpsv7\" (UID: \"24481172-ba60-4445-a58f-aa1f21e7368c\") " pod="openstack/barbican-db-sync-cpsv7" Dec 03 09:00:37 crc kubenswrapper[4947]: I1203 09:00:37.695554 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/24481172-ba60-4445-a58f-aa1f21e7368c-db-sync-config-data\") pod \"barbican-db-sync-cpsv7\" (UID: \"24481172-ba60-4445-a58f-aa1f21e7368c\") " pod="openstack/barbican-db-sync-cpsv7" Dec 03 09:00:37 crc kubenswrapper[4947]: I1203 09:00:37.705607 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24481172-ba60-4445-a58f-aa1f21e7368c-combined-ca-bundle\") pod \"barbican-db-sync-cpsv7\" (UID: \"24481172-ba60-4445-a58f-aa1f21e7368c\") " pod="openstack/barbican-db-sync-cpsv7" Dec 03 09:00:37 crc kubenswrapper[4947]: I1203 09:00:37.706912 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/24481172-ba60-4445-a58f-aa1f21e7368c-db-sync-config-data\") pod \"barbican-db-sync-cpsv7\" (UID: \"24481172-ba60-4445-a58f-aa1f21e7368c\") " pod="openstack/barbican-db-sync-cpsv7" Dec 03 09:00:37 crc kubenswrapper[4947]: I1203 09:00:37.714152 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5glmc\" (UniqueName: \"kubernetes.io/projected/24481172-ba60-4445-a58f-aa1f21e7368c-kube-api-access-5glmc\") pod \"barbican-db-sync-cpsv7\" (UID: \"24481172-ba60-4445-a58f-aa1f21e7368c\") " pod="openstack/barbican-db-sync-cpsv7" Dec 03 09:00:37 crc kubenswrapper[4947]: I1203 09:00:37.837572 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cpsv7" Dec 03 09:00:38 crc kubenswrapper[4947]: I1203 09:00:38.281310 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-cpsv7"] Dec 03 09:00:38 crc kubenswrapper[4947]: W1203 09:00:38.292722 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24481172_ba60_4445_a58f_aa1f21e7368c.slice/crio-251cd6748d7e1a3503c21f778ce8a212ca42e48b47dedc63f07aacad00dc8701 WatchSource:0}: Error finding container 251cd6748d7e1a3503c21f778ce8a212ca42e48b47dedc63f07aacad00dc8701: Status 404 returned error can't find the container with id 251cd6748d7e1a3503c21f778ce8a212ca42e48b47dedc63f07aacad00dc8701 Dec 03 09:00:39 crc kubenswrapper[4947]: I1203 09:00:39.161082 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cpsv7" event={"ID":"24481172-ba60-4445-a58f-aa1f21e7368c","Type":"ContainerStarted","Data":"251cd6748d7e1a3503c21f778ce8a212ca42e48b47dedc63f07aacad00dc8701"} Dec 03 09:00:40 crc kubenswrapper[4947]: I1203 09:00:40.931473 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tf5pv"] Dec 03 09:00:40 crc kubenswrapper[4947]: I1203 09:00:40.934201 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tf5pv" Dec 03 09:00:40 crc kubenswrapper[4947]: I1203 09:00:40.948042 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tf5pv"] Dec 03 09:00:41 crc kubenswrapper[4947]: I1203 09:00:41.054870 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79ed4b40-6bd1-479d-b4da-7c4f86192f32-catalog-content\") pod \"community-operators-tf5pv\" (UID: \"79ed4b40-6bd1-479d-b4da-7c4f86192f32\") " pod="openshift-marketplace/community-operators-tf5pv" Dec 03 09:00:41 crc kubenswrapper[4947]: I1203 09:00:41.054922 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79ed4b40-6bd1-479d-b4da-7c4f86192f32-utilities\") pod \"community-operators-tf5pv\" (UID: \"79ed4b40-6bd1-479d-b4da-7c4f86192f32\") " pod="openshift-marketplace/community-operators-tf5pv" Dec 03 09:00:41 crc kubenswrapper[4947]: I1203 09:00:41.054950 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfm9c\" (UniqueName: \"kubernetes.io/projected/79ed4b40-6bd1-479d-b4da-7c4f86192f32-kube-api-access-tfm9c\") pod \"community-operators-tf5pv\" (UID: \"79ed4b40-6bd1-479d-b4da-7c4f86192f32\") " pod="openshift-marketplace/community-operators-tf5pv" Dec 03 09:00:41 crc kubenswrapper[4947]: I1203 09:00:41.156648 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79ed4b40-6bd1-479d-b4da-7c4f86192f32-catalog-content\") pod \"community-operators-tf5pv\" (UID: \"79ed4b40-6bd1-479d-b4da-7c4f86192f32\") " pod="openshift-marketplace/community-operators-tf5pv" Dec 03 09:00:41 crc kubenswrapper[4947]: I1203 09:00:41.156706 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79ed4b40-6bd1-479d-b4da-7c4f86192f32-utilities\") pod \"community-operators-tf5pv\" (UID: \"79ed4b40-6bd1-479d-b4da-7c4f86192f32\") " pod="openshift-marketplace/community-operators-tf5pv" Dec 03 09:00:41 crc kubenswrapper[4947]: I1203 09:00:41.156726 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfm9c\" (UniqueName: \"kubernetes.io/projected/79ed4b40-6bd1-479d-b4da-7c4f86192f32-kube-api-access-tfm9c\") pod \"community-operators-tf5pv\" (UID: \"79ed4b40-6bd1-479d-b4da-7c4f86192f32\") " pod="openshift-marketplace/community-operators-tf5pv" Dec 03 09:00:41 crc kubenswrapper[4947]: I1203 09:00:41.157119 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79ed4b40-6bd1-479d-b4da-7c4f86192f32-catalog-content\") pod \"community-operators-tf5pv\" (UID: \"79ed4b40-6bd1-479d-b4da-7c4f86192f32\") " pod="openshift-marketplace/community-operators-tf5pv" Dec 03 09:00:41 crc kubenswrapper[4947]: I1203 09:00:41.158203 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79ed4b40-6bd1-479d-b4da-7c4f86192f32-utilities\") pod \"community-operators-tf5pv\" (UID: \"79ed4b40-6bd1-479d-b4da-7c4f86192f32\") " pod="openshift-marketplace/community-operators-tf5pv" Dec 03 09:00:41 crc kubenswrapper[4947]: I1203 09:00:41.192957 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfm9c\" (UniqueName: \"kubernetes.io/projected/79ed4b40-6bd1-479d-b4da-7c4f86192f32-kube-api-access-tfm9c\") pod \"community-operators-tf5pv\" (UID: \"79ed4b40-6bd1-479d-b4da-7c4f86192f32\") " pod="openshift-marketplace/community-operators-tf5pv" Dec 03 09:00:41 crc kubenswrapper[4947]: I1203 09:00:41.274893 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tf5pv" Dec 03 09:00:42 crc kubenswrapper[4947]: I1203 09:00:42.798593 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p4qz6" Dec 03 09:00:42 crc kubenswrapper[4947]: I1203 09:00:42.798914 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p4qz6" Dec 03 09:00:42 crc kubenswrapper[4947]: I1203 09:00:42.840035 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p4qz6" Dec 03 09:00:43 crc kubenswrapper[4947]: I1203 09:00:43.084842 4947 scope.go:117] "RemoveContainer" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" Dec 03 09:00:43 crc kubenswrapper[4947]: E1203 09:00:43.085704 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:00:43 crc kubenswrapper[4947]: I1203 09:00:43.243960 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p4qz6" Dec 03 09:00:44 crc kubenswrapper[4947]: I1203 09:00:44.211341 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cpsv7" event={"ID":"24481172-ba60-4445-a58f-aa1f21e7368c","Type":"ContainerStarted","Data":"7640fceceb75a5b96d15cfd4788e1c48a215300db98621c893df5c92ca81ec90"} Dec 03 09:00:44 crc kubenswrapper[4947]: I1203 09:00:44.236359 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-cpsv7" podStartSLOduration=1.6101216059999999 podStartE2EDuration="7.236335337s" podCreationTimestamp="2025-12-03 09:00:37 +0000 UTC" firstStartedPulling="2025-12-03 09:00:38.299812383 +0000 UTC m=+7899.560766809" lastFinishedPulling="2025-12-03 09:00:43.926026114 +0000 UTC m=+7905.186980540" observedRunningTime="2025-12-03 09:00:44.225023921 +0000 UTC m=+7905.485978387" watchObservedRunningTime="2025-12-03 09:00:44.236335337 +0000 UTC m=+7905.497289783" Dec 03 09:00:44 crc kubenswrapper[4947]: I1203 09:00:44.310604 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p4qz6"] Dec 03 09:00:44 crc kubenswrapper[4947]: I1203 09:00:44.320631 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tf5pv"] Dec 03 09:00:45 crc kubenswrapper[4947]: I1203 09:00:45.220150 4947 generic.go:334] "Generic (PLEG): container finished" podID="79ed4b40-6bd1-479d-b4da-7c4f86192f32" containerID="49ac7fa66149a0c78f093fb3079af917e740b8970042d06c0e48c850ce579efd" exitCode=0 Dec 03 09:00:45 crc kubenswrapper[4947]: I1203 09:00:45.220261 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf5pv" event={"ID":"79ed4b40-6bd1-479d-b4da-7c4f86192f32","Type":"ContainerDied","Data":"49ac7fa66149a0c78f093fb3079af917e740b8970042d06c0e48c850ce579efd"} Dec 03 09:00:45 crc kubenswrapper[4947]: I1203 09:00:45.220296 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf5pv" event={"ID":"79ed4b40-6bd1-479d-b4da-7c4f86192f32","Type":"ContainerStarted","Data":"1d1b7e97dbf0f7fbafb7ef09b4b84e1fd8b9852f9cd8e7d0286b767e4b4d218d"} Dec 03 09:00:45 crc kubenswrapper[4947]: I1203 09:00:45.220544 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p4qz6" podUID="362082d3-410e-43a7-b28c-e887d262a76b" containerName="registry-server" containerID="cri-o://37ac84861db074e8a14ff25386e093ea94ab446a41e8233da8fd4059aaa7bee2" gracePeriod=2 Dec 03 09:00:45 crc kubenswrapper[4947]: I1203 09:00:45.726384 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p4qz6" Dec 03 09:00:45 crc kubenswrapper[4947]: I1203 09:00:45.738385 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/362082d3-410e-43a7-b28c-e887d262a76b-catalog-content\") pod \"362082d3-410e-43a7-b28c-e887d262a76b\" (UID: \"362082d3-410e-43a7-b28c-e887d262a76b\") " Dec 03 09:00:45 crc kubenswrapper[4947]: I1203 09:00:45.738479 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2smwl\" (UniqueName: \"kubernetes.io/projected/362082d3-410e-43a7-b28c-e887d262a76b-kube-api-access-2smwl\") pod \"362082d3-410e-43a7-b28c-e887d262a76b\" (UID: \"362082d3-410e-43a7-b28c-e887d262a76b\") " Dec 03 09:00:45 crc kubenswrapper[4947]: I1203 09:00:45.744974 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/362082d3-410e-43a7-b28c-e887d262a76b-kube-api-access-2smwl" (OuterVolumeSpecName: "kube-api-access-2smwl") pod "362082d3-410e-43a7-b28c-e887d262a76b" (UID: "362082d3-410e-43a7-b28c-e887d262a76b"). InnerVolumeSpecName "kube-api-access-2smwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:00:45 crc kubenswrapper[4947]: I1203 09:00:45.769017 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/362082d3-410e-43a7-b28c-e887d262a76b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "362082d3-410e-43a7-b28c-e887d262a76b" (UID: "362082d3-410e-43a7-b28c-e887d262a76b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:00:45 crc kubenswrapper[4947]: I1203 09:00:45.840334 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/362082d3-410e-43a7-b28c-e887d262a76b-utilities\") pod \"362082d3-410e-43a7-b28c-e887d262a76b\" (UID: \"362082d3-410e-43a7-b28c-e887d262a76b\") " Dec 03 09:00:45 crc kubenswrapper[4947]: I1203 09:00:45.840990 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2smwl\" (UniqueName: \"kubernetes.io/projected/362082d3-410e-43a7-b28c-e887d262a76b-kube-api-access-2smwl\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:45 crc kubenswrapper[4947]: I1203 09:00:45.841013 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/362082d3-410e-43a7-b28c-e887d262a76b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:45 crc kubenswrapper[4947]: I1203 09:00:45.841638 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/362082d3-410e-43a7-b28c-e887d262a76b-utilities" (OuterVolumeSpecName: "utilities") pod "362082d3-410e-43a7-b28c-e887d262a76b" (UID: "362082d3-410e-43a7-b28c-e887d262a76b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:00:45 crc kubenswrapper[4947]: I1203 09:00:45.942481 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/362082d3-410e-43a7-b28c-e887d262a76b-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:46 crc kubenswrapper[4947]: I1203 09:00:46.233406 4947 generic.go:334] "Generic (PLEG): container finished" podID="24481172-ba60-4445-a58f-aa1f21e7368c" containerID="7640fceceb75a5b96d15cfd4788e1c48a215300db98621c893df5c92ca81ec90" exitCode=0 Dec 03 09:00:46 crc kubenswrapper[4947]: I1203 09:00:46.233483 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cpsv7" event={"ID":"24481172-ba60-4445-a58f-aa1f21e7368c","Type":"ContainerDied","Data":"7640fceceb75a5b96d15cfd4788e1c48a215300db98621c893df5c92ca81ec90"} Dec 03 09:00:46 crc kubenswrapper[4947]: I1203 09:00:46.236722 4947 generic.go:334] "Generic (PLEG): container finished" podID="362082d3-410e-43a7-b28c-e887d262a76b" containerID="37ac84861db074e8a14ff25386e093ea94ab446a41e8233da8fd4059aaa7bee2" exitCode=0 Dec 03 09:00:46 crc kubenswrapper[4947]: I1203 09:00:46.236784 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p4qz6" event={"ID":"362082d3-410e-43a7-b28c-e887d262a76b","Type":"ContainerDied","Data":"37ac84861db074e8a14ff25386e093ea94ab446a41e8233da8fd4059aaa7bee2"} Dec 03 09:00:46 crc kubenswrapper[4947]: I1203 09:00:46.236815 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p4qz6" event={"ID":"362082d3-410e-43a7-b28c-e887d262a76b","Type":"ContainerDied","Data":"1d6c72a161ad2087d32325ae27fb2dd25bab871e7b34b40dd3ac6b3168f4698c"} Dec 03 09:00:46 crc kubenswrapper[4947]: I1203 09:00:46.236834 4947 scope.go:117] "RemoveContainer" containerID="37ac84861db074e8a14ff25386e093ea94ab446a41e8233da8fd4059aaa7bee2" Dec 03 09:00:46 crc kubenswrapper[4947]: I1203 09:00:46.236976 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p4qz6" Dec 03 09:00:46 crc kubenswrapper[4947]: I1203 09:00:46.243293 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf5pv" event={"ID":"79ed4b40-6bd1-479d-b4da-7c4f86192f32","Type":"ContainerStarted","Data":"e29fa64c44d599baa264454d81c4cc6385f9f2337f53119da994a1d08cbdf18f"} Dec 03 09:00:46 crc kubenswrapper[4947]: I1203 09:00:46.268300 4947 scope.go:117] "RemoveContainer" containerID="b289c59bec4cba25a13fa584a08994ccf9ee0973f6bd3f8d6ca0c15bb619d0e0" Dec 03 09:00:46 crc kubenswrapper[4947]: I1203 09:00:46.301724 4947 scope.go:117] "RemoveContainer" containerID="1543bf849d333c451fb27a5566a064ef7776fbbf2a1bfe661cac1fda176e8c80" Dec 03 09:00:46 crc kubenswrapper[4947]: I1203 09:00:46.335378 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p4qz6"] Dec 03 09:00:46 crc kubenswrapper[4947]: I1203 09:00:46.342279 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p4qz6"] Dec 03 09:00:46 crc kubenswrapper[4947]: I1203 09:00:46.402984 4947 scope.go:117] "RemoveContainer" containerID="37ac84861db074e8a14ff25386e093ea94ab446a41e8233da8fd4059aaa7bee2" Dec 03 09:00:46 crc kubenswrapper[4947]: E1203 09:00:46.403549 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37ac84861db074e8a14ff25386e093ea94ab446a41e8233da8fd4059aaa7bee2\": container with ID starting with 37ac84861db074e8a14ff25386e093ea94ab446a41e8233da8fd4059aaa7bee2 not found: ID does not exist" containerID="37ac84861db074e8a14ff25386e093ea94ab446a41e8233da8fd4059aaa7bee2" Dec 03 09:00:46 crc kubenswrapper[4947]: I1203 09:00:46.403609 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37ac84861db074e8a14ff25386e093ea94ab446a41e8233da8fd4059aaa7bee2"} err="failed to get container status \"37ac84861db074e8a14ff25386e093ea94ab446a41e8233da8fd4059aaa7bee2\": rpc error: code = NotFound desc = could not find container \"37ac84861db074e8a14ff25386e093ea94ab446a41e8233da8fd4059aaa7bee2\": container with ID starting with 37ac84861db074e8a14ff25386e093ea94ab446a41e8233da8fd4059aaa7bee2 not found: ID does not exist" Dec 03 09:00:46 crc kubenswrapper[4947]: I1203 09:00:46.403636 4947 scope.go:117] "RemoveContainer" containerID="b289c59bec4cba25a13fa584a08994ccf9ee0973f6bd3f8d6ca0c15bb619d0e0" Dec 03 09:00:46 crc kubenswrapper[4947]: E1203 09:00:46.404151 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b289c59bec4cba25a13fa584a08994ccf9ee0973f6bd3f8d6ca0c15bb619d0e0\": container with ID starting with b289c59bec4cba25a13fa584a08994ccf9ee0973f6bd3f8d6ca0c15bb619d0e0 not found: ID does not exist" containerID="b289c59bec4cba25a13fa584a08994ccf9ee0973f6bd3f8d6ca0c15bb619d0e0" Dec 03 09:00:46 crc kubenswrapper[4947]: I1203 09:00:46.404257 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b289c59bec4cba25a13fa584a08994ccf9ee0973f6bd3f8d6ca0c15bb619d0e0"} err="failed to get container status \"b289c59bec4cba25a13fa584a08994ccf9ee0973f6bd3f8d6ca0c15bb619d0e0\": rpc error: code = NotFound desc = could not find container \"b289c59bec4cba25a13fa584a08994ccf9ee0973f6bd3f8d6ca0c15bb619d0e0\": container with ID starting with b289c59bec4cba25a13fa584a08994ccf9ee0973f6bd3f8d6ca0c15bb619d0e0 not found: ID does not exist" Dec 03 09:00:46 crc kubenswrapper[4947]: I1203 09:00:46.404349 4947 scope.go:117] "RemoveContainer" containerID="1543bf849d333c451fb27a5566a064ef7776fbbf2a1bfe661cac1fda176e8c80" Dec 03 09:00:46 crc kubenswrapper[4947]: E1203 09:00:46.404786 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1543bf849d333c451fb27a5566a064ef7776fbbf2a1bfe661cac1fda176e8c80\": container with ID starting with 1543bf849d333c451fb27a5566a064ef7776fbbf2a1bfe661cac1fda176e8c80 not found: ID does not exist" containerID="1543bf849d333c451fb27a5566a064ef7776fbbf2a1bfe661cac1fda176e8c80" Dec 03 09:00:46 crc kubenswrapper[4947]: I1203 09:00:46.404862 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1543bf849d333c451fb27a5566a064ef7776fbbf2a1bfe661cac1fda176e8c80"} err="failed to get container status \"1543bf849d333c451fb27a5566a064ef7776fbbf2a1bfe661cac1fda176e8c80\": rpc error: code = NotFound desc = could not find container \"1543bf849d333c451fb27a5566a064ef7776fbbf2a1bfe661cac1fda176e8c80\": container with ID starting with 1543bf849d333c451fb27a5566a064ef7776fbbf2a1bfe661cac1fda176e8c80 not found: ID does not exist" Dec 03 09:00:47 crc kubenswrapper[4947]: I1203 09:00:47.101262 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="362082d3-410e-43a7-b28c-e887d262a76b" path="/var/lib/kubelet/pods/362082d3-410e-43a7-b28c-e887d262a76b/volumes" Dec 03 09:00:47 crc kubenswrapper[4947]: I1203 09:00:47.252294 4947 generic.go:334] "Generic (PLEG): container finished" podID="79ed4b40-6bd1-479d-b4da-7c4f86192f32" containerID="e29fa64c44d599baa264454d81c4cc6385f9f2337f53119da994a1d08cbdf18f" exitCode=0 Dec 03 09:00:47 crc kubenswrapper[4947]: I1203 09:00:47.252377 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf5pv" event={"ID":"79ed4b40-6bd1-479d-b4da-7c4f86192f32","Type":"ContainerDied","Data":"e29fa64c44d599baa264454d81c4cc6385f9f2337f53119da994a1d08cbdf18f"} Dec 03 09:00:47 crc kubenswrapper[4947]: I1203 09:00:47.590808 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cpsv7" Dec 03 09:00:47 crc kubenswrapper[4947]: I1203 09:00:47.680020 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5glmc\" (UniqueName: \"kubernetes.io/projected/24481172-ba60-4445-a58f-aa1f21e7368c-kube-api-access-5glmc\") pod \"24481172-ba60-4445-a58f-aa1f21e7368c\" (UID: \"24481172-ba60-4445-a58f-aa1f21e7368c\") " Dec 03 09:00:47 crc kubenswrapper[4947]: I1203 09:00:47.680078 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24481172-ba60-4445-a58f-aa1f21e7368c-combined-ca-bundle\") pod \"24481172-ba60-4445-a58f-aa1f21e7368c\" (UID: \"24481172-ba60-4445-a58f-aa1f21e7368c\") " Dec 03 09:00:47 crc kubenswrapper[4947]: I1203 09:00:47.680106 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/24481172-ba60-4445-a58f-aa1f21e7368c-db-sync-config-data\") pod \"24481172-ba60-4445-a58f-aa1f21e7368c\" (UID: \"24481172-ba60-4445-a58f-aa1f21e7368c\") " Dec 03 09:00:47 crc kubenswrapper[4947]: I1203 09:00:47.686177 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24481172-ba60-4445-a58f-aa1f21e7368c-kube-api-access-5glmc" (OuterVolumeSpecName: "kube-api-access-5glmc") pod "24481172-ba60-4445-a58f-aa1f21e7368c" (UID: "24481172-ba60-4445-a58f-aa1f21e7368c"). InnerVolumeSpecName "kube-api-access-5glmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:00:47 crc kubenswrapper[4947]: I1203 09:00:47.686656 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24481172-ba60-4445-a58f-aa1f21e7368c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "24481172-ba60-4445-a58f-aa1f21e7368c" (UID: "24481172-ba60-4445-a58f-aa1f21e7368c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:00:47 crc kubenswrapper[4947]: I1203 09:00:47.706710 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24481172-ba60-4445-a58f-aa1f21e7368c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24481172-ba60-4445-a58f-aa1f21e7368c" (UID: "24481172-ba60-4445-a58f-aa1f21e7368c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:00:47 crc kubenswrapper[4947]: I1203 09:00:47.781611 4947 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/24481172-ba60-4445-a58f-aa1f21e7368c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:47 crc kubenswrapper[4947]: I1203 09:00:47.781646 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5glmc\" (UniqueName: \"kubernetes.io/projected/24481172-ba60-4445-a58f-aa1f21e7368c-kube-api-access-5glmc\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:47 crc kubenswrapper[4947]: I1203 09:00:47.781656 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24481172-ba60-4445-a58f-aa1f21e7368c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.265514 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf5pv" event={"ID":"79ed4b40-6bd1-479d-b4da-7c4f86192f32","Type":"ContainerStarted","Data":"3bd16d3c04188800c7eda890b8efa480dfa191b2228e9c0e62fb63eb8d570e5c"} Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.267417 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cpsv7" event={"ID":"24481172-ba60-4445-a58f-aa1f21e7368c","Type":"ContainerDied","Data":"251cd6748d7e1a3503c21f778ce8a212ca42e48b47dedc63f07aacad00dc8701"} Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.267465 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="251cd6748d7e1a3503c21f778ce8a212ca42e48b47dedc63f07aacad00dc8701" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.267481 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cpsv7" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.303069 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tf5pv" podStartSLOduration=5.661535006 podStartE2EDuration="8.303045645s" podCreationTimestamp="2025-12-03 09:00:40 +0000 UTC" firstStartedPulling="2025-12-03 09:00:45.22169758 +0000 UTC m=+7906.482652006" lastFinishedPulling="2025-12-03 09:00:47.863208169 +0000 UTC m=+7909.124162645" observedRunningTime="2025-12-03 09:00:48.291132353 +0000 UTC m=+7909.552086779" watchObservedRunningTime="2025-12-03 09:00:48.303045645 +0000 UTC m=+7909.564000071" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.513991 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6789c6d659-8b7jf"] Dec 03 09:00:48 crc kubenswrapper[4947]: E1203 09:00:48.514346 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="362082d3-410e-43a7-b28c-e887d262a76b" containerName="extract-content" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.514357 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="362082d3-410e-43a7-b28c-e887d262a76b" containerName="extract-content" Dec 03 09:00:48 crc kubenswrapper[4947]: E1203 09:00:48.514369 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="362082d3-410e-43a7-b28c-e887d262a76b" containerName="registry-server" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.514375 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="362082d3-410e-43a7-b28c-e887d262a76b" containerName="registry-server" Dec 03 09:00:48 crc kubenswrapper[4947]: E1203 09:00:48.514400 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="362082d3-410e-43a7-b28c-e887d262a76b" containerName="extract-utilities" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.514407 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="362082d3-410e-43a7-b28c-e887d262a76b" containerName="extract-utilities" Dec 03 09:00:48 crc kubenswrapper[4947]: E1203 09:00:48.514418 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24481172-ba60-4445-a58f-aa1f21e7368c" containerName="barbican-db-sync" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.514423 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="24481172-ba60-4445-a58f-aa1f21e7368c" containerName="barbican-db-sync" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.514580 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="24481172-ba60-4445-a58f-aa1f21e7368c" containerName="barbican-db-sync" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.514594 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="362082d3-410e-43a7-b28c-e887d262a76b" containerName="registry-server" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.515499 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6789c6d659-8b7jf" Dec 03 09:00:48 crc kubenswrapper[4947]: W1203 09:00:48.519480 4947 reflector.go:561] object-"openstack"/"barbican-worker-config-data": failed to list *v1.Secret: secrets "barbican-worker-config-data" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 03 09:00:48 crc kubenswrapper[4947]: E1203 09:00:48.519538 4947 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"barbican-worker-config-data\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"barbican-worker-config-data\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.519636 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.523015 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-776dx" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.549988 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6789c6d659-8b7jf"] Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.595701 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-74b59955b-vlhfv"] Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.601098 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-74b59955b-vlhfv" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.601134 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f055623c-9e1a-4793-a9d0-fc56e71d8df5-combined-ca-bundle\") pod \"barbican-worker-6789c6d659-8b7jf\" (UID: \"f055623c-9e1a-4793-a9d0-fc56e71d8df5\") " pod="openstack/barbican-worker-6789c6d659-8b7jf" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.601197 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f055623c-9e1a-4793-a9d0-fc56e71d8df5-config-data\") pod \"barbican-worker-6789c6d659-8b7jf\" (UID: \"f055623c-9e1a-4793-a9d0-fc56e71d8df5\") " pod="openstack/barbican-worker-6789c6d659-8b7jf" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.601239 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f055623c-9e1a-4793-a9d0-fc56e71d8df5-logs\") pod \"barbican-worker-6789c6d659-8b7jf\" (UID: \"f055623c-9e1a-4793-a9d0-fc56e71d8df5\") " pod="openstack/barbican-worker-6789c6d659-8b7jf" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.601259 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f055623c-9e1a-4793-a9d0-fc56e71d8df5-config-data-custom\") pod \"barbican-worker-6789c6d659-8b7jf\" (UID: \"f055623c-9e1a-4793-a9d0-fc56e71d8df5\") " pod="openstack/barbican-worker-6789c6d659-8b7jf" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.601320 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdbhb\" (UniqueName: \"kubernetes.io/projected/f055623c-9e1a-4793-a9d0-fc56e71d8df5-kube-api-access-qdbhb\") pod \"barbican-worker-6789c6d659-8b7jf\" (UID: \"f055623c-9e1a-4793-a9d0-fc56e71d8df5\") " pod="openstack/barbican-worker-6789c6d659-8b7jf" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.604695 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.613466 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-74b59955b-vlhfv"] Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.664606 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54ddd5f977-d4wrt"] Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.665948 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.698643 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54ddd5f977-d4wrt"] Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.702898 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc7m9\" (UniqueName: \"kubernetes.io/projected/d40e27d8-38a2-496b-b073-dc91d4c878ef-kube-api-access-xc7m9\") pod \"dnsmasq-dns-54ddd5f977-d4wrt\" (UID: \"d40e27d8-38a2-496b-b073-dc91d4c878ef\") " pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.702942 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d40e27d8-38a2-496b-b073-dc91d4c878ef-ovsdbserver-nb\") pod \"dnsmasq-dns-54ddd5f977-d4wrt\" (UID: \"d40e27d8-38a2-496b-b073-dc91d4c878ef\") " pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.702981 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13615835-dfc2-4dd0-8dc3-518683077f9f-config-data-custom\") pod \"barbican-keystone-listener-74b59955b-vlhfv\" (UID: \"13615835-dfc2-4dd0-8dc3-518683077f9f\") " pod="openstack/barbican-keystone-listener-74b59955b-vlhfv" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.703001 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d40e27d8-38a2-496b-b073-dc91d4c878ef-dns-svc\") pod \"dnsmasq-dns-54ddd5f977-d4wrt\" (UID: \"d40e27d8-38a2-496b-b073-dc91d4c878ef\") " pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.703038 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdbhb\" (UniqueName: \"kubernetes.io/projected/f055623c-9e1a-4793-a9d0-fc56e71d8df5-kube-api-access-qdbhb\") pod \"barbican-worker-6789c6d659-8b7jf\" (UID: \"f055623c-9e1a-4793-a9d0-fc56e71d8df5\") " pod="openstack/barbican-worker-6789c6d659-8b7jf" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.703063 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13615835-dfc2-4dd0-8dc3-518683077f9f-logs\") pod \"barbican-keystone-listener-74b59955b-vlhfv\" (UID: \"13615835-dfc2-4dd0-8dc3-518683077f9f\") " pod="openstack/barbican-keystone-listener-74b59955b-vlhfv" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.703080 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vfv9\" (UniqueName: \"kubernetes.io/projected/13615835-dfc2-4dd0-8dc3-518683077f9f-kube-api-access-6vfv9\") pod \"barbican-keystone-listener-74b59955b-vlhfv\" (UID: \"13615835-dfc2-4dd0-8dc3-518683077f9f\") " pod="openstack/barbican-keystone-listener-74b59955b-vlhfv" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.703122 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f055623c-9e1a-4793-a9d0-fc56e71d8df5-combined-ca-bundle\") pod \"barbican-worker-6789c6d659-8b7jf\" (UID: \"f055623c-9e1a-4793-a9d0-fc56e71d8df5\") " pod="openstack/barbican-worker-6789c6d659-8b7jf" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.703145 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d40e27d8-38a2-496b-b073-dc91d4c878ef-ovsdbserver-sb\") pod \"dnsmasq-dns-54ddd5f977-d4wrt\" (UID: \"d40e27d8-38a2-496b-b073-dc91d4c878ef\") " pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.703523 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f055623c-9e1a-4793-a9d0-fc56e71d8df5-config-data\") pod \"barbican-worker-6789c6d659-8b7jf\" (UID: \"f055623c-9e1a-4793-a9d0-fc56e71d8df5\") " pod="openstack/barbican-worker-6789c6d659-8b7jf" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.703643 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13615835-dfc2-4dd0-8dc3-518683077f9f-combined-ca-bundle\") pod \"barbican-keystone-listener-74b59955b-vlhfv\" (UID: \"13615835-dfc2-4dd0-8dc3-518683077f9f\") " pod="openstack/barbican-keystone-listener-74b59955b-vlhfv" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.703698 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d40e27d8-38a2-496b-b073-dc91d4c878ef-config\") pod \"dnsmasq-dns-54ddd5f977-d4wrt\" (UID: \"d40e27d8-38a2-496b-b073-dc91d4c878ef\") " pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.703722 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f055623c-9e1a-4793-a9d0-fc56e71d8df5-logs\") pod \"barbican-worker-6789c6d659-8b7jf\" (UID: \"f055623c-9e1a-4793-a9d0-fc56e71d8df5\") " pod="openstack/barbican-worker-6789c6d659-8b7jf" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.703770 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f055623c-9e1a-4793-a9d0-fc56e71d8df5-config-data-custom\") pod \"barbican-worker-6789c6d659-8b7jf\" (UID: \"f055623c-9e1a-4793-a9d0-fc56e71d8df5\") " pod="openstack/barbican-worker-6789c6d659-8b7jf" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.703797 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13615835-dfc2-4dd0-8dc3-518683077f9f-config-data\") pod \"barbican-keystone-listener-74b59955b-vlhfv\" (UID: \"13615835-dfc2-4dd0-8dc3-518683077f9f\") " pod="openstack/barbican-keystone-listener-74b59955b-vlhfv" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.704230 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f055623c-9e1a-4793-a9d0-fc56e71d8df5-logs\") pod \"barbican-worker-6789c6d659-8b7jf\" (UID: \"f055623c-9e1a-4793-a9d0-fc56e71d8df5\") " pod="openstack/barbican-worker-6789c6d659-8b7jf" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.711680 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f055623c-9e1a-4793-a9d0-fc56e71d8df5-config-data\") pod \"barbican-worker-6789c6d659-8b7jf\" (UID: \"f055623c-9e1a-4793-a9d0-fc56e71d8df5\") " pod="openstack/barbican-worker-6789c6d659-8b7jf" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.722093 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdbhb\" (UniqueName: \"kubernetes.io/projected/f055623c-9e1a-4793-a9d0-fc56e71d8df5-kube-api-access-qdbhb\") pod \"barbican-worker-6789c6d659-8b7jf\" (UID: \"f055623c-9e1a-4793-a9d0-fc56e71d8df5\") " pod="openstack/barbican-worker-6789c6d659-8b7jf" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.724610 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f055623c-9e1a-4793-a9d0-fc56e71d8df5-combined-ca-bundle\") pod \"barbican-worker-6789c6d659-8b7jf\" (UID: \"f055623c-9e1a-4793-a9d0-fc56e71d8df5\") " pod="openstack/barbican-worker-6789c6d659-8b7jf" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.760678 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-76dd78b6c8-phtkk"] Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.762704 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76dd78b6c8-phtkk" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.765169 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.787370 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-76dd78b6c8-phtkk"] Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.805543 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13615835-dfc2-4dd0-8dc3-518683077f9f-combined-ca-bundle\") pod \"barbican-keystone-listener-74b59955b-vlhfv\" (UID: \"13615835-dfc2-4dd0-8dc3-518683077f9f\") " pod="openstack/barbican-keystone-listener-74b59955b-vlhfv" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.805601 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d40e27d8-38a2-496b-b073-dc91d4c878ef-config\") pod \"dnsmasq-dns-54ddd5f977-d4wrt\" (UID: \"d40e27d8-38a2-496b-b073-dc91d4c878ef\") " pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.805637 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e86fc66-942a-4e83-8969-6d18c93ba3e3-config-data\") pod \"barbican-api-76dd78b6c8-phtkk\" (UID: \"3e86fc66-942a-4e83-8969-6d18c93ba3e3\") " pod="openstack/barbican-api-76dd78b6c8-phtkk" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.805667 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13615835-dfc2-4dd0-8dc3-518683077f9f-config-data\") pod \"barbican-keystone-listener-74b59955b-vlhfv\" (UID: \"13615835-dfc2-4dd0-8dc3-518683077f9f\") " pod="openstack/barbican-keystone-listener-74b59955b-vlhfv" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.805699 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc7m9\" (UniqueName: \"kubernetes.io/projected/d40e27d8-38a2-496b-b073-dc91d4c878ef-kube-api-access-xc7m9\") pod \"dnsmasq-dns-54ddd5f977-d4wrt\" (UID: \"d40e27d8-38a2-496b-b073-dc91d4c878ef\") " pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.805719 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d40e27d8-38a2-496b-b073-dc91d4c878ef-ovsdbserver-nb\") pod \"dnsmasq-dns-54ddd5f977-d4wrt\" (UID: \"d40e27d8-38a2-496b-b073-dc91d4c878ef\") " pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.805852 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13615835-dfc2-4dd0-8dc3-518683077f9f-config-data-custom\") pod \"barbican-keystone-listener-74b59955b-vlhfv\" (UID: \"13615835-dfc2-4dd0-8dc3-518683077f9f\") " pod="openstack/barbican-keystone-listener-74b59955b-vlhfv" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.805907 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d40e27d8-38a2-496b-b073-dc91d4c878ef-dns-svc\") pod \"dnsmasq-dns-54ddd5f977-d4wrt\" (UID: \"d40e27d8-38a2-496b-b073-dc91d4c878ef\") " pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.805998 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e86fc66-942a-4e83-8969-6d18c93ba3e3-config-data-custom\") pod \"barbican-api-76dd78b6c8-phtkk\" (UID: \"3e86fc66-942a-4e83-8969-6d18c93ba3e3\") " pod="openstack/barbican-api-76dd78b6c8-phtkk" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.806062 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13615835-dfc2-4dd0-8dc3-518683077f9f-logs\") pod \"barbican-keystone-listener-74b59955b-vlhfv\" (UID: \"13615835-dfc2-4dd0-8dc3-518683077f9f\") " pod="openstack/barbican-keystone-listener-74b59955b-vlhfv" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.806083 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vfv9\" (UniqueName: \"kubernetes.io/projected/13615835-dfc2-4dd0-8dc3-518683077f9f-kube-api-access-6vfv9\") pod \"barbican-keystone-listener-74b59955b-vlhfv\" (UID: \"13615835-dfc2-4dd0-8dc3-518683077f9f\") " pod="openstack/barbican-keystone-listener-74b59955b-vlhfv" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.806137 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nzmk\" (UniqueName: \"kubernetes.io/projected/3e86fc66-942a-4e83-8969-6d18c93ba3e3-kube-api-access-4nzmk\") pod \"barbican-api-76dd78b6c8-phtkk\" (UID: \"3e86fc66-942a-4e83-8969-6d18c93ba3e3\") " pod="openstack/barbican-api-76dd78b6c8-phtkk" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.806226 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d40e27d8-38a2-496b-b073-dc91d4c878ef-ovsdbserver-sb\") pod \"dnsmasq-dns-54ddd5f977-d4wrt\" (UID: \"d40e27d8-38a2-496b-b073-dc91d4c878ef\") " pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.806263 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e86fc66-942a-4e83-8969-6d18c93ba3e3-logs\") pod \"barbican-api-76dd78b6c8-phtkk\" (UID: \"3e86fc66-942a-4e83-8969-6d18c93ba3e3\") " pod="openstack/barbican-api-76dd78b6c8-phtkk" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.806347 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e86fc66-942a-4e83-8969-6d18c93ba3e3-combined-ca-bundle\") pod \"barbican-api-76dd78b6c8-phtkk\" (UID: \"3e86fc66-942a-4e83-8969-6d18c93ba3e3\") " pod="openstack/barbican-api-76dd78b6c8-phtkk" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.806749 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d40e27d8-38a2-496b-b073-dc91d4c878ef-ovsdbserver-nb\") pod \"dnsmasq-dns-54ddd5f977-d4wrt\" (UID: \"d40e27d8-38a2-496b-b073-dc91d4c878ef\") " pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.806759 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d40e27d8-38a2-496b-b073-dc91d4c878ef-config\") pod \"dnsmasq-dns-54ddd5f977-d4wrt\" (UID: \"d40e27d8-38a2-496b-b073-dc91d4c878ef\") " pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.808624 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d40e27d8-38a2-496b-b073-dc91d4c878ef-ovsdbserver-sb\") pod \"dnsmasq-dns-54ddd5f977-d4wrt\" (UID: \"d40e27d8-38a2-496b-b073-dc91d4c878ef\") " pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.809471 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d40e27d8-38a2-496b-b073-dc91d4c878ef-dns-svc\") pod \"dnsmasq-dns-54ddd5f977-d4wrt\" (UID: \"d40e27d8-38a2-496b-b073-dc91d4c878ef\") " pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.809851 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13615835-dfc2-4dd0-8dc3-518683077f9f-logs\") pod \"barbican-keystone-listener-74b59955b-vlhfv\" (UID: \"13615835-dfc2-4dd0-8dc3-518683077f9f\") " pod="openstack/barbican-keystone-listener-74b59955b-vlhfv" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.819036 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13615835-dfc2-4dd0-8dc3-518683077f9f-config-data-custom\") pod \"barbican-keystone-listener-74b59955b-vlhfv\" (UID: \"13615835-dfc2-4dd0-8dc3-518683077f9f\") " pod="openstack/barbican-keystone-listener-74b59955b-vlhfv" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.828012 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13615835-dfc2-4dd0-8dc3-518683077f9f-config-data\") pod \"barbican-keystone-listener-74b59955b-vlhfv\" (UID: \"13615835-dfc2-4dd0-8dc3-518683077f9f\") " pod="openstack/barbican-keystone-listener-74b59955b-vlhfv" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.836830 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13615835-dfc2-4dd0-8dc3-518683077f9f-combined-ca-bundle\") pod \"barbican-keystone-listener-74b59955b-vlhfv\" (UID: \"13615835-dfc2-4dd0-8dc3-518683077f9f\") " pod="openstack/barbican-keystone-listener-74b59955b-vlhfv" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.837049 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vfv9\" (UniqueName: \"kubernetes.io/projected/13615835-dfc2-4dd0-8dc3-518683077f9f-kube-api-access-6vfv9\") pod \"barbican-keystone-listener-74b59955b-vlhfv\" (UID: \"13615835-dfc2-4dd0-8dc3-518683077f9f\") " pod="openstack/barbican-keystone-listener-74b59955b-vlhfv" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.837178 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc7m9\" (UniqueName: \"kubernetes.io/projected/d40e27d8-38a2-496b-b073-dc91d4c878ef-kube-api-access-xc7m9\") pod \"dnsmasq-dns-54ddd5f977-d4wrt\" (UID: \"d40e27d8-38a2-496b-b073-dc91d4c878ef\") " pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.908170 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e86fc66-942a-4e83-8969-6d18c93ba3e3-config-data-custom\") pod \"barbican-api-76dd78b6c8-phtkk\" (UID: \"3e86fc66-942a-4e83-8969-6d18c93ba3e3\") " pod="openstack/barbican-api-76dd78b6c8-phtkk" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.909549 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nzmk\" (UniqueName: \"kubernetes.io/projected/3e86fc66-942a-4e83-8969-6d18c93ba3e3-kube-api-access-4nzmk\") pod \"barbican-api-76dd78b6c8-phtkk\" (UID: \"3e86fc66-942a-4e83-8969-6d18c93ba3e3\") " pod="openstack/barbican-api-76dd78b6c8-phtkk" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.910065 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e86fc66-942a-4e83-8969-6d18c93ba3e3-logs\") pod \"barbican-api-76dd78b6c8-phtkk\" (UID: \"3e86fc66-942a-4e83-8969-6d18c93ba3e3\") " pod="openstack/barbican-api-76dd78b6c8-phtkk" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.910219 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e86fc66-942a-4e83-8969-6d18c93ba3e3-combined-ca-bundle\") pod \"barbican-api-76dd78b6c8-phtkk\" (UID: \"3e86fc66-942a-4e83-8969-6d18c93ba3e3\") " pod="openstack/barbican-api-76dd78b6c8-phtkk" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.910332 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e86fc66-942a-4e83-8969-6d18c93ba3e3-config-data\") pod \"barbican-api-76dd78b6c8-phtkk\" (UID: \"3e86fc66-942a-4e83-8969-6d18c93ba3e3\") " pod="openstack/barbican-api-76dd78b6c8-phtkk" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.911036 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e86fc66-942a-4e83-8969-6d18c93ba3e3-logs\") pod \"barbican-api-76dd78b6c8-phtkk\" (UID: \"3e86fc66-942a-4e83-8969-6d18c93ba3e3\") " pod="openstack/barbican-api-76dd78b6c8-phtkk" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.915115 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e86fc66-942a-4e83-8969-6d18c93ba3e3-combined-ca-bundle\") pod \"barbican-api-76dd78b6c8-phtkk\" (UID: \"3e86fc66-942a-4e83-8969-6d18c93ba3e3\") " pod="openstack/barbican-api-76dd78b6c8-phtkk" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.916149 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e86fc66-942a-4e83-8969-6d18c93ba3e3-config-data-custom\") pod \"barbican-api-76dd78b6c8-phtkk\" (UID: \"3e86fc66-942a-4e83-8969-6d18c93ba3e3\") " pod="openstack/barbican-api-76dd78b6c8-phtkk" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.917338 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e86fc66-942a-4e83-8969-6d18c93ba3e3-config-data\") pod \"barbican-api-76dd78b6c8-phtkk\" (UID: \"3e86fc66-942a-4e83-8969-6d18c93ba3e3\") " pod="openstack/barbican-api-76dd78b6c8-phtkk" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.927857 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-74b59955b-vlhfv" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.927998 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nzmk\" (UniqueName: \"kubernetes.io/projected/3e86fc66-942a-4e83-8969-6d18c93ba3e3-kube-api-access-4nzmk\") pod \"barbican-api-76dd78b6c8-phtkk\" (UID: \"3e86fc66-942a-4e83-8969-6d18c93ba3e3\") " pod="openstack/barbican-api-76dd78b6c8-phtkk" Dec 03 09:00:48 crc kubenswrapper[4947]: I1203 09:00:48.984584 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" Dec 03 09:00:49 crc kubenswrapper[4947]: I1203 09:00:49.117805 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76dd78b6c8-phtkk" Dec 03 09:00:49 crc kubenswrapper[4947]: I1203 09:00:49.463705 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-74b59955b-vlhfv"] Dec 03 09:00:49 crc kubenswrapper[4947]: I1203 09:00:49.530324 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 03 09:00:49 crc kubenswrapper[4947]: I1203 09:00:49.542676 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f055623c-9e1a-4793-a9d0-fc56e71d8df5-config-data-custom\") pod \"barbican-worker-6789c6d659-8b7jf\" (UID: \"f055623c-9e1a-4793-a9d0-fc56e71d8df5\") " pod="openstack/barbican-worker-6789c6d659-8b7jf" Dec 03 09:00:49 crc kubenswrapper[4947]: W1203 09:00:49.545063 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd40e27d8_38a2_496b_b073_dc91d4c878ef.slice/crio-c3a2a4fdddb540c772c7b865cc615b00c23e8ab17971a7150804182c2700f45d WatchSource:0}: Error finding container c3a2a4fdddb540c772c7b865cc615b00c23e8ab17971a7150804182c2700f45d: Status 404 returned error can't find the container with id c3a2a4fdddb540c772c7b865cc615b00c23e8ab17971a7150804182c2700f45d Dec 03 09:00:49 crc kubenswrapper[4947]: I1203 09:00:49.547234 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54ddd5f977-d4wrt"] Dec 03 09:00:49 crc kubenswrapper[4947]: I1203 09:00:49.618303 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-76dd78b6c8-phtkk"] Dec 03 09:00:49 crc kubenswrapper[4947]: W1203 09:00:49.639590 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e86fc66_942a_4e83_8969_6d18c93ba3e3.slice/crio-80d34aee78b9f9f32a4c61baf3b85caed40a0a2b5689a6e08df8b6de14b17798 WatchSource:0}: Error finding container 80d34aee78b9f9f32a4c61baf3b85caed40a0a2b5689a6e08df8b6de14b17798: Status 404 returned error can't find the container with id 80d34aee78b9f9f32a4c61baf3b85caed40a0a2b5689a6e08df8b6de14b17798 Dec 03 09:00:49 crc kubenswrapper[4947]: I1203 09:00:49.745672 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6789c6d659-8b7jf" Dec 03 09:00:50 crc kubenswrapper[4947]: I1203 09:00:50.178115 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6789c6d659-8b7jf"] Dec 03 09:00:50 crc kubenswrapper[4947]: W1203 09:00:50.183230 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf055623c_9e1a_4793_a9d0_fc56e71d8df5.slice/crio-5041f1abd6b268dcec0a0e3907f51322dbf42f258ec961e59720b219dfd1e899 WatchSource:0}: Error finding container 5041f1abd6b268dcec0a0e3907f51322dbf42f258ec961e59720b219dfd1e899: Status 404 returned error can't find the container with id 5041f1abd6b268dcec0a0e3907f51322dbf42f258ec961e59720b219dfd1e899 Dec 03 09:00:50 crc kubenswrapper[4947]: I1203 09:00:50.291282 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76dd78b6c8-phtkk" event={"ID":"3e86fc66-942a-4e83-8969-6d18c93ba3e3","Type":"ContainerStarted","Data":"fb0fc1f6a3476050f296a3a7026a322ce39f23c20501bde04f27522c29c56c9e"} Dec 03 09:00:50 crc kubenswrapper[4947]: I1203 09:00:50.291342 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76dd78b6c8-phtkk" event={"ID":"3e86fc66-942a-4e83-8969-6d18c93ba3e3","Type":"ContainerStarted","Data":"3270288c595aabec89f3f8b6010cd9d3f27d742cce54a4fff7889388d893686c"} Dec 03 09:00:50 crc kubenswrapper[4947]: I1203 09:00:50.291358 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76dd78b6c8-phtkk" event={"ID":"3e86fc66-942a-4e83-8969-6d18c93ba3e3","Type":"ContainerStarted","Data":"80d34aee78b9f9f32a4c61baf3b85caed40a0a2b5689a6e08df8b6de14b17798"} Dec 03 09:00:50 crc kubenswrapper[4947]: I1203 09:00:50.291982 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-76dd78b6c8-phtkk" Dec 03 09:00:50 crc kubenswrapper[4947]: I1203 09:00:50.292029 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-76dd78b6c8-phtkk" Dec 03 09:00:50 crc kubenswrapper[4947]: I1203 09:00:50.300548 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-74b59955b-vlhfv" event={"ID":"13615835-dfc2-4dd0-8dc3-518683077f9f","Type":"ContainerStarted","Data":"bc4f519b437f943f8b2b4ab645c90ec3efec4f973374b1143fcaef378254e92e"} Dec 03 09:00:50 crc kubenswrapper[4947]: I1203 09:00:50.301665 4947 generic.go:334] "Generic (PLEG): container finished" podID="d40e27d8-38a2-496b-b073-dc91d4c878ef" containerID="836c4f926a460cbcbb6ab24afc0b48068ca7ddf6ea897d7ab6e4c0fdee9d4783" exitCode=0 Dec 03 09:00:50 crc kubenswrapper[4947]: I1203 09:00:50.301769 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" event={"ID":"d40e27d8-38a2-496b-b073-dc91d4c878ef","Type":"ContainerDied","Data":"836c4f926a460cbcbb6ab24afc0b48068ca7ddf6ea897d7ab6e4c0fdee9d4783"} Dec 03 09:00:50 crc kubenswrapper[4947]: I1203 09:00:50.301801 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" event={"ID":"d40e27d8-38a2-496b-b073-dc91d4c878ef","Type":"ContainerStarted","Data":"c3a2a4fdddb540c772c7b865cc615b00c23e8ab17971a7150804182c2700f45d"} Dec 03 09:00:50 crc kubenswrapper[4947]: I1203 09:00:50.302701 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6789c6d659-8b7jf" event={"ID":"f055623c-9e1a-4793-a9d0-fc56e71d8df5","Type":"ContainerStarted","Data":"5041f1abd6b268dcec0a0e3907f51322dbf42f258ec961e59720b219dfd1e899"} Dec 03 09:00:50 crc kubenswrapper[4947]: I1203 09:00:50.314711 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-76dd78b6c8-phtkk" podStartSLOduration=2.314695437 podStartE2EDuration="2.314695437s" podCreationTimestamp="2025-12-03 09:00:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:00:50.311204912 +0000 UTC m=+7911.572159338" watchObservedRunningTime="2025-12-03 09:00:50.314695437 +0000 UTC m=+7911.575649863" Dec 03 09:00:51 crc kubenswrapper[4947]: I1203 09:00:51.275641 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tf5pv" Dec 03 09:00:51 crc kubenswrapper[4947]: I1203 09:00:51.276215 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tf5pv" Dec 03 09:00:51 crc kubenswrapper[4947]: I1203 09:00:51.313539 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-74b59955b-vlhfv" event={"ID":"13615835-dfc2-4dd0-8dc3-518683077f9f","Type":"ContainerStarted","Data":"03f61c6cddb125708b8c372f6dced6253468ae2f2b50563a08e55f62725ccdfe"} Dec 03 09:00:51 crc kubenswrapper[4947]: I1203 09:00:51.315445 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" event={"ID":"d40e27d8-38a2-496b-b073-dc91d4c878ef","Type":"ContainerStarted","Data":"c0756c85a337b15f2baa5c754ec31c13edc666a143ddd2ccadd1df4a75b63843"} Dec 03 09:00:51 crc kubenswrapper[4947]: I1203 09:00:51.316428 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" Dec 03 09:00:51 crc kubenswrapper[4947]: I1203 09:00:51.319154 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6789c6d659-8b7jf" event={"ID":"f055623c-9e1a-4793-a9d0-fc56e71d8df5","Type":"ContainerStarted","Data":"f5522d87b587f6891ad1b937f38541ef130e22615d747764eca4b67e50f83ce7"} Dec 03 09:00:51 crc kubenswrapper[4947]: I1203 09:00:51.341948 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" podStartSLOduration=3.341929482 podStartE2EDuration="3.341929482s" podCreationTimestamp="2025-12-03 09:00:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:00:51.336674099 +0000 UTC m=+7912.597628525" watchObservedRunningTime="2025-12-03 09:00:51.341929482 +0000 UTC m=+7912.602883908" Dec 03 09:00:51 crc kubenswrapper[4947]: I1203 09:00:51.352344 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tf5pv" Dec 03 09:00:52 crc kubenswrapper[4947]: I1203 09:00:52.333581 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-74b59955b-vlhfv" event={"ID":"13615835-dfc2-4dd0-8dc3-518683077f9f","Type":"ContainerStarted","Data":"3f6cb339c6a6607ad976d63731cee7a2bd0ff6e8e539dff16980ac5e1e1afe52"} Dec 03 09:00:52 crc kubenswrapper[4947]: I1203 09:00:52.336206 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6789c6d659-8b7jf" event={"ID":"f055623c-9e1a-4793-a9d0-fc56e71d8df5","Type":"ContainerStarted","Data":"b391c690403fc53734b208df937d086263c635d0b997791267a17c89fb30456f"} Dec 03 09:00:52 crc kubenswrapper[4947]: I1203 09:00:52.358418 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-74b59955b-vlhfv" podStartSLOduration=2.869321869 podStartE2EDuration="4.358396196s" podCreationTimestamp="2025-12-03 09:00:48 +0000 UTC" firstStartedPulling="2025-12-03 09:00:49.469868626 +0000 UTC m=+7910.730823052" lastFinishedPulling="2025-12-03 09:00:50.958942963 +0000 UTC m=+7912.219897379" observedRunningTime="2025-12-03 09:00:52.354636034 +0000 UTC m=+7913.615590460" watchObservedRunningTime="2025-12-03 09:00:52.358396196 +0000 UTC m=+7913.619350632" Dec 03 09:00:52 crc kubenswrapper[4947]: I1203 09:00:52.381870 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6789c6d659-8b7jf" podStartSLOduration=3.557339409 podStartE2EDuration="4.38184841s" podCreationTimestamp="2025-12-03 09:00:48 +0000 UTC" firstStartedPulling="2025-12-03 09:00:50.185978056 +0000 UTC m=+7911.446932482" lastFinishedPulling="2025-12-03 09:00:51.010487057 +0000 UTC m=+7912.271441483" observedRunningTime="2025-12-03 09:00:52.37848349 +0000 UTC m=+7913.639437926" watchObservedRunningTime="2025-12-03 09:00:52.38184841 +0000 UTC m=+7913.642802836" Dec 03 09:00:55 crc kubenswrapper[4947]: I1203 09:00:55.599565 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-76dd78b6c8-phtkk" Dec 03 09:00:57 crc kubenswrapper[4947]: I1203 09:00:57.059982 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-76dd78b6c8-phtkk" Dec 03 09:00:57 crc kubenswrapper[4947]: I1203 09:00:57.085269 4947 scope.go:117] "RemoveContainer" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" Dec 03 09:00:57 crc kubenswrapper[4947]: E1203 09:00:57.085525 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:00:58 crc kubenswrapper[4947]: I1203 09:00:58.986703 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" Dec 03 09:00:59 crc kubenswrapper[4947]: I1203 09:00:59.036909 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c49d858cc-lvjg7"] Dec 03 09:00:59 crc kubenswrapper[4947]: I1203 09:00:59.037180 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" podUID="4b3db51a-8016-47a1-b4e9-ba4e547b8bf3" containerName="dnsmasq-dns" containerID="cri-o://11d95e386845b420651b487d2666f9517f9496f412b8269d0d676df18836f801" gracePeriod=10 Dec 03 09:00:59 crc kubenswrapper[4947]: I1203 09:00:59.415815 4947 generic.go:334] "Generic (PLEG): container finished" podID="4b3db51a-8016-47a1-b4e9-ba4e547b8bf3" containerID="11d95e386845b420651b487d2666f9517f9496f412b8269d0d676df18836f801" exitCode=0 Dec 03 09:00:59 crc kubenswrapper[4947]: I1203 09:00:59.415921 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" event={"ID":"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3","Type":"ContainerDied","Data":"11d95e386845b420651b487d2666f9517f9496f412b8269d0d676df18836f801"} Dec 03 09:00:59 crc kubenswrapper[4947]: I1203 09:00:59.587062 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" Dec 03 09:00:59 crc kubenswrapper[4947]: I1203 09:00:59.722080 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-ovsdbserver-sb\") pod \"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3\" (UID: \"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3\") " Dec 03 09:00:59 crc kubenswrapper[4947]: I1203 09:00:59.722145 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-ovsdbserver-nb\") pod \"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3\" (UID: \"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3\") " Dec 03 09:00:59 crc kubenswrapper[4947]: I1203 09:00:59.722171 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjllr\" (UniqueName: \"kubernetes.io/projected/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-kube-api-access-vjllr\") pod \"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3\" (UID: \"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3\") " Dec 03 09:00:59 crc kubenswrapper[4947]: I1203 09:00:59.722261 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-dns-svc\") pod \"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3\" (UID: \"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3\") " Dec 03 09:00:59 crc kubenswrapper[4947]: I1203 09:00:59.722301 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-config\") pod \"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3\" (UID: \"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3\") " Dec 03 09:00:59 crc kubenswrapper[4947]: I1203 09:00:59.740868 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-kube-api-access-vjllr" (OuterVolumeSpecName: "kube-api-access-vjllr") pod "4b3db51a-8016-47a1-b4e9-ba4e547b8bf3" (UID: "4b3db51a-8016-47a1-b4e9-ba4e547b8bf3"). InnerVolumeSpecName "kube-api-access-vjllr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:00:59 crc kubenswrapper[4947]: I1203 09:00:59.763781 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4b3db51a-8016-47a1-b4e9-ba4e547b8bf3" (UID: "4b3db51a-8016-47a1-b4e9-ba4e547b8bf3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:00:59 crc kubenswrapper[4947]: I1203 09:00:59.781033 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-config" (OuterVolumeSpecName: "config") pod "4b3db51a-8016-47a1-b4e9-ba4e547b8bf3" (UID: "4b3db51a-8016-47a1-b4e9-ba4e547b8bf3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:00:59 crc kubenswrapper[4947]: I1203 09:00:59.781242 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4b3db51a-8016-47a1-b4e9-ba4e547b8bf3" (UID: "4b3db51a-8016-47a1-b4e9-ba4e547b8bf3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:00:59 crc kubenswrapper[4947]: I1203 09:00:59.792555 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4b3db51a-8016-47a1-b4e9-ba4e547b8bf3" (UID: "4b3db51a-8016-47a1-b4e9-ba4e547b8bf3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:00:59 crc kubenswrapper[4947]: I1203 09:00:59.825895 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:59 crc kubenswrapper[4947]: I1203 09:00:59.825932 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:59 crc kubenswrapper[4947]: I1203 09:00:59.825945 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjllr\" (UniqueName: \"kubernetes.io/projected/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-kube-api-access-vjllr\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:59 crc kubenswrapper[4947]: I1203 09:00:59.825959 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:59 crc kubenswrapper[4947]: I1203 09:00:59.825999 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3-config\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.147167 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29412541-j7np7"] Dec 03 09:01:00 crc kubenswrapper[4947]: E1203 09:01:00.147619 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3db51a-8016-47a1-b4e9-ba4e547b8bf3" containerName="dnsmasq-dns" Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.147635 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3db51a-8016-47a1-b4e9-ba4e547b8bf3" containerName="dnsmasq-dns" Dec 03 09:01:00 crc kubenswrapper[4947]: E1203 09:01:00.147649 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3db51a-8016-47a1-b4e9-ba4e547b8bf3" containerName="init" Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.147657 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3db51a-8016-47a1-b4e9-ba4e547b8bf3" containerName="init" Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.147845 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b3db51a-8016-47a1-b4e9-ba4e547b8bf3" containerName="dnsmasq-dns" Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.148522 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412541-j7np7" Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.155188 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412541-j7np7"] Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.233541 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7c09f9ec-b2ae-4620-9815-20949c1c08ba-fernet-keys\") pod \"keystone-cron-29412541-j7np7\" (UID: \"7c09f9ec-b2ae-4620-9815-20949c1c08ba\") " pod="openstack/keystone-cron-29412541-j7np7" Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.233615 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zc6z\" (UniqueName: \"kubernetes.io/projected/7c09f9ec-b2ae-4620-9815-20949c1c08ba-kube-api-access-4zc6z\") pod \"keystone-cron-29412541-j7np7\" (UID: \"7c09f9ec-b2ae-4620-9815-20949c1c08ba\") " pod="openstack/keystone-cron-29412541-j7np7" Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.233683 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c09f9ec-b2ae-4620-9815-20949c1c08ba-config-data\") pod \"keystone-cron-29412541-j7np7\" (UID: \"7c09f9ec-b2ae-4620-9815-20949c1c08ba\") " pod="openstack/keystone-cron-29412541-j7np7" Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.233739 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c09f9ec-b2ae-4620-9815-20949c1c08ba-combined-ca-bundle\") pod \"keystone-cron-29412541-j7np7\" (UID: \"7c09f9ec-b2ae-4620-9815-20949c1c08ba\") " pod="openstack/keystone-cron-29412541-j7np7" Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.335932 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7c09f9ec-b2ae-4620-9815-20949c1c08ba-fernet-keys\") pod \"keystone-cron-29412541-j7np7\" (UID: \"7c09f9ec-b2ae-4620-9815-20949c1c08ba\") " pod="openstack/keystone-cron-29412541-j7np7" Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.336007 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zc6z\" (UniqueName: \"kubernetes.io/projected/7c09f9ec-b2ae-4620-9815-20949c1c08ba-kube-api-access-4zc6z\") pod \"keystone-cron-29412541-j7np7\" (UID: \"7c09f9ec-b2ae-4620-9815-20949c1c08ba\") " pod="openstack/keystone-cron-29412541-j7np7" Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.336092 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c09f9ec-b2ae-4620-9815-20949c1c08ba-config-data\") pod \"keystone-cron-29412541-j7np7\" (UID: \"7c09f9ec-b2ae-4620-9815-20949c1c08ba\") " pod="openstack/keystone-cron-29412541-j7np7" Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.336148 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c09f9ec-b2ae-4620-9815-20949c1c08ba-combined-ca-bundle\") pod \"keystone-cron-29412541-j7np7\" (UID: \"7c09f9ec-b2ae-4620-9815-20949c1c08ba\") " pod="openstack/keystone-cron-29412541-j7np7" Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.340393 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c09f9ec-b2ae-4620-9815-20949c1c08ba-config-data\") pod \"keystone-cron-29412541-j7np7\" (UID: \"7c09f9ec-b2ae-4620-9815-20949c1c08ba\") " pod="openstack/keystone-cron-29412541-j7np7" Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.342291 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7c09f9ec-b2ae-4620-9815-20949c1c08ba-fernet-keys\") pod \"keystone-cron-29412541-j7np7\" (UID: \"7c09f9ec-b2ae-4620-9815-20949c1c08ba\") " pod="openstack/keystone-cron-29412541-j7np7" Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.342698 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c09f9ec-b2ae-4620-9815-20949c1c08ba-combined-ca-bundle\") pod \"keystone-cron-29412541-j7np7\" (UID: \"7c09f9ec-b2ae-4620-9815-20949c1c08ba\") " pod="openstack/keystone-cron-29412541-j7np7" Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.353217 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zc6z\" (UniqueName: \"kubernetes.io/projected/7c09f9ec-b2ae-4620-9815-20949c1c08ba-kube-api-access-4zc6z\") pod \"keystone-cron-29412541-j7np7\" (UID: \"7c09f9ec-b2ae-4620-9815-20949c1c08ba\") " pod="openstack/keystone-cron-29412541-j7np7" Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.429423 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" event={"ID":"4b3db51a-8016-47a1-b4e9-ba4e547b8bf3","Type":"ContainerDied","Data":"222db27ffdbfab5296b883ed8c8e75da6fd9cb82615543b48d45c107b59e24a7"} Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.429475 4947 scope.go:117] "RemoveContainer" containerID="11d95e386845b420651b487d2666f9517f9496f412b8269d0d676df18836f801" Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.429669 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c49d858cc-lvjg7" Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.457279 4947 scope.go:117] "RemoveContainer" containerID="ff478d6bef4fc2dd1f2c8a28f60e73fca1c09a9f497bb761693089dc0f972535" Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.484888 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c49d858cc-lvjg7"] Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.495348 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c49d858cc-lvjg7"] Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.522717 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412541-j7np7" Dec 03 09:01:00 crc kubenswrapper[4947]: I1203 09:01:00.974666 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412541-j7np7"] Dec 03 09:01:01 crc kubenswrapper[4947]: I1203 09:01:01.101594 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b3db51a-8016-47a1-b4e9-ba4e547b8bf3" path="/var/lib/kubelet/pods/4b3db51a-8016-47a1-b4e9-ba4e547b8bf3/volumes" Dec 03 09:01:01 crc kubenswrapper[4947]: I1203 09:01:01.323634 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tf5pv" Dec 03 09:01:01 crc kubenswrapper[4947]: I1203 09:01:01.370740 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tf5pv"] Dec 03 09:01:01 crc kubenswrapper[4947]: I1203 09:01:01.438843 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412541-j7np7" event={"ID":"7c09f9ec-b2ae-4620-9815-20949c1c08ba","Type":"ContainerStarted","Data":"e3686de612795bd2fe106463f17681dadd439943c4fd0c63138f0a455ef2c465"} Dec 03 09:01:01 crc kubenswrapper[4947]: I1203 09:01:01.438884 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412541-j7np7" event={"ID":"7c09f9ec-b2ae-4620-9815-20949c1c08ba","Type":"ContainerStarted","Data":"30ca69dd01575f9f6ab4cce417bed71ccd669efa710af0112656a6ba5a08de6a"} Dec 03 09:01:01 crc kubenswrapper[4947]: I1203 09:01:01.441469 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tf5pv" podUID="79ed4b40-6bd1-479d-b4da-7c4f86192f32" containerName="registry-server" containerID="cri-o://3bd16d3c04188800c7eda890b8efa480dfa191b2228e9c0e62fb63eb8d570e5c" gracePeriod=2 Dec 03 09:01:01 crc kubenswrapper[4947]: I1203 09:01:01.465091 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29412541-j7np7" podStartSLOduration=1.465070397 podStartE2EDuration="1.465070397s" podCreationTimestamp="2025-12-03 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:01:01.457373329 +0000 UTC m=+7922.718327755" watchObservedRunningTime="2025-12-03 09:01:01.465070397 +0000 UTC m=+7922.726024823" Dec 03 09:01:02 crc kubenswrapper[4947]: I1203 09:01:02.264186 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tf5pv" Dec 03 09:01:02 crc kubenswrapper[4947]: I1203 09:01:02.378231 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79ed4b40-6bd1-479d-b4da-7c4f86192f32-utilities\") pod \"79ed4b40-6bd1-479d-b4da-7c4f86192f32\" (UID: \"79ed4b40-6bd1-479d-b4da-7c4f86192f32\") " Dec 03 09:01:02 crc kubenswrapper[4947]: I1203 09:01:02.378290 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfm9c\" (UniqueName: \"kubernetes.io/projected/79ed4b40-6bd1-479d-b4da-7c4f86192f32-kube-api-access-tfm9c\") pod \"79ed4b40-6bd1-479d-b4da-7c4f86192f32\" (UID: \"79ed4b40-6bd1-479d-b4da-7c4f86192f32\") " Dec 03 09:01:02 crc kubenswrapper[4947]: I1203 09:01:02.378456 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79ed4b40-6bd1-479d-b4da-7c4f86192f32-catalog-content\") pod \"79ed4b40-6bd1-479d-b4da-7c4f86192f32\" (UID: \"79ed4b40-6bd1-479d-b4da-7c4f86192f32\") " Dec 03 09:01:02 crc kubenswrapper[4947]: I1203 09:01:02.380058 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79ed4b40-6bd1-479d-b4da-7c4f86192f32-utilities" (OuterVolumeSpecName: "utilities") pod "79ed4b40-6bd1-479d-b4da-7c4f86192f32" (UID: "79ed4b40-6bd1-479d-b4da-7c4f86192f32"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:01:02 crc kubenswrapper[4947]: I1203 09:01:02.392115 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79ed4b40-6bd1-479d-b4da-7c4f86192f32-kube-api-access-tfm9c" (OuterVolumeSpecName: "kube-api-access-tfm9c") pod "79ed4b40-6bd1-479d-b4da-7c4f86192f32" (UID: "79ed4b40-6bd1-479d-b4da-7c4f86192f32"). InnerVolumeSpecName "kube-api-access-tfm9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:01:02 crc kubenswrapper[4947]: I1203 09:01:02.429269 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79ed4b40-6bd1-479d-b4da-7c4f86192f32-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79ed4b40-6bd1-479d-b4da-7c4f86192f32" (UID: "79ed4b40-6bd1-479d-b4da-7c4f86192f32"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:01:02 crc kubenswrapper[4947]: I1203 09:01:02.453902 4947 generic.go:334] "Generic (PLEG): container finished" podID="79ed4b40-6bd1-479d-b4da-7c4f86192f32" containerID="3bd16d3c04188800c7eda890b8efa480dfa191b2228e9c0e62fb63eb8d570e5c" exitCode=0 Dec 03 09:01:02 crc kubenswrapper[4947]: I1203 09:01:02.454082 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tf5pv" Dec 03 09:01:02 crc kubenswrapper[4947]: I1203 09:01:02.454137 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf5pv" event={"ID":"79ed4b40-6bd1-479d-b4da-7c4f86192f32","Type":"ContainerDied","Data":"3bd16d3c04188800c7eda890b8efa480dfa191b2228e9c0e62fb63eb8d570e5c"} Dec 03 09:01:02 crc kubenswrapper[4947]: I1203 09:01:02.454175 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tf5pv" event={"ID":"79ed4b40-6bd1-479d-b4da-7c4f86192f32","Type":"ContainerDied","Data":"1d1b7e97dbf0f7fbafb7ef09b4b84e1fd8b9852f9cd8e7d0286b767e4b4d218d"} Dec 03 09:01:02 crc kubenswrapper[4947]: I1203 09:01:02.454196 4947 scope.go:117] "RemoveContainer" containerID="3bd16d3c04188800c7eda890b8efa480dfa191b2228e9c0e62fb63eb8d570e5c" Dec 03 09:01:02 crc kubenswrapper[4947]: I1203 09:01:02.476347 4947 scope.go:117] "RemoveContainer" containerID="e29fa64c44d599baa264454d81c4cc6385f9f2337f53119da994a1d08cbdf18f" Dec 03 09:01:02 crc kubenswrapper[4947]: I1203 09:01:02.480359 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79ed4b40-6bd1-479d-b4da-7c4f86192f32-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:02 crc kubenswrapper[4947]: I1203 09:01:02.480390 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79ed4b40-6bd1-479d-b4da-7c4f86192f32-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:02 crc kubenswrapper[4947]: I1203 09:01:02.480402 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfm9c\" (UniqueName: \"kubernetes.io/projected/79ed4b40-6bd1-479d-b4da-7c4f86192f32-kube-api-access-tfm9c\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:02 crc kubenswrapper[4947]: I1203 09:01:02.505363 4947 scope.go:117] "RemoveContainer" containerID="49ac7fa66149a0c78f093fb3079af917e740b8970042d06c0e48c850ce579efd" Dec 03 09:01:02 crc kubenswrapper[4947]: I1203 09:01:02.511759 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tf5pv"] Dec 03 09:01:02 crc kubenswrapper[4947]: I1203 09:01:02.522952 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tf5pv"] Dec 03 09:01:02 crc kubenswrapper[4947]: I1203 09:01:02.540822 4947 scope.go:117] "RemoveContainer" containerID="3bd16d3c04188800c7eda890b8efa480dfa191b2228e9c0e62fb63eb8d570e5c" Dec 03 09:01:02 crc kubenswrapper[4947]: E1203 09:01:02.541942 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bd16d3c04188800c7eda890b8efa480dfa191b2228e9c0e62fb63eb8d570e5c\": container with ID starting with 3bd16d3c04188800c7eda890b8efa480dfa191b2228e9c0e62fb63eb8d570e5c not found: ID does not exist" containerID="3bd16d3c04188800c7eda890b8efa480dfa191b2228e9c0e62fb63eb8d570e5c" Dec 03 09:01:02 crc kubenswrapper[4947]: I1203 09:01:02.541995 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bd16d3c04188800c7eda890b8efa480dfa191b2228e9c0e62fb63eb8d570e5c"} err="failed to get container status \"3bd16d3c04188800c7eda890b8efa480dfa191b2228e9c0e62fb63eb8d570e5c\": rpc error: code = NotFound desc = could not find container \"3bd16d3c04188800c7eda890b8efa480dfa191b2228e9c0e62fb63eb8d570e5c\": container with ID starting with 3bd16d3c04188800c7eda890b8efa480dfa191b2228e9c0e62fb63eb8d570e5c not found: ID does not exist" Dec 03 09:01:02 crc kubenswrapper[4947]: I1203 09:01:02.542024 4947 scope.go:117] "RemoveContainer" containerID="e29fa64c44d599baa264454d81c4cc6385f9f2337f53119da994a1d08cbdf18f" Dec 03 09:01:02 crc kubenswrapper[4947]: E1203 09:01:02.542429 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e29fa64c44d599baa264454d81c4cc6385f9f2337f53119da994a1d08cbdf18f\": container with ID starting with e29fa64c44d599baa264454d81c4cc6385f9f2337f53119da994a1d08cbdf18f not found: ID does not exist" containerID="e29fa64c44d599baa264454d81c4cc6385f9f2337f53119da994a1d08cbdf18f" Dec 03 09:01:02 crc kubenswrapper[4947]: I1203 09:01:02.542465 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e29fa64c44d599baa264454d81c4cc6385f9f2337f53119da994a1d08cbdf18f"} err="failed to get container status \"e29fa64c44d599baa264454d81c4cc6385f9f2337f53119da994a1d08cbdf18f\": rpc error: code = NotFound desc = could not find container \"e29fa64c44d599baa264454d81c4cc6385f9f2337f53119da994a1d08cbdf18f\": container with ID starting with e29fa64c44d599baa264454d81c4cc6385f9f2337f53119da994a1d08cbdf18f not found: ID does not exist" Dec 03 09:01:02 crc kubenswrapper[4947]: I1203 09:01:02.542505 4947 scope.go:117] "RemoveContainer" containerID="49ac7fa66149a0c78f093fb3079af917e740b8970042d06c0e48c850ce579efd" Dec 03 09:01:02 crc kubenswrapper[4947]: E1203 09:01:02.542893 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49ac7fa66149a0c78f093fb3079af917e740b8970042d06c0e48c850ce579efd\": container with ID starting with 49ac7fa66149a0c78f093fb3079af917e740b8970042d06c0e48c850ce579efd not found: ID does not exist" containerID="49ac7fa66149a0c78f093fb3079af917e740b8970042d06c0e48c850ce579efd" Dec 03 09:01:02 crc kubenswrapper[4947]: I1203 09:01:02.542924 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49ac7fa66149a0c78f093fb3079af917e740b8970042d06c0e48c850ce579efd"} err="failed to get container status \"49ac7fa66149a0c78f093fb3079af917e740b8970042d06c0e48c850ce579efd\": rpc error: code = NotFound desc = could not find container \"49ac7fa66149a0c78f093fb3079af917e740b8970042d06c0e48c850ce579efd\": container with ID starting with 49ac7fa66149a0c78f093fb3079af917e740b8970042d06c0e48c850ce579efd not found: ID does not exist" Dec 03 09:01:03 crc kubenswrapper[4947]: I1203 09:01:03.118095 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79ed4b40-6bd1-479d-b4da-7c4f86192f32" path="/var/lib/kubelet/pods/79ed4b40-6bd1-479d-b4da-7c4f86192f32/volumes" Dec 03 09:01:03 crc kubenswrapper[4947]: I1203 09:01:03.467027 4947 generic.go:334] "Generic (PLEG): container finished" podID="7c09f9ec-b2ae-4620-9815-20949c1c08ba" containerID="e3686de612795bd2fe106463f17681dadd439943c4fd0c63138f0a455ef2c465" exitCode=0 Dec 03 09:01:03 crc kubenswrapper[4947]: I1203 09:01:03.467099 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412541-j7np7" event={"ID":"7c09f9ec-b2ae-4620-9815-20949c1c08ba","Type":"ContainerDied","Data":"e3686de612795bd2fe106463f17681dadd439943c4fd0c63138f0a455ef2c465"} Dec 03 09:01:03 crc kubenswrapper[4947]: I1203 09:01:03.812120 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-zjrft"] Dec 03 09:01:03 crc kubenswrapper[4947]: E1203 09:01:03.813000 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79ed4b40-6bd1-479d-b4da-7c4f86192f32" containerName="extract-utilities" Dec 03 09:01:03 crc kubenswrapper[4947]: I1203 09:01:03.813027 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ed4b40-6bd1-479d-b4da-7c4f86192f32" containerName="extract-utilities" Dec 03 09:01:03 crc kubenswrapper[4947]: E1203 09:01:03.813051 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79ed4b40-6bd1-479d-b4da-7c4f86192f32" containerName="extract-content" Dec 03 09:01:03 crc kubenswrapper[4947]: I1203 09:01:03.813060 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ed4b40-6bd1-479d-b4da-7c4f86192f32" containerName="extract-content" Dec 03 09:01:03 crc kubenswrapper[4947]: E1203 09:01:03.813094 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79ed4b40-6bd1-479d-b4da-7c4f86192f32" containerName="registry-server" Dec 03 09:01:03 crc kubenswrapper[4947]: I1203 09:01:03.813104 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="79ed4b40-6bd1-479d-b4da-7c4f86192f32" containerName="registry-server" Dec 03 09:01:03 crc kubenswrapper[4947]: I1203 09:01:03.813315 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="79ed4b40-6bd1-479d-b4da-7c4f86192f32" containerName="registry-server" Dec 03 09:01:03 crc kubenswrapper[4947]: I1203 09:01:03.814222 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zjrft" Dec 03 09:01:03 crc kubenswrapper[4947]: I1203 09:01:03.827941 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zjrft"] Dec 03 09:01:03 crc kubenswrapper[4947]: I1203 09:01:03.905684 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7dbf1504-f700-40e9-9700-2db114b8c987-operator-scripts\") pod \"neutron-db-create-zjrft\" (UID: \"7dbf1504-f700-40e9-9700-2db114b8c987\") " pod="openstack/neutron-db-create-zjrft" Dec 03 09:01:03 crc kubenswrapper[4947]: I1203 09:01:03.905754 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qggn\" (UniqueName: \"kubernetes.io/projected/7dbf1504-f700-40e9-9700-2db114b8c987-kube-api-access-4qggn\") pod \"neutron-db-create-zjrft\" (UID: \"7dbf1504-f700-40e9-9700-2db114b8c987\") " pod="openstack/neutron-db-create-zjrft" Dec 03 09:01:03 crc kubenswrapper[4947]: I1203 09:01:03.912396 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-1207-account-create-update-lfxck"] Dec 03 09:01:03 crc kubenswrapper[4947]: I1203 09:01:03.913515 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1207-account-create-update-lfxck" Dec 03 09:01:03 crc kubenswrapper[4947]: I1203 09:01:03.915720 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 03 09:01:03 crc kubenswrapper[4947]: I1203 09:01:03.928733 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1207-account-create-update-lfxck"] Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.006843 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4db4f3da-a494-465f-a92a-7cf9bae6a13d-operator-scripts\") pod \"neutron-1207-account-create-update-lfxck\" (UID: \"4db4f3da-a494-465f-a92a-7cf9bae6a13d\") " pod="openstack/neutron-1207-account-create-update-lfxck" Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.007128 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j7sh\" (UniqueName: \"kubernetes.io/projected/4db4f3da-a494-465f-a92a-7cf9bae6a13d-kube-api-access-8j7sh\") pod \"neutron-1207-account-create-update-lfxck\" (UID: \"4db4f3da-a494-465f-a92a-7cf9bae6a13d\") " pod="openstack/neutron-1207-account-create-update-lfxck" Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.007302 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7dbf1504-f700-40e9-9700-2db114b8c987-operator-scripts\") pod \"neutron-db-create-zjrft\" (UID: \"7dbf1504-f700-40e9-9700-2db114b8c987\") " pod="openstack/neutron-db-create-zjrft" Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.007346 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qggn\" (UniqueName: \"kubernetes.io/projected/7dbf1504-f700-40e9-9700-2db114b8c987-kube-api-access-4qggn\") pod \"neutron-db-create-zjrft\" (UID: \"7dbf1504-f700-40e9-9700-2db114b8c987\") " pod="openstack/neutron-db-create-zjrft" Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.008319 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7dbf1504-f700-40e9-9700-2db114b8c987-operator-scripts\") pod \"neutron-db-create-zjrft\" (UID: \"7dbf1504-f700-40e9-9700-2db114b8c987\") " pod="openstack/neutron-db-create-zjrft" Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.035435 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qggn\" (UniqueName: \"kubernetes.io/projected/7dbf1504-f700-40e9-9700-2db114b8c987-kube-api-access-4qggn\") pod \"neutron-db-create-zjrft\" (UID: \"7dbf1504-f700-40e9-9700-2db114b8c987\") " pod="openstack/neutron-db-create-zjrft" Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.108400 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4db4f3da-a494-465f-a92a-7cf9bae6a13d-operator-scripts\") pod \"neutron-1207-account-create-update-lfxck\" (UID: \"4db4f3da-a494-465f-a92a-7cf9bae6a13d\") " pod="openstack/neutron-1207-account-create-update-lfxck" Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.108537 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j7sh\" (UniqueName: \"kubernetes.io/projected/4db4f3da-a494-465f-a92a-7cf9bae6a13d-kube-api-access-8j7sh\") pod \"neutron-1207-account-create-update-lfxck\" (UID: \"4db4f3da-a494-465f-a92a-7cf9bae6a13d\") " pod="openstack/neutron-1207-account-create-update-lfxck" Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.109254 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4db4f3da-a494-465f-a92a-7cf9bae6a13d-operator-scripts\") pod \"neutron-1207-account-create-update-lfxck\" (UID: \"4db4f3da-a494-465f-a92a-7cf9bae6a13d\") " pod="openstack/neutron-1207-account-create-update-lfxck" Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.136070 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j7sh\" (UniqueName: \"kubernetes.io/projected/4db4f3da-a494-465f-a92a-7cf9bae6a13d-kube-api-access-8j7sh\") pod \"neutron-1207-account-create-update-lfxck\" (UID: \"4db4f3da-a494-465f-a92a-7cf9bae6a13d\") " pod="openstack/neutron-1207-account-create-update-lfxck" Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.136920 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zjrft" Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.228219 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1207-account-create-update-lfxck" Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.586554 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zjrft"] Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.689268 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1207-account-create-update-lfxck"] Dec 03 09:01:04 crc kubenswrapper[4947]: W1203 09:01:04.695562 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4db4f3da_a494_465f_a92a_7cf9bae6a13d.slice/crio-5f0b2c40c2946674367e9c6f44f278812e2e39ab3cc3b7be485995f57cb612a4 WatchSource:0}: Error finding container 5f0b2c40c2946674367e9c6f44f278812e2e39ab3cc3b7be485995f57cb612a4: Status 404 returned error can't find the container with id 5f0b2c40c2946674367e9c6f44f278812e2e39ab3cc3b7be485995f57cb612a4 Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.727948 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412541-j7np7" Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.828964 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7c09f9ec-b2ae-4620-9815-20949c1c08ba-fernet-keys\") pod \"7c09f9ec-b2ae-4620-9815-20949c1c08ba\" (UID: \"7c09f9ec-b2ae-4620-9815-20949c1c08ba\") " Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.829015 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c09f9ec-b2ae-4620-9815-20949c1c08ba-config-data\") pod \"7c09f9ec-b2ae-4620-9815-20949c1c08ba\" (UID: \"7c09f9ec-b2ae-4620-9815-20949c1c08ba\") " Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.829199 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zc6z\" (UniqueName: \"kubernetes.io/projected/7c09f9ec-b2ae-4620-9815-20949c1c08ba-kube-api-access-4zc6z\") pod \"7c09f9ec-b2ae-4620-9815-20949c1c08ba\" (UID: \"7c09f9ec-b2ae-4620-9815-20949c1c08ba\") " Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.829236 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c09f9ec-b2ae-4620-9815-20949c1c08ba-combined-ca-bundle\") pod \"7c09f9ec-b2ae-4620-9815-20949c1c08ba\" (UID: \"7c09f9ec-b2ae-4620-9815-20949c1c08ba\") " Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.834690 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c09f9ec-b2ae-4620-9815-20949c1c08ba-kube-api-access-4zc6z" (OuterVolumeSpecName: "kube-api-access-4zc6z") pod "7c09f9ec-b2ae-4620-9815-20949c1c08ba" (UID: "7c09f9ec-b2ae-4620-9815-20949c1c08ba"). InnerVolumeSpecName "kube-api-access-4zc6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.834926 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c09f9ec-b2ae-4620-9815-20949c1c08ba-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7c09f9ec-b2ae-4620-9815-20949c1c08ba" (UID: "7c09f9ec-b2ae-4620-9815-20949c1c08ba"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.853450 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c09f9ec-b2ae-4620-9815-20949c1c08ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c09f9ec-b2ae-4620-9815-20949c1c08ba" (UID: "7c09f9ec-b2ae-4620-9815-20949c1c08ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.877674 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c09f9ec-b2ae-4620-9815-20949c1c08ba-config-data" (OuterVolumeSpecName: "config-data") pod "7c09f9ec-b2ae-4620-9815-20949c1c08ba" (UID: "7c09f9ec-b2ae-4620-9815-20949c1c08ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.931224 4947 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7c09f9ec-b2ae-4620-9815-20949c1c08ba-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.931255 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c09f9ec-b2ae-4620-9815-20949c1c08ba-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.931265 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zc6z\" (UniqueName: \"kubernetes.io/projected/7c09f9ec-b2ae-4620-9815-20949c1c08ba-kube-api-access-4zc6z\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:04 crc kubenswrapper[4947]: I1203 09:01:04.931276 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c09f9ec-b2ae-4620-9815-20949c1c08ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:05 crc kubenswrapper[4947]: I1203 09:01:05.488520 4947 generic.go:334] "Generic (PLEG): container finished" podID="4db4f3da-a494-465f-a92a-7cf9bae6a13d" containerID="0dcbe87c3a5e133304b06cc7c7a9ce6b08fa1be553daffedaf8ad6ee3179b9bf" exitCode=0 Dec 03 09:01:05 crc kubenswrapper[4947]: I1203 09:01:05.488735 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1207-account-create-update-lfxck" event={"ID":"4db4f3da-a494-465f-a92a-7cf9bae6a13d","Type":"ContainerDied","Data":"0dcbe87c3a5e133304b06cc7c7a9ce6b08fa1be553daffedaf8ad6ee3179b9bf"} Dec 03 09:01:05 crc kubenswrapper[4947]: I1203 09:01:05.489171 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1207-account-create-update-lfxck" event={"ID":"4db4f3da-a494-465f-a92a-7cf9bae6a13d","Type":"ContainerStarted","Data":"5f0b2c40c2946674367e9c6f44f278812e2e39ab3cc3b7be485995f57cb612a4"} Dec 03 09:01:05 crc kubenswrapper[4947]: I1203 09:01:05.492419 4947 generic.go:334] "Generic (PLEG): container finished" podID="7dbf1504-f700-40e9-9700-2db114b8c987" containerID="ce16febdf607363810ef12cd31b0836935f3fa39b8988f042b29db6886d5e84d" exitCode=0 Dec 03 09:01:05 crc kubenswrapper[4947]: I1203 09:01:05.492456 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zjrft" event={"ID":"7dbf1504-f700-40e9-9700-2db114b8c987","Type":"ContainerDied","Data":"ce16febdf607363810ef12cd31b0836935f3fa39b8988f042b29db6886d5e84d"} Dec 03 09:01:05 crc kubenswrapper[4947]: I1203 09:01:05.492680 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zjrft" event={"ID":"7dbf1504-f700-40e9-9700-2db114b8c987","Type":"ContainerStarted","Data":"3753d170d04cd0fd49e975aaf9b939c0f0156c9579a48f8813e00b500b57c370"} Dec 03 09:01:05 crc kubenswrapper[4947]: I1203 09:01:05.494476 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412541-j7np7" event={"ID":"7c09f9ec-b2ae-4620-9815-20949c1c08ba","Type":"ContainerDied","Data":"30ca69dd01575f9f6ab4cce417bed71ccd669efa710af0112656a6ba5a08de6a"} Dec 03 09:01:05 crc kubenswrapper[4947]: I1203 09:01:05.494603 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30ca69dd01575f9f6ab4cce417bed71ccd669efa710af0112656a6ba5a08de6a" Dec 03 09:01:05 crc kubenswrapper[4947]: I1203 09:01:05.494553 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412541-j7np7" Dec 03 09:01:06 crc kubenswrapper[4947]: I1203 09:01:06.996935 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zjrft" Dec 03 09:01:07 crc kubenswrapper[4947]: I1203 09:01:07.002014 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1207-account-create-update-lfxck" Dec 03 09:01:07 crc kubenswrapper[4947]: I1203 09:01:07.089439 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7dbf1504-f700-40e9-9700-2db114b8c987-operator-scripts\") pod \"7dbf1504-f700-40e9-9700-2db114b8c987\" (UID: \"7dbf1504-f700-40e9-9700-2db114b8c987\") " Dec 03 09:01:07 crc kubenswrapper[4947]: I1203 09:01:07.089724 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qggn\" (UniqueName: \"kubernetes.io/projected/7dbf1504-f700-40e9-9700-2db114b8c987-kube-api-access-4qggn\") pod \"7dbf1504-f700-40e9-9700-2db114b8c987\" (UID: \"7dbf1504-f700-40e9-9700-2db114b8c987\") " Dec 03 09:01:07 crc kubenswrapper[4947]: I1203 09:01:07.091342 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dbf1504-f700-40e9-9700-2db114b8c987-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7dbf1504-f700-40e9-9700-2db114b8c987" (UID: "7dbf1504-f700-40e9-9700-2db114b8c987"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:01:07 crc kubenswrapper[4947]: I1203 09:01:07.106786 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dbf1504-f700-40e9-9700-2db114b8c987-kube-api-access-4qggn" (OuterVolumeSpecName: "kube-api-access-4qggn") pod "7dbf1504-f700-40e9-9700-2db114b8c987" (UID: "7dbf1504-f700-40e9-9700-2db114b8c987"). InnerVolumeSpecName "kube-api-access-4qggn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:01:07 crc kubenswrapper[4947]: I1203 09:01:07.192406 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4db4f3da-a494-465f-a92a-7cf9bae6a13d-operator-scripts\") pod \"4db4f3da-a494-465f-a92a-7cf9bae6a13d\" (UID: \"4db4f3da-a494-465f-a92a-7cf9bae6a13d\") " Dec 03 09:01:07 crc kubenswrapper[4947]: I1203 09:01:07.192949 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4db4f3da-a494-465f-a92a-7cf9bae6a13d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4db4f3da-a494-465f-a92a-7cf9bae6a13d" (UID: "4db4f3da-a494-465f-a92a-7cf9bae6a13d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:01:07 crc kubenswrapper[4947]: I1203 09:01:07.193099 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j7sh\" (UniqueName: \"kubernetes.io/projected/4db4f3da-a494-465f-a92a-7cf9bae6a13d-kube-api-access-8j7sh\") pod \"4db4f3da-a494-465f-a92a-7cf9bae6a13d\" (UID: \"4db4f3da-a494-465f-a92a-7cf9bae6a13d\") " Dec 03 09:01:07 crc kubenswrapper[4947]: I1203 09:01:07.193971 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qggn\" (UniqueName: \"kubernetes.io/projected/7dbf1504-f700-40e9-9700-2db114b8c987-kube-api-access-4qggn\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:07 crc kubenswrapper[4947]: I1203 09:01:07.194103 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4db4f3da-a494-465f-a92a-7cf9bae6a13d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:07 crc kubenswrapper[4947]: I1203 09:01:07.194178 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7dbf1504-f700-40e9-9700-2db114b8c987-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:07 crc kubenswrapper[4947]: I1203 09:01:07.198799 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4db4f3da-a494-465f-a92a-7cf9bae6a13d-kube-api-access-8j7sh" (OuterVolumeSpecName: "kube-api-access-8j7sh") pod "4db4f3da-a494-465f-a92a-7cf9bae6a13d" (UID: "4db4f3da-a494-465f-a92a-7cf9bae6a13d"). InnerVolumeSpecName "kube-api-access-8j7sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:01:07 crc kubenswrapper[4947]: I1203 09:01:07.296660 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j7sh\" (UniqueName: \"kubernetes.io/projected/4db4f3da-a494-465f-a92a-7cf9bae6a13d-kube-api-access-8j7sh\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:07 crc kubenswrapper[4947]: I1203 09:01:07.517876 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1207-account-create-update-lfxck" event={"ID":"4db4f3da-a494-465f-a92a-7cf9bae6a13d","Type":"ContainerDied","Data":"5f0b2c40c2946674367e9c6f44f278812e2e39ab3cc3b7be485995f57cb612a4"} Dec 03 09:01:07 crc kubenswrapper[4947]: I1203 09:01:07.517893 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1207-account-create-update-lfxck" Dec 03 09:01:07 crc kubenswrapper[4947]: I1203 09:01:07.517989 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f0b2c40c2946674367e9c6f44f278812e2e39ab3cc3b7be485995f57cb612a4" Dec 03 09:01:07 crc kubenswrapper[4947]: I1203 09:01:07.519773 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zjrft" event={"ID":"7dbf1504-f700-40e9-9700-2db114b8c987","Type":"ContainerDied","Data":"3753d170d04cd0fd49e975aaf9b939c0f0156c9579a48f8813e00b500b57c370"} Dec 03 09:01:07 crc kubenswrapper[4947]: I1203 09:01:07.519809 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3753d170d04cd0fd49e975aaf9b939c0f0156c9579a48f8813e00b500b57c370" Dec 03 09:01:07 crc kubenswrapper[4947]: I1203 09:01:07.519850 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zjrft" Dec 03 09:01:08 crc kubenswrapper[4947]: I1203 09:01:08.082892 4947 scope.go:117] "RemoveContainer" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" Dec 03 09:01:08 crc kubenswrapper[4947]: E1203 09:01:08.083140 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:01:09 crc kubenswrapper[4947]: I1203 09:01:09.250142 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-565rb"] Dec 03 09:01:09 crc kubenswrapper[4947]: E1203 09:01:09.250736 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4db4f3da-a494-465f-a92a-7cf9bae6a13d" containerName="mariadb-account-create-update" Dec 03 09:01:09 crc kubenswrapper[4947]: I1203 09:01:09.250749 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="4db4f3da-a494-465f-a92a-7cf9bae6a13d" containerName="mariadb-account-create-update" Dec 03 09:01:09 crc kubenswrapper[4947]: E1203 09:01:09.250774 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dbf1504-f700-40e9-9700-2db114b8c987" containerName="mariadb-database-create" Dec 03 09:01:09 crc kubenswrapper[4947]: I1203 09:01:09.250781 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dbf1504-f700-40e9-9700-2db114b8c987" containerName="mariadb-database-create" Dec 03 09:01:09 crc kubenswrapper[4947]: E1203 09:01:09.250794 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c09f9ec-b2ae-4620-9815-20949c1c08ba" containerName="keystone-cron" Dec 03 09:01:09 crc kubenswrapper[4947]: I1203 09:01:09.250802 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c09f9ec-b2ae-4620-9815-20949c1c08ba" containerName="keystone-cron" Dec 03 09:01:09 crc kubenswrapper[4947]: I1203 09:01:09.250948 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="4db4f3da-a494-465f-a92a-7cf9bae6a13d" containerName="mariadb-account-create-update" Dec 03 09:01:09 crc kubenswrapper[4947]: I1203 09:01:09.250969 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dbf1504-f700-40e9-9700-2db114b8c987" containerName="mariadb-database-create" Dec 03 09:01:09 crc kubenswrapper[4947]: I1203 09:01:09.250989 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c09f9ec-b2ae-4620-9815-20949c1c08ba" containerName="keystone-cron" Dec 03 09:01:09 crc kubenswrapper[4947]: I1203 09:01:09.251569 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-565rb" Dec 03 09:01:09 crc kubenswrapper[4947]: I1203 09:01:09.254770 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-8fmwb" Dec 03 09:01:09 crc kubenswrapper[4947]: I1203 09:01:09.254898 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 09:01:09 crc kubenswrapper[4947]: I1203 09:01:09.254899 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 09:01:09 crc kubenswrapper[4947]: I1203 09:01:09.263731 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-565rb"] Dec 03 09:01:09 crc kubenswrapper[4947]: I1203 09:01:09.331957 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/354678be-9852-4bb6-ab81-137e937fde3b-combined-ca-bundle\") pod \"neutron-db-sync-565rb\" (UID: \"354678be-9852-4bb6-ab81-137e937fde3b\") " pod="openstack/neutron-db-sync-565rb" Dec 03 09:01:09 crc kubenswrapper[4947]: I1203 09:01:09.332019 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/354678be-9852-4bb6-ab81-137e937fde3b-config\") pod \"neutron-db-sync-565rb\" (UID: \"354678be-9852-4bb6-ab81-137e937fde3b\") " pod="openstack/neutron-db-sync-565rb" Dec 03 09:01:09 crc kubenswrapper[4947]: I1203 09:01:09.332306 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7wcb\" (UniqueName: \"kubernetes.io/projected/354678be-9852-4bb6-ab81-137e937fde3b-kube-api-access-s7wcb\") pod \"neutron-db-sync-565rb\" (UID: \"354678be-9852-4bb6-ab81-137e937fde3b\") " pod="openstack/neutron-db-sync-565rb" Dec 03 09:01:09 crc kubenswrapper[4947]: I1203 09:01:09.433947 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7wcb\" (UniqueName: \"kubernetes.io/projected/354678be-9852-4bb6-ab81-137e937fde3b-kube-api-access-s7wcb\") pod \"neutron-db-sync-565rb\" (UID: \"354678be-9852-4bb6-ab81-137e937fde3b\") " pod="openstack/neutron-db-sync-565rb" Dec 03 09:01:09 crc kubenswrapper[4947]: I1203 09:01:09.434055 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/354678be-9852-4bb6-ab81-137e937fde3b-combined-ca-bundle\") pod \"neutron-db-sync-565rb\" (UID: \"354678be-9852-4bb6-ab81-137e937fde3b\") " pod="openstack/neutron-db-sync-565rb" Dec 03 09:01:09 crc kubenswrapper[4947]: I1203 09:01:09.434119 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/354678be-9852-4bb6-ab81-137e937fde3b-config\") pod \"neutron-db-sync-565rb\" (UID: \"354678be-9852-4bb6-ab81-137e937fde3b\") " pod="openstack/neutron-db-sync-565rb" Dec 03 09:01:09 crc kubenswrapper[4947]: I1203 09:01:09.440417 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/354678be-9852-4bb6-ab81-137e937fde3b-config\") pod \"neutron-db-sync-565rb\" (UID: \"354678be-9852-4bb6-ab81-137e937fde3b\") " pod="openstack/neutron-db-sync-565rb" Dec 03 09:01:09 crc kubenswrapper[4947]: I1203 09:01:09.454921 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/354678be-9852-4bb6-ab81-137e937fde3b-combined-ca-bundle\") pod \"neutron-db-sync-565rb\" (UID: \"354678be-9852-4bb6-ab81-137e937fde3b\") " pod="openstack/neutron-db-sync-565rb" Dec 03 09:01:09 crc kubenswrapper[4947]: I1203 09:01:09.458144 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7wcb\" (UniqueName: \"kubernetes.io/projected/354678be-9852-4bb6-ab81-137e937fde3b-kube-api-access-s7wcb\") pod \"neutron-db-sync-565rb\" (UID: \"354678be-9852-4bb6-ab81-137e937fde3b\") " pod="openstack/neutron-db-sync-565rb" Dec 03 09:01:09 crc kubenswrapper[4947]: I1203 09:01:09.572015 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-565rb" Dec 03 09:01:10 crc kubenswrapper[4947]: I1203 09:01:10.057448 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-565rb"] Dec 03 09:01:10 crc kubenswrapper[4947]: W1203 09:01:10.061756 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod354678be_9852_4bb6_ab81_137e937fde3b.slice/crio-a9fc49ed9e822fd35e50324fdc548a4eb81eb5ecd159573c165f6204954c8416 WatchSource:0}: Error finding container a9fc49ed9e822fd35e50324fdc548a4eb81eb5ecd159573c165f6204954c8416: Status 404 returned error can't find the container with id a9fc49ed9e822fd35e50324fdc548a4eb81eb5ecd159573c165f6204954c8416 Dec 03 09:01:10 crc kubenswrapper[4947]: I1203 09:01:10.545321 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-565rb" event={"ID":"354678be-9852-4bb6-ab81-137e937fde3b","Type":"ContainerStarted","Data":"c093257dc35bef67ec8c14c5ace1091f16c97aafc4c599cbb76f7cccb2e36539"} Dec 03 09:01:10 crc kubenswrapper[4947]: I1203 09:01:10.545634 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-565rb" event={"ID":"354678be-9852-4bb6-ab81-137e937fde3b","Type":"ContainerStarted","Data":"a9fc49ed9e822fd35e50324fdc548a4eb81eb5ecd159573c165f6204954c8416"} Dec 03 09:01:10 crc kubenswrapper[4947]: I1203 09:01:10.561188 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-565rb" podStartSLOduration=1.561171643 podStartE2EDuration="1.561171643s" podCreationTimestamp="2025-12-03 09:01:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:01:10.5577561 +0000 UTC m=+7931.818710526" watchObservedRunningTime="2025-12-03 09:01:10.561171643 +0000 UTC m=+7931.822126059" Dec 03 09:01:14 crc kubenswrapper[4947]: I1203 09:01:14.585936 4947 generic.go:334] "Generic (PLEG): container finished" podID="354678be-9852-4bb6-ab81-137e937fde3b" containerID="c093257dc35bef67ec8c14c5ace1091f16c97aafc4c599cbb76f7cccb2e36539" exitCode=0 Dec 03 09:01:14 crc kubenswrapper[4947]: I1203 09:01:14.585981 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-565rb" event={"ID":"354678be-9852-4bb6-ab81-137e937fde3b","Type":"ContainerDied","Data":"c093257dc35bef67ec8c14c5ace1091f16c97aafc4c599cbb76f7cccb2e36539"} Dec 03 09:01:15 crc kubenswrapper[4947]: I1203 09:01:15.982367 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-565rb" Dec 03 09:01:16 crc kubenswrapper[4947]: I1203 09:01:16.055158 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7wcb\" (UniqueName: \"kubernetes.io/projected/354678be-9852-4bb6-ab81-137e937fde3b-kube-api-access-s7wcb\") pod \"354678be-9852-4bb6-ab81-137e937fde3b\" (UID: \"354678be-9852-4bb6-ab81-137e937fde3b\") " Dec 03 09:01:16 crc kubenswrapper[4947]: I1203 09:01:16.055301 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/354678be-9852-4bb6-ab81-137e937fde3b-config\") pod \"354678be-9852-4bb6-ab81-137e937fde3b\" (UID: \"354678be-9852-4bb6-ab81-137e937fde3b\") " Dec 03 09:01:16 crc kubenswrapper[4947]: I1203 09:01:16.055336 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/354678be-9852-4bb6-ab81-137e937fde3b-combined-ca-bundle\") pod \"354678be-9852-4bb6-ab81-137e937fde3b\" (UID: \"354678be-9852-4bb6-ab81-137e937fde3b\") " Dec 03 09:01:16 crc kubenswrapper[4947]: I1203 09:01:16.066803 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/354678be-9852-4bb6-ab81-137e937fde3b-kube-api-access-s7wcb" (OuterVolumeSpecName: "kube-api-access-s7wcb") pod "354678be-9852-4bb6-ab81-137e937fde3b" (UID: "354678be-9852-4bb6-ab81-137e937fde3b"). InnerVolumeSpecName "kube-api-access-s7wcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:01:16 crc kubenswrapper[4947]: I1203 09:01:16.079699 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/354678be-9852-4bb6-ab81-137e937fde3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "354678be-9852-4bb6-ab81-137e937fde3b" (UID: "354678be-9852-4bb6-ab81-137e937fde3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:01:16 crc kubenswrapper[4947]: I1203 09:01:16.080939 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/354678be-9852-4bb6-ab81-137e937fde3b-config" (OuterVolumeSpecName: "config") pod "354678be-9852-4bb6-ab81-137e937fde3b" (UID: "354678be-9852-4bb6-ab81-137e937fde3b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:01:16 crc kubenswrapper[4947]: I1203 09:01:16.159562 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/354678be-9852-4bb6-ab81-137e937fde3b-config\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:16 crc kubenswrapper[4947]: I1203 09:01:16.159617 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/354678be-9852-4bb6-ab81-137e937fde3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:16 crc kubenswrapper[4947]: I1203 09:01:16.159630 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7wcb\" (UniqueName: \"kubernetes.io/projected/354678be-9852-4bb6-ab81-137e937fde3b-kube-api-access-s7wcb\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:16 crc kubenswrapper[4947]: I1203 09:01:16.603791 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-565rb" event={"ID":"354678be-9852-4bb6-ab81-137e937fde3b","Type":"ContainerDied","Data":"a9fc49ed9e822fd35e50324fdc548a4eb81eb5ecd159573c165f6204954c8416"} Dec 03 09:01:16 crc kubenswrapper[4947]: I1203 09:01:16.603821 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-565rb" Dec 03 09:01:16 crc kubenswrapper[4947]: I1203 09:01:16.603840 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9fc49ed9e822fd35e50324fdc548a4eb81eb5ecd159573c165f6204954c8416" Dec 03 09:01:16 crc kubenswrapper[4947]: I1203 09:01:16.865441 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh"] Dec 03 09:01:16 crc kubenswrapper[4947]: E1203 09:01:16.866014 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="354678be-9852-4bb6-ab81-137e937fde3b" containerName="neutron-db-sync" Dec 03 09:01:16 crc kubenswrapper[4947]: I1203 09:01:16.866030 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="354678be-9852-4bb6-ab81-137e937fde3b" containerName="neutron-db-sync" Dec 03 09:01:16 crc kubenswrapper[4947]: I1203 09:01:16.866203 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="354678be-9852-4bb6-ab81-137e937fde3b" containerName="neutron-db-sync" Dec 03 09:01:16 crc kubenswrapper[4947]: I1203 09:01:16.867113 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" Dec 03 09:01:16 crc kubenswrapper[4947]: I1203 09:01:16.894138 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh"] Dec 03 09:01:16 crc kubenswrapper[4947]: I1203 09:01:16.999936 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7b4c4b69d9-lgsnm"] Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.002392 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b4c4b69d9-lgsnm" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.008248 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-8fmwb" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.008451 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.008619 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.031080 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bbb7417-1020-4257-bd46-ed5cb0f54649-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6c4c8ffc-lqxrh\" (UID: \"5bbb7417-1020-4257-bd46-ed5cb0f54649\") " pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.031271 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bbb7417-1020-4257-bd46-ed5cb0f54649-config\") pod \"dnsmasq-dns-6f6c4c8ffc-lqxrh\" (UID: \"5bbb7417-1020-4257-bd46-ed5cb0f54649\") " pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.031333 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s77cx\" (UniqueName: \"kubernetes.io/projected/5bbb7417-1020-4257-bd46-ed5cb0f54649-kube-api-access-s77cx\") pod \"dnsmasq-dns-6f6c4c8ffc-lqxrh\" (UID: \"5bbb7417-1020-4257-bd46-ed5cb0f54649\") " pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.031357 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bbb7417-1020-4257-bd46-ed5cb0f54649-dns-svc\") pod \"dnsmasq-dns-6f6c4c8ffc-lqxrh\" (UID: \"5bbb7417-1020-4257-bd46-ed5cb0f54649\") " pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.031464 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bbb7417-1020-4257-bd46-ed5cb0f54649-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6c4c8ffc-lqxrh\" (UID: \"5bbb7417-1020-4257-bd46-ed5cb0f54649\") " pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.044270 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7b4c4b69d9-lgsnm"] Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.133531 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s77cx\" (UniqueName: \"kubernetes.io/projected/5bbb7417-1020-4257-bd46-ed5cb0f54649-kube-api-access-s77cx\") pod \"dnsmasq-dns-6f6c4c8ffc-lqxrh\" (UID: \"5bbb7417-1020-4257-bd46-ed5cb0f54649\") " pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.133582 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bbb7417-1020-4257-bd46-ed5cb0f54649-dns-svc\") pod \"dnsmasq-dns-6f6c4c8ffc-lqxrh\" (UID: \"5bbb7417-1020-4257-bd46-ed5cb0f54649\") " pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.133632 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bbb7417-1020-4257-bd46-ed5cb0f54649-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6c4c8ffc-lqxrh\" (UID: \"5bbb7417-1020-4257-bd46-ed5cb0f54649\") " pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.133805 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/94a693aa-cb18-4936-8a2a-2a7fcc5feec4-config\") pod \"neutron-7b4c4b69d9-lgsnm\" (UID: \"94a693aa-cb18-4936-8a2a-2a7fcc5feec4\") " pod="openstack/neutron-7b4c4b69d9-lgsnm" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.133832 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfqgn\" (UniqueName: \"kubernetes.io/projected/94a693aa-cb18-4936-8a2a-2a7fcc5feec4-kube-api-access-vfqgn\") pod \"neutron-7b4c4b69d9-lgsnm\" (UID: \"94a693aa-cb18-4936-8a2a-2a7fcc5feec4\") " pod="openstack/neutron-7b4c4b69d9-lgsnm" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.133915 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a693aa-cb18-4936-8a2a-2a7fcc5feec4-combined-ca-bundle\") pod \"neutron-7b4c4b69d9-lgsnm\" (UID: \"94a693aa-cb18-4936-8a2a-2a7fcc5feec4\") " pod="openstack/neutron-7b4c4b69d9-lgsnm" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.133939 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bbb7417-1020-4257-bd46-ed5cb0f54649-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6c4c8ffc-lqxrh\" (UID: \"5bbb7417-1020-4257-bd46-ed5cb0f54649\") " pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.133965 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/94a693aa-cb18-4936-8a2a-2a7fcc5feec4-httpd-config\") pod \"neutron-7b4c4b69d9-lgsnm\" (UID: \"94a693aa-cb18-4936-8a2a-2a7fcc5feec4\") " pod="openstack/neutron-7b4c4b69d9-lgsnm" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.133994 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bbb7417-1020-4257-bd46-ed5cb0f54649-config\") pod \"dnsmasq-dns-6f6c4c8ffc-lqxrh\" (UID: \"5bbb7417-1020-4257-bd46-ed5cb0f54649\") " pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.134442 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bbb7417-1020-4257-bd46-ed5cb0f54649-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6c4c8ffc-lqxrh\" (UID: \"5bbb7417-1020-4257-bd46-ed5cb0f54649\") " pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.135068 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bbb7417-1020-4257-bd46-ed5cb0f54649-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6c4c8ffc-lqxrh\" (UID: \"5bbb7417-1020-4257-bd46-ed5cb0f54649\") " pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.136156 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bbb7417-1020-4257-bd46-ed5cb0f54649-config\") pod \"dnsmasq-dns-6f6c4c8ffc-lqxrh\" (UID: \"5bbb7417-1020-4257-bd46-ed5cb0f54649\") " pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.136919 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bbb7417-1020-4257-bd46-ed5cb0f54649-dns-svc\") pod \"dnsmasq-dns-6f6c4c8ffc-lqxrh\" (UID: \"5bbb7417-1020-4257-bd46-ed5cb0f54649\") " pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.155881 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s77cx\" (UniqueName: \"kubernetes.io/projected/5bbb7417-1020-4257-bd46-ed5cb0f54649-kube-api-access-s77cx\") pod \"dnsmasq-dns-6f6c4c8ffc-lqxrh\" (UID: \"5bbb7417-1020-4257-bd46-ed5cb0f54649\") " pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.187196 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.235698 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/94a693aa-cb18-4936-8a2a-2a7fcc5feec4-httpd-config\") pod \"neutron-7b4c4b69d9-lgsnm\" (UID: \"94a693aa-cb18-4936-8a2a-2a7fcc5feec4\") " pod="openstack/neutron-7b4c4b69d9-lgsnm" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.235855 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/94a693aa-cb18-4936-8a2a-2a7fcc5feec4-config\") pod \"neutron-7b4c4b69d9-lgsnm\" (UID: \"94a693aa-cb18-4936-8a2a-2a7fcc5feec4\") " pod="openstack/neutron-7b4c4b69d9-lgsnm" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.235887 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfqgn\" (UniqueName: \"kubernetes.io/projected/94a693aa-cb18-4936-8a2a-2a7fcc5feec4-kube-api-access-vfqgn\") pod \"neutron-7b4c4b69d9-lgsnm\" (UID: \"94a693aa-cb18-4936-8a2a-2a7fcc5feec4\") " pod="openstack/neutron-7b4c4b69d9-lgsnm" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.235952 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a693aa-cb18-4936-8a2a-2a7fcc5feec4-combined-ca-bundle\") pod \"neutron-7b4c4b69d9-lgsnm\" (UID: \"94a693aa-cb18-4936-8a2a-2a7fcc5feec4\") " pod="openstack/neutron-7b4c4b69d9-lgsnm" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.239576 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/94a693aa-cb18-4936-8a2a-2a7fcc5feec4-httpd-config\") pod \"neutron-7b4c4b69d9-lgsnm\" (UID: \"94a693aa-cb18-4936-8a2a-2a7fcc5feec4\") " pod="openstack/neutron-7b4c4b69d9-lgsnm" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.240139 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a693aa-cb18-4936-8a2a-2a7fcc5feec4-combined-ca-bundle\") pod \"neutron-7b4c4b69d9-lgsnm\" (UID: \"94a693aa-cb18-4936-8a2a-2a7fcc5feec4\") " pod="openstack/neutron-7b4c4b69d9-lgsnm" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.241409 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/94a693aa-cb18-4936-8a2a-2a7fcc5feec4-config\") pod \"neutron-7b4c4b69d9-lgsnm\" (UID: \"94a693aa-cb18-4936-8a2a-2a7fcc5feec4\") " pod="openstack/neutron-7b4c4b69d9-lgsnm" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.261507 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfqgn\" (UniqueName: \"kubernetes.io/projected/94a693aa-cb18-4936-8a2a-2a7fcc5feec4-kube-api-access-vfqgn\") pod \"neutron-7b4c4b69d9-lgsnm\" (UID: \"94a693aa-cb18-4936-8a2a-2a7fcc5feec4\") " pod="openstack/neutron-7b4c4b69d9-lgsnm" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.337438 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b4c4b69d9-lgsnm" Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.679936 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh"] Dec 03 09:01:17 crc kubenswrapper[4947]: I1203 09:01:17.939150 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7b4c4b69d9-lgsnm"] Dec 03 09:01:17 crc kubenswrapper[4947]: W1203 09:01:17.941230 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94a693aa_cb18_4936_8a2a_2a7fcc5feec4.slice/crio-0fd28c656e50c9a85e722105c14f80884f13120a87e696079b4d450c0475c60d WatchSource:0}: Error finding container 0fd28c656e50c9a85e722105c14f80884f13120a87e696079b4d450c0475c60d: Status 404 returned error can't find the container with id 0fd28c656e50c9a85e722105c14f80884f13120a87e696079b4d450c0475c60d Dec 03 09:01:18 crc kubenswrapper[4947]: I1203 09:01:18.623483 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b4c4b69d9-lgsnm" event={"ID":"94a693aa-cb18-4936-8a2a-2a7fcc5feec4","Type":"ContainerStarted","Data":"ca231b3beae4a9b045be53686395e7d06388becb5a78fcd2db085041e401e369"} Dec 03 09:01:18 crc kubenswrapper[4947]: I1203 09:01:18.628226 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b4c4b69d9-lgsnm" event={"ID":"94a693aa-cb18-4936-8a2a-2a7fcc5feec4","Type":"ContainerStarted","Data":"4ff6de44a19ca03240bf761016dd21ca3f5bc52982355722a16fc39181ca6bdb"} Dec 03 09:01:18 crc kubenswrapper[4947]: I1203 09:01:18.628289 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b4c4b69d9-lgsnm" event={"ID":"94a693aa-cb18-4936-8a2a-2a7fcc5feec4","Type":"ContainerStarted","Data":"0fd28c656e50c9a85e722105c14f80884f13120a87e696079b4d450c0475c60d"} Dec 03 09:01:18 crc kubenswrapper[4947]: I1203 09:01:18.628322 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7b4c4b69d9-lgsnm" Dec 03 09:01:18 crc kubenswrapper[4947]: I1203 09:01:18.636620 4947 generic.go:334] "Generic (PLEG): container finished" podID="5bbb7417-1020-4257-bd46-ed5cb0f54649" containerID="970b8996c80fd5eb2b61181e1f9ae07e69dbb8b511f6bb8bc3566295d6dc2d78" exitCode=0 Dec 03 09:01:18 crc kubenswrapper[4947]: I1203 09:01:18.636692 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" event={"ID":"5bbb7417-1020-4257-bd46-ed5cb0f54649","Type":"ContainerDied","Data":"970b8996c80fd5eb2b61181e1f9ae07e69dbb8b511f6bb8bc3566295d6dc2d78"} Dec 03 09:01:18 crc kubenswrapper[4947]: I1203 09:01:18.636722 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" event={"ID":"5bbb7417-1020-4257-bd46-ed5cb0f54649","Type":"ContainerStarted","Data":"e031ef818b506b1d29fb56d978d5ccd34ad17806e43553f31da3f4df82ea6be0"} Dec 03 09:01:18 crc kubenswrapper[4947]: I1203 09:01:18.663004 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7b4c4b69d9-lgsnm" podStartSLOduration=2.662984573 podStartE2EDuration="2.662984573s" podCreationTimestamp="2025-12-03 09:01:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:01:18.655343717 +0000 UTC m=+7939.916298143" watchObservedRunningTime="2025-12-03 09:01:18.662984573 +0000 UTC m=+7939.923938999" Dec 03 09:01:19 crc kubenswrapper[4947]: I1203 09:01:19.647892 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" event={"ID":"5bbb7417-1020-4257-bd46-ed5cb0f54649","Type":"ContainerStarted","Data":"28ac18e0d4da524583279712bf804496facfb80a7d944565e70c2dfdb3ac634a"} Dec 03 09:01:19 crc kubenswrapper[4947]: I1203 09:01:19.648673 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" Dec 03 09:01:19 crc kubenswrapper[4947]: I1203 09:01:19.673453 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" podStartSLOduration=3.6734350449999997 podStartE2EDuration="3.673435045s" podCreationTimestamp="2025-12-03 09:01:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:01:19.668050759 +0000 UTC m=+7940.929005195" watchObservedRunningTime="2025-12-03 09:01:19.673435045 +0000 UTC m=+7940.934389461" Dec 03 09:01:21 crc kubenswrapper[4947]: I1203 09:01:21.083829 4947 scope.go:117] "RemoveContainer" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" Dec 03 09:01:21 crc kubenswrapper[4947]: E1203 09:01:21.084728 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:01:27 crc kubenswrapper[4947]: I1203 09:01:27.189817 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" Dec 03 09:01:27 crc kubenswrapper[4947]: I1203 09:01:27.288924 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54ddd5f977-d4wrt"] Dec 03 09:01:27 crc kubenswrapper[4947]: I1203 09:01:27.289195 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" podUID="d40e27d8-38a2-496b-b073-dc91d4c878ef" containerName="dnsmasq-dns" containerID="cri-o://c0756c85a337b15f2baa5c754ec31c13edc666a143ddd2ccadd1df4a75b63843" gracePeriod=10 Dec 03 09:01:27 crc kubenswrapper[4947]: I1203 09:01:27.712052 4947 generic.go:334] "Generic (PLEG): container finished" podID="d40e27d8-38a2-496b-b073-dc91d4c878ef" containerID="c0756c85a337b15f2baa5c754ec31c13edc666a143ddd2ccadd1df4a75b63843" exitCode=0 Dec 03 09:01:27 crc kubenswrapper[4947]: I1203 09:01:27.712138 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" event={"ID":"d40e27d8-38a2-496b-b073-dc91d4c878ef","Type":"ContainerDied","Data":"c0756c85a337b15f2baa5c754ec31c13edc666a143ddd2ccadd1df4a75b63843"} Dec 03 09:01:27 crc kubenswrapper[4947]: I1203 09:01:27.800763 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" Dec 03 09:01:27 crc kubenswrapper[4947]: I1203 09:01:27.832325 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d40e27d8-38a2-496b-b073-dc91d4c878ef-ovsdbserver-sb\") pod \"d40e27d8-38a2-496b-b073-dc91d4c878ef\" (UID: \"d40e27d8-38a2-496b-b073-dc91d4c878ef\") " Dec 03 09:01:27 crc kubenswrapper[4947]: I1203 09:01:27.832384 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d40e27d8-38a2-496b-b073-dc91d4c878ef-ovsdbserver-nb\") pod \"d40e27d8-38a2-496b-b073-dc91d4c878ef\" (UID: \"d40e27d8-38a2-496b-b073-dc91d4c878ef\") " Dec 03 09:01:27 crc kubenswrapper[4947]: I1203 09:01:27.832526 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc7m9\" (UniqueName: \"kubernetes.io/projected/d40e27d8-38a2-496b-b073-dc91d4c878ef-kube-api-access-xc7m9\") pod \"d40e27d8-38a2-496b-b073-dc91d4c878ef\" (UID: \"d40e27d8-38a2-496b-b073-dc91d4c878ef\") " Dec 03 09:01:27 crc kubenswrapper[4947]: I1203 09:01:27.832558 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d40e27d8-38a2-496b-b073-dc91d4c878ef-dns-svc\") pod \"d40e27d8-38a2-496b-b073-dc91d4c878ef\" (UID: \"d40e27d8-38a2-496b-b073-dc91d4c878ef\") " Dec 03 09:01:27 crc kubenswrapper[4947]: I1203 09:01:27.832582 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d40e27d8-38a2-496b-b073-dc91d4c878ef-config\") pod \"d40e27d8-38a2-496b-b073-dc91d4c878ef\" (UID: \"d40e27d8-38a2-496b-b073-dc91d4c878ef\") " Dec 03 09:01:27 crc kubenswrapper[4947]: I1203 09:01:27.880089 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d40e27d8-38a2-496b-b073-dc91d4c878ef-kube-api-access-xc7m9" (OuterVolumeSpecName: "kube-api-access-xc7m9") pod "d40e27d8-38a2-496b-b073-dc91d4c878ef" (UID: "d40e27d8-38a2-496b-b073-dc91d4c878ef"). InnerVolumeSpecName "kube-api-access-xc7m9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:01:27 crc kubenswrapper[4947]: I1203 09:01:27.880541 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d40e27d8-38a2-496b-b073-dc91d4c878ef-config" (OuterVolumeSpecName: "config") pod "d40e27d8-38a2-496b-b073-dc91d4c878ef" (UID: "d40e27d8-38a2-496b-b073-dc91d4c878ef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:01:27 crc kubenswrapper[4947]: I1203 09:01:27.900919 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d40e27d8-38a2-496b-b073-dc91d4c878ef-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d40e27d8-38a2-496b-b073-dc91d4c878ef" (UID: "d40e27d8-38a2-496b-b073-dc91d4c878ef"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:01:27 crc kubenswrapper[4947]: I1203 09:01:27.923043 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d40e27d8-38a2-496b-b073-dc91d4c878ef-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d40e27d8-38a2-496b-b073-dc91d4c878ef" (UID: "d40e27d8-38a2-496b-b073-dc91d4c878ef"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:01:27 crc kubenswrapper[4947]: I1203 09:01:27.924274 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d40e27d8-38a2-496b-b073-dc91d4c878ef-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d40e27d8-38a2-496b-b073-dc91d4c878ef" (UID: "d40e27d8-38a2-496b-b073-dc91d4c878ef"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:01:27 crc kubenswrapper[4947]: I1203 09:01:27.934097 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d40e27d8-38a2-496b-b073-dc91d4c878ef-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:27 crc kubenswrapper[4947]: I1203 09:01:27.934162 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d40e27d8-38a2-496b-b073-dc91d4c878ef-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:27 crc kubenswrapper[4947]: I1203 09:01:27.934174 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc7m9\" (UniqueName: \"kubernetes.io/projected/d40e27d8-38a2-496b-b073-dc91d4c878ef-kube-api-access-xc7m9\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:27 crc kubenswrapper[4947]: I1203 09:01:27.934188 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d40e27d8-38a2-496b-b073-dc91d4c878ef-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:27 crc kubenswrapper[4947]: I1203 09:01:27.934198 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d40e27d8-38a2-496b-b073-dc91d4c878ef-config\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:28 crc kubenswrapper[4947]: I1203 09:01:28.723280 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" event={"ID":"d40e27d8-38a2-496b-b073-dc91d4c878ef","Type":"ContainerDied","Data":"c3a2a4fdddb540c772c7b865cc615b00c23e8ab17971a7150804182c2700f45d"} Dec 03 09:01:28 crc kubenswrapper[4947]: I1203 09:01:28.723352 4947 scope.go:117] "RemoveContainer" containerID="c0756c85a337b15f2baa5c754ec31c13edc666a143ddd2ccadd1df4a75b63843" Dec 03 09:01:28 crc kubenswrapper[4947]: I1203 09:01:28.723378 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54ddd5f977-d4wrt" Dec 03 09:01:28 crc kubenswrapper[4947]: I1203 09:01:28.747591 4947 scope.go:117] "RemoveContainer" containerID="836c4f926a460cbcbb6ab24afc0b48068ca7ddf6ea897d7ab6e4c0fdee9d4783" Dec 03 09:01:28 crc kubenswrapper[4947]: I1203 09:01:28.772231 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54ddd5f977-d4wrt"] Dec 03 09:01:28 crc kubenswrapper[4947]: I1203 09:01:28.780595 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54ddd5f977-d4wrt"] Dec 03 09:01:29 crc kubenswrapper[4947]: I1203 09:01:29.094973 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d40e27d8-38a2-496b-b073-dc91d4c878ef" path="/var/lib/kubelet/pods/d40e27d8-38a2-496b-b073-dc91d4c878ef/volumes" Dec 03 09:01:32 crc kubenswrapper[4947]: I1203 09:01:32.083484 4947 scope.go:117] "RemoveContainer" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" Dec 03 09:01:32 crc kubenswrapper[4947]: E1203 09:01:32.084084 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:01:43 crc kubenswrapper[4947]: I1203 09:01:43.087553 4947 scope.go:117] "RemoveContainer" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" Dec 03 09:01:43 crc kubenswrapper[4947]: E1203 09:01:43.090096 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:01:47 crc kubenswrapper[4947]: I1203 09:01:47.349334 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7b4c4b69d9-lgsnm" Dec 03 09:01:56 crc kubenswrapper[4947]: I1203 09:01:56.083280 4947 scope.go:117] "RemoveContainer" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" Dec 03 09:01:56 crc kubenswrapper[4947]: E1203 09:01:56.084306 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:01:56 crc kubenswrapper[4947]: I1203 09:01:56.894637 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-jn9hw"] Dec 03 09:01:56 crc kubenswrapper[4947]: E1203 09:01:56.895082 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d40e27d8-38a2-496b-b073-dc91d4c878ef" containerName="dnsmasq-dns" Dec 03 09:01:56 crc kubenswrapper[4947]: I1203 09:01:56.895106 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40e27d8-38a2-496b-b073-dc91d4c878ef" containerName="dnsmasq-dns" Dec 03 09:01:56 crc kubenswrapper[4947]: E1203 09:01:56.895147 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d40e27d8-38a2-496b-b073-dc91d4c878ef" containerName="init" Dec 03 09:01:56 crc kubenswrapper[4947]: I1203 09:01:56.895155 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40e27d8-38a2-496b-b073-dc91d4c878ef" containerName="init" Dec 03 09:01:56 crc kubenswrapper[4947]: I1203 09:01:56.895363 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="d40e27d8-38a2-496b-b073-dc91d4c878ef" containerName="dnsmasq-dns" Dec 03 09:01:56 crc kubenswrapper[4947]: I1203 09:01:56.896167 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jn9hw" Dec 03 09:01:56 crc kubenswrapper[4947]: I1203 09:01:56.897857 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 03 09:01:56 crc kubenswrapper[4947]: I1203 09:01:56.897900 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 03 09:01:56 crc kubenswrapper[4947]: I1203 09:01:56.897871 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 03 09:01:56 crc kubenswrapper[4947]: I1203 09:01:56.898035 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 03 09:01:56 crc kubenswrapper[4947]: I1203 09:01:56.898921 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-8zccq" Dec 03 09:01:56 crc kubenswrapper[4947]: I1203 09:01:56.909961 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-jn9hw"] Dec 03 09:01:56 crc kubenswrapper[4947]: I1203 09:01:56.979162 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e870e8ba-007a-48cd-bfbb-fa8051fabf33-combined-ca-bundle\") pod \"swift-ring-rebalance-jn9hw\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " pod="openstack/swift-ring-rebalance-jn9hw" Dec 03 09:01:56 crc kubenswrapper[4947]: I1203 09:01:56.979278 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e870e8ba-007a-48cd-bfbb-fa8051fabf33-scripts\") pod \"swift-ring-rebalance-jn9hw\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " pod="openstack/swift-ring-rebalance-jn9hw" Dec 03 09:01:56 crc kubenswrapper[4947]: I1203 09:01:56.979297 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e870e8ba-007a-48cd-bfbb-fa8051fabf33-swiftconf\") pod \"swift-ring-rebalance-jn9hw\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " pod="openstack/swift-ring-rebalance-jn9hw" Dec 03 09:01:56 crc kubenswrapper[4947]: I1203 09:01:56.979327 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e870e8ba-007a-48cd-bfbb-fa8051fabf33-ring-data-devices\") pod \"swift-ring-rebalance-jn9hw\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " pod="openstack/swift-ring-rebalance-jn9hw" Dec 03 09:01:56 crc kubenswrapper[4947]: I1203 09:01:56.979342 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e870e8ba-007a-48cd-bfbb-fa8051fabf33-dispersionconf\") pod \"swift-ring-rebalance-jn9hw\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " pod="openstack/swift-ring-rebalance-jn9hw" Dec 03 09:01:56 crc kubenswrapper[4947]: I1203 09:01:56.979374 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e870e8ba-007a-48cd-bfbb-fa8051fabf33-etc-swift\") pod \"swift-ring-rebalance-jn9hw\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " pod="openstack/swift-ring-rebalance-jn9hw" Dec 03 09:01:56 crc kubenswrapper[4947]: I1203 09:01:56.979389 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdkjh\" (UniqueName: \"kubernetes.io/projected/e870e8ba-007a-48cd-bfbb-fa8051fabf33-kube-api-access-qdkjh\") pod \"swift-ring-rebalance-jn9hw\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " pod="openstack/swift-ring-rebalance-jn9hw" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.021219 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cf689ffdc-g58ps"] Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.022872 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.050608 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cf689ffdc-g58ps"] Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.084162 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e870e8ba-007a-48cd-bfbb-fa8051fabf33-combined-ca-bundle\") pod \"swift-ring-rebalance-jn9hw\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " pod="openstack/swift-ring-rebalance-jn9hw" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.084264 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-ovsdbserver-nb\") pod \"dnsmasq-dns-5cf689ffdc-g58ps\" (UID: \"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f\") " pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.084348 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89znx\" (UniqueName: \"kubernetes.io/projected/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-kube-api-access-89znx\") pod \"dnsmasq-dns-5cf689ffdc-g58ps\" (UID: \"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f\") " pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.084386 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-config\") pod \"dnsmasq-dns-5cf689ffdc-g58ps\" (UID: \"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f\") " pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.084421 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e870e8ba-007a-48cd-bfbb-fa8051fabf33-scripts\") pod \"swift-ring-rebalance-jn9hw\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " pod="openstack/swift-ring-rebalance-jn9hw" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.084446 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e870e8ba-007a-48cd-bfbb-fa8051fabf33-swiftconf\") pod \"swift-ring-rebalance-jn9hw\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " pod="openstack/swift-ring-rebalance-jn9hw" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.084471 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-ovsdbserver-sb\") pod \"dnsmasq-dns-5cf689ffdc-g58ps\" (UID: \"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f\") " pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.084520 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e870e8ba-007a-48cd-bfbb-fa8051fabf33-ring-data-devices\") pod \"swift-ring-rebalance-jn9hw\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " pod="openstack/swift-ring-rebalance-jn9hw" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.084535 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e870e8ba-007a-48cd-bfbb-fa8051fabf33-dispersionconf\") pod \"swift-ring-rebalance-jn9hw\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " pod="openstack/swift-ring-rebalance-jn9hw" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.084565 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e870e8ba-007a-48cd-bfbb-fa8051fabf33-etc-swift\") pod \"swift-ring-rebalance-jn9hw\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " pod="openstack/swift-ring-rebalance-jn9hw" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.084665 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdkjh\" (UniqueName: \"kubernetes.io/projected/e870e8ba-007a-48cd-bfbb-fa8051fabf33-kube-api-access-qdkjh\") pod \"swift-ring-rebalance-jn9hw\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " pod="openstack/swift-ring-rebalance-jn9hw" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.084711 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-dns-svc\") pod \"dnsmasq-dns-5cf689ffdc-g58ps\" (UID: \"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f\") " pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.085339 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e870e8ba-007a-48cd-bfbb-fa8051fabf33-scripts\") pod \"swift-ring-rebalance-jn9hw\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " pod="openstack/swift-ring-rebalance-jn9hw" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.097852 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e870e8ba-007a-48cd-bfbb-fa8051fabf33-etc-swift\") pod \"swift-ring-rebalance-jn9hw\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " pod="openstack/swift-ring-rebalance-jn9hw" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.098476 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e870e8ba-007a-48cd-bfbb-fa8051fabf33-ring-data-devices\") pod \"swift-ring-rebalance-jn9hw\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " pod="openstack/swift-ring-rebalance-jn9hw" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.098952 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e870e8ba-007a-48cd-bfbb-fa8051fabf33-swiftconf\") pod \"swift-ring-rebalance-jn9hw\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " pod="openstack/swift-ring-rebalance-jn9hw" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.099189 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e870e8ba-007a-48cd-bfbb-fa8051fabf33-dispersionconf\") pod \"swift-ring-rebalance-jn9hw\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " pod="openstack/swift-ring-rebalance-jn9hw" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.099692 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e870e8ba-007a-48cd-bfbb-fa8051fabf33-combined-ca-bundle\") pod \"swift-ring-rebalance-jn9hw\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " pod="openstack/swift-ring-rebalance-jn9hw" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.125249 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdkjh\" (UniqueName: \"kubernetes.io/projected/e870e8ba-007a-48cd-bfbb-fa8051fabf33-kube-api-access-qdkjh\") pod \"swift-ring-rebalance-jn9hw\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " pod="openstack/swift-ring-rebalance-jn9hw" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.186817 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89znx\" (UniqueName: \"kubernetes.io/projected/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-kube-api-access-89znx\") pod \"dnsmasq-dns-5cf689ffdc-g58ps\" (UID: \"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f\") " pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.186880 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-config\") pod \"dnsmasq-dns-5cf689ffdc-g58ps\" (UID: \"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f\") " pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.186922 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-ovsdbserver-sb\") pod \"dnsmasq-dns-5cf689ffdc-g58ps\" (UID: \"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f\") " pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.186972 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-dns-svc\") pod \"dnsmasq-dns-5cf689ffdc-g58ps\" (UID: \"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f\") " pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.187043 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-ovsdbserver-nb\") pod \"dnsmasq-dns-5cf689ffdc-g58ps\" (UID: \"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f\") " pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.187642 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-config\") pod \"dnsmasq-dns-5cf689ffdc-g58ps\" (UID: \"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f\") " pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.188422 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-ovsdbserver-sb\") pod \"dnsmasq-dns-5cf689ffdc-g58ps\" (UID: \"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f\") " pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.188785 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-ovsdbserver-nb\") pod \"dnsmasq-dns-5cf689ffdc-g58ps\" (UID: \"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f\") " pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.188961 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-dns-svc\") pod \"dnsmasq-dns-5cf689ffdc-g58ps\" (UID: \"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f\") " pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.204526 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89znx\" (UniqueName: \"kubernetes.io/projected/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-kube-api-access-89znx\") pod \"dnsmasq-dns-5cf689ffdc-g58ps\" (UID: \"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f\") " pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.217108 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jn9hw" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.339233 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.692266 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-jn9hw"] Dec 03 09:01:57 crc kubenswrapper[4947]: I1203 09:01:57.838453 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cf689ffdc-g58ps"] Dec 03 09:01:57 crc kubenswrapper[4947]: W1203 09:01:57.841811 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bc67b75_6adc_43e7_9df0_4a6ff5a70f8f.slice/crio-0a767f6f53d8f7730a29f9ad7ab3ad4d270284a7f3587233c521991619a7729f WatchSource:0}: Error finding container 0a767f6f53d8f7730a29f9ad7ab3ad4d270284a7f3587233c521991619a7729f: Status 404 returned error can't find the container with id 0a767f6f53d8f7730a29f9ad7ab3ad4d270284a7f3587233c521991619a7729f Dec 03 09:01:58 crc kubenswrapper[4947]: I1203 09:01:58.022644 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" event={"ID":"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f","Type":"ContainerStarted","Data":"0a767f6f53d8f7730a29f9ad7ab3ad4d270284a7f3587233c521991619a7729f"} Dec 03 09:01:58 crc kubenswrapper[4947]: I1203 09:01:58.024272 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jn9hw" event={"ID":"e870e8ba-007a-48cd-bfbb-fa8051fabf33","Type":"ContainerStarted","Data":"93ee08beef1cf675770db3fe3c9faed4014711d11d7d4efa8c514bd0a75e4410"} Dec 03 09:01:58 crc kubenswrapper[4947]: I1203 09:01:58.744859 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-64c8bd4d48-79lrk"] Dec 03 09:01:58 crc kubenswrapper[4947]: I1203 09:01:58.748860 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-64c8bd4d48-79lrk" Dec 03 09:01:58 crc kubenswrapper[4947]: I1203 09:01:58.763887 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 03 09:01:58 crc kubenswrapper[4947]: I1203 09:01:58.790629 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-64c8bd4d48-79lrk"] Dec 03 09:01:58 crc kubenswrapper[4947]: I1203 09:01:58.823223 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvgsh\" (UniqueName: \"kubernetes.io/projected/4b5182fc-c85f-4e7a-960c-a2aa59cc653b-kube-api-access-lvgsh\") pod \"swift-proxy-64c8bd4d48-79lrk\" (UID: \"4b5182fc-c85f-4e7a-960c-a2aa59cc653b\") " pod="openstack/swift-proxy-64c8bd4d48-79lrk" Dec 03 09:01:58 crc kubenswrapper[4947]: I1203 09:01:58.823288 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b5182fc-c85f-4e7a-960c-a2aa59cc653b-combined-ca-bundle\") pod \"swift-proxy-64c8bd4d48-79lrk\" (UID: \"4b5182fc-c85f-4e7a-960c-a2aa59cc653b\") " pod="openstack/swift-proxy-64c8bd4d48-79lrk" Dec 03 09:01:58 crc kubenswrapper[4947]: I1203 09:01:58.823387 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4b5182fc-c85f-4e7a-960c-a2aa59cc653b-etc-swift\") pod \"swift-proxy-64c8bd4d48-79lrk\" (UID: \"4b5182fc-c85f-4e7a-960c-a2aa59cc653b\") " pod="openstack/swift-proxy-64c8bd4d48-79lrk" Dec 03 09:01:58 crc kubenswrapper[4947]: I1203 09:01:58.823415 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b5182fc-c85f-4e7a-960c-a2aa59cc653b-config-data\") pod \"swift-proxy-64c8bd4d48-79lrk\" (UID: \"4b5182fc-c85f-4e7a-960c-a2aa59cc653b\") " pod="openstack/swift-proxy-64c8bd4d48-79lrk" Dec 03 09:01:58 crc kubenswrapper[4947]: I1203 09:01:58.823451 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b5182fc-c85f-4e7a-960c-a2aa59cc653b-run-httpd\") pod \"swift-proxy-64c8bd4d48-79lrk\" (UID: \"4b5182fc-c85f-4e7a-960c-a2aa59cc653b\") " pod="openstack/swift-proxy-64c8bd4d48-79lrk" Dec 03 09:01:58 crc kubenswrapper[4947]: I1203 09:01:58.823470 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b5182fc-c85f-4e7a-960c-a2aa59cc653b-log-httpd\") pod \"swift-proxy-64c8bd4d48-79lrk\" (UID: \"4b5182fc-c85f-4e7a-960c-a2aa59cc653b\") " pod="openstack/swift-proxy-64c8bd4d48-79lrk" Dec 03 09:01:58 crc kubenswrapper[4947]: I1203 09:01:58.925241 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b5182fc-c85f-4e7a-960c-a2aa59cc653b-run-httpd\") pod \"swift-proxy-64c8bd4d48-79lrk\" (UID: \"4b5182fc-c85f-4e7a-960c-a2aa59cc653b\") " pod="openstack/swift-proxy-64c8bd4d48-79lrk" Dec 03 09:01:58 crc kubenswrapper[4947]: I1203 09:01:58.925286 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b5182fc-c85f-4e7a-960c-a2aa59cc653b-log-httpd\") pod \"swift-proxy-64c8bd4d48-79lrk\" (UID: \"4b5182fc-c85f-4e7a-960c-a2aa59cc653b\") " pod="openstack/swift-proxy-64c8bd4d48-79lrk" Dec 03 09:01:58 crc kubenswrapper[4947]: I1203 09:01:58.925393 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvgsh\" (UniqueName: \"kubernetes.io/projected/4b5182fc-c85f-4e7a-960c-a2aa59cc653b-kube-api-access-lvgsh\") pod \"swift-proxy-64c8bd4d48-79lrk\" (UID: \"4b5182fc-c85f-4e7a-960c-a2aa59cc653b\") " pod="openstack/swift-proxy-64c8bd4d48-79lrk" Dec 03 09:01:58 crc kubenswrapper[4947]: I1203 09:01:58.925432 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b5182fc-c85f-4e7a-960c-a2aa59cc653b-combined-ca-bundle\") pod \"swift-proxy-64c8bd4d48-79lrk\" (UID: \"4b5182fc-c85f-4e7a-960c-a2aa59cc653b\") " pod="openstack/swift-proxy-64c8bd4d48-79lrk" Dec 03 09:01:58 crc kubenswrapper[4947]: I1203 09:01:58.925543 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4b5182fc-c85f-4e7a-960c-a2aa59cc653b-etc-swift\") pod \"swift-proxy-64c8bd4d48-79lrk\" (UID: \"4b5182fc-c85f-4e7a-960c-a2aa59cc653b\") " pod="openstack/swift-proxy-64c8bd4d48-79lrk" Dec 03 09:01:58 crc kubenswrapper[4947]: I1203 09:01:58.925568 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b5182fc-c85f-4e7a-960c-a2aa59cc653b-config-data\") pod \"swift-proxy-64c8bd4d48-79lrk\" (UID: \"4b5182fc-c85f-4e7a-960c-a2aa59cc653b\") " pod="openstack/swift-proxy-64c8bd4d48-79lrk" Dec 03 09:01:58 crc kubenswrapper[4947]: I1203 09:01:58.925902 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b5182fc-c85f-4e7a-960c-a2aa59cc653b-log-httpd\") pod \"swift-proxy-64c8bd4d48-79lrk\" (UID: \"4b5182fc-c85f-4e7a-960c-a2aa59cc653b\") " pod="openstack/swift-proxy-64c8bd4d48-79lrk" Dec 03 09:01:58 crc kubenswrapper[4947]: I1203 09:01:58.926314 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b5182fc-c85f-4e7a-960c-a2aa59cc653b-run-httpd\") pod \"swift-proxy-64c8bd4d48-79lrk\" (UID: \"4b5182fc-c85f-4e7a-960c-a2aa59cc653b\") " pod="openstack/swift-proxy-64c8bd4d48-79lrk" Dec 03 09:01:58 crc kubenswrapper[4947]: I1203 09:01:58.931856 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4b5182fc-c85f-4e7a-960c-a2aa59cc653b-etc-swift\") pod \"swift-proxy-64c8bd4d48-79lrk\" (UID: \"4b5182fc-c85f-4e7a-960c-a2aa59cc653b\") " pod="openstack/swift-proxy-64c8bd4d48-79lrk" Dec 03 09:01:58 crc kubenswrapper[4947]: I1203 09:01:58.935695 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b5182fc-c85f-4e7a-960c-a2aa59cc653b-combined-ca-bundle\") pod \"swift-proxy-64c8bd4d48-79lrk\" (UID: \"4b5182fc-c85f-4e7a-960c-a2aa59cc653b\") " pod="openstack/swift-proxy-64c8bd4d48-79lrk" Dec 03 09:01:58 crc kubenswrapper[4947]: I1203 09:01:58.940470 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b5182fc-c85f-4e7a-960c-a2aa59cc653b-config-data\") pod \"swift-proxy-64c8bd4d48-79lrk\" (UID: \"4b5182fc-c85f-4e7a-960c-a2aa59cc653b\") " pod="openstack/swift-proxy-64c8bd4d48-79lrk" Dec 03 09:01:58 crc kubenswrapper[4947]: I1203 09:01:58.943214 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvgsh\" (UniqueName: \"kubernetes.io/projected/4b5182fc-c85f-4e7a-960c-a2aa59cc653b-kube-api-access-lvgsh\") pod \"swift-proxy-64c8bd4d48-79lrk\" (UID: \"4b5182fc-c85f-4e7a-960c-a2aa59cc653b\") " pod="openstack/swift-proxy-64c8bd4d48-79lrk" Dec 03 09:01:59 crc kubenswrapper[4947]: I1203 09:01:59.036794 4947 generic.go:334] "Generic (PLEG): container finished" podID="1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f" containerID="8f9db970955d772b66446fd7ddbe2eb1939570cc8e9ee3429c90d39992c36bdf" exitCode=0 Dec 03 09:01:59 crc kubenswrapper[4947]: I1203 09:01:59.036837 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" event={"ID":"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f","Type":"ContainerDied","Data":"8f9db970955d772b66446fd7ddbe2eb1939570cc8e9ee3429c90d39992c36bdf"} Dec 03 09:01:59 crc kubenswrapper[4947]: I1203 09:01:59.096974 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-64c8bd4d48-79lrk" Dec 03 09:01:59 crc kubenswrapper[4947]: I1203 09:01:59.801545 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-64c8bd4d48-79lrk"] Dec 03 09:02:00 crc kubenswrapper[4947]: I1203 09:02:00.047822 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" event={"ID":"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f","Type":"ContainerStarted","Data":"3b414abcdfb3daa4db937f7c84768642c7bdac8b03bf29133ee1ab425b48846f"} Dec 03 09:02:00 crc kubenswrapper[4947]: I1203 09:02:00.048018 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" Dec 03 09:02:00 crc kubenswrapper[4947]: I1203 09:02:00.078591 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" podStartSLOduration=4.078568371 podStartE2EDuration="4.078568371s" podCreationTimestamp="2025-12-03 09:01:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:02:00.074436089 +0000 UTC m=+7981.335390515" watchObservedRunningTime="2025-12-03 09:02:00.078568371 +0000 UTC m=+7981.339522797" Dec 03 09:02:00 crc kubenswrapper[4947]: W1203 09:02:00.965775 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b5182fc_c85f_4e7a_960c_a2aa59cc653b.slice/crio-97376dbc1e64bb355393800b1097550341e327e282fe4548586fe237d8227555 WatchSource:0}: Error finding container 97376dbc1e64bb355393800b1097550341e327e282fe4548586fe237d8227555: Status 404 returned error can't find the container with id 97376dbc1e64bb355393800b1097550341e327e282fe4548586fe237d8227555 Dec 03 09:02:01 crc kubenswrapper[4947]: I1203 09:02:01.064429 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-64c8bd4d48-79lrk" event={"ID":"4b5182fc-c85f-4e7a-960c-a2aa59cc653b","Type":"ContainerStarted","Data":"97376dbc1e64bb355393800b1097550341e327e282fe4548586fe237d8227555"} Dec 03 09:02:03 crc kubenswrapper[4947]: I1203 09:02:03.117385 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-64c8bd4d48-79lrk" Dec 03 09:02:03 crc kubenswrapper[4947]: I1203 09:02:03.118110 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jn9hw" event={"ID":"e870e8ba-007a-48cd-bfbb-fa8051fabf33","Type":"ContainerStarted","Data":"8045ac14622430ec0d9ef8c30211cd98a0dd36a4c2e8e0290194ca9d3ffc1130"} Dec 03 09:02:03 crc kubenswrapper[4947]: I1203 09:02:03.118143 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-64c8bd4d48-79lrk" event={"ID":"4b5182fc-c85f-4e7a-960c-a2aa59cc653b","Type":"ContainerStarted","Data":"2e68a3d90d028be924d82a80d13e5dec1aa5fd0fec4669e2439fc1a968a75d1f"} Dec 03 09:02:03 crc kubenswrapper[4947]: I1203 09:02:03.118175 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-64c8bd4d48-79lrk" Dec 03 09:02:03 crc kubenswrapper[4947]: I1203 09:02:03.118198 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-64c8bd4d48-79lrk" event={"ID":"4b5182fc-c85f-4e7a-960c-a2aa59cc653b","Type":"ContainerStarted","Data":"9ce553f3d1b47734f1ffc064504e0b1a7ae6e2e1d4425a5c3f6b94ea6d32f1aa"} Dec 03 09:02:03 crc kubenswrapper[4947]: I1203 09:02:03.121709 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-jn9hw" podStartSLOduration=2.903843546 podStartE2EDuration="7.121678201s" podCreationTimestamp="2025-12-03 09:01:56 +0000 UTC" firstStartedPulling="2025-12-03 09:01:57.700119427 +0000 UTC m=+7978.961073853" lastFinishedPulling="2025-12-03 09:02:01.917954082 +0000 UTC m=+7983.178908508" observedRunningTime="2025-12-03 09:02:03.11795009 +0000 UTC m=+7984.378904556" watchObservedRunningTime="2025-12-03 09:02:03.121678201 +0000 UTC m=+7984.382632707" Dec 03 09:02:03 crc kubenswrapper[4947]: I1203 09:02:03.159344 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-64c8bd4d48-79lrk" podStartSLOduration=4.209918069 podStartE2EDuration="5.159322849s" podCreationTimestamp="2025-12-03 09:01:58 +0000 UTC" firstStartedPulling="2025-12-03 09:02:00.970593958 +0000 UTC m=+7982.231548384" lastFinishedPulling="2025-12-03 09:02:01.919998698 +0000 UTC m=+7983.180953164" observedRunningTime="2025-12-03 09:02:03.144061186 +0000 UTC m=+7984.405015622" watchObservedRunningTime="2025-12-03 09:02:03.159322849 +0000 UTC m=+7984.420277265" Dec 03 09:02:07 crc kubenswrapper[4947]: I1203 09:02:07.137529 4947 generic.go:334] "Generic (PLEG): container finished" podID="e870e8ba-007a-48cd-bfbb-fa8051fabf33" containerID="8045ac14622430ec0d9ef8c30211cd98a0dd36a4c2e8e0290194ca9d3ffc1130" exitCode=0 Dec 03 09:02:07 crc kubenswrapper[4947]: I1203 09:02:07.137562 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jn9hw" event={"ID":"e870e8ba-007a-48cd-bfbb-fa8051fabf33","Type":"ContainerDied","Data":"8045ac14622430ec0d9ef8c30211cd98a0dd36a4c2e8e0290194ca9d3ffc1130"} Dec 03 09:02:07 crc kubenswrapper[4947]: I1203 09:02:07.341729 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" Dec 03 09:02:07 crc kubenswrapper[4947]: I1203 09:02:07.407473 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh"] Dec 03 09:02:07 crc kubenswrapper[4947]: I1203 09:02:07.407734 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" podUID="5bbb7417-1020-4257-bd46-ed5cb0f54649" containerName="dnsmasq-dns" containerID="cri-o://28ac18e0d4da524583279712bf804496facfb80a7d944565e70c2dfdb3ac634a" gracePeriod=10 Dec 03 09:02:07 crc kubenswrapper[4947]: I1203 09:02:07.840964 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" Dec 03 09:02:07 crc kubenswrapper[4947]: I1203 09:02:07.992018 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bbb7417-1020-4257-bd46-ed5cb0f54649-ovsdbserver-nb\") pod \"5bbb7417-1020-4257-bd46-ed5cb0f54649\" (UID: \"5bbb7417-1020-4257-bd46-ed5cb0f54649\") " Dec 03 09:02:07 crc kubenswrapper[4947]: I1203 09:02:07.992070 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bbb7417-1020-4257-bd46-ed5cb0f54649-dns-svc\") pod \"5bbb7417-1020-4257-bd46-ed5cb0f54649\" (UID: \"5bbb7417-1020-4257-bd46-ed5cb0f54649\") " Dec 03 09:02:07 crc kubenswrapper[4947]: I1203 09:02:07.992127 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s77cx\" (UniqueName: \"kubernetes.io/projected/5bbb7417-1020-4257-bd46-ed5cb0f54649-kube-api-access-s77cx\") pod \"5bbb7417-1020-4257-bd46-ed5cb0f54649\" (UID: \"5bbb7417-1020-4257-bd46-ed5cb0f54649\") " Dec 03 09:02:07 crc kubenswrapper[4947]: I1203 09:02:07.992171 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bbb7417-1020-4257-bd46-ed5cb0f54649-ovsdbserver-sb\") pod \"5bbb7417-1020-4257-bd46-ed5cb0f54649\" (UID: \"5bbb7417-1020-4257-bd46-ed5cb0f54649\") " Dec 03 09:02:07 crc kubenswrapper[4947]: I1203 09:02:07.992262 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bbb7417-1020-4257-bd46-ed5cb0f54649-config\") pod \"5bbb7417-1020-4257-bd46-ed5cb0f54649\" (UID: \"5bbb7417-1020-4257-bd46-ed5cb0f54649\") " Dec 03 09:02:07 crc kubenswrapper[4947]: I1203 09:02:07.997872 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bbb7417-1020-4257-bd46-ed5cb0f54649-kube-api-access-s77cx" (OuterVolumeSpecName: "kube-api-access-s77cx") pod "5bbb7417-1020-4257-bd46-ed5cb0f54649" (UID: "5bbb7417-1020-4257-bd46-ed5cb0f54649"). InnerVolumeSpecName "kube-api-access-s77cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.042330 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bbb7417-1020-4257-bd46-ed5cb0f54649-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5bbb7417-1020-4257-bd46-ed5cb0f54649" (UID: "5bbb7417-1020-4257-bd46-ed5cb0f54649"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.043024 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bbb7417-1020-4257-bd46-ed5cb0f54649-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5bbb7417-1020-4257-bd46-ed5cb0f54649" (UID: "5bbb7417-1020-4257-bd46-ed5cb0f54649"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.050815 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bbb7417-1020-4257-bd46-ed5cb0f54649-config" (OuterVolumeSpecName: "config") pod "5bbb7417-1020-4257-bd46-ed5cb0f54649" (UID: "5bbb7417-1020-4257-bd46-ed5cb0f54649"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.055147 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bbb7417-1020-4257-bd46-ed5cb0f54649-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5bbb7417-1020-4257-bd46-ed5cb0f54649" (UID: "5bbb7417-1020-4257-bd46-ed5cb0f54649"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.094990 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s77cx\" (UniqueName: \"kubernetes.io/projected/5bbb7417-1020-4257-bd46-ed5cb0f54649-kube-api-access-s77cx\") on node \"crc\" DevicePath \"\"" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.095030 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5bbb7417-1020-4257-bd46-ed5cb0f54649-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.095043 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bbb7417-1020-4257-bd46-ed5cb0f54649-config\") on node \"crc\" DevicePath \"\"" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.095054 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5bbb7417-1020-4257-bd46-ed5cb0f54649-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.095067 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5bbb7417-1020-4257-bd46-ed5cb0f54649-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.149370 4947 generic.go:334] "Generic (PLEG): container finished" podID="5bbb7417-1020-4257-bd46-ed5cb0f54649" containerID="28ac18e0d4da524583279712bf804496facfb80a7d944565e70c2dfdb3ac634a" exitCode=0 Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.149483 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" event={"ID":"5bbb7417-1020-4257-bd46-ed5cb0f54649","Type":"ContainerDied","Data":"28ac18e0d4da524583279712bf804496facfb80a7d944565e70c2dfdb3ac634a"} Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.149529 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.149572 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh" event={"ID":"5bbb7417-1020-4257-bd46-ed5cb0f54649","Type":"ContainerDied","Data":"e031ef818b506b1d29fb56d978d5ccd34ad17806e43553f31da3f4df82ea6be0"} Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.149594 4947 scope.go:117] "RemoveContainer" containerID="28ac18e0d4da524583279712bf804496facfb80a7d944565e70c2dfdb3ac634a" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.171577 4947 scope.go:117] "RemoveContainer" containerID="970b8996c80fd5eb2b61181e1f9ae07e69dbb8b511f6bb8bc3566295d6dc2d78" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.195592 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh"] Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.205262 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f6c4c8ffc-lqxrh"] Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.214329 4947 scope.go:117] "RemoveContainer" containerID="28ac18e0d4da524583279712bf804496facfb80a7d944565e70c2dfdb3ac634a" Dec 03 09:02:08 crc kubenswrapper[4947]: E1203 09:02:08.214759 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28ac18e0d4da524583279712bf804496facfb80a7d944565e70c2dfdb3ac634a\": container with ID starting with 28ac18e0d4da524583279712bf804496facfb80a7d944565e70c2dfdb3ac634a not found: ID does not exist" containerID="28ac18e0d4da524583279712bf804496facfb80a7d944565e70c2dfdb3ac634a" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.214794 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28ac18e0d4da524583279712bf804496facfb80a7d944565e70c2dfdb3ac634a"} err="failed to get container status \"28ac18e0d4da524583279712bf804496facfb80a7d944565e70c2dfdb3ac634a\": rpc error: code = NotFound desc = could not find container \"28ac18e0d4da524583279712bf804496facfb80a7d944565e70c2dfdb3ac634a\": container with ID starting with 28ac18e0d4da524583279712bf804496facfb80a7d944565e70c2dfdb3ac634a not found: ID does not exist" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.214819 4947 scope.go:117] "RemoveContainer" containerID="970b8996c80fd5eb2b61181e1f9ae07e69dbb8b511f6bb8bc3566295d6dc2d78" Dec 03 09:02:08 crc kubenswrapper[4947]: E1203 09:02:08.215050 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"970b8996c80fd5eb2b61181e1f9ae07e69dbb8b511f6bb8bc3566295d6dc2d78\": container with ID starting with 970b8996c80fd5eb2b61181e1f9ae07e69dbb8b511f6bb8bc3566295d6dc2d78 not found: ID does not exist" containerID="970b8996c80fd5eb2b61181e1f9ae07e69dbb8b511f6bb8bc3566295d6dc2d78" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.215074 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"970b8996c80fd5eb2b61181e1f9ae07e69dbb8b511f6bb8bc3566295d6dc2d78"} err="failed to get container status \"970b8996c80fd5eb2b61181e1f9ae07e69dbb8b511f6bb8bc3566295d6dc2d78\": rpc error: code = NotFound desc = could not find container \"970b8996c80fd5eb2b61181e1f9ae07e69dbb8b511f6bb8bc3566295d6dc2d78\": container with ID starting with 970b8996c80fd5eb2b61181e1f9ae07e69dbb8b511f6bb8bc3566295d6dc2d78 not found: ID does not exist" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.400575 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jn9hw" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.501859 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e870e8ba-007a-48cd-bfbb-fa8051fabf33-dispersionconf\") pod \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.501913 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e870e8ba-007a-48cd-bfbb-fa8051fabf33-ring-data-devices\") pod \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.501939 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e870e8ba-007a-48cd-bfbb-fa8051fabf33-etc-swift\") pod \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.501992 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e870e8ba-007a-48cd-bfbb-fa8051fabf33-scripts\") pod \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.502065 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdkjh\" (UniqueName: \"kubernetes.io/projected/e870e8ba-007a-48cd-bfbb-fa8051fabf33-kube-api-access-qdkjh\") pod \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.502112 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e870e8ba-007a-48cd-bfbb-fa8051fabf33-combined-ca-bundle\") pod \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.502237 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e870e8ba-007a-48cd-bfbb-fa8051fabf33-swiftconf\") pod \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\" (UID: \"e870e8ba-007a-48cd-bfbb-fa8051fabf33\") " Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.502698 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e870e8ba-007a-48cd-bfbb-fa8051fabf33-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e870e8ba-007a-48cd-bfbb-fa8051fabf33" (UID: "e870e8ba-007a-48cd-bfbb-fa8051fabf33"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.503284 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e870e8ba-007a-48cd-bfbb-fa8051fabf33-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e870e8ba-007a-48cd-bfbb-fa8051fabf33" (UID: "e870e8ba-007a-48cd-bfbb-fa8051fabf33"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.506432 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e870e8ba-007a-48cd-bfbb-fa8051fabf33-kube-api-access-qdkjh" (OuterVolumeSpecName: "kube-api-access-qdkjh") pod "e870e8ba-007a-48cd-bfbb-fa8051fabf33" (UID: "e870e8ba-007a-48cd-bfbb-fa8051fabf33"). InnerVolumeSpecName "kube-api-access-qdkjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.508066 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e870e8ba-007a-48cd-bfbb-fa8051fabf33-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e870e8ba-007a-48cd-bfbb-fa8051fabf33" (UID: "e870e8ba-007a-48cd-bfbb-fa8051fabf33"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.524293 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e870e8ba-007a-48cd-bfbb-fa8051fabf33-scripts" (OuterVolumeSpecName: "scripts") pod "e870e8ba-007a-48cd-bfbb-fa8051fabf33" (UID: "e870e8ba-007a-48cd-bfbb-fa8051fabf33"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.524897 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e870e8ba-007a-48cd-bfbb-fa8051fabf33-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e870e8ba-007a-48cd-bfbb-fa8051fabf33" (UID: "e870e8ba-007a-48cd-bfbb-fa8051fabf33"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.525958 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e870e8ba-007a-48cd-bfbb-fa8051fabf33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e870e8ba-007a-48cd-bfbb-fa8051fabf33" (UID: "e870e8ba-007a-48cd-bfbb-fa8051fabf33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.604419 4947 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e870e8ba-007a-48cd-bfbb-fa8051fabf33-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.604460 4947 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e870e8ba-007a-48cd-bfbb-fa8051fabf33-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.604472 4947 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e870e8ba-007a-48cd-bfbb-fa8051fabf33-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.604486 4947 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e870e8ba-007a-48cd-bfbb-fa8051fabf33-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.604510 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e870e8ba-007a-48cd-bfbb-fa8051fabf33-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.604523 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdkjh\" (UniqueName: \"kubernetes.io/projected/e870e8ba-007a-48cd-bfbb-fa8051fabf33-kube-api-access-qdkjh\") on node \"crc\" DevicePath \"\"" Dec 03 09:02:08 crc kubenswrapper[4947]: I1203 09:02:08.604536 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e870e8ba-007a-48cd-bfbb-fa8051fabf33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:02:09 crc kubenswrapper[4947]: I1203 09:02:09.102801 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bbb7417-1020-4257-bd46-ed5cb0f54649" path="/var/lib/kubelet/pods/5bbb7417-1020-4257-bd46-ed5cb0f54649/volumes" Dec 03 09:02:09 crc kubenswrapper[4947]: I1203 09:02:09.104174 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-64c8bd4d48-79lrk" Dec 03 09:02:09 crc kubenswrapper[4947]: I1203 09:02:09.104234 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-64c8bd4d48-79lrk" Dec 03 09:02:09 crc kubenswrapper[4947]: I1203 09:02:09.188922 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jn9hw" event={"ID":"e870e8ba-007a-48cd-bfbb-fa8051fabf33","Type":"ContainerDied","Data":"93ee08beef1cf675770db3fe3c9faed4014711d11d7d4efa8c514bd0a75e4410"} Dec 03 09:02:09 crc kubenswrapper[4947]: I1203 09:02:09.188983 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93ee08beef1cf675770db3fe3c9faed4014711d11d7d4efa8c514bd0a75e4410" Dec 03 09:02:09 crc kubenswrapper[4947]: I1203 09:02:09.189089 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jn9hw" Dec 03 09:02:11 crc kubenswrapper[4947]: I1203 09:02:11.083065 4947 scope.go:117] "RemoveContainer" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" Dec 03 09:02:11 crc kubenswrapper[4947]: E1203 09:02:11.083897 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:02:14 crc kubenswrapper[4947]: I1203 09:02:14.813302 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-kjzxl"] Dec 03 09:02:14 crc kubenswrapper[4947]: E1203 09:02:14.814198 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bbb7417-1020-4257-bd46-ed5cb0f54649" containerName="dnsmasq-dns" Dec 03 09:02:14 crc kubenswrapper[4947]: I1203 09:02:14.814211 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bbb7417-1020-4257-bd46-ed5cb0f54649" containerName="dnsmasq-dns" Dec 03 09:02:14 crc kubenswrapper[4947]: E1203 09:02:14.814226 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e870e8ba-007a-48cd-bfbb-fa8051fabf33" containerName="swift-ring-rebalance" Dec 03 09:02:14 crc kubenswrapper[4947]: I1203 09:02:14.814232 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e870e8ba-007a-48cd-bfbb-fa8051fabf33" containerName="swift-ring-rebalance" Dec 03 09:02:14 crc kubenswrapper[4947]: E1203 09:02:14.814257 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bbb7417-1020-4257-bd46-ed5cb0f54649" containerName="init" Dec 03 09:02:14 crc kubenswrapper[4947]: I1203 09:02:14.814264 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bbb7417-1020-4257-bd46-ed5cb0f54649" containerName="init" Dec 03 09:02:14 crc kubenswrapper[4947]: I1203 09:02:14.814432 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bbb7417-1020-4257-bd46-ed5cb0f54649" containerName="dnsmasq-dns" Dec 03 09:02:14 crc kubenswrapper[4947]: I1203 09:02:14.814450 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="e870e8ba-007a-48cd-bfbb-fa8051fabf33" containerName="swift-ring-rebalance" Dec 03 09:02:14 crc kubenswrapper[4947]: I1203 09:02:14.815127 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kjzxl" Dec 03 09:02:14 crc kubenswrapper[4947]: I1203 09:02:14.821735 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-kjzxl"] Dec 03 09:02:14 crc kubenswrapper[4947]: I1203 09:02:14.917479 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-9521-account-create-update-hmkwn"] Dec 03 09:02:14 crc kubenswrapper[4947]: I1203 09:02:14.920842 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9521-account-create-update-hmkwn" Dec 03 09:02:14 crc kubenswrapper[4947]: I1203 09:02:14.922908 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 03 09:02:14 crc kubenswrapper[4947]: I1203 09:02:14.926415 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9521-account-create-update-hmkwn"] Dec 03 09:02:14 crc kubenswrapper[4947]: I1203 09:02:14.961682 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r57kq\" (UniqueName: \"kubernetes.io/projected/2c17c928-65ee-4727-99c4-328cbc8ed2d5-kube-api-access-r57kq\") pod \"cinder-db-create-kjzxl\" (UID: \"2c17c928-65ee-4727-99c4-328cbc8ed2d5\") " pod="openstack/cinder-db-create-kjzxl" Dec 03 09:02:14 crc kubenswrapper[4947]: I1203 09:02:14.961764 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c17c928-65ee-4727-99c4-328cbc8ed2d5-operator-scripts\") pod \"cinder-db-create-kjzxl\" (UID: \"2c17c928-65ee-4727-99c4-328cbc8ed2d5\") " pod="openstack/cinder-db-create-kjzxl" Dec 03 09:02:15 crc kubenswrapper[4947]: I1203 09:02:15.063838 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f02dbcd4-a088-488a-ac51-3aa3015c81ff-operator-scripts\") pod \"cinder-9521-account-create-update-hmkwn\" (UID: \"f02dbcd4-a088-488a-ac51-3aa3015c81ff\") " pod="openstack/cinder-9521-account-create-update-hmkwn" Dec 03 09:02:15 crc kubenswrapper[4947]: I1203 09:02:15.064099 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r57kq\" (UniqueName: \"kubernetes.io/projected/2c17c928-65ee-4727-99c4-328cbc8ed2d5-kube-api-access-r57kq\") pod \"cinder-db-create-kjzxl\" (UID: \"2c17c928-65ee-4727-99c4-328cbc8ed2d5\") " pod="openstack/cinder-db-create-kjzxl" Dec 03 09:02:15 crc kubenswrapper[4947]: I1203 09:02:15.064294 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c17c928-65ee-4727-99c4-328cbc8ed2d5-operator-scripts\") pod \"cinder-db-create-kjzxl\" (UID: \"2c17c928-65ee-4727-99c4-328cbc8ed2d5\") " pod="openstack/cinder-db-create-kjzxl" Dec 03 09:02:15 crc kubenswrapper[4947]: I1203 09:02:15.064355 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqcpj\" (UniqueName: \"kubernetes.io/projected/f02dbcd4-a088-488a-ac51-3aa3015c81ff-kube-api-access-dqcpj\") pod \"cinder-9521-account-create-update-hmkwn\" (UID: \"f02dbcd4-a088-488a-ac51-3aa3015c81ff\") " pod="openstack/cinder-9521-account-create-update-hmkwn" Dec 03 09:02:15 crc kubenswrapper[4947]: I1203 09:02:15.065172 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c17c928-65ee-4727-99c4-328cbc8ed2d5-operator-scripts\") pod \"cinder-db-create-kjzxl\" (UID: \"2c17c928-65ee-4727-99c4-328cbc8ed2d5\") " pod="openstack/cinder-db-create-kjzxl" Dec 03 09:02:15 crc kubenswrapper[4947]: I1203 09:02:15.082392 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r57kq\" (UniqueName: \"kubernetes.io/projected/2c17c928-65ee-4727-99c4-328cbc8ed2d5-kube-api-access-r57kq\") pod \"cinder-db-create-kjzxl\" (UID: \"2c17c928-65ee-4727-99c4-328cbc8ed2d5\") " pod="openstack/cinder-db-create-kjzxl" Dec 03 09:02:15 crc kubenswrapper[4947]: I1203 09:02:15.144675 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kjzxl" Dec 03 09:02:15 crc kubenswrapper[4947]: I1203 09:02:15.166588 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqcpj\" (UniqueName: \"kubernetes.io/projected/f02dbcd4-a088-488a-ac51-3aa3015c81ff-kube-api-access-dqcpj\") pod \"cinder-9521-account-create-update-hmkwn\" (UID: \"f02dbcd4-a088-488a-ac51-3aa3015c81ff\") " pod="openstack/cinder-9521-account-create-update-hmkwn" Dec 03 09:02:15 crc kubenswrapper[4947]: I1203 09:02:15.166677 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f02dbcd4-a088-488a-ac51-3aa3015c81ff-operator-scripts\") pod \"cinder-9521-account-create-update-hmkwn\" (UID: \"f02dbcd4-a088-488a-ac51-3aa3015c81ff\") " pod="openstack/cinder-9521-account-create-update-hmkwn" Dec 03 09:02:15 crc kubenswrapper[4947]: I1203 09:02:15.167479 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f02dbcd4-a088-488a-ac51-3aa3015c81ff-operator-scripts\") pod \"cinder-9521-account-create-update-hmkwn\" (UID: \"f02dbcd4-a088-488a-ac51-3aa3015c81ff\") " pod="openstack/cinder-9521-account-create-update-hmkwn" Dec 03 09:02:15 crc kubenswrapper[4947]: I1203 09:02:15.183578 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqcpj\" (UniqueName: \"kubernetes.io/projected/f02dbcd4-a088-488a-ac51-3aa3015c81ff-kube-api-access-dqcpj\") pod \"cinder-9521-account-create-update-hmkwn\" (UID: \"f02dbcd4-a088-488a-ac51-3aa3015c81ff\") " pod="openstack/cinder-9521-account-create-update-hmkwn" Dec 03 09:02:15 crc kubenswrapper[4947]: I1203 09:02:15.241470 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9521-account-create-update-hmkwn" Dec 03 09:02:15 crc kubenswrapper[4947]: I1203 09:02:15.563703 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-kjzxl"] Dec 03 09:02:15 crc kubenswrapper[4947]: W1203 09:02:15.564241 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c17c928_65ee_4727_99c4_328cbc8ed2d5.slice/crio-7f6eb08a6861688372ae2ccb35babe906ea1da8207a6eefef6ea5d4cae6b322b WatchSource:0}: Error finding container 7f6eb08a6861688372ae2ccb35babe906ea1da8207a6eefef6ea5d4cae6b322b: Status 404 returned error can't find the container with id 7f6eb08a6861688372ae2ccb35babe906ea1da8207a6eefef6ea5d4cae6b322b Dec 03 09:02:15 crc kubenswrapper[4947]: I1203 09:02:15.672132 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9521-account-create-update-hmkwn"] Dec 03 09:02:16 crc kubenswrapper[4947]: I1203 09:02:16.260214 4947 generic.go:334] "Generic (PLEG): container finished" podID="f02dbcd4-a088-488a-ac51-3aa3015c81ff" containerID="31433968ff91a8bb07a413ae78d7b236c76244fc27f60747006f6cc92bc4ca2d" exitCode=0 Dec 03 09:02:16 crc kubenswrapper[4947]: I1203 09:02:16.260295 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9521-account-create-update-hmkwn" event={"ID":"f02dbcd4-a088-488a-ac51-3aa3015c81ff","Type":"ContainerDied","Data":"31433968ff91a8bb07a413ae78d7b236c76244fc27f60747006f6cc92bc4ca2d"} Dec 03 09:02:16 crc kubenswrapper[4947]: I1203 09:02:16.260326 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9521-account-create-update-hmkwn" event={"ID":"f02dbcd4-a088-488a-ac51-3aa3015c81ff","Type":"ContainerStarted","Data":"56f4969f7f169879233d398fac8cc6163c55da5125c745d46984bb86a40c9e83"} Dec 03 09:02:16 crc kubenswrapper[4947]: I1203 09:02:16.262143 4947 generic.go:334] "Generic (PLEG): container finished" podID="2c17c928-65ee-4727-99c4-328cbc8ed2d5" containerID="f39ee615a6da237f990e6d0610b3f674b26516b59945590708bb9516180097b5" exitCode=0 Dec 03 09:02:16 crc kubenswrapper[4947]: I1203 09:02:16.262163 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kjzxl" event={"ID":"2c17c928-65ee-4727-99c4-328cbc8ed2d5","Type":"ContainerDied","Data":"f39ee615a6da237f990e6d0610b3f674b26516b59945590708bb9516180097b5"} Dec 03 09:02:16 crc kubenswrapper[4947]: I1203 09:02:16.262177 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kjzxl" event={"ID":"2c17c928-65ee-4727-99c4-328cbc8ed2d5","Type":"ContainerStarted","Data":"7f6eb08a6861688372ae2ccb35babe906ea1da8207a6eefef6ea5d4cae6b322b"} Dec 03 09:02:17 crc kubenswrapper[4947]: I1203 09:02:17.704628 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9521-account-create-update-hmkwn" Dec 03 09:02:17 crc kubenswrapper[4947]: I1203 09:02:17.710172 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kjzxl" Dec 03 09:02:17 crc kubenswrapper[4947]: I1203 09:02:17.814317 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqcpj\" (UniqueName: \"kubernetes.io/projected/f02dbcd4-a088-488a-ac51-3aa3015c81ff-kube-api-access-dqcpj\") pod \"f02dbcd4-a088-488a-ac51-3aa3015c81ff\" (UID: \"f02dbcd4-a088-488a-ac51-3aa3015c81ff\") " Dec 03 09:02:17 crc kubenswrapper[4947]: I1203 09:02:17.814646 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f02dbcd4-a088-488a-ac51-3aa3015c81ff-operator-scripts\") pod \"f02dbcd4-a088-488a-ac51-3aa3015c81ff\" (UID: \"f02dbcd4-a088-488a-ac51-3aa3015c81ff\") " Dec 03 09:02:17 crc kubenswrapper[4947]: I1203 09:02:17.814680 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r57kq\" (UniqueName: \"kubernetes.io/projected/2c17c928-65ee-4727-99c4-328cbc8ed2d5-kube-api-access-r57kq\") pod \"2c17c928-65ee-4727-99c4-328cbc8ed2d5\" (UID: \"2c17c928-65ee-4727-99c4-328cbc8ed2d5\") " Dec 03 09:02:17 crc kubenswrapper[4947]: I1203 09:02:17.814707 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c17c928-65ee-4727-99c4-328cbc8ed2d5-operator-scripts\") pod \"2c17c928-65ee-4727-99c4-328cbc8ed2d5\" (UID: \"2c17c928-65ee-4727-99c4-328cbc8ed2d5\") " Dec 03 09:02:17 crc kubenswrapper[4947]: I1203 09:02:17.815322 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c17c928-65ee-4727-99c4-328cbc8ed2d5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2c17c928-65ee-4727-99c4-328cbc8ed2d5" (UID: "2c17c928-65ee-4727-99c4-328cbc8ed2d5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:02:17 crc kubenswrapper[4947]: I1203 09:02:17.817245 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f02dbcd4-a088-488a-ac51-3aa3015c81ff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f02dbcd4-a088-488a-ac51-3aa3015c81ff" (UID: "f02dbcd4-a088-488a-ac51-3aa3015c81ff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:02:17 crc kubenswrapper[4947]: I1203 09:02:17.821069 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f02dbcd4-a088-488a-ac51-3aa3015c81ff-kube-api-access-dqcpj" (OuterVolumeSpecName: "kube-api-access-dqcpj") pod "f02dbcd4-a088-488a-ac51-3aa3015c81ff" (UID: "f02dbcd4-a088-488a-ac51-3aa3015c81ff"). InnerVolumeSpecName "kube-api-access-dqcpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:02:17 crc kubenswrapper[4947]: I1203 09:02:17.821155 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c17c928-65ee-4727-99c4-328cbc8ed2d5-kube-api-access-r57kq" (OuterVolumeSpecName: "kube-api-access-r57kq") pod "2c17c928-65ee-4727-99c4-328cbc8ed2d5" (UID: "2c17c928-65ee-4727-99c4-328cbc8ed2d5"). InnerVolumeSpecName "kube-api-access-r57kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:02:17 crc kubenswrapper[4947]: I1203 09:02:17.916433 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f02dbcd4-a088-488a-ac51-3aa3015c81ff-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:02:17 crc kubenswrapper[4947]: I1203 09:02:17.916467 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r57kq\" (UniqueName: \"kubernetes.io/projected/2c17c928-65ee-4727-99c4-328cbc8ed2d5-kube-api-access-r57kq\") on node \"crc\" DevicePath \"\"" Dec 03 09:02:17 crc kubenswrapper[4947]: I1203 09:02:17.916477 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c17c928-65ee-4727-99c4-328cbc8ed2d5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:02:17 crc kubenswrapper[4947]: I1203 09:02:17.916485 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqcpj\" (UniqueName: \"kubernetes.io/projected/f02dbcd4-a088-488a-ac51-3aa3015c81ff-kube-api-access-dqcpj\") on node \"crc\" DevicePath \"\"" Dec 03 09:02:18 crc kubenswrapper[4947]: I1203 09:02:18.284357 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-kjzxl" Dec 03 09:02:18 crc kubenswrapper[4947]: I1203 09:02:18.284344 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-kjzxl" event={"ID":"2c17c928-65ee-4727-99c4-328cbc8ed2d5","Type":"ContainerDied","Data":"7f6eb08a6861688372ae2ccb35babe906ea1da8207a6eefef6ea5d4cae6b322b"} Dec 03 09:02:18 crc kubenswrapper[4947]: I1203 09:02:18.284910 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f6eb08a6861688372ae2ccb35babe906ea1da8207a6eefef6ea5d4cae6b322b" Dec 03 09:02:18 crc kubenswrapper[4947]: I1203 09:02:18.286504 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9521-account-create-update-hmkwn" event={"ID":"f02dbcd4-a088-488a-ac51-3aa3015c81ff","Type":"ContainerDied","Data":"56f4969f7f169879233d398fac8cc6163c55da5125c745d46984bb86a40c9e83"} Dec 03 09:02:18 crc kubenswrapper[4947]: I1203 09:02:18.286547 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56f4969f7f169879233d398fac8cc6163c55da5125c745d46984bb86a40c9e83" Dec 03 09:02:18 crc kubenswrapper[4947]: I1203 09:02:18.286732 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9521-account-create-update-hmkwn" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.220988 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-xcxvl"] Dec 03 09:02:20 crc kubenswrapper[4947]: E1203 09:02:20.221393 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f02dbcd4-a088-488a-ac51-3aa3015c81ff" containerName="mariadb-account-create-update" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.221407 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f02dbcd4-a088-488a-ac51-3aa3015c81ff" containerName="mariadb-account-create-update" Dec 03 09:02:20 crc kubenswrapper[4947]: E1203 09:02:20.221437 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c17c928-65ee-4727-99c4-328cbc8ed2d5" containerName="mariadb-database-create" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.221445 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c17c928-65ee-4727-99c4-328cbc8ed2d5" containerName="mariadb-database-create" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.221669 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f02dbcd4-a088-488a-ac51-3aa3015c81ff" containerName="mariadb-account-create-update" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.221708 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c17c928-65ee-4727-99c4-328cbc8ed2d5" containerName="mariadb-database-create" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.222559 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xcxvl" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.225735 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-m7rnc" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.226198 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.226384 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.230656 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xcxvl"] Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.260566 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a9f981c-80a7-4c92-afe4-5edabea09911-scripts\") pod \"cinder-db-sync-xcxvl\" (UID: \"4a9f981c-80a7-4c92-afe4-5edabea09911\") " pod="openstack/cinder-db-sync-xcxvl" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.261026 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9f981c-80a7-4c92-afe4-5edabea09911-combined-ca-bundle\") pod \"cinder-db-sync-xcxvl\" (UID: \"4a9f981c-80a7-4c92-afe4-5edabea09911\") " pod="openstack/cinder-db-sync-xcxvl" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.261057 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82kvr\" (UniqueName: \"kubernetes.io/projected/4a9f981c-80a7-4c92-afe4-5edabea09911-kube-api-access-82kvr\") pod \"cinder-db-sync-xcxvl\" (UID: \"4a9f981c-80a7-4c92-afe4-5edabea09911\") " pod="openstack/cinder-db-sync-xcxvl" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.261446 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a9f981c-80a7-4c92-afe4-5edabea09911-db-sync-config-data\") pod \"cinder-db-sync-xcxvl\" (UID: \"4a9f981c-80a7-4c92-afe4-5edabea09911\") " pod="openstack/cinder-db-sync-xcxvl" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.261543 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a9f981c-80a7-4c92-afe4-5edabea09911-config-data\") pod \"cinder-db-sync-xcxvl\" (UID: \"4a9f981c-80a7-4c92-afe4-5edabea09911\") " pod="openstack/cinder-db-sync-xcxvl" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.261655 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a9f981c-80a7-4c92-afe4-5edabea09911-etc-machine-id\") pod \"cinder-db-sync-xcxvl\" (UID: \"4a9f981c-80a7-4c92-afe4-5edabea09911\") " pod="openstack/cinder-db-sync-xcxvl" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.363189 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a9f981c-80a7-4c92-afe4-5edabea09911-scripts\") pod \"cinder-db-sync-xcxvl\" (UID: \"4a9f981c-80a7-4c92-afe4-5edabea09911\") " pod="openstack/cinder-db-sync-xcxvl" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.363264 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9f981c-80a7-4c92-afe4-5edabea09911-combined-ca-bundle\") pod \"cinder-db-sync-xcxvl\" (UID: \"4a9f981c-80a7-4c92-afe4-5edabea09911\") " pod="openstack/cinder-db-sync-xcxvl" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.363285 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82kvr\" (UniqueName: \"kubernetes.io/projected/4a9f981c-80a7-4c92-afe4-5edabea09911-kube-api-access-82kvr\") pod \"cinder-db-sync-xcxvl\" (UID: \"4a9f981c-80a7-4c92-afe4-5edabea09911\") " pod="openstack/cinder-db-sync-xcxvl" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.363372 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a9f981c-80a7-4c92-afe4-5edabea09911-db-sync-config-data\") pod \"cinder-db-sync-xcxvl\" (UID: \"4a9f981c-80a7-4c92-afe4-5edabea09911\") " pod="openstack/cinder-db-sync-xcxvl" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.363424 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a9f981c-80a7-4c92-afe4-5edabea09911-config-data\") pod \"cinder-db-sync-xcxvl\" (UID: \"4a9f981c-80a7-4c92-afe4-5edabea09911\") " pod="openstack/cinder-db-sync-xcxvl" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.363467 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a9f981c-80a7-4c92-afe4-5edabea09911-etc-machine-id\") pod \"cinder-db-sync-xcxvl\" (UID: \"4a9f981c-80a7-4c92-afe4-5edabea09911\") " pod="openstack/cinder-db-sync-xcxvl" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.363577 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a9f981c-80a7-4c92-afe4-5edabea09911-etc-machine-id\") pod \"cinder-db-sync-xcxvl\" (UID: \"4a9f981c-80a7-4c92-afe4-5edabea09911\") " pod="openstack/cinder-db-sync-xcxvl" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.368808 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a9f981c-80a7-4c92-afe4-5edabea09911-db-sync-config-data\") pod \"cinder-db-sync-xcxvl\" (UID: \"4a9f981c-80a7-4c92-afe4-5edabea09911\") " pod="openstack/cinder-db-sync-xcxvl" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.369097 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a9f981c-80a7-4c92-afe4-5edabea09911-scripts\") pod \"cinder-db-sync-xcxvl\" (UID: \"4a9f981c-80a7-4c92-afe4-5edabea09911\") " pod="openstack/cinder-db-sync-xcxvl" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.369397 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9f981c-80a7-4c92-afe4-5edabea09911-combined-ca-bundle\") pod \"cinder-db-sync-xcxvl\" (UID: \"4a9f981c-80a7-4c92-afe4-5edabea09911\") " pod="openstack/cinder-db-sync-xcxvl" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.369542 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a9f981c-80a7-4c92-afe4-5edabea09911-config-data\") pod \"cinder-db-sync-xcxvl\" (UID: \"4a9f981c-80a7-4c92-afe4-5edabea09911\") " pod="openstack/cinder-db-sync-xcxvl" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.384242 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82kvr\" (UniqueName: \"kubernetes.io/projected/4a9f981c-80a7-4c92-afe4-5edabea09911-kube-api-access-82kvr\") pod \"cinder-db-sync-xcxvl\" (UID: \"4a9f981c-80a7-4c92-afe4-5edabea09911\") " pod="openstack/cinder-db-sync-xcxvl" Dec 03 09:02:20 crc kubenswrapper[4947]: I1203 09:02:20.546704 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xcxvl" Dec 03 09:02:21 crc kubenswrapper[4947]: W1203 09:02:21.087337 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a9f981c_80a7_4c92_afe4_5edabea09911.slice/crio-48a46ee160740ba8b8e3971911f4ee86fec2fb449ab6923950ca45059a352927 WatchSource:0}: Error finding container 48a46ee160740ba8b8e3971911f4ee86fec2fb449ab6923950ca45059a352927: Status 404 returned error can't find the container with id 48a46ee160740ba8b8e3971911f4ee86fec2fb449ab6923950ca45059a352927 Dec 03 09:02:21 crc kubenswrapper[4947]: I1203 09:02:21.119468 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xcxvl"] Dec 03 09:02:21 crc kubenswrapper[4947]: I1203 09:02:21.314072 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xcxvl" event={"ID":"4a9f981c-80a7-4c92-afe4-5edabea09911","Type":"ContainerStarted","Data":"48a46ee160740ba8b8e3971911f4ee86fec2fb449ab6923950ca45059a352927"} Dec 03 09:02:25 crc kubenswrapper[4947]: I1203 09:02:25.084029 4947 scope.go:117] "RemoveContainer" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" Dec 03 09:02:25 crc kubenswrapper[4947]: E1203 09:02:25.085401 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:02:37 crc kubenswrapper[4947]: I1203 09:02:37.087396 4947 scope.go:117] "RemoveContainer" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" Dec 03 09:02:37 crc kubenswrapper[4947]: E1203 09:02:37.088936 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:02:41 crc kubenswrapper[4947]: E1203 09:02:41.335884 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:65066e8ca260a75886ae57f157049605" Dec 03 09:02:41 crc kubenswrapper[4947]: E1203 09:02:41.336410 4947 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:65066e8ca260a75886ae57f157049605" Dec 03 09:02:41 crc kubenswrapper[4947]: E1203 09:02:41.336608 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:65066e8ca260a75886ae57f157049605,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-82kvr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-xcxvl_openstack(4a9f981c-80a7-4c92-afe4-5edabea09911): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 09:02:41 crc kubenswrapper[4947]: E1203 09:02:41.338037 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-xcxvl" podUID="4a9f981c-80a7-4c92-afe4-5edabea09911" Dec 03 09:02:41 crc kubenswrapper[4947]: E1203 09:02:41.513297 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-cinder-api:65066e8ca260a75886ae57f157049605\\\"\"" pod="openstack/cinder-db-sync-xcxvl" podUID="4a9f981c-80a7-4c92-afe4-5edabea09911" Dec 03 09:02:51 crc kubenswrapper[4947]: I1203 09:02:51.082874 4947 scope.go:117] "RemoveContainer" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" Dec 03 09:02:51 crc kubenswrapper[4947]: E1203 09:02:51.084336 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:02:53 crc kubenswrapper[4947]: I1203 09:02:53.617590 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xcxvl" event={"ID":"4a9f981c-80a7-4c92-afe4-5edabea09911","Type":"ContainerStarted","Data":"c2869799ad7199eb2bcba21cc6484d0e297f037df9fb8ef5096be7a6cea44915"} Dec 03 09:02:53 crc kubenswrapper[4947]: I1203 09:02:53.648948 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-xcxvl" podStartSLOduration=2.44072515 podStartE2EDuration="33.648923932s" podCreationTimestamp="2025-12-03 09:02:20 +0000 UTC" firstStartedPulling="2025-12-03 09:02:21.089475111 +0000 UTC m=+8002.350429537" lastFinishedPulling="2025-12-03 09:02:52.297673893 +0000 UTC m=+8033.558628319" observedRunningTime="2025-12-03 09:02:53.637301998 +0000 UTC m=+8034.898256484" watchObservedRunningTime="2025-12-03 09:02:53.648923932 +0000 UTC m=+8034.909878368" Dec 03 09:02:55 crc kubenswrapper[4947]: I1203 09:02:55.636406 4947 generic.go:334] "Generic (PLEG): container finished" podID="4a9f981c-80a7-4c92-afe4-5edabea09911" containerID="c2869799ad7199eb2bcba21cc6484d0e297f037df9fb8ef5096be7a6cea44915" exitCode=0 Dec 03 09:02:55 crc kubenswrapper[4947]: I1203 09:02:55.636528 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xcxvl" event={"ID":"4a9f981c-80a7-4c92-afe4-5edabea09911","Type":"ContainerDied","Data":"c2869799ad7199eb2bcba21cc6484d0e297f037df9fb8ef5096be7a6cea44915"} Dec 03 09:02:56 crc kubenswrapper[4947]: I1203 09:02:56.963418 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xcxvl" Dec 03 09:02:57 crc kubenswrapper[4947]: I1203 09:02:57.050247 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a9f981c-80a7-4c92-afe4-5edabea09911-config-data\") pod \"4a9f981c-80a7-4c92-afe4-5edabea09911\" (UID: \"4a9f981c-80a7-4c92-afe4-5edabea09911\") " Dec 03 09:02:57 crc kubenswrapper[4947]: I1203 09:02:57.050300 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82kvr\" (UniqueName: \"kubernetes.io/projected/4a9f981c-80a7-4c92-afe4-5edabea09911-kube-api-access-82kvr\") pod \"4a9f981c-80a7-4c92-afe4-5edabea09911\" (UID: \"4a9f981c-80a7-4c92-afe4-5edabea09911\") " Dec 03 09:02:57 crc kubenswrapper[4947]: I1203 09:02:57.050362 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a9f981c-80a7-4c92-afe4-5edabea09911-db-sync-config-data\") pod \"4a9f981c-80a7-4c92-afe4-5edabea09911\" (UID: \"4a9f981c-80a7-4c92-afe4-5edabea09911\") " Dec 03 09:02:57 crc kubenswrapper[4947]: I1203 09:02:57.050381 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a9f981c-80a7-4c92-afe4-5edabea09911-etc-machine-id\") pod \"4a9f981c-80a7-4c92-afe4-5edabea09911\" (UID: \"4a9f981c-80a7-4c92-afe4-5edabea09911\") " Dec 03 09:02:57 crc kubenswrapper[4947]: I1203 09:02:57.050445 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9f981c-80a7-4c92-afe4-5edabea09911-combined-ca-bundle\") pod \"4a9f981c-80a7-4c92-afe4-5edabea09911\" (UID: \"4a9f981c-80a7-4c92-afe4-5edabea09911\") " Dec 03 09:02:57 crc kubenswrapper[4947]: I1203 09:02:57.050557 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a9f981c-80a7-4c92-afe4-5edabea09911-scripts\") pod \"4a9f981c-80a7-4c92-afe4-5edabea09911\" (UID: \"4a9f981c-80a7-4c92-afe4-5edabea09911\") " Dec 03 09:02:57 crc kubenswrapper[4947]: I1203 09:02:57.050674 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a9f981c-80a7-4c92-afe4-5edabea09911-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4a9f981c-80a7-4c92-afe4-5edabea09911" (UID: "4a9f981c-80a7-4c92-afe4-5edabea09911"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 09:02:57 crc kubenswrapper[4947]: I1203 09:02:57.051254 4947 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a9f981c-80a7-4c92-afe4-5edabea09911-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 09:02:57 crc kubenswrapper[4947]: I1203 09:02:57.055730 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9f981c-80a7-4c92-afe4-5edabea09911-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4a9f981c-80a7-4c92-afe4-5edabea09911" (UID: "4a9f981c-80a7-4c92-afe4-5edabea09911"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:02:57 crc kubenswrapper[4947]: I1203 09:02:57.055762 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a9f981c-80a7-4c92-afe4-5edabea09911-kube-api-access-82kvr" (OuterVolumeSpecName: "kube-api-access-82kvr") pod "4a9f981c-80a7-4c92-afe4-5edabea09911" (UID: "4a9f981c-80a7-4c92-afe4-5edabea09911"). InnerVolumeSpecName "kube-api-access-82kvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:02:57 crc kubenswrapper[4947]: I1203 09:02:57.056732 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9f981c-80a7-4c92-afe4-5edabea09911-scripts" (OuterVolumeSpecName: "scripts") pod "4a9f981c-80a7-4c92-afe4-5edabea09911" (UID: "4a9f981c-80a7-4c92-afe4-5edabea09911"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:02:57 crc kubenswrapper[4947]: I1203 09:02:57.080542 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9f981c-80a7-4c92-afe4-5edabea09911-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a9f981c-80a7-4c92-afe4-5edabea09911" (UID: "4a9f981c-80a7-4c92-afe4-5edabea09911"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:02:57 crc kubenswrapper[4947]: I1203 09:02:57.100326 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a9f981c-80a7-4c92-afe4-5edabea09911-config-data" (OuterVolumeSpecName: "config-data") pod "4a9f981c-80a7-4c92-afe4-5edabea09911" (UID: "4a9f981c-80a7-4c92-afe4-5edabea09911"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:02:57 crc kubenswrapper[4947]: I1203 09:02:57.152229 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a9f981c-80a7-4c92-afe4-5edabea09911-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:02:57 crc kubenswrapper[4947]: I1203 09:02:57.152276 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a9f981c-80a7-4c92-afe4-5edabea09911-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:02:57 crc kubenswrapper[4947]: I1203 09:02:57.152288 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a9f981c-80a7-4c92-afe4-5edabea09911-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:02:57 crc kubenswrapper[4947]: I1203 09:02:57.152299 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82kvr\" (UniqueName: \"kubernetes.io/projected/4a9f981c-80a7-4c92-afe4-5edabea09911-kube-api-access-82kvr\") on node \"crc\" DevicePath \"\"" Dec 03 09:02:57 crc kubenswrapper[4947]: I1203 09:02:57.152312 4947 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a9f981c-80a7-4c92-afe4-5edabea09911-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:02:57 crc kubenswrapper[4947]: I1203 09:02:57.654820 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xcxvl" event={"ID":"4a9f981c-80a7-4c92-afe4-5edabea09911","Type":"ContainerDied","Data":"48a46ee160740ba8b8e3971911f4ee86fec2fb449ab6923950ca45059a352927"} Dec 03 09:02:57 crc kubenswrapper[4947]: I1203 09:02:57.654863 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48a46ee160740ba8b8e3971911f4ee86fec2fb449ab6923950ca45059a352927" Dec 03 09:02:57 crc kubenswrapper[4947]: I1203 09:02:57.654891 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xcxvl" Dec 03 09:02:57 crc kubenswrapper[4947]: I1203 09:02:57.977616 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56cbd99d57-8tnp7"] Dec 03 09:02:57 crc kubenswrapper[4947]: E1203 09:02:57.978050 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a9f981c-80a7-4c92-afe4-5edabea09911" containerName="cinder-db-sync" Dec 03 09:02:57 crc kubenswrapper[4947]: I1203 09:02:57.978066 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a9f981c-80a7-4c92-afe4-5edabea09911" containerName="cinder-db-sync" Dec 03 09:02:57 crc kubenswrapper[4947]: I1203 09:02:57.978343 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a9f981c-80a7-4c92-afe4-5edabea09911" containerName="cinder-db-sync" Dec 03 09:02:57 crc kubenswrapper[4947]: I1203 09:02:57.979765 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.004892 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56cbd99d57-8tnp7"] Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.068130 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-ovsdbserver-sb\") pod \"dnsmasq-dns-56cbd99d57-8tnp7\" (UID: \"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64\") " pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.068200 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s84p\" (UniqueName: \"kubernetes.io/projected/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-kube-api-access-7s84p\") pod \"dnsmasq-dns-56cbd99d57-8tnp7\" (UID: \"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64\") " pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.068251 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-dns-svc\") pod \"dnsmasq-dns-56cbd99d57-8tnp7\" (UID: \"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64\") " pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.068314 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-ovsdbserver-nb\") pod \"dnsmasq-dns-56cbd99d57-8tnp7\" (UID: \"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64\") " pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.068381 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-config\") pod \"dnsmasq-dns-56cbd99d57-8tnp7\" (UID: \"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64\") " pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.169741 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-config\") pod \"dnsmasq-dns-56cbd99d57-8tnp7\" (UID: \"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64\") " pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.170398 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-ovsdbserver-sb\") pod \"dnsmasq-dns-56cbd99d57-8tnp7\" (UID: \"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64\") " pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.170479 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s84p\" (UniqueName: \"kubernetes.io/projected/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-kube-api-access-7s84p\") pod \"dnsmasq-dns-56cbd99d57-8tnp7\" (UID: \"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64\") " pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.170634 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-dns-svc\") pod \"dnsmasq-dns-56cbd99d57-8tnp7\" (UID: \"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64\") " pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.170815 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-config\") pod \"dnsmasq-dns-56cbd99d57-8tnp7\" (UID: \"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64\") " pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.171119 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-ovsdbserver-nb\") pod \"dnsmasq-dns-56cbd99d57-8tnp7\" (UID: \"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64\") " pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.171399 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-dns-svc\") pod \"dnsmasq-dns-56cbd99d57-8tnp7\" (UID: \"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64\") " pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.171600 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-ovsdbserver-sb\") pod \"dnsmasq-dns-56cbd99d57-8tnp7\" (UID: \"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64\") " pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.171873 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-ovsdbserver-nb\") pod \"dnsmasq-dns-56cbd99d57-8tnp7\" (UID: \"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64\") " pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.194573 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s84p\" (UniqueName: \"kubernetes.io/projected/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-kube-api-access-7s84p\") pod \"dnsmasq-dns-56cbd99d57-8tnp7\" (UID: \"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64\") " pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.214878 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.216584 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.219480 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-m7rnc" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.219923 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.220362 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.220432 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.231336 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.272332 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46143c9f-b587-4c9c-85ce-70c8780fbc87-logs\") pod \"cinder-api-0\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " pod="openstack/cinder-api-0" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.272630 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46143c9f-b587-4c9c-85ce-70c8780fbc87-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " pod="openstack/cinder-api-0" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.272684 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46143c9f-b587-4c9c-85ce-70c8780fbc87-config-data\") pod \"cinder-api-0\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " pod="openstack/cinder-api-0" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.272701 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46143c9f-b587-4c9c-85ce-70c8780fbc87-config-data-custom\") pod \"cinder-api-0\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " pod="openstack/cinder-api-0" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.272755 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46143c9f-b587-4c9c-85ce-70c8780fbc87-scripts\") pod \"cinder-api-0\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " pod="openstack/cinder-api-0" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.272791 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46143c9f-b587-4c9c-85ce-70c8780fbc87-etc-machine-id\") pod \"cinder-api-0\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " pod="openstack/cinder-api-0" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.272819 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krgzd\" (UniqueName: \"kubernetes.io/projected/46143c9f-b587-4c9c-85ce-70c8780fbc87-kube-api-access-krgzd\") pod \"cinder-api-0\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " pod="openstack/cinder-api-0" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.303120 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.376357 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krgzd\" (UniqueName: \"kubernetes.io/projected/46143c9f-b587-4c9c-85ce-70c8780fbc87-kube-api-access-krgzd\") pod \"cinder-api-0\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " pod="openstack/cinder-api-0" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.376439 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46143c9f-b587-4c9c-85ce-70c8780fbc87-logs\") pod \"cinder-api-0\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " pod="openstack/cinder-api-0" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.376473 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46143c9f-b587-4c9c-85ce-70c8780fbc87-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " pod="openstack/cinder-api-0" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.376546 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46143c9f-b587-4c9c-85ce-70c8780fbc87-config-data\") pod \"cinder-api-0\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " pod="openstack/cinder-api-0" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.376570 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46143c9f-b587-4c9c-85ce-70c8780fbc87-config-data-custom\") pod \"cinder-api-0\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " pod="openstack/cinder-api-0" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.376655 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46143c9f-b587-4c9c-85ce-70c8780fbc87-scripts\") pod \"cinder-api-0\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " pod="openstack/cinder-api-0" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.376714 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46143c9f-b587-4c9c-85ce-70c8780fbc87-etc-machine-id\") pod \"cinder-api-0\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " pod="openstack/cinder-api-0" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.376843 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46143c9f-b587-4c9c-85ce-70c8780fbc87-etc-machine-id\") pod \"cinder-api-0\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " pod="openstack/cinder-api-0" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.377646 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46143c9f-b587-4c9c-85ce-70c8780fbc87-logs\") pod \"cinder-api-0\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " pod="openstack/cinder-api-0" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.384476 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46143c9f-b587-4c9c-85ce-70c8780fbc87-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " pod="openstack/cinder-api-0" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.384610 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46143c9f-b587-4c9c-85ce-70c8780fbc87-config-data\") pod \"cinder-api-0\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " pod="openstack/cinder-api-0" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.386653 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46143c9f-b587-4c9c-85ce-70c8780fbc87-scripts\") pod \"cinder-api-0\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " pod="openstack/cinder-api-0" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.398047 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46143c9f-b587-4c9c-85ce-70c8780fbc87-config-data-custom\") pod \"cinder-api-0\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " pod="openstack/cinder-api-0" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.404080 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krgzd\" (UniqueName: \"kubernetes.io/projected/46143c9f-b587-4c9c-85ce-70c8780fbc87-kube-api-access-krgzd\") pod \"cinder-api-0\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " pod="openstack/cinder-api-0" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.566714 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.791387 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56cbd99d57-8tnp7"] Dec 03 09:02:58 crc kubenswrapper[4947]: I1203 09:02:58.987856 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 09:02:58 crc kubenswrapper[4947]: W1203 09:02:58.995405 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46143c9f_b587_4c9c_85ce_70c8780fbc87.slice/crio-c88754abb311635bf20c886ae7a48a546a700baa73e89353842cf60f01a1a196 WatchSource:0}: Error finding container c88754abb311635bf20c886ae7a48a546a700baa73e89353842cf60f01a1a196: Status 404 returned error can't find the container with id c88754abb311635bf20c886ae7a48a546a700baa73e89353842cf60f01a1a196 Dec 03 09:02:59 crc kubenswrapper[4947]: I1203 09:02:59.680105 4947 generic.go:334] "Generic (PLEG): container finished" podID="b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64" containerID="56906a5680e3db102c21703b68397be45729af3fa2b348f0311420fe14090054" exitCode=0 Dec 03 09:02:59 crc kubenswrapper[4947]: I1203 09:02:59.680204 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" event={"ID":"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64","Type":"ContainerDied","Data":"56906a5680e3db102c21703b68397be45729af3fa2b348f0311420fe14090054"} Dec 03 09:02:59 crc kubenswrapper[4947]: I1203 09:02:59.680237 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" event={"ID":"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64","Type":"ContainerStarted","Data":"01b2833bbb42e34f1aa08f97b98ec5c62531b63e361f5d9b126dcb470a51b137"} Dec 03 09:02:59 crc kubenswrapper[4947]: I1203 09:02:59.683548 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"46143c9f-b587-4c9c-85ce-70c8780fbc87","Type":"ContainerStarted","Data":"c88754abb311635bf20c886ae7a48a546a700baa73e89353842cf60f01a1a196"} Dec 03 09:03:00 crc kubenswrapper[4947]: I1203 09:03:00.696385 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" event={"ID":"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64","Type":"ContainerStarted","Data":"ac00a368496b2036e7ce63fded5e5b14e49199f61b9d07fbeacb8c831ee9d4ca"} Dec 03 09:03:00 crc kubenswrapper[4947]: I1203 09:03:00.696870 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" Dec 03 09:03:00 crc kubenswrapper[4947]: I1203 09:03:00.698903 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"46143c9f-b587-4c9c-85ce-70c8780fbc87","Type":"ContainerStarted","Data":"8c5f7b20e44ebba8cce5c75b43e002c4af847e70ca8be82b2f7461cbf264ff46"} Dec 03 09:03:00 crc kubenswrapper[4947]: I1203 09:03:00.698932 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"46143c9f-b587-4c9c-85ce-70c8780fbc87","Type":"ContainerStarted","Data":"5b4a32d406d166f1285874a8bc32eb5fa91a3c2e3fb0a4cb08914279a423f539"} Dec 03 09:03:00 crc kubenswrapper[4947]: I1203 09:03:00.699019 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 03 09:03:00 crc kubenswrapper[4947]: I1203 09:03:00.715517 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" podStartSLOduration=3.715485432 podStartE2EDuration="3.715485432s" podCreationTimestamp="2025-12-03 09:02:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:03:00.714092024 +0000 UTC m=+8041.975046450" watchObservedRunningTime="2025-12-03 09:03:00.715485432 +0000 UTC m=+8041.976439858" Dec 03 09:03:00 crc kubenswrapper[4947]: I1203 09:03:00.734409 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.734393543 podStartE2EDuration="2.734393543s" podCreationTimestamp="2025-12-03 09:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:03:00.732711018 +0000 UTC m=+8041.993665434" watchObservedRunningTime="2025-12-03 09:03:00.734393543 +0000 UTC m=+8041.995347969" Dec 03 09:03:03 crc kubenswrapper[4947]: I1203 09:03:03.083241 4947 scope.go:117] "RemoveContainer" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" Dec 03 09:03:03 crc kubenswrapper[4947]: I1203 09:03:03.734381 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"90c58260a4ecaf4766384be2b94412a301c15894ed8a02e64bee5f8c77d3f403"} Dec 03 09:03:08 crc kubenswrapper[4947]: I1203 09:03:08.305825 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" Dec 03 09:03:08 crc kubenswrapper[4947]: I1203 09:03:08.371958 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cf689ffdc-g58ps"] Dec 03 09:03:08 crc kubenswrapper[4947]: I1203 09:03:08.372293 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" podUID="1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f" containerName="dnsmasq-dns" containerID="cri-o://3b414abcdfb3daa4db937f7c84768642c7bdac8b03bf29133ee1ab425b48846f" gracePeriod=10 Dec 03 09:03:08 crc kubenswrapper[4947]: I1203 09:03:08.784603 4947 generic.go:334] "Generic (PLEG): container finished" podID="1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f" containerID="3b414abcdfb3daa4db937f7c84768642c7bdac8b03bf29133ee1ab425b48846f" exitCode=0 Dec 03 09:03:08 crc kubenswrapper[4947]: I1203 09:03:08.784679 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" event={"ID":"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f","Type":"ContainerDied","Data":"3b414abcdfb3daa4db937f7c84768642c7bdac8b03bf29133ee1ab425b48846f"} Dec 03 09:03:08 crc kubenswrapper[4947]: I1203 09:03:08.908772 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" Dec 03 09:03:09 crc kubenswrapper[4947]: I1203 09:03:09.100108 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-dns-svc\") pod \"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f\" (UID: \"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f\") " Dec 03 09:03:09 crc kubenswrapper[4947]: I1203 09:03:09.100175 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-ovsdbserver-nb\") pod \"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f\" (UID: \"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f\") " Dec 03 09:03:09 crc kubenswrapper[4947]: I1203 09:03:09.100199 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-ovsdbserver-sb\") pod \"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f\" (UID: \"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f\") " Dec 03 09:03:09 crc kubenswrapper[4947]: I1203 09:03:09.100242 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-config\") pod \"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f\" (UID: \"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f\") " Dec 03 09:03:09 crc kubenswrapper[4947]: I1203 09:03:09.100320 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89znx\" (UniqueName: \"kubernetes.io/projected/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-kube-api-access-89znx\") pod \"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f\" (UID: \"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f\") " Dec 03 09:03:09 crc kubenswrapper[4947]: I1203 09:03:09.122977 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-kube-api-access-89znx" (OuterVolumeSpecName: "kube-api-access-89znx") pod "1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f" (UID: "1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f"). InnerVolumeSpecName "kube-api-access-89znx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:03:09 crc kubenswrapper[4947]: I1203 09:03:09.159289 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f" (UID: "1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:03:09 crc kubenswrapper[4947]: I1203 09:03:09.166827 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f" (UID: "1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:03:09 crc kubenswrapper[4947]: I1203 09:03:09.170976 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-config" (OuterVolumeSpecName: "config") pod "1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f" (UID: "1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:03:09 crc kubenswrapper[4947]: I1203 09:03:09.188028 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f" (UID: "1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:03:09 crc kubenswrapper[4947]: I1203 09:03:09.202145 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 09:03:09 crc kubenswrapper[4947]: I1203 09:03:09.202187 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 09:03:09 crc kubenswrapper[4947]: I1203 09:03:09.202198 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 09:03:09 crc kubenswrapper[4947]: I1203 09:03:09.202206 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-config\") on node \"crc\" DevicePath \"\"" Dec 03 09:03:09 crc kubenswrapper[4947]: I1203 09:03:09.202215 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89znx\" (UniqueName: \"kubernetes.io/projected/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f-kube-api-access-89znx\") on node \"crc\" DevicePath \"\"" Dec 03 09:03:09 crc kubenswrapper[4947]: I1203 09:03:09.795818 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" event={"ID":"1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f","Type":"ContainerDied","Data":"0a767f6f53d8f7730a29f9ad7ab3ad4d270284a7f3587233c521991619a7729f"} Dec 03 09:03:09 crc kubenswrapper[4947]: I1203 09:03:09.795898 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cf689ffdc-g58ps" Dec 03 09:03:09 crc kubenswrapper[4947]: I1203 09:03:09.796137 4947 scope.go:117] "RemoveContainer" containerID="3b414abcdfb3daa4db937f7c84768642c7bdac8b03bf29133ee1ab425b48846f" Dec 03 09:03:09 crc kubenswrapper[4947]: I1203 09:03:09.837357 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cf689ffdc-g58ps"] Dec 03 09:03:09 crc kubenswrapper[4947]: I1203 09:03:09.845602 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cf689ffdc-g58ps"] Dec 03 09:03:10 crc kubenswrapper[4947]: I1203 09:03:10.216100 4947 scope.go:117] "RemoveContainer" containerID="8f9db970955d772b66446fd7ddbe2eb1939570cc8e9ee3429c90d39992c36bdf" Dec 03 09:03:10 crc kubenswrapper[4947]: I1203 09:03:10.483085 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 03 09:03:11 crc kubenswrapper[4947]: I1203 09:03:11.093981 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f" path="/var/lib/kubelet/pods/1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f/volumes" Dec 03 09:03:12 crc kubenswrapper[4947]: I1203 09:03:12.883469 4947 scope.go:117] "RemoveContainer" containerID="b7e5220e376e445cb5c0539a007dad481d8a330665f4caaf9623a1397719ceb4" Dec 03 09:03:12 crc kubenswrapper[4947]: I1203 09:03:12.921700 4947 scope.go:117] "RemoveContainer" containerID="7cfb3dadbed4c99cdef3765c4aa1651e016194728817eef87f18a8e4fcbb2528" Dec 03 09:03:12 crc kubenswrapper[4947]: I1203 09:03:12.967188 4947 scope.go:117] "RemoveContainer" containerID="8312780366e14dabe54cbc01f62c65fb2b382984e7edadef950f5b9fead43ed3" Dec 03 09:03:13 crc kubenswrapper[4947]: I1203 09:03:13.000093 4947 scope.go:117] "RemoveContainer" containerID="be209a575278e2f465cbf6a02c21dfd76a5a2085fa6ff333faa202b7bcaca4d8" Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.376204 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 09:03:27 crc kubenswrapper[4947]: E1203 09:03:27.377374 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f" containerName="init" Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.377390 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f" containerName="init" Dec 03 09:03:27 crc kubenswrapper[4947]: E1203 09:03:27.377420 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f" containerName="dnsmasq-dns" Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.377427 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f" containerName="dnsmasq-dns" Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.377670 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bc67b75-6adc-43e7-9df0-4a6ff5a70f8f" containerName="dnsmasq-dns" Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.378855 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.380653 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.405270 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.556028 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.556088 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-scripts\") pod \"cinder-scheduler-0\" (UID: \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.556119 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-config-data\") pod \"cinder-scheduler-0\" (UID: \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.556198 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.556263 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqnqb\" (UniqueName: \"kubernetes.io/projected/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-kube-api-access-jqnqb\") pod \"cinder-scheduler-0\" (UID: \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.556316 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.658195 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.658275 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqnqb\" (UniqueName: \"kubernetes.io/projected/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-kube-api-access-jqnqb\") pod \"cinder-scheduler-0\" (UID: \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.658317 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.658384 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.658401 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-scripts\") pod \"cinder-scheduler-0\" (UID: \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.658420 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-config-data\") pod \"cinder-scheduler-0\" (UID: \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.658444 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.663795 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-scripts\") pod \"cinder-scheduler-0\" (UID: \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.664635 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.666224 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-config-data\") pod \"cinder-scheduler-0\" (UID: \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.668709 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.676866 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqnqb\" (UniqueName: \"kubernetes.io/projected/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-kube-api-access-jqnqb\") pod \"cinder-scheduler-0\" (UID: \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:27 crc kubenswrapper[4947]: I1203 09:03:27.698814 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 09:03:28 crc kubenswrapper[4947]: I1203 09:03:28.173731 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 09:03:29 crc kubenswrapper[4947]: I1203 09:03:29.013636 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e2e6ea2b-b80a-467e-8180-ac735fa47dc3","Type":"ContainerStarted","Data":"369a7dd6795be6355106a94e9256e3da23b48a702d1dd52a7862464e9a88659e"} Dec 03 09:03:29 crc kubenswrapper[4947]: I1203 09:03:29.178159 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 09:03:29 crc kubenswrapper[4947]: I1203 09:03:29.178698 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="46143c9f-b587-4c9c-85ce-70c8780fbc87" containerName="cinder-api-log" containerID="cri-o://5b4a32d406d166f1285874a8bc32eb5fa91a3c2e3fb0a4cb08914279a423f539" gracePeriod=30 Dec 03 09:03:29 crc kubenswrapper[4947]: I1203 09:03:29.178906 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="46143c9f-b587-4c9c-85ce-70c8780fbc87" containerName="cinder-api" containerID="cri-o://8c5f7b20e44ebba8cce5c75b43e002c4af847e70ca8be82b2f7461cbf264ff46" gracePeriod=30 Dec 03 09:03:30 crc kubenswrapper[4947]: I1203 09:03:30.024184 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e2e6ea2b-b80a-467e-8180-ac735fa47dc3","Type":"ContainerStarted","Data":"c2cfa19a5ab6448031b5c44ab76a47e3e7465dd0f9d31221bc35fc695dfad8ba"} Dec 03 09:03:30 crc kubenswrapper[4947]: I1203 09:03:30.024564 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e2e6ea2b-b80a-467e-8180-ac735fa47dc3","Type":"ContainerStarted","Data":"3b0f4d1dfa347c5ee6fd3e1f8a53bdb3eb55a0bd48499cf6c2e7ab12ac90e8dd"} Dec 03 09:03:30 crc kubenswrapper[4947]: I1203 09:03:30.026931 4947 generic.go:334] "Generic (PLEG): container finished" podID="46143c9f-b587-4c9c-85ce-70c8780fbc87" containerID="5b4a32d406d166f1285874a8bc32eb5fa91a3c2e3fb0a4cb08914279a423f539" exitCode=143 Dec 03 09:03:30 crc kubenswrapper[4947]: I1203 09:03:30.026993 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"46143c9f-b587-4c9c-85ce-70c8780fbc87","Type":"ContainerDied","Data":"5b4a32d406d166f1285874a8bc32eb5fa91a3c2e3fb0a4cb08914279a423f539"} Dec 03 09:03:30 crc kubenswrapper[4947]: I1203 09:03:30.060765 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.787650249 podStartE2EDuration="3.060740916s" podCreationTimestamp="2025-12-03 09:03:27 +0000 UTC" firstStartedPulling="2025-12-03 09:03:28.183135359 +0000 UTC m=+8069.444089785" lastFinishedPulling="2025-12-03 09:03:28.456226026 +0000 UTC m=+8069.717180452" observedRunningTime="2025-12-03 09:03:30.04610037 +0000 UTC m=+8071.307054826" watchObservedRunningTime="2025-12-03 09:03:30.060740916 +0000 UTC m=+8071.321695372" Dec 03 09:03:32 crc kubenswrapper[4947]: I1203 09:03:32.698938 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.054432 4947 generic.go:334] "Generic (PLEG): container finished" podID="46143c9f-b587-4c9c-85ce-70c8780fbc87" containerID="8c5f7b20e44ebba8cce5c75b43e002c4af847e70ca8be82b2f7461cbf264ff46" exitCode=0 Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.054542 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"46143c9f-b587-4c9c-85ce-70c8780fbc87","Type":"ContainerDied","Data":"8c5f7b20e44ebba8cce5c75b43e002c4af847e70ca8be82b2f7461cbf264ff46"} Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.054755 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"46143c9f-b587-4c9c-85ce-70c8780fbc87","Type":"ContainerDied","Data":"c88754abb311635bf20c886ae7a48a546a700baa73e89353842cf60f01a1a196"} Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.054772 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c88754abb311635bf20c886ae7a48a546a700baa73e89353842cf60f01a1a196" Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.122884 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.163840 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46143c9f-b587-4c9c-85ce-70c8780fbc87-config-data\") pod \"46143c9f-b587-4c9c-85ce-70c8780fbc87\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.164002 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46143c9f-b587-4c9c-85ce-70c8780fbc87-etc-machine-id\") pod \"46143c9f-b587-4c9c-85ce-70c8780fbc87\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.164034 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46143c9f-b587-4c9c-85ce-70c8780fbc87-combined-ca-bundle\") pod \"46143c9f-b587-4c9c-85ce-70c8780fbc87\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.164105 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46143c9f-b587-4c9c-85ce-70c8780fbc87-logs\") pod \"46143c9f-b587-4c9c-85ce-70c8780fbc87\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.164136 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krgzd\" (UniqueName: \"kubernetes.io/projected/46143c9f-b587-4c9c-85ce-70c8780fbc87-kube-api-access-krgzd\") pod \"46143c9f-b587-4c9c-85ce-70c8780fbc87\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.164239 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46143c9f-b587-4c9c-85ce-70c8780fbc87-config-data-custom\") pod \"46143c9f-b587-4c9c-85ce-70c8780fbc87\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.164285 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46143c9f-b587-4c9c-85ce-70c8780fbc87-scripts\") pod \"46143c9f-b587-4c9c-85ce-70c8780fbc87\" (UID: \"46143c9f-b587-4c9c-85ce-70c8780fbc87\") " Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.168654 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46143c9f-b587-4c9c-85ce-70c8780fbc87-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "46143c9f-b587-4c9c-85ce-70c8780fbc87" (UID: "46143c9f-b587-4c9c-85ce-70c8780fbc87"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.170133 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46143c9f-b587-4c9c-85ce-70c8780fbc87-logs" (OuterVolumeSpecName: "logs") pod "46143c9f-b587-4c9c-85ce-70c8780fbc87" (UID: "46143c9f-b587-4c9c-85ce-70c8780fbc87"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.173516 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46143c9f-b587-4c9c-85ce-70c8780fbc87-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "46143c9f-b587-4c9c-85ce-70c8780fbc87" (UID: "46143c9f-b587-4c9c-85ce-70c8780fbc87"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.191875 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46143c9f-b587-4c9c-85ce-70c8780fbc87-scripts" (OuterVolumeSpecName: "scripts") pod "46143c9f-b587-4c9c-85ce-70c8780fbc87" (UID: "46143c9f-b587-4c9c-85ce-70c8780fbc87"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.201959 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46143c9f-b587-4c9c-85ce-70c8780fbc87-kube-api-access-krgzd" (OuterVolumeSpecName: "kube-api-access-krgzd") pod "46143c9f-b587-4c9c-85ce-70c8780fbc87" (UID: "46143c9f-b587-4c9c-85ce-70c8780fbc87"). InnerVolumeSpecName "kube-api-access-krgzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.228811 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46143c9f-b587-4c9c-85ce-70c8780fbc87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46143c9f-b587-4c9c-85ce-70c8780fbc87" (UID: "46143c9f-b587-4c9c-85ce-70c8780fbc87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.252675 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46143c9f-b587-4c9c-85ce-70c8780fbc87-config-data" (OuterVolumeSpecName: "config-data") pod "46143c9f-b587-4c9c-85ce-70c8780fbc87" (UID: "46143c9f-b587-4c9c-85ce-70c8780fbc87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.266470 4947 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/46143c9f-b587-4c9c-85ce-70c8780fbc87-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.266519 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46143c9f-b587-4c9c-85ce-70c8780fbc87-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.266532 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46143c9f-b587-4c9c-85ce-70c8780fbc87-logs\") on node \"crc\" DevicePath \"\"" Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.266546 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krgzd\" (UniqueName: \"kubernetes.io/projected/46143c9f-b587-4c9c-85ce-70c8780fbc87-kube-api-access-krgzd\") on node \"crc\" DevicePath \"\"" Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.266560 4947 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46143c9f-b587-4c9c-85ce-70c8780fbc87-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.266571 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/46143c9f-b587-4c9c-85ce-70c8780fbc87-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:03:33 crc kubenswrapper[4947]: I1203 09:03:33.266581 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46143c9f-b587-4c9c-85ce-70c8780fbc87-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.062246 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.099954 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.117462 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.139446 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 03 09:03:34 crc kubenswrapper[4947]: E1203 09:03:34.140015 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46143c9f-b587-4c9c-85ce-70c8780fbc87" containerName="cinder-api-log" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.140048 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="46143c9f-b587-4c9c-85ce-70c8780fbc87" containerName="cinder-api-log" Dec 03 09:03:34 crc kubenswrapper[4947]: E1203 09:03:34.140076 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46143c9f-b587-4c9c-85ce-70c8780fbc87" containerName="cinder-api" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.140084 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="46143c9f-b587-4c9c-85ce-70c8780fbc87" containerName="cinder-api" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.140316 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="46143c9f-b587-4c9c-85ce-70c8780fbc87" containerName="cinder-api" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.140342 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="46143c9f-b587-4c9c-85ce-70c8780fbc87" containerName="cinder-api-log" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.142161 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.144892 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.145428 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.180097 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhgp7\" (UniqueName: \"kubernetes.io/projected/7e617284-a25d-439b-ac2d-5b795a63ea06-kube-api-access-hhgp7\") pod \"cinder-api-0\" (UID: \"7e617284-a25d-439b-ac2d-5b795a63ea06\") " pod="openstack/cinder-api-0" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.180142 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e617284-a25d-439b-ac2d-5b795a63ea06-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7e617284-a25d-439b-ac2d-5b795a63ea06\") " pod="openstack/cinder-api-0" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.180173 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e617284-a25d-439b-ac2d-5b795a63ea06-config-data-custom\") pod \"cinder-api-0\" (UID: \"7e617284-a25d-439b-ac2d-5b795a63ea06\") " pod="openstack/cinder-api-0" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.180537 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e617284-a25d-439b-ac2d-5b795a63ea06-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7e617284-a25d-439b-ac2d-5b795a63ea06\") " pod="openstack/cinder-api-0" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.180637 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e617284-a25d-439b-ac2d-5b795a63ea06-scripts\") pod \"cinder-api-0\" (UID: \"7e617284-a25d-439b-ac2d-5b795a63ea06\") " pod="openstack/cinder-api-0" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.180765 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e617284-a25d-439b-ac2d-5b795a63ea06-config-data\") pod \"cinder-api-0\" (UID: \"7e617284-a25d-439b-ac2d-5b795a63ea06\") " pod="openstack/cinder-api-0" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.180802 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e617284-a25d-439b-ac2d-5b795a63ea06-logs\") pod \"cinder-api-0\" (UID: \"7e617284-a25d-439b-ac2d-5b795a63ea06\") " pod="openstack/cinder-api-0" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.282702 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhgp7\" (UniqueName: \"kubernetes.io/projected/7e617284-a25d-439b-ac2d-5b795a63ea06-kube-api-access-hhgp7\") pod \"cinder-api-0\" (UID: \"7e617284-a25d-439b-ac2d-5b795a63ea06\") " pod="openstack/cinder-api-0" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.282764 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e617284-a25d-439b-ac2d-5b795a63ea06-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7e617284-a25d-439b-ac2d-5b795a63ea06\") " pod="openstack/cinder-api-0" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.282799 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e617284-a25d-439b-ac2d-5b795a63ea06-config-data-custom\") pod \"cinder-api-0\" (UID: \"7e617284-a25d-439b-ac2d-5b795a63ea06\") " pod="openstack/cinder-api-0" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.282851 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e617284-a25d-439b-ac2d-5b795a63ea06-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7e617284-a25d-439b-ac2d-5b795a63ea06\") " pod="openstack/cinder-api-0" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.282874 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e617284-a25d-439b-ac2d-5b795a63ea06-scripts\") pod \"cinder-api-0\" (UID: \"7e617284-a25d-439b-ac2d-5b795a63ea06\") " pod="openstack/cinder-api-0" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.282907 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e617284-a25d-439b-ac2d-5b795a63ea06-config-data\") pod \"cinder-api-0\" (UID: \"7e617284-a25d-439b-ac2d-5b795a63ea06\") " pod="openstack/cinder-api-0" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.282923 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e617284-a25d-439b-ac2d-5b795a63ea06-logs\") pod \"cinder-api-0\" (UID: \"7e617284-a25d-439b-ac2d-5b795a63ea06\") " pod="openstack/cinder-api-0" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.283055 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7e617284-a25d-439b-ac2d-5b795a63ea06-etc-machine-id\") pod \"cinder-api-0\" (UID: \"7e617284-a25d-439b-ac2d-5b795a63ea06\") " pod="openstack/cinder-api-0" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.283394 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e617284-a25d-439b-ac2d-5b795a63ea06-logs\") pod \"cinder-api-0\" (UID: \"7e617284-a25d-439b-ac2d-5b795a63ea06\") " pod="openstack/cinder-api-0" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.286551 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e617284-a25d-439b-ac2d-5b795a63ea06-scripts\") pod \"cinder-api-0\" (UID: \"7e617284-a25d-439b-ac2d-5b795a63ea06\") " pod="openstack/cinder-api-0" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.286994 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7e617284-a25d-439b-ac2d-5b795a63ea06-config-data-custom\") pod \"cinder-api-0\" (UID: \"7e617284-a25d-439b-ac2d-5b795a63ea06\") " pod="openstack/cinder-api-0" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.292080 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e617284-a25d-439b-ac2d-5b795a63ea06-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"7e617284-a25d-439b-ac2d-5b795a63ea06\") " pod="openstack/cinder-api-0" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.292370 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e617284-a25d-439b-ac2d-5b795a63ea06-config-data\") pod \"cinder-api-0\" (UID: \"7e617284-a25d-439b-ac2d-5b795a63ea06\") " pod="openstack/cinder-api-0" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.300078 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhgp7\" (UniqueName: \"kubernetes.io/projected/7e617284-a25d-439b-ac2d-5b795a63ea06-kube-api-access-hhgp7\") pod \"cinder-api-0\" (UID: \"7e617284-a25d-439b-ac2d-5b795a63ea06\") " pod="openstack/cinder-api-0" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.462032 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 09:03:34 crc kubenswrapper[4947]: I1203 09:03:34.950338 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 09:03:35 crc kubenswrapper[4947]: I1203 09:03:35.073707 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7e617284-a25d-439b-ac2d-5b795a63ea06","Type":"ContainerStarted","Data":"11059dada0320f59c954558c221d987e25b798471f6fd5a3072c396f59132ccc"} Dec 03 09:03:35 crc kubenswrapper[4947]: I1203 09:03:35.095077 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46143c9f-b587-4c9c-85ce-70c8780fbc87" path="/var/lib/kubelet/pods/46143c9f-b587-4c9c-85ce-70c8780fbc87/volumes" Dec 03 09:03:36 crc kubenswrapper[4947]: I1203 09:03:36.083223 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7e617284-a25d-439b-ac2d-5b795a63ea06","Type":"ContainerStarted","Data":"063bc1679256d81cd198225b6f6ee4ec7a8ec1d4311f96fabce9b6a82fd00518"} Dec 03 09:03:37 crc kubenswrapper[4947]: I1203 09:03:37.102145 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"7e617284-a25d-439b-ac2d-5b795a63ea06","Type":"ContainerStarted","Data":"d84c99e14089e7eaeb5826d39a016067df7ce9bcd3be994854bf8da74fa34587"} Dec 03 09:03:37 crc kubenswrapper[4947]: I1203 09:03:37.102430 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 03 09:03:37 crc kubenswrapper[4947]: I1203 09:03:37.134982 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.134965783 podStartE2EDuration="3.134965783s" podCreationTimestamp="2025-12-03 09:03:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:03:37.120570263 +0000 UTC m=+8078.381524739" watchObservedRunningTime="2025-12-03 09:03:37.134965783 +0000 UTC m=+8078.395920209" Dec 03 09:03:37 crc kubenswrapper[4947]: I1203 09:03:37.911537 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 03 09:03:37 crc kubenswrapper[4947]: I1203 09:03:37.947599 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 09:03:38 crc kubenswrapper[4947]: I1203 09:03:38.122219 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e2e6ea2b-b80a-467e-8180-ac735fa47dc3" containerName="cinder-scheduler" containerID="cri-o://3b0f4d1dfa347c5ee6fd3e1f8a53bdb3eb55a0bd48499cf6c2e7ab12ac90e8dd" gracePeriod=30 Dec 03 09:03:38 crc kubenswrapper[4947]: I1203 09:03:38.122611 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e2e6ea2b-b80a-467e-8180-ac735fa47dc3" containerName="probe" containerID="cri-o://c2cfa19a5ab6448031b5c44ab76a47e3e7465dd0f9d31221bc35fc695dfad8ba" gracePeriod=30 Dec 03 09:03:39 crc kubenswrapper[4947]: I1203 09:03:39.137701 4947 generic.go:334] "Generic (PLEG): container finished" podID="e2e6ea2b-b80a-467e-8180-ac735fa47dc3" containerID="c2cfa19a5ab6448031b5c44ab76a47e3e7465dd0f9d31221bc35fc695dfad8ba" exitCode=0 Dec 03 09:03:39 crc kubenswrapper[4947]: I1203 09:03:39.137808 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e2e6ea2b-b80a-467e-8180-ac735fa47dc3","Type":"ContainerDied","Data":"c2cfa19a5ab6448031b5c44ab76a47e3e7465dd0f9d31221bc35fc695dfad8ba"} Dec 03 09:03:40 crc kubenswrapper[4947]: I1203 09:03:40.147348 4947 generic.go:334] "Generic (PLEG): container finished" podID="e2e6ea2b-b80a-467e-8180-ac735fa47dc3" containerID="3b0f4d1dfa347c5ee6fd3e1f8a53bdb3eb55a0bd48499cf6c2e7ab12ac90e8dd" exitCode=0 Dec 03 09:03:40 crc kubenswrapper[4947]: I1203 09:03:40.147393 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e2e6ea2b-b80a-467e-8180-ac735fa47dc3","Type":"ContainerDied","Data":"3b0f4d1dfa347c5ee6fd3e1f8a53bdb3eb55a0bd48499cf6c2e7ab12ac90e8dd"} Dec 03 09:03:40 crc kubenswrapper[4947]: I1203 09:03:40.318338 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 09:03:40 crc kubenswrapper[4947]: I1203 09:03:40.444728 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-scripts\") pod \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\" (UID: \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\") " Dec 03 09:03:40 crc kubenswrapper[4947]: I1203 09:03:40.444812 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqnqb\" (UniqueName: \"kubernetes.io/projected/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-kube-api-access-jqnqb\") pod \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\" (UID: \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\") " Dec 03 09:03:40 crc kubenswrapper[4947]: I1203 09:03:40.444837 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-config-data\") pod \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\" (UID: \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\") " Dec 03 09:03:40 crc kubenswrapper[4947]: I1203 09:03:40.444860 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-config-data-custom\") pod \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\" (UID: \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\") " Dec 03 09:03:40 crc kubenswrapper[4947]: I1203 09:03:40.444885 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-combined-ca-bundle\") pod \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\" (UID: \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\") " Dec 03 09:03:40 crc kubenswrapper[4947]: I1203 09:03:40.444904 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-etc-machine-id\") pod \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\" (UID: \"e2e6ea2b-b80a-467e-8180-ac735fa47dc3\") " Dec 03 09:03:40 crc kubenswrapper[4947]: I1203 09:03:40.445501 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e2e6ea2b-b80a-467e-8180-ac735fa47dc3" (UID: "e2e6ea2b-b80a-467e-8180-ac735fa47dc3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 09:03:40 crc kubenswrapper[4947]: I1203 09:03:40.450479 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e2e6ea2b-b80a-467e-8180-ac735fa47dc3" (UID: "e2e6ea2b-b80a-467e-8180-ac735fa47dc3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:03:40 crc kubenswrapper[4947]: I1203 09:03:40.451727 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-scripts" (OuterVolumeSpecName: "scripts") pod "e2e6ea2b-b80a-467e-8180-ac735fa47dc3" (UID: "e2e6ea2b-b80a-467e-8180-ac735fa47dc3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:03:40 crc kubenswrapper[4947]: I1203 09:03:40.451737 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-kube-api-access-jqnqb" (OuterVolumeSpecName: "kube-api-access-jqnqb") pod "e2e6ea2b-b80a-467e-8180-ac735fa47dc3" (UID: "e2e6ea2b-b80a-467e-8180-ac735fa47dc3"). InnerVolumeSpecName "kube-api-access-jqnqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:03:40 crc kubenswrapper[4947]: I1203 09:03:40.524309 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2e6ea2b-b80a-467e-8180-ac735fa47dc3" (UID: "e2e6ea2b-b80a-467e-8180-ac735fa47dc3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:03:40 crc kubenswrapper[4947]: I1203 09:03:40.546980 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:03:40 crc kubenswrapper[4947]: I1203 09:03:40.547004 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqnqb\" (UniqueName: \"kubernetes.io/projected/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-kube-api-access-jqnqb\") on node \"crc\" DevicePath \"\"" Dec 03 09:03:40 crc kubenswrapper[4947]: I1203 09:03:40.547013 4947 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 09:03:40 crc kubenswrapper[4947]: I1203 09:03:40.547023 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:03:40 crc kubenswrapper[4947]: I1203 09:03:40.547030 4947 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 09:03:40 crc kubenswrapper[4947]: I1203 09:03:40.559842 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-config-data" (OuterVolumeSpecName: "config-data") pod "e2e6ea2b-b80a-467e-8180-ac735fa47dc3" (UID: "e2e6ea2b-b80a-467e-8180-ac735fa47dc3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:03:40 crc kubenswrapper[4947]: I1203 09:03:40.648247 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2e6ea2b-b80a-467e-8180-ac735fa47dc3-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.158932 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e2e6ea2b-b80a-467e-8180-ac735fa47dc3","Type":"ContainerDied","Data":"369a7dd6795be6355106a94e9256e3da23b48a702d1dd52a7862464e9a88659e"} Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.159036 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.159362 4947 scope.go:117] "RemoveContainer" containerID="c2cfa19a5ab6448031b5c44ab76a47e3e7465dd0f9d31221bc35fc695dfad8ba" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.200530 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.203184 4947 scope.go:117] "RemoveContainer" containerID="3b0f4d1dfa347c5ee6fd3e1f8a53bdb3eb55a0bd48499cf6c2e7ab12ac90e8dd" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.220397 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.232667 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 09:03:41 crc kubenswrapper[4947]: E1203 09:03:41.233264 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2e6ea2b-b80a-467e-8180-ac735fa47dc3" containerName="probe" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.233299 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e6ea2b-b80a-467e-8180-ac735fa47dc3" containerName="probe" Dec 03 09:03:41 crc kubenswrapper[4947]: E1203 09:03:41.233343 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2e6ea2b-b80a-467e-8180-ac735fa47dc3" containerName="cinder-scheduler" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.233353 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e6ea2b-b80a-467e-8180-ac735fa47dc3" containerName="cinder-scheduler" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.233622 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2e6ea2b-b80a-467e-8180-ac735fa47dc3" containerName="probe" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.233650 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2e6ea2b-b80a-467e-8180-ac735fa47dc3" containerName="cinder-scheduler" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.234954 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.237129 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.244350 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.260212 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e73b990b-2218-4b3e-9f09-3200fc4d668d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e73b990b-2218-4b3e-9f09-3200fc4d668d\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.260328 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e73b990b-2218-4b3e-9f09-3200fc4d668d-config-data\") pod \"cinder-scheduler-0\" (UID: \"e73b990b-2218-4b3e-9f09-3200fc4d668d\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.260344 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e73b990b-2218-4b3e-9f09-3200fc4d668d-scripts\") pod \"cinder-scheduler-0\" (UID: \"e73b990b-2218-4b3e-9f09-3200fc4d668d\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.260383 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e73b990b-2218-4b3e-9f09-3200fc4d668d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e73b990b-2218-4b3e-9f09-3200fc4d668d\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.260587 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-565jd\" (UniqueName: \"kubernetes.io/projected/e73b990b-2218-4b3e-9f09-3200fc4d668d-kube-api-access-565jd\") pod \"cinder-scheduler-0\" (UID: \"e73b990b-2218-4b3e-9f09-3200fc4d668d\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.260677 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e73b990b-2218-4b3e-9f09-3200fc4d668d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e73b990b-2218-4b3e-9f09-3200fc4d668d\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.361698 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e73b990b-2218-4b3e-9f09-3200fc4d668d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e73b990b-2218-4b3e-9f09-3200fc4d668d\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.361761 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e73b990b-2218-4b3e-9f09-3200fc4d668d-config-data\") pod \"cinder-scheduler-0\" (UID: \"e73b990b-2218-4b3e-9f09-3200fc4d668d\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.361782 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e73b990b-2218-4b3e-9f09-3200fc4d668d-scripts\") pod \"cinder-scheduler-0\" (UID: \"e73b990b-2218-4b3e-9f09-3200fc4d668d\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.361826 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e73b990b-2218-4b3e-9f09-3200fc4d668d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e73b990b-2218-4b3e-9f09-3200fc4d668d\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.361908 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-565jd\" (UniqueName: \"kubernetes.io/projected/e73b990b-2218-4b3e-9f09-3200fc4d668d-kube-api-access-565jd\") pod \"cinder-scheduler-0\" (UID: \"e73b990b-2218-4b3e-9f09-3200fc4d668d\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.361953 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e73b990b-2218-4b3e-9f09-3200fc4d668d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e73b990b-2218-4b3e-9f09-3200fc4d668d\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.362596 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e73b990b-2218-4b3e-9f09-3200fc4d668d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e73b990b-2218-4b3e-9f09-3200fc4d668d\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.366186 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e73b990b-2218-4b3e-9f09-3200fc4d668d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e73b990b-2218-4b3e-9f09-3200fc4d668d\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.366763 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e73b990b-2218-4b3e-9f09-3200fc4d668d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e73b990b-2218-4b3e-9f09-3200fc4d668d\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.367022 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e73b990b-2218-4b3e-9f09-3200fc4d668d-config-data\") pod \"cinder-scheduler-0\" (UID: \"e73b990b-2218-4b3e-9f09-3200fc4d668d\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.367356 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e73b990b-2218-4b3e-9f09-3200fc4d668d-scripts\") pod \"cinder-scheduler-0\" (UID: \"e73b990b-2218-4b3e-9f09-3200fc4d668d\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.378907 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-565jd\" (UniqueName: \"kubernetes.io/projected/e73b990b-2218-4b3e-9f09-3200fc4d668d-kube-api-access-565jd\") pod \"cinder-scheduler-0\" (UID: \"e73b990b-2218-4b3e-9f09-3200fc4d668d\") " pod="openstack/cinder-scheduler-0" Dec 03 09:03:41 crc kubenswrapper[4947]: I1203 09:03:41.569002 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 09:03:42 crc kubenswrapper[4947]: I1203 09:03:42.049856 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 09:03:42 crc kubenswrapper[4947]: I1203 09:03:42.174677 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e73b990b-2218-4b3e-9f09-3200fc4d668d","Type":"ContainerStarted","Data":"291ade27749bd15e12a624e45b5b4fb91e52b3f58c92c3edfad86cce5065d41e"} Dec 03 09:03:43 crc kubenswrapper[4947]: I1203 09:03:43.098403 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2e6ea2b-b80a-467e-8180-ac735fa47dc3" path="/var/lib/kubelet/pods/e2e6ea2b-b80a-467e-8180-ac735fa47dc3/volumes" Dec 03 09:03:43 crc kubenswrapper[4947]: I1203 09:03:43.183281 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e73b990b-2218-4b3e-9f09-3200fc4d668d","Type":"ContainerStarted","Data":"f5d67ee564cedbeb8dde1b2b7e0b9980e9311e8115ff4d11e423e17fc8040fb4"} Dec 03 09:03:44 crc kubenswrapper[4947]: I1203 09:03:44.193326 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e73b990b-2218-4b3e-9f09-3200fc4d668d","Type":"ContainerStarted","Data":"2e299b95569c02d2f3fded7813aaf0216d28dc600533ebd6a34859d8a7ec8af4"} Dec 03 09:03:44 crc kubenswrapper[4947]: I1203 09:03:44.229060 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.229039596 podStartE2EDuration="3.229039596s" podCreationTimestamp="2025-12-03 09:03:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:03:44.225904662 +0000 UTC m=+8085.486859088" watchObservedRunningTime="2025-12-03 09:03:44.229039596 +0000 UTC m=+8085.489994022" Dec 03 09:03:46 crc kubenswrapper[4947]: I1203 09:03:46.569800 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 09:03:46 crc kubenswrapper[4947]: I1203 09:03:46.720980 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 03 09:03:51 crc kubenswrapper[4947]: I1203 09:03:51.786787 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 03 09:03:54 crc kubenswrapper[4947]: I1203 09:03:54.218511 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-bn5wz"] Dec 03 09:03:54 crc kubenswrapper[4947]: I1203 09:03:54.220029 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-bn5wz" Dec 03 09:03:54 crc kubenswrapper[4947]: I1203 09:03:54.233448 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-bn5wz"] Dec 03 09:03:54 crc kubenswrapper[4947]: I1203 09:03:54.314013 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lgwd\" (UniqueName: \"kubernetes.io/projected/6ca62b55-e3c3-42b8-b98b-8147bef1e7db-kube-api-access-9lgwd\") pod \"glance-db-create-bn5wz\" (UID: \"6ca62b55-e3c3-42b8-b98b-8147bef1e7db\") " pod="openstack/glance-db-create-bn5wz" Dec 03 09:03:54 crc kubenswrapper[4947]: I1203 09:03:54.314314 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ca62b55-e3c3-42b8-b98b-8147bef1e7db-operator-scripts\") pod \"glance-db-create-bn5wz\" (UID: \"6ca62b55-e3c3-42b8-b98b-8147bef1e7db\") " pod="openstack/glance-db-create-bn5wz" Dec 03 09:03:54 crc kubenswrapper[4947]: I1203 09:03:54.329795 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-c4e1-account-create-update-c52zf"] Dec 03 09:03:54 crc kubenswrapper[4947]: I1203 09:03:54.330942 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c4e1-account-create-update-c52zf" Dec 03 09:03:54 crc kubenswrapper[4947]: I1203 09:03:54.332703 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 03 09:03:54 crc kubenswrapper[4947]: I1203 09:03:54.340507 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c4e1-account-create-update-c52zf"] Dec 03 09:03:54 crc kubenswrapper[4947]: I1203 09:03:54.416120 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/714cfec2-f79e-43ee-8b81-54bef9aae50f-operator-scripts\") pod \"glance-c4e1-account-create-update-c52zf\" (UID: \"714cfec2-f79e-43ee-8b81-54bef9aae50f\") " pod="openstack/glance-c4e1-account-create-update-c52zf" Dec 03 09:03:54 crc kubenswrapper[4947]: I1203 09:03:54.416219 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ca62b55-e3c3-42b8-b98b-8147bef1e7db-operator-scripts\") pod \"glance-db-create-bn5wz\" (UID: \"6ca62b55-e3c3-42b8-b98b-8147bef1e7db\") " pod="openstack/glance-db-create-bn5wz" Dec 03 09:03:54 crc kubenswrapper[4947]: I1203 09:03:54.416283 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lgwd\" (UniqueName: \"kubernetes.io/projected/6ca62b55-e3c3-42b8-b98b-8147bef1e7db-kube-api-access-9lgwd\") pod \"glance-db-create-bn5wz\" (UID: \"6ca62b55-e3c3-42b8-b98b-8147bef1e7db\") " pod="openstack/glance-db-create-bn5wz" Dec 03 09:03:54 crc kubenswrapper[4947]: I1203 09:03:54.416349 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxsbf\" (UniqueName: \"kubernetes.io/projected/714cfec2-f79e-43ee-8b81-54bef9aae50f-kube-api-access-pxsbf\") pod \"glance-c4e1-account-create-update-c52zf\" (UID: \"714cfec2-f79e-43ee-8b81-54bef9aae50f\") " pod="openstack/glance-c4e1-account-create-update-c52zf" Dec 03 09:03:54 crc kubenswrapper[4947]: I1203 09:03:54.417211 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ca62b55-e3c3-42b8-b98b-8147bef1e7db-operator-scripts\") pod \"glance-db-create-bn5wz\" (UID: \"6ca62b55-e3c3-42b8-b98b-8147bef1e7db\") " pod="openstack/glance-db-create-bn5wz" Dec 03 09:03:54 crc kubenswrapper[4947]: I1203 09:03:54.434407 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lgwd\" (UniqueName: \"kubernetes.io/projected/6ca62b55-e3c3-42b8-b98b-8147bef1e7db-kube-api-access-9lgwd\") pod \"glance-db-create-bn5wz\" (UID: \"6ca62b55-e3c3-42b8-b98b-8147bef1e7db\") " pod="openstack/glance-db-create-bn5wz" Dec 03 09:03:54 crc kubenswrapper[4947]: I1203 09:03:54.518423 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxsbf\" (UniqueName: \"kubernetes.io/projected/714cfec2-f79e-43ee-8b81-54bef9aae50f-kube-api-access-pxsbf\") pod \"glance-c4e1-account-create-update-c52zf\" (UID: \"714cfec2-f79e-43ee-8b81-54bef9aae50f\") " pod="openstack/glance-c4e1-account-create-update-c52zf" Dec 03 09:03:54 crc kubenswrapper[4947]: I1203 09:03:54.518581 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/714cfec2-f79e-43ee-8b81-54bef9aae50f-operator-scripts\") pod \"glance-c4e1-account-create-update-c52zf\" (UID: \"714cfec2-f79e-43ee-8b81-54bef9aae50f\") " pod="openstack/glance-c4e1-account-create-update-c52zf" Dec 03 09:03:54 crc kubenswrapper[4947]: I1203 09:03:54.521113 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/714cfec2-f79e-43ee-8b81-54bef9aae50f-operator-scripts\") pod \"glance-c4e1-account-create-update-c52zf\" (UID: \"714cfec2-f79e-43ee-8b81-54bef9aae50f\") " pod="openstack/glance-c4e1-account-create-update-c52zf" Dec 03 09:03:54 crc kubenswrapper[4947]: I1203 09:03:54.541353 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-bn5wz" Dec 03 09:03:54 crc kubenswrapper[4947]: I1203 09:03:54.545211 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxsbf\" (UniqueName: \"kubernetes.io/projected/714cfec2-f79e-43ee-8b81-54bef9aae50f-kube-api-access-pxsbf\") pod \"glance-c4e1-account-create-update-c52zf\" (UID: \"714cfec2-f79e-43ee-8b81-54bef9aae50f\") " pod="openstack/glance-c4e1-account-create-update-c52zf" Dec 03 09:03:54 crc kubenswrapper[4947]: I1203 09:03:54.650201 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c4e1-account-create-update-c52zf" Dec 03 09:03:54 crc kubenswrapper[4947]: I1203 09:03:54.885059 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-bn5wz"] Dec 03 09:03:55 crc kubenswrapper[4947]: I1203 09:03:55.185901 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c4e1-account-create-update-c52zf"] Dec 03 09:03:55 crc kubenswrapper[4947]: W1203 09:03:55.190146 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod714cfec2_f79e_43ee_8b81_54bef9aae50f.slice/crio-6418f78536edd5d09cd8cd852e1ae63174f9c94f60358a5f2e7a60ecae811a05 WatchSource:0}: Error finding container 6418f78536edd5d09cd8cd852e1ae63174f9c94f60358a5f2e7a60ecae811a05: Status 404 returned error can't find the container with id 6418f78536edd5d09cd8cd852e1ae63174f9c94f60358a5f2e7a60ecae811a05 Dec 03 09:03:55 crc kubenswrapper[4947]: I1203 09:03:55.300373 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c4e1-account-create-update-c52zf" event={"ID":"714cfec2-f79e-43ee-8b81-54bef9aae50f","Type":"ContainerStarted","Data":"6418f78536edd5d09cd8cd852e1ae63174f9c94f60358a5f2e7a60ecae811a05"} Dec 03 09:03:55 crc kubenswrapper[4947]: I1203 09:03:55.302006 4947 generic.go:334] "Generic (PLEG): container finished" podID="6ca62b55-e3c3-42b8-b98b-8147bef1e7db" containerID="32f43fd9f17ce976e5b9032aaaf005518a768ffed6edadcf037c3da77372c212" exitCode=0 Dec 03 09:03:55 crc kubenswrapper[4947]: I1203 09:03:55.302043 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-bn5wz" event={"ID":"6ca62b55-e3c3-42b8-b98b-8147bef1e7db","Type":"ContainerDied","Data":"32f43fd9f17ce976e5b9032aaaf005518a768ffed6edadcf037c3da77372c212"} Dec 03 09:03:55 crc kubenswrapper[4947]: I1203 09:03:55.302065 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-bn5wz" event={"ID":"6ca62b55-e3c3-42b8-b98b-8147bef1e7db","Type":"ContainerStarted","Data":"296d8a24eb4ac6851de42d8209ed2d59c1aa329660aafb0b9b27f4723e7c56d7"} Dec 03 09:03:56 crc kubenswrapper[4947]: I1203 09:03:56.329262 4947 generic.go:334] "Generic (PLEG): container finished" podID="714cfec2-f79e-43ee-8b81-54bef9aae50f" containerID="4b3e08d8388cbeb73d533373158c34d6309e2693962201be40d8c493aaea17b7" exitCode=0 Dec 03 09:03:56 crc kubenswrapper[4947]: I1203 09:03:56.332759 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c4e1-account-create-update-c52zf" event={"ID":"714cfec2-f79e-43ee-8b81-54bef9aae50f","Type":"ContainerDied","Data":"4b3e08d8388cbeb73d533373158c34d6309e2693962201be40d8c493aaea17b7"} Dec 03 09:03:56 crc kubenswrapper[4947]: I1203 09:03:56.707075 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-bn5wz" Dec 03 09:03:56 crc kubenswrapper[4947]: I1203 09:03:56.759869 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lgwd\" (UniqueName: \"kubernetes.io/projected/6ca62b55-e3c3-42b8-b98b-8147bef1e7db-kube-api-access-9lgwd\") pod \"6ca62b55-e3c3-42b8-b98b-8147bef1e7db\" (UID: \"6ca62b55-e3c3-42b8-b98b-8147bef1e7db\") " Dec 03 09:03:56 crc kubenswrapper[4947]: I1203 09:03:56.760017 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ca62b55-e3c3-42b8-b98b-8147bef1e7db-operator-scripts\") pod \"6ca62b55-e3c3-42b8-b98b-8147bef1e7db\" (UID: \"6ca62b55-e3c3-42b8-b98b-8147bef1e7db\") " Dec 03 09:03:56 crc kubenswrapper[4947]: I1203 09:03:56.760790 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ca62b55-e3c3-42b8-b98b-8147bef1e7db-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6ca62b55-e3c3-42b8-b98b-8147bef1e7db" (UID: "6ca62b55-e3c3-42b8-b98b-8147bef1e7db"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:03:56 crc kubenswrapper[4947]: I1203 09:03:56.772105 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ca62b55-e3c3-42b8-b98b-8147bef1e7db-kube-api-access-9lgwd" (OuterVolumeSpecName: "kube-api-access-9lgwd") pod "6ca62b55-e3c3-42b8-b98b-8147bef1e7db" (UID: "6ca62b55-e3c3-42b8-b98b-8147bef1e7db"). InnerVolumeSpecName "kube-api-access-9lgwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:03:56 crc kubenswrapper[4947]: I1203 09:03:56.862280 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lgwd\" (UniqueName: \"kubernetes.io/projected/6ca62b55-e3c3-42b8-b98b-8147bef1e7db-kube-api-access-9lgwd\") on node \"crc\" DevicePath \"\"" Dec 03 09:03:56 crc kubenswrapper[4947]: I1203 09:03:56.862310 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ca62b55-e3c3-42b8-b98b-8147bef1e7db-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:03:57 crc kubenswrapper[4947]: I1203 09:03:57.341351 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-bn5wz" event={"ID":"6ca62b55-e3c3-42b8-b98b-8147bef1e7db","Type":"ContainerDied","Data":"296d8a24eb4ac6851de42d8209ed2d59c1aa329660aafb0b9b27f4723e7c56d7"} Dec 03 09:03:57 crc kubenswrapper[4947]: I1203 09:03:57.341403 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="296d8a24eb4ac6851de42d8209ed2d59c1aa329660aafb0b9b27f4723e7c56d7" Dec 03 09:03:57 crc kubenswrapper[4947]: I1203 09:03:57.341554 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-bn5wz" Dec 03 09:03:57 crc kubenswrapper[4947]: I1203 09:03:57.750538 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c4e1-account-create-update-c52zf" Dec 03 09:03:57 crc kubenswrapper[4947]: I1203 09:03:57.778614 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/714cfec2-f79e-43ee-8b81-54bef9aae50f-operator-scripts\") pod \"714cfec2-f79e-43ee-8b81-54bef9aae50f\" (UID: \"714cfec2-f79e-43ee-8b81-54bef9aae50f\") " Dec 03 09:03:57 crc kubenswrapper[4947]: I1203 09:03:57.778803 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxsbf\" (UniqueName: \"kubernetes.io/projected/714cfec2-f79e-43ee-8b81-54bef9aae50f-kube-api-access-pxsbf\") pod \"714cfec2-f79e-43ee-8b81-54bef9aae50f\" (UID: \"714cfec2-f79e-43ee-8b81-54bef9aae50f\") " Dec 03 09:03:57 crc kubenswrapper[4947]: I1203 09:03:57.779062 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/714cfec2-f79e-43ee-8b81-54bef9aae50f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "714cfec2-f79e-43ee-8b81-54bef9aae50f" (UID: "714cfec2-f79e-43ee-8b81-54bef9aae50f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:03:57 crc kubenswrapper[4947]: I1203 09:03:57.779579 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/714cfec2-f79e-43ee-8b81-54bef9aae50f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:03:57 crc kubenswrapper[4947]: I1203 09:03:57.793831 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/714cfec2-f79e-43ee-8b81-54bef9aae50f-kube-api-access-pxsbf" (OuterVolumeSpecName: "kube-api-access-pxsbf") pod "714cfec2-f79e-43ee-8b81-54bef9aae50f" (UID: "714cfec2-f79e-43ee-8b81-54bef9aae50f"). InnerVolumeSpecName "kube-api-access-pxsbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:03:57 crc kubenswrapper[4947]: I1203 09:03:57.880899 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxsbf\" (UniqueName: \"kubernetes.io/projected/714cfec2-f79e-43ee-8b81-54bef9aae50f-kube-api-access-pxsbf\") on node \"crc\" DevicePath \"\"" Dec 03 09:03:58 crc kubenswrapper[4947]: I1203 09:03:58.354269 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c4e1-account-create-update-c52zf" event={"ID":"714cfec2-f79e-43ee-8b81-54bef9aae50f","Type":"ContainerDied","Data":"6418f78536edd5d09cd8cd852e1ae63174f9c94f60358a5f2e7a60ecae811a05"} Dec 03 09:03:58 crc kubenswrapper[4947]: I1203 09:03:58.354316 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6418f78536edd5d09cd8cd852e1ae63174f9c94f60358a5f2e7a60ecae811a05" Dec 03 09:03:58 crc kubenswrapper[4947]: I1203 09:03:58.354329 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c4e1-account-create-update-c52zf" Dec 03 09:03:59 crc kubenswrapper[4947]: I1203 09:03:59.567597 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-hk2ws"] Dec 03 09:03:59 crc kubenswrapper[4947]: E1203 09:03:59.568366 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ca62b55-e3c3-42b8-b98b-8147bef1e7db" containerName="mariadb-database-create" Dec 03 09:03:59 crc kubenswrapper[4947]: I1203 09:03:59.568381 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ca62b55-e3c3-42b8-b98b-8147bef1e7db" containerName="mariadb-database-create" Dec 03 09:03:59 crc kubenswrapper[4947]: E1203 09:03:59.568420 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="714cfec2-f79e-43ee-8b81-54bef9aae50f" containerName="mariadb-account-create-update" Dec 03 09:03:59 crc kubenswrapper[4947]: I1203 09:03:59.568428 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="714cfec2-f79e-43ee-8b81-54bef9aae50f" containerName="mariadb-account-create-update" Dec 03 09:03:59 crc kubenswrapper[4947]: I1203 09:03:59.568663 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="714cfec2-f79e-43ee-8b81-54bef9aae50f" containerName="mariadb-account-create-update" Dec 03 09:03:59 crc kubenswrapper[4947]: I1203 09:03:59.568684 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ca62b55-e3c3-42b8-b98b-8147bef1e7db" containerName="mariadb-database-create" Dec 03 09:03:59 crc kubenswrapper[4947]: I1203 09:03:59.569458 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hk2ws" Dec 03 09:03:59 crc kubenswrapper[4947]: I1203 09:03:59.574226 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xr42w" Dec 03 09:03:59 crc kubenswrapper[4947]: I1203 09:03:59.575129 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 03 09:03:59 crc kubenswrapper[4947]: I1203 09:03:59.584133 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hk2ws"] Dec 03 09:03:59 crc kubenswrapper[4947]: I1203 09:03:59.613915 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b3e13f5e-62e2-4404-90a8-e5d7e47322c6-db-sync-config-data\") pod \"glance-db-sync-hk2ws\" (UID: \"b3e13f5e-62e2-4404-90a8-e5d7e47322c6\") " pod="openstack/glance-db-sync-hk2ws" Dec 03 09:03:59 crc kubenswrapper[4947]: I1203 09:03:59.614022 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3e13f5e-62e2-4404-90a8-e5d7e47322c6-combined-ca-bundle\") pod \"glance-db-sync-hk2ws\" (UID: \"b3e13f5e-62e2-4404-90a8-e5d7e47322c6\") " pod="openstack/glance-db-sync-hk2ws" Dec 03 09:03:59 crc kubenswrapper[4947]: I1203 09:03:59.614487 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3e13f5e-62e2-4404-90a8-e5d7e47322c6-config-data\") pod \"glance-db-sync-hk2ws\" (UID: \"b3e13f5e-62e2-4404-90a8-e5d7e47322c6\") " pod="openstack/glance-db-sync-hk2ws" Dec 03 09:03:59 crc kubenswrapper[4947]: I1203 09:03:59.615103 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prs9s\" (UniqueName: \"kubernetes.io/projected/b3e13f5e-62e2-4404-90a8-e5d7e47322c6-kube-api-access-prs9s\") pod \"glance-db-sync-hk2ws\" (UID: \"b3e13f5e-62e2-4404-90a8-e5d7e47322c6\") " pod="openstack/glance-db-sync-hk2ws" Dec 03 09:03:59 crc kubenswrapper[4947]: I1203 09:03:59.717220 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b3e13f5e-62e2-4404-90a8-e5d7e47322c6-db-sync-config-data\") pod \"glance-db-sync-hk2ws\" (UID: \"b3e13f5e-62e2-4404-90a8-e5d7e47322c6\") " pod="openstack/glance-db-sync-hk2ws" Dec 03 09:03:59 crc kubenswrapper[4947]: I1203 09:03:59.717310 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3e13f5e-62e2-4404-90a8-e5d7e47322c6-combined-ca-bundle\") pod \"glance-db-sync-hk2ws\" (UID: \"b3e13f5e-62e2-4404-90a8-e5d7e47322c6\") " pod="openstack/glance-db-sync-hk2ws" Dec 03 09:03:59 crc kubenswrapper[4947]: I1203 09:03:59.717350 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3e13f5e-62e2-4404-90a8-e5d7e47322c6-config-data\") pod \"glance-db-sync-hk2ws\" (UID: \"b3e13f5e-62e2-4404-90a8-e5d7e47322c6\") " pod="openstack/glance-db-sync-hk2ws" Dec 03 09:03:59 crc kubenswrapper[4947]: I1203 09:03:59.717405 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prs9s\" (UniqueName: \"kubernetes.io/projected/b3e13f5e-62e2-4404-90a8-e5d7e47322c6-kube-api-access-prs9s\") pod \"glance-db-sync-hk2ws\" (UID: \"b3e13f5e-62e2-4404-90a8-e5d7e47322c6\") " pod="openstack/glance-db-sync-hk2ws" Dec 03 09:03:59 crc kubenswrapper[4947]: I1203 09:03:59.722985 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b3e13f5e-62e2-4404-90a8-e5d7e47322c6-db-sync-config-data\") pod \"glance-db-sync-hk2ws\" (UID: \"b3e13f5e-62e2-4404-90a8-e5d7e47322c6\") " pod="openstack/glance-db-sync-hk2ws" Dec 03 09:03:59 crc kubenswrapper[4947]: I1203 09:03:59.723284 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3e13f5e-62e2-4404-90a8-e5d7e47322c6-combined-ca-bundle\") pod \"glance-db-sync-hk2ws\" (UID: \"b3e13f5e-62e2-4404-90a8-e5d7e47322c6\") " pod="openstack/glance-db-sync-hk2ws" Dec 03 09:03:59 crc kubenswrapper[4947]: I1203 09:03:59.732012 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3e13f5e-62e2-4404-90a8-e5d7e47322c6-config-data\") pod \"glance-db-sync-hk2ws\" (UID: \"b3e13f5e-62e2-4404-90a8-e5d7e47322c6\") " pod="openstack/glance-db-sync-hk2ws" Dec 03 09:03:59 crc kubenswrapper[4947]: I1203 09:03:59.751209 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prs9s\" (UniqueName: \"kubernetes.io/projected/b3e13f5e-62e2-4404-90a8-e5d7e47322c6-kube-api-access-prs9s\") pod \"glance-db-sync-hk2ws\" (UID: \"b3e13f5e-62e2-4404-90a8-e5d7e47322c6\") " pod="openstack/glance-db-sync-hk2ws" Dec 03 09:03:59 crc kubenswrapper[4947]: I1203 09:03:59.887654 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hk2ws" Dec 03 09:04:00 crc kubenswrapper[4947]: I1203 09:04:00.399692 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hk2ws"] Dec 03 09:04:00 crc kubenswrapper[4947]: W1203 09:04:00.404062 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3e13f5e_62e2_4404_90a8_e5d7e47322c6.slice/crio-426b088de4a35d9053128aea5ac57a6c9251b3db6d1c9ec5ca4647cdc99064d4 WatchSource:0}: Error finding container 426b088de4a35d9053128aea5ac57a6c9251b3db6d1c9ec5ca4647cdc99064d4: Status 404 returned error can't find the container with id 426b088de4a35d9053128aea5ac57a6c9251b3db6d1c9ec5ca4647cdc99064d4 Dec 03 09:04:01 crc kubenswrapper[4947]: I1203 09:04:01.394686 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hk2ws" event={"ID":"b3e13f5e-62e2-4404-90a8-e5d7e47322c6","Type":"ContainerStarted","Data":"426b088de4a35d9053128aea5ac57a6c9251b3db6d1c9ec5ca4647cdc99064d4"} Dec 03 09:04:13 crc kubenswrapper[4947]: I1203 09:04:13.118862 4947 scope.go:117] "RemoveContainer" containerID="e748cd90bfa674d6a7540867fa744d3b98638934c547607bcc3c871e1ab835af" Dec 03 09:04:17 crc kubenswrapper[4947]: I1203 09:04:17.573481 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hk2ws" event={"ID":"b3e13f5e-62e2-4404-90a8-e5d7e47322c6","Type":"ContainerStarted","Data":"af483dbc11a06b127181b4ec8975febdf31408555cc20de47bd159ec4139f7dd"} Dec 03 09:04:17 crc kubenswrapper[4947]: I1203 09:04:17.590614 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-hk2ws" podStartSLOduration=2.698066456 podStartE2EDuration="18.590597574s" podCreationTimestamp="2025-12-03 09:03:59 +0000 UTC" firstStartedPulling="2025-12-03 09:04:00.406324097 +0000 UTC m=+8101.667278523" lastFinishedPulling="2025-12-03 09:04:16.298855215 +0000 UTC m=+8117.559809641" observedRunningTime="2025-12-03 09:04:17.587996695 +0000 UTC m=+8118.848951141" watchObservedRunningTime="2025-12-03 09:04:17.590597574 +0000 UTC m=+8118.851552000" Dec 03 09:04:21 crc kubenswrapper[4947]: I1203 09:04:21.623566 4947 generic.go:334] "Generic (PLEG): container finished" podID="b3e13f5e-62e2-4404-90a8-e5d7e47322c6" containerID="af483dbc11a06b127181b4ec8975febdf31408555cc20de47bd159ec4139f7dd" exitCode=0 Dec 03 09:04:21 crc kubenswrapper[4947]: I1203 09:04:21.623663 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hk2ws" event={"ID":"b3e13f5e-62e2-4404-90a8-e5d7e47322c6","Type":"ContainerDied","Data":"af483dbc11a06b127181b4ec8975febdf31408555cc20de47bd159ec4139f7dd"} Dec 03 09:04:23 crc kubenswrapper[4947]: I1203 09:04:23.031487 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hk2ws" Dec 03 09:04:23 crc kubenswrapper[4947]: I1203 09:04:23.161083 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b3e13f5e-62e2-4404-90a8-e5d7e47322c6-db-sync-config-data\") pod \"b3e13f5e-62e2-4404-90a8-e5d7e47322c6\" (UID: \"b3e13f5e-62e2-4404-90a8-e5d7e47322c6\") " Dec 03 09:04:23 crc kubenswrapper[4947]: I1203 09:04:23.161354 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3e13f5e-62e2-4404-90a8-e5d7e47322c6-config-data\") pod \"b3e13f5e-62e2-4404-90a8-e5d7e47322c6\" (UID: \"b3e13f5e-62e2-4404-90a8-e5d7e47322c6\") " Dec 03 09:04:23 crc kubenswrapper[4947]: I1203 09:04:23.161393 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3e13f5e-62e2-4404-90a8-e5d7e47322c6-combined-ca-bundle\") pod \"b3e13f5e-62e2-4404-90a8-e5d7e47322c6\" (UID: \"b3e13f5e-62e2-4404-90a8-e5d7e47322c6\") " Dec 03 09:04:23 crc kubenswrapper[4947]: I1203 09:04:23.161432 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prs9s\" (UniqueName: \"kubernetes.io/projected/b3e13f5e-62e2-4404-90a8-e5d7e47322c6-kube-api-access-prs9s\") pod \"b3e13f5e-62e2-4404-90a8-e5d7e47322c6\" (UID: \"b3e13f5e-62e2-4404-90a8-e5d7e47322c6\") " Dec 03 09:04:23 crc kubenswrapper[4947]: I1203 09:04:23.167763 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3e13f5e-62e2-4404-90a8-e5d7e47322c6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b3e13f5e-62e2-4404-90a8-e5d7e47322c6" (UID: "b3e13f5e-62e2-4404-90a8-e5d7e47322c6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:04:23 crc kubenswrapper[4947]: I1203 09:04:23.176803 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3e13f5e-62e2-4404-90a8-e5d7e47322c6-kube-api-access-prs9s" (OuterVolumeSpecName: "kube-api-access-prs9s") pod "b3e13f5e-62e2-4404-90a8-e5d7e47322c6" (UID: "b3e13f5e-62e2-4404-90a8-e5d7e47322c6"). InnerVolumeSpecName "kube-api-access-prs9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:04:23 crc kubenswrapper[4947]: I1203 09:04:23.186770 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3e13f5e-62e2-4404-90a8-e5d7e47322c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3e13f5e-62e2-4404-90a8-e5d7e47322c6" (UID: "b3e13f5e-62e2-4404-90a8-e5d7e47322c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:04:23 crc kubenswrapper[4947]: I1203 09:04:23.212758 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3e13f5e-62e2-4404-90a8-e5d7e47322c6-config-data" (OuterVolumeSpecName: "config-data") pod "b3e13f5e-62e2-4404-90a8-e5d7e47322c6" (UID: "b3e13f5e-62e2-4404-90a8-e5d7e47322c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:04:23 crc kubenswrapper[4947]: I1203 09:04:23.263978 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3e13f5e-62e2-4404-90a8-e5d7e47322c6-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:23 crc kubenswrapper[4947]: I1203 09:04:23.264440 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3e13f5e-62e2-4404-90a8-e5d7e47322c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:23 crc kubenswrapper[4947]: I1203 09:04:23.265535 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prs9s\" (UniqueName: \"kubernetes.io/projected/b3e13f5e-62e2-4404-90a8-e5d7e47322c6-kube-api-access-prs9s\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:23 crc kubenswrapper[4947]: I1203 09:04:23.265561 4947 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b3e13f5e-62e2-4404-90a8-e5d7e47322c6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:23 crc kubenswrapper[4947]: I1203 09:04:23.642411 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hk2ws" event={"ID":"b3e13f5e-62e2-4404-90a8-e5d7e47322c6","Type":"ContainerDied","Data":"426b088de4a35d9053128aea5ac57a6c9251b3db6d1c9ec5ca4647cdc99064d4"} Dec 03 09:04:23 crc kubenswrapper[4947]: I1203 09:04:23.642685 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="426b088de4a35d9053128aea5ac57a6c9251b3db6d1c9ec5ca4647cdc99064d4" Dec 03 09:04:23 crc kubenswrapper[4947]: I1203 09:04:23.642450 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hk2ws" Dec 03 09:04:23 crc kubenswrapper[4947]: I1203 09:04:23.985225 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 09:04:23 crc kubenswrapper[4947]: E1203 09:04:23.989742 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3e13f5e-62e2-4404-90a8-e5d7e47322c6" containerName="glance-db-sync" Dec 03 09:04:23 crc kubenswrapper[4947]: I1203 09:04:23.989766 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3e13f5e-62e2-4404-90a8-e5d7e47322c6" containerName="glance-db-sync" Dec 03 09:04:23 crc kubenswrapper[4947]: I1203 09:04:23.990018 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3e13f5e-62e2-4404-90a8-e5d7e47322c6" containerName="glance-db-sync" Dec 03 09:04:23 crc kubenswrapper[4947]: I1203 09:04:23.990942 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 09:04:23 crc kubenswrapper[4947]: I1203 09:04:23.993484 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 09:04:23 crc kubenswrapper[4947]: I1203 09:04:23.993674 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xr42w" Dec 03 09:04:23 crc kubenswrapper[4947]: I1203 09:04:23.993670 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.000344 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.087088 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df01b422-b8a7-4373-966c-76ff6aec224d-config-data\") pod \"glance-default-external-api-0\" (UID: \"df01b422-b8a7-4373-966c-76ff6aec224d\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.087255 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df01b422-b8a7-4373-966c-76ff6aec224d-logs\") pod \"glance-default-external-api-0\" (UID: \"df01b422-b8a7-4373-966c-76ff6aec224d\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.087306 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df01b422-b8a7-4373-966c-76ff6aec224d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"df01b422-b8a7-4373-966c-76ff6aec224d\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.087550 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df01b422-b8a7-4373-966c-76ff6aec224d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"df01b422-b8a7-4373-966c-76ff6aec224d\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.087579 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqvz6\" (UniqueName: \"kubernetes.io/projected/df01b422-b8a7-4373-966c-76ff6aec224d-kube-api-access-nqvz6\") pod \"glance-default-external-api-0\" (UID: \"df01b422-b8a7-4373-966c-76ff6aec224d\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.087622 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df01b422-b8a7-4373-966c-76ff6aec224d-scripts\") pod \"glance-default-external-api-0\" (UID: \"df01b422-b8a7-4373-966c-76ff6aec224d\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.189620 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df01b422-b8a7-4373-966c-76ff6aec224d-config-data\") pod \"glance-default-external-api-0\" (UID: \"df01b422-b8a7-4373-966c-76ff6aec224d\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.189717 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df01b422-b8a7-4373-966c-76ff6aec224d-logs\") pod \"glance-default-external-api-0\" (UID: \"df01b422-b8a7-4373-966c-76ff6aec224d\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.189743 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df01b422-b8a7-4373-966c-76ff6aec224d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"df01b422-b8a7-4373-966c-76ff6aec224d\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.189835 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df01b422-b8a7-4373-966c-76ff6aec224d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"df01b422-b8a7-4373-966c-76ff6aec224d\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.189862 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqvz6\" (UniqueName: \"kubernetes.io/projected/df01b422-b8a7-4373-966c-76ff6aec224d-kube-api-access-nqvz6\") pod \"glance-default-external-api-0\" (UID: \"df01b422-b8a7-4373-966c-76ff6aec224d\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.189884 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df01b422-b8a7-4373-966c-76ff6aec224d-scripts\") pod \"glance-default-external-api-0\" (UID: \"df01b422-b8a7-4373-966c-76ff6aec224d\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.190721 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df01b422-b8a7-4373-966c-76ff6aec224d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"df01b422-b8a7-4373-966c-76ff6aec224d\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.191081 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df01b422-b8a7-4373-966c-76ff6aec224d-logs\") pod \"glance-default-external-api-0\" (UID: \"df01b422-b8a7-4373-966c-76ff6aec224d\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.195706 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df01b422-b8a7-4373-966c-76ff6aec224d-scripts\") pod \"glance-default-external-api-0\" (UID: \"df01b422-b8a7-4373-966c-76ff6aec224d\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.206025 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df01b422-b8a7-4373-966c-76ff6aec224d-config-data\") pod \"glance-default-external-api-0\" (UID: \"df01b422-b8a7-4373-966c-76ff6aec224d\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.207358 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df01b422-b8a7-4373-966c-76ff6aec224d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"df01b422-b8a7-4373-966c-76ff6aec224d\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.243085 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqvz6\" (UniqueName: \"kubernetes.io/projected/df01b422-b8a7-4373-966c-76ff6aec224d-kube-api-access-nqvz6\") pod \"glance-default-external-api-0\" (UID: \"df01b422-b8a7-4373-966c-76ff6aec224d\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.259350 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6df8bf884f-5lqn4"] Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.261122 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.278457 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6df8bf884f-5lqn4"] Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.308579 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.316631 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.318475 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.320394 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.354424 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.395103 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1d86fda-5f36-48f2-b250-c399085f94ae-dns-svc\") pod \"dnsmasq-dns-6df8bf884f-5lqn4\" (UID: \"e1d86fda-5f36-48f2-b250-c399085f94ae\") " pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.395158 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1d86fda-5f36-48f2-b250-c399085f94ae-ovsdbserver-nb\") pod \"dnsmasq-dns-6df8bf884f-5lqn4\" (UID: \"e1d86fda-5f36-48f2-b250-c399085f94ae\") " pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.395203 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1d86fda-5f36-48f2-b250-c399085f94ae-ovsdbserver-sb\") pod \"dnsmasq-dns-6df8bf884f-5lqn4\" (UID: \"e1d86fda-5f36-48f2-b250-c399085f94ae\") " pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.395260 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1d86fda-5f36-48f2-b250-c399085f94ae-config\") pod \"dnsmasq-dns-6df8bf884f-5lqn4\" (UID: \"e1d86fda-5f36-48f2-b250-c399085f94ae\") " pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.395281 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zlxj\" (UniqueName: \"kubernetes.io/projected/e1d86fda-5f36-48f2-b250-c399085f94ae-kube-api-access-8zlxj\") pod \"dnsmasq-dns-6df8bf884f-5lqn4\" (UID: \"e1d86fda-5f36-48f2-b250-c399085f94ae\") " pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.498768 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1d86fda-5f36-48f2-b250-c399085f94ae-dns-svc\") pod \"dnsmasq-dns-6df8bf884f-5lqn4\" (UID: \"e1d86fda-5f36-48f2-b250-c399085f94ae\") " pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.498820 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1d86fda-5f36-48f2-b250-c399085f94ae-ovsdbserver-nb\") pod \"dnsmasq-dns-6df8bf884f-5lqn4\" (UID: \"e1d86fda-5f36-48f2-b250-c399085f94ae\") " pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.498853 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/761c7d03-8af8-4b0a-8fa4-68569918ec05-logs\") pod \"glance-default-internal-api-0\" (UID: \"761c7d03-8af8-4b0a-8fa4-68569918ec05\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.498875 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761c7d03-8af8-4b0a-8fa4-68569918ec05-config-data\") pod \"glance-default-internal-api-0\" (UID: \"761c7d03-8af8-4b0a-8fa4-68569918ec05\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.498905 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1d86fda-5f36-48f2-b250-c399085f94ae-ovsdbserver-sb\") pod \"dnsmasq-dns-6df8bf884f-5lqn4\" (UID: \"e1d86fda-5f36-48f2-b250-c399085f94ae\") " pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.498971 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/761c7d03-8af8-4b0a-8fa4-68569918ec05-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"761c7d03-8af8-4b0a-8fa4-68569918ec05\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.498995 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1d86fda-5f36-48f2-b250-c399085f94ae-config\") pod \"dnsmasq-dns-6df8bf884f-5lqn4\" (UID: \"e1d86fda-5f36-48f2-b250-c399085f94ae\") " pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.499013 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zlxj\" (UniqueName: \"kubernetes.io/projected/e1d86fda-5f36-48f2-b250-c399085f94ae-kube-api-access-8zlxj\") pod \"dnsmasq-dns-6df8bf884f-5lqn4\" (UID: \"e1d86fda-5f36-48f2-b250-c399085f94ae\") " pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.499070 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761c7d03-8af8-4b0a-8fa4-68569918ec05-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"761c7d03-8af8-4b0a-8fa4-68569918ec05\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.499169 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/761c7d03-8af8-4b0a-8fa4-68569918ec05-scripts\") pod \"glance-default-internal-api-0\" (UID: \"761c7d03-8af8-4b0a-8fa4-68569918ec05\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.499211 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5kk7\" (UniqueName: \"kubernetes.io/projected/761c7d03-8af8-4b0a-8fa4-68569918ec05-kube-api-access-x5kk7\") pod \"glance-default-internal-api-0\" (UID: \"761c7d03-8af8-4b0a-8fa4-68569918ec05\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.500562 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1d86fda-5f36-48f2-b250-c399085f94ae-ovsdbserver-nb\") pod \"dnsmasq-dns-6df8bf884f-5lqn4\" (UID: \"e1d86fda-5f36-48f2-b250-c399085f94ae\") " pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.500644 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1d86fda-5f36-48f2-b250-c399085f94ae-dns-svc\") pod \"dnsmasq-dns-6df8bf884f-5lqn4\" (UID: \"e1d86fda-5f36-48f2-b250-c399085f94ae\") " pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.501314 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1d86fda-5f36-48f2-b250-c399085f94ae-config\") pod \"dnsmasq-dns-6df8bf884f-5lqn4\" (UID: \"e1d86fda-5f36-48f2-b250-c399085f94ae\") " pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.501958 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1d86fda-5f36-48f2-b250-c399085f94ae-ovsdbserver-sb\") pod \"dnsmasq-dns-6df8bf884f-5lqn4\" (UID: \"e1d86fda-5f36-48f2-b250-c399085f94ae\") " pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.540668 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zlxj\" (UniqueName: \"kubernetes.io/projected/e1d86fda-5f36-48f2-b250-c399085f94ae-kube-api-access-8zlxj\") pod \"dnsmasq-dns-6df8bf884f-5lqn4\" (UID: \"e1d86fda-5f36-48f2-b250-c399085f94ae\") " pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.603240 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/761c7d03-8af8-4b0a-8fa4-68569918ec05-scripts\") pod \"glance-default-internal-api-0\" (UID: \"761c7d03-8af8-4b0a-8fa4-68569918ec05\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.603294 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5kk7\" (UniqueName: \"kubernetes.io/projected/761c7d03-8af8-4b0a-8fa4-68569918ec05-kube-api-access-x5kk7\") pod \"glance-default-internal-api-0\" (UID: \"761c7d03-8af8-4b0a-8fa4-68569918ec05\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.603323 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/761c7d03-8af8-4b0a-8fa4-68569918ec05-logs\") pod \"glance-default-internal-api-0\" (UID: \"761c7d03-8af8-4b0a-8fa4-68569918ec05\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.603338 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761c7d03-8af8-4b0a-8fa4-68569918ec05-config-data\") pod \"glance-default-internal-api-0\" (UID: \"761c7d03-8af8-4b0a-8fa4-68569918ec05\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.603383 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/761c7d03-8af8-4b0a-8fa4-68569918ec05-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"761c7d03-8af8-4b0a-8fa4-68569918ec05\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.603428 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761c7d03-8af8-4b0a-8fa4-68569918ec05-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"761c7d03-8af8-4b0a-8fa4-68569918ec05\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.604123 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/761c7d03-8af8-4b0a-8fa4-68569918ec05-logs\") pod \"glance-default-internal-api-0\" (UID: \"761c7d03-8af8-4b0a-8fa4-68569918ec05\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.608108 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/761c7d03-8af8-4b0a-8fa4-68569918ec05-scripts\") pod \"glance-default-internal-api-0\" (UID: \"761c7d03-8af8-4b0a-8fa4-68569918ec05\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.608399 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761c7d03-8af8-4b0a-8fa4-68569918ec05-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"761c7d03-8af8-4b0a-8fa4-68569918ec05\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.608577 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761c7d03-8af8-4b0a-8fa4-68569918ec05-config-data\") pod \"glance-default-internal-api-0\" (UID: \"761c7d03-8af8-4b0a-8fa4-68569918ec05\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.609901 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/761c7d03-8af8-4b0a-8fa4-68569918ec05-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"761c7d03-8af8-4b0a-8fa4-68569918ec05\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.628801 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5kk7\" (UniqueName: \"kubernetes.io/projected/761c7d03-8af8-4b0a-8fa4-68569918ec05-kube-api-access-x5kk7\") pod \"glance-default-internal-api-0\" (UID: \"761c7d03-8af8-4b0a-8fa4-68569918ec05\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.629033 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.729555 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 09:04:24 crc kubenswrapper[4947]: I1203 09:04:24.989857 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 09:04:25 crc kubenswrapper[4947]: I1203 09:04:25.130383 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6df8bf884f-5lqn4"] Dec 03 09:04:25 crc kubenswrapper[4947]: W1203 09:04:25.133615 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1d86fda_5f36_48f2_b250_c399085f94ae.slice/crio-e61b4defdcb3425e8a927f10ca3f07b0fa70e8851a4abf89b5e4d15dd63704cf WatchSource:0}: Error finding container e61b4defdcb3425e8a927f10ca3f07b0fa70e8851a4abf89b5e4d15dd63704cf: Status 404 returned error can't find the container with id e61b4defdcb3425e8a927f10ca3f07b0fa70e8851a4abf89b5e4d15dd63704cf Dec 03 09:04:25 crc kubenswrapper[4947]: I1203 09:04:25.393073 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 09:04:25 crc kubenswrapper[4947]: W1203 09:04:25.404413 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod761c7d03_8af8_4b0a_8fa4_68569918ec05.slice/crio-1b07034489ee129beac1192c2cf1ff305a62d867648ce13a13c45603890958e6 WatchSource:0}: Error finding container 1b07034489ee129beac1192c2cf1ff305a62d867648ce13a13c45603890958e6: Status 404 returned error can't find the container with id 1b07034489ee129beac1192c2cf1ff305a62d867648ce13a13c45603890958e6 Dec 03 09:04:25 crc kubenswrapper[4947]: I1203 09:04:25.481343 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 09:04:25 crc kubenswrapper[4947]: I1203 09:04:25.657748 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df01b422-b8a7-4373-966c-76ff6aec224d","Type":"ContainerStarted","Data":"7c42ea1cdbcba40719a5a59bc91ab01c04e0352a275b3b97fbe1caae95551317"} Dec 03 09:04:25 crc kubenswrapper[4947]: I1203 09:04:25.659149 4947 generic.go:334] "Generic (PLEG): container finished" podID="e1d86fda-5f36-48f2-b250-c399085f94ae" containerID="a9029b8e902c917b9e1e9d43d07fb55951e7fe78b9c3fe0adc957e96ac354ded" exitCode=0 Dec 03 09:04:25 crc kubenswrapper[4947]: I1203 09:04:25.659220 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" event={"ID":"e1d86fda-5f36-48f2-b250-c399085f94ae","Type":"ContainerDied","Data":"a9029b8e902c917b9e1e9d43d07fb55951e7fe78b9c3fe0adc957e96ac354ded"} Dec 03 09:04:25 crc kubenswrapper[4947]: I1203 09:04:25.659240 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" event={"ID":"e1d86fda-5f36-48f2-b250-c399085f94ae","Type":"ContainerStarted","Data":"e61b4defdcb3425e8a927f10ca3f07b0fa70e8851a4abf89b5e4d15dd63704cf"} Dec 03 09:04:25 crc kubenswrapper[4947]: I1203 09:04:25.665040 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"761c7d03-8af8-4b0a-8fa4-68569918ec05","Type":"ContainerStarted","Data":"1b07034489ee129beac1192c2cf1ff305a62d867648ce13a13c45603890958e6"} Dec 03 09:04:26 crc kubenswrapper[4947]: I1203 09:04:26.675932 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df01b422-b8a7-4373-966c-76ff6aec224d","Type":"ContainerStarted","Data":"3beb70c60e68bf929872d622dda84b695434e1b415aedf90decdfbc8636d723b"} Dec 03 09:04:26 crc kubenswrapper[4947]: I1203 09:04:26.676575 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df01b422-b8a7-4373-966c-76ff6aec224d","Type":"ContainerStarted","Data":"fac27623464064e5eaad635bafab3fd40e58eb39881c635dfce7570a7e291923"} Dec 03 09:04:26 crc kubenswrapper[4947]: I1203 09:04:26.676271 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="df01b422-b8a7-4373-966c-76ff6aec224d" containerName="glance-log" containerID="cri-o://fac27623464064e5eaad635bafab3fd40e58eb39881c635dfce7570a7e291923" gracePeriod=30 Dec 03 09:04:26 crc kubenswrapper[4947]: I1203 09:04:26.677067 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="df01b422-b8a7-4373-966c-76ff6aec224d" containerName="glance-httpd" containerID="cri-o://3beb70c60e68bf929872d622dda84b695434e1b415aedf90decdfbc8636d723b" gracePeriod=30 Dec 03 09:04:26 crc kubenswrapper[4947]: I1203 09:04:26.682976 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" event={"ID":"e1d86fda-5f36-48f2-b250-c399085f94ae","Type":"ContainerStarted","Data":"6bd838e415cfd1332088426231560de65e5f9e0c8c47f679e1c4389b55758871"} Dec 03 09:04:26 crc kubenswrapper[4947]: I1203 09:04:26.684551 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" Dec 03 09:04:26 crc kubenswrapper[4947]: I1203 09:04:26.685422 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"761c7d03-8af8-4b0a-8fa4-68569918ec05","Type":"ContainerStarted","Data":"706374c2a4e6b5b2255aeea28684d308cff0848ac527343fd80c81dae0ea8858"} Dec 03 09:04:26 crc kubenswrapper[4947]: I1203 09:04:26.740345 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.730486814 podStartE2EDuration="3.730486814s" podCreationTimestamp="2025-12-03 09:04:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:04:26.725035186 +0000 UTC m=+8127.985989622" watchObservedRunningTime="2025-12-03 09:04:26.730486814 +0000 UTC m=+8127.991441240" Dec 03 09:04:26 crc kubenswrapper[4947]: I1203 09:04:26.760022 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" podStartSLOduration=2.760001482 podStartE2EDuration="2.760001482s" podCreationTimestamp="2025-12-03 09:04:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:04:26.751795321 +0000 UTC m=+8128.012749747" watchObservedRunningTime="2025-12-03 09:04:26.760001482 +0000 UTC m=+8128.020955908" Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.697239 4947 generic.go:334] "Generic (PLEG): container finished" podID="df01b422-b8a7-4373-966c-76ff6aec224d" containerID="3beb70c60e68bf929872d622dda84b695434e1b415aedf90decdfbc8636d723b" exitCode=0 Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.697530 4947 generic.go:334] "Generic (PLEG): container finished" podID="df01b422-b8a7-4373-966c-76ff6aec224d" containerID="fac27623464064e5eaad635bafab3fd40e58eb39881c635dfce7570a7e291923" exitCode=143 Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.697323 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df01b422-b8a7-4373-966c-76ff6aec224d","Type":"ContainerDied","Data":"3beb70c60e68bf929872d622dda84b695434e1b415aedf90decdfbc8636d723b"} Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.697645 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df01b422-b8a7-4373-966c-76ff6aec224d","Type":"ContainerDied","Data":"fac27623464064e5eaad635bafab3fd40e58eb39881c635dfce7570a7e291923"} Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.697663 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"df01b422-b8a7-4373-966c-76ff6aec224d","Type":"ContainerDied","Data":"7c42ea1cdbcba40719a5a59bc91ab01c04e0352a275b3b97fbe1caae95551317"} Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.697677 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c42ea1cdbcba40719a5a59bc91ab01c04e0352a275b3b97fbe1caae95551317" Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.701452 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"761c7d03-8af8-4b0a-8fa4-68569918ec05","Type":"ContainerStarted","Data":"8ac10e0542ee6e6fc91192146a0ccaeb3a1f1aa07e3a572e1dc3942995b8f513"} Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.718352 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.757142 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.757116694 podStartE2EDuration="3.757116694s" podCreationTimestamp="2025-12-03 09:04:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:04:27.729950888 +0000 UTC m=+8128.990905324" watchObservedRunningTime="2025-12-03 09:04:27.757116694 +0000 UTC m=+8129.018071130" Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.775278 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.887719 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df01b422-b8a7-4373-966c-76ff6aec224d-logs\") pod \"df01b422-b8a7-4373-966c-76ff6aec224d\" (UID: \"df01b422-b8a7-4373-966c-76ff6aec224d\") " Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.887764 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df01b422-b8a7-4373-966c-76ff6aec224d-config-data\") pod \"df01b422-b8a7-4373-966c-76ff6aec224d\" (UID: \"df01b422-b8a7-4373-966c-76ff6aec224d\") " Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.887799 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqvz6\" (UniqueName: \"kubernetes.io/projected/df01b422-b8a7-4373-966c-76ff6aec224d-kube-api-access-nqvz6\") pod \"df01b422-b8a7-4373-966c-76ff6aec224d\" (UID: \"df01b422-b8a7-4373-966c-76ff6aec224d\") " Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.887817 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df01b422-b8a7-4373-966c-76ff6aec224d-httpd-run\") pod \"df01b422-b8a7-4373-966c-76ff6aec224d\" (UID: \"df01b422-b8a7-4373-966c-76ff6aec224d\") " Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.887855 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df01b422-b8a7-4373-966c-76ff6aec224d-scripts\") pod \"df01b422-b8a7-4373-966c-76ff6aec224d\" (UID: \"df01b422-b8a7-4373-966c-76ff6aec224d\") " Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.887981 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df01b422-b8a7-4373-966c-76ff6aec224d-combined-ca-bundle\") pod \"df01b422-b8a7-4373-966c-76ff6aec224d\" (UID: \"df01b422-b8a7-4373-966c-76ff6aec224d\") " Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.888680 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df01b422-b8a7-4373-966c-76ff6aec224d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "df01b422-b8a7-4373-966c-76ff6aec224d" (UID: "df01b422-b8a7-4373-966c-76ff6aec224d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.889747 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df01b422-b8a7-4373-966c-76ff6aec224d-logs" (OuterVolumeSpecName: "logs") pod "df01b422-b8a7-4373-966c-76ff6aec224d" (UID: "df01b422-b8a7-4373-966c-76ff6aec224d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.893339 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df01b422-b8a7-4373-966c-76ff6aec224d-scripts" (OuterVolumeSpecName: "scripts") pod "df01b422-b8a7-4373-966c-76ff6aec224d" (UID: "df01b422-b8a7-4373-966c-76ff6aec224d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.920859 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df01b422-b8a7-4373-966c-76ff6aec224d-kube-api-access-nqvz6" (OuterVolumeSpecName: "kube-api-access-nqvz6") pod "df01b422-b8a7-4373-966c-76ff6aec224d" (UID: "df01b422-b8a7-4373-966c-76ff6aec224d"). InnerVolumeSpecName "kube-api-access-nqvz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.928582 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df01b422-b8a7-4373-966c-76ff6aec224d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df01b422-b8a7-4373-966c-76ff6aec224d" (UID: "df01b422-b8a7-4373-966c-76ff6aec224d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.953421 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df01b422-b8a7-4373-966c-76ff6aec224d-config-data" (OuterVolumeSpecName: "config-data") pod "df01b422-b8a7-4373-966c-76ff6aec224d" (UID: "df01b422-b8a7-4373-966c-76ff6aec224d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.989980 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df01b422-b8a7-4373-966c-76ff6aec224d-logs\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.990010 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df01b422-b8a7-4373-966c-76ff6aec224d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.990024 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqvz6\" (UniqueName: \"kubernetes.io/projected/df01b422-b8a7-4373-966c-76ff6aec224d-kube-api-access-nqvz6\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.990037 4947 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df01b422-b8a7-4373-966c-76ff6aec224d-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.990047 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df01b422-b8a7-4373-966c-76ff6aec224d-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:27 crc kubenswrapper[4947]: I1203 09:04:27.990056 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df01b422-b8a7-4373-966c-76ff6aec224d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:28 crc kubenswrapper[4947]: I1203 09:04:28.710309 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 09:04:28 crc kubenswrapper[4947]: I1203 09:04:28.748299 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 09:04:28 crc kubenswrapper[4947]: I1203 09:04:28.762699 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 09:04:28 crc kubenswrapper[4947]: I1203 09:04:28.774053 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 09:04:28 crc kubenswrapper[4947]: E1203 09:04:28.774548 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df01b422-b8a7-4373-966c-76ff6aec224d" containerName="glance-httpd" Dec 03 09:04:28 crc kubenswrapper[4947]: I1203 09:04:28.774566 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="df01b422-b8a7-4373-966c-76ff6aec224d" containerName="glance-httpd" Dec 03 09:04:28 crc kubenswrapper[4947]: E1203 09:04:28.774606 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df01b422-b8a7-4373-966c-76ff6aec224d" containerName="glance-log" Dec 03 09:04:28 crc kubenswrapper[4947]: I1203 09:04:28.774612 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="df01b422-b8a7-4373-966c-76ff6aec224d" containerName="glance-log" Dec 03 09:04:28 crc kubenswrapper[4947]: I1203 09:04:28.774795 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="df01b422-b8a7-4373-966c-76ff6aec224d" containerName="glance-log" Dec 03 09:04:28 crc kubenswrapper[4947]: I1203 09:04:28.774817 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="df01b422-b8a7-4373-966c-76ff6aec224d" containerName="glance-httpd" Dec 03 09:04:28 crc kubenswrapper[4947]: I1203 09:04:28.775769 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 09:04:28 crc kubenswrapper[4947]: I1203 09:04:28.782075 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 09:04:28 crc kubenswrapper[4947]: I1203 09:04:28.783786 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 09:04:28 crc kubenswrapper[4947]: I1203 09:04:28.905102 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b5cf332-361f-4e92-97e2-575f850e1782-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7b5cf332-361f-4e92-97e2-575f850e1782\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:28 crc kubenswrapper[4947]: I1203 09:04:28.905386 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxdfd\" (UniqueName: \"kubernetes.io/projected/7b5cf332-361f-4e92-97e2-575f850e1782-kube-api-access-rxdfd\") pod \"glance-default-external-api-0\" (UID: \"7b5cf332-361f-4e92-97e2-575f850e1782\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:28 crc kubenswrapper[4947]: I1203 09:04:28.905432 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b5cf332-361f-4e92-97e2-575f850e1782-scripts\") pod \"glance-default-external-api-0\" (UID: \"7b5cf332-361f-4e92-97e2-575f850e1782\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:28 crc kubenswrapper[4947]: I1203 09:04:28.905538 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b5cf332-361f-4e92-97e2-575f850e1782-logs\") pod \"glance-default-external-api-0\" (UID: \"7b5cf332-361f-4e92-97e2-575f850e1782\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:28 crc kubenswrapper[4947]: I1203 09:04:28.905588 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b5cf332-361f-4e92-97e2-575f850e1782-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7b5cf332-361f-4e92-97e2-575f850e1782\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:28 crc kubenswrapper[4947]: I1203 09:04:28.905681 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b5cf332-361f-4e92-97e2-575f850e1782-config-data\") pod \"glance-default-external-api-0\" (UID: \"7b5cf332-361f-4e92-97e2-575f850e1782\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:29 crc kubenswrapper[4947]: I1203 09:04:29.006992 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b5cf332-361f-4e92-97e2-575f850e1782-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7b5cf332-361f-4e92-97e2-575f850e1782\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:29 crc kubenswrapper[4947]: I1203 09:04:29.007102 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxdfd\" (UniqueName: \"kubernetes.io/projected/7b5cf332-361f-4e92-97e2-575f850e1782-kube-api-access-rxdfd\") pod \"glance-default-external-api-0\" (UID: \"7b5cf332-361f-4e92-97e2-575f850e1782\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:29 crc kubenswrapper[4947]: I1203 09:04:29.007124 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b5cf332-361f-4e92-97e2-575f850e1782-scripts\") pod \"glance-default-external-api-0\" (UID: \"7b5cf332-361f-4e92-97e2-575f850e1782\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:29 crc kubenswrapper[4947]: I1203 09:04:29.007148 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b5cf332-361f-4e92-97e2-575f850e1782-logs\") pod \"glance-default-external-api-0\" (UID: \"7b5cf332-361f-4e92-97e2-575f850e1782\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:29 crc kubenswrapper[4947]: I1203 09:04:29.007166 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b5cf332-361f-4e92-97e2-575f850e1782-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7b5cf332-361f-4e92-97e2-575f850e1782\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:29 crc kubenswrapper[4947]: I1203 09:04:29.007195 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b5cf332-361f-4e92-97e2-575f850e1782-config-data\") pod \"glance-default-external-api-0\" (UID: \"7b5cf332-361f-4e92-97e2-575f850e1782\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:29 crc kubenswrapper[4947]: I1203 09:04:29.007810 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b5cf332-361f-4e92-97e2-575f850e1782-logs\") pod \"glance-default-external-api-0\" (UID: \"7b5cf332-361f-4e92-97e2-575f850e1782\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:29 crc kubenswrapper[4947]: I1203 09:04:29.007868 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b5cf332-361f-4e92-97e2-575f850e1782-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7b5cf332-361f-4e92-97e2-575f850e1782\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:29 crc kubenswrapper[4947]: I1203 09:04:29.011824 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b5cf332-361f-4e92-97e2-575f850e1782-scripts\") pod \"glance-default-external-api-0\" (UID: \"7b5cf332-361f-4e92-97e2-575f850e1782\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:29 crc kubenswrapper[4947]: I1203 09:04:29.012153 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b5cf332-361f-4e92-97e2-575f850e1782-config-data\") pod \"glance-default-external-api-0\" (UID: \"7b5cf332-361f-4e92-97e2-575f850e1782\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:29 crc kubenswrapper[4947]: I1203 09:04:29.020783 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b5cf332-361f-4e92-97e2-575f850e1782-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7b5cf332-361f-4e92-97e2-575f850e1782\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:29 crc kubenswrapper[4947]: I1203 09:04:29.025269 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxdfd\" (UniqueName: \"kubernetes.io/projected/7b5cf332-361f-4e92-97e2-575f850e1782-kube-api-access-rxdfd\") pod \"glance-default-external-api-0\" (UID: \"7b5cf332-361f-4e92-97e2-575f850e1782\") " pod="openstack/glance-default-external-api-0" Dec 03 09:04:29 crc kubenswrapper[4947]: I1203 09:04:29.097770 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df01b422-b8a7-4373-966c-76ff6aec224d" path="/var/lib/kubelet/pods/df01b422-b8a7-4373-966c-76ff6aec224d/volumes" Dec 03 09:04:29 crc kubenswrapper[4947]: I1203 09:04:29.099219 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 09:04:29 crc kubenswrapper[4947]: I1203 09:04:29.637310 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 09:04:29 crc kubenswrapper[4947]: W1203 09:04:29.647195 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b5cf332_361f_4e92_97e2_575f850e1782.slice/crio-4e429bcd3a8eb0b1f61156fb7cbe45a1ae2d7dbf2073c36cfa336f5e4b0acea1 WatchSource:0}: Error finding container 4e429bcd3a8eb0b1f61156fb7cbe45a1ae2d7dbf2073c36cfa336f5e4b0acea1: Status 404 returned error can't find the container with id 4e429bcd3a8eb0b1f61156fb7cbe45a1ae2d7dbf2073c36cfa336f5e4b0acea1 Dec 03 09:04:29 crc kubenswrapper[4947]: I1203 09:04:29.725324 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7b5cf332-361f-4e92-97e2-575f850e1782","Type":"ContainerStarted","Data":"4e429bcd3a8eb0b1f61156fb7cbe45a1ae2d7dbf2073c36cfa336f5e4b0acea1"} Dec 03 09:04:29 crc kubenswrapper[4947]: I1203 09:04:29.725474 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="761c7d03-8af8-4b0a-8fa4-68569918ec05" containerName="glance-log" containerID="cri-o://706374c2a4e6b5b2255aeea28684d308cff0848ac527343fd80c81dae0ea8858" gracePeriod=30 Dec 03 09:04:29 crc kubenswrapper[4947]: I1203 09:04:29.725947 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="761c7d03-8af8-4b0a-8fa4-68569918ec05" containerName="glance-httpd" containerID="cri-o://8ac10e0542ee6e6fc91192146a0ccaeb3a1f1aa07e3a572e1dc3942995b8f513" gracePeriod=30 Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.375919 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.531312 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/761c7d03-8af8-4b0a-8fa4-68569918ec05-scripts\") pod \"761c7d03-8af8-4b0a-8fa4-68569918ec05\" (UID: \"761c7d03-8af8-4b0a-8fa4-68569918ec05\") " Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.531857 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/761c7d03-8af8-4b0a-8fa4-68569918ec05-httpd-run\") pod \"761c7d03-8af8-4b0a-8fa4-68569918ec05\" (UID: \"761c7d03-8af8-4b0a-8fa4-68569918ec05\") " Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.531940 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5kk7\" (UniqueName: \"kubernetes.io/projected/761c7d03-8af8-4b0a-8fa4-68569918ec05-kube-api-access-x5kk7\") pod \"761c7d03-8af8-4b0a-8fa4-68569918ec05\" (UID: \"761c7d03-8af8-4b0a-8fa4-68569918ec05\") " Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.531984 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761c7d03-8af8-4b0a-8fa4-68569918ec05-config-data\") pod \"761c7d03-8af8-4b0a-8fa4-68569918ec05\" (UID: \"761c7d03-8af8-4b0a-8fa4-68569918ec05\") " Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.532044 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761c7d03-8af8-4b0a-8fa4-68569918ec05-combined-ca-bundle\") pod \"761c7d03-8af8-4b0a-8fa4-68569918ec05\" (UID: \"761c7d03-8af8-4b0a-8fa4-68569918ec05\") " Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.532104 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/761c7d03-8af8-4b0a-8fa4-68569918ec05-logs\") pod \"761c7d03-8af8-4b0a-8fa4-68569918ec05\" (UID: \"761c7d03-8af8-4b0a-8fa4-68569918ec05\") " Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.532261 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/761c7d03-8af8-4b0a-8fa4-68569918ec05-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "761c7d03-8af8-4b0a-8fa4-68569918ec05" (UID: "761c7d03-8af8-4b0a-8fa4-68569918ec05"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.532527 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/761c7d03-8af8-4b0a-8fa4-68569918ec05-logs" (OuterVolumeSpecName: "logs") pod "761c7d03-8af8-4b0a-8fa4-68569918ec05" (UID: "761c7d03-8af8-4b0a-8fa4-68569918ec05"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.532947 4947 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/761c7d03-8af8-4b0a-8fa4-68569918ec05-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.532970 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/761c7d03-8af8-4b0a-8fa4-68569918ec05-logs\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.536568 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/761c7d03-8af8-4b0a-8fa4-68569918ec05-kube-api-access-x5kk7" (OuterVolumeSpecName: "kube-api-access-x5kk7") pod "761c7d03-8af8-4b0a-8fa4-68569918ec05" (UID: "761c7d03-8af8-4b0a-8fa4-68569918ec05"). InnerVolumeSpecName "kube-api-access-x5kk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.539043 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761c7d03-8af8-4b0a-8fa4-68569918ec05-scripts" (OuterVolumeSpecName: "scripts") pod "761c7d03-8af8-4b0a-8fa4-68569918ec05" (UID: "761c7d03-8af8-4b0a-8fa4-68569918ec05"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.561458 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761c7d03-8af8-4b0a-8fa4-68569918ec05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "761c7d03-8af8-4b0a-8fa4-68569918ec05" (UID: "761c7d03-8af8-4b0a-8fa4-68569918ec05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.575586 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761c7d03-8af8-4b0a-8fa4-68569918ec05-config-data" (OuterVolumeSpecName: "config-data") pod "761c7d03-8af8-4b0a-8fa4-68569918ec05" (UID: "761c7d03-8af8-4b0a-8fa4-68569918ec05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.634739 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/761c7d03-8af8-4b0a-8fa4-68569918ec05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.634788 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/761c7d03-8af8-4b0a-8fa4-68569918ec05-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.634802 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5kk7\" (UniqueName: \"kubernetes.io/projected/761c7d03-8af8-4b0a-8fa4-68569918ec05-kube-api-access-x5kk7\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.634815 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/761c7d03-8af8-4b0a-8fa4-68569918ec05-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.740928 4947 generic.go:334] "Generic (PLEG): container finished" podID="761c7d03-8af8-4b0a-8fa4-68569918ec05" containerID="8ac10e0542ee6e6fc91192146a0ccaeb3a1f1aa07e3a572e1dc3942995b8f513" exitCode=0 Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.740967 4947 generic.go:334] "Generic (PLEG): container finished" podID="761c7d03-8af8-4b0a-8fa4-68569918ec05" containerID="706374c2a4e6b5b2255aeea28684d308cff0848ac527343fd80c81dae0ea8858" exitCode=143 Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.741096 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"761c7d03-8af8-4b0a-8fa4-68569918ec05","Type":"ContainerDied","Data":"8ac10e0542ee6e6fc91192146a0ccaeb3a1f1aa07e3a572e1dc3942995b8f513"} Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.741131 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"761c7d03-8af8-4b0a-8fa4-68569918ec05","Type":"ContainerDied","Data":"706374c2a4e6b5b2255aeea28684d308cff0848ac527343fd80c81dae0ea8858"} Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.741150 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"761c7d03-8af8-4b0a-8fa4-68569918ec05","Type":"ContainerDied","Data":"1b07034489ee129beac1192c2cf1ff305a62d867648ce13a13c45603890958e6"} Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.741200 4947 scope.go:117] "RemoveContainer" containerID="8ac10e0542ee6e6fc91192146a0ccaeb3a1f1aa07e3a572e1dc3942995b8f513" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.741386 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.748713 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7b5cf332-361f-4e92-97e2-575f850e1782","Type":"ContainerStarted","Data":"b3e52658147c693a35a77619f358007f0cf09e4514fd8b7e864349283ddf8663"} Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.748753 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7b5cf332-361f-4e92-97e2-575f850e1782","Type":"ContainerStarted","Data":"228f0368e624c00692128d283e2dbe53f23cc91ba07b76bd6c7e957d9c237b07"} Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.775159 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=2.775136996 podStartE2EDuration="2.775136996s" podCreationTimestamp="2025-12-03 09:04:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:04:30.766712658 +0000 UTC m=+8132.027667084" watchObservedRunningTime="2025-12-03 09:04:30.775136996 +0000 UTC m=+8132.036091422" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.779747 4947 scope.go:117] "RemoveContainer" containerID="706374c2a4e6b5b2255aeea28684d308cff0848ac527343fd80c81dae0ea8858" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.797460 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.810073 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.819520 4947 scope.go:117] "RemoveContainer" containerID="8ac10e0542ee6e6fc91192146a0ccaeb3a1f1aa07e3a572e1dc3942995b8f513" Dec 03 09:04:30 crc kubenswrapper[4947]: E1203 09:04:30.820046 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ac10e0542ee6e6fc91192146a0ccaeb3a1f1aa07e3a572e1dc3942995b8f513\": container with ID starting with 8ac10e0542ee6e6fc91192146a0ccaeb3a1f1aa07e3a572e1dc3942995b8f513 not found: ID does not exist" containerID="8ac10e0542ee6e6fc91192146a0ccaeb3a1f1aa07e3a572e1dc3942995b8f513" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.820084 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ac10e0542ee6e6fc91192146a0ccaeb3a1f1aa07e3a572e1dc3942995b8f513"} err="failed to get container status \"8ac10e0542ee6e6fc91192146a0ccaeb3a1f1aa07e3a572e1dc3942995b8f513\": rpc error: code = NotFound desc = could not find container \"8ac10e0542ee6e6fc91192146a0ccaeb3a1f1aa07e3a572e1dc3942995b8f513\": container with ID starting with 8ac10e0542ee6e6fc91192146a0ccaeb3a1f1aa07e3a572e1dc3942995b8f513 not found: ID does not exist" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.820110 4947 scope.go:117] "RemoveContainer" containerID="706374c2a4e6b5b2255aeea28684d308cff0848ac527343fd80c81dae0ea8858" Dec 03 09:04:30 crc kubenswrapper[4947]: E1203 09:04:30.821099 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"706374c2a4e6b5b2255aeea28684d308cff0848ac527343fd80c81dae0ea8858\": container with ID starting with 706374c2a4e6b5b2255aeea28684d308cff0848ac527343fd80c81dae0ea8858 not found: ID does not exist" containerID="706374c2a4e6b5b2255aeea28684d308cff0848ac527343fd80c81dae0ea8858" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.821133 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"706374c2a4e6b5b2255aeea28684d308cff0848ac527343fd80c81dae0ea8858"} err="failed to get container status \"706374c2a4e6b5b2255aeea28684d308cff0848ac527343fd80c81dae0ea8858\": rpc error: code = NotFound desc = could not find container \"706374c2a4e6b5b2255aeea28684d308cff0848ac527343fd80c81dae0ea8858\": container with ID starting with 706374c2a4e6b5b2255aeea28684d308cff0848ac527343fd80c81dae0ea8858 not found: ID does not exist" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.821152 4947 scope.go:117] "RemoveContainer" containerID="8ac10e0542ee6e6fc91192146a0ccaeb3a1f1aa07e3a572e1dc3942995b8f513" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.821415 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ac10e0542ee6e6fc91192146a0ccaeb3a1f1aa07e3a572e1dc3942995b8f513"} err="failed to get container status \"8ac10e0542ee6e6fc91192146a0ccaeb3a1f1aa07e3a572e1dc3942995b8f513\": rpc error: code = NotFound desc = could not find container \"8ac10e0542ee6e6fc91192146a0ccaeb3a1f1aa07e3a572e1dc3942995b8f513\": container with ID starting with 8ac10e0542ee6e6fc91192146a0ccaeb3a1f1aa07e3a572e1dc3942995b8f513 not found: ID does not exist" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.821444 4947 scope.go:117] "RemoveContainer" containerID="706374c2a4e6b5b2255aeea28684d308cff0848ac527343fd80c81dae0ea8858" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.821647 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"706374c2a4e6b5b2255aeea28684d308cff0848ac527343fd80c81dae0ea8858"} err="failed to get container status \"706374c2a4e6b5b2255aeea28684d308cff0848ac527343fd80c81dae0ea8858\": rpc error: code = NotFound desc = could not find container \"706374c2a4e6b5b2255aeea28684d308cff0848ac527343fd80c81dae0ea8858\": container with ID starting with 706374c2a4e6b5b2255aeea28684d308cff0848ac527343fd80c81dae0ea8858 not found: ID does not exist" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.826904 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 09:04:30 crc kubenswrapper[4947]: E1203 09:04:30.827316 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="761c7d03-8af8-4b0a-8fa4-68569918ec05" containerName="glance-httpd" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.827335 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="761c7d03-8af8-4b0a-8fa4-68569918ec05" containerName="glance-httpd" Dec 03 09:04:30 crc kubenswrapper[4947]: E1203 09:04:30.827357 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="761c7d03-8af8-4b0a-8fa4-68569918ec05" containerName="glance-log" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.827364 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="761c7d03-8af8-4b0a-8fa4-68569918ec05" containerName="glance-log" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.827613 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="761c7d03-8af8-4b0a-8fa4-68569918ec05" containerName="glance-httpd" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.827641 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="761c7d03-8af8-4b0a-8fa4-68569918ec05" containerName="glance-log" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.828718 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.831081 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.839933 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.942244 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f62866b1-13eb-4f40-9be3-168fae89746a-logs\") pod \"glance-default-internal-api-0\" (UID: \"f62866b1-13eb-4f40-9be3-168fae89746a\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.942329 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f62866b1-13eb-4f40-9be3-168fae89746a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f62866b1-13eb-4f40-9be3-168fae89746a\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.942401 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvrww\" (UniqueName: \"kubernetes.io/projected/f62866b1-13eb-4f40-9be3-168fae89746a-kube-api-access-mvrww\") pod \"glance-default-internal-api-0\" (UID: \"f62866b1-13eb-4f40-9be3-168fae89746a\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.942420 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f62866b1-13eb-4f40-9be3-168fae89746a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f62866b1-13eb-4f40-9be3-168fae89746a\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.942466 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f62866b1-13eb-4f40-9be3-168fae89746a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f62866b1-13eb-4f40-9be3-168fae89746a\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:30 crc kubenswrapper[4947]: I1203 09:04:30.942799 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f62866b1-13eb-4f40-9be3-168fae89746a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f62866b1-13eb-4f40-9be3-168fae89746a\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:31 crc kubenswrapper[4947]: I1203 09:04:31.045004 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f62866b1-13eb-4f40-9be3-168fae89746a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f62866b1-13eb-4f40-9be3-168fae89746a\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:31 crc kubenswrapper[4947]: I1203 09:04:31.045111 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f62866b1-13eb-4f40-9be3-168fae89746a-logs\") pod \"glance-default-internal-api-0\" (UID: \"f62866b1-13eb-4f40-9be3-168fae89746a\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:31 crc kubenswrapper[4947]: I1203 09:04:31.045178 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f62866b1-13eb-4f40-9be3-168fae89746a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f62866b1-13eb-4f40-9be3-168fae89746a\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:31 crc kubenswrapper[4947]: I1203 09:04:31.045229 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvrww\" (UniqueName: \"kubernetes.io/projected/f62866b1-13eb-4f40-9be3-168fae89746a-kube-api-access-mvrww\") pod \"glance-default-internal-api-0\" (UID: \"f62866b1-13eb-4f40-9be3-168fae89746a\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:31 crc kubenswrapper[4947]: I1203 09:04:31.045252 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f62866b1-13eb-4f40-9be3-168fae89746a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f62866b1-13eb-4f40-9be3-168fae89746a\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:31 crc kubenswrapper[4947]: I1203 09:04:31.045284 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f62866b1-13eb-4f40-9be3-168fae89746a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f62866b1-13eb-4f40-9be3-168fae89746a\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:31 crc kubenswrapper[4947]: I1203 09:04:31.045782 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f62866b1-13eb-4f40-9be3-168fae89746a-logs\") pod \"glance-default-internal-api-0\" (UID: \"f62866b1-13eb-4f40-9be3-168fae89746a\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:31 crc kubenswrapper[4947]: I1203 09:04:31.046006 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f62866b1-13eb-4f40-9be3-168fae89746a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f62866b1-13eb-4f40-9be3-168fae89746a\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:31 crc kubenswrapper[4947]: I1203 09:04:31.050298 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f62866b1-13eb-4f40-9be3-168fae89746a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f62866b1-13eb-4f40-9be3-168fae89746a\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:31 crc kubenswrapper[4947]: I1203 09:04:31.050653 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f62866b1-13eb-4f40-9be3-168fae89746a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f62866b1-13eb-4f40-9be3-168fae89746a\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:31 crc kubenswrapper[4947]: I1203 09:04:31.050703 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f62866b1-13eb-4f40-9be3-168fae89746a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f62866b1-13eb-4f40-9be3-168fae89746a\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:31 crc kubenswrapper[4947]: I1203 09:04:31.066273 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvrww\" (UniqueName: \"kubernetes.io/projected/f62866b1-13eb-4f40-9be3-168fae89746a-kube-api-access-mvrww\") pod \"glance-default-internal-api-0\" (UID: \"f62866b1-13eb-4f40-9be3-168fae89746a\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:04:31 crc kubenswrapper[4947]: I1203 09:04:31.095820 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="761c7d03-8af8-4b0a-8fa4-68569918ec05" path="/var/lib/kubelet/pods/761c7d03-8af8-4b0a-8fa4-68569918ec05/volumes" Dec 03 09:04:31 crc kubenswrapper[4947]: I1203 09:04:31.147564 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 09:04:31 crc kubenswrapper[4947]: I1203 09:04:31.689684 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 09:04:31 crc kubenswrapper[4947]: I1203 09:04:31.764734 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f62866b1-13eb-4f40-9be3-168fae89746a","Type":"ContainerStarted","Data":"2f142053a94faa0d43149efd12c97f338acec5df471901bab25e45845b32ec57"} Dec 03 09:04:32 crc kubenswrapper[4947]: I1203 09:04:32.783068 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f62866b1-13eb-4f40-9be3-168fae89746a","Type":"ContainerStarted","Data":"3f02f4ca39c082bd13b86c307f0171b08ad3d32035f76189a46d15637932dc87"} Dec 03 09:04:33 crc kubenswrapper[4947]: I1203 09:04:33.794806 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f62866b1-13eb-4f40-9be3-168fae89746a","Type":"ContainerStarted","Data":"a25e5fa470e9b2747046dc6fb3121e3d33d66549e440913c334a3f4509f0a763"} Dec 03 09:04:33 crc kubenswrapper[4947]: I1203 09:04:33.816873 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.816852549 podStartE2EDuration="3.816852549s" podCreationTimestamp="2025-12-03 09:04:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:04:33.816148681 +0000 UTC m=+8135.077103117" watchObservedRunningTime="2025-12-03 09:04:33.816852549 +0000 UTC m=+8135.077806985" Dec 03 09:04:34 crc kubenswrapper[4947]: I1203 09:04:34.630777 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" Dec 03 09:04:34 crc kubenswrapper[4947]: I1203 09:04:34.704963 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56cbd99d57-8tnp7"] Dec 03 09:04:34 crc kubenswrapper[4947]: I1203 09:04:34.705199 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" podUID="b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64" containerName="dnsmasq-dns" containerID="cri-o://ac00a368496b2036e7ce63fded5e5b14e49199f61b9d07fbeacb8c831ee9d4ca" gracePeriod=10 Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.709793 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.815110 4947 generic.go:334] "Generic (PLEG): container finished" podID="b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64" containerID="ac00a368496b2036e7ce63fded5e5b14e49199f61b9d07fbeacb8c831ee9d4ca" exitCode=0 Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.815150 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" event={"ID":"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64","Type":"ContainerDied","Data":"ac00a368496b2036e7ce63fded5e5b14e49199f61b9d07fbeacb8c831ee9d4ca"} Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.815178 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" event={"ID":"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64","Type":"ContainerDied","Data":"01b2833bbb42e34f1aa08f97b98ec5c62531b63e361f5d9b126dcb470a51b137"} Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.815184 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56cbd99d57-8tnp7" Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.815197 4947 scope.go:117] "RemoveContainer" containerID="ac00a368496b2036e7ce63fded5e5b14e49199f61b9d07fbeacb8c831ee9d4ca" Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.834415 4947 scope.go:117] "RemoveContainer" containerID="56906a5680e3db102c21703b68397be45729af3fa2b348f0311420fe14090054" Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.854206 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-ovsdbserver-nb\") pod \"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64\" (UID: \"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64\") " Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.854290 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s84p\" (UniqueName: \"kubernetes.io/projected/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-kube-api-access-7s84p\") pod \"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64\" (UID: \"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64\") " Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.854316 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-ovsdbserver-sb\") pod \"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64\" (UID: \"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64\") " Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.855266 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-config\") pod \"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64\" (UID: \"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64\") " Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.855393 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-dns-svc\") pod \"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64\" (UID: \"b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64\") " Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.856947 4947 scope.go:117] "RemoveContainer" containerID="ac00a368496b2036e7ce63fded5e5b14e49199f61b9d07fbeacb8c831ee9d4ca" Dec 03 09:04:35 crc kubenswrapper[4947]: E1203 09:04:35.857380 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac00a368496b2036e7ce63fded5e5b14e49199f61b9d07fbeacb8c831ee9d4ca\": container with ID starting with ac00a368496b2036e7ce63fded5e5b14e49199f61b9d07fbeacb8c831ee9d4ca not found: ID does not exist" containerID="ac00a368496b2036e7ce63fded5e5b14e49199f61b9d07fbeacb8c831ee9d4ca" Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.857427 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac00a368496b2036e7ce63fded5e5b14e49199f61b9d07fbeacb8c831ee9d4ca"} err="failed to get container status \"ac00a368496b2036e7ce63fded5e5b14e49199f61b9d07fbeacb8c831ee9d4ca\": rpc error: code = NotFound desc = could not find container \"ac00a368496b2036e7ce63fded5e5b14e49199f61b9d07fbeacb8c831ee9d4ca\": container with ID starting with ac00a368496b2036e7ce63fded5e5b14e49199f61b9d07fbeacb8c831ee9d4ca not found: ID does not exist" Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.857453 4947 scope.go:117] "RemoveContainer" containerID="56906a5680e3db102c21703b68397be45729af3fa2b348f0311420fe14090054" Dec 03 09:04:35 crc kubenswrapper[4947]: E1203 09:04:35.857967 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56906a5680e3db102c21703b68397be45729af3fa2b348f0311420fe14090054\": container with ID starting with 56906a5680e3db102c21703b68397be45729af3fa2b348f0311420fe14090054 not found: ID does not exist" containerID="56906a5680e3db102c21703b68397be45729af3fa2b348f0311420fe14090054" Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.858019 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56906a5680e3db102c21703b68397be45729af3fa2b348f0311420fe14090054"} err="failed to get container status \"56906a5680e3db102c21703b68397be45729af3fa2b348f0311420fe14090054\": rpc error: code = NotFound desc = could not find container \"56906a5680e3db102c21703b68397be45729af3fa2b348f0311420fe14090054\": container with ID starting with 56906a5680e3db102c21703b68397be45729af3fa2b348f0311420fe14090054 not found: ID does not exist" Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.860569 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-kube-api-access-7s84p" (OuterVolumeSpecName: "kube-api-access-7s84p") pod "b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64" (UID: "b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64"). InnerVolumeSpecName "kube-api-access-7s84p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.895796 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64" (UID: "b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.895818 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-config" (OuterVolumeSpecName: "config") pod "b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64" (UID: "b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.898882 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64" (UID: "b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.899575 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64" (UID: "b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.958431 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.958470 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s84p\" (UniqueName: \"kubernetes.io/projected/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-kube-api-access-7s84p\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.958483 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.958560 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-config\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:35 crc kubenswrapper[4947]: I1203 09:04:35.958597 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:36 crc kubenswrapper[4947]: I1203 09:04:36.153299 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56cbd99d57-8tnp7"] Dec 03 09:04:36 crc kubenswrapper[4947]: I1203 09:04:36.160927 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56cbd99d57-8tnp7"] Dec 03 09:04:37 crc kubenswrapper[4947]: I1203 09:04:37.108825 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64" path="/var/lib/kubelet/pods/b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64/volumes" Dec 03 09:04:39 crc kubenswrapper[4947]: I1203 09:04:39.099862 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 09:04:39 crc kubenswrapper[4947]: I1203 09:04:39.100180 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 09:04:39 crc kubenswrapper[4947]: I1203 09:04:39.128470 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 09:04:39 crc kubenswrapper[4947]: I1203 09:04:39.141771 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 09:04:39 crc kubenswrapper[4947]: I1203 09:04:39.849005 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 09:04:39 crc kubenswrapper[4947]: I1203 09:04:39.849058 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 09:04:41 crc kubenswrapper[4947]: I1203 09:04:41.148073 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 09:04:41 crc kubenswrapper[4947]: I1203 09:04:41.148708 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 09:04:41 crc kubenswrapper[4947]: I1203 09:04:41.198026 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 09:04:41 crc kubenswrapper[4947]: I1203 09:04:41.206458 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 09:04:41 crc kubenswrapper[4947]: I1203 09:04:41.865403 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 09:04:41 crc kubenswrapper[4947]: I1203 09:04:41.865943 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 09:04:42 crc kubenswrapper[4947]: I1203 09:04:42.893308 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 09:04:42 crc kubenswrapper[4947]: I1203 09:04:42.893440 4947 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 09:04:42 crc kubenswrapper[4947]: I1203 09:04:42.909230 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 09:04:44 crc kubenswrapper[4947]: I1203 09:04:44.020230 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 09:04:44 crc kubenswrapper[4947]: I1203 09:04:44.021538 4947 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 09:04:44 crc kubenswrapper[4947]: I1203 09:04:44.033328 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.232260 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-mbcvh"] Dec 03 09:04:50 crc kubenswrapper[4947]: E1203 09:04:50.234516 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64" containerName="init" Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.234601 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64" containerName="init" Dec 03 09:04:50 crc kubenswrapper[4947]: E1203 09:04:50.234673 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64" containerName="dnsmasq-dns" Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.234731 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64" containerName="dnsmasq-dns" Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.234994 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6c8d27f-7d52-4b59-a1e5-e0e182f9ea64" containerName="dnsmasq-dns" Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.235736 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mbcvh" Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.242833 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-ef07-account-create-update-vlczq"] Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.244907 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ef07-account-create-update-vlczq" Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.246880 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.251009 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-mbcvh"] Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.271668 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ef07-account-create-update-vlczq"] Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.332627 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx748\" (UniqueName: \"kubernetes.io/projected/8f4b85c2-6f17-4b45-adaf-7600a962658a-kube-api-access-wx748\") pod \"placement-ef07-account-create-update-vlczq\" (UID: \"8f4b85c2-6f17-4b45-adaf-7600a962658a\") " pod="openstack/placement-ef07-account-create-update-vlczq" Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.332695 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f4b85c2-6f17-4b45-adaf-7600a962658a-operator-scripts\") pod \"placement-ef07-account-create-update-vlczq\" (UID: \"8f4b85c2-6f17-4b45-adaf-7600a962658a\") " pod="openstack/placement-ef07-account-create-update-vlczq" Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.332975 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skljf\" (UniqueName: \"kubernetes.io/projected/d8da5d95-8a33-43fe-b1a1-23888f1c8a3b-kube-api-access-skljf\") pod \"placement-db-create-mbcvh\" (UID: \"d8da5d95-8a33-43fe-b1a1-23888f1c8a3b\") " pod="openstack/placement-db-create-mbcvh" Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.333209 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8da5d95-8a33-43fe-b1a1-23888f1c8a3b-operator-scripts\") pod \"placement-db-create-mbcvh\" (UID: \"d8da5d95-8a33-43fe-b1a1-23888f1c8a3b\") " pod="openstack/placement-db-create-mbcvh" Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.434459 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx748\" (UniqueName: \"kubernetes.io/projected/8f4b85c2-6f17-4b45-adaf-7600a962658a-kube-api-access-wx748\") pod \"placement-ef07-account-create-update-vlczq\" (UID: \"8f4b85c2-6f17-4b45-adaf-7600a962658a\") " pod="openstack/placement-ef07-account-create-update-vlczq" Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.434734 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f4b85c2-6f17-4b45-adaf-7600a962658a-operator-scripts\") pod \"placement-ef07-account-create-update-vlczq\" (UID: \"8f4b85c2-6f17-4b45-adaf-7600a962658a\") " pod="openstack/placement-ef07-account-create-update-vlczq" Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.434855 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skljf\" (UniqueName: \"kubernetes.io/projected/d8da5d95-8a33-43fe-b1a1-23888f1c8a3b-kube-api-access-skljf\") pod \"placement-db-create-mbcvh\" (UID: \"d8da5d95-8a33-43fe-b1a1-23888f1c8a3b\") " pod="openstack/placement-db-create-mbcvh" Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.434930 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8da5d95-8a33-43fe-b1a1-23888f1c8a3b-operator-scripts\") pod \"placement-db-create-mbcvh\" (UID: \"d8da5d95-8a33-43fe-b1a1-23888f1c8a3b\") " pod="openstack/placement-db-create-mbcvh" Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.435755 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f4b85c2-6f17-4b45-adaf-7600a962658a-operator-scripts\") pod \"placement-ef07-account-create-update-vlczq\" (UID: \"8f4b85c2-6f17-4b45-adaf-7600a962658a\") " pod="openstack/placement-ef07-account-create-update-vlczq" Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.435819 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8da5d95-8a33-43fe-b1a1-23888f1c8a3b-operator-scripts\") pod \"placement-db-create-mbcvh\" (UID: \"d8da5d95-8a33-43fe-b1a1-23888f1c8a3b\") " pod="openstack/placement-db-create-mbcvh" Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.458939 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx748\" (UniqueName: \"kubernetes.io/projected/8f4b85c2-6f17-4b45-adaf-7600a962658a-kube-api-access-wx748\") pod \"placement-ef07-account-create-update-vlczq\" (UID: \"8f4b85c2-6f17-4b45-adaf-7600a962658a\") " pod="openstack/placement-ef07-account-create-update-vlczq" Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.464097 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skljf\" (UniqueName: \"kubernetes.io/projected/d8da5d95-8a33-43fe-b1a1-23888f1c8a3b-kube-api-access-skljf\") pod \"placement-db-create-mbcvh\" (UID: \"d8da5d95-8a33-43fe-b1a1-23888f1c8a3b\") " pod="openstack/placement-db-create-mbcvh" Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.578871 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mbcvh" Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.588455 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ef07-account-create-update-vlczq" Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.946248 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ef07-account-create-update-vlczq"] Dec 03 09:04:50 crc kubenswrapper[4947]: I1203 09:04:50.955562 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ef07-account-create-update-vlczq" event={"ID":"8f4b85c2-6f17-4b45-adaf-7600a962658a","Type":"ContainerStarted","Data":"21bc44f39206869bf81f7f31968260cfc1d4ed9680a4252fff598ef207bfd347"} Dec 03 09:04:51 crc kubenswrapper[4947]: I1203 09:04:51.197520 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-mbcvh"] Dec 03 09:04:51 crc kubenswrapper[4947]: I1203 09:04:51.967368 4947 generic.go:334] "Generic (PLEG): container finished" podID="d8da5d95-8a33-43fe-b1a1-23888f1c8a3b" containerID="fbc39a2968baf632859177722f2b34ce11d613f0d3220f1bf64f257b52b4a1b3" exitCode=0 Dec 03 09:04:51 crc kubenswrapper[4947]: I1203 09:04:51.967456 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-mbcvh" event={"ID":"d8da5d95-8a33-43fe-b1a1-23888f1c8a3b","Type":"ContainerDied","Data":"fbc39a2968baf632859177722f2b34ce11d613f0d3220f1bf64f257b52b4a1b3"} Dec 03 09:04:51 crc kubenswrapper[4947]: I1203 09:04:51.967931 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-mbcvh" event={"ID":"d8da5d95-8a33-43fe-b1a1-23888f1c8a3b","Type":"ContainerStarted","Data":"289d0150fff867ed5111841c97683d1c1ed73356f2fbab45b35bc55eb8cb55c3"} Dec 03 09:04:51 crc kubenswrapper[4947]: I1203 09:04:51.971308 4947 generic.go:334] "Generic (PLEG): container finished" podID="8f4b85c2-6f17-4b45-adaf-7600a962658a" containerID="e90162baaca1f0b5bad71f0f8f5a4ed750636334bd672b786e9f6185f7e70e5b" exitCode=0 Dec 03 09:04:51 crc kubenswrapper[4947]: I1203 09:04:51.971342 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ef07-account-create-update-vlczq" event={"ID":"8f4b85c2-6f17-4b45-adaf-7600a962658a","Type":"ContainerDied","Data":"e90162baaca1f0b5bad71f0f8f5a4ed750636334bd672b786e9f6185f7e70e5b"} Dec 03 09:04:53 crc kubenswrapper[4947]: I1203 09:04:53.463217 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ef07-account-create-update-vlczq" Dec 03 09:04:53 crc kubenswrapper[4947]: I1203 09:04:53.470984 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mbcvh" Dec 03 09:04:53 crc kubenswrapper[4947]: I1203 09:04:53.600639 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wx748\" (UniqueName: \"kubernetes.io/projected/8f4b85c2-6f17-4b45-adaf-7600a962658a-kube-api-access-wx748\") pod \"8f4b85c2-6f17-4b45-adaf-7600a962658a\" (UID: \"8f4b85c2-6f17-4b45-adaf-7600a962658a\") " Dec 03 09:04:53 crc kubenswrapper[4947]: I1203 09:04:53.600694 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f4b85c2-6f17-4b45-adaf-7600a962658a-operator-scripts\") pod \"8f4b85c2-6f17-4b45-adaf-7600a962658a\" (UID: \"8f4b85c2-6f17-4b45-adaf-7600a962658a\") " Dec 03 09:04:53 crc kubenswrapper[4947]: I1203 09:04:53.600773 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skljf\" (UniqueName: \"kubernetes.io/projected/d8da5d95-8a33-43fe-b1a1-23888f1c8a3b-kube-api-access-skljf\") pod \"d8da5d95-8a33-43fe-b1a1-23888f1c8a3b\" (UID: \"d8da5d95-8a33-43fe-b1a1-23888f1c8a3b\") " Dec 03 09:04:53 crc kubenswrapper[4947]: I1203 09:04:53.600901 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8da5d95-8a33-43fe-b1a1-23888f1c8a3b-operator-scripts\") pod \"d8da5d95-8a33-43fe-b1a1-23888f1c8a3b\" (UID: \"d8da5d95-8a33-43fe-b1a1-23888f1c8a3b\") " Dec 03 09:04:53 crc kubenswrapper[4947]: I1203 09:04:53.601733 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f4b85c2-6f17-4b45-adaf-7600a962658a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8f4b85c2-6f17-4b45-adaf-7600a962658a" (UID: "8f4b85c2-6f17-4b45-adaf-7600a962658a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:04:53 crc kubenswrapper[4947]: I1203 09:04:53.601882 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8da5d95-8a33-43fe-b1a1-23888f1c8a3b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8da5d95-8a33-43fe-b1a1-23888f1c8a3b" (UID: "d8da5d95-8a33-43fe-b1a1-23888f1c8a3b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:04:53 crc kubenswrapper[4947]: I1203 09:04:53.609200 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8da5d95-8a33-43fe-b1a1-23888f1c8a3b-kube-api-access-skljf" (OuterVolumeSpecName: "kube-api-access-skljf") pod "d8da5d95-8a33-43fe-b1a1-23888f1c8a3b" (UID: "d8da5d95-8a33-43fe-b1a1-23888f1c8a3b"). InnerVolumeSpecName "kube-api-access-skljf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:04:53 crc kubenswrapper[4947]: I1203 09:04:53.610763 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f4b85c2-6f17-4b45-adaf-7600a962658a-kube-api-access-wx748" (OuterVolumeSpecName: "kube-api-access-wx748") pod "8f4b85c2-6f17-4b45-adaf-7600a962658a" (UID: "8f4b85c2-6f17-4b45-adaf-7600a962658a"). InnerVolumeSpecName "kube-api-access-wx748". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:04:53 crc kubenswrapper[4947]: I1203 09:04:53.703281 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skljf\" (UniqueName: \"kubernetes.io/projected/d8da5d95-8a33-43fe-b1a1-23888f1c8a3b-kube-api-access-skljf\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:53 crc kubenswrapper[4947]: I1203 09:04:53.703803 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8da5d95-8a33-43fe-b1a1-23888f1c8a3b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:53 crc kubenswrapper[4947]: I1203 09:04:53.703882 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wx748\" (UniqueName: \"kubernetes.io/projected/8f4b85c2-6f17-4b45-adaf-7600a962658a-kube-api-access-wx748\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:53 crc kubenswrapper[4947]: I1203 09:04:53.703991 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f4b85c2-6f17-4b45-adaf-7600a962658a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:53 crc kubenswrapper[4947]: I1203 09:04:53.991044 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ef07-account-create-update-vlczq" event={"ID":"8f4b85c2-6f17-4b45-adaf-7600a962658a","Type":"ContainerDied","Data":"21bc44f39206869bf81f7f31968260cfc1d4ed9680a4252fff598ef207bfd347"} Dec 03 09:04:53 crc kubenswrapper[4947]: I1203 09:04:53.991087 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21bc44f39206869bf81f7f31968260cfc1d4ed9680a4252fff598ef207bfd347" Dec 03 09:04:53 crc kubenswrapper[4947]: I1203 09:04:53.991143 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ef07-account-create-update-vlczq" Dec 03 09:04:53 crc kubenswrapper[4947]: I1203 09:04:53.993467 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-mbcvh" event={"ID":"d8da5d95-8a33-43fe-b1a1-23888f1c8a3b","Type":"ContainerDied","Data":"289d0150fff867ed5111841c97683d1c1ed73356f2fbab45b35bc55eb8cb55c3"} Dec 03 09:04:53 crc kubenswrapper[4947]: I1203 09:04:53.993558 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="289d0150fff867ed5111841c97683d1c1ed73356f2fbab45b35bc55eb8cb55c3" Dec 03 09:04:53 crc kubenswrapper[4947]: I1203 09:04:53.993581 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-mbcvh" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.555696 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86559d7c89-9459n"] Dec 03 09:04:55 crc kubenswrapper[4947]: E1203 09:04:55.556336 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8da5d95-8a33-43fe-b1a1-23888f1c8a3b" containerName="mariadb-database-create" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.556349 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8da5d95-8a33-43fe-b1a1-23888f1c8a3b" containerName="mariadb-database-create" Dec 03 09:04:55 crc kubenswrapper[4947]: E1203 09:04:55.556372 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f4b85c2-6f17-4b45-adaf-7600a962658a" containerName="mariadb-account-create-update" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.556378 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f4b85c2-6f17-4b45-adaf-7600a962658a" containerName="mariadb-account-create-update" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.556599 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f4b85c2-6f17-4b45-adaf-7600a962658a" containerName="mariadb-account-create-update" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.556614 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8da5d95-8a33-43fe-b1a1-23888f1c8a3b" containerName="mariadb-database-create" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.557595 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86559d7c89-9459n" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.574066 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86559d7c89-9459n"] Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.583885 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-vmh4b"] Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.592514 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-vmh4b" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.596833 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.596911 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.597114 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-6lskg" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.621028 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-vmh4b"] Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.642201 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4l5b\" (UniqueName: \"kubernetes.io/projected/22679f47-5751-4f6c-a404-93a57d494ebc-kube-api-access-s4l5b\") pod \"dnsmasq-dns-86559d7c89-9459n\" (UID: \"22679f47-5751-4f6c-a404-93a57d494ebc\") " pod="openstack/dnsmasq-dns-86559d7c89-9459n" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.642250 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22679f47-5751-4f6c-a404-93a57d494ebc-config\") pod \"dnsmasq-dns-86559d7c89-9459n\" (UID: \"22679f47-5751-4f6c-a404-93a57d494ebc\") " pod="openstack/dnsmasq-dns-86559d7c89-9459n" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.642288 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22679f47-5751-4f6c-a404-93a57d494ebc-ovsdbserver-sb\") pod \"dnsmasq-dns-86559d7c89-9459n\" (UID: \"22679f47-5751-4f6c-a404-93a57d494ebc\") " pod="openstack/dnsmasq-dns-86559d7c89-9459n" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.642434 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22679f47-5751-4f6c-a404-93a57d494ebc-ovsdbserver-nb\") pod \"dnsmasq-dns-86559d7c89-9459n\" (UID: \"22679f47-5751-4f6c-a404-93a57d494ebc\") " pod="openstack/dnsmasq-dns-86559d7c89-9459n" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.642476 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22679f47-5751-4f6c-a404-93a57d494ebc-dns-svc\") pod \"dnsmasq-dns-86559d7c89-9459n\" (UID: \"22679f47-5751-4f6c-a404-93a57d494ebc\") " pod="openstack/dnsmasq-dns-86559d7c89-9459n" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.744293 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22679f47-5751-4f6c-a404-93a57d494ebc-config\") pod \"dnsmasq-dns-86559d7c89-9459n\" (UID: \"22679f47-5751-4f6c-a404-93a57d494ebc\") " pod="openstack/dnsmasq-dns-86559d7c89-9459n" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.744384 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22679f47-5751-4f6c-a404-93a57d494ebc-ovsdbserver-sb\") pod \"dnsmasq-dns-86559d7c89-9459n\" (UID: \"22679f47-5751-4f6c-a404-93a57d494ebc\") " pod="openstack/dnsmasq-dns-86559d7c89-9459n" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.744513 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8be5d69-561e-41cc-bf19-f216936f8c9b-combined-ca-bundle\") pod \"placement-db-sync-vmh4b\" (UID: \"d8be5d69-561e-41cc-bf19-f216936f8c9b\") " pod="openstack/placement-db-sync-vmh4b" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.744544 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8be5d69-561e-41cc-bf19-f216936f8c9b-logs\") pod \"placement-db-sync-vmh4b\" (UID: \"d8be5d69-561e-41cc-bf19-f216936f8c9b\") " pod="openstack/placement-db-sync-vmh4b" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.744575 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzwnf\" (UniqueName: \"kubernetes.io/projected/d8be5d69-561e-41cc-bf19-f216936f8c9b-kube-api-access-qzwnf\") pod \"placement-db-sync-vmh4b\" (UID: \"d8be5d69-561e-41cc-bf19-f216936f8c9b\") " pod="openstack/placement-db-sync-vmh4b" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.744597 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22679f47-5751-4f6c-a404-93a57d494ebc-ovsdbserver-nb\") pod \"dnsmasq-dns-86559d7c89-9459n\" (UID: \"22679f47-5751-4f6c-a404-93a57d494ebc\") " pod="openstack/dnsmasq-dns-86559d7c89-9459n" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.744630 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22679f47-5751-4f6c-a404-93a57d494ebc-dns-svc\") pod \"dnsmasq-dns-86559d7c89-9459n\" (UID: \"22679f47-5751-4f6c-a404-93a57d494ebc\") " pod="openstack/dnsmasq-dns-86559d7c89-9459n" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.744669 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8be5d69-561e-41cc-bf19-f216936f8c9b-config-data\") pod \"placement-db-sync-vmh4b\" (UID: \"d8be5d69-561e-41cc-bf19-f216936f8c9b\") " pod="openstack/placement-db-sync-vmh4b" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.744710 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8be5d69-561e-41cc-bf19-f216936f8c9b-scripts\") pod \"placement-db-sync-vmh4b\" (UID: \"d8be5d69-561e-41cc-bf19-f216936f8c9b\") " pod="openstack/placement-db-sync-vmh4b" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.744749 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4l5b\" (UniqueName: \"kubernetes.io/projected/22679f47-5751-4f6c-a404-93a57d494ebc-kube-api-access-s4l5b\") pod \"dnsmasq-dns-86559d7c89-9459n\" (UID: \"22679f47-5751-4f6c-a404-93a57d494ebc\") " pod="openstack/dnsmasq-dns-86559d7c89-9459n" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.745305 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22679f47-5751-4f6c-a404-93a57d494ebc-config\") pod \"dnsmasq-dns-86559d7c89-9459n\" (UID: \"22679f47-5751-4f6c-a404-93a57d494ebc\") " pod="openstack/dnsmasq-dns-86559d7c89-9459n" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.745854 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22679f47-5751-4f6c-a404-93a57d494ebc-ovsdbserver-nb\") pod \"dnsmasq-dns-86559d7c89-9459n\" (UID: \"22679f47-5751-4f6c-a404-93a57d494ebc\") " pod="openstack/dnsmasq-dns-86559d7c89-9459n" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.747853 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22679f47-5751-4f6c-a404-93a57d494ebc-ovsdbserver-sb\") pod \"dnsmasq-dns-86559d7c89-9459n\" (UID: \"22679f47-5751-4f6c-a404-93a57d494ebc\") " pod="openstack/dnsmasq-dns-86559d7c89-9459n" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.748785 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22679f47-5751-4f6c-a404-93a57d494ebc-dns-svc\") pod \"dnsmasq-dns-86559d7c89-9459n\" (UID: \"22679f47-5751-4f6c-a404-93a57d494ebc\") " pod="openstack/dnsmasq-dns-86559d7c89-9459n" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.770505 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4l5b\" (UniqueName: \"kubernetes.io/projected/22679f47-5751-4f6c-a404-93a57d494ebc-kube-api-access-s4l5b\") pod \"dnsmasq-dns-86559d7c89-9459n\" (UID: \"22679f47-5751-4f6c-a404-93a57d494ebc\") " pod="openstack/dnsmasq-dns-86559d7c89-9459n" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.846301 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8be5d69-561e-41cc-bf19-f216936f8c9b-config-data\") pod \"placement-db-sync-vmh4b\" (UID: \"d8be5d69-561e-41cc-bf19-f216936f8c9b\") " pod="openstack/placement-db-sync-vmh4b" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.846374 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8be5d69-561e-41cc-bf19-f216936f8c9b-scripts\") pod \"placement-db-sync-vmh4b\" (UID: \"d8be5d69-561e-41cc-bf19-f216936f8c9b\") " pod="openstack/placement-db-sync-vmh4b" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.846480 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8be5d69-561e-41cc-bf19-f216936f8c9b-combined-ca-bundle\") pod \"placement-db-sync-vmh4b\" (UID: \"d8be5d69-561e-41cc-bf19-f216936f8c9b\") " pod="openstack/placement-db-sync-vmh4b" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.846525 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8be5d69-561e-41cc-bf19-f216936f8c9b-logs\") pod \"placement-db-sync-vmh4b\" (UID: \"d8be5d69-561e-41cc-bf19-f216936f8c9b\") " pod="openstack/placement-db-sync-vmh4b" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.846547 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzwnf\" (UniqueName: \"kubernetes.io/projected/d8be5d69-561e-41cc-bf19-f216936f8c9b-kube-api-access-qzwnf\") pod \"placement-db-sync-vmh4b\" (UID: \"d8be5d69-561e-41cc-bf19-f216936f8c9b\") " pod="openstack/placement-db-sync-vmh4b" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.847307 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8be5d69-561e-41cc-bf19-f216936f8c9b-logs\") pod \"placement-db-sync-vmh4b\" (UID: \"d8be5d69-561e-41cc-bf19-f216936f8c9b\") " pod="openstack/placement-db-sync-vmh4b" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.850640 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8be5d69-561e-41cc-bf19-f216936f8c9b-config-data\") pod \"placement-db-sync-vmh4b\" (UID: \"d8be5d69-561e-41cc-bf19-f216936f8c9b\") " pod="openstack/placement-db-sync-vmh4b" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.850721 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8be5d69-561e-41cc-bf19-f216936f8c9b-combined-ca-bundle\") pod \"placement-db-sync-vmh4b\" (UID: \"d8be5d69-561e-41cc-bf19-f216936f8c9b\") " pod="openstack/placement-db-sync-vmh4b" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.850868 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8be5d69-561e-41cc-bf19-f216936f8c9b-scripts\") pod \"placement-db-sync-vmh4b\" (UID: \"d8be5d69-561e-41cc-bf19-f216936f8c9b\") " pod="openstack/placement-db-sync-vmh4b" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.863936 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzwnf\" (UniqueName: \"kubernetes.io/projected/d8be5d69-561e-41cc-bf19-f216936f8c9b-kube-api-access-qzwnf\") pod \"placement-db-sync-vmh4b\" (UID: \"d8be5d69-561e-41cc-bf19-f216936f8c9b\") " pod="openstack/placement-db-sync-vmh4b" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.875637 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86559d7c89-9459n" Dec 03 09:04:55 crc kubenswrapper[4947]: I1203 09:04:55.911293 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-vmh4b" Dec 03 09:04:56 crc kubenswrapper[4947]: I1203 09:04:56.351701 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86559d7c89-9459n"] Dec 03 09:04:56 crc kubenswrapper[4947]: I1203 09:04:56.425558 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-vmh4b"] Dec 03 09:04:56 crc kubenswrapper[4947]: W1203 09:04:56.427405 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8be5d69_561e_41cc_bf19_f216936f8c9b.slice/crio-75a3fd30fb7fbf7416cdd21048c00aa7453bf16ca5447049b3113d698bed4c7f WatchSource:0}: Error finding container 75a3fd30fb7fbf7416cdd21048c00aa7453bf16ca5447049b3113d698bed4c7f: Status 404 returned error can't find the container with id 75a3fd30fb7fbf7416cdd21048c00aa7453bf16ca5447049b3113d698bed4c7f Dec 03 09:04:57 crc kubenswrapper[4947]: I1203 09:04:57.034431 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86559d7c89-9459n" event={"ID":"22679f47-5751-4f6c-a404-93a57d494ebc","Type":"ContainerStarted","Data":"fba60e7c2c585a65fdbbdeade94d713bdf4124f64ec57429283004672a30d631"} Dec 03 09:04:57 crc kubenswrapper[4947]: I1203 09:04:57.036814 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-vmh4b" event={"ID":"d8be5d69-561e-41cc-bf19-f216936f8c9b","Type":"ContainerStarted","Data":"75a3fd30fb7fbf7416cdd21048c00aa7453bf16ca5447049b3113d698bed4c7f"} Dec 03 09:04:58 crc kubenswrapper[4947]: I1203 09:04:58.047473 4947 generic.go:334] "Generic (PLEG): container finished" podID="22679f47-5751-4f6c-a404-93a57d494ebc" containerID="d5d66c37b14ad7a5d7f345841dece0b005c404d66caa2ee3cf5a1edc1888b554" exitCode=0 Dec 03 09:04:58 crc kubenswrapper[4947]: I1203 09:04:58.047592 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86559d7c89-9459n" event={"ID":"22679f47-5751-4f6c-a404-93a57d494ebc","Type":"ContainerDied","Data":"d5d66c37b14ad7a5d7f345841dece0b005c404d66caa2ee3cf5a1edc1888b554"} Dec 03 09:04:59 crc kubenswrapper[4947]: I1203 09:04:59.060108 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86559d7c89-9459n" event={"ID":"22679f47-5751-4f6c-a404-93a57d494ebc","Type":"ContainerStarted","Data":"2c5464a3c3c03006bf71528218685df4f127aa268315b4200d721725b3c40813"} Dec 03 09:04:59 crc kubenswrapper[4947]: I1203 09:04:59.060373 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86559d7c89-9459n" Dec 03 09:04:59 crc kubenswrapper[4947]: I1203 09:04:59.079714 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86559d7c89-9459n" podStartSLOduration=4.079698809 podStartE2EDuration="4.079698809s" podCreationTimestamp="2025-12-03 09:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:04:59.079550245 +0000 UTC m=+8160.340504711" watchObservedRunningTime="2025-12-03 09:04:59.079698809 +0000 UTC m=+8160.340653235" Dec 03 09:05:02 crc kubenswrapper[4947]: I1203 09:05:02.094170 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-vmh4b" event={"ID":"d8be5d69-561e-41cc-bf19-f216936f8c9b","Type":"ContainerStarted","Data":"945b8d0c53b24f98426655fb060f883839c9bc6cd2098b6510cbdccfd213329b"} Dec 03 09:05:02 crc kubenswrapper[4947]: I1203 09:05:02.113412 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-vmh4b" podStartSLOduration=1.881898262 podStartE2EDuration="7.113395856s" podCreationTimestamp="2025-12-03 09:04:55 +0000 UTC" firstStartedPulling="2025-12-03 09:04:56.429423234 +0000 UTC m=+8157.690377660" lastFinishedPulling="2025-12-03 09:05:01.660920788 +0000 UTC m=+8162.921875254" observedRunningTime="2025-12-03 09:05:02.109376617 +0000 UTC m=+8163.370331063" watchObservedRunningTime="2025-12-03 09:05:02.113395856 +0000 UTC m=+8163.374350282" Dec 03 09:05:04 crc kubenswrapper[4947]: I1203 09:05:04.114605 4947 generic.go:334] "Generic (PLEG): container finished" podID="d8be5d69-561e-41cc-bf19-f216936f8c9b" containerID="945b8d0c53b24f98426655fb060f883839c9bc6cd2098b6510cbdccfd213329b" exitCode=0 Dec 03 09:05:04 crc kubenswrapper[4947]: I1203 09:05:04.115076 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-vmh4b" event={"ID":"d8be5d69-561e-41cc-bf19-f216936f8c9b","Type":"ContainerDied","Data":"945b8d0c53b24f98426655fb060f883839c9bc6cd2098b6510cbdccfd213329b"} Dec 03 09:05:05 crc kubenswrapper[4947]: I1203 09:05:05.537962 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-vmh4b" Dec 03 09:05:05 crc kubenswrapper[4947]: I1203 09:05:05.649649 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8be5d69-561e-41cc-bf19-f216936f8c9b-combined-ca-bundle\") pod \"d8be5d69-561e-41cc-bf19-f216936f8c9b\" (UID: \"d8be5d69-561e-41cc-bf19-f216936f8c9b\") " Dec 03 09:05:05 crc kubenswrapper[4947]: I1203 09:05:05.650052 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzwnf\" (UniqueName: \"kubernetes.io/projected/d8be5d69-561e-41cc-bf19-f216936f8c9b-kube-api-access-qzwnf\") pod \"d8be5d69-561e-41cc-bf19-f216936f8c9b\" (UID: \"d8be5d69-561e-41cc-bf19-f216936f8c9b\") " Dec 03 09:05:05 crc kubenswrapper[4947]: I1203 09:05:05.650142 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8be5d69-561e-41cc-bf19-f216936f8c9b-config-data\") pod \"d8be5d69-561e-41cc-bf19-f216936f8c9b\" (UID: \"d8be5d69-561e-41cc-bf19-f216936f8c9b\") " Dec 03 09:05:05 crc kubenswrapper[4947]: I1203 09:05:05.650230 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8be5d69-561e-41cc-bf19-f216936f8c9b-scripts\") pod \"d8be5d69-561e-41cc-bf19-f216936f8c9b\" (UID: \"d8be5d69-561e-41cc-bf19-f216936f8c9b\") " Dec 03 09:05:05 crc kubenswrapper[4947]: I1203 09:05:05.650357 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8be5d69-561e-41cc-bf19-f216936f8c9b-logs\") pod \"d8be5d69-561e-41cc-bf19-f216936f8c9b\" (UID: \"d8be5d69-561e-41cc-bf19-f216936f8c9b\") " Dec 03 09:05:05 crc kubenswrapper[4947]: I1203 09:05:05.651071 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8be5d69-561e-41cc-bf19-f216936f8c9b-logs" (OuterVolumeSpecName: "logs") pod "d8be5d69-561e-41cc-bf19-f216936f8c9b" (UID: "d8be5d69-561e-41cc-bf19-f216936f8c9b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:05:05 crc kubenswrapper[4947]: I1203 09:05:05.652185 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8be5d69-561e-41cc-bf19-f216936f8c9b-logs\") on node \"crc\" DevicePath \"\"" Dec 03 09:05:05 crc kubenswrapper[4947]: I1203 09:05:05.655957 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8be5d69-561e-41cc-bf19-f216936f8c9b-kube-api-access-qzwnf" (OuterVolumeSpecName: "kube-api-access-qzwnf") pod "d8be5d69-561e-41cc-bf19-f216936f8c9b" (UID: "d8be5d69-561e-41cc-bf19-f216936f8c9b"). InnerVolumeSpecName "kube-api-access-qzwnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:05:05 crc kubenswrapper[4947]: I1203 09:05:05.661252 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8be5d69-561e-41cc-bf19-f216936f8c9b-scripts" (OuterVolumeSpecName: "scripts") pod "d8be5d69-561e-41cc-bf19-f216936f8c9b" (UID: "d8be5d69-561e-41cc-bf19-f216936f8c9b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:05:05 crc kubenswrapper[4947]: I1203 09:05:05.682465 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8be5d69-561e-41cc-bf19-f216936f8c9b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8be5d69-561e-41cc-bf19-f216936f8c9b" (UID: "d8be5d69-561e-41cc-bf19-f216936f8c9b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:05:05 crc kubenswrapper[4947]: I1203 09:05:05.683078 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8be5d69-561e-41cc-bf19-f216936f8c9b-config-data" (OuterVolumeSpecName: "config-data") pod "d8be5d69-561e-41cc-bf19-f216936f8c9b" (UID: "d8be5d69-561e-41cc-bf19-f216936f8c9b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:05:05 crc kubenswrapper[4947]: I1203 09:05:05.755563 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8be5d69-561e-41cc-bf19-f216936f8c9b-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:05:05 crc kubenswrapper[4947]: I1203 09:05:05.755604 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8be5d69-561e-41cc-bf19-f216936f8c9b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:05:05 crc kubenswrapper[4947]: I1203 09:05:05.755628 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzwnf\" (UniqueName: \"kubernetes.io/projected/d8be5d69-561e-41cc-bf19-f216936f8c9b-kube-api-access-qzwnf\") on node \"crc\" DevicePath \"\"" Dec 03 09:05:05 crc kubenswrapper[4947]: I1203 09:05:05.755642 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8be5d69-561e-41cc-bf19-f216936f8c9b-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:05:05 crc kubenswrapper[4947]: I1203 09:05:05.877697 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86559d7c89-9459n" Dec 03 09:05:05 crc kubenswrapper[4947]: I1203 09:05:05.937669 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6df8bf884f-5lqn4"] Dec 03 09:05:05 crc kubenswrapper[4947]: I1203 09:05:05.937911 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" podUID="e1d86fda-5f36-48f2-b250-c399085f94ae" containerName="dnsmasq-dns" containerID="cri-o://6bd838e415cfd1332088426231560de65e5f9e0c8c47f679e1c4389b55758871" gracePeriod=10 Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.140299 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-vmh4b" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.141243 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-vmh4b" event={"ID":"d8be5d69-561e-41cc-bf19-f216936f8c9b","Type":"ContainerDied","Data":"75a3fd30fb7fbf7416cdd21048c00aa7453bf16ca5447049b3113d698bed4c7f"} Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.141347 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75a3fd30fb7fbf7416cdd21048c00aa7453bf16ca5447049b3113d698bed4c7f" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.152595 4947 generic.go:334] "Generic (PLEG): container finished" podID="e1d86fda-5f36-48f2-b250-c399085f94ae" containerID="6bd838e415cfd1332088426231560de65e5f9e0c8c47f679e1c4389b55758871" exitCode=0 Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.152650 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" event={"ID":"e1d86fda-5f36-48f2-b250-c399085f94ae","Type":"ContainerDied","Data":"6bd838e415cfd1332088426231560de65e5f9e0c8c47f679e1c4389b55758871"} Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.262594 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-59fd9c98dd-nkt77"] Dec 03 09:05:06 crc kubenswrapper[4947]: E1203 09:05:06.263104 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8be5d69-561e-41cc-bf19-f216936f8c9b" containerName="placement-db-sync" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.263122 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8be5d69-561e-41cc-bf19-f216936f8c9b" containerName="placement-db-sync" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.263341 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8be5d69-561e-41cc-bf19-f216936f8c9b" containerName="placement-db-sync" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.264639 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-59fd9c98dd-nkt77" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.268149 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.268450 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.279278 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-6lskg" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.288903 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-59fd9c98dd-nkt77"] Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.377940 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49837ac9-0624-4e5a-ad0a-8ed6eed35e8d-config-data\") pod \"placement-59fd9c98dd-nkt77\" (UID: \"49837ac9-0624-4e5a-ad0a-8ed6eed35e8d\") " pod="openstack/placement-59fd9c98dd-nkt77" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.378063 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwqtx\" (UniqueName: \"kubernetes.io/projected/49837ac9-0624-4e5a-ad0a-8ed6eed35e8d-kube-api-access-zwqtx\") pod \"placement-59fd9c98dd-nkt77\" (UID: \"49837ac9-0624-4e5a-ad0a-8ed6eed35e8d\") " pod="openstack/placement-59fd9c98dd-nkt77" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.378135 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49837ac9-0624-4e5a-ad0a-8ed6eed35e8d-scripts\") pod \"placement-59fd9c98dd-nkt77\" (UID: \"49837ac9-0624-4e5a-ad0a-8ed6eed35e8d\") " pod="openstack/placement-59fd9c98dd-nkt77" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.378156 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49837ac9-0624-4e5a-ad0a-8ed6eed35e8d-combined-ca-bundle\") pod \"placement-59fd9c98dd-nkt77\" (UID: \"49837ac9-0624-4e5a-ad0a-8ed6eed35e8d\") " pod="openstack/placement-59fd9c98dd-nkt77" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.378263 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49837ac9-0624-4e5a-ad0a-8ed6eed35e8d-logs\") pod \"placement-59fd9c98dd-nkt77\" (UID: \"49837ac9-0624-4e5a-ad0a-8ed6eed35e8d\") " pod="openstack/placement-59fd9c98dd-nkt77" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.482119 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49837ac9-0624-4e5a-ad0a-8ed6eed35e8d-logs\") pod \"placement-59fd9c98dd-nkt77\" (UID: \"49837ac9-0624-4e5a-ad0a-8ed6eed35e8d\") " pod="openstack/placement-59fd9c98dd-nkt77" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.482198 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49837ac9-0624-4e5a-ad0a-8ed6eed35e8d-config-data\") pod \"placement-59fd9c98dd-nkt77\" (UID: \"49837ac9-0624-4e5a-ad0a-8ed6eed35e8d\") " pod="openstack/placement-59fd9c98dd-nkt77" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.482248 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwqtx\" (UniqueName: \"kubernetes.io/projected/49837ac9-0624-4e5a-ad0a-8ed6eed35e8d-kube-api-access-zwqtx\") pod \"placement-59fd9c98dd-nkt77\" (UID: \"49837ac9-0624-4e5a-ad0a-8ed6eed35e8d\") " pod="openstack/placement-59fd9c98dd-nkt77" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.482306 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49837ac9-0624-4e5a-ad0a-8ed6eed35e8d-scripts\") pod \"placement-59fd9c98dd-nkt77\" (UID: \"49837ac9-0624-4e5a-ad0a-8ed6eed35e8d\") " pod="openstack/placement-59fd9c98dd-nkt77" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.482327 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49837ac9-0624-4e5a-ad0a-8ed6eed35e8d-combined-ca-bundle\") pod \"placement-59fd9c98dd-nkt77\" (UID: \"49837ac9-0624-4e5a-ad0a-8ed6eed35e8d\") " pod="openstack/placement-59fd9c98dd-nkt77" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.482680 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49837ac9-0624-4e5a-ad0a-8ed6eed35e8d-logs\") pod \"placement-59fd9c98dd-nkt77\" (UID: \"49837ac9-0624-4e5a-ad0a-8ed6eed35e8d\") " pod="openstack/placement-59fd9c98dd-nkt77" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.485807 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49837ac9-0624-4e5a-ad0a-8ed6eed35e8d-scripts\") pod \"placement-59fd9c98dd-nkt77\" (UID: \"49837ac9-0624-4e5a-ad0a-8ed6eed35e8d\") " pod="openstack/placement-59fd9c98dd-nkt77" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.486112 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49837ac9-0624-4e5a-ad0a-8ed6eed35e8d-combined-ca-bundle\") pod \"placement-59fd9c98dd-nkt77\" (UID: \"49837ac9-0624-4e5a-ad0a-8ed6eed35e8d\") " pod="openstack/placement-59fd9c98dd-nkt77" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.498630 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.499423 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49837ac9-0624-4e5a-ad0a-8ed6eed35e8d-config-data\") pod \"placement-59fd9c98dd-nkt77\" (UID: \"49837ac9-0624-4e5a-ad0a-8ed6eed35e8d\") " pod="openstack/placement-59fd9c98dd-nkt77" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.507321 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwqtx\" (UniqueName: \"kubernetes.io/projected/49837ac9-0624-4e5a-ad0a-8ed6eed35e8d-kube-api-access-zwqtx\") pod \"placement-59fd9c98dd-nkt77\" (UID: \"49837ac9-0624-4e5a-ad0a-8ed6eed35e8d\") " pod="openstack/placement-59fd9c98dd-nkt77" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.584274 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1d86fda-5f36-48f2-b250-c399085f94ae-config\") pod \"e1d86fda-5f36-48f2-b250-c399085f94ae\" (UID: \"e1d86fda-5f36-48f2-b250-c399085f94ae\") " Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.584380 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1d86fda-5f36-48f2-b250-c399085f94ae-dns-svc\") pod \"e1d86fda-5f36-48f2-b250-c399085f94ae\" (UID: \"e1d86fda-5f36-48f2-b250-c399085f94ae\") " Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.584402 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1d86fda-5f36-48f2-b250-c399085f94ae-ovsdbserver-sb\") pod \"e1d86fda-5f36-48f2-b250-c399085f94ae\" (UID: \"e1d86fda-5f36-48f2-b250-c399085f94ae\") " Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.584475 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1d86fda-5f36-48f2-b250-c399085f94ae-ovsdbserver-nb\") pod \"e1d86fda-5f36-48f2-b250-c399085f94ae\" (UID: \"e1d86fda-5f36-48f2-b250-c399085f94ae\") " Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.584513 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zlxj\" (UniqueName: \"kubernetes.io/projected/e1d86fda-5f36-48f2-b250-c399085f94ae-kube-api-access-8zlxj\") pod \"e1d86fda-5f36-48f2-b250-c399085f94ae\" (UID: \"e1d86fda-5f36-48f2-b250-c399085f94ae\") " Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.588781 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1d86fda-5f36-48f2-b250-c399085f94ae-kube-api-access-8zlxj" (OuterVolumeSpecName: "kube-api-access-8zlxj") pod "e1d86fda-5f36-48f2-b250-c399085f94ae" (UID: "e1d86fda-5f36-48f2-b250-c399085f94ae"). InnerVolumeSpecName "kube-api-access-8zlxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.600051 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-59fd9c98dd-nkt77" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.687776 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zlxj\" (UniqueName: \"kubernetes.io/projected/e1d86fda-5f36-48f2-b250-c399085f94ae-kube-api-access-8zlxj\") on node \"crc\" DevicePath \"\"" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.715906 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1d86fda-5f36-48f2-b250-c399085f94ae-config" (OuterVolumeSpecName: "config") pod "e1d86fda-5f36-48f2-b250-c399085f94ae" (UID: "e1d86fda-5f36-48f2-b250-c399085f94ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.744078 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1d86fda-5f36-48f2-b250-c399085f94ae-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e1d86fda-5f36-48f2-b250-c399085f94ae" (UID: "e1d86fda-5f36-48f2-b250-c399085f94ae"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.767181 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1d86fda-5f36-48f2-b250-c399085f94ae-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e1d86fda-5f36-48f2-b250-c399085f94ae" (UID: "e1d86fda-5f36-48f2-b250-c399085f94ae"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.783328 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1d86fda-5f36-48f2-b250-c399085f94ae-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e1d86fda-5f36-48f2-b250-c399085f94ae" (UID: "e1d86fda-5f36-48f2-b250-c399085f94ae"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.790690 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1d86fda-5f36-48f2-b250-c399085f94ae-config\") on node \"crc\" DevicePath \"\"" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.791155 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1d86fda-5f36-48f2-b250-c399085f94ae-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.791243 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1d86fda-5f36-48f2-b250-c399085f94ae-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 09:05:06 crc kubenswrapper[4947]: I1203 09:05:06.791331 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1d86fda-5f36-48f2-b250-c399085f94ae-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 09:05:07 crc kubenswrapper[4947]: I1203 09:05:07.164304 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" event={"ID":"e1d86fda-5f36-48f2-b250-c399085f94ae","Type":"ContainerDied","Data":"e61b4defdcb3425e8a927f10ca3f07b0fa70e8851a4abf89b5e4d15dd63704cf"} Dec 03 09:05:07 crc kubenswrapper[4947]: I1203 09:05:07.164628 4947 scope.go:117] "RemoveContainer" containerID="6bd838e415cfd1332088426231560de65e5f9e0c8c47f679e1c4389b55758871" Dec 03 09:05:07 crc kubenswrapper[4947]: I1203 09:05:07.164365 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6df8bf884f-5lqn4" Dec 03 09:05:07 crc kubenswrapper[4947]: I1203 09:05:07.190142 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6df8bf884f-5lqn4"] Dec 03 09:05:07 crc kubenswrapper[4947]: I1203 09:05:07.199254 4947 scope.go:117] "RemoveContainer" containerID="a9029b8e902c917b9e1e9d43d07fb55951e7fe78b9c3fe0adc957e96ac354ded" Dec 03 09:05:07 crc kubenswrapper[4947]: I1203 09:05:07.202964 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6df8bf884f-5lqn4"] Dec 03 09:05:07 crc kubenswrapper[4947]: I1203 09:05:07.222561 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-59fd9c98dd-nkt77"] Dec 03 09:05:07 crc kubenswrapper[4947]: W1203 09:05:07.229010 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49837ac9_0624_4e5a_ad0a_8ed6eed35e8d.slice/crio-4d00d7379e90b839f05e88f1f00b3df6ff3647d6e631269e7a7b61a3a947cc14 WatchSource:0}: Error finding container 4d00d7379e90b839f05e88f1f00b3df6ff3647d6e631269e7a7b61a3a947cc14: Status 404 returned error can't find the container with id 4d00d7379e90b839f05e88f1f00b3df6ff3647d6e631269e7a7b61a3a947cc14 Dec 03 09:05:08 crc kubenswrapper[4947]: I1203 09:05:08.176048 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59fd9c98dd-nkt77" event={"ID":"49837ac9-0624-4e5a-ad0a-8ed6eed35e8d","Type":"ContainerStarted","Data":"1383af68b4c1b44881116d175d3216e932a92f3ec3eb3819e02daaa9e9a7f586"} Dec 03 09:05:08 crc kubenswrapper[4947]: I1203 09:05:08.176370 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59fd9c98dd-nkt77" event={"ID":"49837ac9-0624-4e5a-ad0a-8ed6eed35e8d","Type":"ContainerStarted","Data":"3b3cfed9c906ccf602bc90dd6b50609849e0180b626348132db8ab08f0bd6923"} Dec 03 09:05:08 crc kubenswrapper[4947]: I1203 09:05:08.176388 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-59fd9c98dd-nkt77" Dec 03 09:05:08 crc kubenswrapper[4947]: I1203 09:05:08.176397 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59fd9c98dd-nkt77" event={"ID":"49837ac9-0624-4e5a-ad0a-8ed6eed35e8d","Type":"ContainerStarted","Data":"4d00d7379e90b839f05e88f1f00b3df6ff3647d6e631269e7a7b61a3a947cc14"} Dec 03 09:05:08 crc kubenswrapper[4947]: I1203 09:05:08.198929 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-59fd9c98dd-nkt77" podStartSLOduration=2.198913721 podStartE2EDuration="2.198913721s" podCreationTimestamp="2025-12-03 09:05:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:05:08.192756833 +0000 UTC m=+8169.453711309" watchObservedRunningTime="2025-12-03 09:05:08.198913721 +0000 UTC m=+8169.459868147" Dec 03 09:05:09 crc kubenswrapper[4947]: I1203 09:05:09.094756 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1d86fda-5f36-48f2-b250-c399085f94ae" path="/var/lib/kubelet/pods/e1d86fda-5f36-48f2-b250-c399085f94ae/volumes" Dec 03 09:05:09 crc kubenswrapper[4947]: I1203 09:05:09.185336 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-59fd9c98dd-nkt77" Dec 03 09:05:30 crc kubenswrapper[4947]: I1203 09:05:30.087175 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:05:30 crc kubenswrapper[4947]: I1203 09:05:30.087995 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:05:38 crc kubenswrapper[4947]: I1203 09:05:38.256940 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-59fd9c98dd-nkt77" Dec 03 09:05:38 crc kubenswrapper[4947]: I1203 09:05:38.262572 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-59fd9c98dd-nkt77" Dec 03 09:05:42 crc kubenswrapper[4947]: I1203 09:05:42.575762 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dkwjq"] Dec 03 09:05:42 crc kubenswrapper[4947]: E1203 09:05:42.576768 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1d86fda-5f36-48f2-b250-c399085f94ae" containerName="dnsmasq-dns" Dec 03 09:05:42 crc kubenswrapper[4947]: I1203 09:05:42.576787 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1d86fda-5f36-48f2-b250-c399085f94ae" containerName="dnsmasq-dns" Dec 03 09:05:42 crc kubenswrapper[4947]: E1203 09:05:42.576832 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1d86fda-5f36-48f2-b250-c399085f94ae" containerName="init" Dec 03 09:05:42 crc kubenswrapper[4947]: I1203 09:05:42.576840 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1d86fda-5f36-48f2-b250-c399085f94ae" containerName="init" Dec 03 09:05:42 crc kubenswrapper[4947]: I1203 09:05:42.577051 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1d86fda-5f36-48f2-b250-c399085f94ae" containerName="dnsmasq-dns" Dec 03 09:05:42 crc kubenswrapper[4947]: I1203 09:05:42.578624 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dkwjq" Dec 03 09:05:42 crc kubenswrapper[4947]: I1203 09:05:42.604746 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dkwjq"] Dec 03 09:05:42 crc kubenswrapper[4947]: I1203 09:05:42.688544 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b13e23d7-279d-4d38-b1fd-8bb06aee282e-catalog-content\") pod \"certified-operators-dkwjq\" (UID: \"b13e23d7-279d-4d38-b1fd-8bb06aee282e\") " pod="openshift-marketplace/certified-operators-dkwjq" Dec 03 09:05:42 crc kubenswrapper[4947]: I1203 09:05:42.688606 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w74ch\" (UniqueName: \"kubernetes.io/projected/b13e23d7-279d-4d38-b1fd-8bb06aee282e-kube-api-access-w74ch\") pod \"certified-operators-dkwjq\" (UID: \"b13e23d7-279d-4d38-b1fd-8bb06aee282e\") " pod="openshift-marketplace/certified-operators-dkwjq" Dec 03 09:05:42 crc kubenswrapper[4947]: I1203 09:05:42.688806 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b13e23d7-279d-4d38-b1fd-8bb06aee282e-utilities\") pod \"certified-operators-dkwjq\" (UID: \"b13e23d7-279d-4d38-b1fd-8bb06aee282e\") " pod="openshift-marketplace/certified-operators-dkwjq" Dec 03 09:05:42 crc kubenswrapper[4947]: I1203 09:05:42.790548 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b13e23d7-279d-4d38-b1fd-8bb06aee282e-utilities\") pod \"certified-operators-dkwjq\" (UID: \"b13e23d7-279d-4d38-b1fd-8bb06aee282e\") " pod="openshift-marketplace/certified-operators-dkwjq" Dec 03 09:05:42 crc kubenswrapper[4947]: I1203 09:05:42.790613 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b13e23d7-279d-4d38-b1fd-8bb06aee282e-catalog-content\") pod \"certified-operators-dkwjq\" (UID: \"b13e23d7-279d-4d38-b1fd-8bb06aee282e\") " pod="openshift-marketplace/certified-operators-dkwjq" Dec 03 09:05:42 crc kubenswrapper[4947]: I1203 09:05:42.790639 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w74ch\" (UniqueName: \"kubernetes.io/projected/b13e23d7-279d-4d38-b1fd-8bb06aee282e-kube-api-access-w74ch\") pod \"certified-operators-dkwjq\" (UID: \"b13e23d7-279d-4d38-b1fd-8bb06aee282e\") " pod="openshift-marketplace/certified-operators-dkwjq" Dec 03 09:05:42 crc kubenswrapper[4947]: I1203 09:05:42.791184 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b13e23d7-279d-4d38-b1fd-8bb06aee282e-utilities\") pod \"certified-operators-dkwjq\" (UID: \"b13e23d7-279d-4d38-b1fd-8bb06aee282e\") " pod="openshift-marketplace/certified-operators-dkwjq" Dec 03 09:05:42 crc kubenswrapper[4947]: I1203 09:05:42.791449 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b13e23d7-279d-4d38-b1fd-8bb06aee282e-catalog-content\") pod \"certified-operators-dkwjq\" (UID: \"b13e23d7-279d-4d38-b1fd-8bb06aee282e\") " pod="openshift-marketplace/certified-operators-dkwjq" Dec 03 09:05:42 crc kubenswrapper[4947]: I1203 09:05:42.812092 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w74ch\" (UniqueName: \"kubernetes.io/projected/b13e23d7-279d-4d38-b1fd-8bb06aee282e-kube-api-access-w74ch\") pod \"certified-operators-dkwjq\" (UID: \"b13e23d7-279d-4d38-b1fd-8bb06aee282e\") " pod="openshift-marketplace/certified-operators-dkwjq" Dec 03 09:05:42 crc kubenswrapper[4947]: I1203 09:05:42.902655 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dkwjq" Dec 03 09:05:43 crc kubenswrapper[4947]: I1203 09:05:43.491335 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dkwjq"] Dec 03 09:05:43 crc kubenswrapper[4947]: I1203 09:05:43.528879 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dkwjq" event={"ID":"b13e23d7-279d-4d38-b1fd-8bb06aee282e","Type":"ContainerStarted","Data":"0b699be864a34d4451785ed4fedee96e01cd6bb4c0b56060a5ba0c2973cdf7da"} Dec 03 09:05:44 crc kubenswrapper[4947]: I1203 09:05:44.543154 4947 generic.go:334] "Generic (PLEG): container finished" podID="b13e23d7-279d-4d38-b1fd-8bb06aee282e" containerID="3bac10b2cd6fc3c6c654b311a91c0955c536c94636667ae0c25d66b096bc5c58" exitCode=0 Dec 03 09:05:44 crc kubenswrapper[4947]: I1203 09:05:44.543244 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dkwjq" event={"ID":"b13e23d7-279d-4d38-b1fd-8bb06aee282e","Type":"ContainerDied","Data":"3bac10b2cd6fc3c6c654b311a91c0955c536c94636667ae0c25d66b096bc5c58"} Dec 03 09:05:44 crc kubenswrapper[4947]: I1203 09:05:44.547590 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 09:05:46 crc kubenswrapper[4947]: I1203 09:05:46.563019 4947 generic.go:334] "Generic (PLEG): container finished" podID="b13e23d7-279d-4d38-b1fd-8bb06aee282e" containerID="ff6807a477c75178b2b6478faca019f6df6dafb03140c6cf916a3878cc574fde" exitCode=0 Dec 03 09:05:46 crc kubenswrapper[4947]: I1203 09:05:46.563312 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dkwjq" event={"ID":"b13e23d7-279d-4d38-b1fd-8bb06aee282e","Type":"ContainerDied","Data":"ff6807a477c75178b2b6478faca019f6df6dafb03140c6cf916a3878cc574fde"} Dec 03 09:05:47 crc kubenswrapper[4947]: I1203 09:05:47.578104 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dkwjq" event={"ID":"b13e23d7-279d-4d38-b1fd-8bb06aee282e","Type":"ContainerStarted","Data":"f9edd6795d716c3a747877ce7838cfa9b8d47d220f342ae45c2c7bb90cd44d29"} Dec 03 09:05:47 crc kubenswrapper[4947]: I1203 09:05:47.605574 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dkwjq" podStartSLOduration=3.03661145 podStartE2EDuration="5.605557977s" podCreationTimestamp="2025-12-03 09:05:42 +0000 UTC" firstStartedPulling="2025-12-03 09:05:44.546988637 +0000 UTC m=+8205.807943063" lastFinishedPulling="2025-12-03 09:05:47.115935154 +0000 UTC m=+8208.376889590" observedRunningTime="2025-12-03 09:05:47.599738549 +0000 UTC m=+8208.860692975" watchObservedRunningTime="2025-12-03 09:05:47.605557977 +0000 UTC m=+8208.866512403" Dec 03 09:05:52 crc kubenswrapper[4947]: I1203 09:05:52.903639 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dkwjq" Dec 03 09:05:52 crc kubenswrapper[4947]: I1203 09:05:52.904165 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dkwjq" Dec 03 09:05:52 crc kubenswrapper[4947]: I1203 09:05:52.969125 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dkwjq" Dec 03 09:05:53 crc kubenswrapper[4947]: I1203 09:05:53.679476 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dkwjq" Dec 03 09:05:53 crc kubenswrapper[4947]: I1203 09:05:53.721965 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dkwjq"] Dec 03 09:05:55 crc kubenswrapper[4947]: I1203 09:05:55.648061 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dkwjq" podUID="b13e23d7-279d-4d38-b1fd-8bb06aee282e" containerName="registry-server" containerID="cri-o://f9edd6795d716c3a747877ce7838cfa9b8d47d220f342ae45c2c7bb90cd44d29" gracePeriod=2 Dec 03 09:05:56 crc kubenswrapper[4947]: I1203 09:05:56.168705 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dkwjq" Dec 03 09:05:56 crc kubenswrapper[4947]: I1203 09:05:56.265095 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b13e23d7-279d-4d38-b1fd-8bb06aee282e-utilities\") pod \"b13e23d7-279d-4d38-b1fd-8bb06aee282e\" (UID: \"b13e23d7-279d-4d38-b1fd-8bb06aee282e\") " Dec 03 09:05:56 crc kubenswrapper[4947]: I1203 09:05:56.265424 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w74ch\" (UniqueName: \"kubernetes.io/projected/b13e23d7-279d-4d38-b1fd-8bb06aee282e-kube-api-access-w74ch\") pod \"b13e23d7-279d-4d38-b1fd-8bb06aee282e\" (UID: \"b13e23d7-279d-4d38-b1fd-8bb06aee282e\") " Dec 03 09:05:56 crc kubenswrapper[4947]: I1203 09:05:56.265474 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b13e23d7-279d-4d38-b1fd-8bb06aee282e-catalog-content\") pod \"b13e23d7-279d-4d38-b1fd-8bb06aee282e\" (UID: \"b13e23d7-279d-4d38-b1fd-8bb06aee282e\") " Dec 03 09:05:56 crc kubenswrapper[4947]: I1203 09:05:56.268300 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b13e23d7-279d-4d38-b1fd-8bb06aee282e-utilities" (OuterVolumeSpecName: "utilities") pod "b13e23d7-279d-4d38-b1fd-8bb06aee282e" (UID: "b13e23d7-279d-4d38-b1fd-8bb06aee282e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:05:56 crc kubenswrapper[4947]: I1203 09:05:56.272876 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b13e23d7-279d-4d38-b1fd-8bb06aee282e-kube-api-access-w74ch" (OuterVolumeSpecName: "kube-api-access-w74ch") pod "b13e23d7-279d-4d38-b1fd-8bb06aee282e" (UID: "b13e23d7-279d-4d38-b1fd-8bb06aee282e"). InnerVolumeSpecName "kube-api-access-w74ch". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:05:56 crc kubenswrapper[4947]: I1203 09:05:56.321229 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b13e23d7-279d-4d38-b1fd-8bb06aee282e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b13e23d7-279d-4d38-b1fd-8bb06aee282e" (UID: "b13e23d7-279d-4d38-b1fd-8bb06aee282e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:05:56 crc kubenswrapper[4947]: I1203 09:05:56.368044 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b13e23d7-279d-4d38-b1fd-8bb06aee282e-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:05:56 crc kubenswrapper[4947]: I1203 09:05:56.368347 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w74ch\" (UniqueName: \"kubernetes.io/projected/b13e23d7-279d-4d38-b1fd-8bb06aee282e-kube-api-access-w74ch\") on node \"crc\" DevicePath \"\"" Dec 03 09:05:56 crc kubenswrapper[4947]: I1203 09:05:56.368463 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b13e23d7-279d-4d38-b1fd-8bb06aee282e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:05:56 crc kubenswrapper[4947]: I1203 09:05:56.660484 4947 generic.go:334] "Generic (PLEG): container finished" podID="b13e23d7-279d-4d38-b1fd-8bb06aee282e" containerID="f9edd6795d716c3a747877ce7838cfa9b8d47d220f342ae45c2c7bb90cd44d29" exitCode=0 Dec 03 09:05:56 crc kubenswrapper[4947]: I1203 09:05:56.660574 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dkwjq" event={"ID":"b13e23d7-279d-4d38-b1fd-8bb06aee282e","Type":"ContainerDied","Data":"f9edd6795d716c3a747877ce7838cfa9b8d47d220f342ae45c2c7bb90cd44d29"} Dec 03 09:05:56 crc kubenswrapper[4947]: I1203 09:05:56.660595 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dkwjq" Dec 03 09:05:56 crc kubenswrapper[4947]: I1203 09:05:56.660613 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dkwjq" event={"ID":"b13e23d7-279d-4d38-b1fd-8bb06aee282e","Type":"ContainerDied","Data":"0b699be864a34d4451785ed4fedee96e01cd6bb4c0b56060a5ba0c2973cdf7da"} Dec 03 09:05:56 crc kubenswrapper[4947]: I1203 09:05:56.660644 4947 scope.go:117] "RemoveContainer" containerID="f9edd6795d716c3a747877ce7838cfa9b8d47d220f342ae45c2c7bb90cd44d29" Dec 03 09:05:56 crc kubenswrapper[4947]: I1203 09:05:56.697393 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dkwjq"] Dec 03 09:05:56 crc kubenswrapper[4947]: I1203 09:05:56.705795 4947 scope.go:117] "RemoveContainer" containerID="ff6807a477c75178b2b6478faca019f6df6dafb03140c6cf916a3878cc574fde" Dec 03 09:05:56 crc kubenswrapper[4947]: I1203 09:05:56.705904 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dkwjq"] Dec 03 09:05:56 crc kubenswrapper[4947]: I1203 09:05:56.732332 4947 scope.go:117] "RemoveContainer" containerID="3bac10b2cd6fc3c6c654b311a91c0955c536c94636667ae0c25d66b096bc5c58" Dec 03 09:05:56 crc kubenswrapper[4947]: I1203 09:05:56.778680 4947 scope.go:117] "RemoveContainer" containerID="f9edd6795d716c3a747877ce7838cfa9b8d47d220f342ae45c2c7bb90cd44d29" Dec 03 09:05:56 crc kubenswrapper[4947]: E1203 09:05:56.779517 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9edd6795d716c3a747877ce7838cfa9b8d47d220f342ae45c2c7bb90cd44d29\": container with ID starting with f9edd6795d716c3a747877ce7838cfa9b8d47d220f342ae45c2c7bb90cd44d29 not found: ID does not exist" containerID="f9edd6795d716c3a747877ce7838cfa9b8d47d220f342ae45c2c7bb90cd44d29" Dec 03 09:05:56 crc kubenswrapper[4947]: I1203 09:05:56.779612 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9edd6795d716c3a747877ce7838cfa9b8d47d220f342ae45c2c7bb90cd44d29"} err="failed to get container status \"f9edd6795d716c3a747877ce7838cfa9b8d47d220f342ae45c2c7bb90cd44d29\": rpc error: code = NotFound desc = could not find container \"f9edd6795d716c3a747877ce7838cfa9b8d47d220f342ae45c2c7bb90cd44d29\": container with ID starting with f9edd6795d716c3a747877ce7838cfa9b8d47d220f342ae45c2c7bb90cd44d29 not found: ID does not exist" Dec 03 09:05:56 crc kubenswrapper[4947]: I1203 09:05:56.779657 4947 scope.go:117] "RemoveContainer" containerID="ff6807a477c75178b2b6478faca019f6df6dafb03140c6cf916a3878cc574fde" Dec 03 09:05:56 crc kubenswrapper[4947]: E1203 09:05:56.780311 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff6807a477c75178b2b6478faca019f6df6dafb03140c6cf916a3878cc574fde\": container with ID starting with ff6807a477c75178b2b6478faca019f6df6dafb03140c6cf916a3878cc574fde not found: ID does not exist" containerID="ff6807a477c75178b2b6478faca019f6df6dafb03140c6cf916a3878cc574fde" Dec 03 09:05:56 crc kubenswrapper[4947]: I1203 09:05:56.780363 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff6807a477c75178b2b6478faca019f6df6dafb03140c6cf916a3878cc574fde"} err="failed to get container status \"ff6807a477c75178b2b6478faca019f6df6dafb03140c6cf916a3878cc574fde\": rpc error: code = NotFound desc = could not find container \"ff6807a477c75178b2b6478faca019f6df6dafb03140c6cf916a3878cc574fde\": container with ID starting with ff6807a477c75178b2b6478faca019f6df6dafb03140c6cf916a3878cc574fde not found: ID does not exist" Dec 03 09:05:56 crc kubenswrapper[4947]: I1203 09:05:56.780396 4947 scope.go:117] "RemoveContainer" containerID="3bac10b2cd6fc3c6c654b311a91c0955c536c94636667ae0c25d66b096bc5c58" Dec 03 09:05:56 crc kubenswrapper[4947]: E1203 09:05:56.780817 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bac10b2cd6fc3c6c654b311a91c0955c536c94636667ae0c25d66b096bc5c58\": container with ID starting with 3bac10b2cd6fc3c6c654b311a91c0955c536c94636667ae0c25d66b096bc5c58 not found: ID does not exist" containerID="3bac10b2cd6fc3c6c654b311a91c0955c536c94636667ae0c25d66b096bc5c58" Dec 03 09:05:56 crc kubenswrapper[4947]: I1203 09:05:56.780865 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bac10b2cd6fc3c6c654b311a91c0955c536c94636667ae0c25d66b096bc5c58"} err="failed to get container status \"3bac10b2cd6fc3c6c654b311a91c0955c536c94636667ae0c25d66b096bc5c58\": rpc error: code = NotFound desc = could not find container \"3bac10b2cd6fc3c6c654b311a91c0955c536c94636667ae0c25d66b096bc5c58\": container with ID starting with 3bac10b2cd6fc3c6c654b311a91c0955c536c94636667ae0c25d66b096bc5c58 not found: ID does not exist" Dec 03 09:05:57 crc kubenswrapper[4947]: I1203 09:05:57.123877 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b13e23d7-279d-4d38-b1fd-8bb06aee282e" path="/var/lib/kubelet/pods/b13e23d7-279d-4d38-b1fd-8bb06aee282e/volumes" Dec 03 09:06:00 crc kubenswrapper[4947]: I1203 09:06:00.086392 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:06:00 crc kubenswrapper[4947]: I1203 09:06:00.086769 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.402325 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-7csgl"] Dec 03 09:06:02 crc kubenswrapper[4947]: E1203 09:06:02.403081 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b13e23d7-279d-4d38-b1fd-8bb06aee282e" containerName="registry-server" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.403184 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="b13e23d7-279d-4d38-b1fd-8bb06aee282e" containerName="registry-server" Dec 03 09:06:02 crc kubenswrapper[4947]: E1203 09:06:02.403229 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b13e23d7-279d-4d38-b1fd-8bb06aee282e" containerName="extract-content" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.403237 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="b13e23d7-279d-4d38-b1fd-8bb06aee282e" containerName="extract-content" Dec 03 09:06:02 crc kubenswrapper[4947]: E1203 09:06:02.403264 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b13e23d7-279d-4d38-b1fd-8bb06aee282e" containerName="extract-utilities" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.403273 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="b13e23d7-279d-4d38-b1fd-8bb06aee282e" containerName="extract-utilities" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.403536 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="b13e23d7-279d-4d38-b1fd-8bb06aee282e" containerName="registry-server" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.404306 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7csgl" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.412279 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7csgl"] Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.480085 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qncj\" (UniqueName: \"kubernetes.io/projected/a614093c-3c22-4f0d-916b-9719904fc295-kube-api-access-7qncj\") pod \"nova-api-db-create-7csgl\" (UID: \"a614093c-3c22-4f0d-916b-9719904fc295\") " pod="openstack/nova-api-db-create-7csgl" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.480249 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a614093c-3c22-4f0d-916b-9719904fc295-operator-scripts\") pod \"nova-api-db-create-7csgl\" (UID: \"a614093c-3c22-4f0d-916b-9719904fc295\") " pod="openstack/nova-api-db-create-7csgl" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.492902 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-m87jz"] Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.493997 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-m87jz" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.516388 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-m87jz"] Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.582724 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a614093c-3c22-4f0d-916b-9719904fc295-operator-scripts\") pod \"nova-api-db-create-7csgl\" (UID: \"a614093c-3c22-4f0d-916b-9719904fc295\") " pod="openstack/nova-api-db-create-7csgl" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.583117 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0-operator-scripts\") pod \"nova-cell0-db-create-m87jz\" (UID: \"d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0\") " pod="openstack/nova-cell0-db-create-m87jz" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.583230 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh965\" (UniqueName: \"kubernetes.io/projected/d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0-kube-api-access-xh965\") pod \"nova-cell0-db-create-m87jz\" (UID: \"d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0\") " pod="openstack/nova-cell0-db-create-m87jz" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.583265 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qncj\" (UniqueName: \"kubernetes.io/projected/a614093c-3c22-4f0d-916b-9719904fc295-kube-api-access-7qncj\") pod \"nova-api-db-create-7csgl\" (UID: \"a614093c-3c22-4f0d-916b-9719904fc295\") " pod="openstack/nova-api-db-create-7csgl" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.583687 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a614093c-3c22-4f0d-916b-9719904fc295-operator-scripts\") pod \"nova-api-db-create-7csgl\" (UID: \"a614093c-3c22-4f0d-916b-9719904fc295\") " pod="openstack/nova-api-db-create-7csgl" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.618684 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-4ec2-account-create-update-f7xgv"] Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.620682 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4ec2-account-create-update-f7xgv" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.622103 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qncj\" (UniqueName: \"kubernetes.io/projected/a614093c-3c22-4f0d-916b-9719904fc295-kube-api-access-7qncj\") pod \"nova-api-db-create-7csgl\" (UID: \"a614093c-3c22-4f0d-916b-9719904fc295\") " pod="openstack/nova-api-db-create-7csgl" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.622939 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.638773 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4ec2-account-create-update-f7xgv"] Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.684927 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3130440-e339-4ca1-9fc6-d4feeac9ae93-operator-scripts\") pod \"nova-api-4ec2-account-create-update-f7xgv\" (UID: \"e3130440-e339-4ca1-9fc6-d4feeac9ae93\") " pod="openstack/nova-api-4ec2-account-create-update-f7xgv" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.685002 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh965\" (UniqueName: \"kubernetes.io/projected/d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0-kube-api-access-xh965\") pod \"nova-cell0-db-create-m87jz\" (UID: \"d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0\") " pod="openstack/nova-cell0-db-create-m87jz" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.685153 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0-operator-scripts\") pod \"nova-cell0-db-create-m87jz\" (UID: \"d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0\") " pod="openstack/nova-cell0-db-create-m87jz" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.685185 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pws6m\" (UniqueName: \"kubernetes.io/projected/e3130440-e339-4ca1-9fc6-d4feeac9ae93-kube-api-access-pws6m\") pod \"nova-api-4ec2-account-create-update-f7xgv\" (UID: \"e3130440-e339-4ca1-9fc6-d4feeac9ae93\") " pod="openstack/nova-api-4ec2-account-create-update-f7xgv" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.686141 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0-operator-scripts\") pod \"nova-cell0-db-create-m87jz\" (UID: \"d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0\") " pod="openstack/nova-cell0-db-create-m87jz" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.704988 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell2-db-create-qgb8g"] Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.705438 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh965\" (UniqueName: \"kubernetes.io/projected/d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0-kube-api-access-xh965\") pod \"nova-cell0-db-create-m87jz\" (UID: \"d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0\") " pod="openstack/nova-cell0-db-create-m87jz" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.706293 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-db-create-qgb8g" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.712334 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell2-db-create-qgb8g"] Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.724001 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7csgl" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.786578 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1-operator-scripts\") pod \"nova-cell2-db-create-qgb8g\" (UID: \"c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1\") " pod="openstack/nova-cell2-db-create-qgb8g" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.786682 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfdl5\" (UniqueName: \"kubernetes.io/projected/c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1-kube-api-access-mfdl5\") pod \"nova-cell2-db-create-qgb8g\" (UID: \"c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1\") " pod="openstack/nova-cell2-db-create-qgb8g" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.786738 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pws6m\" (UniqueName: \"kubernetes.io/projected/e3130440-e339-4ca1-9fc6-d4feeac9ae93-kube-api-access-pws6m\") pod \"nova-api-4ec2-account-create-update-f7xgv\" (UID: \"e3130440-e339-4ca1-9fc6-d4feeac9ae93\") " pod="openstack/nova-api-4ec2-account-create-update-f7xgv" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.786799 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3130440-e339-4ca1-9fc6-d4feeac9ae93-operator-scripts\") pod \"nova-api-4ec2-account-create-update-f7xgv\" (UID: \"e3130440-e339-4ca1-9fc6-d4feeac9ae93\") " pod="openstack/nova-api-4ec2-account-create-update-f7xgv" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.787706 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3130440-e339-4ca1-9fc6-d4feeac9ae93-operator-scripts\") pod \"nova-api-4ec2-account-create-update-f7xgv\" (UID: \"e3130440-e339-4ca1-9fc6-d4feeac9ae93\") " pod="openstack/nova-api-4ec2-account-create-update-f7xgv" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.816898 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-m87jz" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.834919 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pws6m\" (UniqueName: \"kubernetes.io/projected/e3130440-e339-4ca1-9fc6-d4feeac9ae93-kube-api-access-pws6m\") pod \"nova-api-4ec2-account-create-update-f7xgv\" (UID: \"e3130440-e339-4ca1-9fc6-d4feeac9ae93\") " pod="openstack/nova-api-4ec2-account-create-update-f7xgv" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.860089 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-ec73-account-create-update-zq27k"] Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.861384 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ec73-account-create-update-zq27k" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.865142 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.868600 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ec73-account-create-update-zq27k"] Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.889789 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfdl5\" (UniqueName: \"kubernetes.io/projected/c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1-kube-api-access-mfdl5\") pod \"nova-cell2-db-create-qgb8g\" (UID: \"c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1\") " pod="openstack/nova-cell2-db-create-qgb8g" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.890146 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1-operator-scripts\") pod \"nova-cell2-db-create-qgb8g\" (UID: \"c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1\") " pod="openstack/nova-cell2-db-create-qgb8g" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.890828 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1-operator-scripts\") pod \"nova-cell2-db-create-qgb8g\" (UID: \"c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1\") " pod="openstack/nova-cell2-db-create-qgb8g" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.924100 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell3-db-create-ntsgx"] Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.925328 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell3-db-create-ntsgx" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.934280 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfdl5\" (UniqueName: \"kubernetes.io/projected/c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1-kube-api-access-mfdl5\") pod \"nova-cell2-db-create-qgb8g\" (UID: \"c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1\") " pod="openstack/nova-cell2-db-create-qgb8g" Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.961360 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell3-db-create-ntsgx"] Dec 03 09:06:02 crc kubenswrapper[4947]: I1203 09:06:02.965902 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4ec2-account-create-update-f7xgv" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.000595 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ab80fcd-7771-41f9-a64c-08e0cacf63c5-operator-scripts\") pod \"nova-cell3-db-create-ntsgx\" (UID: \"3ab80fcd-7771-41f9-a64c-08e0cacf63c5\") " pod="openstack/nova-cell3-db-create-ntsgx" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.001152 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98cbb\" (UniqueName: \"kubernetes.io/projected/c2f0e73d-81cd-4359-bba3-8fe0147be087-kube-api-access-98cbb\") pod \"nova-cell0-ec73-account-create-update-zq27k\" (UID: \"c2f0e73d-81cd-4359-bba3-8fe0147be087\") " pod="openstack/nova-cell0-ec73-account-create-update-zq27k" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.001322 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f0e73d-81cd-4359-bba3-8fe0147be087-operator-scripts\") pod \"nova-cell0-ec73-account-create-update-zq27k\" (UID: \"c2f0e73d-81cd-4359-bba3-8fe0147be087\") " pod="openstack/nova-cell0-ec73-account-create-update-zq27k" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.001485 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mmjs\" (UniqueName: \"kubernetes.io/projected/3ab80fcd-7771-41f9-a64c-08e0cacf63c5-kube-api-access-9mmjs\") pod \"nova-cell3-db-create-ntsgx\" (UID: \"3ab80fcd-7771-41f9-a64c-08e0cacf63c5\") " pod="openstack/nova-cell3-db-create-ntsgx" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.035422 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-6cpvn"] Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.036956 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6cpvn" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.054718 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-6cpvn"] Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.085461 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-db-create-qgb8g" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.102962 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqvsq\" (UniqueName: \"kubernetes.io/projected/7076b626-52f4-442b-9bc3-96d8a747ddaa-kube-api-access-kqvsq\") pod \"nova-cell1-db-create-6cpvn\" (UID: \"7076b626-52f4-442b-9bc3-96d8a747ddaa\") " pod="openstack/nova-cell1-db-create-6cpvn" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.103236 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98cbb\" (UniqueName: \"kubernetes.io/projected/c2f0e73d-81cd-4359-bba3-8fe0147be087-kube-api-access-98cbb\") pod \"nova-cell0-ec73-account-create-update-zq27k\" (UID: \"c2f0e73d-81cd-4359-bba3-8fe0147be087\") " pod="openstack/nova-cell0-ec73-account-create-update-zq27k" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.103421 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f0e73d-81cd-4359-bba3-8fe0147be087-operator-scripts\") pod \"nova-cell0-ec73-account-create-update-zq27k\" (UID: \"c2f0e73d-81cd-4359-bba3-8fe0147be087\") " pod="openstack/nova-cell0-ec73-account-create-update-zq27k" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.103640 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mmjs\" (UniqueName: \"kubernetes.io/projected/3ab80fcd-7771-41f9-a64c-08e0cacf63c5-kube-api-access-9mmjs\") pod \"nova-cell3-db-create-ntsgx\" (UID: \"3ab80fcd-7771-41f9-a64c-08e0cacf63c5\") " pod="openstack/nova-cell3-db-create-ntsgx" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.103780 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ab80fcd-7771-41f9-a64c-08e0cacf63c5-operator-scripts\") pod \"nova-cell3-db-create-ntsgx\" (UID: \"3ab80fcd-7771-41f9-a64c-08e0cacf63c5\") " pod="openstack/nova-cell3-db-create-ntsgx" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.103953 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7076b626-52f4-442b-9bc3-96d8a747ddaa-operator-scripts\") pod \"nova-cell1-db-create-6cpvn\" (UID: \"7076b626-52f4-442b-9bc3-96d8a747ddaa\") " pod="openstack/nova-cell1-db-create-6cpvn" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.105139 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f0e73d-81cd-4359-bba3-8fe0147be087-operator-scripts\") pod \"nova-cell0-ec73-account-create-update-zq27k\" (UID: \"c2f0e73d-81cd-4359-bba3-8fe0147be087\") " pod="openstack/nova-cell0-ec73-account-create-update-zq27k" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.105805 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ab80fcd-7771-41f9-a64c-08e0cacf63c5-operator-scripts\") pod \"nova-cell3-db-create-ntsgx\" (UID: \"3ab80fcd-7771-41f9-a64c-08e0cacf63c5\") " pod="openstack/nova-cell3-db-create-ntsgx" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.133643 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-3f1a-account-create-update-xt86d"] Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.135142 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3f1a-account-create-update-xt86d" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.140747 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.143907 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mmjs\" (UniqueName: \"kubernetes.io/projected/3ab80fcd-7771-41f9-a64c-08e0cacf63c5-kube-api-access-9mmjs\") pod \"nova-cell3-db-create-ntsgx\" (UID: \"3ab80fcd-7771-41f9-a64c-08e0cacf63c5\") " pod="openstack/nova-cell3-db-create-ntsgx" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.155044 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98cbb\" (UniqueName: \"kubernetes.io/projected/c2f0e73d-81cd-4359-bba3-8fe0147be087-kube-api-access-98cbb\") pod \"nova-cell0-ec73-account-create-update-zq27k\" (UID: \"c2f0e73d-81cd-4359-bba3-8fe0147be087\") " pod="openstack/nova-cell0-ec73-account-create-update-zq27k" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.159169 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3f1a-account-create-update-xt86d"] Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.216694 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ec73-account-create-update-zq27k" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.217046 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82669ed0-9ff7-4ebc-a09f-8e31ef7358bc-operator-scripts\") pod \"nova-cell1-3f1a-account-create-update-xt86d\" (UID: \"82669ed0-9ff7-4ebc-a09f-8e31ef7358bc\") " pod="openstack/nova-cell1-3f1a-account-create-update-xt86d" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.217132 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drprh\" (UniqueName: \"kubernetes.io/projected/82669ed0-9ff7-4ebc-a09f-8e31ef7358bc-kube-api-access-drprh\") pod \"nova-cell1-3f1a-account-create-update-xt86d\" (UID: \"82669ed0-9ff7-4ebc-a09f-8e31ef7358bc\") " pod="openstack/nova-cell1-3f1a-account-create-update-xt86d" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.217812 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7076b626-52f4-442b-9bc3-96d8a747ddaa-operator-scripts\") pod \"nova-cell1-db-create-6cpvn\" (UID: \"7076b626-52f4-442b-9bc3-96d8a747ddaa\") " pod="openstack/nova-cell1-db-create-6cpvn" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.217887 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqvsq\" (UniqueName: \"kubernetes.io/projected/7076b626-52f4-442b-9bc3-96d8a747ddaa-kube-api-access-kqvsq\") pod \"nova-cell1-db-create-6cpvn\" (UID: \"7076b626-52f4-442b-9bc3-96d8a747ddaa\") " pod="openstack/nova-cell1-db-create-6cpvn" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.218430 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7076b626-52f4-442b-9bc3-96d8a747ddaa-operator-scripts\") pod \"nova-cell1-db-create-6cpvn\" (UID: \"7076b626-52f4-442b-9bc3-96d8a747ddaa\") " pod="openstack/nova-cell1-db-create-6cpvn" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.225104 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell2-87f1-account-create-update-lkw4s"] Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.226873 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-87f1-account-create-update-lkw4s" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.229916 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell2-db-secret" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.235024 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqvsq\" (UniqueName: \"kubernetes.io/projected/7076b626-52f4-442b-9bc3-96d8a747ddaa-kube-api-access-kqvsq\") pod \"nova-cell1-db-create-6cpvn\" (UID: \"7076b626-52f4-442b-9bc3-96d8a747ddaa\") " pod="openstack/nova-cell1-db-create-6cpvn" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.247456 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell2-87f1-account-create-update-lkw4s"] Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.311375 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell3-db-create-ntsgx" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.319712 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a7f2d65-0a76-45f8-add3-9e040c04d500-operator-scripts\") pod \"nova-cell2-87f1-account-create-update-lkw4s\" (UID: \"0a7f2d65-0a76-45f8-add3-9e040c04d500\") " pod="openstack/nova-cell2-87f1-account-create-update-lkw4s" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.319852 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82669ed0-9ff7-4ebc-a09f-8e31ef7358bc-operator-scripts\") pod \"nova-cell1-3f1a-account-create-update-xt86d\" (UID: \"82669ed0-9ff7-4ebc-a09f-8e31ef7358bc\") " pod="openstack/nova-cell1-3f1a-account-create-update-xt86d" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.319906 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drprh\" (UniqueName: \"kubernetes.io/projected/82669ed0-9ff7-4ebc-a09f-8e31ef7358bc-kube-api-access-drprh\") pod \"nova-cell1-3f1a-account-create-update-xt86d\" (UID: \"82669ed0-9ff7-4ebc-a09f-8e31ef7358bc\") " pod="openstack/nova-cell1-3f1a-account-create-update-xt86d" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.319955 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm5rn\" (UniqueName: \"kubernetes.io/projected/0a7f2d65-0a76-45f8-add3-9e040c04d500-kube-api-access-dm5rn\") pod \"nova-cell2-87f1-account-create-update-lkw4s\" (UID: \"0a7f2d65-0a76-45f8-add3-9e040c04d500\") " pod="openstack/nova-cell2-87f1-account-create-update-lkw4s" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.320812 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82669ed0-9ff7-4ebc-a09f-8e31ef7358bc-operator-scripts\") pod \"nova-cell1-3f1a-account-create-update-xt86d\" (UID: \"82669ed0-9ff7-4ebc-a09f-8e31ef7358bc\") " pod="openstack/nova-cell1-3f1a-account-create-update-xt86d" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.354658 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drprh\" (UniqueName: \"kubernetes.io/projected/82669ed0-9ff7-4ebc-a09f-8e31ef7358bc-kube-api-access-drprh\") pod \"nova-cell1-3f1a-account-create-update-xt86d\" (UID: \"82669ed0-9ff7-4ebc-a09f-8e31ef7358bc\") " pod="openstack/nova-cell1-3f1a-account-create-update-xt86d" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.364326 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6cpvn" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.425137 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm5rn\" (UniqueName: \"kubernetes.io/projected/0a7f2d65-0a76-45f8-add3-9e040c04d500-kube-api-access-dm5rn\") pod \"nova-cell2-87f1-account-create-update-lkw4s\" (UID: \"0a7f2d65-0a76-45f8-add3-9e040c04d500\") " pod="openstack/nova-cell2-87f1-account-create-update-lkw4s" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.425484 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a7f2d65-0a76-45f8-add3-9e040c04d500-operator-scripts\") pod \"nova-cell2-87f1-account-create-update-lkw4s\" (UID: \"0a7f2d65-0a76-45f8-add3-9e040c04d500\") " pod="openstack/nova-cell2-87f1-account-create-update-lkw4s" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.426374 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a7f2d65-0a76-45f8-add3-9e040c04d500-operator-scripts\") pod \"nova-cell2-87f1-account-create-update-lkw4s\" (UID: \"0a7f2d65-0a76-45f8-add3-9e040c04d500\") " pod="openstack/nova-cell2-87f1-account-create-update-lkw4s" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.437827 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell3-8ab9-account-create-update-222fx"] Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.439108 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell3-8ab9-account-create-update-222fx" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.441677 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell3-db-secret" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.446283 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm5rn\" (UniqueName: \"kubernetes.io/projected/0a7f2d65-0a76-45f8-add3-9e040c04d500-kube-api-access-dm5rn\") pod \"nova-cell2-87f1-account-create-update-lkw4s\" (UID: \"0a7f2d65-0a76-45f8-add3-9e040c04d500\") " pod="openstack/nova-cell2-87f1-account-create-update-lkw4s" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.451454 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell3-8ab9-account-create-update-222fx"] Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.470772 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3f1a-account-create-update-xt86d" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.526780 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6-operator-scripts\") pod \"nova-cell3-8ab9-account-create-update-222fx\" (UID: \"f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6\") " pod="openstack/nova-cell3-8ab9-account-create-update-222fx" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.527272 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6xm2\" (UniqueName: \"kubernetes.io/projected/f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6-kube-api-access-x6xm2\") pod \"nova-cell3-8ab9-account-create-update-222fx\" (UID: \"f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6\") " pod="openstack/nova-cell3-8ab9-account-create-update-222fx" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.542039 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7csgl"] Dec 03 09:06:03 crc kubenswrapper[4947]: W1203 09:06:03.554372 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda614093c_3c22_4f0d_916b_9719904fc295.slice/crio-e3e393087364dca57c5f92405b997d950cf120687b1b49ef9d13ee9c1ffbce6e WatchSource:0}: Error finding container e3e393087364dca57c5f92405b997d950cf120687b1b49ef9d13ee9c1ffbce6e: Status 404 returned error can't find the container with id e3e393087364dca57c5f92405b997d950cf120687b1b49ef9d13ee9c1ffbce6e Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.561409 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-87f1-account-create-update-lkw4s" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.629707 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6-operator-scripts\") pod \"nova-cell3-8ab9-account-create-update-222fx\" (UID: \"f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6\") " pod="openstack/nova-cell3-8ab9-account-create-update-222fx" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.630375 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6xm2\" (UniqueName: \"kubernetes.io/projected/f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6-kube-api-access-x6xm2\") pod \"nova-cell3-8ab9-account-create-update-222fx\" (UID: \"f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6\") " pod="openstack/nova-cell3-8ab9-account-create-update-222fx" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.632061 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6-operator-scripts\") pod \"nova-cell3-8ab9-account-create-update-222fx\" (UID: \"f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6\") " pod="openstack/nova-cell3-8ab9-account-create-update-222fx" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.659668 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6xm2\" (UniqueName: \"kubernetes.io/projected/f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6-kube-api-access-x6xm2\") pod \"nova-cell3-8ab9-account-create-update-222fx\" (UID: \"f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6\") " pod="openstack/nova-cell3-8ab9-account-create-update-222fx" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.683155 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4ec2-account-create-update-f7xgv"] Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.695675 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-m87jz"] Dec 03 09:06:03 crc kubenswrapper[4947]: W1203 09:06:03.695933 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3130440_e339_4ca1_9fc6_d4feeac9ae93.slice/crio-59edc04392185784fdc85e6571bcd6f0cdefa2b38ed3de37769a77c53615dda6 WatchSource:0}: Error finding container 59edc04392185784fdc85e6571bcd6f0cdefa2b38ed3de37769a77c53615dda6: Status 404 returned error can't find the container with id 59edc04392185784fdc85e6571bcd6f0cdefa2b38ed3de37769a77c53615dda6 Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.741804 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4ec2-account-create-update-f7xgv" event={"ID":"e3130440-e339-4ca1-9fc6-d4feeac9ae93","Type":"ContainerStarted","Data":"59edc04392185784fdc85e6571bcd6f0cdefa2b38ed3de37769a77c53615dda6"} Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.743622 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7csgl" event={"ID":"a614093c-3c22-4f0d-916b-9719904fc295","Type":"ContainerStarted","Data":"e3e393087364dca57c5f92405b997d950cf120687b1b49ef9d13ee9c1ffbce6e"} Dec 03 09:06:03 crc kubenswrapper[4947]: W1203 09:06:03.753532 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2fcdb22_2b64_44ef_9d4b_5d6cffd4cbc0.slice/crio-edeac9aece85dfa185e141bea03aa4a1107af9b20d2b0ba538a40739a31cf26a WatchSource:0}: Error finding container edeac9aece85dfa185e141bea03aa4a1107af9b20d2b0ba538a40739a31cf26a: Status 404 returned error can't find the container with id edeac9aece85dfa185e141bea03aa4a1107af9b20d2b0ba538a40739a31cf26a Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.768600 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell3-8ab9-account-create-update-222fx" Dec 03 09:06:03 crc kubenswrapper[4947]: I1203 09:06:03.862143 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell2-db-create-qgb8g"] Dec 03 09:06:03 crc kubenswrapper[4947]: W1203 09:06:03.875466 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2ce9dbe_7fd3_4aac_bffd_6688f7d7ebc1.slice/crio-bffe955c0e34eb08830be7a8c05cd08cf4a402a824c16a4f42428809ccb8a1ce WatchSource:0}: Error finding container bffe955c0e34eb08830be7a8c05cd08cf4a402a824c16a4f42428809ccb8a1ce: Status 404 returned error can't find the container with id bffe955c0e34eb08830be7a8c05cd08cf4a402a824c16a4f42428809ccb8a1ce Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.044235 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-6cpvn"] Dec 03 09:06:04 crc kubenswrapper[4947]: W1203 09:06:04.050629 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2f0e73d_81cd_4359_bba3_8fe0147be087.slice/crio-531b15791eec7a0c09eb4c395daf508d8fb9694a03d80fcc0d0bf835383f3f17 WatchSource:0}: Error finding container 531b15791eec7a0c09eb4c395daf508d8fb9694a03d80fcc0d0bf835383f3f17: Status 404 returned error can't find the container with id 531b15791eec7a0c09eb4c395daf508d8fb9694a03d80fcc0d0bf835383f3f17 Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.052121 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ec73-account-create-update-zq27k"] Dec 03 09:06:04 crc kubenswrapper[4947]: W1203 09:06:04.054203 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ab80fcd_7771_41f9_a64c_08e0cacf63c5.slice/crio-76447a67da4b5b1cc408286343a48ffc7f8573e6c5243e08df7af2857f4017a9 WatchSource:0}: Error finding container 76447a67da4b5b1cc408286343a48ffc7f8573e6c5243e08df7af2857f4017a9: Status 404 returned error can't find the container with id 76447a67da4b5b1cc408286343a48ffc7f8573e6c5243e08df7af2857f4017a9 Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.060914 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell3-db-create-ntsgx"] Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.258446 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3f1a-account-create-update-xt86d"] Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.379431 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell3-8ab9-account-create-update-222fx"] Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.396474 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell2-87f1-account-create-update-lkw4s"] Dec 03 09:06:04 crc kubenswrapper[4947]: W1203 09:06:04.457748 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf96fcab4_55a8_4b1d_8a24_67fbc1cfafd6.slice/crio-0c5887a077387ee212a6951033a5c394d63bde829622e9ae8376029b35e94bf4 WatchSource:0}: Error finding container 0c5887a077387ee212a6951033a5c394d63bde829622e9ae8376029b35e94bf4: Status 404 returned error can't find the container with id 0c5887a077387ee212a6951033a5c394d63bde829622e9ae8376029b35e94bf4 Dec 03 09:06:04 crc kubenswrapper[4947]: W1203 09:06:04.502527 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a7f2d65_0a76_45f8_add3_9e040c04d500.slice/crio-0a4aa4edbadf263c0c1567d5a4e3b9b6a2529910f30bdbe78b40b22e0c49c496 WatchSource:0}: Error finding container 0a4aa4edbadf263c0c1567d5a4e3b9b6a2529910f30bdbe78b40b22e0c49c496: Status 404 returned error can't find the container with id 0a4aa4edbadf263c0c1567d5a4e3b9b6a2529910f30bdbe78b40b22e0c49c496 Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.760081 4947 generic.go:334] "Generic (PLEG): container finished" podID="3ab80fcd-7771-41f9-a64c-08e0cacf63c5" containerID="607078157d2ccf487999a75a36f9d90ec1cff093751322c155d4cdd02678468c" exitCode=0 Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.760171 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell3-db-create-ntsgx" event={"ID":"3ab80fcd-7771-41f9-a64c-08e0cacf63c5","Type":"ContainerDied","Data":"607078157d2ccf487999a75a36f9d90ec1cff093751322c155d4cdd02678468c"} Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.760205 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell3-db-create-ntsgx" event={"ID":"3ab80fcd-7771-41f9-a64c-08e0cacf63c5","Type":"ContainerStarted","Data":"76447a67da4b5b1cc408286343a48ffc7f8573e6c5243e08df7af2857f4017a9"} Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.770107 4947 generic.go:334] "Generic (PLEG): container finished" podID="c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1" containerID="b20486fd2e1e682323f829f2f1bab9f228faf7a8c65f5cbf27ae7617b13432ba" exitCode=0 Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.770216 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-db-create-qgb8g" event={"ID":"c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1","Type":"ContainerDied","Data":"b20486fd2e1e682323f829f2f1bab9f228faf7a8c65f5cbf27ae7617b13432ba"} Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.770250 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-db-create-qgb8g" event={"ID":"c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1","Type":"ContainerStarted","Data":"bffe955c0e34eb08830be7a8c05cd08cf4a402a824c16a4f42428809ccb8a1ce"} Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.780959 4947 generic.go:334] "Generic (PLEG): container finished" podID="a614093c-3c22-4f0d-916b-9719904fc295" containerID="2fc06c8352a3a245cf483849bc980b7345b7534cd4630f5870be486e35180875" exitCode=0 Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.781405 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7csgl" event={"ID":"a614093c-3c22-4f0d-916b-9719904fc295","Type":"ContainerDied","Data":"2fc06c8352a3a245cf483849bc980b7345b7534cd4630f5870be486e35180875"} Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.786624 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell3-8ab9-account-create-update-222fx" event={"ID":"f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6","Type":"ContainerStarted","Data":"0c5887a077387ee212a6951033a5c394d63bde829622e9ae8376029b35e94bf4"} Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.798762 4947 generic.go:334] "Generic (PLEG): container finished" podID="e3130440-e339-4ca1-9fc6-d4feeac9ae93" containerID="20b347a296676464532d3e136ab9e37b5ba555aa34bf8d2b8b8c5b90ed0805eb" exitCode=0 Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.798990 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4ec2-account-create-update-f7xgv" event={"ID":"e3130440-e339-4ca1-9fc6-d4feeac9ae93","Type":"ContainerDied","Data":"20b347a296676464532d3e136ab9e37b5ba555aa34bf8d2b8b8c5b90ed0805eb"} Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.808912 4947 generic.go:334] "Generic (PLEG): container finished" podID="c2f0e73d-81cd-4359-bba3-8fe0147be087" containerID="abad442556398d4818dc4ff4b1dc018c706bea8979247a6c73d4a10393b2a850" exitCode=0 Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.808975 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ec73-account-create-update-zq27k" event={"ID":"c2f0e73d-81cd-4359-bba3-8fe0147be087","Type":"ContainerDied","Data":"abad442556398d4818dc4ff4b1dc018c706bea8979247a6c73d4a10393b2a850"} Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.809035 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ec73-account-create-update-zq27k" event={"ID":"c2f0e73d-81cd-4359-bba3-8fe0147be087","Type":"ContainerStarted","Data":"531b15791eec7a0c09eb4c395daf508d8fb9694a03d80fcc0d0bf835383f3f17"} Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.815875 4947 generic.go:334] "Generic (PLEG): container finished" podID="7076b626-52f4-442b-9bc3-96d8a747ddaa" containerID="ce242123f417fb4772b3b8fed2700290cbc93375e0050f7e7c1fa44eaaa2e808" exitCode=0 Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.815993 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6cpvn" event={"ID":"7076b626-52f4-442b-9bc3-96d8a747ddaa","Type":"ContainerDied","Data":"ce242123f417fb4772b3b8fed2700290cbc93375e0050f7e7c1fa44eaaa2e808"} Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.816035 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6cpvn" event={"ID":"7076b626-52f4-442b-9bc3-96d8a747ddaa","Type":"ContainerStarted","Data":"d4404650580deee18baf7bf68ce3188a7059fc9530441e9b60901ae328c0294e"} Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.831601 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3f1a-account-create-update-xt86d" event={"ID":"82669ed0-9ff7-4ebc-a09f-8e31ef7358bc","Type":"ContainerStarted","Data":"003e9b6e885452d692e6fa8b60c1ad9eb458229ca885e1464f039da59a01d8b3"} Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.833876 4947 generic.go:334] "Generic (PLEG): container finished" podID="d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0" containerID="5c0ffeff5caa73c84b3ca1f78f22d28f34af4891cf0f71350003481dbe43d60d" exitCode=0 Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.833933 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-m87jz" event={"ID":"d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0","Type":"ContainerDied","Data":"5c0ffeff5caa73c84b3ca1f78f22d28f34af4891cf0f71350003481dbe43d60d"} Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.833957 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-m87jz" event={"ID":"d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0","Type":"ContainerStarted","Data":"edeac9aece85dfa185e141bea03aa4a1107af9b20d2b0ba538a40739a31cf26a"} Dec 03 09:06:04 crc kubenswrapper[4947]: I1203 09:06:04.835010 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-87f1-account-create-update-lkw4s" event={"ID":"0a7f2d65-0a76-45f8-add3-9e040c04d500","Type":"ContainerStarted","Data":"0a4aa4edbadf263c0c1567d5a4e3b9b6a2529910f30bdbe78b40b22e0c49c496"} Dec 03 09:06:05 crc kubenswrapper[4947]: I1203 09:06:05.853923 4947 generic.go:334] "Generic (PLEG): container finished" podID="82669ed0-9ff7-4ebc-a09f-8e31ef7358bc" containerID="e264b0a91480befbce48dfa8a0e4b87bc6959d1bc6c33e2fcc3fdd0c26b59ace" exitCode=0 Dec 03 09:06:05 crc kubenswrapper[4947]: I1203 09:06:05.854048 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3f1a-account-create-update-xt86d" event={"ID":"82669ed0-9ff7-4ebc-a09f-8e31ef7358bc","Type":"ContainerDied","Data":"e264b0a91480befbce48dfa8a0e4b87bc6959d1bc6c33e2fcc3fdd0c26b59ace"} Dec 03 09:06:05 crc kubenswrapper[4947]: I1203 09:06:05.859361 4947 generic.go:334] "Generic (PLEG): container finished" podID="0a7f2d65-0a76-45f8-add3-9e040c04d500" containerID="4496c4d6d39195088b229a9a1fab081db144b46ba7268d070245ca847b4f3426" exitCode=0 Dec 03 09:06:05 crc kubenswrapper[4947]: I1203 09:06:05.859431 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-87f1-account-create-update-lkw4s" event={"ID":"0a7f2d65-0a76-45f8-add3-9e040c04d500","Type":"ContainerDied","Data":"4496c4d6d39195088b229a9a1fab081db144b46ba7268d070245ca847b4f3426"} Dec 03 09:06:05 crc kubenswrapper[4947]: I1203 09:06:05.862012 4947 generic.go:334] "Generic (PLEG): container finished" podID="f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6" containerID="aae611f0757d19d89d16a80fe897296ed226f13ed6f810730f1bd02a6ba9cddc" exitCode=0 Dec 03 09:06:05 crc kubenswrapper[4947]: I1203 09:06:05.862058 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell3-8ab9-account-create-update-222fx" event={"ID":"f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6","Type":"ContainerDied","Data":"aae611f0757d19d89d16a80fe897296ed226f13ed6f810730f1bd02a6ba9cddc"} Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.263583 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ec73-account-create-update-zq27k" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.395919 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f0e73d-81cd-4359-bba3-8fe0147be087-operator-scripts\") pod \"c2f0e73d-81cd-4359-bba3-8fe0147be087\" (UID: \"c2f0e73d-81cd-4359-bba3-8fe0147be087\") " Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.396096 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98cbb\" (UniqueName: \"kubernetes.io/projected/c2f0e73d-81cd-4359-bba3-8fe0147be087-kube-api-access-98cbb\") pod \"c2f0e73d-81cd-4359-bba3-8fe0147be087\" (UID: \"c2f0e73d-81cd-4359-bba3-8fe0147be087\") " Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.403300 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2f0e73d-81cd-4359-bba3-8fe0147be087-kube-api-access-98cbb" (OuterVolumeSpecName: "kube-api-access-98cbb") pod "c2f0e73d-81cd-4359-bba3-8fe0147be087" (UID: "c2f0e73d-81cd-4359-bba3-8fe0147be087"). InnerVolumeSpecName "kube-api-access-98cbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.403973 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2f0e73d-81cd-4359-bba3-8fe0147be087-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c2f0e73d-81cd-4359-bba3-8fe0147be087" (UID: "c2f0e73d-81cd-4359-bba3-8fe0147be087"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.497844 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f0e73d-81cd-4359-bba3-8fe0147be087-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.498158 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98cbb\" (UniqueName: \"kubernetes.io/projected/c2f0e73d-81cd-4359-bba3-8fe0147be087-kube-api-access-98cbb\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.580443 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4ec2-account-create-update-f7xgv" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.588259 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-db-create-qgb8g" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.609423 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6cpvn" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.623440 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7csgl" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.630795 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-m87jz" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.643075 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell3-db-create-ntsgx" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.701128 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3130440-e339-4ca1-9fc6-d4feeac9ae93-operator-scripts\") pod \"e3130440-e339-4ca1-9fc6-d4feeac9ae93\" (UID: \"e3130440-e339-4ca1-9fc6-d4feeac9ae93\") " Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.701439 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mmjs\" (UniqueName: \"kubernetes.io/projected/3ab80fcd-7771-41f9-a64c-08e0cacf63c5-kube-api-access-9mmjs\") pod \"3ab80fcd-7771-41f9-a64c-08e0cacf63c5\" (UID: \"3ab80fcd-7771-41f9-a64c-08e0cacf63c5\") " Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.701574 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qncj\" (UniqueName: \"kubernetes.io/projected/a614093c-3c22-4f0d-916b-9719904fc295-kube-api-access-7qncj\") pod \"a614093c-3c22-4f0d-916b-9719904fc295\" (UID: \"a614093c-3c22-4f0d-916b-9719904fc295\") " Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.701672 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfdl5\" (UniqueName: \"kubernetes.io/projected/c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1-kube-api-access-mfdl5\") pod \"c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1\" (UID: \"c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1\") " Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.701748 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pws6m\" (UniqueName: \"kubernetes.io/projected/e3130440-e339-4ca1-9fc6-d4feeac9ae93-kube-api-access-pws6m\") pod \"e3130440-e339-4ca1-9fc6-d4feeac9ae93\" (UID: \"e3130440-e339-4ca1-9fc6-d4feeac9ae93\") " Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.701814 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7076b626-52f4-442b-9bc3-96d8a747ddaa-operator-scripts\") pod \"7076b626-52f4-442b-9bc3-96d8a747ddaa\" (UID: \"7076b626-52f4-442b-9bc3-96d8a747ddaa\") " Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.701886 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0-operator-scripts\") pod \"d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0\" (UID: \"d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0\") " Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.701975 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1-operator-scripts\") pod \"c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1\" (UID: \"c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1\") " Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.702063 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ab80fcd-7771-41f9-a64c-08e0cacf63c5-operator-scripts\") pod \"3ab80fcd-7771-41f9-a64c-08e0cacf63c5\" (UID: \"3ab80fcd-7771-41f9-a64c-08e0cacf63c5\") " Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.702101 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3130440-e339-4ca1-9fc6-d4feeac9ae93-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e3130440-e339-4ca1-9fc6-d4feeac9ae93" (UID: "e3130440-e339-4ca1-9fc6-d4feeac9ae93"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.702210 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a614093c-3c22-4f0d-916b-9719904fc295-operator-scripts\") pod \"a614093c-3c22-4f0d-916b-9719904fc295\" (UID: \"a614093c-3c22-4f0d-916b-9719904fc295\") " Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.702283 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh965\" (UniqueName: \"kubernetes.io/projected/d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0-kube-api-access-xh965\") pod \"d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0\" (UID: \"d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0\") " Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.702365 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqvsq\" (UniqueName: \"kubernetes.io/projected/7076b626-52f4-442b-9bc3-96d8a747ddaa-kube-api-access-kqvsq\") pod \"7076b626-52f4-442b-9bc3-96d8a747ddaa\" (UID: \"7076b626-52f4-442b-9bc3-96d8a747ddaa\") " Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.702428 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7076b626-52f4-442b-9bc3-96d8a747ddaa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7076b626-52f4-442b-9bc3-96d8a747ddaa" (UID: "7076b626-52f4-442b-9bc3-96d8a747ddaa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.702842 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1" (UID: "c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.702951 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ab80fcd-7771-41f9-a64c-08e0cacf63c5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3ab80fcd-7771-41f9-a64c-08e0cacf63c5" (UID: "3ab80fcd-7771-41f9-a64c-08e0cacf63c5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.703050 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3130440-e339-4ca1-9fc6-d4feeac9ae93-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.703117 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7076b626-52f4-442b-9bc3-96d8a747ddaa-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.703181 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.703109 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0" (UID: "d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.703141 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a614093c-3c22-4f0d-916b-9719904fc295-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a614093c-3c22-4f0d-916b-9719904fc295" (UID: "a614093c-3c22-4f0d-916b-9719904fc295"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.704921 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a614093c-3c22-4f0d-916b-9719904fc295-kube-api-access-7qncj" (OuterVolumeSpecName: "kube-api-access-7qncj") pod "a614093c-3c22-4f0d-916b-9719904fc295" (UID: "a614093c-3c22-4f0d-916b-9719904fc295"). InnerVolumeSpecName "kube-api-access-7qncj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.705645 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab80fcd-7771-41f9-a64c-08e0cacf63c5-kube-api-access-9mmjs" (OuterVolumeSpecName: "kube-api-access-9mmjs") pod "3ab80fcd-7771-41f9-a64c-08e0cacf63c5" (UID: "3ab80fcd-7771-41f9-a64c-08e0cacf63c5"). InnerVolumeSpecName "kube-api-access-9mmjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.706472 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7076b626-52f4-442b-9bc3-96d8a747ddaa-kube-api-access-kqvsq" (OuterVolumeSpecName: "kube-api-access-kqvsq") pod "7076b626-52f4-442b-9bc3-96d8a747ddaa" (UID: "7076b626-52f4-442b-9bc3-96d8a747ddaa"). InnerVolumeSpecName "kube-api-access-kqvsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.706535 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1-kube-api-access-mfdl5" (OuterVolumeSpecName: "kube-api-access-mfdl5") pod "c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1" (UID: "c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1"). InnerVolumeSpecName "kube-api-access-mfdl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.706802 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0-kube-api-access-xh965" (OuterVolumeSpecName: "kube-api-access-xh965") pod "d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0" (UID: "d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0"). InnerVolumeSpecName "kube-api-access-xh965". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.707046 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3130440-e339-4ca1-9fc6-d4feeac9ae93-kube-api-access-pws6m" (OuterVolumeSpecName: "kube-api-access-pws6m") pod "e3130440-e339-4ca1-9fc6-d4feeac9ae93" (UID: "e3130440-e339-4ca1-9fc6-d4feeac9ae93"). InnerVolumeSpecName "kube-api-access-pws6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.804951 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mmjs\" (UniqueName: \"kubernetes.io/projected/3ab80fcd-7771-41f9-a64c-08e0cacf63c5-kube-api-access-9mmjs\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.805011 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qncj\" (UniqueName: \"kubernetes.io/projected/a614093c-3c22-4f0d-916b-9719904fc295-kube-api-access-7qncj\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.805023 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfdl5\" (UniqueName: \"kubernetes.io/projected/c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1-kube-api-access-mfdl5\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.805035 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pws6m\" (UniqueName: \"kubernetes.io/projected/e3130440-e339-4ca1-9fc6-d4feeac9ae93-kube-api-access-pws6m\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.805047 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.805058 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ab80fcd-7771-41f9-a64c-08e0cacf63c5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.805095 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a614093c-3c22-4f0d-916b-9719904fc295-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.805106 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh965\" (UniqueName: \"kubernetes.io/projected/d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0-kube-api-access-xh965\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.805118 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqvsq\" (UniqueName: \"kubernetes.io/projected/7076b626-52f4-442b-9bc3-96d8a747ddaa-kube-api-access-kqvsq\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.873304 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell3-db-create-ntsgx" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.873288 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell3-db-create-ntsgx" event={"ID":"3ab80fcd-7771-41f9-a64c-08e0cacf63c5","Type":"ContainerDied","Data":"76447a67da4b5b1cc408286343a48ffc7f8573e6c5243e08df7af2857f4017a9"} Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.873432 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76447a67da4b5b1cc408286343a48ffc7f8573e6c5243e08df7af2857f4017a9" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.874859 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-db-create-qgb8g" event={"ID":"c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1","Type":"ContainerDied","Data":"bffe955c0e34eb08830be7a8c05cd08cf4a402a824c16a4f42428809ccb8a1ce"} Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.874886 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bffe955c0e34eb08830be7a8c05cd08cf4a402a824c16a4f42428809ccb8a1ce" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.874896 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-db-create-qgb8g" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.876398 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4ec2-account-create-update-f7xgv" event={"ID":"e3130440-e339-4ca1-9fc6-d4feeac9ae93","Type":"ContainerDied","Data":"59edc04392185784fdc85e6571bcd6f0cdefa2b38ed3de37769a77c53615dda6"} Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.876423 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59edc04392185784fdc85e6571bcd6f0cdefa2b38ed3de37769a77c53615dda6" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.876453 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4ec2-account-create-update-f7xgv" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.877993 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ec73-account-create-update-zq27k" event={"ID":"c2f0e73d-81cd-4359-bba3-8fe0147be087","Type":"ContainerDied","Data":"531b15791eec7a0c09eb4c395daf508d8fb9694a03d80fcc0d0bf835383f3f17"} Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.878016 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ec73-account-create-update-zq27k" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.878031 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="531b15791eec7a0c09eb4c395daf508d8fb9694a03d80fcc0d0bf835383f3f17" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.879256 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6cpvn" event={"ID":"7076b626-52f4-442b-9bc3-96d8a747ddaa","Type":"ContainerDied","Data":"d4404650580deee18baf7bf68ce3188a7059fc9530441e9b60901ae328c0294e"} Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.879270 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6cpvn" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.879283 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4404650580deee18baf7bf68ce3188a7059fc9530441e9b60901ae328c0294e" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.881502 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-m87jz" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.881484 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-m87jz" event={"ID":"d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0","Type":"ContainerDied","Data":"edeac9aece85dfa185e141bea03aa4a1107af9b20d2b0ba538a40739a31cf26a"} Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.882049 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edeac9aece85dfa185e141bea03aa4a1107af9b20d2b0ba538a40739a31cf26a" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.883460 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7csgl" Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.883542 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7csgl" event={"ID":"a614093c-3c22-4f0d-916b-9719904fc295","Type":"ContainerDied","Data":"e3e393087364dca57c5f92405b997d950cf120687b1b49ef9d13ee9c1ffbce6e"} Dec 03 09:06:06 crc kubenswrapper[4947]: I1203 09:06:06.883577 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3e393087364dca57c5f92405b997d950cf120687b1b49ef9d13ee9c1ffbce6e" Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.166290 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell3-8ab9-account-create-update-222fx" Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.314462 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6-operator-scripts\") pod \"f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6\" (UID: \"f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6\") " Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.315477 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6xm2\" (UniqueName: \"kubernetes.io/projected/f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6-kube-api-access-x6xm2\") pod \"f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6\" (UID: \"f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6\") " Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.315795 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6" (UID: "f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.316356 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.320809 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6-kube-api-access-x6xm2" (OuterVolumeSpecName: "kube-api-access-x6xm2") pod "f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6" (UID: "f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6"). InnerVolumeSpecName "kube-api-access-x6xm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.375700 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-87f1-account-create-update-lkw4s" Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.384990 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3f1a-account-create-update-xt86d" Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.418583 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6xm2\" (UniqueName: \"kubernetes.io/projected/f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6-kube-api-access-x6xm2\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.519750 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a7f2d65-0a76-45f8-add3-9e040c04d500-operator-scripts\") pod \"0a7f2d65-0a76-45f8-add3-9e040c04d500\" (UID: \"0a7f2d65-0a76-45f8-add3-9e040c04d500\") " Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.520030 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dm5rn\" (UniqueName: \"kubernetes.io/projected/0a7f2d65-0a76-45f8-add3-9e040c04d500-kube-api-access-dm5rn\") pod \"0a7f2d65-0a76-45f8-add3-9e040c04d500\" (UID: \"0a7f2d65-0a76-45f8-add3-9e040c04d500\") " Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.520088 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82669ed0-9ff7-4ebc-a09f-8e31ef7358bc-operator-scripts\") pod \"82669ed0-9ff7-4ebc-a09f-8e31ef7358bc\" (UID: \"82669ed0-9ff7-4ebc-a09f-8e31ef7358bc\") " Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.520180 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drprh\" (UniqueName: \"kubernetes.io/projected/82669ed0-9ff7-4ebc-a09f-8e31ef7358bc-kube-api-access-drprh\") pod \"82669ed0-9ff7-4ebc-a09f-8e31ef7358bc\" (UID: \"82669ed0-9ff7-4ebc-a09f-8e31ef7358bc\") " Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.520348 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a7f2d65-0a76-45f8-add3-9e040c04d500-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0a7f2d65-0a76-45f8-add3-9e040c04d500" (UID: "0a7f2d65-0a76-45f8-add3-9e040c04d500"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.520892 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0a7f2d65-0a76-45f8-add3-9e040c04d500-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.521039 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82669ed0-9ff7-4ebc-a09f-8e31ef7358bc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "82669ed0-9ff7-4ebc-a09f-8e31ef7358bc" (UID: "82669ed0-9ff7-4ebc-a09f-8e31ef7358bc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.523625 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82669ed0-9ff7-4ebc-a09f-8e31ef7358bc-kube-api-access-drprh" (OuterVolumeSpecName: "kube-api-access-drprh") pod "82669ed0-9ff7-4ebc-a09f-8e31ef7358bc" (UID: "82669ed0-9ff7-4ebc-a09f-8e31ef7358bc"). InnerVolumeSpecName "kube-api-access-drprh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.524977 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a7f2d65-0a76-45f8-add3-9e040c04d500-kube-api-access-dm5rn" (OuterVolumeSpecName: "kube-api-access-dm5rn") pod "0a7f2d65-0a76-45f8-add3-9e040c04d500" (UID: "0a7f2d65-0a76-45f8-add3-9e040c04d500"). InnerVolumeSpecName "kube-api-access-dm5rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.622763 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dm5rn\" (UniqueName: \"kubernetes.io/projected/0a7f2d65-0a76-45f8-add3-9e040c04d500-kube-api-access-dm5rn\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.622806 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82669ed0-9ff7-4ebc-a09f-8e31ef7358bc-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.622818 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drprh\" (UniqueName: \"kubernetes.io/projected/82669ed0-9ff7-4ebc-a09f-8e31ef7358bc-kube-api-access-drprh\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.894011 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3f1a-account-create-update-xt86d" event={"ID":"82669ed0-9ff7-4ebc-a09f-8e31ef7358bc","Type":"ContainerDied","Data":"003e9b6e885452d692e6fa8b60c1ad9eb458229ca885e1464f039da59a01d8b3"} Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.894096 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="003e9b6e885452d692e6fa8b60c1ad9eb458229ca885e1464f039da59a01d8b3" Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.894029 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3f1a-account-create-update-xt86d" Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.896791 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-87f1-account-create-update-lkw4s" event={"ID":"0a7f2d65-0a76-45f8-add3-9e040c04d500","Type":"ContainerDied","Data":"0a4aa4edbadf263c0c1567d5a4e3b9b6a2529910f30bdbe78b40b22e0c49c496"} Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.896901 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a4aa4edbadf263c0c1567d5a4e3b9b6a2529910f30bdbe78b40b22e0c49c496" Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.897034 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-87f1-account-create-update-lkw4s" Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.902442 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell3-8ab9-account-create-update-222fx" event={"ID":"f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6","Type":"ContainerDied","Data":"0c5887a077387ee212a6951033a5c394d63bde829622e9ae8376029b35e94bf4"} Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.902482 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c5887a077387ee212a6951033a5c394d63bde829622e9ae8376029b35e94bf4" Dec 03 09:06:07 crc kubenswrapper[4947]: I1203 09:06:07.902518 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell3-8ab9-account-create-update-222fx" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.700678 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hx4fk"] Dec 03 09:06:12 crc kubenswrapper[4947]: E1203 09:06:12.701543 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6" containerName="mariadb-account-create-update" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.701556 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6" containerName="mariadb-account-create-update" Dec 03 09:06:12 crc kubenswrapper[4947]: E1203 09:06:12.701579 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ab80fcd-7771-41f9-a64c-08e0cacf63c5" containerName="mariadb-database-create" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.701586 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ab80fcd-7771-41f9-a64c-08e0cacf63c5" containerName="mariadb-database-create" Dec 03 09:06:12 crc kubenswrapper[4947]: E1203 09:06:12.701600 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3130440-e339-4ca1-9fc6-d4feeac9ae93" containerName="mariadb-account-create-update" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.701607 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3130440-e339-4ca1-9fc6-d4feeac9ae93" containerName="mariadb-account-create-update" Dec 03 09:06:12 crc kubenswrapper[4947]: E1203 09:06:12.701622 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0" containerName="mariadb-database-create" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.701629 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0" containerName="mariadb-database-create" Dec 03 09:06:12 crc kubenswrapper[4947]: E1203 09:06:12.701642 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82669ed0-9ff7-4ebc-a09f-8e31ef7358bc" containerName="mariadb-account-create-update" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.701648 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="82669ed0-9ff7-4ebc-a09f-8e31ef7358bc" containerName="mariadb-account-create-update" Dec 03 09:06:12 crc kubenswrapper[4947]: E1203 09:06:12.701662 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1" containerName="mariadb-database-create" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.701669 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1" containerName="mariadb-database-create" Dec 03 09:06:12 crc kubenswrapper[4947]: E1203 09:06:12.701702 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a7f2d65-0a76-45f8-add3-9e040c04d500" containerName="mariadb-account-create-update" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.701709 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a7f2d65-0a76-45f8-add3-9e040c04d500" containerName="mariadb-account-create-update" Dec 03 09:06:12 crc kubenswrapper[4947]: E1203 09:06:12.701731 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a614093c-3c22-4f0d-916b-9719904fc295" containerName="mariadb-database-create" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.701755 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="a614093c-3c22-4f0d-916b-9719904fc295" containerName="mariadb-database-create" Dec 03 09:06:12 crc kubenswrapper[4947]: E1203 09:06:12.701767 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7076b626-52f4-442b-9bc3-96d8a747ddaa" containerName="mariadb-database-create" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.701774 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="7076b626-52f4-442b-9bc3-96d8a747ddaa" containerName="mariadb-database-create" Dec 03 09:06:12 crc kubenswrapper[4947]: E1203 09:06:12.701782 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2f0e73d-81cd-4359-bba3-8fe0147be087" containerName="mariadb-account-create-update" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.701788 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2f0e73d-81cd-4359-bba3-8fe0147be087" containerName="mariadb-account-create-update" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.701963 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="82669ed0-9ff7-4ebc-a09f-8e31ef7358bc" containerName="mariadb-account-create-update" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.701976 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2f0e73d-81cd-4359-bba3-8fe0147be087" containerName="mariadb-account-create-update" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.701989 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="a614093c-3c22-4f0d-916b-9719904fc295" containerName="mariadb-database-create" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.702003 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6" containerName="mariadb-account-create-update" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.702015 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3130440-e339-4ca1-9fc6-d4feeac9ae93" containerName="mariadb-account-create-update" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.702027 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a7f2d65-0a76-45f8-add3-9e040c04d500" containerName="mariadb-account-create-update" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.702042 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1" containerName="mariadb-database-create" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.702053 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ab80fcd-7771-41f9-a64c-08e0cacf63c5" containerName="mariadb-database-create" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.702067 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="7076b626-52f4-442b-9bc3-96d8a747ddaa" containerName="mariadb-database-create" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.702076 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0" containerName="mariadb-database-create" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.702688 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hx4fk" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.710127 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.710696 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-wxj85" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.715380 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.732234 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hx4fk"] Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.790606 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7pxlc"] Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.792915 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pxlc" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.800004 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7pxlc"] Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.841769 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-www6f\" (UniqueName: \"kubernetes.io/projected/963b895d-9438-47d0-b4ca-3fd58d5e5bad-kube-api-access-www6f\") pod \"nova-cell0-conductor-db-sync-hx4fk\" (UID: \"963b895d-9438-47d0-b4ca-3fd58d5e5bad\") " pod="openstack/nova-cell0-conductor-db-sync-hx4fk" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.841824 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/963b895d-9438-47d0-b4ca-3fd58d5e5bad-scripts\") pod \"nova-cell0-conductor-db-sync-hx4fk\" (UID: \"963b895d-9438-47d0-b4ca-3fd58d5e5bad\") " pod="openstack/nova-cell0-conductor-db-sync-hx4fk" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.841843 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/963b895d-9438-47d0-b4ca-3fd58d5e5bad-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hx4fk\" (UID: \"963b895d-9438-47d0-b4ca-3fd58d5e5bad\") " pod="openstack/nova-cell0-conductor-db-sync-hx4fk" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.841886 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/963b895d-9438-47d0-b4ca-3fd58d5e5bad-config-data\") pod \"nova-cell0-conductor-db-sync-hx4fk\" (UID: \"963b895d-9438-47d0-b4ca-3fd58d5e5bad\") " pod="openstack/nova-cell0-conductor-db-sync-hx4fk" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.943936 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/963b895d-9438-47d0-b4ca-3fd58d5e5bad-config-data\") pod \"nova-cell0-conductor-db-sync-hx4fk\" (UID: \"963b895d-9438-47d0-b4ca-3fd58d5e5bad\") " pod="openstack/nova-cell0-conductor-db-sync-hx4fk" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.944042 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wldp6\" (UniqueName: \"kubernetes.io/projected/104bf72c-b8e1-4daf-834e-42342cd02813-kube-api-access-wldp6\") pod \"redhat-operators-7pxlc\" (UID: \"104bf72c-b8e1-4daf-834e-42342cd02813\") " pod="openshift-marketplace/redhat-operators-7pxlc" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.944116 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/104bf72c-b8e1-4daf-834e-42342cd02813-catalog-content\") pod \"redhat-operators-7pxlc\" (UID: \"104bf72c-b8e1-4daf-834e-42342cd02813\") " pod="openshift-marketplace/redhat-operators-7pxlc" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.944216 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-www6f\" (UniqueName: \"kubernetes.io/projected/963b895d-9438-47d0-b4ca-3fd58d5e5bad-kube-api-access-www6f\") pod \"nova-cell0-conductor-db-sync-hx4fk\" (UID: \"963b895d-9438-47d0-b4ca-3fd58d5e5bad\") " pod="openstack/nova-cell0-conductor-db-sync-hx4fk" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.944316 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/104bf72c-b8e1-4daf-834e-42342cd02813-utilities\") pod \"redhat-operators-7pxlc\" (UID: \"104bf72c-b8e1-4daf-834e-42342cd02813\") " pod="openshift-marketplace/redhat-operators-7pxlc" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.944364 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/963b895d-9438-47d0-b4ca-3fd58d5e5bad-scripts\") pod \"nova-cell0-conductor-db-sync-hx4fk\" (UID: \"963b895d-9438-47d0-b4ca-3fd58d5e5bad\") " pod="openstack/nova-cell0-conductor-db-sync-hx4fk" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.944401 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/963b895d-9438-47d0-b4ca-3fd58d5e5bad-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hx4fk\" (UID: \"963b895d-9438-47d0-b4ca-3fd58d5e5bad\") " pod="openstack/nova-cell0-conductor-db-sync-hx4fk" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.950747 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/963b895d-9438-47d0-b4ca-3fd58d5e5bad-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hx4fk\" (UID: \"963b895d-9438-47d0-b4ca-3fd58d5e5bad\") " pod="openstack/nova-cell0-conductor-db-sync-hx4fk" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.952351 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/963b895d-9438-47d0-b4ca-3fd58d5e5bad-config-data\") pod \"nova-cell0-conductor-db-sync-hx4fk\" (UID: \"963b895d-9438-47d0-b4ca-3fd58d5e5bad\") " pod="openstack/nova-cell0-conductor-db-sync-hx4fk" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.955178 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/963b895d-9438-47d0-b4ca-3fd58d5e5bad-scripts\") pod \"nova-cell0-conductor-db-sync-hx4fk\" (UID: \"963b895d-9438-47d0-b4ca-3fd58d5e5bad\") " pod="openstack/nova-cell0-conductor-db-sync-hx4fk" Dec 03 09:06:12 crc kubenswrapper[4947]: I1203 09:06:12.960215 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-www6f\" (UniqueName: \"kubernetes.io/projected/963b895d-9438-47d0-b4ca-3fd58d5e5bad-kube-api-access-www6f\") pod \"nova-cell0-conductor-db-sync-hx4fk\" (UID: \"963b895d-9438-47d0-b4ca-3fd58d5e5bad\") " pod="openstack/nova-cell0-conductor-db-sync-hx4fk" Dec 03 09:06:13 crc kubenswrapper[4947]: I1203 09:06:13.039741 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hx4fk" Dec 03 09:06:13 crc kubenswrapper[4947]: I1203 09:06:13.045836 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/104bf72c-b8e1-4daf-834e-42342cd02813-catalog-content\") pod \"redhat-operators-7pxlc\" (UID: \"104bf72c-b8e1-4daf-834e-42342cd02813\") " pod="openshift-marketplace/redhat-operators-7pxlc" Dec 03 09:06:13 crc kubenswrapper[4947]: I1203 09:06:13.046418 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/104bf72c-b8e1-4daf-834e-42342cd02813-catalog-content\") pod \"redhat-operators-7pxlc\" (UID: \"104bf72c-b8e1-4daf-834e-42342cd02813\") " pod="openshift-marketplace/redhat-operators-7pxlc" Dec 03 09:06:13 crc kubenswrapper[4947]: I1203 09:06:13.046626 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/104bf72c-b8e1-4daf-834e-42342cd02813-utilities\") pod \"redhat-operators-7pxlc\" (UID: \"104bf72c-b8e1-4daf-834e-42342cd02813\") " pod="openshift-marketplace/redhat-operators-7pxlc" Dec 03 09:06:13 crc kubenswrapper[4947]: I1203 09:06:13.046800 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wldp6\" (UniqueName: \"kubernetes.io/projected/104bf72c-b8e1-4daf-834e-42342cd02813-kube-api-access-wldp6\") pod \"redhat-operators-7pxlc\" (UID: \"104bf72c-b8e1-4daf-834e-42342cd02813\") " pod="openshift-marketplace/redhat-operators-7pxlc" Dec 03 09:06:13 crc kubenswrapper[4947]: I1203 09:06:13.047167 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/104bf72c-b8e1-4daf-834e-42342cd02813-utilities\") pod \"redhat-operators-7pxlc\" (UID: \"104bf72c-b8e1-4daf-834e-42342cd02813\") " pod="openshift-marketplace/redhat-operators-7pxlc" Dec 03 09:06:13 crc kubenswrapper[4947]: I1203 09:06:13.068087 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wldp6\" (UniqueName: \"kubernetes.io/projected/104bf72c-b8e1-4daf-834e-42342cd02813-kube-api-access-wldp6\") pod \"redhat-operators-7pxlc\" (UID: \"104bf72c-b8e1-4daf-834e-42342cd02813\") " pod="openshift-marketplace/redhat-operators-7pxlc" Dec 03 09:06:13 crc kubenswrapper[4947]: I1203 09:06:13.110115 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pxlc" Dec 03 09:06:13 crc kubenswrapper[4947]: I1203 09:06:13.555363 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hx4fk"] Dec 03 09:06:13 crc kubenswrapper[4947]: I1203 09:06:13.627542 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7pxlc"] Dec 03 09:06:13 crc kubenswrapper[4947]: W1203 09:06:13.629918 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod104bf72c_b8e1_4daf_834e_42342cd02813.slice/crio-7a79b452f26a35fc5bdfe41a2b3b8306494ded18f02deb181f078a03fef48aa0 WatchSource:0}: Error finding container 7a79b452f26a35fc5bdfe41a2b3b8306494ded18f02deb181f078a03fef48aa0: Status 404 returned error can't find the container with id 7a79b452f26a35fc5bdfe41a2b3b8306494ded18f02deb181f078a03fef48aa0 Dec 03 09:06:13 crc kubenswrapper[4947]: I1203 09:06:13.966370 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hx4fk" event={"ID":"963b895d-9438-47d0-b4ca-3fd58d5e5bad","Type":"ContainerStarted","Data":"e5b44bae0d22f6387c35ddd55f95c6517085bdad0e63eefa50493950f4ffe971"} Dec 03 09:06:13 crc kubenswrapper[4947]: I1203 09:06:13.968562 4947 generic.go:334] "Generic (PLEG): container finished" podID="104bf72c-b8e1-4daf-834e-42342cd02813" containerID="486efaa5e7efa5d0d5a8fd486049a86f3009b3f7a69ef2f8e2942ee0a736e260" exitCode=0 Dec 03 09:06:13 crc kubenswrapper[4947]: I1203 09:06:13.968618 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pxlc" event={"ID":"104bf72c-b8e1-4daf-834e-42342cd02813","Type":"ContainerDied","Data":"486efaa5e7efa5d0d5a8fd486049a86f3009b3f7a69ef2f8e2942ee0a736e260"} Dec 03 09:06:13 crc kubenswrapper[4947]: I1203 09:06:13.968636 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pxlc" event={"ID":"104bf72c-b8e1-4daf-834e-42342cd02813","Type":"ContainerStarted","Data":"7a79b452f26a35fc5bdfe41a2b3b8306494ded18f02deb181f078a03fef48aa0"} Dec 03 09:06:26 crc kubenswrapper[4947]: E1203 09:06:26.889206 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-nova-conductor:65066e8ca260a75886ae57f157049605" Dec 03 09:06:26 crc kubenswrapper[4947]: E1203 09:06:26.889801 4947 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-nova-conductor:65066e8ca260a75886ae57f157049605" Dec 03 09:06:26 crc kubenswrapper[4947]: E1203 09:06:26.890028 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nova-cell0-conductor-db-sync,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-nova-conductor:65066e8ca260a75886ae57f157049605,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CELL_NAME,Value:cell0,ValueFrom:nil,},EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:false,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/kolla/config_files/config.json,SubPath:nova-conductor-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-www6f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42436,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-cell0-conductor-db-sync-hx4fk_openstack(963b895d-9438-47d0-b4ca-3fd58d5e5bad): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 09:06:26 crc kubenswrapper[4947]: E1203 09:06:26.891691 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/nova-cell0-conductor-db-sync-hx4fk" podUID="963b895d-9438-47d0-b4ca-3fd58d5e5bad" Dec 03 09:06:27 crc kubenswrapper[4947]: E1203 09:06:27.092574 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-nova-conductor:65066e8ca260a75886ae57f157049605\\\"\"" pod="openstack/nova-cell0-conductor-db-sync-hx4fk" podUID="963b895d-9438-47d0-b4ca-3fd58d5e5bad" Dec 03 09:06:28 crc kubenswrapper[4947]: I1203 09:06:28.106295 4947 generic.go:334] "Generic (PLEG): container finished" podID="104bf72c-b8e1-4daf-834e-42342cd02813" containerID="4b5c10dd777107704f9bd4be88a3284eb6310486c0a8578e45f7aff97f7b8445" exitCode=0 Dec 03 09:06:28 crc kubenswrapper[4947]: I1203 09:06:28.106362 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pxlc" event={"ID":"104bf72c-b8e1-4daf-834e-42342cd02813","Type":"ContainerDied","Data":"4b5c10dd777107704f9bd4be88a3284eb6310486c0a8578e45f7aff97f7b8445"} Dec 03 09:06:29 crc kubenswrapper[4947]: I1203 09:06:29.123003 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pxlc" event={"ID":"104bf72c-b8e1-4daf-834e-42342cd02813","Type":"ContainerStarted","Data":"3208d1cb303dd71355cf249df6a232efd5029c6b857e4056155146de3d098af1"} Dec 03 09:06:29 crc kubenswrapper[4947]: I1203 09:06:29.148970 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7pxlc" podStartSLOduration=2.500322578 podStartE2EDuration="17.14894859s" podCreationTimestamp="2025-12-03 09:06:12 +0000 UTC" firstStartedPulling="2025-12-03 09:06:13.970165098 +0000 UTC m=+8235.231119524" lastFinishedPulling="2025-12-03 09:06:28.61879111 +0000 UTC m=+8249.879745536" observedRunningTime="2025-12-03 09:06:29.146260978 +0000 UTC m=+8250.407215404" watchObservedRunningTime="2025-12-03 09:06:29.14894859 +0000 UTC m=+8250.409903016" Dec 03 09:06:30 crc kubenswrapper[4947]: I1203 09:06:30.086222 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:06:30 crc kubenswrapper[4947]: I1203 09:06:30.086688 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:06:30 crc kubenswrapper[4947]: I1203 09:06:30.086743 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 09:06:30 crc kubenswrapper[4947]: I1203 09:06:30.087295 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"90c58260a4ecaf4766384be2b94412a301c15894ed8a02e64bee5f8c77d3f403"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 09:06:30 crc kubenswrapper[4947]: I1203 09:06:30.087363 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://90c58260a4ecaf4766384be2b94412a301c15894ed8a02e64bee5f8c77d3f403" gracePeriod=600 Dec 03 09:06:31 crc kubenswrapper[4947]: I1203 09:06:31.144389 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="90c58260a4ecaf4766384be2b94412a301c15894ed8a02e64bee5f8c77d3f403" exitCode=0 Dec 03 09:06:31 crc kubenswrapper[4947]: I1203 09:06:31.144530 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"90c58260a4ecaf4766384be2b94412a301c15894ed8a02e64bee5f8c77d3f403"} Dec 03 09:06:31 crc kubenswrapper[4947]: I1203 09:06:31.144983 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca"} Dec 03 09:06:31 crc kubenswrapper[4947]: I1203 09:06:31.145006 4947 scope.go:117] "RemoveContainer" containerID="b4192eaf6f9476bca9c20c3cf83a45b0480e03ca78fdb951783becad389554ce" Dec 03 09:06:33 crc kubenswrapper[4947]: I1203 09:06:33.110862 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7pxlc" Dec 03 09:06:33 crc kubenswrapper[4947]: I1203 09:06:33.111551 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7pxlc" Dec 03 09:06:34 crc kubenswrapper[4947]: I1203 09:06:34.161074 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7pxlc" podUID="104bf72c-b8e1-4daf-834e-42342cd02813" containerName="registry-server" probeResult="failure" output=< Dec 03 09:06:34 crc kubenswrapper[4947]: timeout: failed to connect service ":50051" within 1s Dec 03 09:06:34 crc kubenswrapper[4947]: > Dec 03 09:06:40 crc kubenswrapper[4947]: I1203 09:06:40.230584 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hx4fk" event={"ID":"963b895d-9438-47d0-b4ca-3fd58d5e5bad","Type":"ContainerStarted","Data":"7e7d5f6bff7df8bcc0fc52da398887063f5c0c0f1d931604266d9fabe65a754b"} Dec 03 09:06:41 crc kubenswrapper[4947]: I1203 09:06:41.263118 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-hx4fk" podStartSLOduration=3.5504841430000003 podStartE2EDuration="29.26309803s" podCreationTimestamp="2025-12-03 09:06:12 +0000 UTC" firstStartedPulling="2025-12-03 09:06:13.55322039 +0000 UTC m=+8234.814174816" lastFinishedPulling="2025-12-03 09:06:39.265834267 +0000 UTC m=+8260.526788703" observedRunningTime="2025-12-03 09:06:41.252620837 +0000 UTC m=+8262.513575283" watchObservedRunningTime="2025-12-03 09:06:41.26309803 +0000 UTC m=+8262.524052456" Dec 03 09:06:43 crc kubenswrapper[4947]: I1203 09:06:43.185627 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7pxlc" Dec 03 09:06:43 crc kubenswrapper[4947]: I1203 09:06:43.230735 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7pxlc" Dec 03 09:06:43 crc kubenswrapper[4947]: I1203 09:06:43.816155 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7pxlc"] Dec 03 09:06:43 crc kubenswrapper[4947]: I1203 09:06:43.990265 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qwtfv"] Dec 03 09:06:43 crc kubenswrapper[4947]: I1203 09:06:43.990564 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qwtfv" podUID="7287497a-3ee8-48ea-b85f-4b1d8a9927cf" containerName="registry-server" containerID="cri-o://b191938a393f368c92afd0240563960a6a3b2f566fb1c20b3c791242636734cc" gracePeriod=2 Dec 03 09:06:44 crc kubenswrapper[4947]: I1203 09:06:44.320434 4947 generic.go:334] "Generic (PLEG): container finished" podID="7287497a-3ee8-48ea-b85f-4b1d8a9927cf" containerID="b191938a393f368c92afd0240563960a6a3b2f566fb1c20b3c791242636734cc" exitCode=0 Dec 03 09:06:44 crc kubenswrapper[4947]: I1203 09:06:44.320716 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qwtfv" event={"ID":"7287497a-3ee8-48ea-b85f-4b1d8a9927cf","Type":"ContainerDied","Data":"b191938a393f368c92afd0240563960a6a3b2f566fb1c20b3c791242636734cc"} Dec 03 09:06:44 crc kubenswrapper[4947]: I1203 09:06:44.455929 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qwtfv" Dec 03 09:06:44 crc kubenswrapper[4947]: I1203 09:06:44.515259 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srnzs\" (UniqueName: \"kubernetes.io/projected/7287497a-3ee8-48ea-b85f-4b1d8a9927cf-kube-api-access-srnzs\") pod \"7287497a-3ee8-48ea-b85f-4b1d8a9927cf\" (UID: \"7287497a-3ee8-48ea-b85f-4b1d8a9927cf\") " Dec 03 09:06:44 crc kubenswrapper[4947]: I1203 09:06:44.515764 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7287497a-3ee8-48ea-b85f-4b1d8a9927cf-utilities\") pod \"7287497a-3ee8-48ea-b85f-4b1d8a9927cf\" (UID: \"7287497a-3ee8-48ea-b85f-4b1d8a9927cf\") " Dec 03 09:06:44 crc kubenswrapper[4947]: I1203 09:06:44.515832 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7287497a-3ee8-48ea-b85f-4b1d8a9927cf-catalog-content\") pod \"7287497a-3ee8-48ea-b85f-4b1d8a9927cf\" (UID: \"7287497a-3ee8-48ea-b85f-4b1d8a9927cf\") " Dec 03 09:06:44 crc kubenswrapper[4947]: I1203 09:06:44.516721 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7287497a-3ee8-48ea-b85f-4b1d8a9927cf-utilities" (OuterVolumeSpecName: "utilities") pod "7287497a-3ee8-48ea-b85f-4b1d8a9927cf" (UID: "7287497a-3ee8-48ea-b85f-4b1d8a9927cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:06:44 crc kubenswrapper[4947]: I1203 09:06:44.520738 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7287497a-3ee8-48ea-b85f-4b1d8a9927cf-kube-api-access-srnzs" (OuterVolumeSpecName: "kube-api-access-srnzs") pod "7287497a-3ee8-48ea-b85f-4b1d8a9927cf" (UID: "7287497a-3ee8-48ea-b85f-4b1d8a9927cf"). InnerVolumeSpecName "kube-api-access-srnzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:06:44 crc kubenswrapper[4947]: I1203 09:06:44.617993 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srnzs\" (UniqueName: \"kubernetes.io/projected/7287497a-3ee8-48ea-b85f-4b1d8a9927cf-kube-api-access-srnzs\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:44 crc kubenswrapper[4947]: I1203 09:06:44.618026 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7287497a-3ee8-48ea-b85f-4b1d8a9927cf-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:44 crc kubenswrapper[4947]: I1203 09:06:44.627389 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7287497a-3ee8-48ea-b85f-4b1d8a9927cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7287497a-3ee8-48ea-b85f-4b1d8a9927cf" (UID: "7287497a-3ee8-48ea-b85f-4b1d8a9927cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:06:44 crc kubenswrapper[4947]: I1203 09:06:44.723769 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7287497a-3ee8-48ea-b85f-4b1d8a9927cf-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:45 crc kubenswrapper[4947]: I1203 09:06:45.332356 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qwtfv" event={"ID":"7287497a-3ee8-48ea-b85f-4b1d8a9927cf","Type":"ContainerDied","Data":"f54f750d262b2e795aa81784c7f244515a91bfb43b41362017359200194cdcbd"} Dec 03 09:06:45 crc kubenswrapper[4947]: I1203 09:06:45.332420 4947 scope.go:117] "RemoveContainer" containerID="b191938a393f368c92afd0240563960a6a3b2f566fb1c20b3c791242636734cc" Dec 03 09:06:45 crc kubenswrapper[4947]: I1203 09:06:45.332605 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qwtfv" Dec 03 09:06:45 crc kubenswrapper[4947]: I1203 09:06:45.362036 4947 scope.go:117] "RemoveContainer" containerID="c6f9f5cff93bd88d3cb996ce2d7e99563ff5e9e9593d378b97760c81ac139696" Dec 03 09:06:45 crc kubenswrapper[4947]: I1203 09:06:45.363171 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qwtfv"] Dec 03 09:06:45 crc kubenswrapper[4947]: I1203 09:06:45.372131 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qwtfv"] Dec 03 09:06:45 crc kubenswrapper[4947]: I1203 09:06:45.396932 4947 scope.go:117] "RemoveContainer" containerID="84bda53ef880c281eee4012d871c3c6c370e995de333c0d88d4d863ee8e194e4" Dec 03 09:06:47 crc kubenswrapper[4947]: I1203 09:06:47.099128 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7287497a-3ee8-48ea-b85f-4b1d8a9927cf" path="/var/lib/kubelet/pods/7287497a-3ee8-48ea-b85f-4b1d8a9927cf/volumes" Dec 03 09:06:47 crc kubenswrapper[4947]: I1203 09:06:47.352831 4947 generic.go:334] "Generic (PLEG): container finished" podID="963b895d-9438-47d0-b4ca-3fd58d5e5bad" containerID="7e7d5f6bff7df8bcc0fc52da398887063f5c0c0f1d931604266d9fabe65a754b" exitCode=0 Dec 03 09:06:47 crc kubenswrapper[4947]: I1203 09:06:47.352876 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hx4fk" event={"ID":"963b895d-9438-47d0-b4ca-3fd58d5e5bad","Type":"ContainerDied","Data":"7e7d5f6bff7df8bcc0fc52da398887063f5c0c0f1d931604266d9fabe65a754b"} Dec 03 09:06:48 crc kubenswrapper[4947]: I1203 09:06:48.668057 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hx4fk" Dec 03 09:06:48 crc kubenswrapper[4947]: I1203 09:06:48.801579 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-www6f\" (UniqueName: \"kubernetes.io/projected/963b895d-9438-47d0-b4ca-3fd58d5e5bad-kube-api-access-www6f\") pod \"963b895d-9438-47d0-b4ca-3fd58d5e5bad\" (UID: \"963b895d-9438-47d0-b4ca-3fd58d5e5bad\") " Dec 03 09:06:48 crc kubenswrapper[4947]: I1203 09:06:48.801650 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/963b895d-9438-47d0-b4ca-3fd58d5e5bad-combined-ca-bundle\") pod \"963b895d-9438-47d0-b4ca-3fd58d5e5bad\" (UID: \"963b895d-9438-47d0-b4ca-3fd58d5e5bad\") " Dec 03 09:06:48 crc kubenswrapper[4947]: I1203 09:06:48.801677 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/963b895d-9438-47d0-b4ca-3fd58d5e5bad-config-data\") pod \"963b895d-9438-47d0-b4ca-3fd58d5e5bad\" (UID: \"963b895d-9438-47d0-b4ca-3fd58d5e5bad\") " Dec 03 09:06:48 crc kubenswrapper[4947]: I1203 09:06:48.801719 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/963b895d-9438-47d0-b4ca-3fd58d5e5bad-scripts\") pod \"963b895d-9438-47d0-b4ca-3fd58d5e5bad\" (UID: \"963b895d-9438-47d0-b4ca-3fd58d5e5bad\") " Dec 03 09:06:48 crc kubenswrapper[4947]: I1203 09:06:48.806667 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/963b895d-9438-47d0-b4ca-3fd58d5e5bad-scripts" (OuterVolumeSpecName: "scripts") pod "963b895d-9438-47d0-b4ca-3fd58d5e5bad" (UID: "963b895d-9438-47d0-b4ca-3fd58d5e5bad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:06:48 crc kubenswrapper[4947]: I1203 09:06:48.807645 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/963b895d-9438-47d0-b4ca-3fd58d5e5bad-kube-api-access-www6f" (OuterVolumeSpecName: "kube-api-access-www6f") pod "963b895d-9438-47d0-b4ca-3fd58d5e5bad" (UID: "963b895d-9438-47d0-b4ca-3fd58d5e5bad"). InnerVolumeSpecName "kube-api-access-www6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:06:48 crc kubenswrapper[4947]: I1203 09:06:48.826698 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/963b895d-9438-47d0-b4ca-3fd58d5e5bad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "963b895d-9438-47d0-b4ca-3fd58d5e5bad" (UID: "963b895d-9438-47d0-b4ca-3fd58d5e5bad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:06:48 crc kubenswrapper[4947]: I1203 09:06:48.829472 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/963b895d-9438-47d0-b4ca-3fd58d5e5bad-config-data" (OuterVolumeSpecName: "config-data") pod "963b895d-9438-47d0-b4ca-3fd58d5e5bad" (UID: "963b895d-9438-47d0-b4ca-3fd58d5e5bad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:06:48 crc kubenswrapper[4947]: I1203 09:06:48.903332 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-www6f\" (UniqueName: \"kubernetes.io/projected/963b895d-9438-47d0-b4ca-3fd58d5e5bad-kube-api-access-www6f\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:48 crc kubenswrapper[4947]: I1203 09:06:48.903367 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/963b895d-9438-47d0-b4ca-3fd58d5e5bad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:48 crc kubenswrapper[4947]: I1203 09:06:48.903379 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/963b895d-9438-47d0-b4ca-3fd58d5e5bad-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:48 crc kubenswrapper[4947]: I1203 09:06:48.903391 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/963b895d-9438-47d0-b4ca-3fd58d5e5bad-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:49 crc kubenswrapper[4947]: I1203 09:06:49.375295 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hx4fk" event={"ID":"963b895d-9438-47d0-b4ca-3fd58d5e5bad","Type":"ContainerDied","Data":"e5b44bae0d22f6387c35ddd55f95c6517085bdad0e63eefa50493950f4ffe971"} Dec 03 09:06:49 crc kubenswrapper[4947]: I1203 09:06:49.376451 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5b44bae0d22f6387c35ddd55f95c6517085bdad0e63eefa50493950f4ffe971" Dec 03 09:06:49 crc kubenswrapper[4947]: I1203 09:06:49.375464 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hx4fk" Dec 03 09:06:49 crc kubenswrapper[4947]: I1203 09:06:49.477535 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 09:06:49 crc kubenswrapper[4947]: E1203 09:06:49.478279 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7287497a-3ee8-48ea-b85f-4b1d8a9927cf" containerName="registry-server" Dec 03 09:06:49 crc kubenswrapper[4947]: I1203 09:06:49.478382 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="7287497a-3ee8-48ea-b85f-4b1d8a9927cf" containerName="registry-server" Dec 03 09:06:49 crc kubenswrapper[4947]: E1203 09:06:49.478470 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7287497a-3ee8-48ea-b85f-4b1d8a9927cf" containerName="extract-content" Dec 03 09:06:49 crc kubenswrapper[4947]: I1203 09:06:49.478555 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="7287497a-3ee8-48ea-b85f-4b1d8a9927cf" containerName="extract-content" Dec 03 09:06:49 crc kubenswrapper[4947]: E1203 09:06:49.478643 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7287497a-3ee8-48ea-b85f-4b1d8a9927cf" containerName="extract-utilities" Dec 03 09:06:49 crc kubenswrapper[4947]: I1203 09:06:49.478715 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="7287497a-3ee8-48ea-b85f-4b1d8a9927cf" containerName="extract-utilities" Dec 03 09:06:49 crc kubenswrapper[4947]: E1203 09:06:49.478807 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="963b895d-9438-47d0-b4ca-3fd58d5e5bad" containerName="nova-cell0-conductor-db-sync" Dec 03 09:06:49 crc kubenswrapper[4947]: I1203 09:06:49.478874 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="963b895d-9438-47d0-b4ca-3fd58d5e5bad" containerName="nova-cell0-conductor-db-sync" Dec 03 09:06:49 crc kubenswrapper[4947]: I1203 09:06:49.479136 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="7287497a-3ee8-48ea-b85f-4b1d8a9927cf" containerName="registry-server" Dec 03 09:06:49 crc kubenswrapper[4947]: I1203 09:06:49.479239 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="963b895d-9438-47d0-b4ca-3fd58d5e5bad" containerName="nova-cell0-conductor-db-sync" Dec 03 09:06:49 crc kubenswrapper[4947]: I1203 09:06:49.480201 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 09:06:49 crc kubenswrapper[4947]: I1203 09:06:49.482458 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-wxj85" Dec 03 09:06:49 crc kubenswrapper[4947]: I1203 09:06:49.483049 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 09:06:49 crc kubenswrapper[4947]: I1203 09:06:49.486612 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 09:06:49 crc kubenswrapper[4947]: I1203 09:06:49.618578 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm8vw\" (UniqueName: \"kubernetes.io/projected/10daa16c-8d05-423d-8e47-760b529a5925-kube-api-access-nm8vw\") pod \"nova-cell0-conductor-0\" (UID: \"10daa16c-8d05-423d-8e47-760b529a5925\") " pod="openstack/nova-cell0-conductor-0" Dec 03 09:06:49 crc kubenswrapper[4947]: I1203 09:06:49.618646 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10daa16c-8d05-423d-8e47-760b529a5925-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"10daa16c-8d05-423d-8e47-760b529a5925\") " pod="openstack/nova-cell0-conductor-0" Dec 03 09:06:49 crc kubenswrapper[4947]: I1203 09:06:49.618838 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10daa16c-8d05-423d-8e47-760b529a5925-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"10daa16c-8d05-423d-8e47-760b529a5925\") " pod="openstack/nova-cell0-conductor-0" Dec 03 09:06:49 crc kubenswrapper[4947]: I1203 09:06:49.720809 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm8vw\" (UniqueName: \"kubernetes.io/projected/10daa16c-8d05-423d-8e47-760b529a5925-kube-api-access-nm8vw\") pod \"nova-cell0-conductor-0\" (UID: \"10daa16c-8d05-423d-8e47-760b529a5925\") " pod="openstack/nova-cell0-conductor-0" Dec 03 09:06:49 crc kubenswrapper[4947]: I1203 09:06:49.720906 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10daa16c-8d05-423d-8e47-760b529a5925-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"10daa16c-8d05-423d-8e47-760b529a5925\") " pod="openstack/nova-cell0-conductor-0" Dec 03 09:06:49 crc kubenswrapper[4947]: I1203 09:06:49.721039 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10daa16c-8d05-423d-8e47-760b529a5925-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"10daa16c-8d05-423d-8e47-760b529a5925\") " pod="openstack/nova-cell0-conductor-0" Dec 03 09:06:49 crc kubenswrapper[4947]: I1203 09:06:49.731430 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10daa16c-8d05-423d-8e47-760b529a5925-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"10daa16c-8d05-423d-8e47-760b529a5925\") " pod="openstack/nova-cell0-conductor-0" Dec 03 09:06:49 crc kubenswrapper[4947]: I1203 09:06:49.732672 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10daa16c-8d05-423d-8e47-760b529a5925-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"10daa16c-8d05-423d-8e47-760b529a5925\") " pod="openstack/nova-cell0-conductor-0" Dec 03 09:06:49 crc kubenswrapper[4947]: I1203 09:06:49.744326 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm8vw\" (UniqueName: \"kubernetes.io/projected/10daa16c-8d05-423d-8e47-760b529a5925-kube-api-access-nm8vw\") pod \"nova-cell0-conductor-0\" (UID: \"10daa16c-8d05-423d-8e47-760b529a5925\") " pod="openstack/nova-cell0-conductor-0" Dec 03 09:06:49 crc kubenswrapper[4947]: I1203 09:06:49.805356 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 09:06:50 crc kubenswrapper[4947]: I1203 09:06:50.243929 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 09:06:50 crc kubenswrapper[4947]: I1203 09:06:50.385689 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"10daa16c-8d05-423d-8e47-760b529a5925","Type":"ContainerStarted","Data":"6502592d951f50066be86639aea2c344bf75ff34bc87f4c9c33c6db3b5ad5704"} Dec 03 09:06:51 crc kubenswrapper[4947]: I1203 09:06:51.397852 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"10daa16c-8d05-423d-8e47-760b529a5925","Type":"ContainerStarted","Data":"8aaf46bab0cf2bb2435d235f20745f19140a0b89bcedd1807ca49fe7294a0b7d"} Dec 03 09:06:51 crc kubenswrapper[4947]: I1203 09:06:51.398274 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 03 09:06:59 crc kubenswrapper[4947]: I1203 09:06:59.837221 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 03 09:06:59 crc kubenswrapper[4947]: I1203 09:06:59.858063 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=10.858039054 podStartE2EDuration="10.858039054s" podCreationTimestamp="2025-12-03 09:06:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:06:51.416621677 +0000 UTC m=+8272.677576123" watchObservedRunningTime="2025-12-03 09:06:59.858039054 +0000 UTC m=+8281.118993520" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.365985 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-5zj7g"] Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.367941 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5zj7g" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.377636 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.378166 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.383879 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-5zj7g"] Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.427719 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ef7a0e-8716-49c8-8605-d4d09d31df62-config-data\") pod \"nova-cell0-cell-mapping-5zj7g\" (UID: \"36ef7a0e-8716-49c8-8605-d4d09d31df62\") " pod="openstack/nova-cell0-cell-mapping-5zj7g" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.427793 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq2d4\" (UniqueName: \"kubernetes.io/projected/36ef7a0e-8716-49c8-8605-d4d09d31df62-kube-api-access-dq2d4\") pod \"nova-cell0-cell-mapping-5zj7g\" (UID: \"36ef7a0e-8716-49c8-8605-d4d09d31df62\") " pod="openstack/nova-cell0-cell-mapping-5zj7g" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.427857 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ef7a0e-8716-49c8-8605-d4d09d31df62-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5zj7g\" (UID: \"36ef7a0e-8716-49c8-8605-d4d09d31df62\") " pod="openstack/nova-cell0-cell-mapping-5zj7g" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.427965 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36ef7a0e-8716-49c8-8605-d4d09d31df62-scripts\") pod \"nova-cell0-cell-mapping-5zj7g\" (UID: \"36ef7a0e-8716-49c8-8605-d4d09d31df62\") " pod="openstack/nova-cell0-cell-mapping-5zj7g" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.527073 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.530210 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ef7a0e-8716-49c8-8605-d4d09d31df62-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5zj7g\" (UID: \"36ef7a0e-8716-49c8-8605-d4d09d31df62\") " pod="openstack/nova-cell0-cell-mapping-5zj7g" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.530296 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36ef7a0e-8716-49c8-8605-d4d09d31df62-scripts\") pod \"nova-cell0-cell-mapping-5zj7g\" (UID: \"36ef7a0e-8716-49c8-8605-d4d09d31df62\") " pod="openstack/nova-cell0-cell-mapping-5zj7g" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.530362 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ef7a0e-8716-49c8-8605-d4d09d31df62-config-data\") pod \"nova-cell0-cell-mapping-5zj7g\" (UID: \"36ef7a0e-8716-49c8-8605-d4d09d31df62\") " pod="openstack/nova-cell0-cell-mapping-5zj7g" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.530395 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq2d4\" (UniqueName: \"kubernetes.io/projected/36ef7a0e-8716-49c8-8605-d4d09d31df62-kube-api-access-dq2d4\") pod \"nova-cell0-cell-mapping-5zj7g\" (UID: \"36ef7a0e-8716-49c8-8605-d4d09d31df62\") " pod="openstack/nova-cell0-cell-mapping-5zj7g" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.531747 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.536314 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.547392 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ef7a0e-8716-49c8-8605-d4d09d31df62-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5zj7g\" (UID: \"36ef7a0e-8716-49c8-8605-d4d09d31df62\") " pod="openstack/nova-cell0-cell-mapping-5zj7g" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.548764 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ef7a0e-8716-49c8-8605-d4d09d31df62-config-data\") pod \"nova-cell0-cell-mapping-5zj7g\" (UID: \"36ef7a0e-8716-49c8-8605-d4d09d31df62\") " pod="openstack/nova-cell0-cell-mapping-5zj7g" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.558747 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36ef7a0e-8716-49c8-8605-d4d09d31df62-scripts\") pod \"nova-cell0-cell-mapping-5zj7g\" (UID: \"36ef7a0e-8716-49c8-8605-d4d09d31df62\") " pod="openstack/nova-cell0-cell-mapping-5zj7g" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.564068 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq2d4\" (UniqueName: \"kubernetes.io/projected/36ef7a0e-8716-49c8-8605-d4d09d31df62-kube-api-access-dq2d4\") pod \"nova-cell0-cell-mapping-5zj7g\" (UID: \"36ef7a0e-8716-49c8-8605-d4d09d31df62\") " pod="openstack/nova-cell0-cell-mapping-5zj7g" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.570607 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.625566 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.627209 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.632905 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.634863 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn49p\" (UniqueName: \"kubernetes.io/projected/ac73988c-f87a-4d9e-a9b8-139d9f791090-kube-api-access-bn49p\") pod \"nova-api-0\" (UID: \"ac73988c-f87a-4d9e-a9b8-139d9f791090\") " pod="openstack/nova-api-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.634946 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac73988c-f87a-4d9e-a9b8-139d9f791090-config-data\") pod \"nova-api-0\" (UID: \"ac73988c-f87a-4d9e-a9b8-139d9f791090\") " pod="openstack/nova-api-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.635054 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac73988c-f87a-4d9e-a9b8-139d9f791090-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ac73988c-f87a-4d9e-a9b8-139d9f791090\") " pod="openstack/nova-api-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.635095 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac73988c-f87a-4d9e-a9b8-139d9f791090-logs\") pod \"nova-api-0\" (UID: \"ac73988c-f87a-4d9e-a9b8-139d9f791090\") " pod="openstack/nova-api-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.645068 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.694786 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.696044 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.702083 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.706372 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5zj7g" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.715359 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.736633 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d8c3f69-2a93-4e2b-92b1-5f467c4e8178-logs\") pod \"nova-metadata-0\" (UID: \"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178\") " pod="openstack/nova-metadata-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.736711 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn49p\" (UniqueName: \"kubernetes.io/projected/ac73988c-f87a-4d9e-a9b8-139d9f791090-kube-api-access-bn49p\") pod \"nova-api-0\" (UID: \"ac73988c-f87a-4d9e-a9b8-139d9f791090\") " pod="openstack/nova-api-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.736756 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d8c3f69-2a93-4e2b-92b1-5f467c4e8178-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178\") " pod="openstack/nova-metadata-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.736790 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac73988c-f87a-4d9e-a9b8-139d9f791090-config-data\") pod \"nova-api-0\" (UID: \"ac73988c-f87a-4d9e-a9b8-139d9f791090\") " pod="openstack/nova-api-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.736877 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ce93d1-fa71-48ee-bf44-a0203c659dee-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"06ce93d1-fa71-48ee-bf44-a0203c659dee\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.736903 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac73988c-f87a-4d9e-a9b8-139d9f791090-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ac73988c-f87a-4d9e-a9b8-139d9f791090\") " pod="openstack/nova-api-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.736929 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06ce93d1-fa71-48ee-bf44-a0203c659dee-config-data\") pod \"nova-scheduler-0\" (UID: \"06ce93d1-fa71-48ee-bf44-a0203c659dee\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.736955 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac73988c-f87a-4d9e-a9b8-139d9f791090-logs\") pod \"nova-api-0\" (UID: \"ac73988c-f87a-4d9e-a9b8-139d9f791090\") " pod="openstack/nova-api-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.737017 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvjv4\" (UniqueName: \"kubernetes.io/projected/06ce93d1-fa71-48ee-bf44-a0203c659dee-kube-api-access-bvjv4\") pod \"nova-scheduler-0\" (UID: \"06ce93d1-fa71-48ee-bf44-a0203c659dee\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.737067 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qbk4\" (UniqueName: \"kubernetes.io/projected/7d8c3f69-2a93-4e2b-92b1-5f467c4e8178-kube-api-access-2qbk4\") pod \"nova-metadata-0\" (UID: \"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178\") " pod="openstack/nova-metadata-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.737142 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d8c3f69-2a93-4e2b-92b1-5f467c4e8178-config-data\") pod \"nova-metadata-0\" (UID: \"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178\") " pod="openstack/nova-metadata-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.740887 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac73988c-f87a-4d9e-a9b8-139d9f791090-logs\") pod \"nova-api-0\" (UID: \"ac73988c-f87a-4d9e-a9b8-139d9f791090\") " pod="openstack/nova-api-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.755268 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac73988c-f87a-4d9e-a9b8-139d9f791090-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ac73988c-f87a-4d9e-a9b8-139d9f791090\") " pod="openstack/nova-api-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.763517 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac73988c-f87a-4d9e-a9b8-139d9f791090-config-data\") pod \"nova-api-0\" (UID: \"ac73988c-f87a-4d9e-a9b8-139d9f791090\") " pod="openstack/nova-api-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.771767 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c478dcc7c-ngls5"] Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.773331 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.798129 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn49p\" (UniqueName: \"kubernetes.io/projected/ac73988c-f87a-4d9e-a9b8-139d9f791090-kube-api-access-bn49p\") pod \"nova-api-0\" (UID: \"ac73988c-f87a-4d9e-a9b8-139d9f791090\") " pod="openstack/nova-api-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.798478 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c478dcc7c-ngls5"] Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.821732 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.823287 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.828918 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.845216 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ce93d1-fa71-48ee-bf44-a0203c659dee-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"06ce93d1-fa71-48ee-bf44-a0203c659dee\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.845264 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06ce93d1-fa71-48ee-bf44-a0203c659dee-config-data\") pod \"nova-scheduler-0\" (UID: \"06ce93d1-fa71-48ee-bf44-a0203c659dee\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.845298 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvjv4\" (UniqueName: \"kubernetes.io/projected/06ce93d1-fa71-48ee-bf44-a0203c659dee-kube-api-access-bvjv4\") pod \"nova-scheduler-0\" (UID: \"06ce93d1-fa71-48ee-bf44-a0203c659dee\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.845327 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmwz8\" (UniqueName: \"kubernetes.io/projected/c5aad179-8099-4458-b053-600b58e2b759-kube-api-access-cmwz8\") pod \"dnsmasq-dns-6c478dcc7c-ngls5\" (UID: \"c5aad179-8099-4458-b053-600b58e2b759\") " pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.845357 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qbk4\" (UniqueName: \"kubernetes.io/projected/7d8c3f69-2a93-4e2b-92b1-5f467c4e8178-kube-api-access-2qbk4\") pod \"nova-metadata-0\" (UID: \"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178\") " pod="openstack/nova-metadata-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.845388 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5aad179-8099-4458-b053-600b58e2b759-ovsdbserver-nb\") pod \"dnsmasq-dns-6c478dcc7c-ngls5\" (UID: \"c5aad179-8099-4458-b053-600b58e2b759\") " pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.845409 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5aad179-8099-4458-b053-600b58e2b759-config\") pod \"dnsmasq-dns-6c478dcc7c-ngls5\" (UID: \"c5aad179-8099-4458-b053-600b58e2b759\") " pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.845438 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5aad179-8099-4458-b053-600b58e2b759-ovsdbserver-sb\") pod \"dnsmasq-dns-6c478dcc7c-ngls5\" (UID: \"c5aad179-8099-4458-b053-600b58e2b759\") " pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.845467 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d8c3f69-2a93-4e2b-92b1-5f467c4e8178-config-data\") pod \"nova-metadata-0\" (UID: \"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178\") " pod="openstack/nova-metadata-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.845485 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d8c3f69-2a93-4e2b-92b1-5f467c4e8178-logs\") pod \"nova-metadata-0\" (UID: \"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178\") " pod="openstack/nova-metadata-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.845543 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d8c3f69-2a93-4e2b-92b1-5f467c4e8178-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178\") " pod="openstack/nova-metadata-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.845565 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5aad179-8099-4458-b053-600b58e2b759-dns-svc\") pod \"dnsmasq-dns-6c478dcc7c-ngls5\" (UID: \"c5aad179-8099-4458-b053-600b58e2b759\") " pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.849285 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d8c3f69-2a93-4e2b-92b1-5f467c4e8178-logs\") pod \"nova-metadata-0\" (UID: \"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178\") " pod="openstack/nova-metadata-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.854656 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.858251 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d8c3f69-2a93-4e2b-92b1-5f467c4e8178-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178\") " pod="openstack/nova-metadata-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.865666 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d8c3f69-2a93-4e2b-92b1-5f467c4e8178-config-data\") pod \"nova-metadata-0\" (UID: \"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178\") " pod="openstack/nova-metadata-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.881340 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ce93d1-fa71-48ee-bf44-a0203c659dee-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"06ce93d1-fa71-48ee-bf44-a0203c659dee\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.884802 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06ce93d1-fa71-48ee-bf44-a0203c659dee-config-data\") pod \"nova-scheduler-0\" (UID: \"06ce93d1-fa71-48ee-bf44-a0203c659dee\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.893836 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell2-novncproxy-0"] Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.894840 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qbk4\" (UniqueName: \"kubernetes.io/projected/7d8c3f69-2a93-4e2b-92b1-5f467c4e8178-kube-api-access-2qbk4\") pod \"nova-metadata-0\" (UID: \"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178\") " pod="openstack/nova-metadata-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.896743 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvjv4\" (UniqueName: \"kubernetes.io/projected/06ce93d1-fa71-48ee-bf44-a0203c659dee-kube-api-access-bvjv4\") pod \"nova-scheduler-0\" (UID: \"06ce93d1-fa71-48ee-bf44-a0203c659dee\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.902551 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-novncproxy-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.907942 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell2-novncproxy-config-data" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.954403 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5aad179-8099-4458-b053-600b58e2b759-ovsdbserver-nb\") pod \"dnsmasq-dns-6c478dcc7c-ngls5\" (UID: \"c5aad179-8099-4458-b053-600b58e2b759\") " pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.954468 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4948e0fe-dd3d-451b-9f21-92a047aef4a1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4948e0fe-dd3d-451b-9f21-92a047aef4a1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.954585 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5aad179-8099-4458-b053-600b58e2b759-config\") pod \"dnsmasq-dns-6c478dcc7c-ngls5\" (UID: \"c5aad179-8099-4458-b053-600b58e2b759\") " pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.954639 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5aad179-8099-4458-b053-600b58e2b759-ovsdbserver-sb\") pod \"dnsmasq-dns-6c478dcc7c-ngls5\" (UID: \"c5aad179-8099-4458-b053-600b58e2b759\") " pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.954718 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fada3a26-6dc9-44dc-9662-e6e2111cbf72-combined-ca-bundle\") pod \"nova-cell2-novncproxy-0\" (UID: \"fada3a26-6dc9-44dc-9662-e6e2111cbf72\") " pod="openstack/nova-cell2-novncproxy-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.954785 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4ncv\" (UniqueName: \"kubernetes.io/projected/4948e0fe-dd3d-451b-9f21-92a047aef4a1-kube-api-access-v4ncv\") pod \"nova-cell1-novncproxy-0\" (UID: \"4948e0fe-dd3d-451b-9f21-92a047aef4a1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.954873 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5aad179-8099-4458-b053-600b58e2b759-dns-svc\") pod \"dnsmasq-dns-6c478dcc7c-ngls5\" (UID: \"c5aad179-8099-4458-b053-600b58e2b759\") " pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.954925 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fada3a26-6dc9-44dc-9662-e6e2111cbf72-config-data\") pod \"nova-cell2-novncproxy-0\" (UID: \"fada3a26-6dc9-44dc-9662-e6e2111cbf72\") " pod="openstack/nova-cell2-novncproxy-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.954966 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4948e0fe-dd3d-451b-9f21-92a047aef4a1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4948e0fe-dd3d-451b-9f21-92a047aef4a1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.956085 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5aad179-8099-4458-b053-600b58e2b759-ovsdbserver-nb\") pod \"dnsmasq-dns-6c478dcc7c-ngls5\" (UID: \"c5aad179-8099-4458-b053-600b58e2b759\") " pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.956181 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmwz8\" (UniqueName: \"kubernetes.io/projected/c5aad179-8099-4458-b053-600b58e2b759-kube-api-access-cmwz8\") pod \"dnsmasq-dns-6c478dcc7c-ngls5\" (UID: \"c5aad179-8099-4458-b053-600b58e2b759\") " pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.956280 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-878gt\" (UniqueName: \"kubernetes.io/projected/fada3a26-6dc9-44dc-9662-e6e2111cbf72-kube-api-access-878gt\") pod \"nova-cell2-novncproxy-0\" (UID: \"fada3a26-6dc9-44dc-9662-e6e2111cbf72\") " pod="openstack/nova-cell2-novncproxy-0" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.971245 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5aad179-8099-4458-b053-600b58e2b759-config\") pod \"dnsmasq-dns-6c478dcc7c-ngls5\" (UID: \"c5aad179-8099-4458-b053-600b58e2b759\") " pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.972535 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5aad179-8099-4458-b053-600b58e2b759-dns-svc\") pod \"dnsmasq-dns-6c478dcc7c-ngls5\" (UID: \"c5aad179-8099-4458-b053-600b58e2b759\") " pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.973131 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5aad179-8099-4458-b053-600b58e2b759-ovsdbserver-sb\") pod \"dnsmasq-dns-6c478dcc7c-ngls5\" (UID: \"c5aad179-8099-4458-b053-600b58e2b759\") " pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" Dec 03 09:07:00 crc kubenswrapper[4947]: I1203 09:07:00.980539 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmwz8\" (UniqueName: \"kubernetes.io/projected/c5aad179-8099-4458-b053-600b58e2b759-kube-api-access-cmwz8\") pod \"dnsmasq-dns-6c478dcc7c-ngls5\" (UID: \"c5aad179-8099-4458-b053-600b58e2b759\") " pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.005327 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.005885 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell2-novncproxy-0"] Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.017322 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.025568 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.039386 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell3-novncproxy-0"] Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.040962 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell3-novncproxy-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.044767 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell3-novncproxy-config-data" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.057186 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell3-novncproxy-0"] Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.058526 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fada3a26-6dc9-44dc-9662-e6e2111cbf72-config-data\") pod \"nova-cell2-novncproxy-0\" (UID: \"fada3a26-6dc9-44dc-9662-e6e2111cbf72\") " pod="openstack/nova-cell2-novncproxy-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.058653 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4948e0fe-dd3d-451b-9f21-92a047aef4a1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4948e0fe-dd3d-451b-9f21-92a047aef4a1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.058805 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-878gt\" (UniqueName: \"kubernetes.io/projected/fada3a26-6dc9-44dc-9662-e6e2111cbf72-kube-api-access-878gt\") pod \"nova-cell2-novncproxy-0\" (UID: \"fada3a26-6dc9-44dc-9662-e6e2111cbf72\") " pod="openstack/nova-cell2-novncproxy-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.058892 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4948e0fe-dd3d-451b-9f21-92a047aef4a1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4948e0fe-dd3d-451b-9f21-92a047aef4a1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.058993 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fada3a26-6dc9-44dc-9662-e6e2111cbf72-combined-ca-bundle\") pod \"nova-cell2-novncproxy-0\" (UID: \"fada3a26-6dc9-44dc-9662-e6e2111cbf72\") " pod="openstack/nova-cell2-novncproxy-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.059083 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4ncv\" (UniqueName: \"kubernetes.io/projected/4948e0fe-dd3d-451b-9f21-92a047aef4a1-kube-api-access-v4ncv\") pod \"nova-cell1-novncproxy-0\" (UID: \"4948e0fe-dd3d-451b-9f21-92a047aef4a1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.065838 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4948e0fe-dd3d-451b-9f21-92a047aef4a1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4948e0fe-dd3d-451b-9f21-92a047aef4a1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.067161 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4948e0fe-dd3d-451b-9f21-92a047aef4a1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4948e0fe-dd3d-451b-9f21-92a047aef4a1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.067687 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fada3a26-6dc9-44dc-9662-e6e2111cbf72-config-data\") pod \"nova-cell2-novncproxy-0\" (UID: \"fada3a26-6dc9-44dc-9662-e6e2111cbf72\") " pod="openstack/nova-cell2-novncproxy-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.078789 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-878gt\" (UniqueName: \"kubernetes.io/projected/fada3a26-6dc9-44dc-9662-e6e2111cbf72-kube-api-access-878gt\") pod \"nova-cell2-novncproxy-0\" (UID: \"fada3a26-6dc9-44dc-9662-e6e2111cbf72\") " pod="openstack/nova-cell2-novncproxy-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.079481 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4ncv\" (UniqueName: \"kubernetes.io/projected/4948e0fe-dd3d-451b-9f21-92a047aef4a1-kube-api-access-v4ncv\") pod \"nova-cell1-novncproxy-0\" (UID: \"4948e0fe-dd3d-451b-9f21-92a047aef4a1\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.086989 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fada3a26-6dc9-44dc-9662-e6e2111cbf72-combined-ca-bundle\") pod \"nova-cell2-novncproxy-0\" (UID: \"fada3a26-6dc9-44dc-9662-e6e2111cbf72\") " pod="openstack/nova-cell2-novncproxy-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.161621 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7957cf89-0ff1-4dcf-841f-735eb853dd8c-combined-ca-bundle\") pod \"nova-cell3-novncproxy-0\" (UID: \"7957cf89-0ff1-4dcf-841f-735eb853dd8c\") " pod="openstack/nova-cell3-novncproxy-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.162149 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7957cf89-0ff1-4dcf-841f-735eb853dd8c-config-data\") pod \"nova-cell3-novncproxy-0\" (UID: \"7957cf89-0ff1-4dcf-841f-735eb853dd8c\") " pod="openstack/nova-cell3-novncproxy-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.162319 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l46xg\" (UniqueName: \"kubernetes.io/projected/7957cf89-0ff1-4dcf-841f-735eb853dd8c-kube-api-access-l46xg\") pod \"nova-cell3-novncproxy-0\" (UID: \"7957cf89-0ff1-4dcf-841f-735eb853dd8c\") " pod="openstack/nova-cell3-novncproxy-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.207449 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.208358 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.220202 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-novncproxy-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.268927 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l46xg\" (UniqueName: \"kubernetes.io/projected/7957cf89-0ff1-4dcf-841f-735eb853dd8c-kube-api-access-l46xg\") pod \"nova-cell3-novncproxy-0\" (UID: \"7957cf89-0ff1-4dcf-841f-735eb853dd8c\") " pod="openstack/nova-cell3-novncproxy-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.269004 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7957cf89-0ff1-4dcf-841f-735eb853dd8c-combined-ca-bundle\") pod \"nova-cell3-novncproxy-0\" (UID: \"7957cf89-0ff1-4dcf-841f-735eb853dd8c\") " pod="openstack/nova-cell3-novncproxy-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.269327 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7957cf89-0ff1-4dcf-841f-735eb853dd8c-config-data\") pod \"nova-cell3-novncproxy-0\" (UID: \"7957cf89-0ff1-4dcf-841f-735eb853dd8c\") " pod="openstack/nova-cell3-novncproxy-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.274141 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7957cf89-0ff1-4dcf-841f-735eb853dd8c-combined-ca-bundle\") pod \"nova-cell3-novncproxy-0\" (UID: \"7957cf89-0ff1-4dcf-841f-735eb853dd8c\") " pod="openstack/nova-cell3-novncproxy-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.278271 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7957cf89-0ff1-4dcf-841f-735eb853dd8c-config-data\") pod \"nova-cell3-novncproxy-0\" (UID: \"7957cf89-0ff1-4dcf-841f-735eb853dd8c\") " pod="openstack/nova-cell3-novncproxy-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.286591 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l46xg\" (UniqueName: \"kubernetes.io/projected/7957cf89-0ff1-4dcf-841f-735eb853dd8c-kube-api-access-l46xg\") pod \"nova-cell3-novncproxy-0\" (UID: \"7957cf89-0ff1-4dcf-841f-735eb853dd8c\") " pod="openstack/nova-cell3-novncproxy-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.359638 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell3-novncproxy-0" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.391499 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-5zj7g"] Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.528846 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5zj7g" event={"ID":"36ef7a0e-8716-49c8-8605-d4d09d31df62","Type":"ContainerStarted","Data":"04829b28d9ae81be233bf257ca02736883ea9cebcaba5a6e465ab4787feb1f48"} Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.596636 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dp5gk"] Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.598283 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dp5gk" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.602747 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.603008 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.609225 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.621689 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.641296 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dp5gk"] Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.679613 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell2-conductor-db-sync-p99zm"] Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.693914 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell2-conductor-db-sync-p99zm"] Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.694019 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-conductor-db-sync-p99zm" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.697873 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell2-conductor-scripts" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.697914 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb51151c-0951-48f3-a2cf-4f71d43b811e-scripts\") pod \"nova-cell1-conductor-db-sync-dp5gk\" (UID: \"fb51151c-0951-48f3-a2cf-4f71d43b811e\") " pod="openstack/nova-cell1-conductor-db-sync-dp5gk" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.697996 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9zdp\" (UniqueName: \"kubernetes.io/projected/fb51151c-0951-48f3-a2cf-4f71d43b811e-kube-api-access-h9zdp\") pod \"nova-cell1-conductor-db-sync-dp5gk\" (UID: \"fb51151c-0951-48f3-a2cf-4f71d43b811e\") " pod="openstack/nova-cell1-conductor-db-sync-dp5gk" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.698049 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb51151c-0951-48f3-a2cf-4f71d43b811e-config-data\") pod \"nova-cell1-conductor-db-sync-dp5gk\" (UID: \"fb51151c-0951-48f3-a2cf-4f71d43b811e\") " pod="openstack/nova-cell1-conductor-db-sync-dp5gk" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.698148 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb51151c-0951-48f3-a2cf-4f71d43b811e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dp5gk\" (UID: \"fb51151c-0951-48f3-a2cf-4f71d43b811e\") " pod="openstack/nova-cell1-conductor-db-sync-dp5gk" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.698521 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell2-conductor-config-data" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.703443 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell3-conductor-db-sync-bn2f5"] Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.705063 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell3-conductor-db-sync-bn2f5" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.707576 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell3-conductor-config-data" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.707786 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell3-conductor-scripts" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.713623 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell3-conductor-db-sync-bn2f5"] Dec 03 09:07:01 crc kubenswrapper[4947]: W1203 09:07:01.770461 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06ce93d1_fa71_48ee_bf44_a0203c659dee.slice/crio-a64382d9ccad42fd41240d65f7886367ce569047af764585060d52ffc6b1bc72 WatchSource:0}: Error finding container a64382d9ccad42fd41240d65f7886367ce569047af764585060d52ffc6b1bc72: Status 404 returned error can't find the container with id a64382d9ccad42fd41240d65f7886367ce569047af764585060d52ffc6b1bc72 Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.772156 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.802471 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tdj7\" (UniqueName: \"kubernetes.io/projected/b1442d5f-950b-4776-933d-0c75857b7249-kube-api-access-5tdj7\") pod \"nova-cell3-conductor-db-sync-bn2f5\" (UID: \"b1442d5f-950b-4776-933d-0c75857b7249\") " pod="openstack/nova-cell3-conductor-db-sync-bn2f5" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.802545 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb51151c-0951-48f3-a2cf-4f71d43b811e-config-data\") pod \"nova-cell1-conductor-db-sync-dp5gk\" (UID: \"fb51151c-0951-48f3-a2cf-4f71d43b811e\") " pod="openstack/nova-cell1-conductor-db-sync-dp5gk" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.802665 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1442d5f-950b-4776-933d-0c75857b7249-scripts\") pod \"nova-cell3-conductor-db-sync-bn2f5\" (UID: \"b1442d5f-950b-4776-933d-0c75857b7249\") " pod="openstack/nova-cell3-conductor-db-sync-bn2f5" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.802750 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1442d5f-950b-4776-933d-0c75857b7249-combined-ca-bundle\") pod \"nova-cell3-conductor-db-sync-bn2f5\" (UID: \"b1442d5f-950b-4776-933d-0c75857b7249\") " pod="openstack/nova-cell3-conductor-db-sync-bn2f5" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.802787 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb51151c-0951-48f3-a2cf-4f71d43b811e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dp5gk\" (UID: \"fb51151c-0951-48f3-a2cf-4f71d43b811e\") " pod="openstack/nova-cell1-conductor-db-sync-dp5gk" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.802814 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1442d5f-950b-4776-933d-0c75857b7249-config-data\") pod \"nova-cell3-conductor-db-sync-bn2f5\" (UID: \"b1442d5f-950b-4776-933d-0c75857b7249\") " pod="openstack/nova-cell3-conductor-db-sync-bn2f5" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.802857 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58kgf\" (UniqueName: \"kubernetes.io/projected/40791046-82ba-4013-932c-2ef81bb1c309-kube-api-access-58kgf\") pod \"nova-cell2-conductor-db-sync-p99zm\" (UID: \"40791046-82ba-4013-932c-2ef81bb1c309\") " pod="openstack/nova-cell2-conductor-db-sync-p99zm" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.802919 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40791046-82ba-4013-932c-2ef81bb1c309-config-data\") pod \"nova-cell2-conductor-db-sync-p99zm\" (UID: \"40791046-82ba-4013-932c-2ef81bb1c309\") " pod="openstack/nova-cell2-conductor-db-sync-p99zm" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.802935 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40791046-82ba-4013-932c-2ef81bb1c309-combined-ca-bundle\") pod \"nova-cell2-conductor-db-sync-p99zm\" (UID: \"40791046-82ba-4013-932c-2ef81bb1c309\") " pod="openstack/nova-cell2-conductor-db-sync-p99zm" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.802998 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb51151c-0951-48f3-a2cf-4f71d43b811e-scripts\") pod \"nova-cell1-conductor-db-sync-dp5gk\" (UID: \"fb51151c-0951-48f3-a2cf-4f71d43b811e\") " pod="openstack/nova-cell1-conductor-db-sync-dp5gk" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.803048 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40791046-82ba-4013-932c-2ef81bb1c309-scripts\") pod \"nova-cell2-conductor-db-sync-p99zm\" (UID: \"40791046-82ba-4013-932c-2ef81bb1c309\") " pod="openstack/nova-cell2-conductor-db-sync-p99zm" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.803077 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9zdp\" (UniqueName: \"kubernetes.io/projected/fb51151c-0951-48f3-a2cf-4f71d43b811e-kube-api-access-h9zdp\") pod \"nova-cell1-conductor-db-sync-dp5gk\" (UID: \"fb51151c-0951-48f3-a2cf-4f71d43b811e\") " pod="openstack/nova-cell1-conductor-db-sync-dp5gk" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.810318 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb51151c-0951-48f3-a2cf-4f71d43b811e-config-data\") pod \"nova-cell1-conductor-db-sync-dp5gk\" (UID: \"fb51151c-0951-48f3-a2cf-4f71d43b811e\") " pod="openstack/nova-cell1-conductor-db-sync-dp5gk" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.811358 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb51151c-0951-48f3-a2cf-4f71d43b811e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dp5gk\" (UID: \"fb51151c-0951-48f3-a2cf-4f71d43b811e\") " pod="openstack/nova-cell1-conductor-db-sync-dp5gk" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.811752 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb51151c-0951-48f3-a2cf-4f71d43b811e-scripts\") pod \"nova-cell1-conductor-db-sync-dp5gk\" (UID: \"fb51151c-0951-48f3-a2cf-4f71d43b811e\") " pod="openstack/nova-cell1-conductor-db-sync-dp5gk" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.827710 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9zdp\" (UniqueName: \"kubernetes.io/projected/fb51151c-0951-48f3-a2cf-4f71d43b811e-kube-api-access-h9zdp\") pod \"nova-cell1-conductor-db-sync-dp5gk\" (UID: \"fb51151c-0951-48f3-a2cf-4f71d43b811e\") " pod="openstack/nova-cell1-conductor-db-sync-dp5gk" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.904940 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1442d5f-950b-4776-933d-0c75857b7249-combined-ca-bundle\") pod \"nova-cell3-conductor-db-sync-bn2f5\" (UID: \"b1442d5f-950b-4776-933d-0c75857b7249\") " pod="openstack/nova-cell3-conductor-db-sync-bn2f5" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.905011 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1442d5f-950b-4776-933d-0c75857b7249-config-data\") pod \"nova-cell3-conductor-db-sync-bn2f5\" (UID: \"b1442d5f-950b-4776-933d-0c75857b7249\") " pod="openstack/nova-cell3-conductor-db-sync-bn2f5" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.905046 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58kgf\" (UniqueName: \"kubernetes.io/projected/40791046-82ba-4013-932c-2ef81bb1c309-kube-api-access-58kgf\") pod \"nova-cell2-conductor-db-sync-p99zm\" (UID: \"40791046-82ba-4013-932c-2ef81bb1c309\") " pod="openstack/nova-cell2-conductor-db-sync-p99zm" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.905090 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40791046-82ba-4013-932c-2ef81bb1c309-config-data\") pod \"nova-cell2-conductor-db-sync-p99zm\" (UID: \"40791046-82ba-4013-932c-2ef81bb1c309\") " pod="openstack/nova-cell2-conductor-db-sync-p99zm" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.905107 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40791046-82ba-4013-932c-2ef81bb1c309-combined-ca-bundle\") pod \"nova-cell2-conductor-db-sync-p99zm\" (UID: \"40791046-82ba-4013-932c-2ef81bb1c309\") " pod="openstack/nova-cell2-conductor-db-sync-p99zm" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.905165 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40791046-82ba-4013-932c-2ef81bb1c309-scripts\") pod \"nova-cell2-conductor-db-sync-p99zm\" (UID: \"40791046-82ba-4013-932c-2ef81bb1c309\") " pod="openstack/nova-cell2-conductor-db-sync-p99zm" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.908734 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tdj7\" (UniqueName: \"kubernetes.io/projected/b1442d5f-950b-4776-933d-0c75857b7249-kube-api-access-5tdj7\") pod \"nova-cell3-conductor-db-sync-bn2f5\" (UID: \"b1442d5f-950b-4776-933d-0c75857b7249\") " pod="openstack/nova-cell3-conductor-db-sync-bn2f5" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.909004 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1442d5f-950b-4776-933d-0c75857b7249-scripts\") pod \"nova-cell3-conductor-db-sync-bn2f5\" (UID: \"b1442d5f-950b-4776-933d-0c75857b7249\") " pod="openstack/nova-cell3-conductor-db-sync-bn2f5" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.909022 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1442d5f-950b-4776-933d-0c75857b7249-combined-ca-bundle\") pod \"nova-cell3-conductor-db-sync-bn2f5\" (UID: \"b1442d5f-950b-4776-933d-0c75857b7249\") " pod="openstack/nova-cell3-conductor-db-sync-bn2f5" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.911533 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40791046-82ba-4013-932c-2ef81bb1c309-scripts\") pod \"nova-cell2-conductor-db-sync-p99zm\" (UID: \"40791046-82ba-4013-932c-2ef81bb1c309\") " pod="openstack/nova-cell2-conductor-db-sync-p99zm" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.914454 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40791046-82ba-4013-932c-2ef81bb1c309-combined-ca-bundle\") pod \"nova-cell2-conductor-db-sync-p99zm\" (UID: \"40791046-82ba-4013-932c-2ef81bb1c309\") " pod="openstack/nova-cell2-conductor-db-sync-p99zm" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.915077 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40791046-82ba-4013-932c-2ef81bb1c309-config-data\") pod \"nova-cell2-conductor-db-sync-p99zm\" (UID: \"40791046-82ba-4013-932c-2ef81bb1c309\") " pod="openstack/nova-cell2-conductor-db-sync-p99zm" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.916011 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1442d5f-950b-4776-933d-0c75857b7249-scripts\") pod \"nova-cell3-conductor-db-sync-bn2f5\" (UID: \"b1442d5f-950b-4776-933d-0c75857b7249\") " pod="openstack/nova-cell3-conductor-db-sync-bn2f5" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.916316 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1442d5f-950b-4776-933d-0c75857b7249-config-data\") pod \"nova-cell3-conductor-db-sync-bn2f5\" (UID: \"b1442d5f-950b-4776-933d-0c75857b7249\") " pod="openstack/nova-cell3-conductor-db-sync-bn2f5" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.922384 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58kgf\" (UniqueName: \"kubernetes.io/projected/40791046-82ba-4013-932c-2ef81bb1c309-kube-api-access-58kgf\") pod \"nova-cell2-conductor-db-sync-p99zm\" (UID: \"40791046-82ba-4013-932c-2ef81bb1c309\") " pod="openstack/nova-cell2-conductor-db-sync-p99zm" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.923333 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tdj7\" (UniqueName: \"kubernetes.io/projected/b1442d5f-950b-4776-933d-0c75857b7249-kube-api-access-5tdj7\") pod \"nova-cell3-conductor-db-sync-bn2f5\" (UID: \"b1442d5f-950b-4776-933d-0c75857b7249\") " pod="openstack/nova-cell3-conductor-db-sync-bn2f5" Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.986565 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 09:07:01 crc kubenswrapper[4947]: I1203 09:07:01.995005 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c478dcc7c-ngls5"] Dec 03 09:07:02 crc kubenswrapper[4947]: I1203 09:07:02.010637 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell2-novncproxy-0"] Dec 03 09:07:02 crc kubenswrapper[4947]: W1203 09:07:02.016545 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfada3a26_6dc9_44dc_9662_e6e2111cbf72.slice/crio-512349ccf52ca9fe2c4070bf6e31ed3197916a283dfd9d2b14254b7fae2970cf WatchSource:0}: Error finding container 512349ccf52ca9fe2c4070bf6e31ed3197916a283dfd9d2b14254b7fae2970cf: Status 404 returned error can't find the container with id 512349ccf52ca9fe2c4070bf6e31ed3197916a283dfd9d2b14254b7fae2970cf Dec 03 09:07:02 crc kubenswrapper[4947]: I1203 09:07:02.039369 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dp5gk" Dec 03 09:07:02 crc kubenswrapper[4947]: I1203 09:07:02.055341 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-conductor-db-sync-p99zm" Dec 03 09:07:02 crc kubenswrapper[4947]: I1203 09:07:02.189877 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell3-novncproxy-0"] Dec 03 09:07:02 crc kubenswrapper[4947]: I1203 09:07:02.201063 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell3-conductor-db-sync-bn2f5" Dec 03 09:07:02 crc kubenswrapper[4947]: W1203 09:07:02.221335 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7957cf89_0ff1_4dcf_841f_735eb853dd8c.slice/crio-0fbbb5bcca812c776d90a97a62a2d6caa4661f0386871cca964f48e6a8b2ecbd WatchSource:0}: Error finding container 0fbbb5bcca812c776d90a97a62a2d6caa4661f0386871cca964f48e6a8b2ecbd: Status 404 returned error can't find the container with id 0fbbb5bcca812c776d90a97a62a2d6caa4661f0386871cca964f48e6a8b2ecbd Dec 03 09:07:02 crc kubenswrapper[4947]: I1203 09:07:02.561840 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178","Type":"ContainerStarted","Data":"dda0a17f4edb9b6dbe752bec33ec9dbd26007be15b57f4e6b3a878fb06b759b2"} Dec 03 09:07:02 crc kubenswrapper[4947]: I1203 09:07:02.571162 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5zj7g" event={"ID":"36ef7a0e-8716-49c8-8605-d4d09d31df62","Type":"ContainerStarted","Data":"c6f74581fa7df7cf0f8ffa53fcc435452df09d0f336ecae60a568f0afe891716"} Dec 03 09:07:02 crc kubenswrapper[4947]: I1203 09:07:02.578379 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4948e0fe-dd3d-451b-9f21-92a047aef4a1","Type":"ContainerStarted","Data":"ea4804965d6e1e5fed0ac6295b7b80dc95e43d84091fd1d54f065d4b61345e19"} Dec 03 09:07:02 crc kubenswrapper[4947]: I1203 09:07:02.589527 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-5zj7g" podStartSLOduration=2.589485326 podStartE2EDuration="2.589485326s" podCreationTimestamp="2025-12-03 09:07:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:07:02.589321082 +0000 UTC m=+8283.850275508" watchObservedRunningTime="2025-12-03 09:07:02.589485326 +0000 UTC m=+8283.850439752" Dec 03 09:07:02 crc kubenswrapper[4947]: I1203 09:07:02.596183 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"06ce93d1-fa71-48ee-bf44-a0203c659dee","Type":"ContainerStarted","Data":"a64382d9ccad42fd41240d65f7886367ce569047af764585060d52ffc6b1bc72"} Dec 03 09:07:02 crc kubenswrapper[4947]: I1203 09:07:02.600540 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ac73988c-f87a-4d9e-a9b8-139d9f791090","Type":"ContainerStarted","Data":"d674cd3cf4f468f135d5addf703ab04ecc2440ff1265d0e0829568683cded8d1"} Dec 03 09:07:02 crc kubenswrapper[4947]: I1203 09:07:02.602400 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell3-novncproxy-0" event={"ID":"7957cf89-0ff1-4dcf-841f-735eb853dd8c","Type":"ContainerStarted","Data":"0fbbb5bcca812c776d90a97a62a2d6caa4661f0386871cca964f48e6a8b2ecbd"} Dec 03 09:07:02 crc kubenswrapper[4947]: I1203 09:07:02.605805 4947 generic.go:334] "Generic (PLEG): container finished" podID="c5aad179-8099-4458-b053-600b58e2b759" containerID="4bbe0bfeb543de59ff49e77e9842eb2c08f80707a29d1356d8a2a3754cc883e2" exitCode=0 Dec 03 09:07:02 crc kubenswrapper[4947]: I1203 09:07:02.605866 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" event={"ID":"c5aad179-8099-4458-b053-600b58e2b759","Type":"ContainerDied","Data":"4bbe0bfeb543de59ff49e77e9842eb2c08f80707a29d1356d8a2a3754cc883e2"} Dec 03 09:07:02 crc kubenswrapper[4947]: I1203 09:07:02.605890 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" event={"ID":"c5aad179-8099-4458-b053-600b58e2b759","Type":"ContainerStarted","Data":"48adc91a8c0e355d23627be44112e5e35da4426bf134bc07e59c66a61959d058"} Dec 03 09:07:02 crc kubenswrapper[4947]: I1203 09:07:02.609681 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-novncproxy-0" event={"ID":"fada3a26-6dc9-44dc-9662-e6e2111cbf72","Type":"ContainerStarted","Data":"512349ccf52ca9fe2c4070bf6e31ed3197916a283dfd9d2b14254b7fae2970cf"} Dec 03 09:07:02 crc kubenswrapper[4947]: I1203 09:07:02.618323 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dp5gk"] Dec 03 09:07:02 crc kubenswrapper[4947]: I1203 09:07:02.743068 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell2-conductor-db-sync-p99zm"] Dec 03 09:07:02 crc kubenswrapper[4947]: I1203 09:07:02.838917 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell3-conductor-db-sync-bn2f5"] Dec 03 09:07:03 crc kubenswrapper[4947]: I1203 09:07:03.639043 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell3-conductor-db-sync-bn2f5" event={"ID":"b1442d5f-950b-4776-933d-0c75857b7249","Type":"ContainerStarted","Data":"3e7969ecf221536360979a4f956b25b4a56ae7c58eb131fa2748a2210d94fa97"} Dec 03 09:07:03 crc kubenswrapper[4947]: I1203 09:07:03.639480 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell3-conductor-db-sync-bn2f5" event={"ID":"b1442d5f-950b-4776-933d-0c75857b7249","Type":"ContainerStarted","Data":"b30efe9932cae56890f0684c96154c925c5cca68f6a7d9bf52f1a64008f2d9e9"} Dec 03 09:07:03 crc kubenswrapper[4947]: I1203 09:07:03.645041 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dp5gk" event={"ID":"fb51151c-0951-48f3-a2cf-4f71d43b811e","Type":"ContainerStarted","Data":"93191f250b936d927d19cf8148da57f63627aa990a327cd79f5e2f2a82b60a09"} Dec 03 09:07:03 crc kubenswrapper[4947]: I1203 09:07:03.645083 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dp5gk" event={"ID":"fb51151c-0951-48f3-a2cf-4f71d43b811e","Type":"ContainerStarted","Data":"8a77ef0b26e2b7b99374e75b3484728c2e030ef6c751d79e22d7d06f975d2143"} Dec 03 09:07:03 crc kubenswrapper[4947]: I1203 09:07:03.647745 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" event={"ID":"c5aad179-8099-4458-b053-600b58e2b759","Type":"ContainerStarted","Data":"16a76d14c570ea94fd6772ca2cd0bf8e966411d98e6f4627e6a1d8990c5f5ac4"} Dec 03 09:07:03 crc kubenswrapper[4947]: I1203 09:07:03.647815 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" Dec 03 09:07:03 crc kubenswrapper[4947]: I1203 09:07:03.653017 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-conductor-db-sync-p99zm" event={"ID":"40791046-82ba-4013-932c-2ef81bb1c309","Type":"ContainerStarted","Data":"4f06a320344f861a246db6b3ff759d3ec9a17028d3427b6b48b0015822d477aa"} Dec 03 09:07:03 crc kubenswrapper[4947]: I1203 09:07:03.653056 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-conductor-db-sync-p99zm" event={"ID":"40791046-82ba-4013-932c-2ef81bb1c309","Type":"ContainerStarted","Data":"a2c031db7f357311b5a741ab3c49b2702837d9bbfb0315fb83bcc71a8b67cdf5"} Dec 03 09:07:03 crc kubenswrapper[4947]: I1203 09:07:03.671086 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell3-conductor-db-sync-bn2f5" podStartSLOduration=2.671066141 podStartE2EDuration="2.671066141s" podCreationTimestamp="2025-12-03 09:07:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:07:03.660997818 +0000 UTC m=+8284.921952244" watchObservedRunningTime="2025-12-03 09:07:03.671066141 +0000 UTC m=+8284.932020567" Dec 03 09:07:03 crc kubenswrapper[4947]: I1203 09:07:03.681699 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell2-conductor-db-sync-p99zm" podStartSLOduration=2.681682628 podStartE2EDuration="2.681682628s" podCreationTimestamp="2025-12-03 09:07:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:07:03.673460626 +0000 UTC m=+8284.934415052" watchObservedRunningTime="2025-12-03 09:07:03.681682628 +0000 UTC m=+8284.942637054" Dec 03 09:07:03 crc kubenswrapper[4947]: I1203 09:07:03.693092 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-dp5gk" podStartSLOduration=2.693072686 podStartE2EDuration="2.693072686s" podCreationTimestamp="2025-12-03 09:07:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:07:03.688536894 +0000 UTC m=+8284.949491320" watchObservedRunningTime="2025-12-03 09:07:03.693072686 +0000 UTC m=+8284.954027112" Dec 03 09:07:03 crc kubenswrapper[4947]: I1203 09:07:03.710945 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" podStartSLOduration=3.710924269 podStartE2EDuration="3.710924269s" podCreationTimestamp="2025-12-03 09:07:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:07:03.705592635 +0000 UTC m=+8284.966547071" watchObservedRunningTime="2025-12-03 09:07:03.710924269 +0000 UTC m=+8284.971878695" Dec 03 09:07:05 crc kubenswrapper[4947]: I1203 09:07:05.678378 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4948e0fe-dd3d-451b-9f21-92a047aef4a1","Type":"ContainerStarted","Data":"9461b7396435a3a5dc9c89b78e9fd5c6a5a91493117afb66fdaa287c47b992e4"} Dec 03 09:07:05 crc kubenswrapper[4947]: I1203 09:07:05.682174 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"06ce93d1-fa71-48ee-bf44-a0203c659dee","Type":"ContainerStarted","Data":"29cd745d90bd284fce8f4c323fcdf100937e2f8730e0fdbf47815fa52ae81608"} Dec 03 09:07:05 crc kubenswrapper[4947]: I1203 09:07:05.684056 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ac73988c-f87a-4d9e-a9b8-139d9f791090","Type":"ContainerStarted","Data":"ea24bcd78650cea8b8af6ce971dc07eb338873a53732e4d9f8ac732502f308a2"} Dec 03 09:07:05 crc kubenswrapper[4947]: I1203 09:07:05.685511 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell3-novncproxy-0" event={"ID":"7957cf89-0ff1-4dcf-841f-735eb853dd8c","Type":"ContainerStarted","Data":"31386ec73e2925873e149fb4e00c07213b6f18c3f1f366f757510e3e4dcb5322"} Dec 03 09:07:05 crc kubenswrapper[4947]: I1203 09:07:05.687426 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-novncproxy-0" event={"ID":"fada3a26-6dc9-44dc-9662-e6e2111cbf72","Type":"ContainerStarted","Data":"3fc4dfb54dd0bb4cd7475341cd00b624a6a4d1d37031c78e48e89cb6c1f7499c"} Dec 03 09:07:05 crc kubenswrapper[4947]: I1203 09:07:05.693254 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178","Type":"ContainerStarted","Data":"0e62f4ca02a3dd2d0de8ead00167e4fc86dccad4d2d5de7eccf787cba7a054dc"} Dec 03 09:07:05 crc kubenswrapper[4947]: I1203 09:07:05.698880 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.69467904 podStartE2EDuration="5.698865079s" podCreationTimestamp="2025-12-03 09:07:00 +0000 UTC" firstStartedPulling="2025-12-03 09:07:01.999396694 +0000 UTC m=+8283.260351120" lastFinishedPulling="2025-12-03 09:07:05.003582733 +0000 UTC m=+8286.264537159" observedRunningTime="2025-12-03 09:07:05.698851709 +0000 UTC m=+8286.959806135" watchObservedRunningTime="2025-12-03 09:07:05.698865079 +0000 UTC m=+8286.959819505" Dec 03 09:07:05 crc kubenswrapper[4947]: I1203 09:07:05.723368 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell2-novncproxy-0" podStartSLOduration=2.766863624 podStartE2EDuration="5.723353102s" podCreationTimestamp="2025-12-03 09:07:00 +0000 UTC" firstStartedPulling="2025-12-03 09:07:02.019765526 +0000 UTC m=+8283.280719952" lastFinishedPulling="2025-12-03 09:07:04.976255004 +0000 UTC m=+8286.237209430" observedRunningTime="2025-12-03 09:07:05.71812912 +0000 UTC m=+8286.979083546" watchObservedRunningTime="2025-12-03 09:07:05.723353102 +0000 UTC m=+8286.984307528" Dec 03 09:07:05 crc kubenswrapper[4947]: I1203 09:07:05.757893 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell3-novncproxy-0" podStartSLOduration=3.008542911 podStartE2EDuration="5.757875606s" podCreationTimestamp="2025-12-03 09:07:00 +0000 UTC" firstStartedPulling="2025-12-03 09:07:02.249649984 +0000 UTC m=+8283.510604410" lastFinishedPulling="2025-12-03 09:07:04.998982679 +0000 UTC m=+8286.259937105" observedRunningTime="2025-12-03 09:07:05.755029899 +0000 UTC m=+8287.015984325" watchObservedRunningTime="2025-12-03 09:07:05.757875606 +0000 UTC m=+8287.018830032" Dec 03 09:07:05 crc kubenswrapper[4947]: I1203 09:07:05.779820 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.593559205 podStartE2EDuration="5.779797738s" podCreationTimestamp="2025-12-03 09:07:00 +0000 UTC" firstStartedPulling="2025-12-03 09:07:01.789788355 +0000 UTC m=+8283.050742781" lastFinishedPulling="2025-12-03 09:07:04.976026888 +0000 UTC m=+8286.236981314" observedRunningTime="2025-12-03 09:07:05.770542698 +0000 UTC m=+8287.031497144" watchObservedRunningTime="2025-12-03 09:07:05.779797738 +0000 UTC m=+8287.040752164" Dec 03 09:07:06 crc kubenswrapper[4947]: I1203 09:07:06.026833 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 09:07:06 crc kubenswrapper[4947]: I1203 09:07:06.209084 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 03 09:07:06 crc kubenswrapper[4947]: I1203 09:07:06.221534 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell2-novncproxy-0" Dec 03 09:07:06 crc kubenswrapper[4947]: I1203 09:07:06.360356 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell3-novncproxy-0" Dec 03 09:07:06 crc kubenswrapper[4947]: I1203 09:07:06.704037 4947 generic.go:334] "Generic (PLEG): container finished" podID="b1442d5f-950b-4776-933d-0c75857b7249" containerID="3e7969ecf221536360979a4f956b25b4a56ae7c58eb131fa2748a2210d94fa97" exitCode=0 Dec 03 09:07:06 crc kubenswrapper[4947]: I1203 09:07:06.704134 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell3-conductor-db-sync-bn2f5" event={"ID":"b1442d5f-950b-4776-933d-0c75857b7249","Type":"ContainerDied","Data":"3e7969ecf221536360979a4f956b25b4a56ae7c58eb131fa2748a2210d94fa97"} Dec 03 09:07:06 crc kubenswrapper[4947]: I1203 09:07:06.708643 4947 generic.go:334] "Generic (PLEG): container finished" podID="fb51151c-0951-48f3-a2cf-4f71d43b811e" containerID="93191f250b936d927d19cf8148da57f63627aa990a327cd79f5e2f2a82b60a09" exitCode=0 Dec 03 09:07:06 crc kubenswrapper[4947]: I1203 09:07:06.708721 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dp5gk" event={"ID":"fb51151c-0951-48f3-a2cf-4f71d43b811e","Type":"ContainerDied","Data":"93191f250b936d927d19cf8148da57f63627aa990a327cd79f5e2f2a82b60a09"} Dec 03 09:07:06 crc kubenswrapper[4947]: I1203 09:07:06.711939 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ac73988c-f87a-4d9e-a9b8-139d9f791090","Type":"ContainerStarted","Data":"a64aefe07c67ce9470729e87599cea33ac4347a2e658daf4d1f6bcf83d4ec0c9"} Dec 03 09:07:06 crc kubenswrapper[4947]: I1203 09:07:06.729074 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178","Type":"ContainerStarted","Data":"27a963180bcfd36bc9b01e84260e09db28c4c7e001dfb27447959d30cb16e1a0"} Dec 03 09:07:06 crc kubenswrapper[4947]: I1203 09:07:06.739859 4947 generic.go:334] "Generic (PLEG): container finished" podID="40791046-82ba-4013-932c-2ef81bb1c309" containerID="4f06a320344f861a246db6b3ff759d3ec9a17028d3427b6b48b0015822d477aa" exitCode=0 Dec 03 09:07:06 crc kubenswrapper[4947]: I1203 09:07:06.740734 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-conductor-db-sync-p99zm" event={"ID":"40791046-82ba-4013-932c-2ef81bb1c309","Type":"ContainerDied","Data":"4f06a320344f861a246db6b3ff759d3ec9a17028d3427b6b48b0015822d477aa"} Dec 03 09:07:06 crc kubenswrapper[4947]: I1203 09:07:06.760362 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.424119362 podStartE2EDuration="6.760339111s" podCreationTimestamp="2025-12-03 09:07:00 +0000 UTC" firstStartedPulling="2025-12-03 09:07:01.664041764 +0000 UTC m=+8282.924996190" lastFinishedPulling="2025-12-03 09:07:05.000261503 +0000 UTC m=+8286.261215939" observedRunningTime="2025-12-03 09:07:06.75109676 +0000 UTC m=+8288.012051186" watchObservedRunningTime="2025-12-03 09:07:06.760339111 +0000 UTC m=+8288.021293537" Dec 03 09:07:06 crc kubenswrapper[4947]: I1203 09:07:06.784717 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.393434681 podStartE2EDuration="6.784699219s" podCreationTimestamp="2025-12-03 09:07:00 +0000 UTC" firstStartedPulling="2025-12-03 09:07:01.612348646 +0000 UTC m=+8282.873303072" lastFinishedPulling="2025-12-03 09:07:05.003613184 +0000 UTC m=+8286.264567610" observedRunningTime="2025-12-03 09:07:06.773214349 +0000 UTC m=+8288.034168785" watchObservedRunningTime="2025-12-03 09:07:06.784699219 +0000 UTC m=+8288.045653645" Dec 03 09:07:07 crc kubenswrapper[4947]: I1203 09:07:07.749931 4947 generic.go:334] "Generic (PLEG): container finished" podID="36ef7a0e-8716-49c8-8605-d4d09d31df62" containerID="c6f74581fa7df7cf0f8ffa53fcc435452df09d0f336ecae60a568f0afe891716" exitCode=0 Dec 03 09:07:07 crc kubenswrapper[4947]: I1203 09:07:07.750037 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5zj7g" event={"ID":"36ef7a0e-8716-49c8-8605-d4d09d31df62","Type":"ContainerDied","Data":"c6f74581fa7df7cf0f8ffa53fcc435452df09d0f336ecae60a568f0afe891716"} Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.274048 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-conductor-db-sync-p99zm" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.286290 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell3-conductor-db-sync-bn2f5" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.298101 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dp5gk" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.371284 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb51151c-0951-48f3-a2cf-4f71d43b811e-scripts\") pod \"fb51151c-0951-48f3-a2cf-4f71d43b811e\" (UID: \"fb51151c-0951-48f3-a2cf-4f71d43b811e\") " Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.371477 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9zdp\" (UniqueName: \"kubernetes.io/projected/fb51151c-0951-48f3-a2cf-4f71d43b811e-kube-api-access-h9zdp\") pod \"fb51151c-0951-48f3-a2cf-4f71d43b811e\" (UID: \"fb51151c-0951-48f3-a2cf-4f71d43b811e\") " Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.371520 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1442d5f-950b-4776-933d-0c75857b7249-combined-ca-bundle\") pod \"b1442d5f-950b-4776-933d-0c75857b7249\" (UID: \"b1442d5f-950b-4776-933d-0c75857b7249\") " Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.371608 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb51151c-0951-48f3-a2cf-4f71d43b811e-combined-ca-bundle\") pod \"fb51151c-0951-48f3-a2cf-4f71d43b811e\" (UID: \"fb51151c-0951-48f3-a2cf-4f71d43b811e\") " Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.371652 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1442d5f-950b-4776-933d-0c75857b7249-config-data\") pod \"b1442d5f-950b-4776-933d-0c75857b7249\" (UID: \"b1442d5f-950b-4776-933d-0c75857b7249\") " Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.371680 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb51151c-0951-48f3-a2cf-4f71d43b811e-config-data\") pod \"fb51151c-0951-48f3-a2cf-4f71d43b811e\" (UID: \"fb51151c-0951-48f3-a2cf-4f71d43b811e\") " Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.371710 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40791046-82ba-4013-932c-2ef81bb1c309-combined-ca-bundle\") pod \"40791046-82ba-4013-932c-2ef81bb1c309\" (UID: \"40791046-82ba-4013-932c-2ef81bb1c309\") " Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.371737 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tdj7\" (UniqueName: \"kubernetes.io/projected/b1442d5f-950b-4776-933d-0c75857b7249-kube-api-access-5tdj7\") pod \"b1442d5f-950b-4776-933d-0c75857b7249\" (UID: \"b1442d5f-950b-4776-933d-0c75857b7249\") " Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.371796 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40791046-82ba-4013-932c-2ef81bb1c309-config-data\") pod \"40791046-82ba-4013-932c-2ef81bb1c309\" (UID: \"40791046-82ba-4013-932c-2ef81bb1c309\") " Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.371831 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40791046-82ba-4013-932c-2ef81bb1c309-scripts\") pod \"40791046-82ba-4013-932c-2ef81bb1c309\" (UID: \"40791046-82ba-4013-932c-2ef81bb1c309\") " Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.371848 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58kgf\" (UniqueName: \"kubernetes.io/projected/40791046-82ba-4013-932c-2ef81bb1c309-kube-api-access-58kgf\") pod \"40791046-82ba-4013-932c-2ef81bb1c309\" (UID: \"40791046-82ba-4013-932c-2ef81bb1c309\") " Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.371866 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1442d5f-950b-4776-933d-0c75857b7249-scripts\") pod \"b1442d5f-950b-4776-933d-0c75857b7249\" (UID: \"b1442d5f-950b-4776-933d-0c75857b7249\") " Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.377221 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb51151c-0951-48f3-a2cf-4f71d43b811e-scripts" (OuterVolumeSpecName: "scripts") pod "fb51151c-0951-48f3-a2cf-4f71d43b811e" (UID: "fb51151c-0951-48f3-a2cf-4f71d43b811e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.377554 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40791046-82ba-4013-932c-2ef81bb1c309-kube-api-access-58kgf" (OuterVolumeSpecName: "kube-api-access-58kgf") pod "40791046-82ba-4013-932c-2ef81bb1c309" (UID: "40791046-82ba-4013-932c-2ef81bb1c309"). InnerVolumeSpecName "kube-api-access-58kgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.389599 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1442d5f-950b-4776-933d-0c75857b7249-scripts" (OuterVolumeSpecName: "scripts") pod "b1442d5f-950b-4776-933d-0c75857b7249" (UID: "b1442d5f-950b-4776-933d-0c75857b7249"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.392699 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40791046-82ba-4013-932c-2ef81bb1c309-scripts" (OuterVolumeSpecName: "scripts") pod "40791046-82ba-4013-932c-2ef81bb1c309" (UID: "40791046-82ba-4013-932c-2ef81bb1c309"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.392768 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb51151c-0951-48f3-a2cf-4f71d43b811e-kube-api-access-h9zdp" (OuterVolumeSpecName: "kube-api-access-h9zdp") pod "fb51151c-0951-48f3-a2cf-4f71d43b811e" (UID: "fb51151c-0951-48f3-a2cf-4f71d43b811e"). InnerVolumeSpecName "kube-api-access-h9zdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.392883 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1442d5f-950b-4776-933d-0c75857b7249-kube-api-access-5tdj7" (OuterVolumeSpecName: "kube-api-access-5tdj7") pod "b1442d5f-950b-4776-933d-0c75857b7249" (UID: "b1442d5f-950b-4776-933d-0c75857b7249"). InnerVolumeSpecName "kube-api-access-5tdj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.398608 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1442d5f-950b-4776-933d-0c75857b7249-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1442d5f-950b-4776-933d-0c75857b7249" (UID: "b1442d5f-950b-4776-933d-0c75857b7249"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.399175 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40791046-82ba-4013-932c-2ef81bb1c309-config-data" (OuterVolumeSpecName: "config-data") pod "40791046-82ba-4013-932c-2ef81bb1c309" (UID: "40791046-82ba-4013-932c-2ef81bb1c309"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.400149 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb51151c-0951-48f3-a2cf-4f71d43b811e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb51151c-0951-48f3-a2cf-4f71d43b811e" (UID: "fb51151c-0951-48f3-a2cf-4f71d43b811e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.405083 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1442d5f-950b-4776-933d-0c75857b7249-config-data" (OuterVolumeSpecName: "config-data") pod "b1442d5f-950b-4776-933d-0c75857b7249" (UID: "b1442d5f-950b-4776-933d-0c75857b7249"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.407833 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb51151c-0951-48f3-a2cf-4f71d43b811e-config-data" (OuterVolumeSpecName: "config-data") pod "fb51151c-0951-48f3-a2cf-4f71d43b811e" (UID: "fb51151c-0951-48f3-a2cf-4f71d43b811e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.409434 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40791046-82ba-4013-932c-2ef81bb1c309-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40791046-82ba-4013-932c-2ef81bb1c309" (UID: "40791046-82ba-4013-932c-2ef81bb1c309"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.474533 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb51151c-0951-48f3-a2cf-4f71d43b811e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.474584 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1442d5f-950b-4776-933d-0c75857b7249-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.474596 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb51151c-0951-48f3-a2cf-4f71d43b811e-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.474608 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tdj7\" (UniqueName: \"kubernetes.io/projected/b1442d5f-950b-4776-933d-0c75857b7249-kube-api-access-5tdj7\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.474621 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40791046-82ba-4013-932c-2ef81bb1c309-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.474632 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40791046-82ba-4013-932c-2ef81bb1c309-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.474642 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58kgf\" (UniqueName: \"kubernetes.io/projected/40791046-82ba-4013-932c-2ef81bb1c309-kube-api-access-58kgf\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.474652 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40791046-82ba-4013-932c-2ef81bb1c309-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.474663 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1442d5f-950b-4776-933d-0c75857b7249-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.474673 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb51151c-0951-48f3-a2cf-4f71d43b811e-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.474684 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9zdp\" (UniqueName: \"kubernetes.io/projected/fb51151c-0951-48f3-a2cf-4f71d43b811e-kube-api-access-h9zdp\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.474695 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1442d5f-950b-4776-933d-0c75857b7249-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.763476 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell3-conductor-db-sync-bn2f5" event={"ID":"b1442d5f-950b-4776-933d-0c75857b7249","Type":"ContainerDied","Data":"b30efe9932cae56890f0684c96154c925c5cca68f6a7d9bf52f1a64008f2d9e9"} Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.763528 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b30efe9932cae56890f0684c96154c925c5cca68f6a7d9bf52f1a64008f2d9e9" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.763608 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell3-conductor-db-sync-bn2f5" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.766088 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dp5gk" event={"ID":"fb51151c-0951-48f3-a2cf-4f71d43b811e","Type":"ContainerDied","Data":"8a77ef0b26e2b7b99374e75b3484728c2e030ef6c751d79e22d7d06f975d2143"} Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.766116 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a77ef0b26e2b7b99374e75b3484728c2e030ef6c751d79e22d7d06f975d2143" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.766139 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dp5gk" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.767710 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-conductor-db-sync-p99zm" event={"ID":"40791046-82ba-4013-932c-2ef81bb1c309","Type":"ContainerDied","Data":"a2c031db7f357311b5a741ab3c49b2702837d9bbfb0315fb83bcc71a8b67cdf5"} Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.767754 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-conductor-db-sync-p99zm" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.767757 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2c031db7f357311b5a741ab3c49b2702837d9bbfb0315fb83bcc71a8b67cdf5" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.810315 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell3-conductor-0"] Dec 03 09:07:08 crc kubenswrapper[4947]: E1203 09:07:08.810734 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb51151c-0951-48f3-a2cf-4f71d43b811e" containerName="nova-cell1-conductor-db-sync" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.810747 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb51151c-0951-48f3-a2cf-4f71d43b811e" containerName="nova-cell1-conductor-db-sync" Dec 03 09:07:08 crc kubenswrapper[4947]: E1203 09:07:08.810770 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1442d5f-950b-4776-933d-0c75857b7249" containerName="nova-cell3-conductor-db-sync" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.810776 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1442d5f-950b-4776-933d-0c75857b7249" containerName="nova-cell3-conductor-db-sync" Dec 03 09:07:08 crc kubenswrapper[4947]: E1203 09:07:08.810790 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40791046-82ba-4013-932c-2ef81bb1c309" containerName="nova-cell2-conductor-db-sync" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.810796 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="40791046-82ba-4013-932c-2ef81bb1c309" containerName="nova-cell2-conductor-db-sync" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.810975 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1442d5f-950b-4776-933d-0c75857b7249" containerName="nova-cell3-conductor-db-sync" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.810990 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb51151c-0951-48f3-a2cf-4f71d43b811e" containerName="nova-cell1-conductor-db-sync" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.811001 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="40791046-82ba-4013-932c-2ef81bb1c309" containerName="nova-cell2-conductor-db-sync" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.811670 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell3-conductor-0" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.816991 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell3-conductor-config-data" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.835422 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell3-conductor-0"] Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.882945 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3cdf1c-630c-4944-ae58-c426e0d2161a-config-data\") pod \"nova-cell3-conductor-0\" (UID: \"4f3cdf1c-630c-4944-ae58-c426e0d2161a\") " pod="openstack/nova-cell3-conductor-0" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.883056 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9gxh\" (UniqueName: \"kubernetes.io/projected/4f3cdf1c-630c-4944-ae58-c426e0d2161a-kube-api-access-w9gxh\") pod \"nova-cell3-conductor-0\" (UID: \"4f3cdf1c-630c-4944-ae58-c426e0d2161a\") " pod="openstack/nova-cell3-conductor-0" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.883172 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3cdf1c-630c-4944-ae58-c426e0d2161a-combined-ca-bundle\") pod \"nova-cell3-conductor-0\" (UID: \"4f3cdf1c-630c-4944-ae58-c426e0d2161a\") " pod="openstack/nova-cell3-conductor-0" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.892263 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.893629 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.895905 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.906766 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.945791 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell2-conductor-0"] Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.947566 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-conductor-0" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.952550 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell2-conductor-config-data" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.956211 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell2-conductor-0"] Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.985144 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3cdf1c-630c-4944-ae58-c426e0d2161a-combined-ca-bundle\") pod \"nova-cell3-conductor-0\" (UID: \"4f3cdf1c-630c-4944-ae58-c426e0d2161a\") " pod="openstack/nova-cell3-conductor-0" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.985193 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llgrr\" (UniqueName: \"kubernetes.io/projected/d13b2965-2e64-4e8d-a42a-48bdcb8c5bed-kube-api-access-llgrr\") pod \"nova-cell2-conductor-0\" (UID: \"d13b2965-2e64-4e8d-a42a-48bdcb8c5bed\") " pod="openstack/nova-cell2-conductor-0" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.985261 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d13b2965-2e64-4e8d-a42a-48bdcb8c5bed-config-data\") pod \"nova-cell2-conductor-0\" (UID: \"d13b2965-2e64-4e8d-a42a-48bdcb8c5bed\") " pod="openstack/nova-cell2-conductor-0" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.985297 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3cdf1c-630c-4944-ae58-c426e0d2161a-config-data\") pod \"nova-cell3-conductor-0\" (UID: \"4f3cdf1c-630c-4944-ae58-c426e0d2161a\") " pod="openstack/nova-cell3-conductor-0" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.985477 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d13b2965-2e64-4e8d-a42a-48bdcb8c5bed-combined-ca-bundle\") pod \"nova-cell2-conductor-0\" (UID: \"d13b2965-2e64-4e8d-a42a-48bdcb8c5bed\") " pod="openstack/nova-cell2-conductor-0" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.985531 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e57f73cf-4f9a-4c56-b146-af55d546e1b0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e57f73cf-4f9a-4c56-b146-af55d546e1b0\") " pod="openstack/nova-cell1-conductor-0" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.985593 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9gxh\" (UniqueName: \"kubernetes.io/projected/4f3cdf1c-630c-4944-ae58-c426e0d2161a-kube-api-access-w9gxh\") pod \"nova-cell3-conductor-0\" (UID: \"4f3cdf1c-630c-4944-ae58-c426e0d2161a\") " pod="openstack/nova-cell3-conductor-0" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.985842 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e57f73cf-4f9a-4c56-b146-af55d546e1b0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e57f73cf-4f9a-4c56-b146-af55d546e1b0\") " pod="openstack/nova-cell1-conductor-0" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.985877 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vthv\" (UniqueName: \"kubernetes.io/projected/e57f73cf-4f9a-4c56-b146-af55d546e1b0-kube-api-access-9vthv\") pod \"nova-cell1-conductor-0\" (UID: \"e57f73cf-4f9a-4c56-b146-af55d546e1b0\") " pod="openstack/nova-cell1-conductor-0" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.991418 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3cdf1c-630c-4944-ae58-c426e0d2161a-config-data\") pod \"nova-cell3-conductor-0\" (UID: \"4f3cdf1c-630c-4944-ae58-c426e0d2161a\") " pod="openstack/nova-cell3-conductor-0" Dec 03 09:07:08 crc kubenswrapper[4947]: I1203 09:07:08.992026 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3cdf1c-630c-4944-ae58-c426e0d2161a-combined-ca-bundle\") pod \"nova-cell3-conductor-0\" (UID: \"4f3cdf1c-630c-4944-ae58-c426e0d2161a\") " pod="openstack/nova-cell3-conductor-0" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.002221 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9gxh\" (UniqueName: \"kubernetes.io/projected/4f3cdf1c-630c-4944-ae58-c426e0d2161a-kube-api-access-w9gxh\") pod \"nova-cell3-conductor-0\" (UID: \"4f3cdf1c-630c-4944-ae58-c426e0d2161a\") " pod="openstack/nova-cell3-conductor-0" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.087930 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llgrr\" (UniqueName: \"kubernetes.io/projected/d13b2965-2e64-4e8d-a42a-48bdcb8c5bed-kube-api-access-llgrr\") pod \"nova-cell2-conductor-0\" (UID: \"d13b2965-2e64-4e8d-a42a-48bdcb8c5bed\") " pod="openstack/nova-cell2-conductor-0" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.087990 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d13b2965-2e64-4e8d-a42a-48bdcb8c5bed-config-data\") pod \"nova-cell2-conductor-0\" (UID: \"d13b2965-2e64-4e8d-a42a-48bdcb8c5bed\") " pod="openstack/nova-cell2-conductor-0" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.088054 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d13b2965-2e64-4e8d-a42a-48bdcb8c5bed-combined-ca-bundle\") pod \"nova-cell2-conductor-0\" (UID: \"d13b2965-2e64-4e8d-a42a-48bdcb8c5bed\") " pod="openstack/nova-cell2-conductor-0" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.088070 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e57f73cf-4f9a-4c56-b146-af55d546e1b0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e57f73cf-4f9a-4c56-b146-af55d546e1b0\") " pod="openstack/nova-cell1-conductor-0" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.088122 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e57f73cf-4f9a-4c56-b146-af55d546e1b0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e57f73cf-4f9a-4c56-b146-af55d546e1b0\") " pod="openstack/nova-cell1-conductor-0" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.088139 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vthv\" (UniqueName: \"kubernetes.io/projected/e57f73cf-4f9a-4c56-b146-af55d546e1b0-kube-api-access-9vthv\") pod \"nova-cell1-conductor-0\" (UID: \"e57f73cf-4f9a-4c56-b146-af55d546e1b0\") " pod="openstack/nova-cell1-conductor-0" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.093190 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e57f73cf-4f9a-4c56-b146-af55d546e1b0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e57f73cf-4f9a-4c56-b146-af55d546e1b0\") " pod="openstack/nova-cell1-conductor-0" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.096256 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d13b2965-2e64-4e8d-a42a-48bdcb8c5bed-config-data\") pod \"nova-cell2-conductor-0\" (UID: \"d13b2965-2e64-4e8d-a42a-48bdcb8c5bed\") " pod="openstack/nova-cell2-conductor-0" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.102124 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e57f73cf-4f9a-4c56-b146-af55d546e1b0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e57f73cf-4f9a-4c56-b146-af55d546e1b0\") " pod="openstack/nova-cell1-conductor-0" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.104164 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d13b2965-2e64-4e8d-a42a-48bdcb8c5bed-combined-ca-bundle\") pod \"nova-cell2-conductor-0\" (UID: \"d13b2965-2e64-4e8d-a42a-48bdcb8c5bed\") " pod="openstack/nova-cell2-conductor-0" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.105159 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llgrr\" (UniqueName: \"kubernetes.io/projected/d13b2965-2e64-4e8d-a42a-48bdcb8c5bed-kube-api-access-llgrr\") pod \"nova-cell2-conductor-0\" (UID: \"d13b2965-2e64-4e8d-a42a-48bdcb8c5bed\") " pod="openstack/nova-cell2-conductor-0" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.105746 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vthv\" (UniqueName: \"kubernetes.io/projected/e57f73cf-4f9a-4c56-b146-af55d546e1b0-kube-api-access-9vthv\") pod \"nova-cell1-conductor-0\" (UID: \"e57f73cf-4f9a-4c56-b146-af55d546e1b0\") " pod="openstack/nova-cell1-conductor-0" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.157785 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell3-conductor-0" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.209081 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.264227 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-conductor-0" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.309133 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5zj7g" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.397379 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ef7a0e-8716-49c8-8605-d4d09d31df62-config-data\") pod \"36ef7a0e-8716-49c8-8605-d4d09d31df62\" (UID: \"36ef7a0e-8716-49c8-8605-d4d09d31df62\") " Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.397516 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ef7a0e-8716-49c8-8605-d4d09d31df62-combined-ca-bundle\") pod \"36ef7a0e-8716-49c8-8605-d4d09d31df62\" (UID: \"36ef7a0e-8716-49c8-8605-d4d09d31df62\") " Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.397695 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36ef7a0e-8716-49c8-8605-d4d09d31df62-scripts\") pod \"36ef7a0e-8716-49c8-8605-d4d09d31df62\" (UID: \"36ef7a0e-8716-49c8-8605-d4d09d31df62\") " Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.397720 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq2d4\" (UniqueName: \"kubernetes.io/projected/36ef7a0e-8716-49c8-8605-d4d09d31df62-kube-api-access-dq2d4\") pod \"36ef7a0e-8716-49c8-8605-d4d09d31df62\" (UID: \"36ef7a0e-8716-49c8-8605-d4d09d31df62\") " Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.401676 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ef7a0e-8716-49c8-8605-d4d09d31df62-scripts" (OuterVolumeSpecName: "scripts") pod "36ef7a0e-8716-49c8-8605-d4d09d31df62" (UID: "36ef7a0e-8716-49c8-8605-d4d09d31df62"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.405101 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36ef7a0e-8716-49c8-8605-d4d09d31df62-kube-api-access-dq2d4" (OuterVolumeSpecName: "kube-api-access-dq2d4") pod "36ef7a0e-8716-49c8-8605-d4d09d31df62" (UID: "36ef7a0e-8716-49c8-8605-d4d09d31df62"). InnerVolumeSpecName "kube-api-access-dq2d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.423111 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ef7a0e-8716-49c8-8605-d4d09d31df62-config-data" (OuterVolumeSpecName: "config-data") pod "36ef7a0e-8716-49c8-8605-d4d09d31df62" (UID: "36ef7a0e-8716-49c8-8605-d4d09d31df62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.432205 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36ef7a0e-8716-49c8-8605-d4d09d31df62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36ef7a0e-8716-49c8-8605-d4d09d31df62" (UID: "36ef7a0e-8716-49c8-8605-d4d09d31df62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.500202 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36ef7a0e-8716-49c8-8605-d4d09d31df62-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.500235 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36ef7a0e-8716-49c8-8605-d4d09d31df62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.500245 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36ef7a0e-8716-49c8-8605-d4d09d31df62-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.500253 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq2d4\" (UniqueName: \"kubernetes.io/projected/36ef7a0e-8716-49c8-8605-d4d09d31df62-kube-api-access-dq2d4\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.606669 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell3-conductor-0"] Dec 03 09:07:09 crc kubenswrapper[4947]: W1203 09:07:09.714091 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode57f73cf_4f9a_4c56_b146_af55d546e1b0.slice/crio-3a402e77faaed21b896b61384dd12c5d4b20e714bd4e8be603286de7ce5e1212 WatchSource:0}: Error finding container 3a402e77faaed21b896b61384dd12c5d4b20e714bd4e8be603286de7ce5e1212: Status 404 returned error can't find the container with id 3a402e77faaed21b896b61384dd12c5d4b20e714bd4e8be603286de7ce5e1212 Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.718800 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.781553 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e57f73cf-4f9a-4c56-b146-af55d546e1b0","Type":"ContainerStarted","Data":"3a402e77faaed21b896b61384dd12c5d4b20e714bd4e8be603286de7ce5e1212"} Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.792187 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell3-conductor-0" event={"ID":"4f3cdf1c-630c-4944-ae58-c426e0d2161a","Type":"ContainerStarted","Data":"ef57f8bd5b76e4daaca6be28ec90ba64ff112e583418f7fa187d77f61de3b0ed"} Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.794002 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5zj7g" event={"ID":"36ef7a0e-8716-49c8-8605-d4d09d31df62","Type":"ContainerDied","Data":"04829b28d9ae81be233bf257ca02736883ea9cebcaba5a6e465ab4787feb1f48"} Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.794034 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04829b28d9ae81be233bf257ca02736883ea9cebcaba5a6e465ab4787feb1f48" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.794060 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5zj7g" Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.799331 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell2-conductor-0"] Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.961370 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.964813 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ac73988c-f87a-4d9e-a9b8-139d9f791090" containerName="nova-api-log" containerID="cri-o://ea24bcd78650cea8b8af6ce971dc07eb338873a53732e4d9f8ac732502f308a2" gracePeriod=30 Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.964915 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ac73988c-f87a-4d9e-a9b8-139d9f791090" containerName="nova-api-api" containerID="cri-o://a64aefe07c67ce9470729e87599cea33ac4347a2e658daf4d1f6bcf83d4ec0c9" gracePeriod=30 Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.978418 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.979832 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="06ce93d1-fa71-48ee-bf44-a0203c659dee" containerName="nova-scheduler-scheduler" containerID="cri-o://29cd745d90bd284fce8f4c323fcdf100937e2f8730e0fdbf47815fa52ae81608" gracePeriod=30 Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.995485 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.995727 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7d8c3f69-2a93-4e2b-92b1-5f467c4e8178" containerName="nova-metadata-log" containerID="cri-o://0e62f4ca02a3dd2d0de8ead00167e4fc86dccad4d2d5de7eccf787cba7a054dc" gracePeriod=30 Dec 03 09:07:09 crc kubenswrapper[4947]: I1203 09:07:09.996216 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7d8c3f69-2a93-4e2b-92b1-5f467c4e8178" containerName="nova-metadata-metadata" containerID="cri-o://27a963180bcfd36bc9b01e84260e09db28c4c7e001dfb27447959d30cb16e1a0" gracePeriod=30 Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.778784 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.791008 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.827361 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d8c3f69-2a93-4e2b-92b1-5f467c4e8178-logs\") pod \"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178\" (UID: \"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178\") " Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.827407 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac73988c-f87a-4d9e-a9b8-139d9f791090-logs\") pod \"ac73988c-f87a-4d9e-a9b8-139d9f791090\" (UID: \"ac73988c-f87a-4d9e-a9b8-139d9f791090\") " Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.827515 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn49p\" (UniqueName: \"kubernetes.io/projected/ac73988c-f87a-4d9e-a9b8-139d9f791090-kube-api-access-bn49p\") pod \"ac73988c-f87a-4d9e-a9b8-139d9f791090\" (UID: \"ac73988c-f87a-4d9e-a9b8-139d9f791090\") " Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.827542 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac73988c-f87a-4d9e-a9b8-139d9f791090-combined-ca-bundle\") pod \"ac73988c-f87a-4d9e-a9b8-139d9f791090\" (UID: \"ac73988c-f87a-4d9e-a9b8-139d9f791090\") " Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.827580 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qbk4\" (UniqueName: \"kubernetes.io/projected/7d8c3f69-2a93-4e2b-92b1-5f467c4e8178-kube-api-access-2qbk4\") pod \"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178\" (UID: \"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178\") " Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.827626 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d8c3f69-2a93-4e2b-92b1-5f467c4e8178-config-data\") pod \"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178\" (UID: \"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178\") " Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.827791 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d8c3f69-2a93-4e2b-92b1-5f467c4e8178-combined-ca-bundle\") pod \"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178\" (UID: \"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178\") " Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.827864 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac73988c-f87a-4d9e-a9b8-139d9f791090-config-data\") pod \"ac73988c-f87a-4d9e-a9b8-139d9f791090\" (UID: \"ac73988c-f87a-4d9e-a9b8-139d9f791090\") " Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.854941 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac73988c-f87a-4d9e-a9b8-139d9f791090-logs" (OuterVolumeSpecName: "logs") pod "ac73988c-f87a-4d9e-a9b8-139d9f791090" (UID: "ac73988c-f87a-4d9e-a9b8-139d9f791090"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.855170 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d8c3f69-2a93-4e2b-92b1-5f467c4e8178-logs" (OuterVolumeSpecName: "logs") pod "7d8c3f69-2a93-4e2b-92b1-5f467c4e8178" (UID: "7d8c3f69-2a93-4e2b-92b1-5f467c4e8178"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.868038 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d8c3f69-2a93-4e2b-92b1-5f467c4e8178-kube-api-access-2qbk4" (OuterVolumeSpecName: "kube-api-access-2qbk4") pod "7d8c3f69-2a93-4e2b-92b1-5f467c4e8178" (UID: "7d8c3f69-2a93-4e2b-92b1-5f467c4e8178"). InnerVolumeSpecName "kube-api-access-2qbk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.903089 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac73988c-f87a-4d9e-a9b8-139d9f791090-kube-api-access-bn49p" (OuterVolumeSpecName: "kube-api-access-bn49p") pod "ac73988c-f87a-4d9e-a9b8-139d9f791090" (UID: "ac73988c-f87a-4d9e-a9b8-139d9f791090"). InnerVolumeSpecName "kube-api-access-bn49p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.909603 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac73988c-f87a-4d9e-a9b8-139d9f791090-config-data" (OuterVolumeSpecName: "config-data") pod "ac73988c-f87a-4d9e-a9b8-139d9f791090" (UID: "ac73988c-f87a-4d9e-a9b8-139d9f791090"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.912738 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e57f73cf-4f9a-4c56-b146-af55d546e1b0","Type":"ContainerStarted","Data":"2a9eb8d59aac80d11abeeb6f04e66b2aefa7cae7495c7ba476fbe71c58f7878c"} Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.913927 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.931694 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac73988c-f87a-4d9e-a9b8-139d9f791090-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac73988c-f87a-4d9e-a9b8-139d9f791090" (UID: "ac73988c-f87a-4d9e-a9b8-139d9f791090"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.936858 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac73988c-f87a-4d9e-a9b8-139d9f791090-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.937314 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d8c3f69-2a93-4e2b-92b1-5f467c4e8178-logs\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.937326 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac73988c-f87a-4d9e-a9b8-139d9f791090-logs\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.937335 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn49p\" (UniqueName: \"kubernetes.io/projected/ac73988c-f87a-4d9e-a9b8-139d9f791090-kube-api-access-bn49p\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.937345 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac73988c-f87a-4d9e-a9b8-139d9f791090-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.937354 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qbk4\" (UniqueName: \"kubernetes.io/projected/7d8c3f69-2a93-4e2b-92b1-5f467c4e8178-kube-api-access-2qbk4\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.941706 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell3-conductor-0" event={"ID":"4f3cdf1c-630c-4944-ae58-c426e0d2161a","Type":"ContainerStarted","Data":"021742ed4b31b0cdb1f94b3211a4216b98b34188eb48343aaf531734f0ecc45b"} Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.942530 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell3-conductor-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.945036 4947 generic.go:334] "Generic (PLEG): container finished" podID="ac73988c-f87a-4d9e-a9b8-139d9f791090" containerID="a64aefe07c67ce9470729e87599cea33ac4347a2e658daf4d1f6bcf83d4ec0c9" exitCode=0 Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.945056 4947 generic.go:334] "Generic (PLEG): container finished" podID="ac73988c-f87a-4d9e-a9b8-139d9f791090" containerID="ea24bcd78650cea8b8af6ce971dc07eb338873a53732e4d9f8ac732502f308a2" exitCode=143 Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.945089 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ac73988c-f87a-4d9e-a9b8-139d9f791090","Type":"ContainerDied","Data":"a64aefe07c67ce9470729e87599cea33ac4347a2e658daf4d1f6bcf83d4ec0c9"} Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.945104 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ac73988c-f87a-4d9e-a9b8-139d9f791090","Type":"ContainerDied","Data":"ea24bcd78650cea8b8af6ce971dc07eb338873a53732e4d9f8ac732502f308a2"} Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.945113 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ac73988c-f87a-4d9e-a9b8-139d9f791090","Type":"ContainerDied","Data":"d674cd3cf4f468f135d5addf703ab04ecc2440ff1265d0e0829568683cded8d1"} Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.945127 4947 scope.go:117] "RemoveContainer" containerID="a64aefe07c67ce9470729e87599cea33ac4347a2e658daf4d1f6bcf83d4ec0c9" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.945215 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.961039 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.961020492 podStartE2EDuration="2.961020492s" podCreationTimestamp="2025-12-03 09:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:07:10.947934458 +0000 UTC m=+8292.208888884" watchObservedRunningTime="2025-12-03 09:07:10.961020492 +0000 UTC m=+8292.221974918" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.969282 4947 generic.go:334] "Generic (PLEG): container finished" podID="7d8c3f69-2a93-4e2b-92b1-5f467c4e8178" containerID="27a963180bcfd36bc9b01e84260e09db28c4c7e001dfb27447959d30cb16e1a0" exitCode=0 Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.969305 4947 generic.go:334] "Generic (PLEG): container finished" podID="7d8c3f69-2a93-4e2b-92b1-5f467c4e8178" containerID="0e62f4ca02a3dd2d0de8ead00167e4fc86dccad4d2d5de7eccf787cba7a054dc" exitCode=143 Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.969374 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178","Type":"ContainerDied","Data":"27a963180bcfd36bc9b01e84260e09db28c4c7e001dfb27447959d30cb16e1a0"} Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.969428 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178","Type":"ContainerDied","Data":"0e62f4ca02a3dd2d0de8ead00167e4fc86dccad4d2d5de7eccf787cba7a054dc"} Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.969441 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7d8c3f69-2a93-4e2b-92b1-5f467c4e8178","Type":"ContainerDied","Data":"dda0a17f4edb9b6dbe752bec33ec9dbd26007be15b57f4e6b3a878fb06b759b2"} Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.969521 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.972758 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d8c3f69-2a93-4e2b-92b1-5f467c4e8178-config-data" (OuterVolumeSpecName: "config-data") pod "7d8c3f69-2a93-4e2b-92b1-5f467c4e8178" (UID: "7d8c3f69-2a93-4e2b-92b1-5f467c4e8178"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.975159 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-conductor-0" event={"ID":"d13b2965-2e64-4e8d-a42a-48bdcb8c5bed","Type":"ContainerStarted","Data":"001dfcd357ce07fb6bceb9ec56d451cdfd1de5ea4845b9c29d545b7d0afe7bfb"} Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.975396 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-conductor-0" event={"ID":"d13b2965-2e64-4e8d-a42a-48bdcb8c5bed","Type":"ContainerStarted","Data":"1a1fa8c9acb43c48a801341b48b69c013c6d97545e6e207771a93d4dbfc335ee"} Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.976181 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell2-conductor-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.980865 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell3-conductor-0" podStartSLOduration=2.980849889 podStartE2EDuration="2.980849889s" podCreationTimestamp="2025-12-03 09:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:07:10.976589853 +0000 UTC m=+8292.237544289" watchObservedRunningTime="2025-12-03 09:07:10.980849889 +0000 UTC m=+8292.241804315" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:10.995576 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d8c3f69-2a93-4e2b-92b1-5f467c4e8178-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d8c3f69-2a93-4e2b-92b1-5f467c4e8178" (UID: "7d8c3f69-2a93-4e2b-92b1-5f467c4e8178"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.004482 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell2-conductor-0" podStartSLOduration=3.004467747 podStartE2EDuration="3.004467747s" podCreationTimestamp="2025-12-03 09:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:07:11.00235242 +0000 UTC m=+8292.263306846" watchObservedRunningTime="2025-12-03 09:07:11.004467747 +0000 UTC m=+8292.265422173" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.008963 4947 scope.go:117] "RemoveContainer" containerID="ea24bcd78650cea8b8af6ce971dc07eb338873a53732e4d9f8ac732502f308a2" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.030518 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.043778 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d8c3f69-2a93-4e2b-92b1-5f467c4e8178-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.043809 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d8c3f69-2a93-4e2b-92b1-5f467c4e8178-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.047940 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.057823 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 09:07:11 crc kubenswrapper[4947]: E1203 09:07:11.058251 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac73988c-f87a-4d9e-a9b8-139d9f791090" containerName="nova-api-log" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.058264 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac73988c-f87a-4d9e-a9b8-139d9f791090" containerName="nova-api-log" Dec 03 09:07:11 crc kubenswrapper[4947]: E1203 09:07:11.058276 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac73988c-f87a-4d9e-a9b8-139d9f791090" containerName="nova-api-api" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.058281 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac73988c-f87a-4d9e-a9b8-139d9f791090" containerName="nova-api-api" Dec 03 09:07:11 crc kubenswrapper[4947]: E1203 09:07:11.058302 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d8c3f69-2a93-4e2b-92b1-5f467c4e8178" containerName="nova-metadata-metadata" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.058307 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d8c3f69-2a93-4e2b-92b1-5f467c4e8178" containerName="nova-metadata-metadata" Dec 03 09:07:11 crc kubenswrapper[4947]: E1203 09:07:11.058318 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d8c3f69-2a93-4e2b-92b1-5f467c4e8178" containerName="nova-metadata-log" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.058323 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d8c3f69-2a93-4e2b-92b1-5f467c4e8178" containerName="nova-metadata-log" Dec 03 09:07:11 crc kubenswrapper[4947]: E1203 09:07:11.058339 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36ef7a0e-8716-49c8-8605-d4d09d31df62" containerName="nova-manage" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.058345 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="36ef7a0e-8716-49c8-8605-d4d09d31df62" containerName="nova-manage" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.058524 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac73988c-f87a-4d9e-a9b8-139d9f791090" containerName="nova-api-log" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.058543 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d8c3f69-2a93-4e2b-92b1-5f467c4e8178" containerName="nova-metadata-log" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.058554 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d8c3f69-2a93-4e2b-92b1-5f467c4e8178" containerName="nova-metadata-metadata" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.058569 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac73988c-f87a-4d9e-a9b8-139d9f791090" containerName="nova-api-api" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.058578 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="36ef7a0e-8716-49c8-8605-d4d09d31df62" containerName="nova-manage" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.059640 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.068015 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.072743 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.097667 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac73988c-f87a-4d9e-a9b8-139d9f791090" path="/var/lib/kubelet/pods/ac73988c-f87a-4d9e-a9b8-139d9f791090/volumes" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.145446 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d\") " pod="openstack/nova-api-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.145576 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgqrz\" (UniqueName: \"kubernetes.io/projected/0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d-kube-api-access-jgqrz\") pod \"nova-api-0\" (UID: \"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d\") " pod="openstack/nova-api-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.145629 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d-logs\") pod \"nova-api-0\" (UID: \"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d\") " pod="openstack/nova-api-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.146957 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d-config-data\") pod \"nova-api-0\" (UID: \"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d\") " pod="openstack/nova-api-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.159228 4947 scope.go:117] "RemoveContainer" containerID="a64aefe07c67ce9470729e87599cea33ac4347a2e658daf4d1f6bcf83d4ec0c9" Dec 03 09:07:11 crc kubenswrapper[4947]: E1203 09:07:11.161615 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a64aefe07c67ce9470729e87599cea33ac4347a2e658daf4d1f6bcf83d4ec0c9\": container with ID starting with a64aefe07c67ce9470729e87599cea33ac4347a2e658daf4d1f6bcf83d4ec0c9 not found: ID does not exist" containerID="a64aefe07c67ce9470729e87599cea33ac4347a2e658daf4d1f6bcf83d4ec0c9" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.161683 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a64aefe07c67ce9470729e87599cea33ac4347a2e658daf4d1f6bcf83d4ec0c9"} err="failed to get container status \"a64aefe07c67ce9470729e87599cea33ac4347a2e658daf4d1f6bcf83d4ec0c9\": rpc error: code = NotFound desc = could not find container \"a64aefe07c67ce9470729e87599cea33ac4347a2e658daf4d1f6bcf83d4ec0c9\": container with ID starting with a64aefe07c67ce9470729e87599cea33ac4347a2e658daf4d1f6bcf83d4ec0c9 not found: ID does not exist" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.161740 4947 scope.go:117] "RemoveContainer" containerID="ea24bcd78650cea8b8af6ce971dc07eb338873a53732e4d9f8ac732502f308a2" Dec 03 09:07:11 crc kubenswrapper[4947]: E1203 09:07:11.162225 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea24bcd78650cea8b8af6ce971dc07eb338873a53732e4d9f8ac732502f308a2\": container with ID starting with ea24bcd78650cea8b8af6ce971dc07eb338873a53732e4d9f8ac732502f308a2 not found: ID does not exist" containerID="ea24bcd78650cea8b8af6ce971dc07eb338873a53732e4d9f8ac732502f308a2" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.162257 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea24bcd78650cea8b8af6ce971dc07eb338873a53732e4d9f8ac732502f308a2"} err="failed to get container status \"ea24bcd78650cea8b8af6ce971dc07eb338873a53732e4d9f8ac732502f308a2\": rpc error: code = NotFound desc = could not find container \"ea24bcd78650cea8b8af6ce971dc07eb338873a53732e4d9f8ac732502f308a2\": container with ID starting with ea24bcd78650cea8b8af6ce971dc07eb338873a53732e4d9f8ac732502f308a2 not found: ID does not exist" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.162277 4947 scope.go:117] "RemoveContainer" containerID="a64aefe07c67ce9470729e87599cea33ac4347a2e658daf4d1f6bcf83d4ec0c9" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.162525 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a64aefe07c67ce9470729e87599cea33ac4347a2e658daf4d1f6bcf83d4ec0c9"} err="failed to get container status \"a64aefe07c67ce9470729e87599cea33ac4347a2e658daf4d1f6bcf83d4ec0c9\": rpc error: code = NotFound desc = could not find container \"a64aefe07c67ce9470729e87599cea33ac4347a2e658daf4d1f6bcf83d4ec0c9\": container with ID starting with a64aefe07c67ce9470729e87599cea33ac4347a2e658daf4d1f6bcf83d4ec0c9 not found: ID does not exist" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.162542 4947 scope.go:117] "RemoveContainer" containerID="ea24bcd78650cea8b8af6ce971dc07eb338873a53732e4d9f8ac732502f308a2" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.162771 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea24bcd78650cea8b8af6ce971dc07eb338873a53732e4d9f8ac732502f308a2"} err="failed to get container status \"ea24bcd78650cea8b8af6ce971dc07eb338873a53732e4d9f8ac732502f308a2\": rpc error: code = NotFound desc = could not find container \"ea24bcd78650cea8b8af6ce971dc07eb338873a53732e4d9f8ac732502f308a2\": container with ID starting with ea24bcd78650cea8b8af6ce971dc07eb338873a53732e4d9f8ac732502f308a2 not found: ID does not exist" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.162783 4947 scope.go:117] "RemoveContainer" containerID="27a963180bcfd36bc9b01e84260e09db28c4c7e001dfb27447959d30cb16e1a0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.204740 4947 scope.go:117] "RemoveContainer" containerID="0e62f4ca02a3dd2d0de8ead00167e4fc86dccad4d2d5de7eccf787cba7a054dc" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.209134 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.209642 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.221202 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell2-novncproxy-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.230965 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.249888 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d-logs\") pod \"nova-api-0\" (UID: \"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d\") " pod="openstack/nova-api-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.250084 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d-config-data\") pod \"nova-api-0\" (UID: \"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d\") " pod="openstack/nova-api-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.250547 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d\") " pod="openstack/nova-api-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.250644 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgqrz\" (UniqueName: \"kubernetes.io/projected/0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d-kube-api-access-jgqrz\") pod \"nova-api-0\" (UID: \"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d\") " pod="openstack/nova-api-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.251295 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d-logs\") pod \"nova-api-0\" (UID: \"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d\") " pod="openstack/nova-api-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.252064 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell2-novncproxy-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.258336 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d-config-data\") pod \"nova-api-0\" (UID: \"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d\") " pod="openstack/nova-api-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.273159 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d\") " pod="openstack/nova-api-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.274534 4947 scope.go:117] "RemoveContainer" containerID="27a963180bcfd36bc9b01e84260e09db28c4c7e001dfb27447959d30cb16e1a0" Dec 03 09:07:11 crc kubenswrapper[4947]: E1203 09:07:11.275293 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27a963180bcfd36bc9b01e84260e09db28c4c7e001dfb27447959d30cb16e1a0\": container with ID starting with 27a963180bcfd36bc9b01e84260e09db28c4c7e001dfb27447959d30cb16e1a0 not found: ID does not exist" containerID="27a963180bcfd36bc9b01e84260e09db28c4c7e001dfb27447959d30cb16e1a0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.275325 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27a963180bcfd36bc9b01e84260e09db28c4c7e001dfb27447959d30cb16e1a0"} err="failed to get container status \"27a963180bcfd36bc9b01e84260e09db28c4c7e001dfb27447959d30cb16e1a0\": rpc error: code = NotFound desc = could not find container \"27a963180bcfd36bc9b01e84260e09db28c4c7e001dfb27447959d30cb16e1a0\": container with ID starting with 27a963180bcfd36bc9b01e84260e09db28c4c7e001dfb27447959d30cb16e1a0 not found: ID does not exist" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.276323 4947 scope.go:117] "RemoveContainer" containerID="0e62f4ca02a3dd2d0de8ead00167e4fc86dccad4d2d5de7eccf787cba7a054dc" Dec 03 09:07:11 crc kubenswrapper[4947]: E1203 09:07:11.276856 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e62f4ca02a3dd2d0de8ead00167e4fc86dccad4d2d5de7eccf787cba7a054dc\": container with ID starting with 0e62f4ca02a3dd2d0de8ead00167e4fc86dccad4d2d5de7eccf787cba7a054dc not found: ID does not exist" containerID="0e62f4ca02a3dd2d0de8ead00167e4fc86dccad4d2d5de7eccf787cba7a054dc" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.278552 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e62f4ca02a3dd2d0de8ead00167e4fc86dccad4d2d5de7eccf787cba7a054dc"} err="failed to get container status \"0e62f4ca02a3dd2d0de8ead00167e4fc86dccad4d2d5de7eccf787cba7a054dc\": rpc error: code = NotFound desc = could not find container \"0e62f4ca02a3dd2d0de8ead00167e4fc86dccad4d2d5de7eccf787cba7a054dc\": container with ID starting with 0e62f4ca02a3dd2d0de8ead00167e4fc86dccad4d2d5de7eccf787cba7a054dc not found: ID does not exist" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.278581 4947 scope.go:117] "RemoveContainer" containerID="27a963180bcfd36bc9b01e84260e09db28c4c7e001dfb27447959d30cb16e1a0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.281684 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgqrz\" (UniqueName: \"kubernetes.io/projected/0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d-kube-api-access-jgqrz\") pod \"nova-api-0\" (UID: \"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d\") " pod="openstack/nova-api-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.282822 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27a963180bcfd36bc9b01e84260e09db28c4c7e001dfb27447959d30cb16e1a0"} err="failed to get container status \"27a963180bcfd36bc9b01e84260e09db28c4c7e001dfb27447959d30cb16e1a0\": rpc error: code = NotFound desc = could not find container \"27a963180bcfd36bc9b01e84260e09db28c4c7e001dfb27447959d30cb16e1a0\": container with ID starting with 27a963180bcfd36bc9b01e84260e09db28c4c7e001dfb27447959d30cb16e1a0 not found: ID does not exist" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.282869 4947 scope.go:117] "RemoveContainer" containerID="0e62f4ca02a3dd2d0de8ead00167e4fc86dccad4d2d5de7eccf787cba7a054dc" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.283303 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e62f4ca02a3dd2d0de8ead00167e4fc86dccad4d2d5de7eccf787cba7a054dc"} err="failed to get container status \"0e62f4ca02a3dd2d0de8ead00167e4fc86dccad4d2d5de7eccf787cba7a054dc\": rpc error: code = NotFound desc = could not find container \"0e62f4ca02a3dd2d0de8ead00167e4fc86dccad4d2d5de7eccf787cba7a054dc\": container with ID starting with 0e62f4ca02a3dd2d0de8ead00167e4fc86dccad4d2d5de7eccf787cba7a054dc not found: ID does not exist" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.296939 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86559d7c89-9459n"] Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.297205 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86559d7c89-9459n" podUID="22679f47-5751-4f6c-a404-93a57d494ebc" containerName="dnsmasq-dns" containerID="cri-o://2c5464a3c3c03006bf71528218685df4f127aa268315b4200d721725b3c40813" gracePeriod=10 Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.360900 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell3-novncproxy-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.373662 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell3-novncproxy-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.476188 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.492521 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.504714 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.531847 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.533805 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.535521 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.548462 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.564743 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cac1187-37b2-46a4-a16e-e818460fb3ae-logs\") pod \"nova-metadata-0\" (UID: \"9cac1187-37b2-46a4-a16e-e818460fb3ae\") " pod="openstack/nova-metadata-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.564786 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cac1187-37b2-46a4-a16e-e818460fb3ae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9cac1187-37b2-46a4-a16e-e818460fb3ae\") " pod="openstack/nova-metadata-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.564830 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktkmr\" (UniqueName: \"kubernetes.io/projected/9cac1187-37b2-46a4-a16e-e818460fb3ae-kube-api-access-ktkmr\") pod \"nova-metadata-0\" (UID: \"9cac1187-37b2-46a4-a16e-e818460fb3ae\") " pod="openstack/nova-metadata-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.564851 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cac1187-37b2-46a4-a16e-e818460fb3ae-config-data\") pod \"nova-metadata-0\" (UID: \"9cac1187-37b2-46a4-a16e-e818460fb3ae\") " pod="openstack/nova-metadata-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.667716 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cac1187-37b2-46a4-a16e-e818460fb3ae-logs\") pod \"nova-metadata-0\" (UID: \"9cac1187-37b2-46a4-a16e-e818460fb3ae\") " pod="openstack/nova-metadata-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.667771 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cac1187-37b2-46a4-a16e-e818460fb3ae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9cac1187-37b2-46a4-a16e-e818460fb3ae\") " pod="openstack/nova-metadata-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.667852 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktkmr\" (UniqueName: \"kubernetes.io/projected/9cac1187-37b2-46a4-a16e-e818460fb3ae-kube-api-access-ktkmr\") pod \"nova-metadata-0\" (UID: \"9cac1187-37b2-46a4-a16e-e818460fb3ae\") " pod="openstack/nova-metadata-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.667906 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cac1187-37b2-46a4-a16e-e818460fb3ae-config-data\") pod \"nova-metadata-0\" (UID: \"9cac1187-37b2-46a4-a16e-e818460fb3ae\") " pod="openstack/nova-metadata-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.669735 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cac1187-37b2-46a4-a16e-e818460fb3ae-logs\") pod \"nova-metadata-0\" (UID: \"9cac1187-37b2-46a4-a16e-e818460fb3ae\") " pod="openstack/nova-metadata-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.675222 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cac1187-37b2-46a4-a16e-e818460fb3ae-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9cac1187-37b2-46a4-a16e-e818460fb3ae\") " pod="openstack/nova-metadata-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.676879 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cac1187-37b2-46a4-a16e-e818460fb3ae-config-data\") pod \"nova-metadata-0\" (UID: \"9cac1187-37b2-46a4-a16e-e818460fb3ae\") " pod="openstack/nova-metadata-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.688262 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktkmr\" (UniqueName: \"kubernetes.io/projected/9cac1187-37b2-46a4-a16e-e818460fb3ae-kube-api-access-ktkmr\") pod \"nova-metadata-0\" (UID: \"9cac1187-37b2-46a4-a16e-e818460fb3ae\") " pod="openstack/nova-metadata-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.894993 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.921915 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.973573 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ce93d1-fa71-48ee-bf44-a0203c659dee-combined-ca-bundle\") pod \"06ce93d1-fa71-48ee-bf44-a0203c659dee\" (UID: \"06ce93d1-fa71-48ee-bf44-a0203c659dee\") " Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.973758 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06ce93d1-fa71-48ee-bf44-a0203c659dee-config-data\") pod \"06ce93d1-fa71-48ee-bf44-a0203c659dee\" (UID: \"06ce93d1-fa71-48ee-bf44-a0203c659dee\") " Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.973798 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvjv4\" (UniqueName: \"kubernetes.io/projected/06ce93d1-fa71-48ee-bf44-a0203c659dee-kube-api-access-bvjv4\") pod \"06ce93d1-fa71-48ee-bf44-a0203c659dee\" (UID: \"06ce93d1-fa71-48ee-bf44-a0203c659dee\") " Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.981119 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06ce93d1-fa71-48ee-bf44-a0203c659dee-kube-api-access-bvjv4" (OuterVolumeSpecName: "kube-api-access-bvjv4") pod "06ce93d1-fa71-48ee-bf44-a0203c659dee" (UID: "06ce93d1-fa71-48ee-bf44-a0203c659dee"). InnerVolumeSpecName "kube-api-access-bvjv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:07:11 crc kubenswrapper[4947]: I1203 09:07:11.985957 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86559d7c89-9459n" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.008659 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06ce93d1-fa71-48ee-bf44-a0203c659dee-config-data" (OuterVolumeSpecName: "config-data") pod "06ce93d1-fa71-48ee-bf44-a0203c659dee" (UID: "06ce93d1-fa71-48ee-bf44-a0203c659dee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.011165 4947 generic.go:334] "Generic (PLEG): container finished" podID="06ce93d1-fa71-48ee-bf44-a0203c659dee" containerID="29cd745d90bd284fce8f4c323fcdf100937e2f8730e0fdbf47815fa52ae81608" exitCode=0 Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.011220 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"06ce93d1-fa71-48ee-bf44-a0203c659dee","Type":"ContainerDied","Data":"29cd745d90bd284fce8f4c323fcdf100937e2f8730e0fdbf47815fa52ae81608"} Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.011266 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"06ce93d1-fa71-48ee-bf44-a0203c659dee","Type":"ContainerDied","Data":"a64382d9ccad42fd41240d65f7886367ce569047af764585060d52ffc6b1bc72"} Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.011293 4947 scope.go:117] "RemoveContainer" containerID="29cd745d90bd284fce8f4c323fcdf100937e2f8730e0fdbf47815fa52ae81608" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.011394 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.018364 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06ce93d1-fa71-48ee-bf44-a0203c659dee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06ce93d1-fa71-48ee-bf44-a0203c659dee" (UID: "06ce93d1-fa71-48ee-bf44-a0203c659dee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.021741 4947 generic.go:334] "Generic (PLEG): container finished" podID="22679f47-5751-4f6c-a404-93a57d494ebc" containerID="2c5464a3c3c03006bf71528218685df4f127aa268315b4200d721725b3c40813" exitCode=0 Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.021802 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86559d7c89-9459n" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.021878 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86559d7c89-9459n" event={"ID":"22679f47-5751-4f6c-a404-93a57d494ebc","Type":"ContainerDied","Data":"2c5464a3c3c03006bf71528218685df4f127aa268315b4200d721725b3c40813"} Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.021936 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86559d7c89-9459n" event={"ID":"22679f47-5751-4f6c-a404-93a57d494ebc","Type":"ContainerDied","Data":"fba60e7c2c585a65fdbbdeade94d713bdf4124f64ec57429283004672a30d631"} Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.032419 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell3-novncproxy-0" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.033283 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.036953 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell2-novncproxy-0" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.096439 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22679f47-5751-4f6c-a404-93a57d494ebc-ovsdbserver-nb\") pod \"22679f47-5751-4f6c-a404-93a57d494ebc\" (UID: \"22679f47-5751-4f6c-a404-93a57d494ebc\") " Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.096686 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22679f47-5751-4f6c-a404-93a57d494ebc-ovsdbserver-sb\") pod \"22679f47-5751-4f6c-a404-93a57d494ebc\" (UID: \"22679f47-5751-4f6c-a404-93a57d494ebc\") " Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.097109 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22679f47-5751-4f6c-a404-93a57d494ebc-dns-svc\") pod \"22679f47-5751-4f6c-a404-93a57d494ebc\" (UID: \"22679f47-5751-4f6c-a404-93a57d494ebc\") " Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.097232 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4l5b\" (UniqueName: \"kubernetes.io/projected/22679f47-5751-4f6c-a404-93a57d494ebc-kube-api-access-s4l5b\") pod \"22679f47-5751-4f6c-a404-93a57d494ebc\" (UID: \"22679f47-5751-4f6c-a404-93a57d494ebc\") " Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.097291 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22679f47-5751-4f6c-a404-93a57d494ebc-config\") pod \"22679f47-5751-4f6c-a404-93a57d494ebc\" (UID: \"22679f47-5751-4f6c-a404-93a57d494ebc\") " Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.101415 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06ce93d1-fa71-48ee-bf44-a0203c659dee-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.101480 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvjv4\" (UniqueName: \"kubernetes.io/projected/06ce93d1-fa71-48ee-bf44-a0203c659dee-kube-api-access-bvjv4\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.101530 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ce93d1-fa71-48ee-bf44-a0203c659dee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.103090 4947 scope.go:117] "RemoveContainer" containerID="29cd745d90bd284fce8f4c323fcdf100937e2f8730e0fdbf47815fa52ae81608" Dec 03 09:07:12 crc kubenswrapper[4947]: E1203 09:07:12.106655 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29cd745d90bd284fce8f4c323fcdf100937e2f8730e0fdbf47815fa52ae81608\": container with ID starting with 29cd745d90bd284fce8f4c323fcdf100937e2f8730e0fdbf47815fa52ae81608 not found: ID does not exist" containerID="29cd745d90bd284fce8f4c323fcdf100937e2f8730e0fdbf47815fa52ae81608" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.106741 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29cd745d90bd284fce8f4c323fcdf100937e2f8730e0fdbf47815fa52ae81608"} err="failed to get container status \"29cd745d90bd284fce8f4c323fcdf100937e2f8730e0fdbf47815fa52ae81608\": rpc error: code = NotFound desc = could not find container \"29cd745d90bd284fce8f4c323fcdf100937e2f8730e0fdbf47815fa52ae81608\": container with ID starting with 29cd745d90bd284fce8f4c323fcdf100937e2f8730e0fdbf47815fa52ae81608 not found: ID does not exist" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.106769 4947 scope.go:117] "RemoveContainer" containerID="2c5464a3c3c03006bf71528218685df4f127aa268315b4200d721725b3c40813" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.178331 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22679f47-5751-4f6c-a404-93a57d494ebc-kube-api-access-s4l5b" (OuterVolumeSpecName: "kube-api-access-s4l5b") pod "22679f47-5751-4f6c-a404-93a57d494ebc" (UID: "22679f47-5751-4f6c-a404-93a57d494ebc"). InnerVolumeSpecName "kube-api-access-s4l5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.203770 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4l5b\" (UniqueName: \"kubernetes.io/projected/22679f47-5751-4f6c-a404-93a57d494ebc-kube-api-access-s4l5b\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.218209 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.292012 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22679f47-5751-4f6c-a404-93a57d494ebc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "22679f47-5751-4f6c-a404-93a57d494ebc" (UID: "22679f47-5751-4f6c-a404-93a57d494ebc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.295851 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22679f47-5751-4f6c-a404-93a57d494ebc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "22679f47-5751-4f6c-a404-93a57d494ebc" (UID: "22679f47-5751-4f6c-a404-93a57d494ebc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.305357 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22679f47-5751-4f6c-a404-93a57d494ebc-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.305390 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22679f47-5751-4f6c-a404-93a57d494ebc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.310279 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22679f47-5751-4f6c-a404-93a57d494ebc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "22679f47-5751-4f6c-a404-93a57d494ebc" (UID: "22679f47-5751-4f6c-a404-93a57d494ebc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.313330 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22679f47-5751-4f6c-a404-93a57d494ebc-config" (OuterVolumeSpecName: "config") pod "22679f47-5751-4f6c-a404-93a57d494ebc" (UID: "22679f47-5751-4f6c-a404-93a57d494ebc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.406972 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22679f47-5751-4f6c-a404-93a57d494ebc-config\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.407244 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22679f47-5751-4f6c-a404-93a57d494ebc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.440448 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86559d7c89-9459n"] Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.455874 4947 scope.go:117] "RemoveContainer" containerID="d5d66c37b14ad7a5d7f345841dece0b005c404d66caa2ee3cf5a1edc1888b554" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.458169 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86559d7c89-9459n"] Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.488746 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.493311 4947 scope.go:117] "RemoveContainer" containerID="2c5464a3c3c03006bf71528218685df4f127aa268315b4200d721725b3c40813" Dec 03 09:07:12 crc kubenswrapper[4947]: E1203 09:07:12.494663 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c5464a3c3c03006bf71528218685df4f127aa268315b4200d721725b3c40813\": container with ID starting with 2c5464a3c3c03006bf71528218685df4f127aa268315b4200d721725b3c40813 not found: ID does not exist" containerID="2c5464a3c3c03006bf71528218685df4f127aa268315b4200d721725b3c40813" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.494700 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c5464a3c3c03006bf71528218685df4f127aa268315b4200d721725b3c40813"} err="failed to get container status \"2c5464a3c3c03006bf71528218685df4f127aa268315b4200d721725b3c40813\": rpc error: code = NotFound desc = could not find container \"2c5464a3c3c03006bf71528218685df4f127aa268315b4200d721725b3c40813\": container with ID starting with 2c5464a3c3c03006bf71528218685df4f127aa268315b4200d721725b3c40813 not found: ID does not exist" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.494724 4947 scope.go:117] "RemoveContainer" containerID="d5d66c37b14ad7a5d7f345841dece0b005c404d66caa2ee3cf5a1edc1888b554" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.498233 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 09:07:12 crc kubenswrapper[4947]: E1203 09:07:12.500601 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5d66c37b14ad7a5d7f345841dece0b005c404d66caa2ee3cf5a1edc1888b554\": container with ID starting with d5d66c37b14ad7a5d7f345841dece0b005c404d66caa2ee3cf5a1edc1888b554 not found: ID does not exist" containerID="d5d66c37b14ad7a5d7f345841dece0b005c404d66caa2ee3cf5a1edc1888b554" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.500635 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5d66c37b14ad7a5d7f345841dece0b005c404d66caa2ee3cf5a1edc1888b554"} err="failed to get container status \"d5d66c37b14ad7a5d7f345841dece0b005c404d66caa2ee3cf5a1edc1888b554\": rpc error: code = NotFound desc = could not find container \"d5d66c37b14ad7a5d7f345841dece0b005c404d66caa2ee3cf5a1edc1888b554\": container with ID starting with d5d66c37b14ad7a5d7f345841dece0b005c404d66caa2ee3cf5a1edc1888b554 not found: ID does not exist" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.523409 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 09:07:12 crc kubenswrapper[4947]: E1203 09:07:12.524025 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06ce93d1-fa71-48ee-bf44-a0203c659dee" containerName="nova-scheduler-scheduler" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.524048 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="06ce93d1-fa71-48ee-bf44-a0203c659dee" containerName="nova-scheduler-scheduler" Dec 03 09:07:12 crc kubenswrapper[4947]: E1203 09:07:12.524067 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22679f47-5751-4f6c-a404-93a57d494ebc" containerName="init" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.524075 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="22679f47-5751-4f6c-a404-93a57d494ebc" containerName="init" Dec 03 09:07:12 crc kubenswrapper[4947]: E1203 09:07:12.524105 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22679f47-5751-4f6c-a404-93a57d494ebc" containerName="dnsmasq-dns" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.524114 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="22679f47-5751-4f6c-a404-93a57d494ebc" containerName="dnsmasq-dns" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.524364 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="06ce93d1-fa71-48ee-bf44-a0203c659dee" containerName="nova-scheduler-scheduler" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.524380 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="22679f47-5751-4f6c-a404-93a57d494ebc" containerName="dnsmasq-dns" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.525605 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.528368 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.547366 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.614660 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b52112-1cfc-45e7-8835-55d0ded3d817-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"06b52112-1cfc-45e7-8835-55d0ded3d817\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.614719 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx89p\" (UniqueName: \"kubernetes.io/projected/06b52112-1cfc-45e7-8835-55d0ded3d817-kube-api-access-qx89p\") pod \"nova-scheduler-0\" (UID: \"06b52112-1cfc-45e7-8835-55d0ded3d817\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.614933 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b52112-1cfc-45e7-8835-55d0ded3d817-config-data\") pod \"nova-scheduler-0\" (UID: \"06b52112-1cfc-45e7-8835-55d0ded3d817\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.622837 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.716210 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx89p\" (UniqueName: \"kubernetes.io/projected/06b52112-1cfc-45e7-8835-55d0ded3d817-kube-api-access-qx89p\") pod \"nova-scheduler-0\" (UID: \"06b52112-1cfc-45e7-8835-55d0ded3d817\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.716315 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b52112-1cfc-45e7-8835-55d0ded3d817-config-data\") pod \"nova-scheduler-0\" (UID: \"06b52112-1cfc-45e7-8835-55d0ded3d817\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.716365 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b52112-1cfc-45e7-8835-55d0ded3d817-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"06b52112-1cfc-45e7-8835-55d0ded3d817\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.721991 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b52112-1cfc-45e7-8835-55d0ded3d817-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"06b52112-1cfc-45e7-8835-55d0ded3d817\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.729172 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b52112-1cfc-45e7-8835-55d0ded3d817-config-data\") pod \"nova-scheduler-0\" (UID: \"06b52112-1cfc-45e7-8835-55d0ded3d817\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.735616 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx89p\" (UniqueName: \"kubernetes.io/projected/06b52112-1cfc-45e7-8835-55d0ded3d817-kube-api-access-qx89p\") pod \"nova-scheduler-0\" (UID: \"06b52112-1cfc-45e7-8835-55d0ded3d817\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:12 crc kubenswrapper[4947]: I1203 09:07:12.848771 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 09:07:13 crc kubenswrapper[4947]: I1203 09:07:13.068716 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9cac1187-37b2-46a4-a16e-e818460fb3ae","Type":"ContainerStarted","Data":"81a5297dff82665a203c19b6551d92f166342ab6461901a3426e7e9750b74880"} Dec 03 09:07:13 crc kubenswrapper[4947]: I1203 09:07:13.069018 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9cac1187-37b2-46a4-a16e-e818460fb3ae","Type":"ContainerStarted","Data":"9d9264d566df680dbc07de41848587f22567f9e8de740223d9346d779856e7d8"} Dec 03 09:07:13 crc kubenswrapper[4947]: I1203 09:07:13.069028 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9cac1187-37b2-46a4-a16e-e818460fb3ae","Type":"ContainerStarted","Data":"474e57148250c76e42b1c216b84abd5d2db5214eec2baa410a044c16a654e221"} Dec 03 09:07:13 crc kubenswrapper[4947]: I1203 09:07:13.121666 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.121646164 podStartE2EDuration="2.121646164s" podCreationTimestamp="2025-12-03 09:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:07:13.099853094 +0000 UTC m=+8294.360807520" watchObservedRunningTime="2025-12-03 09:07:13.121646164 +0000 UTC m=+8294.382600590" Dec 03 09:07:13 crc kubenswrapper[4947]: I1203 09:07:13.128445 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.128414877 podStartE2EDuration="2.128414877s" podCreationTimestamp="2025-12-03 09:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:07:13.120928884 +0000 UTC m=+8294.381883330" watchObservedRunningTime="2025-12-03 09:07:13.128414877 +0000 UTC m=+8294.389369313" Dec 03 09:07:13 crc kubenswrapper[4947]: I1203 09:07:13.141015 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06ce93d1-fa71-48ee-bf44-a0203c659dee" path="/var/lib/kubelet/pods/06ce93d1-fa71-48ee-bf44-a0203c659dee/volumes" Dec 03 09:07:13 crc kubenswrapper[4947]: I1203 09:07:13.141710 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22679f47-5751-4f6c-a404-93a57d494ebc" path="/var/lib/kubelet/pods/22679f47-5751-4f6c-a404-93a57d494ebc/volumes" Dec 03 09:07:13 crc kubenswrapper[4947]: I1203 09:07:13.142537 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d8c3f69-2a93-4e2b-92b1-5f467c4e8178" path="/var/lib/kubelet/pods/7d8c3f69-2a93-4e2b-92b1-5f467c4e8178/volumes" Dec 03 09:07:13 crc kubenswrapper[4947]: I1203 09:07:13.146121 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d","Type":"ContainerStarted","Data":"48a11f82807e193e17c6a1a78f68e5b323d996d5a58d0e4d3dabd62d28d728bd"} Dec 03 09:07:13 crc kubenswrapper[4947]: I1203 09:07:13.146165 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d","Type":"ContainerStarted","Data":"a89403aad01cbe1d34490abaa5c8b21e538c4c3ca9b84b88fcfa657f67318d06"} Dec 03 09:07:13 crc kubenswrapper[4947]: I1203 09:07:13.146186 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d","Type":"ContainerStarted","Data":"a6fb21728284334d93839089d47e8fade09a71b276976da86a24c556f499aa47"} Dec 03 09:07:13 crc kubenswrapper[4947]: I1203 09:07:13.331032 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 09:07:13 crc kubenswrapper[4947]: W1203 09:07:13.331508 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06b52112_1cfc_45e7_8835_55d0ded3d817.slice/crio-f8869e8e18c119e322a6ceda8eb03c3cb395bf66e422c73372cb8da7601d54b9 WatchSource:0}: Error finding container f8869e8e18c119e322a6ceda8eb03c3cb395bf66e422c73372cb8da7601d54b9: Status 404 returned error can't find the container with id f8869e8e18c119e322a6ceda8eb03c3cb395bf66e422c73372cb8da7601d54b9 Dec 03 09:07:14 crc kubenswrapper[4947]: I1203 09:07:14.215210 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"06b52112-1cfc-45e7-8835-55d0ded3d817","Type":"ContainerStarted","Data":"ef4fd6c2704b8a33c01f635a5047514f9ca8388dfe0400cc6acfce73409da376"} Dec 03 09:07:14 crc kubenswrapper[4947]: I1203 09:07:14.215787 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"06b52112-1cfc-45e7-8835-55d0ded3d817","Type":"ContainerStarted","Data":"f8869e8e18c119e322a6ceda8eb03c3cb395bf66e422c73372cb8da7601d54b9"} Dec 03 09:07:14 crc kubenswrapper[4947]: I1203 09:07:14.245157 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.245134212 podStartE2EDuration="2.245134212s" podCreationTimestamp="2025-12-03 09:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:07:14.238664068 +0000 UTC m=+8295.499618494" watchObservedRunningTime="2025-12-03 09:07:14.245134212 +0000 UTC m=+8295.506088658" Dec 03 09:07:14 crc kubenswrapper[4947]: I1203 09:07:14.261712 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell3-conductor-0" Dec 03 09:07:14 crc kubenswrapper[4947]: I1203 09:07:14.265131 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 03 09:07:14 crc kubenswrapper[4947]: I1203 09:07:14.715774 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell3-cell-mapping-rl2pj"] Dec 03 09:07:14 crc kubenswrapper[4947]: I1203 09:07:14.717977 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell3-cell-mapping-rl2pj" Dec 03 09:07:14 crc kubenswrapper[4947]: I1203 09:07:14.719833 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell3-manage-config-data" Dec 03 09:07:14 crc kubenswrapper[4947]: I1203 09:07:14.720295 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell3-manage-scripts" Dec 03 09:07:14 crc kubenswrapper[4947]: I1203 09:07:14.731606 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell3-cell-mapping-rl2pj"] Dec 03 09:07:14 crc kubenswrapper[4947]: I1203 09:07:14.802236 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r88c8\" (UniqueName: \"kubernetes.io/projected/f3d3e206-10b7-4bf2-bec3-b57694a2318f-kube-api-access-r88c8\") pod \"nova-cell3-cell-mapping-rl2pj\" (UID: \"f3d3e206-10b7-4bf2-bec3-b57694a2318f\") " pod="openstack/nova-cell3-cell-mapping-rl2pj" Dec 03 09:07:14 crc kubenswrapper[4947]: I1203 09:07:14.802296 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3d3e206-10b7-4bf2-bec3-b57694a2318f-config-data\") pod \"nova-cell3-cell-mapping-rl2pj\" (UID: \"f3d3e206-10b7-4bf2-bec3-b57694a2318f\") " pod="openstack/nova-cell3-cell-mapping-rl2pj" Dec 03 09:07:14 crc kubenswrapper[4947]: I1203 09:07:14.802316 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3d3e206-10b7-4bf2-bec3-b57694a2318f-scripts\") pod \"nova-cell3-cell-mapping-rl2pj\" (UID: \"f3d3e206-10b7-4bf2-bec3-b57694a2318f\") " pod="openstack/nova-cell3-cell-mapping-rl2pj" Dec 03 09:07:14 crc kubenswrapper[4947]: I1203 09:07:14.802419 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3d3e206-10b7-4bf2-bec3-b57694a2318f-combined-ca-bundle\") pod \"nova-cell3-cell-mapping-rl2pj\" (UID: \"f3d3e206-10b7-4bf2-bec3-b57694a2318f\") " pod="openstack/nova-cell3-cell-mapping-rl2pj" Dec 03 09:07:14 crc kubenswrapper[4947]: I1203 09:07:14.904088 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r88c8\" (UniqueName: \"kubernetes.io/projected/f3d3e206-10b7-4bf2-bec3-b57694a2318f-kube-api-access-r88c8\") pod \"nova-cell3-cell-mapping-rl2pj\" (UID: \"f3d3e206-10b7-4bf2-bec3-b57694a2318f\") " pod="openstack/nova-cell3-cell-mapping-rl2pj" Dec 03 09:07:14 crc kubenswrapper[4947]: I1203 09:07:14.904171 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3d3e206-10b7-4bf2-bec3-b57694a2318f-config-data\") pod \"nova-cell3-cell-mapping-rl2pj\" (UID: \"f3d3e206-10b7-4bf2-bec3-b57694a2318f\") " pod="openstack/nova-cell3-cell-mapping-rl2pj" Dec 03 09:07:14 crc kubenswrapper[4947]: I1203 09:07:14.904195 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3d3e206-10b7-4bf2-bec3-b57694a2318f-scripts\") pod \"nova-cell3-cell-mapping-rl2pj\" (UID: \"f3d3e206-10b7-4bf2-bec3-b57694a2318f\") " pod="openstack/nova-cell3-cell-mapping-rl2pj" Dec 03 09:07:14 crc kubenswrapper[4947]: I1203 09:07:14.904307 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3d3e206-10b7-4bf2-bec3-b57694a2318f-combined-ca-bundle\") pod \"nova-cell3-cell-mapping-rl2pj\" (UID: \"f3d3e206-10b7-4bf2-bec3-b57694a2318f\") " pod="openstack/nova-cell3-cell-mapping-rl2pj" Dec 03 09:07:14 crc kubenswrapper[4947]: I1203 09:07:14.909702 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3d3e206-10b7-4bf2-bec3-b57694a2318f-scripts\") pod \"nova-cell3-cell-mapping-rl2pj\" (UID: \"f3d3e206-10b7-4bf2-bec3-b57694a2318f\") " pod="openstack/nova-cell3-cell-mapping-rl2pj" Dec 03 09:07:14 crc kubenswrapper[4947]: I1203 09:07:14.916848 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3d3e206-10b7-4bf2-bec3-b57694a2318f-config-data\") pod \"nova-cell3-cell-mapping-rl2pj\" (UID: \"f3d3e206-10b7-4bf2-bec3-b57694a2318f\") " pod="openstack/nova-cell3-cell-mapping-rl2pj" Dec 03 09:07:14 crc kubenswrapper[4947]: I1203 09:07:14.918823 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3d3e206-10b7-4bf2-bec3-b57694a2318f-combined-ca-bundle\") pod \"nova-cell3-cell-mapping-rl2pj\" (UID: \"f3d3e206-10b7-4bf2-bec3-b57694a2318f\") " pod="openstack/nova-cell3-cell-mapping-rl2pj" Dec 03 09:07:14 crc kubenswrapper[4947]: I1203 09:07:14.931247 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r88c8\" (UniqueName: \"kubernetes.io/projected/f3d3e206-10b7-4bf2-bec3-b57694a2318f-kube-api-access-r88c8\") pod \"nova-cell3-cell-mapping-rl2pj\" (UID: \"f3d3e206-10b7-4bf2-bec3-b57694a2318f\") " pod="openstack/nova-cell3-cell-mapping-rl2pj" Dec 03 09:07:15 crc kubenswrapper[4947]: I1203 09:07:15.036896 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell3-cell-mapping-rl2pj" Dec 03 09:07:15 crc kubenswrapper[4947]: I1203 09:07:15.227045 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-ztsvz"] Dec 03 09:07:15 crc kubenswrapper[4947]: I1203 09:07:15.230482 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ztsvz" Dec 03 09:07:15 crc kubenswrapper[4947]: I1203 09:07:15.238613 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ztsvz"] Dec 03 09:07:15 crc kubenswrapper[4947]: I1203 09:07:15.249566 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 03 09:07:15 crc kubenswrapper[4947]: I1203 09:07:15.249754 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 03 09:07:15 crc kubenswrapper[4947]: I1203 09:07:15.414120 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5x5l\" (UniqueName: \"kubernetes.io/projected/e1783d52-312d-4302-9a4b-6a15255d3518-kube-api-access-w5x5l\") pod \"nova-cell1-cell-mapping-ztsvz\" (UID: \"e1783d52-312d-4302-9a4b-6a15255d3518\") " pod="openstack/nova-cell1-cell-mapping-ztsvz" Dec 03 09:07:15 crc kubenswrapper[4947]: I1203 09:07:15.414319 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1783d52-312d-4302-9a4b-6a15255d3518-config-data\") pod \"nova-cell1-cell-mapping-ztsvz\" (UID: \"e1783d52-312d-4302-9a4b-6a15255d3518\") " pod="openstack/nova-cell1-cell-mapping-ztsvz" Dec 03 09:07:15 crc kubenswrapper[4947]: I1203 09:07:15.414388 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1783d52-312d-4302-9a4b-6a15255d3518-scripts\") pod \"nova-cell1-cell-mapping-ztsvz\" (UID: \"e1783d52-312d-4302-9a4b-6a15255d3518\") " pod="openstack/nova-cell1-cell-mapping-ztsvz" Dec 03 09:07:15 crc kubenswrapper[4947]: I1203 09:07:15.414422 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1783d52-312d-4302-9a4b-6a15255d3518-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ztsvz\" (UID: \"e1783d52-312d-4302-9a4b-6a15255d3518\") " pod="openstack/nova-cell1-cell-mapping-ztsvz" Dec 03 09:07:15 crc kubenswrapper[4947]: I1203 09:07:15.516755 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1783d52-312d-4302-9a4b-6a15255d3518-config-data\") pod \"nova-cell1-cell-mapping-ztsvz\" (UID: \"e1783d52-312d-4302-9a4b-6a15255d3518\") " pod="openstack/nova-cell1-cell-mapping-ztsvz" Dec 03 09:07:15 crc kubenswrapper[4947]: I1203 09:07:15.516818 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1783d52-312d-4302-9a4b-6a15255d3518-scripts\") pod \"nova-cell1-cell-mapping-ztsvz\" (UID: \"e1783d52-312d-4302-9a4b-6a15255d3518\") " pod="openstack/nova-cell1-cell-mapping-ztsvz" Dec 03 09:07:15 crc kubenswrapper[4947]: I1203 09:07:15.516845 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1783d52-312d-4302-9a4b-6a15255d3518-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ztsvz\" (UID: \"e1783d52-312d-4302-9a4b-6a15255d3518\") " pod="openstack/nova-cell1-cell-mapping-ztsvz" Dec 03 09:07:15 crc kubenswrapper[4947]: I1203 09:07:15.516913 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5x5l\" (UniqueName: \"kubernetes.io/projected/e1783d52-312d-4302-9a4b-6a15255d3518-kube-api-access-w5x5l\") pod \"nova-cell1-cell-mapping-ztsvz\" (UID: \"e1783d52-312d-4302-9a4b-6a15255d3518\") " pod="openstack/nova-cell1-cell-mapping-ztsvz" Dec 03 09:07:15 crc kubenswrapper[4947]: I1203 09:07:15.523293 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1783d52-312d-4302-9a4b-6a15255d3518-scripts\") pod \"nova-cell1-cell-mapping-ztsvz\" (UID: \"e1783d52-312d-4302-9a4b-6a15255d3518\") " pod="openstack/nova-cell1-cell-mapping-ztsvz" Dec 03 09:07:15 crc kubenswrapper[4947]: I1203 09:07:15.523361 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1783d52-312d-4302-9a4b-6a15255d3518-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-ztsvz\" (UID: \"e1783d52-312d-4302-9a4b-6a15255d3518\") " pod="openstack/nova-cell1-cell-mapping-ztsvz" Dec 03 09:07:15 crc kubenswrapper[4947]: I1203 09:07:15.524882 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1783d52-312d-4302-9a4b-6a15255d3518-config-data\") pod \"nova-cell1-cell-mapping-ztsvz\" (UID: \"e1783d52-312d-4302-9a4b-6a15255d3518\") " pod="openstack/nova-cell1-cell-mapping-ztsvz" Dec 03 09:07:15 crc kubenswrapper[4947]: I1203 09:07:15.541139 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5x5l\" (UniqueName: \"kubernetes.io/projected/e1783d52-312d-4302-9a4b-6a15255d3518-kube-api-access-w5x5l\") pod \"nova-cell1-cell-mapping-ztsvz\" (UID: \"e1783d52-312d-4302-9a4b-6a15255d3518\") " pod="openstack/nova-cell1-cell-mapping-ztsvz" Dec 03 09:07:15 crc kubenswrapper[4947]: W1203 09:07:15.545945 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3d3e206_10b7_4bf2_bec3_b57694a2318f.slice/crio-ae7834282b0e8b1265dbffa11ddc2b5aac77b867355536ee5ac405ce0366d53c WatchSource:0}: Error finding container ae7834282b0e8b1265dbffa11ddc2b5aac77b867355536ee5ac405ce0366d53c: Status 404 returned error can't find the container with id ae7834282b0e8b1265dbffa11ddc2b5aac77b867355536ee5ac405ce0366d53c Dec 03 09:07:15 crc kubenswrapper[4947]: I1203 09:07:15.556327 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell3-cell-mapping-rl2pj"] Dec 03 09:07:15 crc kubenswrapper[4947]: I1203 09:07:15.574480 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ztsvz" Dec 03 09:07:16 crc kubenswrapper[4947]: I1203 09:07:16.025520 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-ztsvz"] Dec 03 09:07:16 crc kubenswrapper[4947]: W1203 09:07:16.054740 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1783d52_312d_4302_9a4b_6a15255d3518.slice/crio-ccec70b288c925def545a8d68d50bbd3d2bcaabe73a035f995e96a747cbc4cd4 WatchSource:0}: Error finding container ccec70b288c925def545a8d68d50bbd3d2bcaabe73a035f995e96a747cbc4cd4: Status 404 returned error can't find the container with id ccec70b288c925def545a8d68d50bbd3d2bcaabe73a035f995e96a747cbc4cd4 Dec 03 09:07:16 crc kubenswrapper[4947]: I1203 09:07:16.268271 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell3-cell-mapping-rl2pj" event={"ID":"f3d3e206-10b7-4bf2-bec3-b57694a2318f","Type":"ContainerStarted","Data":"dcc01c53bf83e1af85f67d8b7218dcf11c92256ccefd97fcead07bbed46da0b2"} Dec 03 09:07:16 crc kubenswrapper[4947]: I1203 09:07:16.268689 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell3-cell-mapping-rl2pj" event={"ID":"f3d3e206-10b7-4bf2-bec3-b57694a2318f","Type":"ContainerStarted","Data":"ae7834282b0e8b1265dbffa11ddc2b5aac77b867355536ee5ac405ce0366d53c"} Dec 03 09:07:16 crc kubenswrapper[4947]: I1203 09:07:16.275319 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ztsvz" event={"ID":"e1783d52-312d-4302-9a4b-6a15255d3518","Type":"ContainerStarted","Data":"ccec70b288c925def545a8d68d50bbd3d2bcaabe73a035f995e96a747cbc4cd4"} Dec 03 09:07:16 crc kubenswrapper[4947]: I1203 09:07:16.299137 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell3-cell-mapping-rl2pj" podStartSLOduration=2.299115769 podStartE2EDuration="2.299115769s" podCreationTimestamp="2025-12-03 09:07:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:07:16.293117937 +0000 UTC m=+8297.554072383" watchObservedRunningTime="2025-12-03 09:07:16.299115769 +0000 UTC m=+8297.560070195" Dec 03 09:07:16 crc kubenswrapper[4947]: I1203 09:07:16.922797 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 09:07:16 crc kubenswrapper[4947]: I1203 09:07:16.922895 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 09:07:17 crc kubenswrapper[4947]: I1203 09:07:17.294597 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ztsvz" event={"ID":"e1783d52-312d-4302-9a4b-6a15255d3518","Type":"ContainerStarted","Data":"00f7cf22b160c936da8edff9be6af758151465eb729339529defcbf75c4e16f6"} Dec 03 09:07:17 crc kubenswrapper[4947]: I1203 09:07:17.319685 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-ztsvz" podStartSLOduration=2.319664973 podStartE2EDuration="2.319664973s" podCreationTimestamp="2025-12-03 09:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:07:17.316295393 +0000 UTC m=+8298.577249829" watchObservedRunningTime="2025-12-03 09:07:17.319664973 +0000 UTC m=+8298.580619399" Dec 03 09:07:17 crc kubenswrapper[4947]: I1203 09:07:17.850379 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 09:07:19 crc kubenswrapper[4947]: I1203 09:07:19.301436 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell2-conductor-0" Dec 03 09:07:19 crc kubenswrapper[4947]: I1203 09:07:19.761021 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell2-cell-mapping-8gkkb"] Dec 03 09:07:19 crc kubenswrapper[4947]: I1203 09:07:19.762392 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-cell-mapping-8gkkb" Dec 03 09:07:19 crc kubenswrapper[4947]: I1203 09:07:19.764772 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell2-manage-config-data" Dec 03 09:07:19 crc kubenswrapper[4947]: I1203 09:07:19.765706 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell2-manage-scripts" Dec 03 09:07:19 crc kubenswrapper[4947]: I1203 09:07:19.784079 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell2-cell-mapping-8gkkb"] Dec 03 09:07:19 crc kubenswrapper[4947]: I1203 09:07:19.912972 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a03c107-424a-45ce-94fa-f65a738d62a1-combined-ca-bundle\") pod \"nova-cell2-cell-mapping-8gkkb\" (UID: \"9a03c107-424a-45ce-94fa-f65a738d62a1\") " pod="openstack/nova-cell2-cell-mapping-8gkkb" Dec 03 09:07:19 crc kubenswrapper[4947]: I1203 09:07:19.913029 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a03c107-424a-45ce-94fa-f65a738d62a1-scripts\") pod \"nova-cell2-cell-mapping-8gkkb\" (UID: \"9a03c107-424a-45ce-94fa-f65a738d62a1\") " pod="openstack/nova-cell2-cell-mapping-8gkkb" Dec 03 09:07:19 crc kubenswrapper[4947]: I1203 09:07:19.913120 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a03c107-424a-45ce-94fa-f65a738d62a1-config-data\") pod \"nova-cell2-cell-mapping-8gkkb\" (UID: \"9a03c107-424a-45ce-94fa-f65a738d62a1\") " pod="openstack/nova-cell2-cell-mapping-8gkkb" Dec 03 09:07:19 crc kubenswrapper[4947]: I1203 09:07:19.913245 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t4qm\" (UniqueName: \"kubernetes.io/projected/9a03c107-424a-45ce-94fa-f65a738d62a1-kube-api-access-4t4qm\") pod \"nova-cell2-cell-mapping-8gkkb\" (UID: \"9a03c107-424a-45ce-94fa-f65a738d62a1\") " pod="openstack/nova-cell2-cell-mapping-8gkkb" Dec 03 09:07:20 crc kubenswrapper[4947]: I1203 09:07:20.014630 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a03c107-424a-45ce-94fa-f65a738d62a1-scripts\") pod \"nova-cell2-cell-mapping-8gkkb\" (UID: \"9a03c107-424a-45ce-94fa-f65a738d62a1\") " pod="openstack/nova-cell2-cell-mapping-8gkkb" Dec 03 09:07:20 crc kubenswrapper[4947]: I1203 09:07:20.014704 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a03c107-424a-45ce-94fa-f65a738d62a1-config-data\") pod \"nova-cell2-cell-mapping-8gkkb\" (UID: \"9a03c107-424a-45ce-94fa-f65a738d62a1\") " pod="openstack/nova-cell2-cell-mapping-8gkkb" Dec 03 09:07:20 crc kubenswrapper[4947]: I1203 09:07:20.014840 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t4qm\" (UniqueName: \"kubernetes.io/projected/9a03c107-424a-45ce-94fa-f65a738d62a1-kube-api-access-4t4qm\") pod \"nova-cell2-cell-mapping-8gkkb\" (UID: \"9a03c107-424a-45ce-94fa-f65a738d62a1\") " pod="openstack/nova-cell2-cell-mapping-8gkkb" Dec 03 09:07:20 crc kubenswrapper[4947]: I1203 09:07:20.014887 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a03c107-424a-45ce-94fa-f65a738d62a1-combined-ca-bundle\") pod \"nova-cell2-cell-mapping-8gkkb\" (UID: \"9a03c107-424a-45ce-94fa-f65a738d62a1\") " pod="openstack/nova-cell2-cell-mapping-8gkkb" Dec 03 09:07:20 crc kubenswrapper[4947]: I1203 09:07:20.021970 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a03c107-424a-45ce-94fa-f65a738d62a1-scripts\") pod \"nova-cell2-cell-mapping-8gkkb\" (UID: \"9a03c107-424a-45ce-94fa-f65a738d62a1\") " pod="openstack/nova-cell2-cell-mapping-8gkkb" Dec 03 09:07:20 crc kubenswrapper[4947]: I1203 09:07:20.025179 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a03c107-424a-45ce-94fa-f65a738d62a1-config-data\") pod \"nova-cell2-cell-mapping-8gkkb\" (UID: \"9a03c107-424a-45ce-94fa-f65a738d62a1\") " pod="openstack/nova-cell2-cell-mapping-8gkkb" Dec 03 09:07:20 crc kubenswrapper[4947]: I1203 09:07:20.025684 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a03c107-424a-45ce-94fa-f65a738d62a1-combined-ca-bundle\") pod \"nova-cell2-cell-mapping-8gkkb\" (UID: \"9a03c107-424a-45ce-94fa-f65a738d62a1\") " pod="openstack/nova-cell2-cell-mapping-8gkkb" Dec 03 09:07:20 crc kubenswrapper[4947]: I1203 09:07:20.036296 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t4qm\" (UniqueName: \"kubernetes.io/projected/9a03c107-424a-45ce-94fa-f65a738d62a1-kube-api-access-4t4qm\") pod \"nova-cell2-cell-mapping-8gkkb\" (UID: \"9a03c107-424a-45ce-94fa-f65a738d62a1\") " pod="openstack/nova-cell2-cell-mapping-8gkkb" Dec 03 09:07:20 crc kubenswrapper[4947]: I1203 09:07:20.097084 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-cell-mapping-8gkkb" Dec 03 09:07:20 crc kubenswrapper[4947]: I1203 09:07:20.580121 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell2-cell-mapping-8gkkb"] Dec 03 09:07:21 crc kubenswrapper[4947]: I1203 09:07:21.336812 4947 generic.go:334] "Generic (PLEG): container finished" podID="e1783d52-312d-4302-9a4b-6a15255d3518" containerID="00f7cf22b160c936da8edff9be6af758151465eb729339529defcbf75c4e16f6" exitCode=0 Dec 03 09:07:21 crc kubenswrapper[4947]: I1203 09:07:21.336863 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ztsvz" event={"ID":"e1783d52-312d-4302-9a4b-6a15255d3518","Type":"ContainerDied","Data":"00f7cf22b160c936da8edff9be6af758151465eb729339529defcbf75c4e16f6"} Dec 03 09:07:21 crc kubenswrapper[4947]: I1203 09:07:21.341959 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-cell-mapping-8gkkb" event={"ID":"9a03c107-424a-45ce-94fa-f65a738d62a1","Type":"ContainerStarted","Data":"836e00d1e9b086a63305af21373c00b5622a5a4c3bb5e48477c55deec17acece"} Dec 03 09:07:21 crc kubenswrapper[4947]: I1203 09:07:21.341992 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-cell-mapping-8gkkb" event={"ID":"9a03c107-424a-45ce-94fa-f65a738d62a1","Type":"ContainerStarted","Data":"db3ff2fd62ee7b8a78323cdb80705cf952d5e6d15b75f57d2d56aef2725217a8"} Dec 03 09:07:21 crc kubenswrapper[4947]: I1203 09:07:21.344844 4947 generic.go:334] "Generic (PLEG): container finished" podID="f3d3e206-10b7-4bf2-bec3-b57694a2318f" containerID="dcc01c53bf83e1af85f67d8b7218dcf11c92256ccefd97fcead07bbed46da0b2" exitCode=0 Dec 03 09:07:21 crc kubenswrapper[4947]: I1203 09:07:21.344872 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell3-cell-mapping-rl2pj" event={"ID":"f3d3e206-10b7-4bf2-bec3-b57694a2318f","Type":"ContainerDied","Data":"dcc01c53bf83e1af85f67d8b7218dcf11c92256ccefd97fcead07bbed46da0b2"} Dec 03 09:07:21 crc kubenswrapper[4947]: I1203 09:07:21.386371 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell2-cell-mapping-8gkkb" podStartSLOduration=2.3863450410000002 podStartE2EDuration="2.386345041s" podCreationTimestamp="2025-12-03 09:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:07:21.385078587 +0000 UTC m=+8302.646033013" watchObservedRunningTime="2025-12-03 09:07:21.386345041 +0000 UTC m=+8302.647299467" Dec 03 09:07:21 crc kubenswrapper[4947]: I1203 09:07:21.478894 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 09:07:21 crc kubenswrapper[4947]: I1203 09:07:21.478955 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 09:07:21 crc kubenswrapper[4947]: I1203 09:07:21.922086 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 09:07:21 crc kubenswrapper[4947]: I1203 09:07:21.922259 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 09:07:22 crc kubenswrapper[4947]: I1203 09:07:22.518747 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.130:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 09:07:22 crc kubenswrapper[4947]: I1203 09:07:22.559751 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.130:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 09:07:22 crc kubenswrapper[4947]: I1203 09:07:22.843805 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell3-cell-mapping-rl2pj" Dec 03 09:07:22 crc kubenswrapper[4947]: I1203 09:07:22.849669 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 09:07:22 crc kubenswrapper[4947]: I1203 09:07:22.851445 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ztsvz" Dec 03 09:07:22 crc kubenswrapper[4947]: I1203 09:07:22.881247 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 09:07:22 crc kubenswrapper[4947]: I1203 09:07:22.962759 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9cac1187-37b2-46a4-a16e-e818460fb3ae" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.131:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 09:07:22 crc kubenswrapper[4947]: I1203 09:07:22.972297 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3d3e206-10b7-4bf2-bec3-b57694a2318f-config-data\") pod \"f3d3e206-10b7-4bf2-bec3-b57694a2318f\" (UID: \"f3d3e206-10b7-4bf2-bec3-b57694a2318f\") " Dec 03 09:07:22 crc kubenswrapper[4947]: I1203 09:07:22.972405 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3d3e206-10b7-4bf2-bec3-b57694a2318f-combined-ca-bundle\") pod \"f3d3e206-10b7-4bf2-bec3-b57694a2318f\" (UID: \"f3d3e206-10b7-4bf2-bec3-b57694a2318f\") " Dec 03 09:07:22 crc kubenswrapper[4947]: I1203 09:07:22.972444 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5x5l\" (UniqueName: \"kubernetes.io/projected/e1783d52-312d-4302-9a4b-6a15255d3518-kube-api-access-w5x5l\") pod \"e1783d52-312d-4302-9a4b-6a15255d3518\" (UID: \"e1783d52-312d-4302-9a4b-6a15255d3518\") " Dec 03 09:07:22 crc kubenswrapper[4947]: I1203 09:07:22.972479 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r88c8\" (UniqueName: \"kubernetes.io/projected/f3d3e206-10b7-4bf2-bec3-b57694a2318f-kube-api-access-r88c8\") pod \"f3d3e206-10b7-4bf2-bec3-b57694a2318f\" (UID: \"f3d3e206-10b7-4bf2-bec3-b57694a2318f\") " Dec 03 09:07:22 crc kubenswrapper[4947]: I1203 09:07:22.973183 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1783d52-312d-4302-9a4b-6a15255d3518-config-data\") pod \"e1783d52-312d-4302-9a4b-6a15255d3518\" (UID: \"e1783d52-312d-4302-9a4b-6a15255d3518\") " Dec 03 09:07:22 crc kubenswrapper[4947]: I1203 09:07:22.973207 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3d3e206-10b7-4bf2-bec3-b57694a2318f-scripts\") pod \"f3d3e206-10b7-4bf2-bec3-b57694a2318f\" (UID: \"f3d3e206-10b7-4bf2-bec3-b57694a2318f\") " Dec 03 09:07:22 crc kubenswrapper[4947]: I1203 09:07:22.973759 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1783d52-312d-4302-9a4b-6a15255d3518-combined-ca-bundle\") pod \"e1783d52-312d-4302-9a4b-6a15255d3518\" (UID: \"e1783d52-312d-4302-9a4b-6a15255d3518\") " Dec 03 09:07:22 crc kubenswrapper[4947]: I1203 09:07:22.974720 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1783d52-312d-4302-9a4b-6a15255d3518-scripts\") pod \"e1783d52-312d-4302-9a4b-6a15255d3518\" (UID: \"e1783d52-312d-4302-9a4b-6a15255d3518\") " Dec 03 09:07:22 crc kubenswrapper[4947]: I1203 09:07:22.978184 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3d3e206-10b7-4bf2-bec3-b57694a2318f-kube-api-access-r88c8" (OuterVolumeSpecName: "kube-api-access-r88c8") pod "f3d3e206-10b7-4bf2-bec3-b57694a2318f" (UID: "f3d3e206-10b7-4bf2-bec3-b57694a2318f"). InnerVolumeSpecName "kube-api-access-r88c8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:07:22 crc kubenswrapper[4947]: I1203 09:07:22.978343 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3d3e206-10b7-4bf2-bec3-b57694a2318f-scripts" (OuterVolumeSpecName: "scripts") pod "f3d3e206-10b7-4bf2-bec3-b57694a2318f" (UID: "f3d3e206-10b7-4bf2-bec3-b57694a2318f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:22 crc kubenswrapper[4947]: I1203 09:07:22.987628 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1783d52-312d-4302-9a4b-6a15255d3518-kube-api-access-w5x5l" (OuterVolumeSpecName: "kube-api-access-w5x5l") pod "e1783d52-312d-4302-9a4b-6a15255d3518" (UID: "e1783d52-312d-4302-9a4b-6a15255d3518"). InnerVolumeSpecName "kube-api-access-w5x5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:07:22 crc kubenswrapper[4947]: I1203 09:07:22.991784 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1783d52-312d-4302-9a4b-6a15255d3518-scripts" (OuterVolumeSpecName: "scripts") pod "e1783d52-312d-4302-9a4b-6a15255d3518" (UID: "e1783d52-312d-4302-9a4b-6a15255d3518"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.004172 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9cac1187-37b2-46a4-a16e-e818460fb3ae" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.131:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.006083 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1783d52-312d-4302-9a4b-6a15255d3518-config-data" (OuterVolumeSpecName: "config-data") pod "e1783d52-312d-4302-9a4b-6a15255d3518" (UID: "e1783d52-312d-4302-9a4b-6a15255d3518"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.008948 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1783d52-312d-4302-9a4b-6a15255d3518-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1783d52-312d-4302-9a4b-6a15255d3518" (UID: "e1783d52-312d-4302-9a4b-6a15255d3518"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.019937 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3d3e206-10b7-4bf2-bec3-b57694a2318f-config-data" (OuterVolumeSpecName: "config-data") pod "f3d3e206-10b7-4bf2-bec3-b57694a2318f" (UID: "f3d3e206-10b7-4bf2-bec3-b57694a2318f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.023209 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3d3e206-10b7-4bf2-bec3-b57694a2318f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3d3e206-10b7-4bf2-bec3-b57694a2318f" (UID: "f3d3e206-10b7-4bf2-bec3-b57694a2318f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.078412 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1783d52-312d-4302-9a4b-6a15255d3518-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.078466 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1783d52-312d-4302-9a4b-6a15255d3518-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.078476 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3d3e206-10b7-4bf2-bec3-b57694a2318f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.078485 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3d3e206-10b7-4bf2-bec3-b57694a2318f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.078521 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5x5l\" (UniqueName: \"kubernetes.io/projected/e1783d52-312d-4302-9a4b-6a15255d3518-kube-api-access-w5x5l\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.078529 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r88c8\" (UniqueName: \"kubernetes.io/projected/f3d3e206-10b7-4bf2-bec3-b57694a2318f-kube-api-access-r88c8\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.078538 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1783d52-312d-4302-9a4b-6a15255d3518-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.078547 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3d3e206-10b7-4bf2-bec3-b57694a2318f-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.370031 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell3-cell-mapping-rl2pj" event={"ID":"f3d3e206-10b7-4bf2-bec3-b57694a2318f","Type":"ContainerDied","Data":"ae7834282b0e8b1265dbffa11ddc2b5aac77b867355536ee5ac405ce0366d53c"} Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.370436 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae7834282b0e8b1265dbffa11ddc2b5aac77b867355536ee5ac405ce0366d53c" Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.370125 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell3-cell-mapping-rl2pj" Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.372341 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-ztsvz" Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.372333 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-ztsvz" event={"ID":"e1783d52-312d-4302-9a4b-6a15255d3518","Type":"ContainerDied","Data":"ccec70b288c925def545a8d68d50bbd3d2bcaabe73a035f995e96a747cbc4cd4"} Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.372649 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccec70b288c925def545a8d68d50bbd3d2bcaabe73a035f995e96a747cbc4cd4" Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.408060 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.577470 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.577730 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d" containerName="nova-api-log" containerID="cri-o://a89403aad01cbe1d34490abaa5c8b21e538c4c3ca9b84b88fcfa657f67318d06" gracePeriod=30 Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.578138 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d" containerName="nova-api-api" containerID="cri-o://48a11f82807e193e17c6a1a78f68e5b323d996d5a58d0e4d3dabd62d28d728bd" gracePeriod=30 Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.606188 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.606630 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9cac1187-37b2-46a4-a16e-e818460fb3ae" containerName="nova-metadata-log" containerID="cri-o://9d9264d566df680dbc07de41848587f22567f9e8de740223d9346d779856e7d8" gracePeriod=30 Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.606700 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="9cac1187-37b2-46a4-a16e-e818460fb3ae" containerName="nova-metadata-metadata" containerID="cri-o://81a5297dff82665a203c19b6551d92f166342ab6461901a3426e7e9750b74880" gracePeriod=30 Dec 03 09:07:23 crc kubenswrapper[4947]: E1203 09:07:23.687683 4947 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0436fb4a_a4a0_4671_8e8c_4212fb6f2a4d.slice/crio-a89403aad01cbe1d34490abaa5c8b21e538c4c3ca9b84b88fcfa657f67318d06.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cac1187_37b2_46a4_a16e_e818460fb3ae.slice/crio-conmon-9d9264d566df680dbc07de41848587f22567f9e8de740223d9346d779856e7d8.scope\": RecentStats: unable to find data in memory cache]" Dec 03 09:07:23 crc kubenswrapper[4947]: I1203 09:07:23.971502 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 09:07:24 crc kubenswrapper[4947]: I1203 09:07:24.384449 4947 generic.go:334] "Generic (PLEG): container finished" podID="9cac1187-37b2-46a4-a16e-e818460fb3ae" containerID="9d9264d566df680dbc07de41848587f22567f9e8de740223d9346d779856e7d8" exitCode=143 Dec 03 09:07:24 crc kubenswrapper[4947]: I1203 09:07:24.384651 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9cac1187-37b2-46a4-a16e-e818460fb3ae","Type":"ContainerDied","Data":"9d9264d566df680dbc07de41848587f22567f9e8de740223d9346d779856e7d8"} Dec 03 09:07:24 crc kubenswrapper[4947]: I1203 09:07:24.389842 4947 generic.go:334] "Generic (PLEG): container finished" podID="0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d" containerID="a89403aad01cbe1d34490abaa5c8b21e538c4c3ca9b84b88fcfa657f67318d06" exitCode=143 Dec 03 09:07:24 crc kubenswrapper[4947]: I1203 09:07:24.389896 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d","Type":"ContainerDied","Data":"a89403aad01cbe1d34490abaa5c8b21e538c4c3ca9b84b88fcfa657f67318d06"} Dec 03 09:07:25 crc kubenswrapper[4947]: I1203 09:07:25.400364 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="06b52112-1cfc-45e7-8835-55d0ded3d817" containerName="nova-scheduler-scheduler" containerID="cri-o://ef4fd6c2704b8a33c01f635a5047514f9ca8388dfe0400cc6acfce73409da376" gracePeriod=30 Dec 03 09:07:26 crc kubenswrapper[4947]: I1203 09:07:26.412055 4947 generic.go:334] "Generic (PLEG): container finished" podID="9a03c107-424a-45ce-94fa-f65a738d62a1" containerID="836e00d1e9b086a63305af21373c00b5622a5a4c3bb5e48477c55deec17acece" exitCode=0 Dec 03 09:07:26 crc kubenswrapper[4947]: I1203 09:07:26.412137 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-cell-mapping-8gkkb" event={"ID":"9a03c107-424a-45ce-94fa-f65a738d62a1","Type":"ContainerDied","Data":"836e00d1e9b086a63305af21373c00b5622a5a4c3bb5e48477c55deec17acece"} Dec 03 09:07:27 crc kubenswrapper[4947]: I1203 09:07:27.769758 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-cell-mapping-8gkkb" Dec 03 09:07:27 crc kubenswrapper[4947]: E1203 09:07:27.852539 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ef4fd6c2704b8a33c01f635a5047514f9ca8388dfe0400cc6acfce73409da376" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 09:07:27 crc kubenswrapper[4947]: E1203 09:07:27.854582 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ef4fd6c2704b8a33c01f635a5047514f9ca8388dfe0400cc6acfce73409da376" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 09:07:27 crc kubenswrapper[4947]: E1203 09:07:27.859895 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ef4fd6c2704b8a33c01f635a5047514f9ca8388dfe0400cc6acfce73409da376" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 09:07:27 crc kubenswrapper[4947]: E1203 09:07:27.859921 4947 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="06b52112-1cfc-45e7-8835-55d0ded3d817" containerName="nova-scheduler-scheduler" Dec 03 09:07:27 crc kubenswrapper[4947]: I1203 09:07:27.878780 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a03c107-424a-45ce-94fa-f65a738d62a1-scripts\") pod \"9a03c107-424a-45ce-94fa-f65a738d62a1\" (UID: \"9a03c107-424a-45ce-94fa-f65a738d62a1\") " Dec 03 09:07:27 crc kubenswrapper[4947]: I1203 09:07:27.878890 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a03c107-424a-45ce-94fa-f65a738d62a1-combined-ca-bundle\") pod \"9a03c107-424a-45ce-94fa-f65a738d62a1\" (UID: \"9a03c107-424a-45ce-94fa-f65a738d62a1\") " Dec 03 09:07:27 crc kubenswrapper[4947]: I1203 09:07:27.879075 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t4qm\" (UniqueName: \"kubernetes.io/projected/9a03c107-424a-45ce-94fa-f65a738d62a1-kube-api-access-4t4qm\") pod \"9a03c107-424a-45ce-94fa-f65a738d62a1\" (UID: \"9a03c107-424a-45ce-94fa-f65a738d62a1\") " Dec 03 09:07:27 crc kubenswrapper[4947]: I1203 09:07:27.879121 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a03c107-424a-45ce-94fa-f65a738d62a1-config-data\") pod \"9a03c107-424a-45ce-94fa-f65a738d62a1\" (UID: \"9a03c107-424a-45ce-94fa-f65a738d62a1\") " Dec 03 09:07:27 crc kubenswrapper[4947]: I1203 09:07:27.884920 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a03c107-424a-45ce-94fa-f65a738d62a1-scripts" (OuterVolumeSpecName: "scripts") pod "9a03c107-424a-45ce-94fa-f65a738d62a1" (UID: "9a03c107-424a-45ce-94fa-f65a738d62a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:27 crc kubenswrapper[4947]: I1203 09:07:27.890723 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a03c107-424a-45ce-94fa-f65a738d62a1-kube-api-access-4t4qm" (OuterVolumeSpecName: "kube-api-access-4t4qm") pod "9a03c107-424a-45ce-94fa-f65a738d62a1" (UID: "9a03c107-424a-45ce-94fa-f65a738d62a1"). InnerVolumeSpecName "kube-api-access-4t4qm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:07:27 crc kubenswrapper[4947]: I1203 09:07:27.914627 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a03c107-424a-45ce-94fa-f65a738d62a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a03c107-424a-45ce-94fa-f65a738d62a1" (UID: "9a03c107-424a-45ce-94fa-f65a738d62a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:27 crc kubenswrapper[4947]: I1203 09:07:27.931671 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a03c107-424a-45ce-94fa-f65a738d62a1-config-data" (OuterVolumeSpecName: "config-data") pod "9a03c107-424a-45ce-94fa-f65a738d62a1" (UID: "9a03c107-424a-45ce-94fa-f65a738d62a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:27 crc kubenswrapper[4947]: I1203 09:07:27.982395 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a03c107-424a-45ce-94fa-f65a738d62a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:27 crc kubenswrapper[4947]: I1203 09:07:27.982449 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t4qm\" (UniqueName: \"kubernetes.io/projected/9a03c107-424a-45ce-94fa-f65a738d62a1-kube-api-access-4t4qm\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:27 crc kubenswrapper[4947]: I1203 09:07:27.982473 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a03c107-424a-45ce-94fa-f65a738d62a1-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:27 crc kubenswrapper[4947]: I1203 09:07:27.982528 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a03c107-424a-45ce-94fa-f65a738d62a1-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.435837 4947 generic.go:334] "Generic (PLEG): container finished" podID="0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d" containerID="48a11f82807e193e17c6a1a78f68e5b323d996d5a58d0e4d3dabd62d28d728bd" exitCode=0 Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.435903 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d","Type":"ContainerDied","Data":"48a11f82807e193e17c6a1a78f68e5b323d996d5a58d0e4d3dabd62d28d728bd"} Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.437519 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-cell-mapping-8gkkb" Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.437518 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-cell-mapping-8gkkb" event={"ID":"9a03c107-424a-45ce-94fa-f65a738d62a1","Type":"ContainerDied","Data":"db3ff2fd62ee7b8a78323cdb80705cf952d5e6d15b75f57d2d56aef2725217a8"} Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.437625 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db3ff2fd62ee7b8a78323cdb80705cf952d5e6d15b75f57d2d56aef2725217a8" Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.446726 4947 generic.go:334] "Generic (PLEG): container finished" podID="9cac1187-37b2-46a4-a16e-e818460fb3ae" containerID="81a5297dff82665a203c19b6551d92f166342ab6461901a3426e7e9750b74880" exitCode=0 Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.446780 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9cac1187-37b2-46a4-a16e-e818460fb3ae","Type":"ContainerDied","Data":"81a5297dff82665a203c19b6551d92f166342ab6461901a3426e7e9750b74880"} Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.451985 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.537625 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.593810 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d-combined-ca-bundle\") pod \"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d\" (UID: \"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d\") " Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.593989 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d-config-data\") pod \"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d\" (UID: \"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d\") " Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.594099 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d-logs\") pod \"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d\" (UID: \"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d\") " Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.594135 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgqrz\" (UniqueName: \"kubernetes.io/projected/0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d-kube-api-access-jgqrz\") pod \"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d\" (UID: \"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d\") " Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.594693 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d-logs" (OuterVolumeSpecName: "logs") pod "0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d" (UID: "0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.598046 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d-kube-api-access-jgqrz" (OuterVolumeSpecName: "kube-api-access-jgqrz") pod "0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d" (UID: "0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d"). InnerVolumeSpecName "kube-api-access-jgqrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.621702 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d-config-data" (OuterVolumeSpecName: "config-data") pod "0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d" (UID: "0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.622337 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d" (UID: "0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.695310 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cac1187-37b2-46a4-a16e-e818460fb3ae-config-data\") pod \"9cac1187-37b2-46a4-a16e-e818460fb3ae\" (UID: \"9cac1187-37b2-46a4-a16e-e818460fb3ae\") " Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.695453 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktkmr\" (UniqueName: \"kubernetes.io/projected/9cac1187-37b2-46a4-a16e-e818460fb3ae-kube-api-access-ktkmr\") pod \"9cac1187-37b2-46a4-a16e-e818460fb3ae\" (UID: \"9cac1187-37b2-46a4-a16e-e818460fb3ae\") " Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.695484 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cac1187-37b2-46a4-a16e-e818460fb3ae-logs\") pod \"9cac1187-37b2-46a4-a16e-e818460fb3ae\" (UID: \"9cac1187-37b2-46a4-a16e-e818460fb3ae\") " Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.695639 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cac1187-37b2-46a4-a16e-e818460fb3ae-combined-ca-bundle\") pod \"9cac1187-37b2-46a4-a16e-e818460fb3ae\" (UID: \"9cac1187-37b2-46a4-a16e-e818460fb3ae\") " Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.695985 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgqrz\" (UniqueName: \"kubernetes.io/projected/0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d-kube-api-access-jgqrz\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.696003 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.696015 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.696027 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d-logs\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.697432 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cac1187-37b2-46a4-a16e-e818460fb3ae-logs" (OuterVolumeSpecName: "logs") pod "9cac1187-37b2-46a4-a16e-e818460fb3ae" (UID: "9cac1187-37b2-46a4-a16e-e818460fb3ae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.699735 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cac1187-37b2-46a4-a16e-e818460fb3ae-kube-api-access-ktkmr" (OuterVolumeSpecName: "kube-api-access-ktkmr") pod "9cac1187-37b2-46a4-a16e-e818460fb3ae" (UID: "9cac1187-37b2-46a4-a16e-e818460fb3ae"). InnerVolumeSpecName "kube-api-access-ktkmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.728320 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cac1187-37b2-46a4-a16e-e818460fb3ae-config-data" (OuterVolumeSpecName: "config-data") pod "9cac1187-37b2-46a4-a16e-e818460fb3ae" (UID: "9cac1187-37b2-46a4-a16e-e818460fb3ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.729691 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cac1187-37b2-46a4-a16e-e818460fb3ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cac1187-37b2-46a4-a16e-e818460fb3ae" (UID: "9cac1187-37b2-46a4-a16e-e818460fb3ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.797913 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cac1187-37b2-46a4-a16e-e818460fb3ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.798618 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cac1187-37b2-46a4-a16e-e818460fb3ae-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.798681 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktkmr\" (UniqueName: \"kubernetes.io/projected/9cac1187-37b2-46a4-a16e-e818460fb3ae-kube-api-access-ktkmr\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:28 crc kubenswrapper[4947]: I1203 09:07:28.798737 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9cac1187-37b2-46a4-a16e-e818460fb3ae-logs\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.467379 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9cac1187-37b2-46a4-a16e-e818460fb3ae","Type":"ContainerDied","Data":"474e57148250c76e42b1c216b84abd5d2db5214eec2baa410a044c16a654e221"} Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.467398 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.467768 4947 scope.go:117] "RemoveContainer" containerID="81a5297dff82665a203c19b6551d92f166342ab6461901a3426e7e9750b74880" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.470623 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d","Type":"ContainerDied","Data":"a6fb21728284334d93839089d47e8fade09a71b276976da86a24c556f499aa47"} Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.470725 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.508616 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.511697 4947 scope.go:117] "RemoveContainer" containerID="9d9264d566df680dbc07de41848587f22567f9e8de740223d9346d779856e7d8" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.534721 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.547790 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.563069 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.570055 4947 scope.go:117] "RemoveContainer" containerID="48a11f82807e193e17c6a1a78f68e5b323d996d5a58d0e4d3dabd62d28d728bd" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.573915 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 09:07:29 crc kubenswrapper[4947]: E1203 09:07:29.574387 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d" containerName="nova-api-log" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.574410 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d" containerName="nova-api-log" Dec 03 09:07:29 crc kubenswrapper[4947]: E1203 09:07:29.574436 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1783d52-312d-4302-9a4b-6a15255d3518" containerName="nova-manage" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.574444 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1783d52-312d-4302-9a4b-6a15255d3518" containerName="nova-manage" Dec 03 09:07:29 crc kubenswrapper[4947]: E1203 09:07:29.574465 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cac1187-37b2-46a4-a16e-e818460fb3ae" containerName="nova-metadata-metadata" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.574472 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cac1187-37b2-46a4-a16e-e818460fb3ae" containerName="nova-metadata-metadata" Dec 03 09:07:29 crc kubenswrapper[4947]: E1203 09:07:29.574525 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cac1187-37b2-46a4-a16e-e818460fb3ae" containerName="nova-metadata-log" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.574534 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cac1187-37b2-46a4-a16e-e818460fb3ae" containerName="nova-metadata-log" Dec 03 09:07:29 crc kubenswrapper[4947]: E1203 09:07:29.574549 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d" containerName="nova-api-api" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.574557 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d" containerName="nova-api-api" Dec 03 09:07:29 crc kubenswrapper[4947]: E1203 09:07:29.574579 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3d3e206-10b7-4bf2-bec3-b57694a2318f" containerName="nova-manage" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.574586 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3d3e206-10b7-4bf2-bec3-b57694a2318f" containerName="nova-manage" Dec 03 09:07:29 crc kubenswrapper[4947]: E1203 09:07:29.574598 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a03c107-424a-45ce-94fa-f65a738d62a1" containerName="nova-manage" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.574604 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a03c107-424a-45ce-94fa-f65a738d62a1" containerName="nova-manage" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.574831 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1783d52-312d-4302-9a4b-6a15255d3518" containerName="nova-manage" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.574852 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cac1187-37b2-46a4-a16e-e818460fb3ae" containerName="nova-metadata-log" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.574862 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cac1187-37b2-46a4-a16e-e818460fb3ae" containerName="nova-metadata-metadata" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.574880 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3d3e206-10b7-4bf2-bec3-b57694a2318f" containerName="nova-manage" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.574889 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d" containerName="nova-api-api" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.574899 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a03c107-424a-45ce-94fa-f65a738d62a1" containerName="nova-manage" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.574912 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d" containerName="nova-api-log" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.576987 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.583332 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.585759 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.598741 4947 scope.go:117] "RemoveContainer" containerID="a89403aad01cbe1d34490abaa5c8b21e538c4c3ca9b84b88fcfa657f67318d06" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.598892 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.600876 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.604141 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.613696 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.716343 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9sr8\" (UniqueName: \"kubernetes.io/projected/212bbca4-059a-419a-acf3-984c087a0f5d-kube-api-access-v9sr8\") pod \"nova-api-0\" (UID: \"212bbca4-059a-419a-acf3-984c087a0f5d\") " pod="openstack/nova-api-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.716394 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/212bbca4-059a-419a-acf3-984c087a0f5d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"212bbca4-059a-419a-acf3-984c087a0f5d\") " pod="openstack/nova-api-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.716436 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ec54e67-38f1-44ed-a365-932f4103b99a-logs\") pod \"nova-metadata-0\" (UID: \"5ec54e67-38f1-44ed-a365-932f4103b99a\") " pod="openstack/nova-metadata-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.716467 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/212bbca4-059a-419a-acf3-984c087a0f5d-config-data\") pod \"nova-api-0\" (UID: \"212bbca4-059a-419a-acf3-984c087a0f5d\") " pod="openstack/nova-api-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.716559 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/212bbca4-059a-419a-acf3-984c087a0f5d-logs\") pod \"nova-api-0\" (UID: \"212bbca4-059a-419a-acf3-984c087a0f5d\") " pod="openstack/nova-api-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.716611 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec54e67-38f1-44ed-a365-932f4103b99a-config-data\") pod \"nova-metadata-0\" (UID: \"5ec54e67-38f1-44ed-a365-932f4103b99a\") " pod="openstack/nova-metadata-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.716626 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x495x\" (UniqueName: \"kubernetes.io/projected/5ec54e67-38f1-44ed-a365-932f4103b99a-kube-api-access-x495x\") pod \"nova-metadata-0\" (UID: \"5ec54e67-38f1-44ed-a365-932f4103b99a\") " pod="openstack/nova-metadata-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.716647 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec54e67-38f1-44ed-a365-932f4103b99a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5ec54e67-38f1-44ed-a365-932f4103b99a\") " pod="openstack/nova-metadata-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.818317 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/212bbca4-059a-419a-acf3-984c087a0f5d-logs\") pod \"nova-api-0\" (UID: \"212bbca4-059a-419a-acf3-984c087a0f5d\") " pod="openstack/nova-api-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.818736 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/212bbca4-059a-419a-acf3-984c087a0f5d-logs\") pod \"nova-api-0\" (UID: \"212bbca4-059a-419a-acf3-984c087a0f5d\") " pod="openstack/nova-api-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.818385 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec54e67-38f1-44ed-a365-932f4103b99a-config-data\") pod \"nova-metadata-0\" (UID: \"5ec54e67-38f1-44ed-a365-932f4103b99a\") " pod="openstack/nova-metadata-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.818842 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x495x\" (UniqueName: \"kubernetes.io/projected/5ec54e67-38f1-44ed-a365-932f4103b99a-kube-api-access-x495x\") pod \"nova-metadata-0\" (UID: \"5ec54e67-38f1-44ed-a365-932f4103b99a\") " pod="openstack/nova-metadata-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.818898 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec54e67-38f1-44ed-a365-932f4103b99a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5ec54e67-38f1-44ed-a365-932f4103b99a\") " pod="openstack/nova-metadata-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.819420 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9sr8\" (UniqueName: \"kubernetes.io/projected/212bbca4-059a-419a-acf3-984c087a0f5d-kube-api-access-v9sr8\") pod \"nova-api-0\" (UID: \"212bbca4-059a-419a-acf3-984c087a0f5d\") " pod="openstack/nova-api-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.820527 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/212bbca4-059a-419a-acf3-984c087a0f5d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"212bbca4-059a-419a-acf3-984c087a0f5d\") " pod="openstack/nova-api-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.820641 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ec54e67-38f1-44ed-a365-932f4103b99a-logs\") pod \"nova-metadata-0\" (UID: \"5ec54e67-38f1-44ed-a365-932f4103b99a\") " pod="openstack/nova-metadata-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.820759 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/212bbca4-059a-419a-acf3-984c087a0f5d-config-data\") pod \"nova-api-0\" (UID: \"212bbca4-059a-419a-acf3-984c087a0f5d\") " pod="openstack/nova-api-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.821307 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ec54e67-38f1-44ed-a365-932f4103b99a-logs\") pod \"nova-metadata-0\" (UID: \"5ec54e67-38f1-44ed-a365-932f4103b99a\") " pod="openstack/nova-metadata-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.824186 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/212bbca4-059a-419a-acf3-984c087a0f5d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"212bbca4-059a-419a-acf3-984c087a0f5d\") " pod="openstack/nova-api-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.824186 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec54e67-38f1-44ed-a365-932f4103b99a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5ec54e67-38f1-44ed-a365-932f4103b99a\") " pod="openstack/nova-metadata-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.824343 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/212bbca4-059a-419a-acf3-984c087a0f5d-config-data\") pod \"nova-api-0\" (UID: \"212bbca4-059a-419a-acf3-984c087a0f5d\") " pod="openstack/nova-api-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.828235 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec54e67-38f1-44ed-a365-932f4103b99a-config-data\") pod \"nova-metadata-0\" (UID: \"5ec54e67-38f1-44ed-a365-932f4103b99a\") " pod="openstack/nova-metadata-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.838897 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9sr8\" (UniqueName: \"kubernetes.io/projected/212bbca4-059a-419a-acf3-984c087a0f5d-kube-api-access-v9sr8\") pod \"nova-api-0\" (UID: \"212bbca4-059a-419a-acf3-984c087a0f5d\") " pod="openstack/nova-api-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.848908 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x495x\" (UniqueName: \"kubernetes.io/projected/5ec54e67-38f1-44ed-a365-932f4103b99a-kube-api-access-x495x\") pod \"nova-metadata-0\" (UID: \"5ec54e67-38f1-44ed-a365-932f4103b99a\") " pod="openstack/nova-metadata-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.907183 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 09:07:29 crc kubenswrapper[4947]: I1203 09:07:29.938290 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.424567 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.438059 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.440111 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.482667 4947 generic.go:334] "Generic (PLEG): container finished" podID="06b52112-1cfc-45e7-8835-55d0ded3d817" containerID="ef4fd6c2704b8a33c01f635a5047514f9ca8388dfe0400cc6acfce73409da376" exitCode=0 Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.482730 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"06b52112-1cfc-45e7-8835-55d0ded3d817","Type":"ContainerDied","Data":"ef4fd6c2704b8a33c01f635a5047514f9ca8388dfe0400cc6acfce73409da376"} Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.482757 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"06b52112-1cfc-45e7-8835-55d0ded3d817","Type":"ContainerDied","Data":"f8869e8e18c119e322a6ceda8eb03c3cb395bf66e422c73372cb8da7601d54b9"} Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.482773 4947 scope.go:117] "RemoveContainer" containerID="ef4fd6c2704b8a33c01f635a5047514f9ca8388dfe0400cc6acfce73409da376" Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.482871 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.485173 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ec54e67-38f1-44ed-a365-932f4103b99a","Type":"ContainerStarted","Data":"5d2b02edd8e66701da350262b8f2091071676a08962c089087c1dbc0787cbdc4"} Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.487153 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"212bbca4-059a-419a-acf3-984c087a0f5d","Type":"ContainerStarted","Data":"d2ba4f8abd0004131b977317ba7ba0c141d38ba80b456bf010f2b78d5e34568b"} Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.525003 4947 scope.go:117] "RemoveContainer" containerID="ef4fd6c2704b8a33c01f635a5047514f9ca8388dfe0400cc6acfce73409da376" Dec 03 09:07:30 crc kubenswrapper[4947]: E1203 09:07:30.525463 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef4fd6c2704b8a33c01f635a5047514f9ca8388dfe0400cc6acfce73409da376\": container with ID starting with ef4fd6c2704b8a33c01f635a5047514f9ca8388dfe0400cc6acfce73409da376 not found: ID does not exist" containerID="ef4fd6c2704b8a33c01f635a5047514f9ca8388dfe0400cc6acfce73409da376" Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.525530 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef4fd6c2704b8a33c01f635a5047514f9ca8388dfe0400cc6acfce73409da376"} err="failed to get container status \"ef4fd6c2704b8a33c01f635a5047514f9ca8388dfe0400cc6acfce73409da376\": rpc error: code = NotFound desc = could not find container \"ef4fd6c2704b8a33c01f635a5047514f9ca8388dfe0400cc6acfce73409da376\": container with ID starting with ef4fd6c2704b8a33c01f635a5047514f9ca8388dfe0400cc6acfce73409da376 not found: ID does not exist" Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.538742 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b52112-1cfc-45e7-8835-55d0ded3d817-combined-ca-bundle\") pod \"06b52112-1cfc-45e7-8835-55d0ded3d817\" (UID: \"06b52112-1cfc-45e7-8835-55d0ded3d817\") " Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.538966 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx89p\" (UniqueName: \"kubernetes.io/projected/06b52112-1cfc-45e7-8835-55d0ded3d817-kube-api-access-qx89p\") pod \"06b52112-1cfc-45e7-8835-55d0ded3d817\" (UID: \"06b52112-1cfc-45e7-8835-55d0ded3d817\") " Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.539083 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b52112-1cfc-45e7-8835-55d0ded3d817-config-data\") pod \"06b52112-1cfc-45e7-8835-55d0ded3d817\" (UID: \"06b52112-1cfc-45e7-8835-55d0ded3d817\") " Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.543197 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06b52112-1cfc-45e7-8835-55d0ded3d817-kube-api-access-qx89p" (OuterVolumeSpecName: "kube-api-access-qx89p") pod "06b52112-1cfc-45e7-8835-55d0ded3d817" (UID: "06b52112-1cfc-45e7-8835-55d0ded3d817"). InnerVolumeSpecName "kube-api-access-qx89p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.567419 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b52112-1cfc-45e7-8835-55d0ded3d817-config-data" (OuterVolumeSpecName: "config-data") pod "06b52112-1cfc-45e7-8835-55d0ded3d817" (UID: "06b52112-1cfc-45e7-8835-55d0ded3d817"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.582610 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b52112-1cfc-45e7-8835-55d0ded3d817-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06b52112-1cfc-45e7-8835-55d0ded3d817" (UID: "06b52112-1cfc-45e7-8835-55d0ded3d817"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.646981 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b52112-1cfc-45e7-8835-55d0ded3d817-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.647039 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx89p\" (UniqueName: \"kubernetes.io/projected/06b52112-1cfc-45e7-8835-55d0ded3d817-kube-api-access-qx89p\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.647052 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b52112-1cfc-45e7-8835-55d0ded3d817-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.816933 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.828178 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.842514 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 09:07:30 crc kubenswrapper[4947]: E1203 09:07:30.842966 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b52112-1cfc-45e7-8835-55d0ded3d817" containerName="nova-scheduler-scheduler" Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.842987 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b52112-1cfc-45e7-8835-55d0ded3d817" containerName="nova-scheduler-scheduler" Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.843230 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="06b52112-1cfc-45e7-8835-55d0ded3d817" containerName="nova-scheduler-scheduler" Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.843986 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.852701 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.859457 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.952321 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74cba1c-b50e-4d6c-9d24-6ff44ce942e6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f74cba1c-b50e-4d6c-9d24-6ff44ce942e6\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.952420 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpcr6\" (UniqueName: \"kubernetes.io/projected/f74cba1c-b50e-4d6c-9d24-6ff44ce942e6-kube-api-access-fpcr6\") pod \"nova-scheduler-0\" (UID: \"f74cba1c-b50e-4d6c-9d24-6ff44ce942e6\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:30 crc kubenswrapper[4947]: I1203 09:07:30.952798 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f74cba1c-b50e-4d6c-9d24-6ff44ce942e6-config-data\") pod \"nova-scheduler-0\" (UID: \"f74cba1c-b50e-4d6c-9d24-6ff44ce942e6\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:31 crc kubenswrapper[4947]: I1203 09:07:31.054772 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpcr6\" (UniqueName: \"kubernetes.io/projected/f74cba1c-b50e-4d6c-9d24-6ff44ce942e6-kube-api-access-fpcr6\") pod \"nova-scheduler-0\" (UID: \"f74cba1c-b50e-4d6c-9d24-6ff44ce942e6\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:31 crc kubenswrapper[4947]: I1203 09:07:31.054862 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f74cba1c-b50e-4d6c-9d24-6ff44ce942e6-config-data\") pod \"nova-scheduler-0\" (UID: \"f74cba1c-b50e-4d6c-9d24-6ff44ce942e6\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:31 crc kubenswrapper[4947]: I1203 09:07:31.054963 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74cba1c-b50e-4d6c-9d24-6ff44ce942e6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f74cba1c-b50e-4d6c-9d24-6ff44ce942e6\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:31 crc kubenswrapper[4947]: I1203 09:07:31.059197 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f74cba1c-b50e-4d6c-9d24-6ff44ce942e6-config-data\") pod \"nova-scheduler-0\" (UID: \"f74cba1c-b50e-4d6c-9d24-6ff44ce942e6\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:31 crc kubenswrapper[4947]: I1203 09:07:31.068063 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74cba1c-b50e-4d6c-9d24-6ff44ce942e6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f74cba1c-b50e-4d6c-9d24-6ff44ce942e6\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:31 crc kubenswrapper[4947]: I1203 09:07:31.070413 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpcr6\" (UniqueName: \"kubernetes.io/projected/f74cba1c-b50e-4d6c-9d24-6ff44ce942e6-kube-api-access-fpcr6\") pod \"nova-scheduler-0\" (UID: \"f74cba1c-b50e-4d6c-9d24-6ff44ce942e6\") " pod="openstack/nova-scheduler-0" Dec 03 09:07:31 crc kubenswrapper[4947]: I1203 09:07:31.092641 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d" path="/var/lib/kubelet/pods/0436fb4a-a4a0-4671-8e8c-4212fb6f2a4d/volumes" Dec 03 09:07:31 crc kubenswrapper[4947]: I1203 09:07:31.093404 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06b52112-1cfc-45e7-8835-55d0ded3d817" path="/var/lib/kubelet/pods/06b52112-1cfc-45e7-8835-55d0ded3d817/volumes" Dec 03 09:07:31 crc kubenswrapper[4947]: I1203 09:07:31.094040 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cac1187-37b2-46a4-a16e-e818460fb3ae" path="/var/lib/kubelet/pods/9cac1187-37b2-46a4-a16e-e818460fb3ae/volumes" Dec 03 09:07:31 crc kubenswrapper[4947]: I1203 09:07:31.228316 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 09:07:31 crc kubenswrapper[4947]: I1203 09:07:31.499614 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ec54e67-38f1-44ed-a365-932f4103b99a","Type":"ContainerStarted","Data":"a6a14598a9a430b059b6bb767b2c556bf9dd1ecac8031967c832ce57e755a01c"} Dec 03 09:07:31 crc kubenswrapper[4947]: I1203 09:07:31.499977 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ec54e67-38f1-44ed-a365-932f4103b99a","Type":"ContainerStarted","Data":"a5be49daac165ff5b3ff58f7b5517975355ca5ef2665d85ed8ba0f6b4120f17d"} Dec 03 09:07:31 crc kubenswrapper[4947]: I1203 09:07:31.502136 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"212bbca4-059a-419a-acf3-984c087a0f5d","Type":"ContainerStarted","Data":"f62a978e407c8be568bb8ba0315a933f9db91e10ae67b77b343c26e3078bdc73"} Dec 03 09:07:31 crc kubenswrapper[4947]: I1203 09:07:31.502171 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"212bbca4-059a-419a-acf3-984c087a0f5d","Type":"ContainerStarted","Data":"55f873951fc20225fdbca2541d02dc940f188c8a029f55995d2552a261b16573"} Dec 03 09:07:31 crc kubenswrapper[4947]: I1203 09:07:31.526229 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.5262104389999998 podStartE2EDuration="2.526210439s" podCreationTimestamp="2025-12-03 09:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:07:31.51741557 +0000 UTC m=+8312.778369996" watchObservedRunningTime="2025-12-03 09:07:31.526210439 +0000 UTC m=+8312.787164865" Dec 03 09:07:31 crc kubenswrapper[4947]: I1203 09:07:31.540931 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.540915436 podStartE2EDuration="2.540915436s" podCreationTimestamp="2025-12-03 09:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:07:31.532420787 +0000 UTC m=+8312.793375213" watchObservedRunningTime="2025-12-03 09:07:31.540915436 +0000 UTC m=+8312.801869862" Dec 03 09:07:31 crc kubenswrapper[4947]: W1203 09:07:31.673614 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf74cba1c_b50e_4d6c_9d24_6ff44ce942e6.slice/crio-6fa3d3cbb76901dfc81b3c9ebc3ff5705ceb56fd3dab23b06eb163d9fd58cfc7 WatchSource:0}: Error finding container 6fa3d3cbb76901dfc81b3c9ebc3ff5705ceb56fd3dab23b06eb163d9fd58cfc7: Status 404 returned error can't find the container with id 6fa3d3cbb76901dfc81b3c9ebc3ff5705ceb56fd3dab23b06eb163d9fd58cfc7 Dec 03 09:07:31 crc kubenswrapper[4947]: I1203 09:07:31.675380 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 09:07:32 crc kubenswrapper[4947]: I1203 09:07:32.514852 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f74cba1c-b50e-4d6c-9d24-6ff44ce942e6","Type":"ContainerStarted","Data":"1aced893e95341bb3205979e0183c7c041790dd5f2a3ef9804bb24f8aef24fcf"} Dec 03 09:07:32 crc kubenswrapper[4947]: I1203 09:07:32.515232 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f74cba1c-b50e-4d6c-9d24-6ff44ce942e6","Type":"ContainerStarted","Data":"6fa3d3cbb76901dfc81b3c9ebc3ff5705ceb56fd3dab23b06eb163d9fd58cfc7"} Dec 03 09:07:32 crc kubenswrapper[4947]: I1203 09:07:32.541321 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.541301776 podStartE2EDuration="2.541301776s" podCreationTimestamp="2025-12-03 09:07:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:07:32.531676425 +0000 UTC m=+8313.792630841" watchObservedRunningTime="2025-12-03 09:07:32.541301776 +0000 UTC m=+8313.802256212" Dec 03 09:07:34 crc kubenswrapper[4947]: I1203 09:07:34.907553 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 09:07:34 crc kubenswrapper[4947]: I1203 09:07:34.908173 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 09:07:36 crc kubenswrapper[4947]: I1203 09:07:36.229393 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 09:07:39 crc kubenswrapper[4947]: I1203 09:07:39.908106 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 09:07:39 crc kubenswrapper[4947]: I1203 09:07:39.908866 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 09:07:39 crc kubenswrapper[4947]: I1203 09:07:39.939045 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 09:07:39 crc kubenswrapper[4947]: I1203 09:07:39.939110 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 09:07:40 crc kubenswrapper[4947]: I1203 09:07:40.991765 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5ec54e67-38f1-44ed-a365-932f4103b99a" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.136:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 09:07:41 crc kubenswrapper[4947]: I1203 09:07:41.074820 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5ec54e67-38f1-44ed-a365-932f4103b99a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.136:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 09:07:41 crc kubenswrapper[4947]: I1203 09:07:41.075124 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="212bbca4-059a-419a-acf3-984c087a0f5d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.137:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 09:07:41 crc kubenswrapper[4947]: I1203 09:07:41.075282 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="212bbca4-059a-419a-acf3-984c087a0f5d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.137:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 09:07:41 crc kubenswrapper[4947]: I1203 09:07:41.228997 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 09:07:41 crc kubenswrapper[4947]: I1203 09:07:41.263382 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 09:07:41 crc kubenswrapper[4947]: I1203 09:07:41.666805 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 09:07:49 crc kubenswrapper[4947]: I1203 09:07:49.910850 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 09:07:49 crc kubenswrapper[4947]: I1203 09:07:49.911471 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 09:07:49 crc kubenswrapper[4947]: I1203 09:07:49.913575 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 09:07:49 crc kubenswrapper[4947]: I1203 09:07:49.913718 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 09:07:49 crc kubenswrapper[4947]: I1203 09:07:49.944941 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 09:07:49 crc kubenswrapper[4947]: I1203 09:07:49.945457 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 09:07:49 crc kubenswrapper[4947]: I1203 09:07:49.945756 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 09:07:49 crc kubenswrapper[4947]: I1203 09:07:49.950329 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 09:07:50 crc kubenswrapper[4947]: I1203 09:07:50.715751 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 09:07:50 crc kubenswrapper[4947]: I1203 09:07:50.720718 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 09:07:50 crc kubenswrapper[4947]: I1203 09:07:50.909264 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d696ffc89-zrvbs"] Dec 03 09:07:50 crc kubenswrapper[4947]: I1203 09:07:50.911067 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" Dec 03 09:07:50 crc kubenswrapper[4947]: I1203 09:07:50.945706 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa5219b3-502b-445f-9b19-66acfef1f54f-dns-svc\") pod \"dnsmasq-dns-5d696ffc89-zrvbs\" (UID: \"fa5219b3-502b-445f-9b19-66acfef1f54f\") " pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" Dec 03 09:07:50 crc kubenswrapper[4947]: I1203 09:07:50.945761 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa5219b3-502b-445f-9b19-66acfef1f54f-config\") pod \"dnsmasq-dns-5d696ffc89-zrvbs\" (UID: \"fa5219b3-502b-445f-9b19-66acfef1f54f\") " pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" Dec 03 09:07:50 crc kubenswrapper[4947]: I1203 09:07:50.945813 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa5219b3-502b-445f-9b19-66acfef1f54f-ovsdbserver-sb\") pod \"dnsmasq-dns-5d696ffc89-zrvbs\" (UID: \"fa5219b3-502b-445f-9b19-66acfef1f54f\") " pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" Dec 03 09:07:50 crc kubenswrapper[4947]: I1203 09:07:50.945888 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvl2h\" (UniqueName: \"kubernetes.io/projected/fa5219b3-502b-445f-9b19-66acfef1f54f-kube-api-access-jvl2h\") pod \"dnsmasq-dns-5d696ffc89-zrvbs\" (UID: \"fa5219b3-502b-445f-9b19-66acfef1f54f\") " pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" Dec 03 09:07:50 crc kubenswrapper[4947]: I1203 09:07:50.945924 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa5219b3-502b-445f-9b19-66acfef1f54f-ovsdbserver-nb\") pod \"dnsmasq-dns-5d696ffc89-zrvbs\" (UID: \"fa5219b3-502b-445f-9b19-66acfef1f54f\") " pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" Dec 03 09:07:50 crc kubenswrapper[4947]: I1203 09:07:50.947602 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d696ffc89-zrvbs"] Dec 03 09:07:51 crc kubenswrapper[4947]: I1203 09:07:51.047742 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvl2h\" (UniqueName: \"kubernetes.io/projected/fa5219b3-502b-445f-9b19-66acfef1f54f-kube-api-access-jvl2h\") pod \"dnsmasq-dns-5d696ffc89-zrvbs\" (UID: \"fa5219b3-502b-445f-9b19-66acfef1f54f\") " pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" Dec 03 09:07:51 crc kubenswrapper[4947]: I1203 09:07:51.047817 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa5219b3-502b-445f-9b19-66acfef1f54f-ovsdbserver-nb\") pod \"dnsmasq-dns-5d696ffc89-zrvbs\" (UID: \"fa5219b3-502b-445f-9b19-66acfef1f54f\") " pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" Dec 03 09:07:51 crc kubenswrapper[4947]: I1203 09:07:51.047904 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa5219b3-502b-445f-9b19-66acfef1f54f-dns-svc\") pod \"dnsmasq-dns-5d696ffc89-zrvbs\" (UID: \"fa5219b3-502b-445f-9b19-66acfef1f54f\") " pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" Dec 03 09:07:51 crc kubenswrapper[4947]: I1203 09:07:51.047943 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa5219b3-502b-445f-9b19-66acfef1f54f-config\") pod \"dnsmasq-dns-5d696ffc89-zrvbs\" (UID: \"fa5219b3-502b-445f-9b19-66acfef1f54f\") " pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" Dec 03 09:07:51 crc kubenswrapper[4947]: I1203 09:07:51.048011 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa5219b3-502b-445f-9b19-66acfef1f54f-ovsdbserver-sb\") pod \"dnsmasq-dns-5d696ffc89-zrvbs\" (UID: \"fa5219b3-502b-445f-9b19-66acfef1f54f\") " pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" Dec 03 09:07:51 crc kubenswrapper[4947]: I1203 09:07:51.049079 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa5219b3-502b-445f-9b19-66acfef1f54f-ovsdbserver-nb\") pod \"dnsmasq-dns-5d696ffc89-zrvbs\" (UID: \"fa5219b3-502b-445f-9b19-66acfef1f54f\") " pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" Dec 03 09:07:51 crc kubenswrapper[4947]: I1203 09:07:51.049207 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa5219b3-502b-445f-9b19-66acfef1f54f-dns-svc\") pod \"dnsmasq-dns-5d696ffc89-zrvbs\" (UID: \"fa5219b3-502b-445f-9b19-66acfef1f54f\") " pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" Dec 03 09:07:51 crc kubenswrapper[4947]: I1203 09:07:51.049298 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa5219b3-502b-445f-9b19-66acfef1f54f-config\") pod \"dnsmasq-dns-5d696ffc89-zrvbs\" (UID: \"fa5219b3-502b-445f-9b19-66acfef1f54f\") " pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" Dec 03 09:07:51 crc kubenswrapper[4947]: I1203 09:07:51.049378 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa5219b3-502b-445f-9b19-66acfef1f54f-ovsdbserver-sb\") pod \"dnsmasq-dns-5d696ffc89-zrvbs\" (UID: \"fa5219b3-502b-445f-9b19-66acfef1f54f\") " pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" Dec 03 09:07:51 crc kubenswrapper[4947]: I1203 09:07:51.070520 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvl2h\" (UniqueName: \"kubernetes.io/projected/fa5219b3-502b-445f-9b19-66acfef1f54f-kube-api-access-jvl2h\") pod \"dnsmasq-dns-5d696ffc89-zrvbs\" (UID: \"fa5219b3-502b-445f-9b19-66acfef1f54f\") " pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" Dec 03 09:07:51 crc kubenswrapper[4947]: I1203 09:07:51.246470 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" Dec 03 09:07:51 crc kubenswrapper[4947]: I1203 09:07:51.737687 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d696ffc89-zrvbs"] Dec 03 09:07:51 crc kubenswrapper[4947]: W1203 09:07:51.748404 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa5219b3_502b_445f_9b19_66acfef1f54f.slice/crio-031121cfd47d053897e34a24db6f2b7b9f421b3bf3a203983d7288a8fe5a3cd4 WatchSource:0}: Error finding container 031121cfd47d053897e34a24db6f2b7b9f421b3bf3a203983d7288a8fe5a3cd4: Status 404 returned error can't find the container with id 031121cfd47d053897e34a24db6f2b7b9f421b3bf3a203983d7288a8fe5a3cd4 Dec 03 09:07:52 crc kubenswrapper[4947]: I1203 09:07:52.741190 4947 generic.go:334] "Generic (PLEG): container finished" podID="fa5219b3-502b-445f-9b19-66acfef1f54f" containerID="1abf4f790890780fb413bacc27444951ba9fcab5393acbf407742712eb215b0a" exitCode=0 Dec 03 09:07:52 crc kubenswrapper[4947]: I1203 09:07:52.741259 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" event={"ID":"fa5219b3-502b-445f-9b19-66acfef1f54f","Type":"ContainerDied","Data":"1abf4f790890780fb413bacc27444951ba9fcab5393acbf407742712eb215b0a"} Dec 03 09:07:52 crc kubenswrapper[4947]: I1203 09:07:52.741591 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" event={"ID":"fa5219b3-502b-445f-9b19-66acfef1f54f","Type":"ContainerStarted","Data":"031121cfd47d053897e34a24db6f2b7b9f421b3bf3a203983d7288a8fe5a3cd4"} Dec 03 09:07:53 crc kubenswrapper[4947]: I1203 09:07:53.750884 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" event={"ID":"fa5219b3-502b-445f-9b19-66acfef1f54f","Type":"ContainerStarted","Data":"47e621da3b4dcd9289ff86e6ee9df7f7a8498228880c1aabd3c5360c42c6c304"} Dec 03 09:07:53 crc kubenswrapper[4947]: I1203 09:07:53.751098 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" Dec 03 09:07:53 crc kubenswrapper[4947]: I1203 09:07:53.771046 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" podStartSLOduration=3.7710207159999998 podStartE2EDuration="3.771020716s" podCreationTimestamp="2025-12-03 09:07:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:07:53.765387924 +0000 UTC m=+8335.026342350" watchObservedRunningTime="2025-12-03 09:07:53.771020716 +0000 UTC m=+8335.031975152" Dec 03 09:07:59 crc kubenswrapper[4947]: I1203 09:07:59.075038 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-2608-account-create-update-478sx"] Dec 03 09:07:59 crc kubenswrapper[4947]: I1203 09:07:59.096595 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-6qs67"] Dec 03 09:07:59 crc kubenswrapper[4947]: I1203 09:07:59.104614 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-6qs67"] Dec 03 09:07:59 crc kubenswrapper[4947]: I1203 09:07:59.112787 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-2608-account-create-update-478sx"] Dec 03 09:08:01 crc kubenswrapper[4947]: I1203 09:08:01.095790 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6982b993-d2e1-4cba-bf74-0f3ca0366c62" path="/var/lib/kubelet/pods/6982b993-d2e1-4cba-bf74-0f3ca0366c62/volumes" Dec 03 09:08:01 crc kubenswrapper[4947]: I1203 09:08:01.096871 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb78927d-aa75-4748-810a-d7866f36be52" path="/var/lib/kubelet/pods/cb78927d-aa75-4748-810a-d7866f36be52/volumes" Dec 03 09:08:01 crc kubenswrapper[4947]: I1203 09:08:01.250017 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" Dec 03 09:08:01 crc kubenswrapper[4947]: I1203 09:08:01.322121 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c478dcc7c-ngls5"] Dec 03 09:08:01 crc kubenswrapper[4947]: I1203 09:08:01.322385 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" podUID="c5aad179-8099-4458-b053-600b58e2b759" containerName="dnsmasq-dns" containerID="cri-o://16a76d14c570ea94fd6772ca2cd0bf8e966411d98e6f4627e6a1d8990c5f5ac4" gracePeriod=10 Dec 03 09:08:01 crc kubenswrapper[4947]: I1203 09:08:01.852934 4947 generic.go:334] "Generic (PLEG): container finished" podID="c5aad179-8099-4458-b053-600b58e2b759" containerID="16a76d14c570ea94fd6772ca2cd0bf8e966411d98e6f4627e6a1d8990c5f5ac4" exitCode=0 Dec 03 09:08:01 crc kubenswrapper[4947]: I1203 09:08:01.852970 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" event={"ID":"c5aad179-8099-4458-b053-600b58e2b759","Type":"ContainerDied","Data":"16a76d14c570ea94fd6772ca2cd0bf8e966411d98e6f4627e6a1d8990c5f5ac4"} Dec 03 09:08:01 crc kubenswrapper[4947]: I1203 09:08:01.853288 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" event={"ID":"c5aad179-8099-4458-b053-600b58e2b759","Type":"ContainerDied","Data":"48adc91a8c0e355d23627be44112e5e35da4426bf134bc07e59c66a61959d058"} Dec 03 09:08:01 crc kubenswrapper[4947]: I1203 09:08:01.853312 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48adc91a8c0e355d23627be44112e5e35da4426bf134bc07e59c66a61959d058" Dec 03 09:08:01 crc kubenswrapper[4947]: I1203 09:08:01.860561 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" Dec 03 09:08:01 crc kubenswrapper[4947]: I1203 09:08:01.901786 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmwz8\" (UniqueName: \"kubernetes.io/projected/c5aad179-8099-4458-b053-600b58e2b759-kube-api-access-cmwz8\") pod \"c5aad179-8099-4458-b053-600b58e2b759\" (UID: \"c5aad179-8099-4458-b053-600b58e2b759\") " Dec 03 09:08:01 crc kubenswrapper[4947]: I1203 09:08:01.901852 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5aad179-8099-4458-b053-600b58e2b759-config\") pod \"c5aad179-8099-4458-b053-600b58e2b759\" (UID: \"c5aad179-8099-4458-b053-600b58e2b759\") " Dec 03 09:08:01 crc kubenswrapper[4947]: I1203 09:08:01.901932 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5aad179-8099-4458-b053-600b58e2b759-ovsdbserver-sb\") pod \"c5aad179-8099-4458-b053-600b58e2b759\" (UID: \"c5aad179-8099-4458-b053-600b58e2b759\") " Dec 03 09:08:01 crc kubenswrapper[4947]: I1203 09:08:01.901977 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5aad179-8099-4458-b053-600b58e2b759-dns-svc\") pod \"c5aad179-8099-4458-b053-600b58e2b759\" (UID: \"c5aad179-8099-4458-b053-600b58e2b759\") " Dec 03 09:08:01 crc kubenswrapper[4947]: I1203 09:08:01.902048 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5aad179-8099-4458-b053-600b58e2b759-ovsdbserver-nb\") pod \"c5aad179-8099-4458-b053-600b58e2b759\" (UID: \"c5aad179-8099-4458-b053-600b58e2b759\") " Dec 03 09:08:01 crc kubenswrapper[4947]: I1203 09:08:01.912750 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5aad179-8099-4458-b053-600b58e2b759-kube-api-access-cmwz8" (OuterVolumeSpecName: "kube-api-access-cmwz8") pod "c5aad179-8099-4458-b053-600b58e2b759" (UID: "c5aad179-8099-4458-b053-600b58e2b759"). InnerVolumeSpecName "kube-api-access-cmwz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:08:01 crc kubenswrapper[4947]: I1203 09:08:01.967550 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5aad179-8099-4458-b053-600b58e2b759-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c5aad179-8099-4458-b053-600b58e2b759" (UID: "c5aad179-8099-4458-b053-600b58e2b759"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:08:01 crc kubenswrapper[4947]: I1203 09:08:01.972962 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5aad179-8099-4458-b053-600b58e2b759-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c5aad179-8099-4458-b053-600b58e2b759" (UID: "c5aad179-8099-4458-b053-600b58e2b759"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:08:01 crc kubenswrapper[4947]: I1203 09:08:01.986912 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5aad179-8099-4458-b053-600b58e2b759-config" (OuterVolumeSpecName: "config") pod "c5aad179-8099-4458-b053-600b58e2b759" (UID: "c5aad179-8099-4458-b053-600b58e2b759"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:08:02 crc kubenswrapper[4947]: I1203 09:08:02.001364 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5aad179-8099-4458-b053-600b58e2b759-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c5aad179-8099-4458-b053-600b58e2b759" (UID: "c5aad179-8099-4458-b053-600b58e2b759"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:08:02 crc kubenswrapper[4947]: I1203 09:08:02.003575 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmwz8\" (UniqueName: \"kubernetes.io/projected/c5aad179-8099-4458-b053-600b58e2b759-kube-api-access-cmwz8\") on node \"crc\" DevicePath \"\"" Dec 03 09:08:02 crc kubenswrapper[4947]: I1203 09:08:02.003617 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5aad179-8099-4458-b053-600b58e2b759-config\") on node \"crc\" DevicePath \"\"" Dec 03 09:08:02 crc kubenswrapper[4947]: I1203 09:08:02.003629 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c5aad179-8099-4458-b053-600b58e2b759-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 09:08:02 crc kubenswrapper[4947]: I1203 09:08:02.003638 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c5aad179-8099-4458-b053-600b58e2b759-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 09:08:02 crc kubenswrapper[4947]: I1203 09:08:02.003648 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c5aad179-8099-4458-b053-600b58e2b759-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 09:08:02 crc kubenswrapper[4947]: I1203 09:08:02.863145 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c478dcc7c-ngls5" Dec 03 09:08:02 crc kubenswrapper[4947]: I1203 09:08:02.901758 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c478dcc7c-ngls5"] Dec 03 09:08:02 crc kubenswrapper[4947]: I1203 09:08:02.913128 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c478dcc7c-ngls5"] Dec 03 09:08:03 crc kubenswrapper[4947]: I1203 09:08:03.102764 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5aad179-8099-4458-b053-600b58e2b759" path="/var/lib/kubelet/pods/c5aad179-8099-4458-b053-600b58e2b759/volumes" Dec 03 09:08:08 crc kubenswrapper[4947]: I1203 09:08:08.795634 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-64f876b76f-75gvq"] Dec 03 09:08:08 crc kubenswrapper[4947]: E1203 09:08:08.796792 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5aad179-8099-4458-b053-600b58e2b759" containerName="init" Dec 03 09:08:08 crc kubenswrapper[4947]: I1203 09:08:08.796805 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5aad179-8099-4458-b053-600b58e2b759" containerName="init" Dec 03 09:08:08 crc kubenswrapper[4947]: E1203 09:08:08.796823 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5aad179-8099-4458-b053-600b58e2b759" containerName="dnsmasq-dns" Dec 03 09:08:08 crc kubenswrapper[4947]: I1203 09:08:08.796829 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5aad179-8099-4458-b053-600b58e2b759" containerName="dnsmasq-dns" Dec 03 09:08:08 crc kubenswrapper[4947]: I1203 09:08:08.797006 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5aad179-8099-4458-b053-600b58e2b759" containerName="dnsmasq-dns" Dec 03 09:08:08 crc kubenswrapper[4947]: I1203 09:08:08.797994 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64f876b76f-75gvq" Dec 03 09:08:08 crc kubenswrapper[4947]: I1203 09:08:08.801365 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 03 09:08:08 crc kubenswrapper[4947]: I1203 09:08:08.801553 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-l8fmn" Dec 03 09:08:08 crc kubenswrapper[4947]: I1203 09:08:08.801683 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 03 09:08:08 crc kubenswrapper[4947]: I1203 09:08:08.803215 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 03 09:08:08 crc kubenswrapper[4947]: I1203 09:08:08.824464 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64f876b76f-75gvq"] Dec 03 09:08:08 crc kubenswrapper[4947]: I1203 09:08:08.866620 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 09:08:08 crc kubenswrapper[4947]: I1203 09:08:08.866887 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7b5cf332-361f-4e92-97e2-575f850e1782" containerName="glance-httpd" containerID="cri-o://b3e52658147c693a35a77619f358007f0cf09e4514fd8b7e864349283ddf8663" gracePeriod=30 Dec 03 09:08:08 crc kubenswrapper[4947]: I1203 09:08:08.872388 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="7b5cf332-361f-4e92-97e2-575f850e1782" containerName="glance-log" containerID="cri-o://228f0368e624c00692128d283e2dbe53f23cc91ba07b76bd6c7e957d9c237b07" gracePeriod=30 Dec 03 09:08:08 crc kubenswrapper[4947]: I1203 09:08:08.892231 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-b8986b5b9-qhbt7"] Dec 03 09:08:08 crc kubenswrapper[4947]: I1203 09:08:08.893774 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b8986b5b9-qhbt7" Dec 03 09:08:08 crc kubenswrapper[4947]: I1203 09:08:08.951210 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-config-data\") pod \"horizon-64f876b76f-75gvq\" (UID: \"a42ec7f7-5870-4d92-8d5f-dc5350cb281c\") " pod="openstack/horizon-64f876b76f-75gvq" Dec 03 09:08:08 crc kubenswrapper[4947]: I1203 09:08:08.951264 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-scripts\") pod \"horizon-64f876b76f-75gvq\" (UID: \"a42ec7f7-5870-4d92-8d5f-dc5350cb281c\") " pod="openstack/horizon-64f876b76f-75gvq" Dec 03 09:08:08 crc kubenswrapper[4947]: I1203 09:08:08.951291 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rxpc\" (UniqueName: \"kubernetes.io/projected/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-kube-api-access-4rxpc\") pod \"horizon-64f876b76f-75gvq\" (UID: \"a42ec7f7-5870-4d92-8d5f-dc5350cb281c\") " pod="openstack/horizon-64f876b76f-75gvq" Dec 03 09:08:08 crc kubenswrapper[4947]: I1203 09:08:08.951433 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-logs\") pod \"horizon-64f876b76f-75gvq\" (UID: \"a42ec7f7-5870-4d92-8d5f-dc5350cb281c\") " pod="openstack/horizon-64f876b76f-75gvq" Dec 03 09:08:08 crc kubenswrapper[4947]: I1203 09:08:08.951470 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-horizon-secret-key\") pod \"horizon-64f876b76f-75gvq\" (UID: \"a42ec7f7-5870-4d92-8d5f-dc5350cb281c\") " pod="openstack/horizon-64f876b76f-75gvq" Dec 03 09:08:08 crc kubenswrapper[4947]: I1203 09:08:08.955076 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b8986b5b9-qhbt7"] Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.002739 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.003419 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f62866b1-13eb-4f40-9be3-168fae89746a" containerName="glance-httpd" containerID="cri-o://a25e5fa470e9b2747046dc6fb3121e3d33d66549e440913c334a3f4509f0a763" gracePeriod=30 Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.003005 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f62866b1-13eb-4f40-9be3-168fae89746a" containerName="glance-log" containerID="cri-o://3f02f4ca39c082bd13b86c307f0171b08ad3d32035f76189a46d15637932dc87" gracePeriod=30 Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.052896 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1cfe7be4-097c-4ec4-86f6-13f53431f09d-horizon-secret-key\") pod \"horizon-b8986b5b9-qhbt7\" (UID: \"1cfe7be4-097c-4ec4-86f6-13f53431f09d\") " pod="openstack/horizon-b8986b5b9-qhbt7" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.053013 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-config-data\") pod \"horizon-64f876b76f-75gvq\" (UID: \"a42ec7f7-5870-4d92-8d5f-dc5350cb281c\") " pod="openstack/horizon-64f876b76f-75gvq" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.053064 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-scripts\") pod \"horizon-64f876b76f-75gvq\" (UID: \"a42ec7f7-5870-4d92-8d5f-dc5350cb281c\") " pod="openstack/horizon-64f876b76f-75gvq" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.053087 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rxpc\" (UniqueName: \"kubernetes.io/projected/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-kube-api-access-4rxpc\") pod \"horizon-64f876b76f-75gvq\" (UID: \"a42ec7f7-5870-4d92-8d5f-dc5350cb281c\") " pod="openstack/horizon-64f876b76f-75gvq" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.054408 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-scripts\") pod \"horizon-64f876b76f-75gvq\" (UID: \"a42ec7f7-5870-4d92-8d5f-dc5350cb281c\") " pod="openstack/horizon-64f876b76f-75gvq" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.054750 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-config-data\") pod \"horizon-64f876b76f-75gvq\" (UID: \"a42ec7f7-5870-4d92-8d5f-dc5350cb281c\") " pod="openstack/horizon-64f876b76f-75gvq" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.058776 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cfe7be4-097c-4ec4-86f6-13f53431f09d-logs\") pod \"horizon-b8986b5b9-qhbt7\" (UID: \"1cfe7be4-097c-4ec4-86f6-13f53431f09d\") " pod="openstack/horizon-b8986b5b9-qhbt7" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.058833 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cfe7be4-097c-4ec4-86f6-13f53431f09d-scripts\") pod \"horizon-b8986b5b9-qhbt7\" (UID: \"1cfe7be4-097c-4ec4-86f6-13f53431f09d\") " pod="openstack/horizon-b8986b5b9-qhbt7" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.059163 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl47l\" (UniqueName: \"kubernetes.io/projected/1cfe7be4-097c-4ec4-86f6-13f53431f09d-kube-api-access-pl47l\") pod \"horizon-b8986b5b9-qhbt7\" (UID: \"1cfe7be4-097c-4ec4-86f6-13f53431f09d\") " pod="openstack/horizon-b8986b5b9-qhbt7" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.059221 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-logs\") pod \"horizon-64f876b76f-75gvq\" (UID: \"a42ec7f7-5870-4d92-8d5f-dc5350cb281c\") " pod="openstack/horizon-64f876b76f-75gvq" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.059319 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-horizon-secret-key\") pod \"horizon-64f876b76f-75gvq\" (UID: \"a42ec7f7-5870-4d92-8d5f-dc5350cb281c\") " pod="openstack/horizon-64f876b76f-75gvq" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.059393 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1cfe7be4-097c-4ec4-86f6-13f53431f09d-config-data\") pod \"horizon-b8986b5b9-qhbt7\" (UID: \"1cfe7be4-097c-4ec4-86f6-13f53431f09d\") " pod="openstack/horizon-b8986b5b9-qhbt7" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.059942 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-logs\") pod \"horizon-64f876b76f-75gvq\" (UID: \"a42ec7f7-5870-4d92-8d5f-dc5350cb281c\") " pod="openstack/horizon-64f876b76f-75gvq" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.070052 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-horizon-secret-key\") pod \"horizon-64f876b76f-75gvq\" (UID: \"a42ec7f7-5870-4d92-8d5f-dc5350cb281c\") " pod="openstack/horizon-64f876b76f-75gvq" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.075622 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rxpc\" (UniqueName: \"kubernetes.io/projected/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-kube-api-access-4rxpc\") pod \"horizon-64f876b76f-75gvq\" (UID: \"a42ec7f7-5870-4d92-8d5f-dc5350cb281c\") " pod="openstack/horizon-64f876b76f-75gvq" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.124674 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64f876b76f-75gvq" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.160612 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1cfe7be4-097c-4ec4-86f6-13f53431f09d-config-data\") pod \"horizon-b8986b5b9-qhbt7\" (UID: \"1cfe7be4-097c-4ec4-86f6-13f53431f09d\") " pod="openstack/horizon-b8986b5b9-qhbt7" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.160665 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1cfe7be4-097c-4ec4-86f6-13f53431f09d-horizon-secret-key\") pod \"horizon-b8986b5b9-qhbt7\" (UID: \"1cfe7be4-097c-4ec4-86f6-13f53431f09d\") " pod="openstack/horizon-b8986b5b9-qhbt7" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.160728 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cfe7be4-097c-4ec4-86f6-13f53431f09d-logs\") pod \"horizon-b8986b5b9-qhbt7\" (UID: \"1cfe7be4-097c-4ec4-86f6-13f53431f09d\") " pod="openstack/horizon-b8986b5b9-qhbt7" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.160746 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cfe7be4-097c-4ec4-86f6-13f53431f09d-scripts\") pod \"horizon-b8986b5b9-qhbt7\" (UID: \"1cfe7be4-097c-4ec4-86f6-13f53431f09d\") " pod="openstack/horizon-b8986b5b9-qhbt7" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.160817 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl47l\" (UniqueName: \"kubernetes.io/projected/1cfe7be4-097c-4ec4-86f6-13f53431f09d-kube-api-access-pl47l\") pod \"horizon-b8986b5b9-qhbt7\" (UID: \"1cfe7be4-097c-4ec4-86f6-13f53431f09d\") " pod="openstack/horizon-b8986b5b9-qhbt7" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.161293 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cfe7be4-097c-4ec4-86f6-13f53431f09d-logs\") pod \"horizon-b8986b5b9-qhbt7\" (UID: \"1cfe7be4-097c-4ec4-86f6-13f53431f09d\") " pod="openstack/horizon-b8986b5b9-qhbt7" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.161547 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cfe7be4-097c-4ec4-86f6-13f53431f09d-scripts\") pod \"horizon-b8986b5b9-qhbt7\" (UID: \"1cfe7be4-097c-4ec4-86f6-13f53431f09d\") " pod="openstack/horizon-b8986b5b9-qhbt7" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.162010 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1cfe7be4-097c-4ec4-86f6-13f53431f09d-config-data\") pod \"horizon-b8986b5b9-qhbt7\" (UID: \"1cfe7be4-097c-4ec4-86f6-13f53431f09d\") " pod="openstack/horizon-b8986b5b9-qhbt7" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.165217 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1cfe7be4-097c-4ec4-86f6-13f53431f09d-horizon-secret-key\") pod \"horizon-b8986b5b9-qhbt7\" (UID: \"1cfe7be4-097c-4ec4-86f6-13f53431f09d\") " pod="openstack/horizon-b8986b5b9-qhbt7" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.181590 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl47l\" (UniqueName: \"kubernetes.io/projected/1cfe7be4-097c-4ec4-86f6-13f53431f09d-kube-api-access-pl47l\") pod \"horizon-b8986b5b9-qhbt7\" (UID: \"1cfe7be4-097c-4ec4-86f6-13f53431f09d\") " pod="openstack/horizon-b8986b5b9-qhbt7" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.222153 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b8986b5b9-qhbt7" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.540387 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64f876b76f-75gvq"] Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.566784 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5c5859c9f9-lxs8f"] Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.568513 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c5859c9f9-lxs8f" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.599544 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c5859c9f9-lxs8f"] Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.653804 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64f876b76f-75gvq"] Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.687689 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz9vm\" (UniqueName: \"kubernetes.io/projected/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-kube-api-access-gz9vm\") pod \"horizon-5c5859c9f9-lxs8f\" (UID: \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\") " pod="openstack/horizon-5c5859c9f9-lxs8f" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.687768 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-logs\") pod \"horizon-5c5859c9f9-lxs8f\" (UID: \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\") " pod="openstack/horizon-5c5859c9f9-lxs8f" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.687911 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-config-data\") pod \"horizon-5c5859c9f9-lxs8f\" (UID: \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\") " pod="openstack/horizon-5c5859c9f9-lxs8f" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.687969 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-scripts\") pod \"horizon-5c5859c9f9-lxs8f\" (UID: \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\") " pod="openstack/horizon-5c5859c9f9-lxs8f" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.688202 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-horizon-secret-key\") pod \"horizon-5c5859c9f9-lxs8f\" (UID: \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\") " pod="openstack/horizon-5c5859c9f9-lxs8f" Dec 03 09:08:09 crc kubenswrapper[4947]: W1203 09:08:09.765408 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cfe7be4_097c_4ec4_86f6_13f53431f09d.slice/crio-5f58192a65cb1110bf852b6adea49ac9f5234cabee2e58a18a3c71036787e6d0 WatchSource:0}: Error finding container 5f58192a65cb1110bf852b6adea49ac9f5234cabee2e58a18a3c71036787e6d0: Status 404 returned error can't find the container with id 5f58192a65cb1110bf852b6adea49ac9f5234cabee2e58a18a3c71036787e6d0 Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.767472 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b8986b5b9-qhbt7"] Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.789626 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-scripts\") pod \"horizon-5c5859c9f9-lxs8f\" (UID: \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\") " pod="openstack/horizon-5c5859c9f9-lxs8f" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.789753 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-horizon-secret-key\") pod \"horizon-5c5859c9f9-lxs8f\" (UID: \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\") " pod="openstack/horizon-5c5859c9f9-lxs8f" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.789848 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz9vm\" (UniqueName: \"kubernetes.io/projected/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-kube-api-access-gz9vm\") pod \"horizon-5c5859c9f9-lxs8f\" (UID: \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\") " pod="openstack/horizon-5c5859c9f9-lxs8f" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.789887 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-logs\") pod \"horizon-5c5859c9f9-lxs8f\" (UID: \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\") " pod="openstack/horizon-5c5859c9f9-lxs8f" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.789929 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-config-data\") pod \"horizon-5c5859c9f9-lxs8f\" (UID: \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\") " pod="openstack/horizon-5c5859c9f9-lxs8f" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.790595 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-scripts\") pod \"horizon-5c5859c9f9-lxs8f\" (UID: \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\") " pod="openstack/horizon-5c5859c9f9-lxs8f" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.791022 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-logs\") pod \"horizon-5c5859c9f9-lxs8f\" (UID: \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\") " pod="openstack/horizon-5c5859c9f9-lxs8f" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.791288 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-config-data\") pod \"horizon-5c5859c9f9-lxs8f\" (UID: \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\") " pod="openstack/horizon-5c5859c9f9-lxs8f" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.794822 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-horizon-secret-key\") pod \"horizon-5c5859c9f9-lxs8f\" (UID: \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\") " pod="openstack/horizon-5c5859c9f9-lxs8f" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.806191 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz9vm\" (UniqueName: \"kubernetes.io/projected/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-kube-api-access-gz9vm\") pod \"horizon-5c5859c9f9-lxs8f\" (UID: \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\") " pod="openstack/horizon-5c5859c9f9-lxs8f" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.927225 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c5859c9f9-lxs8f" Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.969264 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b8986b5b9-qhbt7" event={"ID":"1cfe7be4-097c-4ec4-86f6-13f53431f09d","Type":"ContainerStarted","Data":"5f58192a65cb1110bf852b6adea49ac9f5234cabee2e58a18a3c71036787e6d0"} Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.972415 4947 generic.go:334] "Generic (PLEG): container finished" podID="f62866b1-13eb-4f40-9be3-168fae89746a" containerID="3f02f4ca39c082bd13b86c307f0171b08ad3d32035f76189a46d15637932dc87" exitCode=143 Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.972477 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f62866b1-13eb-4f40-9be3-168fae89746a","Type":"ContainerDied","Data":"3f02f4ca39c082bd13b86c307f0171b08ad3d32035f76189a46d15637932dc87"} Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.977120 4947 generic.go:334] "Generic (PLEG): container finished" podID="7b5cf332-361f-4e92-97e2-575f850e1782" containerID="228f0368e624c00692128d283e2dbe53f23cc91ba07b76bd6c7e957d9c237b07" exitCode=143 Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.977214 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7b5cf332-361f-4e92-97e2-575f850e1782","Type":"ContainerDied","Data":"228f0368e624c00692128d283e2dbe53f23cc91ba07b76bd6c7e957d9c237b07"} Dec 03 09:08:09 crc kubenswrapper[4947]: I1203 09:08:09.982712 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64f876b76f-75gvq" event={"ID":"a42ec7f7-5870-4d92-8d5f-dc5350cb281c","Type":"ContainerStarted","Data":"4084119244b97fb06f1285b870e7040f21d8e56737a0098f1ad11fbcb855cc09"} Dec 03 09:08:10 crc kubenswrapper[4947]: I1203 09:08:10.428268 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c5859c9f9-lxs8f"] Dec 03 09:08:10 crc kubenswrapper[4947]: I1203 09:08:10.999389 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c5859c9f9-lxs8f" event={"ID":"824c9c5f-7c1e-4f88-9a92-70b10b1945ab","Type":"ContainerStarted","Data":"aff3db99e71cc1ac4d4482bead09121fbbb8c7b608436c89a02a8ea67bd235cb"} Dec 03 09:08:11 crc kubenswrapper[4947]: I1203 09:08:11.049447 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-5mn24"] Dec 03 09:08:11 crc kubenswrapper[4947]: I1203 09:08:11.062868 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-5mn24"] Dec 03 09:08:11 crc kubenswrapper[4947]: I1203 09:08:11.102997 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d528b7ff-54f5-4220-ad84-035b6dfc19ce" path="/var/lib/kubelet/pods/d528b7ff-54f5-4220-ad84-035b6dfc19ce/volumes" Dec 03 09:08:12 crc kubenswrapper[4947]: I1203 09:08:12.921686 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.023910 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b5cf332-361f-4e92-97e2-575f850e1782-logs\") pod \"7b5cf332-361f-4e92-97e2-575f850e1782\" (UID: \"7b5cf332-361f-4e92-97e2-575f850e1782\") " Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.024036 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b5cf332-361f-4e92-97e2-575f850e1782-scripts\") pod \"7b5cf332-361f-4e92-97e2-575f850e1782\" (UID: \"7b5cf332-361f-4e92-97e2-575f850e1782\") " Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.024174 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b5cf332-361f-4e92-97e2-575f850e1782-httpd-run\") pod \"7b5cf332-361f-4e92-97e2-575f850e1782\" (UID: \"7b5cf332-361f-4e92-97e2-575f850e1782\") " Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.024359 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxdfd\" (UniqueName: \"kubernetes.io/projected/7b5cf332-361f-4e92-97e2-575f850e1782-kube-api-access-rxdfd\") pod \"7b5cf332-361f-4e92-97e2-575f850e1782\" (UID: \"7b5cf332-361f-4e92-97e2-575f850e1782\") " Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.024596 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b5cf332-361f-4e92-97e2-575f850e1782-config-data\") pod \"7b5cf332-361f-4e92-97e2-575f850e1782\" (UID: \"7b5cf332-361f-4e92-97e2-575f850e1782\") " Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.024641 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b5cf332-361f-4e92-97e2-575f850e1782-combined-ca-bundle\") pod \"7b5cf332-361f-4e92-97e2-575f850e1782\" (UID: \"7b5cf332-361f-4e92-97e2-575f850e1782\") " Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.030326 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b5cf332-361f-4e92-97e2-575f850e1782-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7b5cf332-361f-4e92-97e2-575f850e1782" (UID: "7b5cf332-361f-4e92-97e2-575f850e1782"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.032639 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b5cf332-361f-4e92-97e2-575f850e1782-logs" (OuterVolumeSpecName: "logs") pod "7b5cf332-361f-4e92-97e2-575f850e1782" (UID: "7b5cf332-361f-4e92-97e2-575f850e1782"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.041836 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b5cf332-361f-4e92-97e2-575f850e1782-kube-api-access-rxdfd" (OuterVolumeSpecName: "kube-api-access-rxdfd") pod "7b5cf332-361f-4e92-97e2-575f850e1782" (UID: "7b5cf332-361f-4e92-97e2-575f850e1782"). InnerVolumeSpecName "kube-api-access-rxdfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.041997 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5cf332-361f-4e92-97e2-575f850e1782-scripts" (OuterVolumeSpecName: "scripts") pod "7b5cf332-361f-4e92-97e2-575f850e1782" (UID: "7b5cf332-361f-4e92-97e2-575f850e1782"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.058084 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5cf332-361f-4e92-97e2-575f850e1782-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b5cf332-361f-4e92-97e2-575f850e1782" (UID: "7b5cf332-361f-4e92-97e2-575f850e1782"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.061314 4947 generic.go:334] "Generic (PLEG): container finished" podID="f62866b1-13eb-4f40-9be3-168fae89746a" containerID="a25e5fa470e9b2747046dc6fb3121e3d33d66549e440913c334a3f4509f0a763" exitCode=0 Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.061537 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f62866b1-13eb-4f40-9be3-168fae89746a","Type":"ContainerDied","Data":"a25e5fa470e9b2747046dc6fb3121e3d33d66549e440913c334a3f4509f0a763"} Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.066631 4947 generic.go:334] "Generic (PLEG): container finished" podID="7b5cf332-361f-4e92-97e2-575f850e1782" containerID="b3e52658147c693a35a77619f358007f0cf09e4514fd8b7e864349283ddf8663" exitCode=0 Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.066661 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.066696 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7b5cf332-361f-4e92-97e2-575f850e1782","Type":"ContainerDied","Data":"b3e52658147c693a35a77619f358007f0cf09e4514fd8b7e864349283ddf8663"} Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.066732 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7b5cf332-361f-4e92-97e2-575f850e1782","Type":"ContainerDied","Data":"4e429bcd3a8eb0b1f61156fb7cbe45a1ae2d7dbf2073c36cfa336f5e4b0acea1"} Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.066779 4947 scope.go:117] "RemoveContainer" containerID="b3e52658147c693a35a77619f358007f0cf09e4514fd8b7e864349283ddf8663" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.126820 4947 scope.go:117] "RemoveContainer" containerID="228f0368e624c00692128d283e2dbe53f23cc91ba07b76bd6c7e957d9c237b07" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.128174 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxdfd\" (UniqueName: \"kubernetes.io/projected/7b5cf332-361f-4e92-97e2-575f850e1782-kube-api-access-rxdfd\") on node \"crc\" DevicePath \"\"" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.128215 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b5cf332-361f-4e92-97e2-575f850e1782-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.128228 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b5cf332-361f-4e92-97e2-575f850e1782-logs\") on node \"crc\" DevicePath \"\"" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.128239 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b5cf332-361f-4e92-97e2-575f850e1782-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.128250 4947 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b5cf332-361f-4e92-97e2-575f850e1782-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.133364 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5cf332-361f-4e92-97e2-575f850e1782-config-data" (OuterVolumeSpecName: "config-data") pod "7b5cf332-361f-4e92-97e2-575f850e1782" (UID: "7b5cf332-361f-4e92-97e2-575f850e1782"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.186744 4947 scope.go:117] "RemoveContainer" containerID="b3e52658147c693a35a77619f358007f0cf09e4514fd8b7e864349283ddf8663" Dec 03 09:08:13 crc kubenswrapper[4947]: E1203 09:08:13.187247 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3e52658147c693a35a77619f358007f0cf09e4514fd8b7e864349283ddf8663\": container with ID starting with b3e52658147c693a35a77619f358007f0cf09e4514fd8b7e864349283ddf8663 not found: ID does not exist" containerID="b3e52658147c693a35a77619f358007f0cf09e4514fd8b7e864349283ddf8663" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.187297 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3e52658147c693a35a77619f358007f0cf09e4514fd8b7e864349283ddf8663"} err="failed to get container status \"b3e52658147c693a35a77619f358007f0cf09e4514fd8b7e864349283ddf8663\": rpc error: code = NotFound desc = could not find container \"b3e52658147c693a35a77619f358007f0cf09e4514fd8b7e864349283ddf8663\": container with ID starting with b3e52658147c693a35a77619f358007f0cf09e4514fd8b7e864349283ddf8663 not found: ID does not exist" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.187317 4947 scope.go:117] "RemoveContainer" containerID="228f0368e624c00692128d283e2dbe53f23cc91ba07b76bd6c7e957d9c237b07" Dec 03 09:08:13 crc kubenswrapper[4947]: E1203 09:08:13.187588 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"228f0368e624c00692128d283e2dbe53f23cc91ba07b76bd6c7e957d9c237b07\": container with ID starting with 228f0368e624c00692128d283e2dbe53f23cc91ba07b76bd6c7e957d9c237b07 not found: ID does not exist" containerID="228f0368e624c00692128d283e2dbe53f23cc91ba07b76bd6c7e957d9c237b07" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.187658 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"228f0368e624c00692128d283e2dbe53f23cc91ba07b76bd6c7e957d9c237b07"} err="failed to get container status \"228f0368e624c00692128d283e2dbe53f23cc91ba07b76bd6c7e957d9c237b07\": rpc error: code = NotFound desc = could not find container \"228f0368e624c00692128d283e2dbe53f23cc91ba07b76bd6c7e957d9c237b07\": container with ID starting with 228f0368e624c00692128d283e2dbe53f23cc91ba07b76bd6c7e957d9c237b07 not found: ID does not exist" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.239225 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b5cf332-361f-4e92-97e2-575f850e1782-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.378026 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.442163 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvrww\" (UniqueName: \"kubernetes.io/projected/f62866b1-13eb-4f40-9be3-168fae89746a-kube-api-access-mvrww\") pod \"f62866b1-13eb-4f40-9be3-168fae89746a\" (UID: \"f62866b1-13eb-4f40-9be3-168fae89746a\") " Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.443299 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f62866b1-13eb-4f40-9be3-168fae89746a-logs\") pod \"f62866b1-13eb-4f40-9be3-168fae89746a\" (UID: \"f62866b1-13eb-4f40-9be3-168fae89746a\") " Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.443346 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f62866b1-13eb-4f40-9be3-168fae89746a-config-data\") pod \"f62866b1-13eb-4f40-9be3-168fae89746a\" (UID: \"f62866b1-13eb-4f40-9be3-168fae89746a\") " Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.443628 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f62866b1-13eb-4f40-9be3-168fae89746a-combined-ca-bundle\") pod \"f62866b1-13eb-4f40-9be3-168fae89746a\" (UID: \"f62866b1-13eb-4f40-9be3-168fae89746a\") " Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.443657 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f62866b1-13eb-4f40-9be3-168fae89746a-httpd-run\") pod \"f62866b1-13eb-4f40-9be3-168fae89746a\" (UID: \"f62866b1-13eb-4f40-9be3-168fae89746a\") " Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.443714 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f62866b1-13eb-4f40-9be3-168fae89746a-scripts\") pod \"f62866b1-13eb-4f40-9be3-168fae89746a\" (UID: \"f62866b1-13eb-4f40-9be3-168fae89746a\") " Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.444145 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f62866b1-13eb-4f40-9be3-168fae89746a-logs" (OuterVolumeSpecName: "logs") pod "f62866b1-13eb-4f40-9be3-168fae89746a" (UID: "f62866b1-13eb-4f40-9be3-168fae89746a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.445145 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f62866b1-13eb-4f40-9be3-168fae89746a-logs\") on node \"crc\" DevicePath \"\"" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.445193 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.447795 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f62866b1-13eb-4f40-9be3-168fae89746a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f62866b1-13eb-4f40-9be3-168fae89746a" (UID: "f62866b1-13eb-4f40-9be3-168fae89746a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.488470 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.494039 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f62866b1-13eb-4f40-9be3-168fae89746a-kube-api-access-mvrww" (OuterVolumeSpecName: "kube-api-access-mvrww") pod "f62866b1-13eb-4f40-9be3-168fae89746a" (UID: "f62866b1-13eb-4f40-9be3-168fae89746a"). InnerVolumeSpecName "kube-api-access-mvrww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.495594 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 09:08:13 crc kubenswrapper[4947]: E1203 09:08:13.496325 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b5cf332-361f-4e92-97e2-575f850e1782" containerName="glance-log" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.496344 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b5cf332-361f-4e92-97e2-575f850e1782" containerName="glance-log" Dec 03 09:08:13 crc kubenswrapper[4947]: E1203 09:08:13.496368 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f62866b1-13eb-4f40-9be3-168fae89746a" containerName="glance-log" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.496376 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62866b1-13eb-4f40-9be3-168fae89746a" containerName="glance-log" Dec 03 09:08:13 crc kubenswrapper[4947]: E1203 09:08:13.496409 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f62866b1-13eb-4f40-9be3-168fae89746a" containerName="glance-httpd" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.496417 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f62866b1-13eb-4f40-9be3-168fae89746a" containerName="glance-httpd" Dec 03 09:08:13 crc kubenswrapper[4947]: E1203 09:08:13.496438 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b5cf332-361f-4e92-97e2-575f850e1782" containerName="glance-httpd" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.496444 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b5cf332-361f-4e92-97e2-575f850e1782" containerName="glance-httpd" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.496716 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b5cf332-361f-4e92-97e2-575f850e1782" containerName="glance-log" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.496730 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f62866b1-13eb-4f40-9be3-168fae89746a" containerName="glance-log" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.496742 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f62866b1-13eb-4f40-9be3-168fae89746a" containerName="glance-httpd" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.496764 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b5cf332-361f-4e92-97e2-575f850e1782" containerName="glance-httpd" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.500761 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f62866b1-13eb-4f40-9be3-168fae89746a-scripts" (OuterVolumeSpecName: "scripts") pod "f62866b1-13eb-4f40-9be3-168fae89746a" (UID: "f62866b1-13eb-4f40-9be3-168fae89746a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.506842 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f62866b1-13eb-4f40-9be3-168fae89746a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f62866b1-13eb-4f40-9be3-168fae89746a" (UID: "f62866b1-13eb-4f40-9be3-168fae89746a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.512553 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.514746 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.532701 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.547300 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f62866b1-13eb-4f40-9be3-168fae89746a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.547330 4947 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f62866b1-13eb-4f40-9be3-168fae89746a-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.547339 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f62866b1-13eb-4f40-9be3-168fae89746a-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.547348 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvrww\" (UniqueName: \"kubernetes.io/projected/f62866b1-13eb-4f40-9be3-168fae89746a-kube-api-access-mvrww\") on node \"crc\" DevicePath \"\"" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.614229 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f62866b1-13eb-4f40-9be3-168fae89746a-config-data" (OuterVolumeSpecName: "config-data") pod "f62866b1-13eb-4f40-9be3-168fae89746a" (UID: "f62866b1-13eb-4f40-9be3-168fae89746a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.649669 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50c34849-5902-436a-965e-5de5d52d6853-config-data\") pod \"glance-default-external-api-0\" (UID: \"50c34849-5902-436a-965e-5de5d52d6853\") " pod="openstack/glance-default-external-api-0" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.649766 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbm82\" (UniqueName: \"kubernetes.io/projected/50c34849-5902-436a-965e-5de5d52d6853-kube-api-access-vbm82\") pod \"glance-default-external-api-0\" (UID: \"50c34849-5902-436a-965e-5de5d52d6853\") " pod="openstack/glance-default-external-api-0" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.649829 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50c34849-5902-436a-965e-5de5d52d6853-logs\") pod \"glance-default-external-api-0\" (UID: \"50c34849-5902-436a-965e-5de5d52d6853\") " pod="openstack/glance-default-external-api-0" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.649877 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50c34849-5902-436a-965e-5de5d52d6853-scripts\") pod \"glance-default-external-api-0\" (UID: \"50c34849-5902-436a-965e-5de5d52d6853\") " pod="openstack/glance-default-external-api-0" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.649905 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50c34849-5902-436a-965e-5de5d52d6853-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"50c34849-5902-436a-965e-5de5d52d6853\") " pod="openstack/glance-default-external-api-0" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.649932 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50c34849-5902-436a-965e-5de5d52d6853-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"50c34849-5902-436a-965e-5de5d52d6853\") " pod="openstack/glance-default-external-api-0" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.650054 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f62866b1-13eb-4f40-9be3-168fae89746a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.752514 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50c34849-5902-436a-965e-5de5d52d6853-config-data\") pod \"glance-default-external-api-0\" (UID: \"50c34849-5902-436a-965e-5de5d52d6853\") " pod="openstack/glance-default-external-api-0" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.752585 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbm82\" (UniqueName: \"kubernetes.io/projected/50c34849-5902-436a-965e-5de5d52d6853-kube-api-access-vbm82\") pod \"glance-default-external-api-0\" (UID: \"50c34849-5902-436a-965e-5de5d52d6853\") " pod="openstack/glance-default-external-api-0" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.752841 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50c34849-5902-436a-965e-5de5d52d6853-logs\") pod \"glance-default-external-api-0\" (UID: \"50c34849-5902-436a-965e-5de5d52d6853\") " pod="openstack/glance-default-external-api-0" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.752904 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50c34849-5902-436a-965e-5de5d52d6853-scripts\") pod \"glance-default-external-api-0\" (UID: \"50c34849-5902-436a-965e-5de5d52d6853\") " pod="openstack/glance-default-external-api-0" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.752948 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50c34849-5902-436a-965e-5de5d52d6853-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"50c34849-5902-436a-965e-5de5d52d6853\") " pod="openstack/glance-default-external-api-0" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.752983 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50c34849-5902-436a-965e-5de5d52d6853-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"50c34849-5902-436a-965e-5de5d52d6853\") " pod="openstack/glance-default-external-api-0" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.754081 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50c34849-5902-436a-965e-5de5d52d6853-logs\") pod \"glance-default-external-api-0\" (UID: \"50c34849-5902-436a-965e-5de5d52d6853\") " pod="openstack/glance-default-external-api-0" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.755923 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50c34849-5902-436a-965e-5de5d52d6853-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"50c34849-5902-436a-965e-5de5d52d6853\") " pod="openstack/glance-default-external-api-0" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.760568 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50c34849-5902-436a-965e-5de5d52d6853-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"50c34849-5902-436a-965e-5de5d52d6853\") " pod="openstack/glance-default-external-api-0" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.761925 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50c34849-5902-436a-965e-5de5d52d6853-scripts\") pod \"glance-default-external-api-0\" (UID: \"50c34849-5902-436a-965e-5de5d52d6853\") " pod="openstack/glance-default-external-api-0" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.762723 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50c34849-5902-436a-965e-5de5d52d6853-config-data\") pod \"glance-default-external-api-0\" (UID: \"50c34849-5902-436a-965e-5de5d52d6853\") " pod="openstack/glance-default-external-api-0" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.778319 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbm82\" (UniqueName: \"kubernetes.io/projected/50c34849-5902-436a-965e-5de5d52d6853-kube-api-access-vbm82\") pod \"glance-default-external-api-0\" (UID: \"50c34849-5902-436a-965e-5de5d52d6853\") " pod="openstack/glance-default-external-api-0" Dec 03 09:08:13 crc kubenswrapper[4947]: I1203 09:08:13.905828 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.090831 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f62866b1-13eb-4f40-9be3-168fae89746a","Type":"ContainerDied","Data":"2f142053a94faa0d43149efd12c97f338acec5df471901bab25e45845b32ec57"} Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.090877 4947 scope.go:117] "RemoveContainer" containerID="a25e5fa470e9b2747046dc6fb3121e3d33d66549e440913c334a3f4509f0a763" Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.090963 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.162924 4947 scope.go:117] "RemoveContainer" containerID="3f02f4ca39c082bd13b86c307f0171b08ad3d32035f76189a46d15637932dc87" Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.166157 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.183555 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.195894 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.197840 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.202250 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.247467 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.269162 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05aacc20-a81e-4e25-88f4-518cf128bcab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"05aacc20-a81e-4e25-88f4-518cf128bcab\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.269253 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05aacc20-a81e-4e25-88f4-518cf128bcab-logs\") pod \"glance-default-internal-api-0\" (UID: \"05aacc20-a81e-4e25-88f4-518cf128bcab\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.269309 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05aacc20-a81e-4e25-88f4-518cf128bcab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"05aacc20-a81e-4e25-88f4-518cf128bcab\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.269338 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05aacc20-a81e-4e25-88f4-518cf128bcab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"05aacc20-a81e-4e25-88f4-518cf128bcab\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.269416 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05aacc20-a81e-4e25-88f4-518cf128bcab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"05aacc20-a81e-4e25-88f4-518cf128bcab\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.269502 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnp8t\" (UniqueName: \"kubernetes.io/projected/05aacc20-a81e-4e25-88f4-518cf128bcab-kube-api-access-gnp8t\") pod \"glance-default-internal-api-0\" (UID: \"05aacc20-a81e-4e25-88f4-518cf128bcab\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.376924 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnp8t\" (UniqueName: \"kubernetes.io/projected/05aacc20-a81e-4e25-88f4-518cf128bcab-kube-api-access-gnp8t\") pod \"glance-default-internal-api-0\" (UID: \"05aacc20-a81e-4e25-88f4-518cf128bcab\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.377286 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05aacc20-a81e-4e25-88f4-518cf128bcab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"05aacc20-a81e-4e25-88f4-518cf128bcab\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.377840 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05aacc20-a81e-4e25-88f4-518cf128bcab-logs\") pod \"glance-default-internal-api-0\" (UID: \"05aacc20-a81e-4e25-88f4-518cf128bcab\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.377898 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05aacc20-a81e-4e25-88f4-518cf128bcab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"05aacc20-a81e-4e25-88f4-518cf128bcab\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.377925 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05aacc20-a81e-4e25-88f4-518cf128bcab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"05aacc20-a81e-4e25-88f4-518cf128bcab\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.378261 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05aacc20-a81e-4e25-88f4-518cf128bcab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"05aacc20-a81e-4e25-88f4-518cf128bcab\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.378335 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05aacc20-a81e-4e25-88f4-518cf128bcab-logs\") pod \"glance-default-internal-api-0\" (UID: \"05aacc20-a81e-4e25-88f4-518cf128bcab\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.378550 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05aacc20-a81e-4e25-88f4-518cf128bcab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"05aacc20-a81e-4e25-88f4-518cf128bcab\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.388673 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05aacc20-a81e-4e25-88f4-518cf128bcab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"05aacc20-a81e-4e25-88f4-518cf128bcab\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.394810 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05aacc20-a81e-4e25-88f4-518cf128bcab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"05aacc20-a81e-4e25-88f4-518cf128bcab\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.395336 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05aacc20-a81e-4e25-88f4-518cf128bcab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"05aacc20-a81e-4e25-88f4-518cf128bcab\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.415089 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnp8t\" (UniqueName: \"kubernetes.io/projected/05aacc20-a81e-4e25-88f4-518cf128bcab-kube-api-access-gnp8t\") pod \"glance-default-internal-api-0\" (UID: \"05aacc20-a81e-4e25-88f4-518cf128bcab\") " pod="openstack/glance-default-internal-api-0" Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.586270 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 09:08:14 crc kubenswrapper[4947]: I1203 09:08:14.612525 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 09:08:15 crc kubenswrapper[4947]: I1203 09:08:15.143718 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b5cf332-361f-4e92-97e2-575f850e1782" path="/var/lib/kubelet/pods/7b5cf332-361f-4e92-97e2-575f850e1782/volumes" Dec 03 09:08:15 crc kubenswrapper[4947]: I1203 09:08:15.145325 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f62866b1-13eb-4f40-9be3-168fae89746a" path="/var/lib/kubelet/pods/f62866b1-13eb-4f40-9be3-168fae89746a/volumes" Dec 03 09:08:16 crc kubenswrapper[4947]: I1203 09:08:16.514160 4947 scope.go:117] "RemoveContainer" containerID="7a43acc78b9ca7b20f0c5f0fffe44f9d9b22480e88012dc6a043a6c344ca162c" Dec 03 09:08:19 crc kubenswrapper[4947]: I1203 09:08:19.951612 4947 scope.go:117] "RemoveContainer" containerID="8cdb3a42d8e864a9e5fe82396e8a4bd7af0fa5455af8950d1dcc63efc5d302bd" Dec 03 09:08:20 crc kubenswrapper[4947]: I1203 09:08:20.085103 4947 scope.go:117] "RemoveContainer" containerID="97b742b625c5f67d43a9de58f2b495fe557433fa21e4d4003c8c1a89c20f24da" Dec 03 09:08:20 crc kubenswrapper[4947]: I1203 09:08:20.220561 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"50c34849-5902-436a-965e-5de5d52d6853","Type":"ContainerStarted","Data":"0262a13fa8928197119c305dfba70af4a47efb1423481ef3e4a49802d420d323"} Dec 03 09:08:20 crc kubenswrapper[4947]: I1203 09:08:20.609180 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 09:08:21 crc kubenswrapper[4947]: I1203 09:08:21.262893 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64f876b76f-75gvq" event={"ID":"a42ec7f7-5870-4d92-8d5f-dc5350cb281c","Type":"ContainerStarted","Data":"0ce794639d5e990a72f8ffebf429004823b413040d15118b9d81920f28f10704"} Dec 03 09:08:21 crc kubenswrapper[4947]: I1203 09:08:21.263223 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64f876b76f-75gvq" event={"ID":"a42ec7f7-5870-4d92-8d5f-dc5350cb281c","Type":"ContainerStarted","Data":"3e0f99cf5d76014a4b250d0b2d0992085060204ff2350124a2c18cfedde0ecee"} Dec 03 09:08:21 crc kubenswrapper[4947]: I1203 09:08:21.263293 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-64f876b76f-75gvq" podUID="a42ec7f7-5870-4d92-8d5f-dc5350cb281c" containerName="horizon-log" containerID="cri-o://3e0f99cf5d76014a4b250d0b2d0992085060204ff2350124a2c18cfedde0ecee" gracePeriod=30 Dec 03 09:08:21 crc kubenswrapper[4947]: I1203 09:08:21.263461 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-64f876b76f-75gvq" podUID="a42ec7f7-5870-4d92-8d5f-dc5350cb281c" containerName="horizon" containerID="cri-o://0ce794639d5e990a72f8ffebf429004823b413040d15118b9d81920f28f10704" gracePeriod=30 Dec 03 09:08:21 crc kubenswrapper[4947]: I1203 09:08:21.269890 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"05aacc20-a81e-4e25-88f4-518cf128bcab","Type":"ContainerStarted","Data":"08ffe8865c9da1ac740ff5df902f000e51f10f6260e35f5d40eeb7d91c3b9f79"} Dec 03 09:08:21 crc kubenswrapper[4947]: I1203 09:08:21.278794 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b8986b5b9-qhbt7" event={"ID":"1cfe7be4-097c-4ec4-86f6-13f53431f09d","Type":"ContainerStarted","Data":"9fb32a3c79132b0ef0344e7f97774b458f0665680409c28228b8bbf1d2f20e98"} Dec 03 09:08:21 crc kubenswrapper[4947]: I1203 09:08:21.278849 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b8986b5b9-qhbt7" event={"ID":"1cfe7be4-097c-4ec4-86f6-13f53431f09d","Type":"ContainerStarted","Data":"5b91fbae315a36893d9f232fc5068a2d1beb120078190e2a1dbd05cb12668420"} Dec 03 09:08:21 crc kubenswrapper[4947]: I1203 09:08:21.281256 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c5859c9f9-lxs8f" event={"ID":"824c9c5f-7c1e-4f88-9a92-70b10b1945ab","Type":"ContainerStarted","Data":"351c4c85737a97ffb494d0a492414826cc13cb42dc17b683f1d4908c0de028f3"} Dec 03 09:08:21 crc kubenswrapper[4947]: I1203 09:08:21.281299 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c5859c9f9-lxs8f" event={"ID":"824c9c5f-7c1e-4f88-9a92-70b10b1945ab","Type":"ContainerStarted","Data":"88527c2a5032b3cfcdc5a0bfe76c84963bcc49b298741cdce7a4242ee3712bbc"} Dec 03 09:08:21 crc kubenswrapper[4947]: I1203 09:08:21.290717 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"50c34849-5902-436a-965e-5de5d52d6853","Type":"ContainerStarted","Data":"55e948041c6384b11d8096579a5ff5419c1138f08f289d0e33ec78efc1ad5de4"} Dec 03 09:08:21 crc kubenswrapper[4947]: I1203 09:08:21.292475 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-64f876b76f-75gvq" podStartSLOduration=2.7765492480000002 podStartE2EDuration="13.292454878s" podCreationTimestamp="2025-12-03 09:08:08 +0000 UTC" firstStartedPulling="2025-12-03 09:08:09.661015265 +0000 UTC m=+8350.921969701" lastFinishedPulling="2025-12-03 09:08:20.176920915 +0000 UTC m=+8361.437875331" observedRunningTime="2025-12-03 09:08:21.284270686 +0000 UTC m=+8362.545225122" watchObservedRunningTime="2025-12-03 09:08:21.292454878 +0000 UTC m=+8362.553409314" Dec 03 09:08:21 crc kubenswrapper[4947]: I1203 09:08:21.325955 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5c5859c9f9-lxs8f" podStartSLOduration=2.588247803 podStartE2EDuration="12.325934663s" podCreationTimestamp="2025-12-03 09:08:09 +0000 UTC" firstStartedPulling="2025-12-03 09:08:10.438819253 +0000 UTC m=+8351.699773679" lastFinishedPulling="2025-12-03 09:08:20.176506113 +0000 UTC m=+8361.437460539" observedRunningTime="2025-12-03 09:08:21.314958066 +0000 UTC m=+8362.575912522" watchObservedRunningTime="2025-12-03 09:08:21.325934663 +0000 UTC m=+8362.586889099" Dec 03 09:08:21 crc kubenswrapper[4947]: I1203 09:08:21.357804 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-b8986b5b9-qhbt7" podStartSLOduration=2.949897739 podStartE2EDuration="13.357472117s" podCreationTimestamp="2025-12-03 09:08:08 +0000 UTC" firstStartedPulling="2025-12-03 09:08:09.768944575 +0000 UTC m=+8351.029899001" lastFinishedPulling="2025-12-03 09:08:20.176518963 +0000 UTC m=+8361.437473379" observedRunningTime="2025-12-03 09:08:21.344897207 +0000 UTC m=+8362.605851633" watchObservedRunningTime="2025-12-03 09:08:21.357472117 +0000 UTC m=+8362.618426543" Dec 03 09:08:22 crc kubenswrapper[4947]: I1203 09:08:22.306384 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"05aacc20-a81e-4e25-88f4-518cf128bcab","Type":"ContainerStarted","Data":"af4134d3397bc1cf9d22fd36c01bbc82ccb938a6ec7ca533f294ddc2780fd59e"} Dec 03 09:08:22 crc kubenswrapper[4947]: I1203 09:08:22.307046 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"05aacc20-a81e-4e25-88f4-518cf128bcab","Type":"ContainerStarted","Data":"289509453cb54da4244d33220ae5736433dd7b43e33569e5c71e0e271c179403"} Dec 03 09:08:22 crc kubenswrapper[4947]: I1203 09:08:22.310239 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"50c34849-5902-436a-965e-5de5d52d6853","Type":"ContainerStarted","Data":"776a239a0a8a0c9bad76ce26998ebe13f47af2c6944a5101d6b316734524841a"} Dec 03 09:08:22 crc kubenswrapper[4947]: I1203 09:08:22.338289 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.338271905 podStartE2EDuration="8.338271905s" podCreationTimestamp="2025-12-03 09:08:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:08:22.327319509 +0000 UTC m=+8363.588273945" watchObservedRunningTime="2025-12-03 09:08:22.338271905 +0000 UTC m=+8363.599226331" Dec 03 09:08:22 crc kubenswrapper[4947]: I1203 09:08:22.356566 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.35654926 podStartE2EDuration="9.35654926s" podCreationTimestamp="2025-12-03 09:08:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:08:22.349968602 +0000 UTC m=+8363.610923038" watchObservedRunningTime="2025-12-03 09:08:22.35654926 +0000 UTC m=+8363.617503686" Dec 03 09:08:23 crc kubenswrapper[4947]: I1203 09:08:23.905972 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 09:08:23 crc kubenswrapper[4947]: I1203 09:08:23.906331 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 09:08:23 crc kubenswrapper[4947]: I1203 09:08:23.965235 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 09:08:23 crc kubenswrapper[4947]: I1203 09:08:23.985871 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 09:08:24 crc kubenswrapper[4947]: I1203 09:08:24.042361 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-swh7v"] Dec 03 09:08:24 crc kubenswrapper[4947]: I1203 09:08:24.055462 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-swh7v"] Dec 03 09:08:24 crc kubenswrapper[4947]: I1203 09:08:24.330079 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 09:08:24 crc kubenswrapper[4947]: I1203 09:08:24.330214 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 09:08:24 crc kubenswrapper[4947]: I1203 09:08:24.587296 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 09:08:24 crc kubenswrapper[4947]: I1203 09:08:24.587371 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 09:08:24 crc kubenswrapper[4947]: I1203 09:08:24.630678 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 09:08:24 crc kubenswrapper[4947]: I1203 09:08:24.644597 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 09:08:25 crc kubenswrapper[4947]: I1203 09:08:25.099387 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42bb363a-5b68-4d1d-8ca6-dbaa2592da72" path="/var/lib/kubelet/pods/42bb363a-5b68-4d1d-8ca6-dbaa2592da72/volumes" Dec 03 09:08:25 crc kubenswrapper[4947]: I1203 09:08:25.367201 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 09:08:25 crc kubenswrapper[4947]: I1203 09:08:25.367916 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 09:08:28 crc kubenswrapper[4947]: I1203 09:08:28.298227 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 09:08:28 crc kubenswrapper[4947]: I1203 09:08:28.513760 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 09:08:28 crc kubenswrapper[4947]: I1203 09:08:28.676283 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 09:08:29 crc kubenswrapper[4947]: I1203 09:08:29.125178 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-64f876b76f-75gvq" Dec 03 09:08:29 crc kubenswrapper[4947]: I1203 09:08:29.223320 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-b8986b5b9-qhbt7" Dec 03 09:08:29 crc kubenswrapper[4947]: I1203 09:08:29.224731 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-b8986b5b9-qhbt7" Dec 03 09:08:29 crc kubenswrapper[4947]: I1203 09:08:29.928024 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5c5859c9f9-lxs8f" Dec 03 09:08:29 crc kubenswrapper[4947]: I1203 09:08:29.928379 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5c5859c9f9-lxs8f" Dec 03 09:08:30 crc kubenswrapper[4947]: I1203 09:08:30.086641 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:08:30 crc kubenswrapper[4947]: I1203 09:08:30.086694 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:08:30 crc kubenswrapper[4947]: I1203 09:08:30.841234 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 09:08:39 crc kubenswrapper[4947]: I1203 09:08:39.225309 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-b8986b5b9-qhbt7" podUID="1cfe7be4-097c-4ec4-86f6-13f53431f09d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.141:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.141:8080: connect: connection refused" Dec 03 09:08:39 crc kubenswrapper[4947]: I1203 09:08:39.929161 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5c5859c9f9-lxs8f" podUID="824c9c5f-7c1e-4f88-9a92-70b10b1945ab" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.142:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.142:8080: connect: connection refused" Dec 03 09:08:52 crc kubenswrapper[4947]: I1203 09:08:52.484240 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-b8986b5b9-qhbt7" Dec 03 09:08:52 crc kubenswrapper[4947]: I1203 09:08:52.498964 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5c5859c9f9-lxs8f" Dec 03 09:08:52 crc kubenswrapper[4947]: I1203 09:08:52.771990 4947 generic.go:334] "Generic (PLEG): container finished" podID="a42ec7f7-5870-4d92-8d5f-dc5350cb281c" containerID="0ce794639d5e990a72f8ffebf429004823b413040d15118b9d81920f28f10704" exitCode=137 Dec 03 09:08:52 crc kubenswrapper[4947]: I1203 09:08:52.772022 4947 generic.go:334] "Generic (PLEG): container finished" podID="a42ec7f7-5870-4d92-8d5f-dc5350cb281c" containerID="3e0f99cf5d76014a4b250d0b2d0992085060204ff2350124a2c18cfedde0ecee" exitCode=137 Dec 03 09:08:52 crc kubenswrapper[4947]: I1203 09:08:52.772039 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64f876b76f-75gvq" event={"ID":"a42ec7f7-5870-4d92-8d5f-dc5350cb281c","Type":"ContainerDied","Data":"0ce794639d5e990a72f8ffebf429004823b413040d15118b9d81920f28f10704"} Dec 03 09:08:52 crc kubenswrapper[4947]: I1203 09:08:52.772067 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64f876b76f-75gvq" event={"ID":"a42ec7f7-5870-4d92-8d5f-dc5350cb281c","Type":"ContainerDied","Data":"3e0f99cf5d76014a4b250d0b2d0992085060204ff2350124a2c18cfedde0ecee"} Dec 03 09:08:53 crc kubenswrapper[4947]: I1203 09:08:53.784612 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64f876b76f-75gvq" event={"ID":"a42ec7f7-5870-4d92-8d5f-dc5350cb281c","Type":"ContainerDied","Data":"4084119244b97fb06f1285b870e7040f21d8e56737a0098f1ad11fbcb855cc09"} Dec 03 09:08:53 crc kubenswrapper[4947]: I1203 09:08:53.785212 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4084119244b97fb06f1285b870e7040f21d8e56737a0098f1ad11fbcb855cc09" Dec 03 09:08:53 crc kubenswrapper[4947]: I1203 09:08:53.816909 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64f876b76f-75gvq" Dec 03 09:08:53 crc kubenswrapper[4947]: I1203 09:08:53.953143 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rxpc\" (UniqueName: \"kubernetes.io/projected/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-kube-api-access-4rxpc\") pod \"a42ec7f7-5870-4d92-8d5f-dc5350cb281c\" (UID: \"a42ec7f7-5870-4d92-8d5f-dc5350cb281c\") " Dec 03 09:08:53 crc kubenswrapper[4947]: I1203 09:08:53.953262 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-horizon-secret-key\") pod \"a42ec7f7-5870-4d92-8d5f-dc5350cb281c\" (UID: \"a42ec7f7-5870-4d92-8d5f-dc5350cb281c\") " Dec 03 09:08:53 crc kubenswrapper[4947]: I1203 09:08:53.953301 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-logs\") pod \"a42ec7f7-5870-4d92-8d5f-dc5350cb281c\" (UID: \"a42ec7f7-5870-4d92-8d5f-dc5350cb281c\") " Dec 03 09:08:53 crc kubenswrapper[4947]: I1203 09:08:53.953457 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-scripts\") pod \"a42ec7f7-5870-4d92-8d5f-dc5350cb281c\" (UID: \"a42ec7f7-5870-4d92-8d5f-dc5350cb281c\") " Dec 03 09:08:53 crc kubenswrapper[4947]: I1203 09:08:53.953611 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-config-data\") pod \"a42ec7f7-5870-4d92-8d5f-dc5350cb281c\" (UID: \"a42ec7f7-5870-4d92-8d5f-dc5350cb281c\") " Dec 03 09:08:53 crc kubenswrapper[4947]: I1203 09:08:53.954342 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-logs" (OuterVolumeSpecName: "logs") pod "a42ec7f7-5870-4d92-8d5f-dc5350cb281c" (UID: "a42ec7f7-5870-4d92-8d5f-dc5350cb281c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:08:53 crc kubenswrapper[4947]: I1203 09:08:53.965925 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a42ec7f7-5870-4d92-8d5f-dc5350cb281c" (UID: "a42ec7f7-5870-4d92-8d5f-dc5350cb281c"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:08:53 crc kubenswrapper[4947]: I1203 09:08:53.966069 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-kube-api-access-4rxpc" (OuterVolumeSpecName: "kube-api-access-4rxpc") pod "a42ec7f7-5870-4d92-8d5f-dc5350cb281c" (UID: "a42ec7f7-5870-4d92-8d5f-dc5350cb281c"). InnerVolumeSpecName "kube-api-access-4rxpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:08:53 crc kubenswrapper[4947]: I1203 09:08:53.978944 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-scripts" (OuterVolumeSpecName: "scripts") pod "a42ec7f7-5870-4d92-8d5f-dc5350cb281c" (UID: "a42ec7f7-5870-4d92-8d5f-dc5350cb281c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:08:53 crc kubenswrapper[4947]: I1203 09:08:53.980323 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-config-data" (OuterVolumeSpecName: "config-data") pod "a42ec7f7-5870-4d92-8d5f-dc5350cb281c" (UID: "a42ec7f7-5870-4d92-8d5f-dc5350cb281c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:08:54 crc kubenswrapper[4947]: I1203 09:08:54.055813 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:08:54 crc kubenswrapper[4947]: I1203 09:08:54.055864 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rxpc\" (UniqueName: \"kubernetes.io/projected/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-kube-api-access-4rxpc\") on node \"crc\" DevicePath \"\"" Dec 03 09:08:54 crc kubenswrapper[4947]: I1203 09:08:54.055874 4947 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:08:54 crc kubenswrapper[4947]: I1203 09:08:54.055883 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-logs\") on node \"crc\" DevicePath \"\"" Dec 03 09:08:54 crc kubenswrapper[4947]: I1203 09:08:54.055892 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a42ec7f7-5870-4d92-8d5f-dc5350cb281c-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:08:54 crc kubenswrapper[4947]: I1203 09:08:54.543093 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5c5859c9f9-lxs8f" Dec 03 09:08:54 crc kubenswrapper[4947]: I1203 09:08:54.552885 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-b8986b5b9-qhbt7" Dec 03 09:08:54 crc kubenswrapper[4947]: I1203 09:08:54.661228 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b8986b5b9-qhbt7"] Dec 03 09:08:54 crc kubenswrapper[4947]: I1203 09:08:54.815061 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-b8986b5b9-qhbt7" podUID="1cfe7be4-097c-4ec4-86f6-13f53431f09d" containerName="horizon-log" containerID="cri-o://5b91fbae315a36893d9f232fc5068a2d1beb120078190e2a1dbd05cb12668420" gracePeriod=30 Dec 03 09:08:54 crc kubenswrapper[4947]: I1203 09:08:54.815247 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64f876b76f-75gvq" Dec 03 09:08:54 crc kubenswrapper[4947]: I1203 09:08:54.816124 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-b8986b5b9-qhbt7" podUID="1cfe7be4-097c-4ec4-86f6-13f53431f09d" containerName="horizon" containerID="cri-o://9fb32a3c79132b0ef0344e7f97774b458f0665680409c28228b8bbf1d2f20e98" gracePeriod=30 Dec 03 09:08:54 crc kubenswrapper[4947]: I1203 09:08:54.896557 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64f876b76f-75gvq"] Dec 03 09:08:54 crc kubenswrapper[4947]: I1203 09:08:54.907127 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-64f876b76f-75gvq"] Dec 03 09:08:55 crc kubenswrapper[4947]: I1203 09:08:55.099628 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a42ec7f7-5870-4d92-8d5f-dc5350cb281c" path="/var/lib/kubelet/pods/a42ec7f7-5870-4d92-8d5f-dc5350cb281c/volumes" Dec 03 09:08:58 crc kubenswrapper[4947]: I1203 09:08:58.858852 4947 generic.go:334] "Generic (PLEG): container finished" podID="1cfe7be4-097c-4ec4-86f6-13f53431f09d" containerID="9fb32a3c79132b0ef0344e7f97774b458f0665680409c28228b8bbf1d2f20e98" exitCode=0 Dec 03 09:08:58 crc kubenswrapper[4947]: I1203 09:08:58.858949 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b8986b5b9-qhbt7" event={"ID":"1cfe7be4-097c-4ec4-86f6-13f53431f09d","Type":"ContainerDied","Data":"9fb32a3c79132b0ef0344e7f97774b458f0665680409c28228b8bbf1d2f20e98"} Dec 03 09:08:59 crc kubenswrapper[4947]: I1203 09:08:59.224131 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-b8986b5b9-qhbt7" podUID="1cfe7be4-097c-4ec4-86f6-13f53431f09d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.141:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.141:8080: connect: connection refused" Dec 03 09:09:00 crc kubenswrapper[4947]: I1203 09:09:00.086152 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:09:00 crc kubenswrapper[4947]: I1203 09:09:00.086226 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:09:09 crc kubenswrapper[4947]: I1203 09:09:09.223652 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-b8986b5b9-qhbt7" podUID="1cfe7be4-097c-4ec4-86f6-13f53431f09d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.141:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.141:8080: connect: connection refused" Dec 03 09:09:19 crc kubenswrapper[4947]: I1203 09:09:19.224013 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-b8986b5b9-qhbt7" podUID="1cfe7be4-097c-4ec4-86f6-13f53431f09d" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.141:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.141:8080: connect: connection refused" Dec 03 09:09:19 crc kubenswrapper[4947]: I1203 09:09:19.228243 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-b8986b5b9-qhbt7" Dec 03 09:09:20 crc kubenswrapper[4947]: I1203 09:09:20.567271 4947 scope.go:117] "RemoveContainer" containerID="8c5f7b20e44ebba8cce5c75b43e002c4af847e70ca8be82b2f7461cbf264ff46" Dec 03 09:09:20 crc kubenswrapper[4947]: I1203 09:09:20.600831 4947 scope.go:117] "RemoveContainer" containerID="fe60c4bc59b5523f41907aebb468432dd68a5b3ca714711f3fe9b20206b7095f" Dec 03 09:09:20 crc kubenswrapper[4947]: I1203 09:09:20.649714 4947 scope.go:117] "RemoveContainer" containerID="5b4a32d406d166f1285874a8bc32eb5fa91a3c2e3fb0a4cb08914279a423f539" Dec 03 09:09:25 crc kubenswrapper[4947]: I1203 09:09:25.175235 4947 generic.go:334] "Generic (PLEG): container finished" podID="1cfe7be4-097c-4ec4-86f6-13f53431f09d" containerID="5b91fbae315a36893d9f232fc5068a2d1beb120078190e2a1dbd05cb12668420" exitCode=137 Dec 03 09:09:25 crc kubenswrapper[4947]: I1203 09:09:25.175344 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b8986b5b9-qhbt7" event={"ID":"1cfe7be4-097c-4ec4-86f6-13f53431f09d","Type":"ContainerDied","Data":"5b91fbae315a36893d9f232fc5068a2d1beb120078190e2a1dbd05cb12668420"} Dec 03 09:09:25 crc kubenswrapper[4947]: I1203 09:09:25.176416 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b8986b5b9-qhbt7" event={"ID":"1cfe7be4-097c-4ec4-86f6-13f53431f09d","Type":"ContainerDied","Data":"5f58192a65cb1110bf852b6adea49ac9f5234cabee2e58a18a3c71036787e6d0"} Dec 03 09:09:25 crc kubenswrapper[4947]: I1203 09:09:25.176434 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f58192a65cb1110bf852b6adea49ac9f5234cabee2e58a18a3c71036787e6d0" Dec 03 09:09:25 crc kubenswrapper[4947]: I1203 09:09:25.244849 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b8986b5b9-qhbt7" Dec 03 09:09:25 crc kubenswrapper[4947]: I1203 09:09:25.379222 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cfe7be4-097c-4ec4-86f6-13f53431f09d-logs\") pod \"1cfe7be4-097c-4ec4-86f6-13f53431f09d\" (UID: \"1cfe7be4-097c-4ec4-86f6-13f53431f09d\") " Dec 03 09:09:25 crc kubenswrapper[4947]: I1203 09:09:25.379312 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1cfe7be4-097c-4ec4-86f6-13f53431f09d-horizon-secret-key\") pod \"1cfe7be4-097c-4ec4-86f6-13f53431f09d\" (UID: \"1cfe7be4-097c-4ec4-86f6-13f53431f09d\") " Dec 03 09:09:25 crc kubenswrapper[4947]: I1203 09:09:25.379355 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1cfe7be4-097c-4ec4-86f6-13f53431f09d-config-data\") pod \"1cfe7be4-097c-4ec4-86f6-13f53431f09d\" (UID: \"1cfe7be4-097c-4ec4-86f6-13f53431f09d\") " Dec 03 09:09:25 crc kubenswrapper[4947]: I1203 09:09:25.379430 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cfe7be4-097c-4ec4-86f6-13f53431f09d-scripts\") pod \"1cfe7be4-097c-4ec4-86f6-13f53431f09d\" (UID: \"1cfe7be4-097c-4ec4-86f6-13f53431f09d\") " Dec 03 09:09:25 crc kubenswrapper[4947]: I1203 09:09:25.379477 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pl47l\" (UniqueName: \"kubernetes.io/projected/1cfe7be4-097c-4ec4-86f6-13f53431f09d-kube-api-access-pl47l\") pod \"1cfe7be4-097c-4ec4-86f6-13f53431f09d\" (UID: \"1cfe7be4-097c-4ec4-86f6-13f53431f09d\") " Dec 03 09:09:25 crc kubenswrapper[4947]: I1203 09:09:25.383038 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cfe7be4-097c-4ec4-86f6-13f53431f09d-logs" (OuterVolumeSpecName: "logs") pod "1cfe7be4-097c-4ec4-86f6-13f53431f09d" (UID: "1cfe7be4-097c-4ec4-86f6-13f53431f09d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:09:25 crc kubenswrapper[4947]: I1203 09:09:25.387701 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cfe7be4-097c-4ec4-86f6-13f53431f09d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1cfe7be4-097c-4ec4-86f6-13f53431f09d" (UID: "1cfe7be4-097c-4ec4-86f6-13f53431f09d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:09:25 crc kubenswrapper[4947]: I1203 09:09:25.394781 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cfe7be4-097c-4ec4-86f6-13f53431f09d-kube-api-access-pl47l" (OuterVolumeSpecName: "kube-api-access-pl47l") pod "1cfe7be4-097c-4ec4-86f6-13f53431f09d" (UID: "1cfe7be4-097c-4ec4-86f6-13f53431f09d"). InnerVolumeSpecName "kube-api-access-pl47l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:09:25 crc kubenswrapper[4947]: I1203 09:09:25.410946 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cfe7be4-097c-4ec4-86f6-13f53431f09d-scripts" (OuterVolumeSpecName: "scripts") pod "1cfe7be4-097c-4ec4-86f6-13f53431f09d" (UID: "1cfe7be4-097c-4ec4-86f6-13f53431f09d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:09:25 crc kubenswrapper[4947]: I1203 09:09:25.412613 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cfe7be4-097c-4ec4-86f6-13f53431f09d-config-data" (OuterVolumeSpecName: "config-data") pod "1cfe7be4-097c-4ec4-86f6-13f53431f09d" (UID: "1cfe7be4-097c-4ec4-86f6-13f53431f09d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:09:25 crc kubenswrapper[4947]: I1203 09:09:25.482087 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1cfe7be4-097c-4ec4-86f6-13f53431f09d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:09:25 crc kubenswrapper[4947]: I1203 09:09:25.482341 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cfe7be4-097c-4ec4-86f6-13f53431f09d-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:09:25 crc kubenswrapper[4947]: I1203 09:09:25.482414 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pl47l\" (UniqueName: \"kubernetes.io/projected/1cfe7be4-097c-4ec4-86f6-13f53431f09d-kube-api-access-pl47l\") on node \"crc\" DevicePath \"\"" Dec 03 09:09:25 crc kubenswrapper[4947]: I1203 09:09:25.482472 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cfe7be4-097c-4ec4-86f6-13f53431f09d-logs\") on node \"crc\" DevicePath \"\"" Dec 03 09:09:25 crc kubenswrapper[4947]: I1203 09:09:25.482540 4947 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1cfe7be4-097c-4ec4-86f6-13f53431f09d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:09:26 crc kubenswrapper[4947]: I1203 09:09:26.187661 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b8986b5b9-qhbt7" Dec 03 09:09:26 crc kubenswrapper[4947]: I1203 09:09:26.257883 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b8986b5b9-qhbt7"] Dec 03 09:09:26 crc kubenswrapper[4947]: I1203 09:09:26.296088 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-b8986b5b9-qhbt7"] Dec 03 09:09:27 crc kubenswrapper[4947]: I1203 09:09:27.097881 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cfe7be4-097c-4ec4-86f6-13f53431f09d" path="/var/lib/kubelet/pods/1cfe7be4-097c-4ec4-86f6-13f53431f09d/volumes" Dec 03 09:09:30 crc kubenswrapper[4947]: I1203 09:09:30.086432 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:09:30 crc kubenswrapper[4947]: I1203 09:09:30.086808 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:09:30 crc kubenswrapper[4947]: I1203 09:09:30.086857 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 09:09:30 crc kubenswrapper[4947]: I1203 09:09:30.087654 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 09:09:30 crc kubenswrapper[4947]: I1203 09:09:30.087704 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" gracePeriod=600 Dec 03 09:09:30 crc kubenswrapper[4947]: E1203 09:09:30.212424 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:09:30 crc kubenswrapper[4947]: I1203 09:09:30.237172 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" exitCode=0 Dec 03 09:09:30 crc kubenswrapper[4947]: I1203 09:09:30.237226 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca"} Dec 03 09:09:30 crc kubenswrapper[4947]: I1203 09:09:30.237261 4947 scope.go:117] "RemoveContainer" containerID="90c58260a4ecaf4766384be2b94412a301c15894ed8a02e64bee5f8c77d3f403" Dec 03 09:09:30 crc kubenswrapper[4947]: I1203 09:09:30.238754 4947 scope.go:117] "RemoveContainer" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" Dec 03 09:09:30 crc kubenswrapper[4947]: E1203 09:09:30.242124 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:09:45 crc kubenswrapper[4947]: I1203 09:09:45.083296 4947 scope.go:117] "RemoveContainer" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" Dec 03 09:09:45 crc kubenswrapper[4947]: E1203 09:09:45.084123 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:09:59 crc kubenswrapper[4947]: I1203 09:09:59.092938 4947 scope.go:117] "RemoveContainer" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" Dec 03 09:09:59 crc kubenswrapper[4947]: E1203 09:09:59.093983 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.380638 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-55945f8875-6c2qk"] Dec 03 09:10:01 crc kubenswrapper[4947]: E1203 09:10:01.382794 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a42ec7f7-5870-4d92-8d5f-dc5350cb281c" containerName="horizon" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.382828 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42ec7f7-5870-4d92-8d5f-dc5350cb281c" containerName="horizon" Dec 03 09:10:01 crc kubenswrapper[4947]: E1203 09:10:01.382857 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cfe7be4-097c-4ec4-86f6-13f53431f09d" containerName="horizon-log" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.382868 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cfe7be4-097c-4ec4-86f6-13f53431f09d" containerName="horizon-log" Dec 03 09:10:01 crc kubenswrapper[4947]: E1203 09:10:01.382884 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cfe7be4-097c-4ec4-86f6-13f53431f09d" containerName="horizon" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.382893 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cfe7be4-097c-4ec4-86f6-13f53431f09d" containerName="horizon" Dec 03 09:10:01 crc kubenswrapper[4947]: E1203 09:10:01.382999 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a42ec7f7-5870-4d92-8d5f-dc5350cb281c" containerName="horizon-log" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.383010 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42ec7f7-5870-4d92-8d5f-dc5350cb281c" containerName="horizon-log" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.383522 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="a42ec7f7-5870-4d92-8d5f-dc5350cb281c" containerName="horizon-log" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.383556 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="a42ec7f7-5870-4d92-8d5f-dc5350cb281c" containerName="horizon" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.383630 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cfe7be4-097c-4ec4-86f6-13f53431f09d" containerName="horizon-log" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.383657 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cfe7be4-097c-4ec4-86f6-13f53431f09d" containerName="horizon" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.386017 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55945f8875-6c2qk" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.405745 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55945f8875-6c2qk"] Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.447597 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ed26802-4951-4e81-bf89-7fec1e488b7b-logs\") pod \"horizon-55945f8875-6c2qk\" (UID: \"3ed26802-4951-4e81-bf89-7fec1e488b7b\") " pod="openstack/horizon-55945f8875-6c2qk" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.447654 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3ed26802-4951-4e81-bf89-7fec1e488b7b-horizon-secret-key\") pod \"horizon-55945f8875-6c2qk\" (UID: \"3ed26802-4951-4e81-bf89-7fec1e488b7b\") " pod="openstack/horizon-55945f8875-6c2qk" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.447723 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx5qf\" (UniqueName: \"kubernetes.io/projected/3ed26802-4951-4e81-bf89-7fec1e488b7b-kube-api-access-zx5qf\") pod \"horizon-55945f8875-6c2qk\" (UID: \"3ed26802-4951-4e81-bf89-7fec1e488b7b\") " pod="openstack/horizon-55945f8875-6c2qk" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.447792 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ed26802-4951-4e81-bf89-7fec1e488b7b-config-data\") pod \"horizon-55945f8875-6c2qk\" (UID: \"3ed26802-4951-4e81-bf89-7fec1e488b7b\") " pod="openstack/horizon-55945f8875-6c2qk" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.447818 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ed26802-4951-4e81-bf89-7fec1e488b7b-scripts\") pod \"horizon-55945f8875-6c2qk\" (UID: \"3ed26802-4951-4e81-bf89-7fec1e488b7b\") " pod="openstack/horizon-55945f8875-6c2qk" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.549080 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx5qf\" (UniqueName: \"kubernetes.io/projected/3ed26802-4951-4e81-bf89-7fec1e488b7b-kube-api-access-zx5qf\") pod \"horizon-55945f8875-6c2qk\" (UID: \"3ed26802-4951-4e81-bf89-7fec1e488b7b\") " pod="openstack/horizon-55945f8875-6c2qk" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.549483 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ed26802-4951-4e81-bf89-7fec1e488b7b-config-data\") pod \"horizon-55945f8875-6c2qk\" (UID: \"3ed26802-4951-4e81-bf89-7fec1e488b7b\") " pod="openstack/horizon-55945f8875-6c2qk" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.549546 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ed26802-4951-4e81-bf89-7fec1e488b7b-scripts\") pod \"horizon-55945f8875-6c2qk\" (UID: \"3ed26802-4951-4e81-bf89-7fec1e488b7b\") " pod="openstack/horizon-55945f8875-6c2qk" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.549623 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ed26802-4951-4e81-bf89-7fec1e488b7b-logs\") pod \"horizon-55945f8875-6c2qk\" (UID: \"3ed26802-4951-4e81-bf89-7fec1e488b7b\") " pod="openstack/horizon-55945f8875-6c2qk" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.549672 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3ed26802-4951-4e81-bf89-7fec1e488b7b-horizon-secret-key\") pod \"horizon-55945f8875-6c2qk\" (UID: \"3ed26802-4951-4e81-bf89-7fec1e488b7b\") " pod="openstack/horizon-55945f8875-6c2qk" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.550302 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3ed26802-4951-4e81-bf89-7fec1e488b7b-scripts\") pod \"horizon-55945f8875-6c2qk\" (UID: \"3ed26802-4951-4e81-bf89-7fec1e488b7b\") " pod="openstack/horizon-55945f8875-6c2qk" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.550641 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ed26802-4951-4e81-bf89-7fec1e488b7b-logs\") pod \"horizon-55945f8875-6c2qk\" (UID: \"3ed26802-4951-4e81-bf89-7fec1e488b7b\") " pod="openstack/horizon-55945f8875-6c2qk" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.550822 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3ed26802-4951-4e81-bf89-7fec1e488b7b-config-data\") pod \"horizon-55945f8875-6c2qk\" (UID: \"3ed26802-4951-4e81-bf89-7fec1e488b7b\") " pod="openstack/horizon-55945f8875-6c2qk" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.555404 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3ed26802-4951-4e81-bf89-7fec1e488b7b-horizon-secret-key\") pod \"horizon-55945f8875-6c2qk\" (UID: \"3ed26802-4951-4e81-bf89-7fec1e488b7b\") " pod="openstack/horizon-55945f8875-6c2qk" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.565193 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx5qf\" (UniqueName: \"kubernetes.io/projected/3ed26802-4951-4e81-bf89-7fec1e488b7b-kube-api-access-zx5qf\") pod \"horizon-55945f8875-6c2qk\" (UID: \"3ed26802-4951-4e81-bf89-7fec1e488b7b\") " pod="openstack/horizon-55945f8875-6c2qk" Dec 03 09:10:01 crc kubenswrapper[4947]: I1203 09:10:01.723245 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-55945f8875-6c2qk" Dec 03 09:10:02 crc kubenswrapper[4947]: I1203 09:10:02.286899 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-55945f8875-6c2qk"] Dec 03 09:10:02 crc kubenswrapper[4947]: I1203 09:10:02.568712 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-dptfn"] Dec 03 09:10:02 crc kubenswrapper[4947]: I1203 09:10:02.570627 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-dptfn" Dec 03 09:10:02 crc kubenswrapper[4947]: I1203 09:10:02.583523 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-dptfn"] Dec 03 09:10:02 crc kubenswrapper[4947]: I1203 09:10:02.635775 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55945f8875-6c2qk" event={"ID":"3ed26802-4951-4e81-bf89-7fec1e488b7b","Type":"ContainerStarted","Data":"dbd7e18c9aad4eb13db2e1e7dd647ecdee77c40ad0ed1b54ec4ec4cf64aade60"} Dec 03 09:10:02 crc kubenswrapper[4947]: I1203 09:10:02.635838 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55945f8875-6c2qk" event={"ID":"3ed26802-4951-4e81-bf89-7fec1e488b7b","Type":"ContainerStarted","Data":"1ef8237019e0a5cb6917dad154e59bafb11cc8b148b52ec495610950110b39db"} Dec 03 09:10:02 crc kubenswrapper[4947]: I1203 09:10:02.670523 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfd6b\" (UniqueName: \"kubernetes.io/projected/326dcb2c-2dda-4388-86f0-0fe7d911bd0a-kube-api-access-tfd6b\") pod \"heat-db-create-dptfn\" (UID: \"326dcb2c-2dda-4388-86f0-0fe7d911bd0a\") " pod="openstack/heat-db-create-dptfn" Dec 03 09:10:02 crc kubenswrapper[4947]: I1203 09:10:02.670657 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/326dcb2c-2dda-4388-86f0-0fe7d911bd0a-operator-scripts\") pod \"heat-db-create-dptfn\" (UID: \"326dcb2c-2dda-4388-86f0-0fe7d911bd0a\") " pod="openstack/heat-db-create-dptfn" Dec 03 09:10:02 crc kubenswrapper[4947]: I1203 09:10:02.680267 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-ac5a-account-create-update-8qldg"] Dec 03 09:10:02 crc kubenswrapper[4947]: I1203 09:10:02.681678 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-ac5a-account-create-update-8qldg" Dec 03 09:10:02 crc kubenswrapper[4947]: I1203 09:10:02.686625 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 03 09:10:02 crc kubenswrapper[4947]: I1203 09:10:02.689250 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-ac5a-account-create-update-8qldg"] Dec 03 09:10:02 crc kubenswrapper[4947]: I1203 09:10:02.772999 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfd6b\" (UniqueName: \"kubernetes.io/projected/326dcb2c-2dda-4388-86f0-0fe7d911bd0a-kube-api-access-tfd6b\") pod \"heat-db-create-dptfn\" (UID: \"326dcb2c-2dda-4388-86f0-0fe7d911bd0a\") " pod="openstack/heat-db-create-dptfn" Dec 03 09:10:02 crc kubenswrapper[4947]: I1203 09:10:02.773043 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e7c96fb-1caf-42ed-adb6-5b0a72276b2a-operator-scripts\") pod \"heat-ac5a-account-create-update-8qldg\" (UID: \"3e7c96fb-1caf-42ed-adb6-5b0a72276b2a\") " pod="openstack/heat-ac5a-account-create-update-8qldg" Dec 03 09:10:02 crc kubenswrapper[4947]: I1203 09:10:02.773080 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/326dcb2c-2dda-4388-86f0-0fe7d911bd0a-operator-scripts\") pod \"heat-db-create-dptfn\" (UID: \"326dcb2c-2dda-4388-86f0-0fe7d911bd0a\") " pod="openstack/heat-db-create-dptfn" Dec 03 09:10:02 crc kubenswrapper[4947]: I1203 09:10:02.773103 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbvdn\" (UniqueName: \"kubernetes.io/projected/3e7c96fb-1caf-42ed-adb6-5b0a72276b2a-kube-api-access-zbvdn\") pod \"heat-ac5a-account-create-update-8qldg\" (UID: \"3e7c96fb-1caf-42ed-adb6-5b0a72276b2a\") " pod="openstack/heat-ac5a-account-create-update-8qldg" Dec 03 09:10:02 crc kubenswrapper[4947]: I1203 09:10:02.774229 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/326dcb2c-2dda-4388-86f0-0fe7d911bd0a-operator-scripts\") pod \"heat-db-create-dptfn\" (UID: \"326dcb2c-2dda-4388-86f0-0fe7d911bd0a\") " pod="openstack/heat-db-create-dptfn" Dec 03 09:10:02 crc kubenswrapper[4947]: I1203 09:10:02.813704 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfd6b\" (UniqueName: \"kubernetes.io/projected/326dcb2c-2dda-4388-86f0-0fe7d911bd0a-kube-api-access-tfd6b\") pod \"heat-db-create-dptfn\" (UID: \"326dcb2c-2dda-4388-86f0-0fe7d911bd0a\") " pod="openstack/heat-db-create-dptfn" Dec 03 09:10:02 crc kubenswrapper[4947]: I1203 09:10:02.875559 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbvdn\" (UniqueName: \"kubernetes.io/projected/3e7c96fb-1caf-42ed-adb6-5b0a72276b2a-kube-api-access-zbvdn\") pod \"heat-ac5a-account-create-update-8qldg\" (UID: \"3e7c96fb-1caf-42ed-adb6-5b0a72276b2a\") " pod="openstack/heat-ac5a-account-create-update-8qldg" Dec 03 09:10:02 crc kubenswrapper[4947]: I1203 09:10:02.876078 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e7c96fb-1caf-42ed-adb6-5b0a72276b2a-operator-scripts\") pod \"heat-ac5a-account-create-update-8qldg\" (UID: \"3e7c96fb-1caf-42ed-adb6-5b0a72276b2a\") " pod="openstack/heat-ac5a-account-create-update-8qldg" Dec 03 09:10:02 crc kubenswrapper[4947]: I1203 09:10:02.877357 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e7c96fb-1caf-42ed-adb6-5b0a72276b2a-operator-scripts\") pod \"heat-ac5a-account-create-update-8qldg\" (UID: \"3e7c96fb-1caf-42ed-adb6-5b0a72276b2a\") " pod="openstack/heat-ac5a-account-create-update-8qldg" Dec 03 09:10:02 crc kubenswrapper[4947]: I1203 09:10:02.898107 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbvdn\" (UniqueName: \"kubernetes.io/projected/3e7c96fb-1caf-42ed-adb6-5b0a72276b2a-kube-api-access-zbvdn\") pod \"heat-ac5a-account-create-update-8qldg\" (UID: \"3e7c96fb-1caf-42ed-adb6-5b0a72276b2a\") " pod="openstack/heat-ac5a-account-create-update-8qldg" Dec 03 09:10:02 crc kubenswrapper[4947]: I1203 09:10:02.992100 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-dptfn" Dec 03 09:10:03 crc kubenswrapper[4947]: I1203 09:10:03.011839 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-ac5a-account-create-update-8qldg" Dec 03 09:10:03 crc kubenswrapper[4947]: I1203 09:10:03.470513 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-dptfn"] Dec 03 09:10:03 crc kubenswrapper[4947]: I1203 09:10:03.621321 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-ac5a-account-create-update-8qldg"] Dec 03 09:10:03 crc kubenswrapper[4947]: I1203 09:10:03.653274 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-55945f8875-6c2qk" event={"ID":"3ed26802-4951-4e81-bf89-7fec1e488b7b","Type":"ContainerStarted","Data":"5c83daf7ec7e66880b2cbd11820746592fc02f2435315ad506fd11f33c8f4fbe"} Dec 03 09:10:03 crc kubenswrapper[4947]: I1203 09:10:03.655670 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-dptfn" event={"ID":"326dcb2c-2dda-4388-86f0-0fe7d911bd0a","Type":"ContainerStarted","Data":"a44cd3515fb28eabbc99436ae161487f9baa80a7a316fd7eaf83ee0383a0ca69"} Dec 03 09:10:03 crc kubenswrapper[4947]: I1203 09:10:03.682781 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-55945f8875-6c2qk" podStartSLOduration=2.682758345 podStartE2EDuration="2.682758345s" podCreationTimestamp="2025-12-03 09:10:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:10:03.674368398 +0000 UTC m=+8464.935322824" watchObservedRunningTime="2025-12-03 09:10:03.682758345 +0000 UTC m=+8464.943712771" Dec 03 09:10:04 crc kubenswrapper[4947]: I1203 09:10:04.666361 4947 generic.go:334] "Generic (PLEG): container finished" podID="326dcb2c-2dda-4388-86f0-0fe7d911bd0a" containerID="31e99cb539d0d3666a31248644abf19f284cbb6a4bd64bb009245d18bf548498" exitCode=0 Dec 03 09:10:04 crc kubenswrapper[4947]: I1203 09:10:04.666647 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-dptfn" event={"ID":"326dcb2c-2dda-4388-86f0-0fe7d911bd0a","Type":"ContainerDied","Data":"31e99cb539d0d3666a31248644abf19f284cbb6a4bd64bb009245d18bf548498"} Dec 03 09:10:04 crc kubenswrapper[4947]: I1203 09:10:04.671394 4947 generic.go:334] "Generic (PLEG): container finished" podID="3e7c96fb-1caf-42ed-adb6-5b0a72276b2a" containerID="ad5de7e3db1cd83608a8e1fe6280478a26f10bc2c17a0f4d8c2cda071f917695" exitCode=0 Dec 03 09:10:04 crc kubenswrapper[4947]: I1203 09:10:04.671732 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-ac5a-account-create-update-8qldg" event={"ID":"3e7c96fb-1caf-42ed-adb6-5b0a72276b2a","Type":"ContainerDied","Data":"ad5de7e3db1cd83608a8e1fe6280478a26f10bc2c17a0f4d8c2cda071f917695"} Dec 03 09:10:04 crc kubenswrapper[4947]: I1203 09:10:04.671767 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-ac5a-account-create-update-8qldg" event={"ID":"3e7c96fb-1caf-42ed-adb6-5b0a72276b2a","Type":"ContainerStarted","Data":"0f71e42ed1af85db64fd72b075c187b4e2818b9036e5300ce44854fb116cdacd"} Dec 03 09:10:06 crc kubenswrapper[4947]: I1203 09:10:06.160761 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-ac5a-account-create-update-8qldg" Dec 03 09:10:06 crc kubenswrapper[4947]: I1203 09:10:06.165223 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-dptfn" Dec 03 09:10:06 crc kubenswrapper[4947]: I1203 09:10:06.255127 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/326dcb2c-2dda-4388-86f0-0fe7d911bd0a-operator-scripts\") pod \"326dcb2c-2dda-4388-86f0-0fe7d911bd0a\" (UID: \"326dcb2c-2dda-4388-86f0-0fe7d911bd0a\") " Dec 03 09:10:06 crc kubenswrapper[4947]: I1203 09:10:06.255400 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfd6b\" (UniqueName: \"kubernetes.io/projected/326dcb2c-2dda-4388-86f0-0fe7d911bd0a-kube-api-access-tfd6b\") pod \"326dcb2c-2dda-4388-86f0-0fe7d911bd0a\" (UID: \"326dcb2c-2dda-4388-86f0-0fe7d911bd0a\") " Dec 03 09:10:06 crc kubenswrapper[4947]: I1203 09:10:06.255599 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e7c96fb-1caf-42ed-adb6-5b0a72276b2a-operator-scripts\") pod \"3e7c96fb-1caf-42ed-adb6-5b0a72276b2a\" (UID: \"3e7c96fb-1caf-42ed-adb6-5b0a72276b2a\") " Dec 03 09:10:06 crc kubenswrapper[4947]: I1203 09:10:06.255652 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbvdn\" (UniqueName: \"kubernetes.io/projected/3e7c96fb-1caf-42ed-adb6-5b0a72276b2a-kube-api-access-zbvdn\") pod \"3e7c96fb-1caf-42ed-adb6-5b0a72276b2a\" (UID: \"3e7c96fb-1caf-42ed-adb6-5b0a72276b2a\") " Dec 03 09:10:06 crc kubenswrapper[4947]: I1203 09:10:06.255908 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/326dcb2c-2dda-4388-86f0-0fe7d911bd0a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "326dcb2c-2dda-4388-86f0-0fe7d911bd0a" (UID: "326dcb2c-2dda-4388-86f0-0fe7d911bd0a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:10:06 crc kubenswrapper[4947]: I1203 09:10:06.256414 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e7c96fb-1caf-42ed-adb6-5b0a72276b2a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3e7c96fb-1caf-42ed-adb6-5b0a72276b2a" (UID: "3e7c96fb-1caf-42ed-adb6-5b0a72276b2a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:10:06 crc kubenswrapper[4947]: I1203 09:10:06.256543 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/326dcb2c-2dda-4388-86f0-0fe7d911bd0a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:10:06 crc kubenswrapper[4947]: I1203 09:10:06.256562 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3e7c96fb-1caf-42ed-adb6-5b0a72276b2a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:10:06 crc kubenswrapper[4947]: I1203 09:10:06.262666 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e7c96fb-1caf-42ed-adb6-5b0a72276b2a-kube-api-access-zbvdn" (OuterVolumeSpecName: "kube-api-access-zbvdn") pod "3e7c96fb-1caf-42ed-adb6-5b0a72276b2a" (UID: "3e7c96fb-1caf-42ed-adb6-5b0a72276b2a"). InnerVolumeSpecName "kube-api-access-zbvdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:10:06 crc kubenswrapper[4947]: I1203 09:10:06.285731 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/326dcb2c-2dda-4388-86f0-0fe7d911bd0a-kube-api-access-tfd6b" (OuterVolumeSpecName: "kube-api-access-tfd6b") pod "326dcb2c-2dda-4388-86f0-0fe7d911bd0a" (UID: "326dcb2c-2dda-4388-86f0-0fe7d911bd0a"). InnerVolumeSpecName "kube-api-access-tfd6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:10:06 crc kubenswrapper[4947]: I1203 09:10:06.358647 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbvdn\" (UniqueName: \"kubernetes.io/projected/3e7c96fb-1caf-42ed-adb6-5b0a72276b2a-kube-api-access-zbvdn\") on node \"crc\" DevicePath \"\"" Dec 03 09:10:06 crc kubenswrapper[4947]: I1203 09:10:06.358686 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfd6b\" (UniqueName: \"kubernetes.io/projected/326dcb2c-2dda-4388-86f0-0fe7d911bd0a-kube-api-access-tfd6b\") on node \"crc\" DevicePath \"\"" Dec 03 09:10:06 crc kubenswrapper[4947]: I1203 09:10:06.692762 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-ac5a-account-create-update-8qldg" event={"ID":"3e7c96fb-1caf-42ed-adb6-5b0a72276b2a","Type":"ContainerDied","Data":"0f71e42ed1af85db64fd72b075c187b4e2818b9036e5300ce44854fb116cdacd"} Dec 03 09:10:06 crc kubenswrapper[4947]: I1203 09:10:06.693134 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f71e42ed1af85db64fd72b075c187b4e2818b9036e5300ce44854fb116cdacd" Dec 03 09:10:06 crc kubenswrapper[4947]: I1203 09:10:06.693157 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-ac5a-account-create-update-8qldg" Dec 03 09:10:06 crc kubenswrapper[4947]: I1203 09:10:06.704851 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-dptfn" event={"ID":"326dcb2c-2dda-4388-86f0-0fe7d911bd0a","Type":"ContainerDied","Data":"a44cd3515fb28eabbc99436ae161487f9baa80a7a316fd7eaf83ee0383a0ca69"} Dec 03 09:10:06 crc kubenswrapper[4947]: I1203 09:10:06.704900 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a44cd3515fb28eabbc99436ae161487f9baa80a7a316fd7eaf83ee0383a0ca69" Dec 03 09:10:06 crc kubenswrapper[4947]: I1203 09:10:06.704914 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-dptfn" Dec 03 09:10:07 crc kubenswrapper[4947]: I1203 09:10:07.787636 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-blw2g"] Dec 03 09:10:07 crc kubenswrapper[4947]: E1203 09:10:07.788312 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e7c96fb-1caf-42ed-adb6-5b0a72276b2a" containerName="mariadb-account-create-update" Dec 03 09:10:07 crc kubenswrapper[4947]: I1203 09:10:07.788325 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e7c96fb-1caf-42ed-adb6-5b0a72276b2a" containerName="mariadb-account-create-update" Dec 03 09:10:07 crc kubenswrapper[4947]: E1203 09:10:07.788342 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="326dcb2c-2dda-4388-86f0-0fe7d911bd0a" containerName="mariadb-database-create" Dec 03 09:10:07 crc kubenswrapper[4947]: I1203 09:10:07.788349 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="326dcb2c-2dda-4388-86f0-0fe7d911bd0a" containerName="mariadb-database-create" Dec 03 09:10:07 crc kubenswrapper[4947]: I1203 09:10:07.788528 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="326dcb2c-2dda-4388-86f0-0fe7d911bd0a" containerName="mariadb-database-create" Dec 03 09:10:07 crc kubenswrapper[4947]: I1203 09:10:07.788556 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e7c96fb-1caf-42ed-adb6-5b0a72276b2a" containerName="mariadb-account-create-update" Dec 03 09:10:07 crc kubenswrapper[4947]: I1203 09:10:07.789176 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-blw2g" Dec 03 09:10:07 crc kubenswrapper[4947]: I1203 09:10:07.791561 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-8gwsv" Dec 03 09:10:07 crc kubenswrapper[4947]: I1203 09:10:07.793203 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 03 09:10:07 crc kubenswrapper[4947]: I1203 09:10:07.798372 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-blw2g"] Dec 03 09:10:07 crc kubenswrapper[4947]: I1203 09:10:07.892565 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941fd7b8-ace3-42f6-ac09-8784a0473417-config-data\") pod \"heat-db-sync-blw2g\" (UID: \"941fd7b8-ace3-42f6-ac09-8784a0473417\") " pod="openstack/heat-db-sync-blw2g" Dec 03 09:10:07 crc kubenswrapper[4947]: I1203 09:10:07.892616 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941fd7b8-ace3-42f6-ac09-8784a0473417-combined-ca-bundle\") pod \"heat-db-sync-blw2g\" (UID: \"941fd7b8-ace3-42f6-ac09-8784a0473417\") " pod="openstack/heat-db-sync-blw2g" Dec 03 09:10:07 crc kubenswrapper[4947]: I1203 09:10:07.892715 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9482\" (UniqueName: \"kubernetes.io/projected/941fd7b8-ace3-42f6-ac09-8784a0473417-kube-api-access-q9482\") pod \"heat-db-sync-blw2g\" (UID: \"941fd7b8-ace3-42f6-ac09-8784a0473417\") " pod="openstack/heat-db-sync-blw2g" Dec 03 09:10:07 crc kubenswrapper[4947]: I1203 09:10:07.994803 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941fd7b8-ace3-42f6-ac09-8784a0473417-config-data\") pod \"heat-db-sync-blw2g\" (UID: \"941fd7b8-ace3-42f6-ac09-8784a0473417\") " pod="openstack/heat-db-sync-blw2g" Dec 03 09:10:07 crc kubenswrapper[4947]: I1203 09:10:07.994879 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941fd7b8-ace3-42f6-ac09-8784a0473417-combined-ca-bundle\") pod \"heat-db-sync-blw2g\" (UID: \"941fd7b8-ace3-42f6-ac09-8784a0473417\") " pod="openstack/heat-db-sync-blw2g" Dec 03 09:10:07 crc kubenswrapper[4947]: I1203 09:10:07.995034 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9482\" (UniqueName: \"kubernetes.io/projected/941fd7b8-ace3-42f6-ac09-8784a0473417-kube-api-access-q9482\") pod \"heat-db-sync-blw2g\" (UID: \"941fd7b8-ace3-42f6-ac09-8784a0473417\") " pod="openstack/heat-db-sync-blw2g" Dec 03 09:10:08 crc kubenswrapper[4947]: I1203 09:10:08.001126 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941fd7b8-ace3-42f6-ac09-8784a0473417-config-data\") pod \"heat-db-sync-blw2g\" (UID: \"941fd7b8-ace3-42f6-ac09-8784a0473417\") " pod="openstack/heat-db-sync-blw2g" Dec 03 09:10:08 crc kubenswrapper[4947]: I1203 09:10:08.008970 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941fd7b8-ace3-42f6-ac09-8784a0473417-combined-ca-bundle\") pod \"heat-db-sync-blw2g\" (UID: \"941fd7b8-ace3-42f6-ac09-8784a0473417\") " pod="openstack/heat-db-sync-blw2g" Dec 03 09:10:08 crc kubenswrapper[4947]: I1203 09:10:08.016552 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9482\" (UniqueName: \"kubernetes.io/projected/941fd7b8-ace3-42f6-ac09-8784a0473417-kube-api-access-q9482\") pod \"heat-db-sync-blw2g\" (UID: \"941fd7b8-ace3-42f6-ac09-8784a0473417\") " pod="openstack/heat-db-sync-blw2g" Dec 03 09:10:08 crc kubenswrapper[4947]: I1203 09:10:08.170276 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-blw2g" Dec 03 09:10:08 crc kubenswrapper[4947]: I1203 09:10:08.746165 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-blw2g"] Dec 03 09:10:09 crc kubenswrapper[4947]: I1203 09:10:09.732867 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-blw2g" event={"ID":"941fd7b8-ace3-42f6-ac09-8784a0473417","Type":"ContainerStarted","Data":"a8b34d25791859e76d6d118410a56f265a5aa42a725ea6e9861fb538e345022d"} Dec 03 09:10:10 crc kubenswrapper[4947]: I1203 09:10:10.083307 4947 scope.go:117] "RemoveContainer" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" Dec 03 09:10:10 crc kubenswrapper[4947]: E1203 09:10:10.083700 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:10:11 crc kubenswrapper[4947]: I1203 09:10:11.724193 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-55945f8875-6c2qk" Dec 03 09:10:11 crc kubenswrapper[4947]: I1203 09:10:11.724545 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-55945f8875-6c2qk" Dec 03 09:10:18 crc kubenswrapper[4947]: I1203 09:10:18.839525 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-blw2g" event={"ID":"941fd7b8-ace3-42f6-ac09-8784a0473417","Type":"ContainerStarted","Data":"0465f2f614221438c45e658faad0b2b3a37e9a98d406c4ce9450939ebe48018b"} Dec 03 09:10:18 crc kubenswrapper[4947]: I1203 09:10:18.869039 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-blw2g" podStartSLOduration=2.894879823 podStartE2EDuration="11.869022049s" podCreationTimestamp="2025-12-03 09:10:07 +0000 UTC" firstStartedPulling="2025-12-03 09:10:08.796157844 +0000 UTC m=+8470.057112270" lastFinishedPulling="2025-12-03 09:10:17.77030008 +0000 UTC m=+8479.031254496" observedRunningTime="2025-12-03 09:10:18.861475835 +0000 UTC m=+8480.122430281" watchObservedRunningTime="2025-12-03 09:10:18.869022049 +0000 UTC m=+8480.129976465" Dec 03 09:10:20 crc kubenswrapper[4947]: I1203 09:10:20.863684 4947 generic.go:334] "Generic (PLEG): container finished" podID="941fd7b8-ace3-42f6-ac09-8784a0473417" containerID="0465f2f614221438c45e658faad0b2b3a37e9a98d406c4ce9450939ebe48018b" exitCode=0 Dec 03 09:10:20 crc kubenswrapper[4947]: I1203 09:10:20.863852 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-blw2g" event={"ID":"941fd7b8-ace3-42f6-ac09-8784a0473417","Type":"ContainerDied","Data":"0465f2f614221438c45e658faad0b2b3a37e9a98d406c4ce9450939ebe48018b"} Dec 03 09:10:21 crc kubenswrapper[4947]: I1203 09:10:21.729765 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-55945f8875-6c2qk" podUID="3ed26802-4951-4e81-bf89-7fec1e488b7b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.145:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.145:8080: connect: connection refused" Dec 03 09:10:22 crc kubenswrapper[4947]: I1203 09:10:22.240889 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-blw2g" Dec 03 09:10:22 crc kubenswrapper[4947]: I1203 09:10:22.322076 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9482\" (UniqueName: \"kubernetes.io/projected/941fd7b8-ace3-42f6-ac09-8784a0473417-kube-api-access-q9482\") pod \"941fd7b8-ace3-42f6-ac09-8784a0473417\" (UID: \"941fd7b8-ace3-42f6-ac09-8784a0473417\") " Dec 03 09:10:22 crc kubenswrapper[4947]: I1203 09:10:22.322325 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941fd7b8-ace3-42f6-ac09-8784a0473417-combined-ca-bundle\") pod \"941fd7b8-ace3-42f6-ac09-8784a0473417\" (UID: \"941fd7b8-ace3-42f6-ac09-8784a0473417\") " Dec 03 09:10:22 crc kubenswrapper[4947]: I1203 09:10:22.322483 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941fd7b8-ace3-42f6-ac09-8784a0473417-config-data\") pod \"941fd7b8-ace3-42f6-ac09-8784a0473417\" (UID: \"941fd7b8-ace3-42f6-ac09-8784a0473417\") " Dec 03 09:10:22 crc kubenswrapper[4947]: I1203 09:10:22.334049 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/941fd7b8-ace3-42f6-ac09-8784a0473417-kube-api-access-q9482" (OuterVolumeSpecName: "kube-api-access-q9482") pod "941fd7b8-ace3-42f6-ac09-8784a0473417" (UID: "941fd7b8-ace3-42f6-ac09-8784a0473417"). InnerVolumeSpecName "kube-api-access-q9482". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:10:22 crc kubenswrapper[4947]: I1203 09:10:22.358273 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/941fd7b8-ace3-42f6-ac09-8784a0473417-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "941fd7b8-ace3-42f6-ac09-8784a0473417" (UID: "941fd7b8-ace3-42f6-ac09-8784a0473417"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:10:22 crc kubenswrapper[4947]: I1203 09:10:22.426032 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9482\" (UniqueName: \"kubernetes.io/projected/941fd7b8-ace3-42f6-ac09-8784a0473417-kube-api-access-q9482\") on node \"crc\" DevicePath \"\"" Dec 03 09:10:22 crc kubenswrapper[4947]: I1203 09:10:22.426084 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941fd7b8-ace3-42f6-ac09-8784a0473417-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:10:22 crc kubenswrapper[4947]: I1203 09:10:22.448246 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/941fd7b8-ace3-42f6-ac09-8784a0473417-config-data" (OuterVolumeSpecName: "config-data") pod "941fd7b8-ace3-42f6-ac09-8784a0473417" (UID: "941fd7b8-ace3-42f6-ac09-8784a0473417"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:10:22 crc kubenswrapper[4947]: I1203 09:10:22.528725 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941fd7b8-ace3-42f6-ac09-8784a0473417-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:10:22 crc kubenswrapper[4947]: I1203 09:10:22.883299 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-blw2g" event={"ID":"941fd7b8-ace3-42f6-ac09-8784a0473417","Type":"ContainerDied","Data":"a8b34d25791859e76d6d118410a56f265a5aa42a725ea6e9861fb538e345022d"} Dec 03 09:10:22 crc kubenswrapper[4947]: I1203 09:10:22.883610 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8b34d25791859e76d6d118410a56f265a5aa42a725ea6e9861fb538e345022d" Dec 03 09:10:22 crc kubenswrapper[4947]: I1203 09:10:22.883350 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-blw2g" Dec 03 09:10:23 crc kubenswrapper[4947]: I1203 09:10:23.927257 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-5dbdc8fbbd-cxvpc"] Dec 03 09:10:23 crc kubenswrapper[4947]: E1203 09:10:23.928222 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="941fd7b8-ace3-42f6-ac09-8784a0473417" containerName="heat-db-sync" Dec 03 09:10:23 crc kubenswrapper[4947]: I1203 09:10:23.928242 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="941fd7b8-ace3-42f6-ac09-8784a0473417" containerName="heat-db-sync" Dec 03 09:10:23 crc kubenswrapper[4947]: I1203 09:10:23.928478 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="941fd7b8-ace3-42f6-ac09-8784a0473417" containerName="heat-db-sync" Dec 03 09:10:23 crc kubenswrapper[4947]: I1203 09:10:23.929368 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5dbdc8fbbd-cxvpc" Dec 03 09:10:23 crc kubenswrapper[4947]: I1203 09:10:23.931919 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-8gwsv" Dec 03 09:10:23 crc kubenswrapper[4947]: I1203 09:10:23.932513 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 03 09:10:23 crc kubenswrapper[4947]: I1203 09:10:23.933388 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Dec 03 09:10:23 crc kubenswrapper[4947]: I1203 09:10:23.960549 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06-combined-ca-bundle\") pod \"heat-engine-5dbdc8fbbd-cxvpc\" (UID: \"ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06\") " pod="openstack/heat-engine-5dbdc8fbbd-cxvpc" Dec 03 09:10:23 crc kubenswrapper[4947]: I1203 09:10:23.960669 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06-config-data-custom\") pod \"heat-engine-5dbdc8fbbd-cxvpc\" (UID: \"ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06\") " pod="openstack/heat-engine-5dbdc8fbbd-cxvpc" Dec 03 09:10:23 crc kubenswrapper[4947]: I1203 09:10:23.960718 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97s56\" (UniqueName: \"kubernetes.io/projected/ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06-kube-api-access-97s56\") pod \"heat-engine-5dbdc8fbbd-cxvpc\" (UID: \"ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06\") " pod="openstack/heat-engine-5dbdc8fbbd-cxvpc" Dec 03 09:10:23 crc kubenswrapper[4947]: I1203 09:10:23.961376 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06-config-data\") pod \"heat-engine-5dbdc8fbbd-cxvpc\" (UID: \"ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06\") " pod="openstack/heat-engine-5dbdc8fbbd-cxvpc" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.025010 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5dbdc8fbbd-cxvpc"] Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.063811 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06-config-data\") pod \"heat-engine-5dbdc8fbbd-cxvpc\" (UID: \"ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06\") " pod="openstack/heat-engine-5dbdc8fbbd-cxvpc" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.064137 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06-combined-ca-bundle\") pod \"heat-engine-5dbdc8fbbd-cxvpc\" (UID: \"ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06\") " pod="openstack/heat-engine-5dbdc8fbbd-cxvpc" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.064266 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06-config-data-custom\") pod \"heat-engine-5dbdc8fbbd-cxvpc\" (UID: \"ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06\") " pod="openstack/heat-engine-5dbdc8fbbd-cxvpc" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.064373 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97s56\" (UniqueName: \"kubernetes.io/projected/ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06-kube-api-access-97s56\") pod \"heat-engine-5dbdc8fbbd-cxvpc\" (UID: \"ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06\") " pod="openstack/heat-engine-5dbdc8fbbd-cxvpc" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.073486 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06-combined-ca-bundle\") pod \"heat-engine-5dbdc8fbbd-cxvpc\" (UID: \"ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06\") " pod="openstack/heat-engine-5dbdc8fbbd-cxvpc" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.073967 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06-config-data\") pod \"heat-engine-5dbdc8fbbd-cxvpc\" (UID: \"ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06\") " pod="openstack/heat-engine-5dbdc8fbbd-cxvpc" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.096224 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06-config-data-custom\") pod \"heat-engine-5dbdc8fbbd-cxvpc\" (UID: \"ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06\") " pod="openstack/heat-engine-5dbdc8fbbd-cxvpc" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.143486 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97s56\" (UniqueName: \"kubernetes.io/projected/ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06-kube-api-access-97s56\") pod \"heat-engine-5dbdc8fbbd-cxvpc\" (UID: \"ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06\") " pod="openstack/heat-engine-5dbdc8fbbd-cxvpc" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.250313 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5dbdc8fbbd-cxvpc" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.344173 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5f576c8c7-hqbg9"] Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.347001 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f576c8c7-hqbg9" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.349312 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.369364 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5f576c8c7-hqbg9"] Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.374119 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4ff2437-317a-4e48-9cb7-d001f05ccbee-config-data-custom\") pod \"heat-api-5f576c8c7-hqbg9\" (UID: \"d4ff2437-317a-4e48-9cb7-d001f05ccbee\") " pod="openstack/heat-api-5f576c8c7-hqbg9" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.374293 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ff2437-317a-4e48-9cb7-d001f05ccbee-combined-ca-bundle\") pod \"heat-api-5f576c8c7-hqbg9\" (UID: \"d4ff2437-317a-4e48-9cb7-d001f05ccbee\") " pod="openstack/heat-api-5f576c8c7-hqbg9" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.374390 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4ff2437-317a-4e48-9cb7-d001f05ccbee-config-data\") pod \"heat-api-5f576c8c7-hqbg9\" (UID: \"d4ff2437-317a-4e48-9cb7-d001f05ccbee\") " pod="openstack/heat-api-5f576c8c7-hqbg9" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.374588 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dshxz\" (UniqueName: \"kubernetes.io/projected/d4ff2437-317a-4e48-9cb7-d001f05ccbee-kube-api-access-dshxz\") pod \"heat-api-5f576c8c7-hqbg9\" (UID: \"d4ff2437-317a-4e48-9cb7-d001f05ccbee\") " pod="openstack/heat-api-5f576c8c7-hqbg9" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.420828 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-b598dd84c-gtjdb"] Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.422270 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-b598dd84c-gtjdb" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.426145 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.454330 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-b598dd84c-gtjdb"] Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.476868 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4ff2437-317a-4e48-9cb7-d001f05ccbee-config-data-custom\") pod \"heat-api-5f576c8c7-hqbg9\" (UID: \"d4ff2437-317a-4e48-9cb7-d001f05ccbee\") " pod="openstack/heat-api-5f576c8c7-hqbg9" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.477859 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cef161a-7885-4753-9cbe-8ee4d59ebc94-config-data-custom\") pod \"heat-cfnapi-b598dd84c-gtjdb\" (UID: \"6cef161a-7885-4753-9cbe-8ee4d59ebc94\") " pod="openstack/heat-cfnapi-b598dd84c-gtjdb" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.477977 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84f74\" (UniqueName: \"kubernetes.io/projected/6cef161a-7885-4753-9cbe-8ee4d59ebc94-kube-api-access-84f74\") pod \"heat-cfnapi-b598dd84c-gtjdb\" (UID: \"6cef161a-7885-4753-9cbe-8ee4d59ebc94\") " pod="openstack/heat-cfnapi-b598dd84c-gtjdb" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.478049 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ff2437-317a-4e48-9cb7-d001f05ccbee-combined-ca-bundle\") pod \"heat-api-5f576c8c7-hqbg9\" (UID: \"d4ff2437-317a-4e48-9cb7-d001f05ccbee\") " pod="openstack/heat-api-5f576c8c7-hqbg9" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.478146 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cef161a-7885-4753-9cbe-8ee4d59ebc94-combined-ca-bundle\") pod \"heat-cfnapi-b598dd84c-gtjdb\" (UID: \"6cef161a-7885-4753-9cbe-8ee4d59ebc94\") " pod="openstack/heat-cfnapi-b598dd84c-gtjdb" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.478217 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4ff2437-317a-4e48-9cb7-d001f05ccbee-config-data\") pod \"heat-api-5f576c8c7-hqbg9\" (UID: \"d4ff2437-317a-4e48-9cb7-d001f05ccbee\") " pod="openstack/heat-api-5f576c8c7-hqbg9" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.478378 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dshxz\" (UniqueName: \"kubernetes.io/projected/d4ff2437-317a-4e48-9cb7-d001f05ccbee-kube-api-access-dshxz\") pod \"heat-api-5f576c8c7-hqbg9\" (UID: \"d4ff2437-317a-4e48-9cb7-d001f05ccbee\") " pod="openstack/heat-api-5f576c8c7-hqbg9" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.478547 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cef161a-7885-4753-9cbe-8ee4d59ebc94-config-data\") pod \"heat-cfnapi-b598dd84c-gtjdb\" (UID: \"6cef161a-7885-4753-9cbe-8ee4d59ebc94\") " pod="openstack/heat-cfnapi-b598dd84c-gtjdb" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.484301 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4ff2437-317a-4e48-9cb7-d001f05ccbee-combined-ca-bundle\") pod \"heat-api-5f576c8c7-hqbg9\" (UID: \"d4ff2437-317a-4e48-9cb7-d001f05ccbee\") " pod="openstack/heat-api-5f576c8c7-hqbg9" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.485996 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4ff2437-317a-4e48-9cb7-d001f05ccbee-config-data-custom\") pod \"heat-api-5f576c8c7-hqbg9\" (UID: \"d4ff2437-317a-4e48-9cb7-d001f05ccbee\") " pod="openstack/heat-api-5f576c8c7-hqbg9" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.488592 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4ff2437-317a-4e48-9cb7-d001f05ccbee-config-data\") pod \"heat-api-5f576c8c7-hqbg9\" (UID: \"d4ff2437-317a-4e48-9cb7-d001f05ccbee\") " pod="openstack/heat-api-5f576c8c7-hqbg9" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.502438 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dshxz\" (UniqueName: \"kubernetes.io/projected/d4ff2437-317a-4e48-9cb7-d001f05ccbee-kube-api-access-dshxz\") pod \"heat-api-5f576c8c7-hqbg9\" (UID: \"d4ff2437-317a-4e48-9cb7-d001f05ccbee\") " pod="openstack/heat-api-5f576c8c7-hqbg9" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.579684 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cef161a-7885-4753-9cbe-8ee4d59ebc94-config-data\") pod \"heat-cfnapi-b598dd84c-gtjdb\" (UID: \"6cef161a-7885-4753-9cbe-8ee4d59ebc94\") " pod="openstack/heat-cfnapi-b598dd84c-gtjdb" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.580471 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cef161a-7885-4753-9cbe-8ee4d59ebc94-config-data-custom\") pod \"heat-cfnapi-b598dd84c-gtjdb\" (UID: \"6cef161a-7885-4753-9cbe-8ee4d59ebc94\") " pod="openstack/heat-cfnapi-b598dd84c-gtjdb" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.580507 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84f74\" (UniqueName: \"kubernetes.io/projected/6cef161a-7885-4753-9cbe-8ee4d59ebc94-kube-api-access-84f74\") pod \"heat-cfnapi-b598dd84c-gtjdb\" (UID: \"6cef161a-7885-4753-9cbe-8ee4d59ebc94\") " pod="openstack/heat-cfnapi-b598dd84c-gtjdb" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.580550 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cef161a-7885-4753-9cbe-8ee4d59ebc94-combined-ca-bundle\") pod \"heat-cfnapi-b598dd84c-gtjdb\" (UID: \"6cef161a-7885-4753-9cbe-8ee4d59ebc94\") " pod="openstack/heat-cfnapi-b598dd84c-gtjdb" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.584736 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cef161a-7885-4753-9cbe-8ee4d59ebc94-combined-ca-bundle\") pod \"heat-cfnapi-b598dd84c-gtjdb\" (UID: \"6cef161a-7885-4753-9cbe-8ee4d59ebc94\") " pod="openstack/heat-cfnapi-b598dd84c-gtjdb" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.584986 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cef161a-7885-4753-9cbe-8ee4d59ebc94-config-data\") pod \"heat-cfnapi-b598dd84c-gtjdb\" (UID: \"6cef161a-7885-4753-9cbe-8ee4d59ebc94\") " pod="openstack/heat-cfnapi-b598dd84c-gtjdb" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.588160 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cef161a-7885-4753-9cbe-8ee4d59ebc94-config-data-custom\") pod \"heat-cfnapi-b598dd84c-gtjdb\" (UID: \"6cef161a-7885-4753-9cbe-8ee4d59ebc94\") " pod="openstack/heat-cfnapi-b598dd84c-gtjdb" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.597383 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84f74\" (UniqueName: \"kubernetes.io/projected/6cef161a-7885-4753-9cbe-8ee4d59ebc94-kube-api-access-84f74\") pod \"heat-cfnapi-b598dd84c-gtjdb\" (UID: \"6cef161a-7885-4753-9cbe-8ee4d59ebc94\") " pod="openstack/heat-cfnapi-b598dd84c-gtjdb" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.782234 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5f576c8c7-hqbg9" Dec 03 09:10:24 crc kubenswrapper[4947]: I1203 09:10:24.872721 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-b598dd84c-gtjdb" Dec 03 09:10:25 crc kubenswrapper[4947]: I1203 09:10:25.084292 4947 scope.go:117] "RemoveContainer" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" Dec 03 09:10:25 crc kubenswrapper[4947]: E1203 09:10:25.084544 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:10:25 crc kubenswrapper[4947]: I1203 09:10:25.126274 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5dbdc8fbbd-cxvpc"] Dec 03 09:10:25 crc kubenswrapper[4947]: W1203 09:10:25.352047 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4ff2437_317a_4e48_9cb7_d001f05ccbee.slice/crio-68dcf3f7a1874876b7f95cef9875df8c9024480e6ecaeef404034d4c317f2960 WatchSource:0}: Error finding container 68dcf3f7a1874876b7f95cef9875df8c9024480e6ecaeef404034d4c317f2960: Status 404 returned error can't find the container with id 68dcf3f7a1874876b7f95cef9875df8c9024480e6ecaeef404034d4c317f2960 Dec 03 09:10:25 crc kubenswrapper[4947]: I1203 09:10:25.353838 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5f576c8c7-hqbg9"] Dec 03 09:10:25 crc kubenswrapper[4947]: W1203 09:10:25.427591 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cef161a_7885_4753_9cbe_8ee4d59ebc94.slice/crio-5a61a37329b29e6fc93e84e6c83f6a82c32eba8e00c55f778fc739d173562d9f WatchSource:0}: Error finding container 5a61a37329b29e6fc93e84e6c83f6a82c32eba8e00c55f778fc739d173562d9f: Status 404 returned error can't find the container with id 5a61a37329b29e6fc93e84e6c83f6a82c32eba8e00c55f778fc739d173562d9f Dec 03 09:10:25 crc kubenswrapper[4947]: I1203 09:10:25.439224 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-b598dd84c-gtjdb"] Dec 03 09:10:25 crc kubenswrapper[4947]: I1203 09:10:25.930124 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5dbdc8fbbd-cxvpc" event={"ID":"ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06","Type":"ContainerStarted","Data":"c9f1d2b7a22ef76c88e860dccb8d93ce8ab455d73405eee6c8f6074307f14e63"} Dec 03 09:10:25 crc kubenswrapper[4947]: I1203 09:10:25.931232 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-b598dd84c-gtjdb" event={"ID":"6cef161a-7885-4753-9cbe-8ee4d59ebc94","Type":"ContainerStarted","Data":"5a61a37329b29e6fc93e84e6c83f6a82c32eba8e00c55f778fc739d173562d9f"} Dec 03 09:10:25 crc kubenswrapper[4947]: I1203 09:10:25.932426 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f576c8c7-hqbg9" event={"ID":"d4ff2437-317a-4e48-9cb7-d001f05ccbee","Type":"ContainerStarted","Data":"68dcf3f7a1874876b7f95cef9875df8c9024480e6ecaeef404034d4c317f2960"} Dec 03 09:10:26 crc kubenswrapper[4947]: I1203 09:10:26.947108 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5dbdc8fbbd-cxvpc" event={"ID":"ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06","Type":"ContainerStarted","Data":"4d33f927ebda7edad95b03280cdd1f23f16cb94641c723210f4fd5499d34eb81"} Dec 03 09:10:26 crc kubenswrapper[4947]: I1203 09:10:26.947480 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-5dbdc8fbbd-cxvpc" Dec 03 09:10:26 crc kubenswrapper[4947]: I1203 09:10:26.982650 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-5dbdc8fbbd-cxvpc" podStartSLOduration=3.9826287799999998 podStartE2EDuration="3.98262878s" podCreationTimestamp="2025-12-03 09:10:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:10:26.961648172 +0000 UTC m=+8488.222602598" watchObservedRunningTime="2025-12-03 09:10:26.98262878 +0000 UTC m=+8488.243583206" Dec 03 09:10:27 crc kubenswrapper[4947]: I1203 09:10:27.958921 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5f576c8c7-hqbg9" event={"ID":"d4ff2437-317a-4e48-9cb7-d001f05ccbee","Type":"ContainerStarted","Data":"84957d5651419f8d001d73071961149cbf06712d99026eea681a4f386af5bf53"} Dec 03 09:10:27 crc kubenswrapper[4947]: I1203 09:10:27.959259 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5f576c8c7-hqbg9" Dec 03 09:10:27 crc kubenswrapper[4947]: I1203 09:10:27.961337 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-b598dd84c-gtjdb" event={"ID":"6cef161a-7885-4753-9cbe-8ee4d59ebc94","Type":"ContainerStarted","Data":"a55c1355edd730420a42c95acd58a566a8cbbb5859b8793ec16c7716416ef5ad"} Dec 03 09:10:27 crc kubenswrapper[4947]: I1203 09:10:27.990211 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5f576c8c7-hqbg9" podStartSLOduration=1.766997277 podStartE2EDuration="3.990191562s" podCreationTimestamp="2025-12-03 09:10:24 +0000 UTC" firstStartedPulling="2025-12-03 09:10:25.354964283 +0000 UTC m=+8486.615918709" lastFinishedPulling="2025-12-03 09:10:27.578158568 +0000 UTC m=+8488.839112994" observedRunningTime="2025-12-03 09:10:27.974728604 +0000 UTC m=+8489.235683030" watchObservedRunningTime="2025-12-03 09:10:27.990191562 +0000 UTC m=+8489.251145988" Dec 03 09:10:27 crc kubenswrapper[4947]: I1203 09:10:27.994775 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-b598dd84c-gtjdb" podStartSLOduration=1.8475630779999999 podStartE2EDuration="3.994754176s" podCreationTimestamp="2025-12-03 09:10:24 +0000 UTC" firstStartedPulling="2025-12-03 09:10:25.429909341 +0000 UTC m=+8486.690863767" lastFinishedPulling="2025-12-03 09:10:27.577100439 +0000 UTC m=+8488.838054865" observedRunningTime="2025-12-03 09:10:27.994036236 +0000 UTC m=+8489.254990672" watchObservedRunningTime="2025-12-03 09:10:27.994754176 +0000 UTC m=+8489.255708602" Dec 03 09:10:28 crc kubenswrapper[4947]: I1203 09:10:28.996215 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-b598dd84c-gtjdb" Dec 03 09:10:33 crc kubenswrapper[4947]: I1203 09:10:33.681923 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-55945f8875-6c2qk" Dec 03 09:10:35 crc kubenswrapper[4947]: I1203 09:10:35.466710 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-55945f8875-6c2qk" Dec 03 09:10:35 crc kubenswrapper[4947]: I1203 09:10:35.525403 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c5859c9f9-lxs8f"] Dec 03 09:10:35 crc kubenswrapper[4947]: I1203 09:10:35.535947 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5c5859c9f9-lxs8f" podUID="824c9c5f-7c1e-4f88-9a92-70b10b1945ab" containerName="horizon-log" containerID="cri-o://88527c2a5032b3cfcdc5a0bfe76c84963bcc49b298741cdce7a4242ee3712bbc" gracePeriod=30 Dec 03 09:10:35 crc kubenswrapper[4947]: I1203 09:10:35.536422 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5c5859c9f9-lxs8f" podUID="824c9c5f-7c1e-4f88-9a92-70b10b1945ab" containerName="horizon" containerID="cri-o://351c4c85737a97ffb494d0a492414826cc13cb42dc17b683f1d4908c0de028f3" gracePeriod=30 Dec 03 09:10:36 crc kubenswrapper[4947]: I1203 09:10:36.271719 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-5f576c8c7-hqbg9" Dec 03 09:10:36 crc kubenswrapper[4947]: I1203 09:10:36.284019 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-b598dd84c-gtjdb" Dec 03 09:10:37 crc kubenswrapper[4947]: I1203 09:10:37.059009 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-26vzc"] Dec 03 09:10:37 crc kubenswrapper[4947]: I1203 09:10:37.243307 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-26vzc"] Dec 03 09:10:37 crc kubenswrapper[4947]: I1203 09:10:37.243370 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b2c3-account-create-update-x6vpw"] Dec 03 09:10:37 crc kubenswrapper[4947]: I1203 09:10:37.243391 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-b2c3-account-create-update-x6vpw"] Dec 03 09:10:38 crc kubenswrapper[4947]: I1203 09:10:38.082760 4947 scope.go:117] "RemoveContainer" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" Dec 03 09:10:38 crc kubenswrapper[4947]: E1203 09:10:38.083016 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:10:39 crc kubenswrapper[4947]: I1203 09:10:39.091906 4947 generic.go:334] "Generic (PLEG): container finished" podID="824c9c5f-7c1e-4f88-9a92-70b10b1945ab" containerID="351c4c85737a97ffb494d0a492414826cc13cb42dc17b683f1d4908c0de028f3" exitCode=0 Dec 03 09:10:39 crc kubenswrapper[4947]: I1203 09:10:39.098918 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bca192c-e85f-4503-a747-2f0c118272f1" path="/var/lib/kubelet/pods/8bca192c-e85f-4503-a747-2f0c118272f1/volumes" Dec 03 09:10:39 crc kubenswrapper[4947]: I1203 09:10:39.100697 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7df294d-c9d9-48cf-8be3-5cecafe3001e" path="/var/lib/kubelet/pods/c7df294d-c9d9-48cf-8be3-5cecafe3001e/volumes" Dec 03 09:10:39 crc kubenswrapper[4947]: I1203 09:10:39.101885 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c5859c9f9-lxs8f" event={"ID":"824c9c5f-7c1e-4f88-9a92-70b10b1945ab","Type":"ContainerDied","Data":"351c4c85737a97ffb494d0a492414826cc13cb42dc17b683f1d4908c0de028f3"} Dec 03 09:10:39 crc kubenswrapper[4947]: I1203 09:10:39.494606 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jrq8t"] Dec 03 09:10:39 crc kubenswrapper[4947]: I1203 09:10:39.502958 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jrq8t" Dec 03 09:10:39 crc kubenswrapper[4947]: I1203 09:10:39.514506 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jrq8t"] Dec 03 09:10:39 crc kubenswrapper[4947]: I1203 09:10:39.589989 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf6dc360-9984-42a4-b8cd-302d7f3fa159-utilities\") pod \"redhat-marketplace-jrq8t\" (UID: \"bf6dc360-9984-42a4-b8cd-302d7f3fa159\") " pod="openshift-marketplace/redhat-marketplace-jrq8t" Dec 03 09:10:39 crc kubenswrapper[4947]: I1203 09:10:39.590075 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9f94\" (UniqueName: \"kubernetes.io/projected/bf6dc360-9984-42a4-b8cd-302d7f3fa159-kube-api-access-z9f94\") pod \"redhat-marketplace-jrq8t\" (UID: \"bf6dc360-9984-42a4-b8cd-302d7f3fa159\") " pod="openshift-marketplace/redhat-marketplace-jrq8t" Dec 03 09:10:39 crc kubenswrapper[4947]: I1203 09:10:39.590183 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf6dc360-9984-42a4-b8cd-302d7f3fa159-catalog-content\") pod \"redhat-marketplace-jrq8t\" (UID: \"bf6dc360-9984-42a4-b8cd-302d7f3fa159\") " pod="openshift-marketplace/redhat-marketplace-jrq8t" Dec 03 09:10:39 crc kubenswrapper[4947]: I1203 09:10:39.692036 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf6dc360-9984-42a4-b8cd-302d7f3fa159-utilities\") pod \"redhat-marketplace-jrq8t\" (UID: \"bf6dc360-9984-42a4-b8cd-302d7f3fa159\") " pod="openshift-marketplace/redhat-marketplace-jrq8t" Dec 03 09:10:39 crc kubenswrapper[4947]: I1203 09:10:39.692126 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9f94\" (UniqueName: \"kubernetes.io/projected/bf6dc360-9984-42a4-b8cd-302d7f3fa159-kube-api-access-z9f94\") pod \"redhat-marketplace-jrq8t\" (UID: \"bf6dc360-9984-42a4-b8cd-302d7f3fa159\") " pod="openshift-marketplace/redhat-marketplace-jrq8t" Dec 03 09:10:39 crc kubenswrapper[4947]: I1203 09:10:39.692220 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf6dc360-9984-42a4-b8cd-302d7f3fa159-catalog-content\") pod \"redhat-marketplace-jrq8t\" (UID: \"bf6dc360-9984-42a4-b8cd-302d7f3fa159\") " pod="openshift-marketplace/redhat-marketplace-jrq8t" Dec 03 09:10:39 crc kubenswrapper[4947]: I1203 09:10:39.692942 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf6dc360-9984-42a4-b8cd-302d7f3fa159-utilities\") pod \"redhat-marketplace-jrq8t\" (UID: \"bf6dc360-9984-42a4-b8cd-302d7f3fa159\") " pod="openshift-marketplace/redhat-marketplace-jrq8t" Dec 03 09:10:39 crc kubenswrapper[4947]: I1203 09:10:39.692988 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf6dc360-9984-42a4-b8cd-302d7f3fa159-catalog-content\") pod \"redhat-marketplace-jrq8t\" (UID: \"bf6dc360-9984-42a4-b8cd-302d7f3fa159\") " pod="openshift-marketplace/redhat-marketplace-jrq8t" Dec 03 09:10:39 crc kubenswrapper[4947]: I1203 09:10:39.727425 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9f94\" (UniqueName: \"kubernetes.io/projected/bf6dc360-9984-42a4-b8cd-302d7f3fa159-kube-api-access-z9f94\") pod \"redhat-marketplace-jrq8t\" (UID: \"bf6dc360-9984-42a4-b8cd-302d7f3fa159\") " pod="openshift-marketplace/redhat-marketplace-jrq8t" Dec 03 09:10:39 crc kubenswrapper[4947]: I1203 09:10:39.826006 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jrq8t" Dec 03 09:10:39 crc kubenswrapper[4947]: I1203 09:10:39.935723 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5c5859c9f9-lxs8f" podUID="824c9c5f-7c1e-4f88-9a92-70b10b1945ab" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.142:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.142:8080: connect: connection refused" Dec 03 09:10:40 crc kubenswrapper[4947]: W1203 09:10:40.318481 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf6dc360_9984_42a4_b8cd_302d7f3fa159.slice/crio-2e98ba658b8734d6db2f1f498ee5ab186ecdf9fc55159df625393d218584d168 WatchSource:0}: Error finding container 2e98ba658b8734d6db2f1f498ee5ab186ecdf9fc55159df625393d218584d168: Status 404 returned error can't find the container with id 2e98ba658b8734d6db2f1f498ee5ab186ecdf9fc55159df625393d218584d168 Dec 03 09:10:40 crc kubenswrapper[4947]: I1203 09:10:40.325691 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jrq8t"] Dec 03 09:10:41 crc kubenswrapper[4947]: I1203 09:10:41.113251 4947 generic.go:334] "Generic (PLEG): container finished" podID="bf6dc360-9984-42a4-b8cd-302d7f3fa159" containerID="d584ca348560fe0a998a55eeac8b2369be4083f05a7604408c9302f24cdb55a7" exitCode=0 Dec 03 09:10:41 crc kubenswrapper[4947]: I1203 09:10:41.113776 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jrq8t" event={"ID":"bf6dc360-9984-42a4-b8cd-302d7f3fa159","Type":"ContainerDied","Data":"d584ca348560fe0a998a55eeac8b2369be4083f05a7604408c9302f24cdb55a7"} Dec 03 09:10:41 crc kubenswrapper[4947]: I1203 09:10:41.113864 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jrq8t" event={"ID":"bf6dc360-9984-42a4-b8cd-302d7f3fa159","Type":"ContainerStarted","Data":"2e98ba658b8734d6db2f1f498ee5ab186ecdf9fc55159df625393d218584d168"} Dec 03 09:10:43 crc kubenswrapper[4947]: I1203 09:10:43.136745 4947 generic.go:334] "Generic (PLEG): container finished" podID="bf6dc360-9984-42a4-b8cd-302d7f3fa159" containerID="2ec848735a6de3c7dcc4f778e3e57d40c7ebec99e21c08721868c0c9bfe0b00e" exitCode=0 Dec 03 09:10:43 crc kubenswrapper[4947]: I1203 09:10:43.137112 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jrq8t" event={"ID":"bf6dc360-9984-42a4-b8cd-302d7f3fa159","Type":"ContainerDied","Data":"2ec848735a6de3c7dcc4f778e3e57d40c7ebec99e21c08721868c0c9bfe0b00e"} Dec 03 09:10:44 crc kubenswrapper[4947]: I1203 09:10:44.150960 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jrq8t" event={"ID":"bf6dc360-9984-42a4-b8cd-302d7f3fa159","Type":"ContainerStarted","Data":"8a6a1e6c39050cbe44230e19bc65d642caa329dfb36e90560082928ecb3450f2"} Dec 03 09:10:44 crc kubenswrapper[4947]: I1203 09:10:44.171340 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jrq8t" podStartSLOduration=2.415965638 podStartE2EDuration="5.171314696s" podCreationTimestamp="2025-12-03 09:10:39 +0000 UTC" firstStartedPulling="2025-12-03 09:10:41.115772769 +0000 UTC m=+8502.376727195" lastFinishedPulling="2025-12-03 09:10:43.871121827 +0000 UTC m=+8505.132076253" observedRunningTime="2025-12-03 09:10:44.167516813 +0000 UTC m=+8505.428471259" watchObservedRunningTime="2025-12-03 09:10:44.171314696 +0000 UTC m=+8505.432269122" Dec 03 09:10:44 crc kubenswrapper[4947]: I1203 09:10:44.287221 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-5dbdc8fbbd-cxvpc" Dec 03 09:10:48 crc kubenswrapper[4947]: I1203 09:10:48.043727 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-cpsv7"] Dec 03 09:10:48 crc kubenswrapper[4947]: I1203 09:10:48.056621 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-cpsv7"] Dec 03 09:10:49 crc kubenswrapper[4947]: I1203 09:10:49.093887 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24481172-ba60-4445-a58f-aa1f21e7368c" path="/var/lib/kubelet/pods/24481172-ba60-4445-a58f-aa1f21e7368c/volumes" Dec 03 09:10:49 crc kubenswrapper[4947]: I1203 09:10:49.826745 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jrq8t" Dec 03 09:10:49 crc kubenswrapper[4947]: I1203 09:10:49.827038 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jrq8t" Dec 03 09:10:49 crc kubenswrapper[4947]: I1203 09:10:49.882864 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jrq8t" Dec 03 09:10:49 crc kubenswrapper[4947]: I1203 09:10:49.928879 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5c5859c9f9-lxs8f" podUID="824c9c5f-7c1e-4f88-9a92-70b10b1945ab" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.142:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.142:8080: connect: connection refused" Dec 03 09:10:50 crc kubenswrapper[4947]: I1203 09:10:50.314510 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jrq8t" Dec 03 09:10:50 crc kubenswrapper[4947]: I1203 09:10:50.372507 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jrq8t"] Dec 03 09:10:52 crc kubenswrapper[4947]: I1203 09:10:52.083721 4947 scope.go:117] "RemoveContainer" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" Dec 03 09:10:52 crc kubenswrapper[4947]: E1203 09:10:52.084334 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:10:52 crc kubenswrapper[4947]: I1203 09:10:52.259984 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jrq8t" podUID="bf6dc360-9984-42a4-b8cd-302d7f3fa159" containerName="registry-server" containerID="cri-o://8a6a1e6c39050cbe44230e19bc65d642caa329dfb36e90560082928ecb3450f2" gracePeriod=2 Dec 03 09:10:53 crc kubenswrapper[4947]: I1203 09:10:53.274059 4947 generic.go:334] "Generic (PLEG): container finished" podID="bf6dc360-9984-42a4-b8cd-302d7f3fa159" containerID="8a6a1e6c39050cbe44230e19bc65d642caa329dfb36e90560082928ecb3450f2" exitCode=0 Dec 03 09:10:53 crc kubenswrapper[4947]: I1203 09:10:53.274334 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jrq8t" event={"ID":"bf6dc360-9984-42a4-b8cd-302d7f3fa159","Type":"ContainerDied","Data":"8a6a1e6c39050cbe44230e19bc65d642caa329dfb36e90560082928ecb3450f2"} Dec 03 09:10:53 crc kubenswrapper[4947]: I1203 09:10:53.274857 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jrq8t" event={"ID":"bf6dc360-9984-42a4-b8cd-302d7f3fa159","Type":"ContainerDied","Data":"2e98ba658b8734d6db2f1f498ee5ab186ecdf9fc55159df625393d218584d168"} Dec 03 09:10:53 crc kubenswrapper[4947]: I1203 09:10:53.274889 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e98ba658b8734d6db2f1f498ee5ab186ecdf9fc55159df625393d218584d168" Dec 03 09:10:53 crc kubenswrapper[4947]: I1203 09:10:53.342173 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jrq8t" Dec 03 09:10:53 crc kubenswrapper[4947]: I1203 09:10:53.470084 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9f94\" (UniqueName: \"kubernetes.io/projected/bf6dc360-9984-42a4-b8cd-302d7f3fa159-kube-api-access-z9f94\") pod \"bf6dc360-9984-42a4-b8cd-302d7f3fa159\" (UID: \"bf6dc360-9984-42a4-b8cd-302d7f3fa159\") " Dec 03 09:10:53 crc kubenswrapper[4947]: I1203 09:10:53.470149 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf6dc360-9984-42a4-b8cd-302d7f3fa159-utilities\") pod \"bf6dc360-9984-42a4-b8cd-302d7f3fa159\" (UID: \"bf6dc360-9984-42a4-b8cd-302d7f3fa159\") " Dec 03 09:10:53 crc kubenswrapper[4947]: I1203 09:10:53.470275 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf6dc360-9984-42a4-b8cd-302d7f3fa159-catalog-content\") pod \"bf6dc360-9984-42a4-b8cd-302d7f3fa159\" (UID: \"bf6dc360-9984-42a4-b8cd-302d7f3fa159\") " Dec 03 09:10:53 crc kubenswrapper[4947]: I1203 09:10:53.471202 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf6dc360-9984-42a4-b8cd-302d7f3fa159-utilities" (OuterVolumeSpecName: "utilities") pod "bf6dc360-9984-42a4-b8cd-302d7f3fa159" (UID: "bf6dc360-9984-42a4-b8cd-302d7f3fa159"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:10:53 crc kubenswrapper[4947]: I1203 09:10:53.480213 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf6dc360-9984-42a4-b8cd-302d7f3fa159-kube-api-access-z9f94" (OuterVolumeSpecName: "kube-api-access-z9f94") pod "bf6dc360-9984-42a4-b8cd-302d7f3fa159" (UID: "bf6dc360-9984-42a4-b8cd-302d7f3fa159"). InnerVolumeSpecName "kube-api-access-z9f94". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:10:53 crc kubenswrapper[4947]: I1203 09:10:53.498276 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf6dc360-9984-42a4-b8cd-302d7f3fa159-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf6dc360-9984-42a4-b8cd-302d7f3fa159" (UID: "bf6dc360-9984-42a4-b8cd-302d7f3fa159"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:10:53 crc kubenswrapper[4947]: I1203 09:10:53.572237 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9f94\" (UniqueName: \"kubernetes.io/projected/bf6dc360-9984-42a4-b8cd-302d7f3fa159-kube-api-access-z9f94\") on node \"crc\" DevicePath \"\"" Dec 03 09:10:53 crc kubenswrapper[4947]: I1203 09:10:53.572274 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf6dc360-9984-42a4-b8cd-302d7f3fa159-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:10:53 crc kubenswrapper[4947]: I1203 09:10:53.572289 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf6dc360-9984-42a4-b8cd-302d7f3fa159-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:10:54 crc kubenswrapper[4947]: I1203 09:10:54.284113 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jrq8t" Dec 03 09:10:54 crc kubenswrapper[4947]: I1203 09:10:54.318182 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jrq8t"] Dec 03 09:10:54 crc kubenswrapper[4947]: I1203 09:10:54.327555 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jrq8t"] Dec 03 09:10:55 crc kubenswrapper[4947]: I1203 09:10:55.105262 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf6dc360-9984-42a4-b8cd-302d7f3fa159" path="/var/lib/kubelet/pods/bf6dc360-9984-42a4-b8cd-302d7f3fa159/volumes" Dec 03 09:10:56 crc kubenswrapper[4947]: I1203 09:10:56.377438 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh"] Dec 03 09:10:56 crc kubenswrapper[4947]: E1203 09:10:56.377913 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf6dc360-9984-42a4-b8cd-302d7f3fa159" containerName="extract-content" Dec 03 09:10:56 crc kubenswrapper[4947]: I1203 09:10:56.377926 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf6dc360-9984-42a4-b8cd-302d7f3fa159" containerName="extract-content" Dec 03 09:10:56 crc kubenswrapper[4947]: E1203 09:10:56.377938 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf6dc360-9984-42a4-b8cd-302d7f3fa159" containerName="registry-server" Dec 03 09:10:56 crc kubenswrapper[4947]: I1203 09:10:56.377944 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf6dc360-9984-42a4-b8cd-302d7f3fa159" containerName="registry-server" Dec 03 09:10:56 crc kubenswrapper[4947]: E1203 09:10:56.377967 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf6dc360-9984-42a4-b8cd-302d7f3fa159" containerName="extract-utilities" Dec 03 09:10:56 crc kubenswrapper[4947]: I1203 09:10:56.377974 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf6dc360-9984-42a4-b8cd-302d7f3fa159" containerName="extract-utilities" Dec 03 09:10:56 crc kubenswrapper[4947]: I1203 09:10:56.378176 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf6dc360-9984-42a4-b8cd-302d7f3fa159" containerName="registry-server" Dec 03 09:10:56 crc kubenswrapper[4947]: I1203 09:10:56.379950 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh" Dec 03 09:10:56 crc kubenswrapper[4947]: I1203 09:10:56.381963 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 09:10:56 crc kubenswrapper[4947]: I1203 09:10:56.387818 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh"] Dec 03 09:10:56 crc kubenswrapper[4947]: I1203 09:10:56.430988 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pd7f\" (UniqueName: \"kubernetes.io/projected/1f7d61d9-044a-4bf6-8993-7e95e6dc289b-kube-api-access-9pd7f\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh\" (UID: \"1f7d61d9-044a-4bf6-8993-7e95e6dc289b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh" Dec 03 09:10:56 crc kubenswrapper[4947]: I1203 09:10:56.431033 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f7d61d9-044a-4bf6-8993-7e95e6dc289b-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh\" (UID: \"1f7d61d9-044a-4bf6-8993-7e95e6dc289b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh" Dec 03 09:10:56 crc kubenswrapper[4947]: I1203 09:10:56.431105 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f7d61d9-044a-4bf6-8993-7e95e6dc289b-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh\" (UID: \"1f7d61d9-044a-4bf6-8993-7e95e6dc289b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh" Dec 03 09:10:56 crc kubenswrapper[4947]: I1203 09:10:56.533604 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f7d61d9-044a-4bf6-8993-7e95e6dc289b-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh\" (UID: \"1f7d61d9-044a-4bf6-8993-7e95e6dc289b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh" Dec 03 09:10:56 crc kubenswrapper[4947]: I1203 09:10:56.533648 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pd7f\" (UniqueName: \"kubernetes.io/projected/1f7d61d9-044a-4bf6-8993-7e95e6dc289b-kube-api-access-9pd7f\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh\" (UID: \"1f7d61d9-044a-4bf6-8993-7e95e6dc289b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh" Dec 03 09:10:56 crc kubenswrapper[4947]: I1203 09:10:56.533708 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f7d61d9-044a-4bf6-8993-7e95e6dc289b-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh\" (UID: \"1f7d61d9-044a-4bf6-8993-7e95e6dc289b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh" Dec 03 09:10:56 crc kubenswrapper[4947]: I1203 09:10:56.534205 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f7d61d9-044a-4bf6-8993-7e95e6dc289b-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh\" (UID: \"1f7d61d9-044a-4bf6-8993-7e95e6dc289b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh" Dec 03 09:10:56 crc kubenswrapper[4947]: I1203 09:10:56.535603 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f7d61d9-044a-4bf6-8993-7e95e6dc289b-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh\" (UID: \"1f7d61d9-044a-4bf6-8993-7e95e6dc289b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh" Dec 03 09:10:56 crc kubenswrapper[4947]: I1203 09:10:56.552080 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pd7f\" (UniqueName: \"kubernetes.io/projected/1f7d61d9-044a-4bf6-8993-7e95e6dc289b-kube-api-access-9pd7f\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh\" (UID: \"1f7d61d9-044a-4bf6-8993-7e95e6dc289b\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh" Dec 03 09:10:56 crc kubenswrapper[4947]: I1203 09:10:56.702726 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh" Dec 03 09:10:57 crc kubenswrapper[4947]: I1203 09:10:57.154438 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh"] Dec 03 09:10:57 crc kubenswrapper[4947]: W1203 09:10:57.162303 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f7d61d9_044a_4bf6_8993_7e95e6dc289b.slice/crio-aac8e6de18976474c261aaa7b0f06e5148ccdea4129494d9bd398a0a8ac5eeb9 WatchSource:0}: Error finding container aac8e6de18976474c261aaa7b0f06e5148ccdea4129494d9bd398a0a8ac5eeb9: Status 404 returned error can't find the container with id aac8e6de18976474c261aaa7b0f06e5148ccdea4129494d9bd398a0a8ac5eeb9 Dec 03 09:10:57 crc kubenswrapper[4947]: I1203 09:10:57.310137 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh" event={"ID":"1f7d61d9-044a-4bf6-8993-7e95e6dc289b","Type":"ContainerStarted","Data":"aac8e6de18976474c261aaa7b0f06e5148ccdea4129494d9bd398a0a8ac5eeb9"} Dec 03 09:10:58 crc kubenswrapper[4947]: I1203 09:10:58.325865 4947 generic.go:334] "Generic (PLEG): container finished" podID="1f7d61d9-044a-4bf6-8993-7e95e6dc289b" containerID="79635070dc151d971ecbe60eae722c7429780f015d4cce1a0aafdc0544f41404" exitCode=0 Dec 03 09:10:58 crc kubenswrapper[4947]: I1203 09:10:58.325924 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh" event={"ID":"1f7d61d9-044a-4bf6-8993-7e95e6dc289b","Type":"ContainerDied","Data":"79635070dc151d971ecbe60eae722c7429780f015d4cce1a0aafdc0544f41404"} Dec 03 09:10:58 crc kubenswrapper[4947]: I1203 09:10:58.328457 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 09:10:59 crc kubenswrapper[4947]: I1203 09:10:59.928818 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5c5859c9f9-lxs8f" podUID="824c9c5f-7c1e-4f88-9a92-70b10b1945ab" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.142:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.142:8080: connect: connection refused" Dec 03 09:10:59 crc kubenswrapper[4947]: I1203 09:10:59.929408 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5c5859c9f9-lxs8f" Dec 03 09:11:00 crc kubenswrapper[4947]: I1203 09:11:00.355970 4947 generic.go:334] "Generic (PLEG): container finished" podID="1f7d61d9-044a-4bf6-8993-7e95e6dc289b" containerID="06c4df1c8b2718f9a06465269994e6cc84d3a7aed3a62bfea5636b021c3a9b3e" exitCode=0 Dec 03 09:11:00 crc kubenswrapper[4947]: I1203 09:11:00.356211 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh" event={"ID":"1f7d61d9-044a-4bf6-8993-7e95e6dc289b","Type":"ContainerDied","Data":"06c4df1c8b2718f9a06465269994e6cc84d3a7aed3a62bfea5636b021c3a9b3e"} Dec 03 09:11:01 crc kubenswrapper[4947]: I1203 09:11:01.371247 4947 generic.go:334] "Generic (PLEG): container finished" podID="1f7d61d9-044a-4bf6-8993-7e95e6dc289b" containerID="eda6c0a3d6d107a01683e66a4dc324cb60e0e39aad10229055213e87363bb594" exitCode=0 Dec 03 09:11:01 crc kubenswrapper[4947]: I1203 09:11:01.371386 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh" event={"ID":"1f7d61d9-044a-4bf6-8993-7e95e6dc289b","Type":"ContainerDied","Data":"eda6c0a3d6d107a01683e66a4dc324cb60e0e39aad10229055213e87363bb594"} Dec 03 09:11:02 crc kubenswrapper[4947]: I1203 09:11:02.761149 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh" Dec 03 09:11:02 crc kubenswrapper[4947]: I1203 09:11:02.891072 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pd7f\" (UniqueName: \"kubernetes.io/projected/1f7d61d9-044a-4bf6-8993-7e95e6dc289b-kube-api-access-9pd7f\") pod \"1f7d61d9-044a-4bf6-8993-7e95e6dc289b\" (UID: \"1f7d61d9-044a-4bf6-8993-7e95e6dc289b\") " Dec 03 09:11:02 crc kubenswrapper[4947]: I1203 09:11:02.891307 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f7d61d9-044a-4bf6-8993-7e95e6dc289b-bundle\") pod \"1f7d61d9-044a-4bf6-8993-7e95e6dc289b\" (UID: \"1f7d61d9-044a-4bf6-8993-7e95e6dc289b\") " Dec 03 09:11:02 crc kubenswrapper[4947]: I1203 09:11:02.891337 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f7d61d9-044a-4bf6-8993-7e95e6dc289b-util\") pod \"1f7d61d9-044a-4bf6-8993-7e95e6dc289b\" (UID: \"1f7d61d9-044a-4bf6-8993-7e95e6dc289b\") " Dec 03 09:11:02 crc kubenswrapper[4947]: I1203 09:11:02.894295 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f7d61d9-044a-4bf6-8993-7e95e6dc289b-bundle" (OuterVolumeSpecName: "bundle") pod "1f7d61d9-044a-4bf6-8993-7e95e6dc289b" (UID: "1f7d61d9-044a-4bf6-8993-7e95e6dc289b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:11:02 crc kubenswrapper[4947]: I1203 09:11:02.898808 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f7d61d9-044a-4bf6-8993-7e95e6dc289b-kube-api-access-9pd7f" (OuterVolumeSpecName: "kube-api-access-9pd7f") pod "1f7d61d9-044a-4bf6-8993-7e95e6dc289b" (UID: "1f7d61d9-044a-4bf6-8993-7e95e6dc289b"). InnerVolumeSpecName "kube-api-access-9pd7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:11:02 crc kubenswrapper[4947]: I1203 09:11:02.902989 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f7d61d9-044a-4bf6-8993-7e95e6dc289b-util" (OuterVolumeSpecName: "util") pod "1f7d61d9-044a-4bf6-8993-7e95e6dc289b" (UID: "1f7d61d9-044a-4bf6-8993-7e95e6dc289b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:11:02 crc kubenswrapper[4947]: I1203 09:11:02.993836 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pd7f\" (UniqueName: \"kubernetes.io/projected/1f7d61d9-044a-4bf6-8993-7e95e6dc289b-kube-api-access-9pd7f\") on node \"crc\" DevicePath \"\"" Dec 03 09:11:02 crc kubenswrapper[4947]: I1203 09:11:02.994527 4947 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f7d61d9-044a-4bf6-8993-7e95e6dc289b-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:11:02 crc kubenswrapper[4947]: I1203 09:11:02.994642 4947 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f7d61d9-044a-4bf6-8993-7e95e6dc289b-util\") on node \"crc\" DevicePath \"\"" Dec 03 09:11:03 crc kubenswrapper[4947]: I1203 09:11:03.395823 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh" event={"ID":"1f7d61d9-044a-4bf6-8993-7e95e6dc289b","Type":"ContainerDied","Data":"aac8e6de18976474c261aaa7b0f06e5148ccdea4129494d9bd398a0a8ac5eeb9"} Dec 03 09:11:03 crc kubenswrapper[4947]: I1203 09:11:03.395872 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aac8e6de18976474c261aaa7b0f06e5148ccdea4129494d9bd398a0a8ac5eeb9" Dec 03 09:11:03 crc kubenswrapper[4947]: I1203 09:11:03.395875 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh" Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.010746 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c5859c9f9-lxs8f" Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.055037 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gz9vm\" (UniqueName: \"kubernetes.io/projected/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-kube-api-access-gz9vm\") pod \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\" (UID: \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\") " Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.055113 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-config-data\") pod \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\" (UID: \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\") " Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.055368 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-scripts\") pod \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\" (UID: \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\") " Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.055399 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-logs\") pod \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\" (UID: \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\") " Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.055533 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-horizon-secret-key\") pod \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\" (UID: \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\") " Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.055924 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-logs" (OuterVolumeSpecName: "logs") pod "824c9c5f-7c1e-4f88-9a92-70b10b1945ab" (UID: "824c9c5f-7c1e-4f88-9a92-70b10b1945ab"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.060674 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "824c9c5f-7c1e-4f88-9a92-70b10b1945ab" (UID: "824c9c5f-7c1e-4f88-9a92-70b10b1945ab"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.061130 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-kube-api-access-gz9vm" (OuterVolumeSpecName: "kube-api-access-gz9vm") pod "824c9c5f-7c1e-4f88-9a92-70b10b1945ab" (UID: "824c9c5f-7c1e-4f88-9a92-70b10b1945ab"). InnerVolumeSpecName "kube-api-access-gz9vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:11:06 crc kubenswrapper[4947]: E1203 09:11:06.078833 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-config-data podName:824c9c5f-7c1e-4f88-9a92-70b10b1945ab nodeName:}" failed. No retries permitted until 2025-12-03 09:11:06.578776209 +0000 UTC m=+8527.839730645 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/configmap/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-config-data") pod "824c9c5f-7c1e-4f88-9a92-70b10b1945ab" (UID: "824c9c5f-7c1e-4f88-9a92-70b10b1945ab") : error deleting /var/lib/kubelet/pods/824c9c5f-7c1e-4f88-9a92-70b10b1945ab/volume-subpaths: remove /var/lib/kubelet/pods/824c9c5f-7c1e-4f88-9a92-70b10b1945ab/volume-subpaths: no such file or directory Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.079120 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-scripts" (OuterVolumeSpecName: "scripts") pod "824c9c5f-7c1e-4f88-9a92-70b10b1945ab" (UID: "824c9c5f-7c1e-4f88-9a92-70b10b1945ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.158395 4947 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.158824 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gz9vm\" (UniqueName: \"kubernetes.io/projected/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-kube-api-access-gz9vm\") on node \"crc\" DevicePath \"\"" Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.158849 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.158860 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-logs\") on node \"crc\" DevicePath \"\"" Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.427056 4947 generic.go:334] "Generic (PLEG): container finished" podID="824c9c5f-7c1e-4f88-9a92-70b10b1945ab" containerID="88527c2a5032b3cfcdc5a0bfe76c84963bcc49b298741cdce7a4242ee3712bbc" exitCode=137 Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.427107 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c5859c9f9-lxs8f" Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.427123 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c5859c9f9-lxs8f" event={"ID":"824c9c5f-7c1e-4f88-9a92-70b10b1945ab","Type":"ContainerDied","Data":"88527c2a5032b3cfcdc5a0bfe76c84963bcc49b298741cdce7a4242ee3712bbc"} Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.427172 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c5859c9f9-lxs8f" event={"ID":"824c9c5f-7c1e-4f88-9a92-70b10b1945ab","Type":"ContainerDied","Data":"aff3db99e71cc1ac4d4482bead09121fbbb8c7b608436c89a02a8ea67bd235cb"} Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.427198 4947 scope.go:117] "RemoveContainer" containerID="351c4c85737a97ffb494d0a492414826cc13cb42dc17b683f1d4908c0de028f3" Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.595543 4947 scope.go:117] "RemoveContainer" containerID="88527c2a5032b3cfcdc5a0bfe76c84963bcc49b298741cdce7a4242ee3712bbc" Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.621132 4947 scope.go:117] "RemoveContainer" containerID="351c4c85737a97ffb494d0a492414826cc13cb42dc17b683f1d4908c0de028f3" Dec 03 09:11:06 crc kubenswrapper[4947]: E1203 09:11:06.621720 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"351c4c85737a97ffb494d0a492414826cc13cb42dc17b683f1d4908c0de028f3\": container with ID starting with 351c4c85737a97ffb494d0a492414826cc13cb42dc17b683f1d4908c0de028f3 not found: ID does not exist" containerID="351c4c85737a97ffb494d0a492414826cc13cb42dc17b683f1d4908c0de028f3" Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.621782 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"351c4c85737a97ffb494d0a492414826cc13cb42dc17b683f1d4908c0de028f3"} err="failed to get container status \"351c4c85737a97ffb494d0a492414826cc13cb42dc17b683f1d4908c0de028f3\": rpc error: code = NotFound desc = could not find container \"351c4c85737a97ffb494d0a492414826cc13cb42dc17b683f1d4908c0de028f3\": container with ID starting with 351c4c85737a97ffb494d0a492414826cc13cb42dc17b683f1d4908c0de028f3 not found: ID does not exist" Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.621825 4947 scope.go:117] "RemoveContainer" containerID="88527c2a5032b3cfcdc5a0bfe76c84963bcc49b298741cdce7a4242ee3712bbc" Dec 03 09:11:06 crc kubenswrapper[4947]: E1203 09:11:06.622180 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88527c2a5032b3cfcdc5a0bfe76c84963bcc49b298741cdce7a4242ee3712bbc\": container with ID starting with 88527c2a5032b3cfcdc5a0bfe76c84963bcc49b298741cdce7a4242ee3712bbc not found: ID does not exist" containerID="88527c2a5032b3cfcdc5a0bfe76c84963bcc49b298741cdce7a4242ee3712bbc" Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.622208 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88527c2a5032b3cfcdc5a0bfe76c84963bcc49b298741cdce7a4242ee3712bbc"} err="failed to get container status \"88527c2a5032b3cfcdc5a0bfe76c84963bcc49b298741cdce7a4242ee3712bbc\": rpc error: code = NotFound desc = could not find container \"88527c2a5032b3cfcdc5a0bfe76c84963bcc49b298741cdce7a4242ee3712bbc\": container with ID starting with 88527c2a5032b3cfcdc5a0bfe76c84963bcc49b298741cdce7a4242ee3712bbc not found: ID does not exist" Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.671261 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-config-data\") pod \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\" (UID: \"824c9c5f-7c1e-4f88-9a92-70b10b1945ab\") " Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.671919 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-config-data" (OuterVolumeSpecName: "config-data") pod "824c9c5f-7c1e-4f88-9a92-70b10b1945ab" (UID: "824c9c5f-7c1e-4f88-9a92-70b10b1945ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.762814 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c5859c9f9-lxs8f"] Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.771885 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5c5859c9f9-lxs8f"] Dec 03 09:11:06 crc kubenswrapper[4947]: I1203 09:11:06.775039 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/824c9c5f-7c1e-4f88-9a92-70b10b1945ab-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:11:07 crc kubenswrapper[4947]: I1203 09:11:07.084282 4947 scope.go:117] "RemoveContainer" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" Dec 03 09:11:07 crc kubenswrapper[4947]: E1203 09:11:07.084564 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:11:07 crc kubenswrapper[4947]: I1203 09:11:07.097386 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="824c9c5f-7c1e-4f88-9a92-70b10b1945ab" path="/var/lib/kubelet/pods/824c9c5f-7c1e-4f88-9a92-70b10b1945ab/volumes" Dec 03 09:11:08 crc kubenswrapper[4947]: I1203 09:11:08.051752 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-1207-account-create-update-lfxck"] Dec 03 09:11:08 crc kubenswrapper[4947]: I1203 09:11:08.066345 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-zjrft"] Dec 03 09:11:08 crc kubenswrapper[4947]: I1203 09:11:08.082837 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-zjrft"] Dec 03 09:11:08 crc kubenswrapper[4947]: I1203 09:11:08.094990 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-1207-account-create-update-lfxck"] Dec 03 09:11:09 crc kubenswrapper[4947]: I1203 09:11:09.096071 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4db4f3da-a494-465f-a92a-7cf9bae6a13d" path="/var/lib/kubelet/pods/4db4f3da-a494-465f-a92a-7cf9bae6a13d/volumes" Dec 03 09:11:09 crc kubenswrapper[4947]: I1203 09:11:09.097109 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dbf1504-f700-40e9-9700-2db114b8c987" path="/var/lib/kubelet/pods/7dbf1504-f700-40e9-9700-2db114b8c987/volumes" Dec 03 09:11:14 crc kubenswrapper[4947]: I1203 09:11:14.938682 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-vg27w"] Dec 03 09:11:14 crc kubenswrapper[4947]: E1203 09:11:14.940620 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7d61d9-044a-4bf6-8993-7e95e6dc289b" containerName="pull" Dec 03 09:11:14 crc kubenswrapper[4947]: I1203 09:11:14.940728 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7d61d9-044a-4bf6-8993-7e95e6dc289b" containerName="pull" Dec 03 09:11:14 crc kubenswrapper[4947]: E1203 09:11:14.940855 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7d61d9-044a-4bf6-8993-7e95e6dc289b" containerName="util" Dec 03 09:11:14 crc kubenswrapper[4947]: I1203 09:11:14.940938 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7d61d9-044a-4bf6-8993-7e95e6dc289b" containerName="util" Dec 03 09:11:14 crc kubenswrapper[4947]: E1203 09:11:14.941049 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="824c9c5f-7c1e-4f88-9a92-70b10b1945ab" containerName="horizon" Dec 03 09:11:14 crc kubenswrapper[4947]: I1203 09:11:14.941149 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="824c9c5f-7c1e-4f88-9a92-70b10b1945ab" containerName="horizon" Dec 03 09:11:14 crc kubenswrapper[4947]: E1203 09:11:14.941225 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="824c9c5f-7c1e-4f88-9a92-70b10b1945ab" containerName="horizon-log" Dec 03 09:11:14 crc kubenswrapper[4947]: I1203 09:11:14.941283 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="824c9c5f-7c1e-4f88-9a92-70b10b1945ab" containerName="horizon-log" Dec 03 09:11:14 crc kubenswrapper[4947]: E1203 09:11:14.941344 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7d61d9-044a-4bf6-8993-7e95e6dc289b" containerName="extract" Dec 03 09:11:14 crc kubenswrapper[4947]: I1203 09:11:14.941400 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7d61d9-044a-4bf6-8993-7e95e6dc289b" containerName="extract" Dec 03 09:11:14 crc kubenswrapper[4947]: I1203 09:11:14.941702 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f7d61d9-044a-4bf6-8993-7e95e6dc289b" containerName="extract" Dec 03 09:11:14 crc kubenswrapper[4947]: I1203 09:11:14.941792 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="824c9c5f-7c1e-4f88-9a92-70b10b1945ab" containerName="horizon" Dec 03 09:11:14 crc kubenswrapper[4947]: I1203 09:11:14.941872 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="824c9c5f-7c1e-4f88-9a92-70b10b1945ab" containerName="horizon-log" Dec 03 09:11:14 crc kubenswrapper[4947]: I1203 09:11:14.942814 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vg27w" Dec 03 09:11:14 crc kubenswrapper[4947]: I1203 09:11:14.949522 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-rjfvv" Dec 03 09:11:14 crc kubenswrapper[4947]: I1203 09:11:14.950022 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 03 09:11:14 crc kubenswrapper[4947]: I1203 09:11:14.950088 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.150587 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-vg27w"] Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.219718 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-zppkk"] Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.221436 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-zppkk" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.223738 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-j22zg" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.223934 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.236291 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a0c92f55-578e-4ab0-8a31-7a3df001a16c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6f69fbf5bf-zppkk\" (UID: \"a0c92f55-578e-4ab0-8a31-7a3df001a16c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-zppkk" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.236649 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb2qw\" (UniqueName: \"kubernetes.io/projected/dc90cebb-4774-4b00-bd82-852b5c5af24d-kube-api-access-jb2qw\") pod \"obo-prometheus-operator-668cf9dfbb-vg27w\" (UID: \"dc90cebb-4774-4b00-bd82-852b5c5af24d\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vg27w" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.236943 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a0c92f55-578e-4ab0-8a31-7a3df001a16c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6f69fbf5bf-zppkk\" (UID: \"a0c92f55-578e-4ab0-8a31-7a3df001a16c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-zppkk" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.249571 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-wbfw7"] Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.250947 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-wbfw7" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.259544 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-zppkk"] Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.270266 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-wbfw7"] Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.339468 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/79345650-86f1-4fed-a0af-1a1cdb2fe403-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6f69fbf5bf-wbfw7\" (UID: \"79345650-86f1-4fed-a0af-1a1cdb2fe403\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-wbfw7" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.339560 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb2qw\" (UniqueName: \"kubernetes.io/projected/dc90cebb-4774-4b00-bd82-852b5c5af24d-kube-api-access-jb2qw\") pod \"obo-prometheus-operator-668cf9dfbb-vg27w\" (UID: \"dc90cebb-4774-4b00-bd82-852b5c5af24d\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vg27w" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.339671 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/79345650-86f1-4fed-a0af-1a1cdb2fe403-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6f69fbf5bf-wbfw7\" (UID: \"79345650-86f1-4fed-a0af-1a1cdb2fe403\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-wbfw7" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.339730 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a0c92f55-578e-4ab0-8a31-7a3df001a16c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6f69fbf5bf-zppkk\" (UID: \"a0c92f55-578e-4ab0-8a31-7a3df001a16c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-zppkk" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.339779 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a0c92f55-578e-4ab0-8a31-7a3df001a16c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6f69fbf5bf-zppkk\" (UID: \"a0c92f55-578e-4ab0-8a31-7a3df001a16c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-zppkk" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.356151 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a0c92f55-578e-4ab0-8a31-7a3df001a16c-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6f69fbf5bf-zppkk\" (UID: \"a0c92f55-578e-4ab0-8a31-7a3df001a16c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-zppkk" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.371654 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a0c92f55-578e-4ab0-8a31-7a3df001a16c-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6f69fbf5bf-zppkk\" (UID: \"a0c92f55-578e-4ab0-8a31-7a3df001a16c\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-zppkk" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.387116 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb2qw\" (UniqueName: \"kubernetes.io/projected/dc90cebb-4774-4b00-bd82-852b5c5af24d-kube-api-access-jb2qw\") pod \"obo-prometheus-operator-668cf9dfbb-vg27w\" (UID: \"dc90cebb-4774-4b00-bd82-852b5c5af24d\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vg27w" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.421545 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-d9hvf"] Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.423001 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-d9hvf" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.427614 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-b7629" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.427806 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.442827 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/79345650-86f1-4fed-a0af-1a1cdb2fe403-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6f69fbf5bf-wbfw7\" (UID: \"79345650-86f1-4fed-a0af-1a1cdb2fe403\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-wbfw7" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.443208 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5c86178b-2737-47e6-8146-3f508757d0ba-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-d9hvf\" (UID: \"5c86178b-2737-47e6-8146-3f508757d0ba\") " pod="openshift-operators/observability-operator-d8bb48f5d-d9hvf" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.443393 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/79345650-86f1-4fed-a0af-1a1cdb2fe403-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6f69fbf5bf-wbfw7\" (UID: \"79345650-86f1-4fed-a0af-1a1cdb2fe403\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-wbfw7" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.443903 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jprwb\" (UniqueName: \"kubernetes.io/projected/5c86178b-2737-47e6-8146-3f508757d0ba-kube-api-access-jprwb\") pod \"observability-operator-d8bb48f5d-d9hvf\" (UID: \"5c86178b-2737-47e6-8146-3f508757d0ba\") " pod="openshift-operators/observability-operator-d8bb48f5d-d9hvf" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.447021 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vg27w" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.448968 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/79345650-86f1-4fed-a0af-1a1cdb2fe403-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6f69fbf5bf-wbfw7\" (UID: \"79345650-86f1-4fed-a0af-1a1cdb2fe403\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-wbfw7" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.459078 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/79345650-86f1-4fed-a0af-1a1cdb2fe403-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6f69fbf5bf-wbfw7\" (UID: \"79345650-86f1-4fed-a0af-1a1cdb2fe403\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-wbfw7" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.459855 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-d9hvf"] Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.550682 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jprwb\" (UniqueName: \"kubernetes.io/projected/5c86178b-2737-47e6-8146-3f508757d0ba-kube-api-access-jprwb\") pod \"observability-operator-d8bb48f5d-d9hvf\" (UID: \"5c86178b-2737-47e6-8146-3f508757d0ba\") " pod="openshift-operators/observability-operator-d8bb48f5d-d9hvf" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.550795 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5c86178b-2737-47e6-8146-3f508757d0ba-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-d9hvf\" (UID: \"5c86178b-2737-47e6-8146-3f508757d0ba\") " pod="openshift-operators/observability-operator-d8bb48f5d-d9hvf" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.556152 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/5c86178b-2737-47e6-8146-3f508757d0ba-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-d9hvf\" (UID: \"5c86178b-2737-47e6-8146-3f508757d0ba\") " pod="openshift-operators/observability-operator-d8bb48f5d-d9hvf" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.587920 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-zppkk" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.593533 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jprwb\" (UniqueName: \"kubernetes.io/projected/5c86178b-2737-47e6-8146-3f508757d0ba-kube-api-access-jprwb\") pod \"observability-operator-d8bb48f5d-d9hvf\" (UID: \"5c86178b-2737-47e6-8146-3f508757d0ba\") " pod="openshift-operators/observability-operator-d8bb48f5d-d9hvf" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.595899 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-wbfw7" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.612746 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-d9hvf" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.672552 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-kj52t"] Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.674178 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-kj52t" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.682201 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-j44xt" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.694290 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-kj52t"] Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.758372 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/13b9e98a-c5f9-4ce7-b114-bfd48c42c147-openshift-service-ca\") pod \"perses-operator-5446b9c989-kj52t\" (UID: \"13b9e98a-c5f9-4ce7-b114-bfd48c42c147\") " pod="openshift-operators/perses-operator-5446b9c989-kj52t" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.758615 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xglr\" (UniqueName: \"kubernetes.io/projected/13b9e98a-c5f9-4ce7-b114-bfd48c42c147-kube-api-access-2xglr\") pod \"perses-operator-5446b9c989-kj52t\" (UID: \"13b9e98a-c5f9-4ce7-b114-bfd48c42c147\") " pod="openshift-operators/perses-operator-5446b9c989-kj52t" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.860929 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/13b9e98a-c5f9-4ce7-b114-bfd48c42c147-openshift-service-ca\") pod \"perses-operator-5446b9c989-kj52t\" (UID: \"13b9e98a-c5f9-4ce7-b114-bfd48c42c147\") " pod="openshift-operators/perses-operator-5446b9c989-kj52t" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.860975 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xglr\" (UniqueName: \"kubernetes.io/projected/13b9e98a-c5f9-4ce7-b114-bfd48c42c147-kube-api-access-2xglr\") pod \"perses-operator-5446b9c989-kj52t\" (UID: \"13b9e98a-c5f9-4ce7-b114-bfd48c42c147\") " pod="openshift-operators/perses-operator-5446b9c989-kj52t" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.864001 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/13b9e98a-c5f9-4ce7-b114-bfd48c42c147-openshift-service-ca\") pod \"perses-operator-5446b9c989-kj52t\" (UID: \"13b9e98a-c5f9-4ce7-b114-bfd48c42c147\") " pod="openshift-operators/perses-operator-5446b9c989-kj52t" Dec 03 09:11:15 crc kubenswrapper[4947]: I1203 09:11:15.891149 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xglr\" (UniqueName: \"kubernetes.io/projected/13b9e98a-c5f9-4ce7-b114-bfd48c42c147-kube-api-access-2xglr\") pod \"perses-operator-5446b9c989-kj52t\" (UID: \"13b9e98a-c5f9-4ce7-b114-bfd48c42c147\") " pod="openshift-operators/perses-operator-5446b9c989-kj52t" Dec 03 09:11:16 crc kubenswrapper[4947]: I1203 09:11:16.051008 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-kj52t" Dec 03 09:11:16 crc kubenswrapper[4947]: I1203 09:11:16.055895 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-565rb"] Dec 03 09:11:16 crc kubenswrapper[4947]: I1203 09:11:16.085185 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-565rb"] Dec 03 09:11:16 crc kubenswrapper[4947]: I1203 09:11:16.218956 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-vg27w"] Dec 03 09:11:16 crc kubenswrapper[4947]: I1203 09:11:16.542892 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vg27w" event={"ID":"dc90cebb-4774-4b00-bd82-852b5c5af24d","Type":"ContainerStarted","Data":"6dcffd54b9b3b3fcd2321ab9f0736906bda287a77206be7a20d74d5392da8d4a"} Dec 03 09:11:16 crc kubenswrapper[4947]: I1203 09:11:16.611082 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-wbfw7"] Dec 03 09:11:16 crc kubenswrapper[4947]: I1203 09:11:16.651528 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-d9hvf"] Dec 03 09:11:16 crc kubenswrapper[4947]: I1203 09:11:16.689992 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-zppkk"] Dec 03 09:11:16 crc kubenswrapper[4947]: I1203 09:11:16.751454 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-kj52t"] Dec 03 09:11:17 crc kubenswrapper[4947]: I1203 09:11:17.097896 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="354678be-9852-4bb6-ab81-137e937fde3b" path="/var/lib/kubelet/pods/354678be-9852-4bb6-ab81-137e937fde3b/volumes" Dec 03 09:11:17 crc kubenswrapper[4947]: I1203 09:11:17.574127 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-zppkk" event={"ID":"a0c92f55-578e-4ab0-8a31-7a3df001a16c","Type":"ContainerStarted","Data":"78e215d6aed1d9a30626ebd6b5e43b71fe15cb1b88c3ffe4a1af08b79035b09a"} Dec 03 09:11:17 crc kubenswrapper[4947]: I1203 09:11:17.579592 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-wbfw7" event={"ID":"79345650-86f1-4fed-a0af-1a1cdb2fe403","Type":"ContainerStarted","Data":"7e89562b3d526ee73b74d0cb1029bddf4e68213ee118c979ec3625588e10b0f2"} Dec 03 09:11:17 crc kubenswrapper[4947]: I1203 09:11:17.593636 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-d9hvf" event={"ID":"5c86178b-2737-47e6-8146-3f508757d0ba","Type":"ContainerStarted","Data":"9fc098bb5d7f835c973812f7fe9505a324eb1274390f0b021dc81dbe269940cc"} Dec 03 09:11:17 crc kubenswrapper[4947]: I1203 09:11:17.609746 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-kj52t" event={"ID":"13b9e98a-c5f9-4ce7-b114-bfd48c42c147","Type":"ContainerStarted","Data":"18e715bbf0b25c6ec9da459b3dbb00b23050c1589735d02fe1757cba876a1800"} Dec 03 09:11:18 crc kubenswrapper[4947]: I1203 09:11:18.084143 4947 scope.go:117] "RemoveContainer" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" Dec 03 09:11:18 crc kubenswrapper[4947]: E1203 09:11:18.084535 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:11:20 crc kubenswrapper[4947]: I1203 09:11:20.757039 4947 scope.go:117] "RemoveContainer" containerID="0dcbe87c3a5e133304b06cc7c7a9ce6b08fa1be553daffedaf8ad6ee3179b9bf" Dec 03 09:11:23 crc kubenswrapper[4947]: I1203 09:11:23.523118 4947 scope.go:117] "RemoveContainer" containerID="3beb70c60e68bf929872d622dda84b695434e1b415aedf90decdfbc8636d723b" Dec 03 09:11:25 crc kubenswrapper[4947]: I1203 09:11:25.967558 4947 scope.go:117] "RemoveContainer" containerID="ce16febdf607363810ef12cd31b0836935f3fa39b8988f042b29db6886d5e84d" Dec 03 09:11:26 crc kubenswrapper[4947]: I1203 09:11:26.039196 4947 scope.go:117] "RemoveContainer" containerID="100b23c66bbbe88856e28fbc130c17c71d54adcb36c332a7a27521c2da23a932" Dec 03 09:11:26 crc kubenswrapper[4947]: I1203 09:11:26.078043 4947 scope.go:117] "RemoveContainer" containerID="fac27623464064e5eaad635bafab3fd40e58eb39881c635dfce7570a7e291923" Dec 03 09:11:26 crc kubenswrapper[4947]: I1203 09:11:26.209323 4947 scope.go:117] "RemoveContainer" containerID="c093257dc35bef67ec8c14c5ace1091f16c97aafc4c599cbb76f7cccb2e36539" Dec 03 09:11:26 crc kubenswrapper[4947]: I1203 09:11:26.327710 4947 scope.go:117] "RemoveContainer" containerID="7640fceceb75a5b96d15cfd4788e1c48a215300db98621c893df5c92ca81ec90" Dec 03 09:11:26 crc kubenswrapper[4947]: I1203 09:11:26.375305 4947 scope.go:117] "RemoveContainer" containerID="d9cef58309b351e3c7551d971b6c4de87f629ebbab948fb89d9f663e6e0e672c" Dec 03 09:11:26 crc kubenswrapper[4947]: I1203 09:11:26.710637 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-kj52t" event={"ID":"13b9e98a-c5f9-4ce7-b114-bfd48c42c147","Type":"ContainerStarted","Data":"43dc53177e598c3241605e72241a9f9b55e77a36eab7e60d0d672a3cb3826ec3"} Dec 03 09:11:26 crc kubenswrapper[4947]: I1203 09:11:26.710684 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-kj52t" Dec 03 09:11:26 crc kubenswrapper[4947]: I1203 09:11:26.728890 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-wbfw7" event={"ID":"79345650-86f1-4fed-a0af-1a1cdb2fe403","Type":"ContainerStarted","Data":"aa92b656fcfb31ac057c204aa613447def218b469e61d072310e19e9f4ca7534"} Dec 03 09:11:26 crc kubenswrapper[4947]: I1203 09:11:26.738020 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-zppkk" event={"ID":"a0c92f55-578e-4ab0-8a31-7a3df001a16c","Type":"ContainerStarted","Data":"ff7e4cb64af847a08f5645d6e113f470aad80375a6f0cf823f07c98a0195dc47"} Dec 03 09:11:26 crc kubenswrapper[4947]: I1203 09:11:26.748421 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-kj52t" podStartSLOduration=2.466862727 podStartE2EDuration="11.748401629s" podCreationTimestamp="2025-12-03 09:11:15 +0000 UTC" firstStartedPulling="2025-12-03 09:11:16.755547448 +0000 UTC m=+8538.016501874" lastFinishedPulling="2025-12-03 09:11:26.03708635 +0000 UTC m=+8547.298040776" observedRunningTime="2025-12-03 09:11:26.743572079 +0000 UTC m=+8548.004526505" watchObservedRunningTime="2025-12-03 09:11:26.748401629 +0000 UTC m=+8548.009356055" Dec 03 09:11:26 crc kubenswrapper[4947]: I1203 09:11:26.783995 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-wbfw7" podStartSLOduration=2.377381009 podStartE2EDuration="11.783959392s" podCreationTimestamp="2025-12-03 09:11:15 +0000 UTC" firstStartedPulling="2025-12-03 09:11:16.630486076 +0000 UTC m=+8537.891440512" lastFinishedPulling="2025-12-03 09:11:26.037064469 +0000 UTC m=+8547.298018895" observedRunningTime="2025-12-03 09:11:26.780718394 +0000 UTC m=+8548.041672820" watchObservedRunningTime="2025-12-03 09:11:26.783959392 +0000 UTC m=+8548.044913818" Dec 03 09:11:26 crc kubenswrapper[4947]: I1203 09:11:26.823606 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6f69fbf5bf-zppkk" podStartSLOduration=2.520659985 podStartE2EDuration="11.823587224s" podCreationTimestamp="2025-12-03 09:11:15 +0000 UTC" firstStartedPulling="2025-12-03 09:11:16.665940575 +0000 UTC m=+8537.926895001" lastFinishedPulling="2025-12-03 09:11:25.968867814 +0000 UTC m=+8547.229822240" observedRunningTime="2025-12-03 09:11:26.804179178 +0000 UTC m=+8548.065133604" watchObservedRunningTime="2025-12-03 09:11:26.823587224 +0000 UTC m=+8548.084541650" Dec 03 09:11:27 crc kubenswrapper[4947]: I1203 09:11:27.750208 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-d9hvf" event={"ID":"5c86178b-2737-47e6-8146-3f508757d0ba","Type":"ContainerStarted","Data":"bead9919e95e39df54a60f2daea8bf5077fa43c927abe6b6d0b203338b9b30ec"} Dec 03 09:11:27 crc kubenswrapper[4947]: I1203 09:11:27.751060 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-d9hvf" Dec 03 09:11:27 crc kubenswrapper[4947]: I1203 09:11:27.752509 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vg27w" event={"ID":"dc90cebb-4774-4b00-bd82-852b5c5af24d","Type":"ContainerStarted","Data":"3d422a41a74e7d9bdec9ca36e79a2334d47a766d877043635f4bc9f722cb4d3f"} Dec 03 09:11:27 crc kubenswrapper[4947]: I1203 09:11:27.804641 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-vg27w" podStartSLOduration=4.073799173 podStartE2EDuration="13.804618138s" podCreationTimestamp="2025-12-03 09:11:14 +0000 UTC" firstStartedPulling="2025-12-03 09:11:16.236978641 +0000 UTC m=+8537.497933067" lastFinishedPulling="2025-12-03 09:11:25.967797606 +0000 UTC m=+8547.228752032" observedRunningTime="2025-12-03 09:11:27.790318402 +0000 UTC m=+8549.051272838" watchObservedRunningTime="2025-12-03 09:11:27.804618138 +0000 UTC m=+8549.065572594" Dec 03 09:11:27 crc kubenswrapper[4947]: I1203 09:11:27.805197 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-d9hvf" podStartSLOduration=3.390824939 podStartE2EDuration="12.805191104s" podCreationTimestamp="2025-12-03 09:11:15 +0000 UTC" firstStartedPulling="2025-12-03 09:11:16.664224218 +0000 UTC m=+8537.925178644" lastFinishedPulling="2025-12-03 09:11:26.078590393 +0000 UTC m=+8547.339544809" observedRunningTime="2025-12-03 09:11:27.776618701 +0000 UTC m=+8549.037573167" watchObservedRunningTime="2025-12-03 09:11:27.805191104 +0000 UTC m=+8549.066145530" Dec 03 09:11:27 crc kubenswrapper[4947]: I1203 09:11:27.835318 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-d9hvf" Dec 03 09:11:33 crc kubenswrapper[4947]: I1203 09:11:33.084521 4947 scope.go:117] "RemoveContainer" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" Dec 03 09:11:33 crc kubenswrapper[4947]: E1203 09:11:33.085195 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:11:36 crc kubenswrapper[4947]: I1203 09:11:36.054106 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-kj52t" Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.285102 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.285895 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="4331d824-805d-455d-af0a-585746ab0364" containerName="openstackclient" containerID="cri-o://19f8253dec3ce074344b7772a491e1fed8663e02ce71520faef32efa5c0485cf" gracePeriod=2 Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.307234 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.410663 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 03 09:11:39 crc kubenswrapper[4947]: E1203 09:11:39.411240 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4331d824-805d-455d-af0a-585746ab0364" containerName="openstackclient" Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.411257 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="4331d824-805d-455d-af0a-585746ab0364" containerName="openstackclient" Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.411515 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="4331d824-805d-455d-af0a-585746ab0364" containerName="openstackclient" Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.412463 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.421420 4947 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="4331d824-805d-455d-af0a-585746ab0364" podUID="4eb4aece-adc4-4e88-adaa-cfef079adb04" Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.437995 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.452860 4947 status_manager.go:875] "Failed to update status for pod" pod="openstack/openstackclient" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eb4aece-adc4-4e88-adaa-cfef079adb04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T09:11:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T09:11:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T09:11:39Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T09:11:39Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.rdoproject.org/podified-antelope-centos9/openstack-openstackclient:65066e8ca260a75886ae57f157049605\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"openstackclient\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/clouds.yaml\\\",\\\"name\\\":\\\"openstack-config\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/secure.yaml\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/cloudrc\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lzxbs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T09:11:39Z\\\"}}\" for pod \"openstack\"/\"openstackclient\": pods \"openstackclient\" not found" Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.455572 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.473948 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 03 09:11:39 crc kubenswrapper[4947]: E1203 09:11:39.480185 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-lzxbs openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[kube-api-access-lzxbs openstack-config openstack-config-secret]: context canceled" pod="openstack/openstackclient" podUID="4eb4aece-adc4-4e88-adaa-cfef079adb04" Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.506338 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.508084 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.519975 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.520826 4947 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="4eb4aece-adc4-4e88-adaa-cfef079adb04" podUID="3de32e8a-5c89-4684-b401-cdac764e2b5b" Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.601338 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf9gx\" (UniqueName: \"kubernetes.io/projected/3de32e8a-5c89-4684-b401-cdac764e2b5b-kube-api-access-kf9gx\") pod \"openstackclient\" (UID: \"3de32e8a-5c89-4684-b401-cdac764e2b5b\") " pod="openstack/openstackclient" Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.601440 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3de32e8a-5c89-4684-b401-cdac764e2b5b-openstack-config\") pod \"openstackclient\" (UID: \"3de32e8a-5c89-4684-b401-cdac764e2b5b\") " pod="openstack/openstackclient" Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.601545 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3de32e8a-5c89-4684-b401-cdac764e2b5b-openstack-config-secret\") pod \"openstackclient\" (UID: \"3de32e8a-5c89-4684-b401-cdac764e2b5b\") " pod="openstack/openstackclient" Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.717583 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf9gx\" (UniqueName: \"kubernetes.io/projected/3de32e8a-5c89-4684-b401-cdac764e2b5b-kube-api-access-kf9gx\") pod \"openstackclient\" (UID: \"3de32e8a-5c89-4684-b401-cdac764e2b5b\") " pod="openstack/openstackclient" Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.717668 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3de32e8a-5c89-4684-b401-cdac764e2b5b-openstack-config\") pod \"openstackclient\" (UID: \"3de32e8a-5c89-4684-b401-cdac764e2b5b\") " pod="openstack/openstackclient" Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.717737 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3de32e8a-5c89-4684-b401-cdac764e2b5b-openstack-config-secret\") pod \"openstackclient\" (UID: \"3de32e8a-5c89-4684-b401-cdac764e2b5b\") " pod="openstack/openstackclient" Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.720677 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3de32e8a-5c89-4684-b401-cdac764e2b5b-openstack-config\") pod \"openstackclient\" (UID: \"3de32e8a-5c89-4684-b401-cdac764e2b5b\") " pod="openstack/openstackclient" Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.728706 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.730385 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.736780 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-75xmw" Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.739708 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3de32e8a-5c89-4684-b401-cdac764e2b5b-openstack-config-secret\") pod \"openstackclient\" (UID: \"3de32e8a-5c89-4684-b401-cdac764e2b5b\") " pod="openstack/openstackclient" Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.747928 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.769058 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf9gx\" (UniqueName: \"kubernetes.io/projected/3de32e8a-5c89-4684-b401-cdac764e2b5b-kube-api-access-kf9gx\") pod \"openstackclient\" (UID: \"3de32e8a-5c89-4684-b401-cdac764e2b5b\") " pod="openstack/openstackclient" Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.853142 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.912435 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.924359 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g279h\" (UniqueName: \"kubernetes.io/projected/bd56123a-a7b6-431e-aaa2-375bff6c9627-kube-api-access-g279h\") pod \"kube-state-metrics-0\" (UID: \"bd56123a-a7b6-431e-aaa2-375bff6c9627\") " pod="openstack/kube-state-metrics-0" Dec 03 09:11:39 crc kubenswrapper[4947]: I1203 09:11:39.937799 4947 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="4eb4aece-adc4-4e88-adaa-cfef079adb04" podUID="3de32e8a-5c89-4684-b401-cdac764e2b5b" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.025969 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g279h\" (UniqueName: \"kubernetes.io/projected/bd56123a-a7b6-431e-aaa2-375bff6c9627-kube-api-access-g279h\") pod \"kube-state-metrics-0\" (UID: \"bd56123a-a7b6-431e-aaa2-375bff6c9627\") " pod="openstack/kube-state-metrics-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.037792 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.100345 4947 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="4eb4aece-adc4-4e88-adaa-cfef079adb04" podUID="3de32e8a-5c89-4684-b401-cdac764e2b5b" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.141337 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g279h\" (UniqueName: \"kubernetes.io/projected/bd56123a-a7b6-431e-aaa2-375bff6c9627-kube-api-access-g279h\") pod \"kube-state-metrics-0\" (UID: \"bd56123a-a7b6-431e-aaa2-375bff6c9627\") " pod="openstack/kube-state-metrics-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.164973 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.543515 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.565473 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.578805 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.578971 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.579293 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.579303 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-p565m" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.587989 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.601778 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.639410 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d601641a-4a35-480d-a171-e058a567adcd-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"d601641a-4a35-480d-a171-e058a567adcd\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.639465 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7lq2\" (UniqueName: \"kubernetes.io/projected/d601641a-4a35-480d-a171-e058a567adcd-kube-api-access-m7lq2\") pod \"alertmanager-metric-storage-0\" (UID: \"d601641a-4a35-480d-a171-e058a567adcd\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.639618 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d601641a-4a35-480d-a171-e058a567adcd-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"d601641a-4a35-480d-a171-e058a567adcd\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.639681 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d601641a-4a35-480d-a171-e058a567adcd-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"d601641a-4a35-480d-a171-e058a567adcd\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.639718 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/d601641a-4a35-480d-a171-e058a567adcd-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"d601641a-4a35-480d-a171-e058a567adcd\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.639752 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d601641a-4a35-480d-a171-e058a567adcd-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"d601641a-4a35-480d-a171-e058a567adcd\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.639783 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d601641a-4a35-480d-a171-e058a567adcd-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"d601641a-4a35-480d-a171-e058a567adcd\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.741314 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d601641a-4a35-480d-a171-e058a567adcd-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"d601641a-4a35-480d-a171-e058a567adcd\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.741384 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7lq2\" (UniqueName: \"kubernetes.io/projected/d601641a-4a35-480d-a171-e058a567adcd-kube-api-access-m7lq2\") pod \"alertmanager-metric-storage-0\" (UID: \"d601641a-4a35-480d-a171-e058a567adcd\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.741413 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d601641a-4a35-480d-a171-e058a567adcd-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"d601641a-4a35-480d-a171-e058a567adcd\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.741486 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d601641a-4a35-480d-a171-e058a567adcd-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"d601641a-4a35-480d-a171-e058a567adcd\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.741553 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/d601641a-4a35-480d-a171-e058a567adcd-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"d601641a-4a35-480d-a171-e058a567adcd\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.741598 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d601641a-4a35-480d-a171-e058a567adcd-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"d601641a-4a35-480d-a171-e058a567adcd\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.741650 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d601641a-4a35-480d-a171-e058a567adcd-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"d601641a-4a35-480d-a171-e058a567adcd\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.759002 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d601641a-4a35-480d-a171-e058a567adcd-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"d601641a-4a35-480d-a171-e058a567adcd\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.761926 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/d601641a-4a35-480d-a171-e058a567adcd-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"d601641a-4a35-480d-a171-e058a567adcd\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.770670 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d601641a-4a35-480d-a171-e058a567adcd-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"d601641a-4a35-480d-a171-e058a567adcd\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.771066 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d601641a-4a35-480d-a171-e058a567adcd-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"d601641a-4a35-480d-a171-e058a567adcd\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.780913 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d601641a-4a35-480d-a171-e058a567adcd-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"d601641a-4a35-480d-a171-e058a567adcd\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.781682 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d601641a-4a35-480d-a171-e058a567adcd-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"d601641a-4a35-480d-a171-e058a567adcd\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.799004 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7lq2\" (UniqueName: \"kubernetes.io/projected/d601641a-4a35-480d-a171-e058a567adcd-kube-api-access-m7lq2\") pod \"alertmanager-metric-storage-0\" (UID: \"d601641a-4a35-480d-a171-e058a567adcd\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.906213 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.930972 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.934132 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 09:11:40 crc kubenswrapper[4947]: I1203 09:11:40.938872 4947 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="4eb4aece-adc4-4e88-adaa-cfef079adb04" podUID="3de32e8a-5c89-4684-b401-cdac764e2b5b" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.121923 4947 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="4eb4aece-adc4-4e88-adaa-cfef079adb04" podUID="3de32e8a-5c89-4684-b401-cdac764e2b5b" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.139235 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eb4aece-adc4-4e88-adaa-cfef079adb04" path="/var/lib/kubelet/pods/4eb4aece-adc4-4e88-adaa-cfef079adb04/volumes" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.240399 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.250449 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.254157 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.254876 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.256031 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.256392 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.256453 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-58xd7" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.256710 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.275669 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.301725 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.397151 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8647ce59-dce8-4205-9ba9-2f5dfbbea9bb-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb\") " pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.397285 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8647ce59-dce8-4205-9ba9-2f5dfbbea9bb-config\") pod \"prometheus-metric-storage-0\" (UID: \"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb\") " pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.397331 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8647ce59-dce8-4205-9ba9-2f5dfbbea9bb-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb\") " pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.397419 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8647ce59-dce8-4205-9ba9-2f5dfbbea9bb-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb\") " pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.397455 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8611258d-037b-4b91-a11c-3c8e9cbc139f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8611258d-037b-4b91-a11c-3c8e9cbc139f\") pod \"prometheus-metric-storage-0\" (UID: \"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb\") " pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.397482 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8647ce59-dce8-4205-9ba9-2f5dfbbea9bb-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb\") " pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.397525 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8647ce59-dce8-4205-9ba9-2f5dfbbea9bb-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb\") " pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.397568 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcksd\" (UniqueName: \"kubernetes.io/projected/8647ce59-dce8-4205-9ba9-2f5dfbbea9bb-kube-api-access-vcksd\") pod \"prometheus-metric-storage-0\" (UID: \"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb\") " pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.499520 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8647ce59-dce8-4205-9ba9-2f5dfbbea9bb-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb\") " pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.499578 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8611258d-037b-4b91-a11c-3c8e9cbc139f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8611258d-037b-4b91-a11c-3c8e9cbc139f\") pod \"prometheus-metric-storage-0\" (UID: \"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb\") " pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.499614 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8647ce59-dce8-4205-9ba9-2f5dfbbea9bb-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb\") " pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.499638 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8647ce59-dce8-4205-9ba9-2f5dfbbea9bb-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb\") " pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.499688 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcksd\" (UniqueName: \"kubernetes.io/projected/8647ce59-dce8-4205-9ba9-2f5dfbbea9bb-kube-api-access-vcksd\") pod \"prometheus-metric-storage-0\" (UID: \"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb\") " pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.500558 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8647ce59-dce8-4205-9ba9-2f5dfbbea9bb-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb\") " pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.500680 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8647ce59-dce8-4205-9ba9-2f5dfbbea9bb-config\") pod \"prometheus-metric-storage-0\" (UID: \"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb\") " pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.500738 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8647ce59-dce8-4205-9ba9-2f5dfbbea9bb-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb\") " pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.501770 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8647ce59-dce8-4205-9ba9-2f5dfbbea9bb-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb\") " pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.505250 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8647ce59-dce8-4205-9ba9-2f5dfbbea9bb-config\") pod \"prometheus-metric-storage-0\" (UID: \"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb\") " pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.505967 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8647ce59-dce8-4205-9ba9-2f5dfbbea9bb-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb\") " pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.506643 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8647ce59-dce8-4205-9ba9-2f5dfbbea9bb-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb\") " pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.507313 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8647ce59-dce8-4205-9ba9-2f5dfbbea9bb-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb\") " pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.519329 4947 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.519384 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8611258d-037b-4b91-a11c-3c8e9cbc139f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8611258d-037b-4b91-a11c-3c8e9cbc139f\") pod \"prometheus-metric-storage-0\" (UID: \"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/06e746eb250d80d8fb40490046d8e84fca25e9bdafda02a72b1c6a0102707b85/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.523300 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8647ce59-dce8-4205-9ba9-2f5dfbbea9bb-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb\") " pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.527101 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcksd\" (UniqueName: \"kubernetes.io/projected/8647ce59-dce8-4205-9ba9-2f5dfbbea9bb-kube-api-access-vcksd\") pod \"prometheus-metric-storage-0\" (UID: \"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb\") " pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.563702 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8611258d-037b-4b91-a11c-3c8e9cbc139f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8611258d-037b-4b91-a11c-3c8e9cbc139f\") pod \"prometheus-metric-storage-0\" (UID: \"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb\") " pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.602297 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.629082 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 03 09:11:41 crc kubenswrapper[4947]: W1203 09:11:41.636455 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd601641a_4a35_480d_a171_e058a567adcd.slice/crio-7f3434cfc8b81aae5878b5a66ae42dbf434705b3e1c2196052d882a549af91a9 WatchSource:0}: Error finding container 7f3434cfc8b81aae5878b5a66ae42dbf434705b3e1c2196052d882a549af91a9: Status 404 returned error can't find the container with id 7f3434cfc8b81aae5878b5a66ae42dbf434705b3e1c2196052d882a549af91a9 Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.653917 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.806620 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4331d824-805d-455d-af0a-585746ab0364-openstack-config\") pod \"4331d824-805d-455d-af0a-585746ab0364\" (UID: \"4331d824-805d-455d-af0a-585746ab0364\") " Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.807005 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfj7t\" (UniqueName: \"kubernetes.io/projected/4331d824-805d-455d-af0a-585746ab0364-kube-api-access-dfj7t\") pod \"4331d824-805d-455d-af0a-585746ab0364\" (UID: \"4331d824-805d-455d-af0a-585746ab0364\") " Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.807121 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4331d824-805d-455d-af0a-585746ab0364-openstack-config-secret\") pod \"4331d824-805d-455d-af0a-585746ab0364\" (UID: \"4331d824-805d-455d-af0a-585746ab0364\") " Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.813607 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4331d824-805d-455d-af0a-585746ab0364-kube-api-access-dfj7t" (OuterVolumeSpecName: "kube-api-access-dfj7t") pod "4331d824-805d-455d-af0a-585746ab0364" (UID: "4331d824-805d-455d-af0a-585746ab0364"). InnerVolumeSpecName "kube-api-access-dfj7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.866551 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4331d824-805d-455d-af0a-585746ab0364-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "4331d824-805d-455d-af0a-585746ab0364" (UID: "4331d824-805d-455d-af0a-585746ab0364"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.883886 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4331d824-805d-455d-af0a-585746ab0364-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "4331d824-805d-455d-af0a-585746ab0364" (UID: "4331d824-805d-455d-af0a-585746ab0364"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.910631 4947 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4331d824-805d-455d-af0a-585746ab0364-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.910666 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfj7t\" (UniqueName: \"kubernetes.io/projected/4331d824-805d-455d-af0a-585746ab0364-kube-api-access-dfj7t\") on node \"crc\" DevicePath \"\"" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.910678 4947 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4331d824-805d-455d-af0a-585746ab0364-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.949317 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"d601641a-4a35-480d-a171-e058a567adcd","Type":"ContainerStarted","Data":"7f3434cfc8b81aae5878b5a66ae42dbf434705b3e1c2196052d882a549af91a9"} Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.950979 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"bd56123a-a7b6-431e-aaa2-375bff6c9627","Type":"ContainerStarted","Data":"53b9dff241afba112a2513f870e3a61886c670e84b0b1f6dff93c227613af5fb"} Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.952452 4947 generic.go:334] "Generic (PLEG): container finished" podID="4331d824-805d-455d-af0a-585746ab0364" containerID="19f8253dec3ce074344b7772a491e1fed8663e02ce71520faef32efa5c0485cf" exitCode=137 Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.952511 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.952542 4947 scope.go:117] "RemoveContainer" containerID="19f8253dec3ce074344b7772a491e1fed8663e02ce71520faef32efa5c0485cf" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.956403 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3de32e8a-5c89-4684-b401-cdac764e2b5b","Type":"ContainerStarted","Data":"b0d8d606b1c676cd5b3b3751d5092e32b88987da800f880212a904f7997c806c"} Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.956463 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3de32e8a-5c89-4684-b401-cdac764e2b5b","Type":"ContainerStarted","Data":"236ae8a1e2c244fe2f24da6394da0c423e5de7a986bed2f4bc4787cfe05bd75e"} Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.987764 4947 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="4331d824-805d-455d-af0a-585746ab0364" podUID="3de32e8a-5c89-4684-b401-cdac764e2b5b" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.994306 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.994282277 podStartE2EDuration="2.994282277s" podCreationTimestamp="2025-12-03 09:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:11:41.974051389 +0000 UTC m=+8563.235005815" watchObservedRunningTime="2025-12-03 09:11:41.994282277 +0000 UTC m=+8563.255236703" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.994790 4947 scope.go:117] "RemoveContainer" containerID="19f8253dec3ce074344b7772a491e1fed8663e02ce71520faef32efa5c0485cf" Dec 03 09:11:41 crc kubenswrapper[4947]: E1203 09:11:41.995126 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19f8253dec3ce074344b7772a491e1fed8663e02ce71520faef32efa5c0485cf\": container with ID starting with 19f8253dec3ce074344b7772a491e1fed8663e02ce71520faef32efa5c0485cf not found: ID does not exist" containerID="19f8253dec3ce074344b7772a491e1fed8663e02ce71520faef32efa5c0485cf" Dec 03 09:11:41 crc kubenswrapper[4947]: I1203 09:11:41.995149 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19f8253dec3ce074344b7772a491e1fed8663e02ce71520faef32efa5c0485cf"} err="failed to get container status \"19f8253dec3ce074344b7772a491e1fed8663e02ce71520faef32efa5c0485cf\": rpc error: code = NotFound desc = could not find container \"19f8253dec3ce074344b7772a491e1fed8663e02ce71520faef32efa5c0485cf\": container with ID starting with 19f8253dec3ce074344b7772a491e1fed8663e02ce71520faef32efa5c0485cf not found: ID does not exist" Dec 03 09:11:42 crc kubenswrapper[4947]: I1203 09:11:42.230219 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 09:11:42 crc kubenswrapper[4947]: I1203 09:11:42.974634 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"bd56123a-a7b6-431e-aaa2-375bff6c9627","Type":"ContainerStarted","Data":"0202e3827cab2bcca04637878bd679e42fa661b022a3cabc39ab0a0a735e7b5c"} Dec 03 09:11:42 crc kubenswrapper[4947]: I1203 09:11:42.976153 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 03 09:11:42 crc kubenswrapper[4947]: I1203 09:11:42.982173 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb","Type":"ContainerStarted","Data":"d0215f08108f2549e69717d084807dc03b25f3435c12f6b0a4de4a77ac5dd8d5"} Dec 03 09:11:42 crc kubenswrapper[4947]: I1203 09:11:42.999685 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.528464347 podStartE2EDuration="3.999662471s" podCreationTimestamp="2025-12-03 09:11:39 +0000 UTC" firstStartedPulling="2025-12-03 09:11:41.214189167 +0000 UTC m=+8562.475143593" lastFinishedPulling="2025-12-03 09:11:41.685387291 +0000 UTC m=+8562.946341717" observedRunningTime="2025-12-03 09:11:42.990173384 +0000 UTC m=+8564.251127810" watchObservedRunningTime="2025-12-03 09:11:42.999662471 +0000 UTC m=+8564.260616897" Dec 03 09:11:43 crc kubenswrapper[4947]: I1203 09:11:43.106798 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4331d824-805d-455d-af0a-585746ab0364" path="/var/lib/kubelet/pods/4331d824-805d-455d-af0a-585746ab0364/volumes" Dec 03 09:11:44 crc kubenswrapper[4947]: I1203 09:11:44.083407 4947 scope.go:117] "RemoveContainer" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" Dec 03 09:11:44 crc kubenswrapper[4947]: E1203 09:11:44.085055 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:11:48 crc kubenswrapper[4947]: I1203 09:11:48.035026 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb","Type":"ContainerStarted","Data":"8dfd5267f42c18e342a9681eee9ffae3a878b1008b1463d12eb7b329af172e1b"} Dec 03 09:11:48 crc kubenswrapper[4947]: I1203 09:11:48.037666 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"d601641a-4a35-480d-a171-e058a567adcd","Type":"ContainerStarted","Data":"094c40681006ee4b7129ee3ee35b33528bb6d1744e06d1329e584bdfd581f3aa"} Dec 03 09:11:50 crc kubenswrapper[4947]: I1203 09:11:50.170683 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 03 09:11:55 crc kubenswrapper[4947]: I1203 09:11:55.083887 4947 scope.go:117] "RemoveContainer" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" Dec 03 09:11:55 crc kubenswrapper[4947]: E1203 09:11:55.085131 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:11:55 crc kubenswrapper[4947]: I1203 09:11:55.120057 4947 generic.go:334] "Generic (PLEG): container finished" podID="8647ce59-dce8-4205-9ba9-2f5dfbbea9bb" containerID="8dfd5267f42c18e342a9681eee9ffae3a878b1008b1463d12eb7b329af172e1b" exitCode=0 Dec 03 09:11:55 crc kubenswrapper[4947]: I1203 09:11:55.120163 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb","Type":"ContainerDied","Data":"8dfd5267f42c18e342a9681eee9ffae3a878b1008b1463d12eb7b329af172e1b"} Dec 03 09:11:55 crc kubenswrapper[4947]: I1203 09:11:55.127371 4947 generic.go:334] "Generic (PLEG): container finished" podID="d601641a-4a35-480d-a171-e058a567adcd" containerID="094c40681006ee4b7129ee3ee35b33528bb6d1744e06d1329e584bdfd581f3aa" exitCode=0 Dec 03 09:11:55 crc kubenswrapper[4947]: I1203 09:11:55.127408 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"d601641a-4a35-480d-a171-e058a567adcd","Type":"ContainerDied","Data":"094c40681006ee4b7129ee3ee35b33528bb6d1744e06d1329e584bdfd581f3aa"} Dec 03 09:12:06 crc kubenswrapper[4947]: I1203 09:12:06.525691 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c2csq"] Dec 03 09:12:06 crc kubenswrapper[4947]: I1203 09:12:06.528409 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c2csq" Dec 03 09:12:06 crc kubenswrapper[4947]: I1203 09:12:06.542126 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c2csq"] Dec 03 09:12:06 crc kubenswrapper[4947]: I1203 09:12:06.642236 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9078258b-c4e7-4e4a-a76f-3fd2a572d6e1-catalog-content\") pod \"community-operators-c2csq\" (UID: \"9078258b-c4e7-4e4a-a76f-3fd2a572d6e1\") " pod="openshift-marketplace/community-operators-c2csq" Dec 03 09:12:06 crc kubenswrapper[4947]: I1203 09:12:06.642292 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9078258b-c4e7-4e4a-a76f-3fd2a572d6e1-utilities\") pod \"community-operators-c2csq\" (UID: \"9078258b-c4e7-4e4a-a76f-3fd2a572d6e1\") " pod="openshift-marketplace/community-operators-c2csq" Dec 03 09:12:06 crc kubenswrapper[4947]: I1203 09:12:06.642699 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjtlq\" (UniqueName: \"kubernetes.io/projected/9078258b-c4e7-4e4a-a76f-3fd2a572d6e1-kube-api-access-gjtlq\") pod \"community-operators-c2csq\" (UID: \"9078258b-c4e7-4e4a-a76f-3fd2a572d6e1\") " pod="openshift-marketplace/community-operators-c2csq" Dec 03 09:12:06 crc kubenswrapper[4947]: I1203 09:12:06.744755 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9078258b-c4e7-4e4a-a76f-3fd2a572d6e1-catalog-content\") pod \"community-operators-c2csq\" (UID: \"9078258b-c4e7-4e4a-a76f-3fd2a572d6e1\") " pod="openshift-marketplace/community-operators-c2csq" Dec 03 09:12:06 crc kubenswrapper[4947]: I1203 09:12:06.744940 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9078258b-c4e7-4e4a-a76f-3fd2a572d6e1-utilities\") pod \"community-operators-c2csq\" (UID: \"9078258b-c4e7-4e4a-a76f-3fd2a572d6e1\") " pod="openshift-marketplace/community-operators-c2csq" Dec 03 09:12:06 crc kubenswrapper[4947]: I1203 09:12:06.745043 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjtlq\" (UniqueName: \"kubernetes.io/projected/9078258b-c4e7-4e4a-a76f-3fd2a572d6e1-kube-api-access-gjtlq\") pod \"community-operators-c2csq\" (UID: \"9078258b-c4e7-4e4a-a76f-3fd2a572d6e1\") " pod="openshift-marketplace/community-operators-c2csq" Dec 03 09:12:06 crc kubenswrapper[4947]: I1203 09:12:06.745349 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9078258b-c4e7-4e4a-a76f-3fd2a572d6e1-catalog-content\") pod \"community-operators-c2csq\" (UID: \"9078258b-c4e7-4e4a-a76f-3fd2a572d6e1\") " pod="openshift-marketplace/community-operators-c2csq" Dec 03 09:12:06 crc kubenswrapper[4947]: I1203 09:12:06.745398 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9078258b-c4e7-4e4a-a76f-3fd2a572d6e1-utilities\") pod \"community-operators-c2csq\" (UID: \"9078258b-c4e7-4e4a-a76f-3fd2a572d6e1\") " pod="openshift-marketplace/community-operators-c2csq" Dec 03 09:12:06 crc kubenswrapper[4947]: I1203 09:12:06.768350 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjtlq\" (UniqueName: \"kubernetes.io/projected/9078258b-c4e7-4e4a-a76f-3fd2a572d6e1-kube-api-access-gjtlq\") pod \"community-operators-c2csq\" (UID: \"9078258b-c4e7-4e4a-a76f-3fd2a572d6e1\") " pod="openshift-marketplace/community-operators-c2csq" Dec 03 09:12:06 crc kubenswrapper[4947]: I1203 09:12:06.850027 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c2csq" Dec 03 09:12:07 crc kubenswrapper[4947]: I1203 09:12:07.083970 4947 scope.go:117] "RemoveContainer" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" Dec 03 09:12:07 crc kubenswrapper[4947]: E1203 09:12:07.084587 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:12:07 crc kubenswrapper[4947]: I1203 09:12:07.376405 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c2csq"] Dec 03 09:12:08 crc kubenswrapper[4947]: I1203 09:12:08.287199 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb","Type":"ContainerStarted","Data":"e0eecfc7358034d3744be69ac2ee29cbcd90c7ec758601b15fc51ca956edd8d9"} Dec 03 09:12:08 crc kubenswrapper[4947]: I1203 09:12:08.289207 4947 generic.go:334] "Generic (PLEG): container finished" podID="9078258b-c4e7-4e4a-a76f-3fd2a572d6e1" containerID="3e26083de2c3a3c61840761f64b98361b685f4ae996988a25516663f77c4048a" exitCode=0 Dec 03 09:12:08 crc kubenswrapper[4947]: I1203 09:12:08.290440 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c2csq" event={"ID":"9078258b-c4e7-4e4a-a76f-3fd2a572d6e1","Type":"ContainerDied","Data":"3e26083de2c3a3c61840761f64b98361b685f4ae996988a25516663f77c4048a"} Dec 03 09:12:08 crc kubenswrapper[4947]: I1203 09:12:08.290473 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c2csq" event={"ID":"9078258b-c4e7-4e4a-a76f-3fd2a572d6e1","Type":"ContainerStarted","Data":"7a3a69ef34311bd7d06d1c2f7bbd75c75b3198b32c466f151bc0388d2b03dba4"} Dec 03 09:12:08 crc kubenswrapper[4947]: I1203 09:12:08.293422 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"d601641a-4a35-480d-a171-e058a567adcd","Type":"ContainerStarted","Data":"67578fc09873dad01b45e5535e3c28428a56d71d6fb6d795a262e982cc97edbc"} Dec 03 09:12:11 crc kubenswrapper[4947]: I1203 09:12:11.333831 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb","Type":"ContainerStarted","Data":"d475c1eef0b73537ab63aca8d69107ebd60aa0b0b451b1adee12163e68dc7b78"} Dec 03 09:12:11 crc kubenswrapper[4947]: I1203 09:12:11.335978 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c2csq" event={"ID":"9078258b-c4e7-4e4a-a76f-3fd2a572d6e1","Type":"ContainerStarted","Data":"84b0c4147668c1e9a2c6ff4ea822004a40ac4c20aa2798cf1f7a1eb9c7a0305b"} Dec 03 09:12:11 crc kubenswrapper[4947]: I1203 09:12:11.344101 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"d601641a-4a35-480d-a171-e058a567adcd","Type":"ContainerStarted","Data":"e61ab134a3158051cf264c75ea6db0761137ec1f578c6190aab4b1c7a7b5dbc5"} Dec 03 09:12:11 crc kubenswrapper[4947]: I1203 09:12:11.345598 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Dec 03 09:12:11 crc kubenswrapper[4947]: I1203 09:12:11.347774 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Dec 03 09:12:11 crc kubenswrapper[4947]: I1203 09:12:11.377092 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=6.196193351 podStartE2EDuration="31.377072044s" podCreationTimestamp="2025-12-03 09:11:40 +0000 UTC" firstStartedPulling="2025-12-03 09:11:41.641955377 +0000 UTC m=+8562.902909803" lastFinishedPulling="2025-12-03 09:12:06.82283407 +0000 UTC m=+8588.083788496" observedRunningTime="2025-12-03 09:12:11.373322664 +0000 UTC m=+8592.634277100" watchObservedRunningTime="2025-12-03 09:12:11.377072044 +0000 UTC m=+8592.638026470" Dec 03 09:12:13 crc kubenswrapper[4947]: I1203 09:12:13.366110 4947 generic.go:334] "Generic (PLEG): container finished" podID="9078258b-c4e7-4e4a-a76f-3fd2a572d6e1" containerID="84b0c4147668c1e9a2c6ff4ea822004a40ac4c20aa2798cf1f7a1eb9c7a0305b" exitCode=0 Dec 03 09:12:13 crc kubenswrapper[4947]: I1203 09:12:13.366204 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c2csq" event={"ID":"9078258b-c4e7-4e4a-a76f-3fd2a572d6e1","Type":"ContainerDied","Data":"84b0c4147668c1e9a2c6ff4ea822004a40ac4c20aa2798cf1f7a1eb9c7a0305b"} Dec 03 09:12:18 crc kubenswrapper[4947]: I1203 09:12:18.046250 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-kjzxl"] Dec 03 09:12:18 crc kubenswrapper[4947]: I1203 09:12:18.059371 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-9521-account-create-update-hmkwn"] Dec 03 09:12:18 crc kubenswrapper[4947]: I1203 09:12:18.070669 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-kjzxl"] Dec 03 09:12:18 crc kubenswrapper[4947]: I1203 09:12:18.080894 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-9521-account-create-update-hmkwn"] Dec 03 09:12:19 crc kubenswrapper[4947]: I1203 09:12:19.103716 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c17c928-65ee-4727-99c4-328cbc8ed2d5" path="/var/lib/kubelet/pods/2c17c928-65ee-4727-99c4-328cbc8ed2d5/volumes" Dec 03 09:12:19 crc kubenswrapper[4947]: I1203 09:12:19.105147 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f02dbcd4-a088-488a-ac51-3aa3015c81ff" path="/var/lib/kubelet/pods/f02dbcd4-a088-488a-ac51-3aa3015c81ff/volumes" Dec 03 09:12:20 crc kubenswrapper[4947]: I1203 09:12:20.083956 4947 scope.go:117] "RemoveContainer" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" Dec 03 09:12:20 crc kubenswrapper[4947]: E1203 09:12:20.084596 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:12:21 crc kubenswrapper[4947]: I1203 09:12:21.464277 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c2csq" event={"ID":"9078258b-c4e7-4e4a-a76f-3fd2a572d6e1","Type":"ContainerStarted","Data":"99796eecc9d15257681fda23167e8f71cb8d741794fbb3128a420029b150ce38"} Dec 03 09:12:21 crc kubenswrapper[4947]: I1203 09:12:21.467540 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8647ce59-dce8-4205-9ba9-2f5dfbbea9bb","Type":"ContainerStarted","Data":"c594f8e2324b3c128067ae92783208e4182f0f7c4dc1e66d27d73bd4d933adec"} Dec 03 09:12:21 crc kubenswrapper[4947]: I1203 09:12:21.513857 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c2csq" podStartSLOduration=3.098375949 podStartE2EDuration="15.513838338s" podCreationTimestamp="2025-12-03 09:12:06 +0000 UTC" firstStartedPulling="2025-12-03 09:12:08.29243021 +0000 UTC m=+8589.553384636" lastFinishedPulling="2025-12-03 09:12:20.707892599 +0000 UTC m=+8601.968847025" observedRunningTime="2025-12-03 09:12:21.488873813 +0000 UTC m=+8602.749828259" watchObservedRunningTime="2025-12-03 09:12:21.513838338 +0000 UTC m=+8602.774792764" Dec 03 09:12:21 crc kubenswrapper[4947]: I1203 09:12:21.603699 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 03 09:12:26 crc kubenswrapper[4947]: I1203 09:12:26.603289 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 03 09:12:26 crc kubenswrapper[4947]: I1203 09:12:26.609663 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 03 09:12:26 crc kubenswrapper[4947]: I1203 09:12:26.651141 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=8.204272606 podStartE2EDuration="46.651114133s" podCreationTimestamp="2025-12-03 09:11:40 +0000 UTC" firstStartedPulling="2025-12-03 09:11:42.260931449 +0000 UTC m=+8563.521885875" lastFinishedPulling="2025-12-03 09:12:20.707772976 +0000 UTC m=+8601.968727402" observedRunningTime="2025-12-03 09:12:21.517077316 +0000 UTC m=+8602.778031762" watchObservedRunningTime="2025-12-03 09:12:26.651114133 +0000 UTC m=+8607.912068569" Dec 03 09:12:26 crc kubenswrapper[4947]: I1203 09:12:26.658364 4947 scope.go:117] "RemoveContainer" containerID="31433968ff91a8bb07a413ae78d7b236c76244fc27f60747006f6cc92bc4ca2d" Dec 03 09:12:26 crc kubenswrapper[4947]: I1203 09:12:26.696929 4947 scope.go:117] "RemoveContainer" containerID="f39ee615a6da237f990e6d0610b3f674b26516b59945590708bb9516180097b5" Dec 03 09:12:26 crc kubenswrapper[4947]: I1203 09:12:26.850509 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c2csq" Dec 03 09:12:26 crc kubenswrapper[4947]: I1203 09:12:26.850789 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c2csq" Dec 03 09:12:26 crc kubenswrapper[4947]: I1203 09:12:26.907536 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c2csq" Dec 03 09:12:27 crc kubenswrapper[4947]: I1203 09:12:27.542621 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 03 09:12:27 crc kubenswrapper[4947]: I1203 09:12:27.609213 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c2csq" Dec 03 09:12:27 crc kubenswrapper[4947]: I1203 09:12:27.663436 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c2csq"] Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.466667 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.470028 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.482889 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.483890 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.484113 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.559971 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c2csq" podUID="9078258b-c4e7-4e4a-a76f-3fd2a572d6e1" containerName="registry-server" containerID="cri-o://99796eecc9d15257681fda23167e8f71cb8d741794fbb3128a420029b150ce38" gracePeriod=2 Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.647016 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9cd9435-a4d8-4de1-ab22-f008af6736b5-scripts\") pod \"ceilometer-0\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " pod="openstack/ceilometer-0" Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.647063 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9cd9435-a4d8-4de1-ab22-f008af6736b5-log-httpd\") pod \"ceilometer-0\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " pod="openstack/ceilometer-0" Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.647152 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9cd9435-a4d8-4de1-ab22-f008af6736b5-config-data\") pod \"ceilometer-0\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " pod="openstack/ceilometer-0" Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.647187 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c9cd9435-a4d8-4de1-ab22-f008af6736b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " pod="openstack/ceilometer-0" Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.647235 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9cd9435-a4d8-4de1-ab22-f008af6736b5-run-httpd\") pod \"ceilometer-0\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " pod="openstack/ceilometer-0" Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.647297 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9cd9435-a4d8-4de1-ab22-f008af6736b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " pod="openstack/ceilometer-0" Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.647330 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l6rq\" (UniqueName: \"kubernetes.io/projected/c9cd9435-a4d8-4de1-ab22-f008af6736b5-kube-api-access-9l6rq\") pod \"ceilometer-0\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " pod="openstack/ceilometer-0" Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.749407 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9cd9435-a4d8-4de1-ab22-f008af6736b5-scripts\") pod \"ceilometer-0\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " pod="openstack/ceilometer-0" Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.749457 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9cd9435-a4d8-4de1-ab22-f008af6736b5-log-httpd\") pod \"ceilometer-0\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " pod="openstack/ceilometer-0" Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.749583 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9cd9435-a4d8-4de1-ab22-f008af6736b5-config-data\") pod \"ceilometer-0\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " pod="openstack/ceilometer-0" Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.749616 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c9cd9435-a4d8-4de1-ab22-f008af6736b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " pod="openstack/ceilometer-0" Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.749680 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9cd9435-a4d8-4de1-ab22-f008af6736b5-run-httpd\") pod \"ceilometer-0\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " pod="openstack/ceilometer-0" Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.749740 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9cd9435-a4d8-4de1-ab22-f008af6736b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " pod="openstack/ceilometer-0" Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.749764 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l6rq\" (UniqueName: \"kubernetes.io/projected/c9cd9435-a4d8-4de1-ab22-f008af6736b5-kube-api-access-9l6rq\") pod \"ceilometer-0\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " pod="openstack/ceilometer-0" Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.750628 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9cd9435-a4d8-4de1-ab22-f008af6736b5-run-httpd\") pod \"ceilometer-0\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " pod="openstack/ceilometer-0" Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.750704 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9cd9435-a4d8-4de1-ab22-f008af6736b5-log-httpd\") pod \"ceilometer-0\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " pod="openstack/ceilometer-0" Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.761662 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9cd9435-a4d8-4de1-ab22-f008af6736b5-config-data\") pod \"ceilometer-0\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " pod="openstack/ceilometer-0" Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.765442 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9cd9435-a4d8-4de1-ab22-f008af6736b5-scripts\") pod \"ceilometer-0\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " pod="openstack/ceilometer-0" Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.766292 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9cd9435-a4d8-4de1-ab22-f008af6736b5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " pod="openstack/ceilometer-0" Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.778122 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c9cd9435-a4d8-4de1-ab22-f008af6736b5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " pod="openstack/ceilometer-0" Dec 03 09:12:29 crc kubenswrapper[4947]: I1203 09:12:29.844155 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l6rq\" (UniqueName: \"kubernetes.io/projected/c9cd9435-a4d8-4de1-ab22-f008af6736b5-kube-api-access-9l6rq\") pod \"ceilometer-0\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " pod="openstack/ceilometer-0" Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.113590 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.221846 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c2csq" Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.362300 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjtlq\" (UniqueName: \"kubernetes.io/projected/9078258b-c4e7-4e4a-a76f-3fd2a572d6e1-kube-api-access-gjtlq\") pod \"9078258b-c4e7-4e4a-a76f-3fd2a572d6e1\" (UID: \"9078258b-c4e7-4e4a-a76f-3fd2a572d6e1\") " Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.362425 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9078258b-c4e7-4e4a-a76f-3fd2a572d6e1-catalog-content\") pod \"9078258b-c4e7-4e4a-a76f-3fd2a572d6e1\" (UID: \"9078258b-c4e7-4e4a-a76f-3fd2a572d6e1\") " Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.362570 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9078258b-c4e7-4e4a-a76f-3fd2a572d6e1-utilities\") pod \"9078258b-c4e7-4e4a-a76f-3fd2a572d6e1\" (UID: \"9078258b-c4e7-4e4a-a76f-3fd2a572d6e1\") " Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.363446 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9078258b-c4e7-4e4a-a76f-3fd2a572d6e1-utilities" (OuterVolumeSpecName: "utilities") pod "9078258b-c4e7-4e4a-a76f-3fd2a572d6e1" (UID: "9078258b-c4e7-4e4a-a76f-3fd2a572d6e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.372404 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9078258b-c4e7-4e4a-a76f-3fd2a572d6e1-kube-api-access-gjtlq" (OuterVolumeSpecName: "kube-api-access-gjtlq") pod "9078258b-c4e7-4e4a-a76f-3fd2a572d6e1" (UID: "9078258b-c4e7-4e4a-a76f-3fd2a572d6e1"). InnerVolumeSpecName "kube-api-access-gjtlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.411558 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9078258b-c4e7-4e4a-a76f-3fd2a572d6e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9078258b-c4e7-4e4a-a76f-3fd2a572d6e1" (UID: "9078258b-c4e7-4e4a-a76f-3fd2a572d6e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.468361 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9078258b-c4e7-4e4a-a76f-3fd2a572d6e1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.468406 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9078258b-c4e7-4e4a-a76f-3fd2a572d6e1-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.468420 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjtlq\" (UniqueName: \"kubernetes.io/projected/9078258b-c4e7-4e4a-a76f-3fd2a572d6e1-kube-api-access-gjtlq\") on node \"crc\" DevicePath \"\"" Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.572608 4947 generic.go:334] "Generic (PLEG): container finished" podID="9078258b-c4e7-4e4a-a76f-3fd2a572d6e1" containerID="99796eecc9d15257681fda23167e8f71cb8d741794fbb3128a420029b150ce38" exitCode=0 Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.572662 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c2csq" event={"ID":"9078258b-c4e7-4e4a-a76f-3fd2a572d6e1","Type":"ContainerDied","Data":"99796eecc9d15257681fda23167e8f71cb8d741794fbb3128a420029b150ce38"} Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.572687 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c2csq" event={"ID":"9078258b-c4e7-4e4a-a76f-3fd2a572d6e1","Type":"ContainerDied","Data":"7a3a69ef34311bd7d06d1c2f7bbd75c75b3198b32c466f151bc0388d2b03dba4"} Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.572705 4947 scope.go:117] "RemoveContainer" containerID="99796eecc9d15257681fda23167e8f71cb8d741794fbb3128a420029b150ce38" Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.572822 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c2csq" Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.610714 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c2csq"] Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.619357 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c2csq"] Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.629988 4947 scope.go:117] "RemoveContainer" containerID="84b0c4147668c1e9a2c6ff4ea822004a40ac4c20aa2798cf1f7a1eb9c7a0305b" Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.653225 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.656338 4947 scope.go:117] "RemoveContainer" containerID="3e26083de2c3a3c61840761f64b98361b685f4ae996988a25516663f77c4048a" Dec 03 09:12:30 crc kubenswrapper[4947]: W1203 09:12:30.663021 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9cd9435_a4d8_4de1_ab22_f008af6736b5.slice/crio-828ea520a9331be1550bd3efd243658327f2849687947859a0df48686729290c WatchSource:0}: Error finding container 828ea520a9331be1550bd3efd243658327f2849687947859a0df48686729290c: Status 404 returned error can't find the container with id 828ea520a9331be1550bd3efd243658327f2849687947859a0df48686729290c Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.677127 4947 scope.go:117] "RemoveContainer" containerID="99796eecc9d15257681fda23167e8f71cb8d741794fbb3128a420029b150ce38" Dec 03 09:12:30 crc kubenswrapper[4947]: E1203 09:12:30.677800 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99796eecc9d15257681fda23167e8f71cb8d741794fbb3128a420029b150ce38\": container with ID starting with 99796eecc9d15257681fda23167e8f71cb8d741794fbb3128a420029b150ce38 not found: ID does not exist" containerID="99796eecc9d15257681fda23167e8f71cb8d741794fbb3128a420029b150ce38" Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.677825 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99796eecc9d15257681fda23167e8f71cb8d741794fbb3128a420029b150ce38"} err="failed to get container status \"99796eecc9d15257681fda23167e8f71cb8d741794fbb3128a420029b150ce38\": rpc error: code = NotFound desc = could not find container \"99796eecc9d15257681fda23167e8f71cb8d741794fbb3128a420029b150ce38\": container with ID starting with 99796eecc9d15257681fda23167e8f71cb8d741794fbb3128a420029b150ce38 not found: ID does not exist" Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.677842 4947 scope.go:117] "RemoveContainer" containerID="84b0c4147668c1e9a2c6ff4ea822004a40ac4c20aa2798cf1f7a1eb9c7a0305b" Dec 03 09:12:30 crc kubenswrapper[4947]: E1203 09:12:30.678325 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84b0c4147668c1e9a2c6ff4ea822004a40ac4c20aa2798cf1f7a1eb9c7a0305b\": container with ID starting with 84b0c4147668c1e9a2c6ff4ea822004a40ac4c20aa2798cf1f7a1eb9c7a0305b not found: ID does not exist" containerID="84b0c4147668c1e9a2c6ff4ea822004a40ac4c20aa2798cf1f7a1eb9c7a0305b" Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.678343 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84b0c4147668c1e9a2c6ff4ea822004a40ac4c20aa2798cf1f7a1eb9c7a0305b"} err="failed to get container status \"84b0c4147668c1e9a2c6ff4ea822004a40ac4c20aa2798cf1f7a1eb9c7a0305b\": rpc error: code = NotFound desc = could not find container \"84b0c4147668c1e9a2c6ff4ea822004a40ac4c20aa2798cf1f7a1eb9c7a0305b\": container with ID starting with 84b0c4147668c1e9a2c6ff4ea822004a40ac4c20aa2798cf1f7a1eb9c7a0305b not found: ID does not exist" Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.678355 4947 scope.go:117] "RemoveContainer" containerID="3e26083de2c3a3c61840761f64b98361b685f4ae996988a25516663f77c4048a" Dec 03 09:12:30 crc kubenswrapper[4947]: E1203 09:12:30.678553 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e26083de2c3a3c61840761f64b98361b685f4ae996988a25516663f77c4048a\": container with ID starting with 3e26083de2c3a3c61840761f64b98361b685f4ae996988a25516663f77c4048a not found: ID does not exist" containerID="3e26083de2c3a3c61840761f64b98361b685f4ae996988a25516663f77c4048a" Dec 03 09:12:30 crc kubenswrapper[4947]: I1203 09:12:30.678568 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e26083de2c3a3c61840761f64b98361b685f4ae996988a25516663f77c4048a"} err="failed to get container status \"3e26083de2c3a3c61840761f64b98361b685f4ae996988a25516663f77c4048a\": rpc error: code = NotFound desc = could not find container \"3e26083de2c3a3c61840761f64b98361b685f4ae996988a25516663f77c4048a\": container with ID starting with 3e26083de2c3a3c61840761f64b98361b685f4ae996988a25516663f77c4048a not found: ID does not exist" Dec 03 09:12:31 crc kubenswrapper[4947]: I1203 09:12:31.097177 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9078258b-c4e7-4e4a-a76f-3fd2a572d6e1" path="/var/lib/kubelet/pods/9078258b-c4e7-4e4a-a76f-3fd2a572d6e1/volumes" Dec 03 09:12:31 crc kubenswrapper[4947]: I1203 09:12:31.588796 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c9cd9435-a4d8-4de1-ab22-f008af6736b5","Type":"ContainerStarted","Data":"828ea520a9331be1550bd3efd243658327f2849687947859a0df48686729290c"} Dec 03 09:12:35 crc kubenswrapper[4947]: I1203 09:12:35.084397 4947 scope.go:117] "RemoveContainer" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" Dec 03 09:12:35 crc kubenswrapper[4947]: E1203 09:12:35.086452 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:12:36 crc kubenswrapper[4947]: I1203 09:12:36.642421 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c9cd9435-a4d8-4de1-ab22-f008af6736b5","Type":"ContainerStarted","Data":"40d9470f975c765b3c3735dcb77a77cf6b9025f0aad3f21677b4b5f0dfb81341"} Dec 03 09:12:40 crc kubenswrapper[4947]: I1203 09:12:40.693366 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c9cd9435-a4d8-4de1-ab22-f008af6736b5","Type":"ContainerStarted","Data":"c16d99e6d25785837aa97b19b87ae31e596fbf5fe1eeeb3614edd64af8a321c5"} Dec 03 09:12:41 crc kubenswrapper[4947]: I1203 09:12:41.706118 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c9cd9435-a4d8-4de1-ab22-f008af6736b5","Type":"ContainerStarted","Data":"3008deffa8db8427b59144cc594c55e86d79ec443e0e2629caf9f489a4b5e074"} Dec 03 09:12:43 crc kubenswrapper[4947]: I1203 09:12:43.728043 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c9cd9435-a4d8-4de1-ab22-f008af6736b5","Type":"ContainerStarted","Data":"8d9e5c15d5f53b93811c5409f383ef72b1a8ef9fddaccd6542f50ee47494a767"} Dec 03 09:12:43 crc kubenswrapper[4947]: I1203 09:12:43.729351 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 09:12:43 crc kubenswrapper[4947]: I1203 09:12:43.756906 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.307669146 podStartE2EDuration="14.756818986s" podCreationTimestamp="2025-12-03 09:12:29 +0000 UTC" firstStartedPulling="2025-12-03 09:12:30.677251354 +0000 UTC m=+8611.938205780" lastFinishedPulling="2025-12-03 09:12:43.126401194 +0000 UTC m=+8624.387355620" observedRunningTime="2025-12-03 09:12:43.745434538 +0000 UTC m=+8625.006389004" watchObservedRunningTime="2025-12-03 09:12:43.756818986 +0000 UTC m=+8625.017773412" Dec 03 09:12:50 crc kubenswrapper[4947]: I1203 09:12:50.082809 4947 scope.go:117] "RemoveContainer" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" Dec 03 09:12:50 crc kubenswrapper[4947]: E1203 09:12:50.083300 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:12:50 crc kubenswrapper[4947]: I1203 09:12:50.799866 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-smxsk"] Dec 03 09:12:50 crc kubenswrapper[4947]: E1203 09:12:50.800555 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9078258b-c4e7-4e4a-a76f-3fd2a572d6e1" containerName="registry-server" Dec 03 09:12:50 crc kubenswrapper[4947]: I1203 09:12:50.800580 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9078258b-c4e7-4e4a-a76f-3fd2a572d6e1" containerName="registry-server" Dec 03 09:12:50 crc kubenswrapper[4947]: E1203 09:12:50.800603 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9078258b-c4e7-4e4a-a76f-3fd2a572d6e1" containerName="extract-utilities" Dec 03 09:12:50 crc kubenswrapper[4947]: I1203 09:12:50.800613 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9078258b-c4e7-4e4a-a76f-3fd2a572d6e1" containerName="extract-utilities" Dec 03 09:12:50 crc kubenswrapper[4947]: E1203 09:12:50.800655 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9078258b-c4e7-4e4a-a76f-3fd2a572d6e1" containerName="extract-content" Dec 03 09:12:50 crc kubenswrapper[4947]: I1203 09:12:50.800664 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9078258b-c4e7-4e4a-a76f-3fd2a572d6e1" containerName="extract-content" Dec 03 09:12:50 crc kubenswrapper[4947]: I1203 09:12:50.800949 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="9078258b-c4e7-4e4a-a76f-3fd2a572d6e1" containerName="registry-server" Dec 03 09:12:50 crc kubenswrapper[4947]: I1203 09:12:50.801912 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-smxsk" Dec 03 09:12:50 crc kubenswrapper[4947]: I1203 09:12:50.832091 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-475b-account-create-update-rkmdb"] Dec 03 09:12:50 crc kubenswrapper[4947]: I1203 09:12:50.833445 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-475b-account-create-update-rkmdb" Dec 03 09:12:50 crc kubenswrapper[4947]: I1203 09:12:50.841667 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-smxsk"] Dec 03 09:12:50 crc kubenswrapper[4947]: I1203 09:12:50.849707 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Dec 03 09:12:50 crc kubenswrapper[4947]: I1203 09:12:50.852552 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-475b-account-create-update-rkmdb"] Dec 03 09:12:50 crc kubenswrapper[4947]: I1203 09:12:50.925015 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdtx8\" (UniqueName: \"kubernetes.io/projected/1fc249da-1377-4963-9556-38b7a1effa79-kube-api-access-mdtx8\") pod \"aodh-db-create-smxsk\" (UID: \"1fc249da-1377-4963-9556-38b7a1effa79\") " pod="openstack/aodh-db-create-smxsk" Dec 03 09:12:50 crc kubenswrapper[4947]: I1203 09:12:50.925069 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvq46\" (UniqueName: \"kubernetes.io/projected/3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c-kube-api-access-gvq46\") pod \"aodh-475b-account-create-update-rkmdb\" (UID: \"3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c\") " pod="openstack/aodh-475b-account-create-update-rkmdb" Dec 03 09:12:50 crc kubenswrapper[4947]: I1203 09:12:50.925145 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c-operator-scripts\") pod \"aodh-475b-account-create-update-rkmdb\" (UID: \"3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c\") " pod="openstack/aodh-475b-account-create-update-rkmdb" Dec 03 09:12:50 crc kubenswrapper[4947]: I1203 09:12:50.925579 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fc249da-1377-4963-9556-38b7a1effa79-operator-scripts\") pod \"aodh-db-create-smxsk\" (UID: \"1fc249da-1377-4963-9556-38b7a1effa79\") " pod="openstack/aodh-db-create-smxsk" Dec 03 09:12:51 crc kubenswrapper[4947]: I1203 09:12:51.027159 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fc249da-1377-4963-9556-38b7a1effa79-operator-scripts\") pod \"aodh-db-create-smxsk\" (UID: \"1fc249da-1377-4963-9556-38b7a1effa79\") " pod="openstack/aodh-db-create-smxsk" Dec 03 09:12:51 crc kubenswrapper[4947]: I1203 09:12:51.027469 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdtx8\" (UniqueName: \"kubernetes.io/projected/1fc249da-1377-4963-9556-38b7a1effa79-kube-api-access-mdtx8\") pod \"aodh-db-create-smxsk\" (UID: \"1fc249da-1377-4963-9556-38b7a1effa79\") " pod="openstack/aodh-db-create-smxsk" Dec 03 09:12:51 crc kubenswrapper[4947]: I1203 09:12:51.027612 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvq46\" (UniqueName: \"kubernetes.io/projected/3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c-kube-api-access-gvq46\") pod \"aodh-475b-account-create-update-rkmdb\" (UID: \"3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c\") " pod="openstack/aodh-475b-account-create-update-rkmdb" Dec 03 09:12:51 crc kubenswrapper[4947]: I1203 09:12:51.027773 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c-operator-scripts\") pod \"aodh-475b-account-create-update-rkmdb\" (UID: \"3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c\") " pod="openstack/aodh-475b-account-create-update-rkmdb" Dec 03 09:12:51 crc kubenswrapper[4947]: I1203 09:12:51.028110 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fc249da-1377-4963-9556-38b7a1effa79-operator-scripts\") pod \"aodh-db-create-smxsk\" (UID: \"1fc249da-1377-4963-9556-38b7a1effa79\") " pod="openstack/aodh-db-create-smxsk" Dec 03 09:12:51 crc kubenswrapper[4947]: I1203 09:12:51.028653 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c-operator-scripts\") pod \"aodh-475b-account-create-update-rkmdb\" (UID: \"3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c\") " pod="openstack/aodh-475b-account-create-update-rkmdb" Dec 03 09:12:51 crc kubenswrapper[4947]: I1203 09:12:51.047978 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvq46\" (UniqueName: \"kubernetes.io/projected/3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c-kube-api-access-gvq46\") pod \"aodh-475b-account-create-update-rkmdb\" (UID: \"3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c\") " pod="openstack/aodh-475b-account-create-update-rkmdb" Dec 03 09:12:51 crc kubenswrapper[4947]: I1203 09:12:51.050166 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdtx8\" (UniqueName: \"kubernetes.io/projected/1fc249da-1377-4963-9556-38b7a1effa79-kube-api-access-mdtx8\") pod \"aodh-db-create-smxsk\" (UID: \"1fc249da-1377-4963-9556-38b7a1effa79\") " pod="openstack/aodh-db-create-smxsk" Dec 03 09:12:51 crc kubenswrapper[4947]: I1203 09:12:51.158738 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-smxsk" Dec 03 09:12:51 crc kubenswrapper[4947]: I1203 09:12:51.170222 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-475b-account-create-update-rkmdb" Dec 03 09:12:51 crc kubenswrapper[4947]: I1203 09:12:51.754801 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-475b-account-create-update-rkmdb"] Dec 03 09:12:51 crc kubenswrapper[4947]: W1203 09:12:51.767693 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fc249da_1377_4963_9556_38b7a1effa79.slice/crio-c60ee9e4a7f030b18fe0c1af52da3c393dfc9f48b3194de8cd7f7493d2655b88 WatchSource:0}: Error finding container c60ee9e4a7f030b18fe0c1af52da3c393dfc9f48b3194de8cd7f7493d2655b88: Status 404 returned error can't find the container with id c60ee9e4a7f030b18fe0c1af52da3c393dfc9f48b3194de8cd7f7493d2655b88 Dec 03 09:12:51 crc kubenswrapper[4947]: I1203 09:12:51.770047 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-smxsk"] Dec 03 09:12:51 crc kubenswrapper[4947]: I1203 09:12:51.812353 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-smxsk" event={"ID":"1fc249da-1377-4963-9556-38b7a1effa79","Type":"ContainerStarted","Data":"c60ee9e4a7f030b18fe0c1af52da3c393dfc9f48b3194de8cd7f7493d2655b88"} Dec 03 09:12:51 crc kubenswrapper[4947]: I1203 09:12:51.813655 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-475b-account-create-update-rkmdb" event={"ID":"3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c","Type":"ContainerStarted","Data":"39f0b4822cc15615fcb7ace5afa7bc488ad725cfd3379d4a17491648360ccb33"} Dec 03 09:12:52 crc kubenswrapper[4947]: I1203 09:12:52.830468 4947 generic.go:334] "Generic (PLEG): container finished" podID="1fc249da-1377-4963-9556-38b7a1effa79" containerID="8bb5b8db77cce67c639ee2ba505a16125c593cb60b66a7877e3404155ea3381b" exitCode=0 Dec 03 09:12:52 crc kubenswrapper[4947]: I1203 09:12:52.830539 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-smxsk" event={"ID":"1fc249da-1377-4963-9556-38b7a1effa79","Type":"ContainerDied","Data":"8bb5b8db77cce67c639ee2ba505a16125c593cb60b66a7877e3404155ea3381b"} Dec 03 09:12:52 crc kubenswrapper[4947]: I1203 09:12:52.836666 4947 generic.go:334] "Generic (PLEG): container finished" podID="3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c" containerID="9eabfb70ace61efc0edbf49be73a86ddcaae44937fc9895d7dc5a4a85521dec7" exitCode=0 Dec 03 09:12:52 crc kubenswrapper[4947]: I1203 09:12:52.836712 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-475b-account-create-update-rkmdb" event={"ID":"3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c","Type":"ContainerDied","Data":"9eabfb70ace61efc0edbf49be73a86ddcaae44937fc9895d7dc5a4a85521dec7"} Dec 03 09:12:54 crc kubenswrapper[4947]: I1203 09:12:54.396369 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-475b-account-create-update-rkmdb" Dec 03 09:12:54 crc kubenswrapper[4947]: I1203 09:12:54.404518 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-smxsk" Dec 03 09:12:54 crc kubenswrapper[4947]: I1203 09:12:54.507917 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c-operator-scripts\") pod \"3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c\" (UID: \"3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c\") " Dec 03 09:12:54 crc kubenswrapper[4947]: I1203 09:12:54.507976 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvq46\" (UniqueName: \"kubernetes.io/projected/3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c-kube-api-access-gvq46\") pod \"3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c\" (UID: \"3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c\") " Dec 03 09:12:54 crc kubenswrapper[4947]: I1203 09:12:54.508123 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdtx8\" (UniqueName: \"kubernetes.io/projected/1fc249da-1377-4963-9556-38b7a1effa79-kube-api-access-mdtx8\") pod \"1fc249da-1377-4963-9556-38b7a1effa79\" (UID: \"1fc249da-1377-4963-9556-38b7a1effa79\") " Dec 03 09:12:54 crc kubenswrapper[4947]: I1203 09:12:54.508284 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fc249da-1377-4963-9556-38b7a1effa79-operator-scripts\") pod \"1fc249da-1377-4963-9556-38b7a1effa79\" (UID: \"1fc249da-1377-4963-9556-38b7a1effa79\") " Dec 03 09:12:54 crc kubenswrapper[4947]: I1203 09:12:54.508635 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c" (UID: "3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:12:54 crc kubenswrapper[4947]: I1203 09:12:54.508831 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fc249da-1377-4963-9556-38b7a1effa79-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1fc249da-1377-4963-9556-38b7a1effa79" (UID: "1fc249da-1377-4963-9556-38b7a1effa79"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:12:54 crc kubenswrapper[4947]: I1203 09:12:54.509335 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fc249da-1377-4963-9556-38b7a1effa79-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:12:54 crc kubenswrapper[4947]: I1203 09:12:54.509361 4947 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:12:54 crc kubenswrapper[4947]: I1203 09:12:54.513924 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fc249da-1377-4963-9556-38b7a1effa79-kube-api-access-mdtx8" (OuterVolumeSpecName: "kube-api-access-mdtx8") pod "1fc249da-1377-4963-9556-38b7a1effa79" (UID: "1fc249da-1377-4963-9556-38b7a1effa79"). InnerVolumeSpecName "kube-api-access-mdtx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:12:54 crc kubenswrapper[4947]: I1203 09:12:54.515278 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c-kube-api-access-gvq46" (OuterVolumeSpecName: "kube-api-access-gvq46") pod "3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c" (UID: "3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c"). InnerVolumeSpecName "kube-api-access-gvq46". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:12:54 crc kubenswrapper[4947]: I1203 09:12:54.611138 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvq46\" (UniqueName: \"kubernetes.io/projected/3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c-kube-api-access-gvq46\") on node \"crc\" DevicePath \"\"" Dec 03 09:12:54 crc kubenswrapper[4947]: I1203 09:12:54.611184 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdtx8\" (UniqueName: \"kubernetes.io/projected/1fc249da-1377-4963-9556-38b7a1effa79-kube-api-access-mdtx8\") on node \"crc\" DevicePath \"\"" Dec 03 09:12:54 crc kubenswrapper[4947]: I1203 09:12:54.893634 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-smxsk" Dec 03 09:12:54 crc kubenswrapper[4947]: I1203 09:12:54.893624 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-smxsk" event={"ID":"1fc249da-1377-4963-9556-38b7a1effa79","Type":"ContainerDied","Data":"c60ee9e4a7f030b18fe0c1af52da3c393dfc9f48b3194de8cd7f7493d2655b88"} Dec 03 09:12:54 crc kubenswrapper[4947]: I1203 09:12:54.893786 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c60ee9e4a7f030b18fe0c1af52da3c393dfc9f48b3194de8cd7f7493d2655b88" Dec 03 09:12:54 crc kubenswrapper[4947]: I1203 09:12:54.895721 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-475b-account-create-update-rkmdb" event={"ID":"3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c","Type":"ContainerDied","Data":"39f0b4822cc15615fcb7ace5afa7bc488ad725cfd3379d4a17491648360ccb33"} Dec 03 09:12:54 crc kubenswrapper[4947]: I1203 09:12:54.895749 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39f0b4822cc15615fcb7ace5afa7bc488ad725cfd3379d4a17491648360ccb33" Dec 03 09:12:54 crc kubenswrapper[4947]: I1203 09:12:54.895791 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-475b-account-create-update-rkmdb" Dec 03 09:12:56 crc kubenswrapper[4947]: I1203 09:12:56.301792 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-4pqqh"] Dec 03 09:12:56 crc kubenswrapper[4947]: E1203 09:12:56.302849 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc249da-1377-4963-9556-38b7a1effa79" containerName="mariadb-database-create" Dec 03 09:12:56 crc kubenswrapper[4947]: I1203 09:12:56.302865 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc249da-1377-4963-9556-38b7a1effa79" containerName="mariadb-database-create" Dec 03 09:12:56 crc kubenswrapper[4947]: E1203 09:12:56.302883 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c" containerName="mariadb-account-create-update" Dec 03 09:12:56 crc kubenswrapper[4947]: I1203 09:12:56.302889 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c" containerName="mariadb-account-create-update" Dec 03 09:12:56 crc kubenswrapper[4947]: I1203 09:12:56.303123 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fc249da-1377-4963-9556-38b7a1effa79" containerName="mariadb-database-create" Dec 03 09:12:56 crc kubenswrapper[4947]: I1203 09:12:56.303148 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c" containerName="mariadb-account-create-update" Dec 03 09:12:56 crc kubenswrapper[4947]: I1203 09:12:56.303871 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-4pqqh" Dec 03 09:12:56 crc kubenswrapper[4947]: I1203 09:12:56.313145 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 03 09:12:56 crc kubenswrapper[4947]: I1203 09:12:56.313394 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 03 09:12:56 crc kubenswrapper[4947]: I1203 09:12:56.313483 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 09:12:56 crc kubenswrapper[4947]: I1203 09:12:56.318468 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-mpgqb" Dec 03 09:12:56 crc kubenswrapper[4947]: I1203 09:12:56.322100 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-4pqqh"] Dec 03 09:12:56 crc kubenswrapper[4947]: I1203 09:12:56.357961 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d3388b-b492-404e-bfbd-d9c84de28766-combined-ca-bundle\") pod \"aodh-db-sync-4pqqh\" (UID: \"30d3388b-b492-404e-bfbd-d9c84de28766\") " pod="openstack/aodh-db-sync-4pqqh" Dec 03 09:12:56 crc kubenswrapper[4947]: I1203 09:12:56.358011 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30d3388b-b492-404e-bfbd-d9c84de28766-config-data\") pod \"aodh-db-sync-4pqqh\" (UID: \"30d3388b-b492-404e-bfbd-d9c84de28766\") " pod="openstack/aodh-db-sync-4pqqh" Dec 03 09:12:56 crc kubenswrapper[4947]: I1203 09:12:56.358037 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30d3388b-b492-404e-bfbd-d9c84de28766-scripts\") pod \"aodh-db-sync-4pqqh\" (UID: \"30d3388b-b492-404e-bfbd-d9c84de28766\") " pod="openstack/aodh-db-sync-4pqqh" Dec 03 09:12:56 crc kubenswrapper[4947]: I1203 09:12:56.358093 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ckqg\" (UniqueName: \"kubernetes.io/projected/30d3388b-b492-404e-bfbd-d9c84de28766-kube-api-access-4ckqg\") pod \"aodh-db-sync-4pqqh\" (UID: \"30d3388b-b492-404e-bfbd-d9c84de28766\") " pod="openstack/aodh-db-sync-4pqqh" Dec 03 09:12:56 crc kubenswrapper[4947]: I1203 09:12:56.459785 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d3388b-b492-404e-bfbd-d9c84de28766-combined-ca-bundle\") pod \"aodh-db-sync-4pqqh\" (UID: \"30d3388b-b492-404e-bfbd-d9c84de28766\") " pod="openstack/aodh-db-sync-4pqqh" Dec 03 09:12:56 crc kubenswrapper[4947]: I1203 09:12:56.459843 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30d3388b-b492-404e-bfbd-d9c84de28766-config-data\") pod \"aodh-db-sync-4pqqh\" (UID: \"30d3388b-b492-404e-bfbd-d9c84de28766\") " pod="openstack/aodh-db-sync-4pqqh" Dec 03 09:12:56 crc kubenswrapper[4947]: I1203 09:12:56.459984 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30d3388b-b492-404e-bfbd-d9c84de28766-scripts\") pod \"aodh-db-sync-4pqqh\" (UID: \"30d3388b-b492-404e-bfbd-d9c84de28766\") " pod="openstack/aodh-db-sync-4pqqh" Dec 03 09:12:56 crc kubenswrapper[4947]: I1203 09:12:56.460087 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ckqg\" (UniqueName: \"kubernetes.io/projected/30d3388b-b492-404e-bfbd-d9c84de28766-kube-api-access-4ckqg\") pod \"aodh-db-sync-4pqqh\" (UID: \"30d3388b-b492-404e-bfbd-d9c84de28766\") " pod="openstack/aodh-db-sync-4pqqh" Dec 03 09:12:56 crc kubenswrapper[4947]: I1203 09:12:56.466838 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d3388b-b492-404e-bfbd-d9c84de28766-combined-ca-bundle\") pod \"aodh-db-sync-4pqqh\" (UID: \"30d3388b-b492-404e-bfbd-d9c84de28766\") " pod="openstack/aodh-db-sync-4pqqh" Dec 03 09:12:56 crc kubenswrapper[4947]: I1203 09:12:56.467265 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30d3388b-b492-404e-bfbd-d9c84de28766-scripts\") pod \"aodh-db-sync-4pqqh\" (UID: \"30d3388b-b492-404e-bfbd-d9c84de28766\") " pod="openstack/aodh-db-sync-4pqqh" Dec 03 09:12:56 crc kubenswrapper[4947]: I1203 09:12:56.467478 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30d3388b-b492-404e-bfbd-d9c84de28766-config-data\") pod \"aodh-db-sync-4pqqh\" (UID: \"30d3388b-b492-404e-bfbd-d9c84de28766\") " pod="openstack/aodh-db-sync-4pqqh" Dec 03 09:12:56 crc kubenswrapper[4947]: I1203 09:12:56.475986 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ckqg\" (UniqueName: \"kubernetes.io/projected/30d3388b-b492-404e-bfbd-d9c84de28766-kube-api-access-4ckqg\") pod \"aodh-db-sync-4pqqh\" (UID: \"30d3388b-b492-404e-bfbd-d9c84de28766\") " pod="openstack/aodh-db-sync-4pqqh" Dec 03 09:12:56 crc kubenswrapper[4947]: I1203 09:12:56.622694 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-4pqqh" Dec 03 09:12:57 crc kubenswrapper[4947]: I1203 09:12:57.038471 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-xcxvl"] Dec 03 09:12:57 crc kubenswrapper[4947]: I1203 09:12:57.049743 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-xcxvl"] Dec 03 09:12:57 crc kubenswrapper[4947]: I1203 09:12:57.101428 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a9f981c-80a7-4c92-afe4-5edabea09911" path="/var/lib/kubelet/pods/4a9f981c-80a7-4c92-afe4-5edabea09911/volumes" Dec 03 09:12:57 crc kubenswrapper[4947]: W1203 09:12:57.112804 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30d3388b_b492_404e_bfbd_d9c84de28766.slice/crio-cfc346e7ac2257bc0617504525ac3698fa07e61c6b178bcfcb62fa8e3105f399 WatchSource:0}: Error finding container cfc346e7ac2257bc0617504525ac3698fa07e61c6b178bcfcb62fa8e3105f399: Status 404 returned error can't find the container with id cfc346e7ac2257bc0617504525ac3698fa07e61c6b178bcfcb62fa8e3105f399 Dec 03 09:12:57 crc kubenswrapper[4947]: I1203 09:12:57.114783 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-4pqqh"] Dec 03 09:12:57 crc kubenswrapper[4947]: I1203 09:12:57.929598 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-4pqqh" event={"ID":"30d3388b-b492-404e-bfbd-d9c84de28766","Type":"ContainerStarted","Data":"cfc346e7ac2257bc0617504525ac3698fa07e61c6b178bcfcb62fa8e3105f399"} Dec 03 09:13:00 crc kubenswrapper[4947]: I1203 09:13:00.122410 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 09:13:01 crc kubenswrapper[4947]: I1203 09:13:01.399836 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 09:13:01 crc kubenswrapper[4947]: I1203 09:13:01.968824 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-4pqqh" event={"ID":"30d3388b-b492-404e-bfbd-d9c84de28766","Type":"ContainerStarted","Data":"3ea4ae7a65fdba513c2c2b8e9656c853967c8f2d9cbeb0fe4168c7e93565ea01"} Dec 03 09:13:01 crc kubenswrapper[4947]: I1203 09:13:01.990982 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-4pqqh" podStartSLOduration=1.717231763 podStartE2EDuration="5.99096438s" podCreationTimestamp="2025-12-03 09:12:56 +0000 UTC" firstStartedPulling="2025-12-03 09:12:57.123512904 +0000 UTC m=+8638.384467330" lastFinishedPulling="2025-12-03 09:13:01.397245521 +0000 UTC m=+8642.658199947" observedRunningTime="2025-12-03 09:13:01.984172358 +0000 UTC m=+8643.245126824" watchObservedRunningTime="2025-12-03 09:13:01.99096438 +0000 UTC m=+8643.251918806" Dec 03 09:13:04 crc kubenswrapper[4947]: I1203 09:13:04.000365 4947 generic.go:334] "Generic (PLEG): container finished" podID="30d3388b-b492-404e-bfbd-d9c84de28766" containerID="3ea4ae7a65fdba513c2c2b8e9656c853967c8f2d9cbeb0fe4168c7e93565ea01" exitCode=0 Dec 03 09:13:04 crc kubenswrapper[4947]: I1203 09:13:04.000420 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-4pqqh" event={"ID":"30d3388b-b492-404e-bfbd-d9c84de28766","Type":"ContainerDied","Data":"3ea4ae7a65fdba513c2c2b8e9656c853967c8f2d9cbeb0fe4168c7e93565ea01"} Dec 03 09:13:05 crc kubenswrapper[4947]: I1203 09:13:05.083607 4947 scope.go:117] "RemoveContainer" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" Dec 03 09:13:05 crc kubenswrapper[4947]: E1203 09:13:05.084546 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:13:05 crc kubenswrapper[4947]: I1203 09:13:05.437040 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-4pqqh" Dec 03 09:13:05 crc kubenswrapper[4947]: I1203 09:13:05.565814 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30d3388b-b492-404e-bfbd-d9c84de28766-scripts\") pod \"30d3388b-b492-404e-bfbd-d9c84de28766\" (UID: \"30d3388b-b492-404e-bfbd-d9c84de28766\") " Dec 03 09:13:05 crc kubenswrapper[4947]: I1203 09:13:05.566079 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30d3388b-b492-404e-bfbd-d9c84de28766-config-data\") pod \"30d3388b-b492-404e-bfbd-d9c84de28766\" (UID: \"30d3388b-b492-404e-bfbd-d9c84de28766\") " Dec 03 09:13:05 crc kubenswrapper[4947]: I1203 09:13:05.566365 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d3388b-b492-404e-bfbd-d9c84de28766-combined-ca-bundle\") pod \"30d3388b-b492-404e-bfbd-d9c84de28766\" (UID: \"30d3388b-b492-404e-bfbd-d9c84de28766\") " Dec 03 09:13:05 crc kubenswrapper[4947]: I1203 09:13:05.566557 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ckqg\" (UniqueName: \"kubernetes.io/projected/30d3388b-b492-404e-bfbd-d9c84de28766-kube-api-access-4ckqg\") pod \"30d3388b-b492-404e-bfbd-d9c84de28766\" (UID: \"30d3388b-b492-404e-bfbd-d9c84de28766\") " Dec 03 09:13:05 crc kubenswrapper[4947]: I1203 09:13:05.572122 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30d3388b-b492-404e-bfbd-d9c84de28766-scripts" (OuterVolumeSpecName: "scripts") pod "30d3388b-b492-404e-bfbd-d9c84de28766" (UID: "30d3388b-b492-404e-bfbd-d9c84de28766"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:13:05 crc kubenswrapper[4947]: I1203 09:13:05.574175 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30d3388b-b492-404e-bfbd-d9c84de28766-kube-api-access-4ckqg" (OuterVolumeSpecName: "kube-api-access-4ckqg") pod "30d3388b-b492-404e-bfbd-d9c84de28766" (UID: "30d3388b-b492-404e-bfbd-d9c84de28766"). InnerVolumeSpecName "kube-api-access-4ckqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:13:05 crc kubenswrapper[4947]: I1203 09:13:05.596714 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30d3388b-b492-404e-bfbd-d9c84de28766-config-data" (OuterVolumeSpecName: "config-data") pod "30d3388b-b492-404e-bfbd-d9c84de28766" (UID: "30d3388b-b492-404e-bfbd-d9c84de28766"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:13:05 crc kubenswrapper[4947]: I1203 09:13:05.600391 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30d3388b-b492-404e-bfbd-d9c84de28766-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30d3388b-b492-404e-bfbd-d9c84de28766" (UID: "30d3388b-b492-404e-bfbd-d9c84de28766"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:13:05 crc kubenswrapper[4947]: I1203 09:13:05.669150 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30d3388b-b492-404e-bfbd-d9c84de28766-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:13:05 crc kubenswrapper[4947]: I1203 09:13:05.669190 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30d3388b-b492-404e-bfbd-d9c84de28766-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:13:05 crc kubenswrapper[4947]: I1203 09:13:05.669207 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d3388b-b492-404e-bfbd-d9c84de28766-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:13:05 crc kubenswrapper[4947]: I1203 09:13:05.669222 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ckqg\" (UniqueName: \"kubernetes.io/projected/30d3388b-b492-404e-bfbd-d9c84de28766-kube-api-access-4ckqg\") on node \"crc\" DevicePath \"\"" Dec 03 09:13:06 crc kubenswrapper[4947]: I1203 09:13:06.033068 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-4pqqh" event={"ID":"30d3388b-b492-404e-bfbd-d9c84de28766","Type":"ContainerDied","Data":"cfc346e7ac2257bc0617504525ac3698fa07e61c6b178bcfcb62fa8e3105f399"} Dec 03 09:13:06 crc kubenswrapper[4947]: I1203 09:13:06.033442 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfc346e7ac2257bc0617504525ac3698fa07e61c6b178bcfcb62fa8e3105f399" Dec 03 09:13:06 crc kubenswrapper[4947]: I1203 09:13:06.033183 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-4pqqh" Dec 03 09:13:10 crc kubenswrapper[4947]: I1203 09:13:10.852438 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 03 09:13:10 crc kubenswrapper[4947]: E1203 09:13:10.853615 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30d3388b-b492-404e-bfbd-d9c84de28766" containerName="aodh-db-sync" Dec 03 09:13:10 crc kubenswrapper[4947]: I1203 09:13:10.853635 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d3388b-b492-404e-bfbd-d9c84de28766" containerName="aodh-db-sync" Dec 03 09:13:10 crc kubenswrapper[4947]: I1203 09:13:10.853970 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="30d3388b-b492-404e-bfbd-d9c84de28766" containerName="aodh-db-sync" Dec 03 09:13:10 crc kubenswrapper[4947]: I1203 09:13:10.874559 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 09:13:10 crc kubenswrapper[4947]: I1203 09:13:10.879067 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 03 09:13:10 crc kubenswrapper[4947]: I1203 09:13:10.879402 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-mpgqb" Dec 03 09:13:10 crc kubenswrapper[4947]: I1203 09:13:10.882702 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 03 09:13:10 crc kubenswrapper[4947]: I1203 09:13:10.895593 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 03 09:13:11 crc kubenswrapper[4947]: I1203 09:13:11.007096 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fg2x\" (UniqueName: \"kubernetes.io/projected/5f2e2ca8-0e8a-40eb-92f6-838120ef08c0-kube-api-access-9fg2x\") pod \"aodh-0\" (UID: \"5f2e2ca8-0e8a-40eb-92f6-838120ef08c0\") " pod="openstack/aodh-0" Dec 03 09:13:11 crc kubenswrapper[4947]: I1203 09:13:11.007188 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2e2ca8-0e8a-40eb-92f6-838120ef08c0-combined-ca-bundle\") pod \"aodh-0\" (UID: \"5f2e2ca8-0e8a-40eb-92f6-838120ef08c0\") " pod="openstack/aodh-0" Dec 03 09:13:11 crc kubenswrapper[4947]: I1203 09:13:11.007278 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f2e2ca8-0e8a-40eb-92f6-838120ef08c0-scripts\") pod \"aodh-0\" (UID: \"5f2e2ca8-0e8a-40eb-92f6-838120ef08c0\") " pod="openstack/aodh-0" Dec 03 09:13:11 crc kubenswrapper[4947]: I1203 09:13:11.007300 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2e2ca8-0e8a-40eb-92f6-838120ef08c0-config-data\") pod \"aodh-0\" (UID: \"5f2e2ca8-0e8a-40eb-92f6-838120ef08c0\") " pod="openstack/aodh-0" Dec 03 09:13:11 crc kubenswrapper[4947]: I1203 09:13:11.109038 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fg2x\" (UniqueName: \"kubernetes.io/projected/5f2e2ca8-0e8a-40eb-92f6-838120ef08c0-kube-api-access-9fg2x\") pod \"aodh-0\" (UID: \"5f2e2ca8-0e8a-40eb-92f6-838120ef08c0\") " pod="openstack/aodh-0" Dec 03 09:13:11 crc kubenswrapper[4947]: I1203 09:13:11.109145 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2e2ca8-0e8a-40eb-92f6-838120ef08c0-combined-ca-bundle\") pod \"aodh-0\" (UID: \"5f2e2ca8-0e8a-40eb-92f6-838120ef08c0\") " pod="openstack/aodh-0" Dec 03 09:13:11 crc kubenswrapper[4947]: I1203 09:13:11.109288 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f2e2ca8-0e8a-40eb-92f6-838120ef08c0-scripts\") pod \"aodh-0\" (UID: \"5f2e2ca8-0e8a-40eb-92f6-838120ef08c0\") " pod="openstack/aodh-0" Dec 03 09:13:11 crc kubenswrapper[4947]: I1203 09:13:11.109313 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2e2ca8-0e8a-40eb-92f6-838120ef08c0-config-data\") pod \"aodh-0\" (UID: \"5f2e2ca8-0e8a-40eb-92f6-838120ef08c0\") " pod="openstack/aodh-0" Dec 03 09:13:11 crc kubenswrapper[4947]: I1203 09:13:11.115271 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f2e2ca8-0e8a-40eb-92f6-838120ef08c0-scripts\") pod \"aodh-0\" (UID: \"5f2e2ca8-0e8a-40eb-92f6-838120ef08c0\") " pod="openstack/aodh-0" Dec 03 09:13:11 crc kubenswrapper[4947]: I1203 09:13:11.117124 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f2e2ca8-0e8a-40eb-92f6-838120ef08c0-config-data\") pod \"aodh-0\" (UID: \"5f2e2ca8-0e8a-40eb-92f6-838120ef08c0\") " pod="openstack/aodh-0" Dec 03 09:13:11 crc kubenswrapper[4947]: I1203 09:13:11.128577 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f2e2ca8-0e8a-40eb-92f6-838120ef08c0-combined-ca-bundle\") pod \"aodh-0\" (UID: \"5f2e2ca8-0e8a-40eb-92f6-838120ef08c0\") " pod="openstack/aodh-0" Dec 03 09:13:11 crc kubenswrapper[4947]: I1203 09:13:11.137874 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fg2x\" (UniqueName: \"kubernetes.io/projected/5f2e2ca8-0e8a-40eb-92f6-838120ef08c0-kube-api-access-9fg2x\") pod \"aodh-0\" (UID: \"5f2e2ca8-0e8a-40eb-92f6-838120ef08c0\") " pod="openstack/aodh-0" Dec 03 09:13:11 crc kubenswrapper[4947]: I1203 09:13:11.223336 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 09:13:11 crc kubenswrapper[4947]: I1203 09:13:11.811039 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 03 09:13:12 crc kubenswrapper[4947]: I1203 09:13:12.113628 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"5f2e2ca8-0e8a-40eb-92f6-838120ef08c0","Type":"ContainerStarted","Data":"9811f68acda7ee4533a957545fe2b48eb6db26ba52eefd15a1ca1883eae24a43"} Dec 03 09:13:13 crc kubenswrapper[4947]: I1203 09:13:13.133129 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"5f2e2ca8-0e8a-40eb-92f6-838120ef08c0","Type":"ContainerStarted","Data":"0631bf8a4064f306e0dc7473d47479c7b5045a58dcb00fff819bec63f162565d"} Dec 03 09:13:13 crc kubenswrapper[4947]: I1203 09:13:13.183447 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 09:13:13 crc kubenswrapper[4947]: I1203 09:13:13.183797 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c9cd9435-a4d8-4de1-ab22-f008af6736b5" containerName="ceilometer-central-agent" containerID="cri-o://40d9470f975c765b3c3735dcb77a77cf6b9025f0aad3f21677b4b5f0dfb81341" gracePeriod=30 Dec 03 09:13:13 crc kubenswrapper[4947]: I1203 09:13:13.183878 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c9cd9435-a4d8-4de1-ab22-f008af6736b5" containerName="sg-core" containerID="cri-o://3008deffa8db8427b59144cc594c55e86d79ec443e0e2629caf9f489a4b5e074" gracePeriod=30 Dec 03 09:13:13 crc kubenswrapper[4947]: I1203 09:13:13.183913 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c9cd9435-a4d8-4de1-ab22-f008af6736b5" containerName="proxy-httpd" containerID="cri-o://8d9e5c15d5f53b93811c5409f383ef72b1a8ef9fddaccd6542f50ee47494a767" gracePeriod=30 Dec 03 09:13:13 crc kubenswrapper[4947]: I1203 09:13:13.183913 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c9cd9435-a4d8-4de1-ab22-f008af6736b5" containerName="ceilometer-notification-agent" containerID="cri-o://c16d99e6d25785837aa97b19b87ae31e596fbf5fe1eeeb3614edd64af8a321c5" gracePeriod=30 Dec 03 09:13:14 crc kubenswrapper[4947]: I1203 09:13:14.149260 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"5f2e2ca8-0e8a-40eb-92f6-838120ef08c0","Type":"ContainerStarted","Data":"b305a667d9cd9efec03e0780e937b3c2be7d4dfc75e3921ddafee8103e48a5a9"} Dec 03 09:13:14 crc kubenswrapper[4947]: I1203 09:13:14.154063 4947 generic.go:334] "Generic (PLEG): container finished" podID="c9cd9435-a4d8-4de1-ab22-f008af6736b5" containerID="8d9e5c15d5f53b93811c5409f383ef72b1a8ef9fddaccd6542f50ee47494a767" exitCode=0 Dec 03 09:13:14 crc kubenswrapper[4947]: I1203 09:13:14.154093 4947 generic.go:334] "Generic (PLEG): container finished" podID="c9cd9435-a4d8-4de1-ab22-f008af6736b5" containerID="3008deffa8db8427b59144cc594c55e86d79ec443e0e2629caf9f489a4b5e074" exitCode=2 Dec 03 09:13:14 crc kubenswrapper[4947]: I1203 09:13:14.154100 4947 generic.go:334] "Generic (PLEG): container finished" podID="c9cd9435-a4d8-4de1-ab22-f008af6736b5" containerID="40d9470f975c765b3c3735dcb77a77cf6b9025f0aad3f21677b4b5f0dfb81341" exitCode=0 Dec 03 09:13:14 crc kubenswrapper[4947]: I1203 09:13:14.154120 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c9cd9435-a4d8-4de1-ab22-f008af6736b5","Type":"ContainerDied","Data":"8d9e5c15d5f53b93811c5409f383ef72b1a8ef9fddaccd6542f50ee47494a767"} Dec 03 09:13:14 crc kubenswrapper[4947]: I1203 09:13:14.154146 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c9cd9435-a4d8-4de1-ab22-f008af6736b5","Type":"ContainerDied","Data":"3008deffa8db8427b59144cc594c55e86d79ec443e0e2629caf9f489a4b5e074"} Dec 03 09:13:14 crc kubenswrapper[4947]: I1203 09:13:14.154155 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c9cd9435-a4d8-4de1-ab22-f008af6736b5","Type":"ContainerDied","Data":"40d9470f975c765b3c3735dcb77a77cf6b9025f0aad3f21677b4b5f0dfb81341"} Dec 03 09:13:16 crc kubenswrapper[4947]: I1203 09:13:16.082793 4947 scope.go:117] "RemoveContainer" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" Dec 03 09:13:16 crc kubenswrapper[4947]: E1203 09:13:16.083387 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:13:16 crc kubenswrapper[4947]: I1203 09:13:16.182768 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"5f2e2ca8-0e8a-40eb-92f6-838120ef08c0","Type":"ContainerStarted","Data":"dd4beaad144a5054d2bb7815b4036f409d37ca49c2176adb00fca34c171dee3b"} Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.194409 4947 generic.go:334] "Generic (PLEG): container finished" podID="c9cd9435-a4d8-4de1-ab22-f008af6736b5" containerID="c16d99e6d25785837aa97b19b87ae31e596fbf5fe1eeeb3614edd64af8a321c5" exitCode=0 Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.194446 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c9cd9435-a4d8-4de1-ab22-f008af6736b5","Type":"ContainerDied","Data":"c16d99e6d25785837aa97b19b87ae31e596fbf5fe1eeeb3614edd64af8a321c5"} Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.199148 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"5f2e2ca8-0e8a-40eb-92f6-838120ef08c0","Type":"ContainerStarted","Data":"31c1842cef7a4d968d5efc4b9574ab1f0aaf967eb44c328c509d64f324c723f7"} Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.234859 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.806029942 podStartE2EDuration="7.234835744s" podCreationTimestamp="2025-12-03 09:13:10 +0000 UTC" firstStartedPulling="2025-12-03 09:13:11.801033738 +0000 UTC m=+8653.061988164" lastFinishedPulling="2025-12-03 09:13:16.22983953 +0000 UTC m=+8657.490793966" observedRunningTime="2025-12-03 09:13:17.222740987 +0000 UTC m=+8658.483695413" watchObservedRunningTime="2025-12-03 09:13:17.234835744 +0000 UTC m=+8658.495790180" Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.356158 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.451065 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9cd9435-a4d8-4de1-ab22-f008af6736b5-run-httpd\") pod \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.451166 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l6rq\" (UniqueName: \"kubernetes.io/projected/c9cd9435-a4d8-4de1-ab22-f008af6736b5-kube-api-access-9l6rq\") pod \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.451201 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9cd9435-a4d8-4de1-ab22-f008af6736b5-scripts\") pod \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.451325 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c9cd9435-a4d8-4de1-ab22-f008af6736b5-sg-core-conf-yaml\") pod \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.451362 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9cd9435-a4d8-4de1-ab22-f008af6736b5-config-data\") pod \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.451425 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9cd9435-a4d8-4de1-ab22-f008af6736b5-combined-ca-bundle\") pod \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.451525 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9cd9435-a4d8-4de1-ab22-f008af6736b5-log-httpd\") pod \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\" (UID: \"c9cd9435-a4d8-4de1-ab22-f008af6736b5\") " Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.451547 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9cd9435-a4d8-4de1-ab22-f008af6736b5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c9cd9435-a4d8-4de1-ab22-f008af6736b5" (UID: "c9cd9435-a4d8-4de1-ab22-f008af6736b5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.452424 4947 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9cd9435-a4d8-4de1-ab22-f008af6736b5-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.453146 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9cd9435-a4d8-4de1-ab22-f008af6736b5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c9cd9435-a4d8-4de1-ab22-f008af6736b5" (UID: "c9cd9435-a4d8-4de1-ab22-f008af6736b5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.456951 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9cd9435-a4d8-4de1-ab22-f008af6736b5-kube-api-access-9l6rq" (OuterVolumeSpecName: "kube-api-access-9l6rq") pod "c9cd9435-a4d8-4de1-ab22-f008af6736b5" (UID: "c9cd9435-a4d8-4de1-ab22-f008af6736b5"). InnerVolumeSpecName "kube-api-access-9l6rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.461779 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cd9435-a4d8-4de1-ab22-f008af6736b5-scripts" (OuterVolumeSpecName: "scripts") pod "c9cd9435-a4d8-4de1-ab22-f008af6736b5" (UID: "c9cd9435-a4d8-4de1-ab22-f008af6736b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.488476 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cd9435-a4d8-4de1-ab22-f008af6736b5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c9cd9435-a4d8-4de1-ab22-f008af6736b5" (UID: "c9cd9435-a4d8-4de1-ab22-f008af6736b5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.554592 4947 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c9cd9435-a4d8-4de1-ab22-f008af6736b5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.554630 4947 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c9cd9435-a4d8-4de1-ab22-f008af6736b5-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.554643 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l6rq\" (UniqueName: \"kubernetes.io/projected/c9cd9435-a4d8-4de1-ab22-f008af6736b5-kube-api-access-9l6rq\") on node \"crc\" DevicePath \"\"" Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.554656 4947 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9cd9435-a4d8-4de1-ab22-f008af6736b5-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.559731 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cd9435-a4d8-4de1-ab22-f008af6736b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9cd9435-a4d8-4de1-ab22-f008af6736b5" (UID: "c9cd9435-a4d8-4de1-ab22-f008af6736b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.561408 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9cd9435-a4d8-4de1-ab22-f008af6736b5-config-data" (OuterVolumeSpecName: "config-data") pod "c9cd9435-a4d8-4de1-ab22-f008af6736b5" (UID: "c9cd9435-a4d8-4de1-ab22-f008af6736b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.657216 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9cd9435-a4d8-4de1-ab22-f008af6736b5-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:13:17 crc kubenswrapper[4947]: I1203 09:13:17.657429 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9cd9435-a4d8-4de1-ab22-f008af6736b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.220919 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.220988 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c9cd9435-a4d8-4de1-ab22-f008af6736b5","Type":"ContainerDied","Data":"828ea520a9331be1550bd3efd243658327f2849687947859a0df48686729290c"} Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.221036 4947 scope.go:117] "RemoveContainer" containerID="8d9e5c15d5f53b93811c5409f383ef72b1a8ef9fddaccd6542f50ee47494a767" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.252885 4947 scope.go:117] "RemoveContainer" containerID="3008deffa8db8427b59144cc594c55e86d79ec443e0e2629caf9f489a4b5e074" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.259897 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.270133 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.275832 4947 scope.go:117] "RemoveContainer" containerID="c16d99e6d25785837aa97b19b87ae31e596fbf5fe1eeeb3614edd64af8a321c5" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.318883 4947 scope.go:117] "RemoveContainer" containerID="40d9470f975c765b3c3735dcb77a77cf6b9025f0aad3f21677b4b5f0dfb81341" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.325291 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 09:13:18 crc kubenswrapper[4947]: E1203 09:13:18.325897 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9cd9435-a4d8-4de1-ab22-f008af6736b5" containerName="ceilometer-central-agent" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.325922 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cd9435-a4d8-4de1-ab22-f008af6736b5" containerName="ceilometer-central-agent" Dec 03 09:13:18 crc kubenswrapper[4947]: E1203 09:13:18.325948 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9cd9435-a4d8-4de1-ab22-f008af6736b5" containerName="sg-core" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.325956 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cd9435-a4d8-4de1-ab22-f008af6736b5" containerName="sg-core" Dec 03 09:13:18 crc kubenswrapper[4947]: E1203 09:13:18.325988 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9cd9435-a4d8-4de1-ab22-f008af6736b5" containerName="ceilometer-notification-agent" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.325998 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cd9435-a4d8-4de1-ab22-f008af6736b5" containerName="ceilometer-notification-agent" Dec 03 09:13:18 crc kubenswrapper[4947]: E1203 09:13:18.326025 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9cd9435-a4d8-4de1-ab22-f008af6736b5" containerName="proxy-httpd" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.326034 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9cd9435-a4d8-4de1-ab22-f008af6736b5" containerName="proxy-httpd" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.326354 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9cd9435-a4d8-4de1-ab22-f008af6736b5" containerName="proxy-httpd" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.326377 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9cd9435-a4d8-4de1-ab22-f008af6736b5" containerName="sg-core" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.326390 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9cd9435-a4d8-4de1-ab22-f008af6736b5" containerName="ceilometer-notification-agent" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.326408 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9cd9435-a4d8-4de1-ab22-f008af6736b5" containerName="ceilometer-central-agent" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.328561 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.331210 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.331604 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.346041 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.483094 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g8np\" (UniqueName: \"kubernetes.io/projected/87e9571e-9b03-430a-83a3-2bc809f12a29-kube-api-access-6g8np\") pod \"ceilometer-0\" (UID: \"87e9571e-9b03-430a-83a3-2bc809f12a29\") " pod="openstack/ceilometer-0" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.483470 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87e9571e-9b03-430a-83a3-2bc809f12a29-scripts\") pod \"ceilometer-0\" (UID: \"87e9571e-9b03-430a-83a3-2bc809f12a29\") " pod="openstack/ceilometer-0" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.483599 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87e9571e-9b03-430a-83a3-2bc809f12a29-config-data\") pod \"ceilometer-0\" (UID: \"87e9571e-9b03-430a-83a3-2bc809f12a29\") " pod="openstack/ceilometer-0" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.484013 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87e9571e-9b03-430a-83a3-2bc809f12a29-log-httpd\") pod \"ceilometer-0\" (UID: \"87e9571e-9b03-430a-83a3-2bc809f12a29\") " pod="openstack/ceilometer-0" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.484113 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87e9571e-9b03-430a-83a3-2bc809f12a29-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87e9571e-9b03-430a-83a3-2bc809f12a29\") " pod="openstack/ceilometer-0" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.484200 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87e9571e-9b03-430a-83a3-2bc809f12a29-run-httpd\") pod \"ceilometer-0\" (UID: \"87e9571e-9b03-430a-83a3-2bc809f12a29\") " pod="openstack/ceilometer-0" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.484330 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87e9571e-9b03-430a-83a3-2bc809f12a29-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87e9571e-9b03-430a-83a3-2bc809f12a29\") " pod="openstack/ceilometer-0" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.586384 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g8np\" (UniqueName: \"kubernetes.io/projected/87e9571e-9b03-430a-83a3-2bc809f12a29-kube-api-access-6g8np\") pod \"ceilometer-0\" (UID: \"87e9571e-9b03-430a-83a3-2bc809f12a29\") " pod="openstack/ceilometer-0" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.586572 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87e9571e-9b03-430a-83a3-2bc809f12a29-scripts\") pod \"ceilometer-0\" (UID: \"87e9571e-9b03-430a-83a3-2bc809f12a29\") " pod="openstack/ceilometer-0" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.586636 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87e9571e-9b03-430a-83a3-2bc809f12a29-config-data\") pod \"ceilometer-0\" (UID: \"87e9571e-9b03-430a-83a3-2bc809f12a29\") " pod="openstack/ceilometer-0" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.586698 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87e9571e-9b03-430a-83a3-2bc809f12a29-log-httpd\") pod \"ceilometer-0\" (UID: \"87e9571e-9b03-430a-83a3-2bc809f12a29\") " pod="openstack/ceilometer-0" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.586769 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87e9571e-9b03-430a-83a3-2bc809f12a29-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87e9571e-9b03-430a-83a3-2bc809f12a29\") " pod="openstack/ceilometer-0" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.586799 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87e9571e-9b03-430a-83a3-2bc809f12a29-run-httpd\") pod \"ceilometer-0\" (UID: \"87e9571e-9b03-430a-83a3-2bc809f12a29\") " pod="openstack/ceilometer-0" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.586884 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87e9571e-9b03-430a-83a3-2bc809f12a29-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87e9571e-9b03-430a-83a3-2bc809f12a29\") " pod="openstack/ceilometer-0" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.587698 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87e9571e-9b03-430a-83a3-2bc809f12a29-run-httpd\") pod \"ceilometer-0\" (UID: \"87e9571e-9b03-430a-83a3-2bc809f12a29\") " pod="openstack/ceilometer-0" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.587835 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/87e9571e-9b03-430a-83a3-2bc809f12a29-log-httpd\") pod \"ceilometer-0\" (UID: \"87e9571e-9b03-430a-83a3-2bc809f12a29\") " pod="openstack/ceilometer-0" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.592013 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87e9571e-9b03-430a-83a3-2bc809f12a29-scripts\") pod \"ceilometer-0\" (UID: \"87e9571e-9b03-430a-83a3-2bc809f12a29\") " pod="openstack/ceilometer-0" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.592243 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/87e9571e-9b03-430a-83a3-2bc809f12a29-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"87e9571e-9b03-430a-83a3-2bc809f12a29\") " pod="openstack/ceilometer-0" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.592247 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87e9571e-9b03-430a-83a3-2bc809f12a29-config-data\") pod \"ceilometer-0\" (UID: \"87e9571e-9b03-430a-83a3-2bc809f12a29\") " pod="openstack/ceilometer-0" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.593798 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87e9571e-9b03-430a-83a3-2bc809f12a29-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"87e9571e-9b03-430a-83a3-2bc809f12a29\") " pod="openstack/ceilometer-0" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.608960 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g8np\" (UniqueName: \"kubernetes.io/projected/87e9571e-9b03-430a-83a3-2bc809f12a29-kube-api-access-6g8np\") pod \"ceilometer-0\" (UID: \"87e9571e-9b03-430a-83a3-2bc809f12a29\") " pod="openstack/ceilometer-0" Dec 03 09:13:18 crc kubenswrapper[4947]: I1203 09:13:18.651595 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 09:13:19 crc kubenswrapper[4947]: I1203 09:13:19.106924 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9cd9435-a4d8-4de1-ab22-f008af6736b5" path="/var/lib/kubelet/pods/c9cd9435-a4d8-4de1-ab22-f008af6736b5/volumes" Dec 03 09:13:19 crc kubenswrapper[4947]: I1203 09:13:19.188036 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 09:13:19 crc kubenswrapper[4947]: W1203 09:13:19.198272 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87e9571e_9b03_430a_83a3_2bc809f12a29.slice/crio-326c61cb4e97e33cd2002dd37cf08ded1d2958b2f989020e82ca4129663cc6f8 WatchSource:0}: Error finding container 326c61cb4e97e33cd2002dd37cf08ded1d2958b2f989020e82ca4129663cc6f8: Status 404 returned error can't find the container with id 326c61cb4e97e33cd2002dd37cf08ded1d2958b2f989020e82ca4129663cc6f8 Dec 03 09:13:19 crc kubenswrapper[4947]: I1203 09:13:19.235412 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87e9571e-9b03-430a-83a3-2bc809f12a29","Type":"ContainerStarted","Data":"326c61cb4e97e33cd2002dd37cf08ded1d2958b2f989020e82ca4129663cc6f8"} Dec 03 09:13:20 crc kubenswrapper[4947]: I1203 09:13:20.252051 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87e9571e-9b03-430a-83a3-2bc809f12a29","Type":"ContainerStarted","Data":"5cc56595defadbf56de02fedae7278da01a6a65c2d0de80cc34fef54365ce043"} Dec 03 09:13:20 crc kubenswrapper[4947]: I1203 09:13:20.252408 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87e9571e-9b03-430a-83a3-2bc809f12a29","Type":"ContainerStarted","Data":"a37326b036e34401b4fa607e3cfd8041cbfa62f1d0b879865016cd7ef48a04d0"} Dec 03 09:13:21 crc kubenswrapper[4947]: I1203 09:13:21.277670 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87e9571e-9b03-430a-83a3-2bc809f12a29","Type":"ContainerStarted","Data":"647265dfdc7e45f8a0e0ec405e0952d8a122cdf4c3c5279631c2b0b1ef7a228e"} Dec 03 09:13:22 crc kubenswrapper[4947]: I1203 09:13:22.290274 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"87e9571e-9b03-430a-83a3-2bc809f12a29","Type":"ContainerStarted","Data":"cde4b06eed279627d41804ab3ee621eb9a2929227fde37b363fd8766ca1ff495"} Dec 03 09:13:22 crc kubenswrapper[4947]: I1203 09:13:22.291039 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 09:13:22 crc kubenswrapper[4947]: I1203 09:13:22.319031 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.623851183 podStartE2EDuration="4.319009223s" podCreationTimestamp="2025-12-03 09:13:18 +0000 UTC" firstStartedPulling="2025-12-03 09:13:19.200827741 +0000 UTC m=+8660.461782167" lastFinishedPulling="2025-12-03 09:13:21.895985781 +0000 UTC m=+8663.156940207" observedRunningTime="2025-12-03 09:13:22.312102516 +0000 UTC m=+8663.573056952" watchObservedRunningTime="2025-12-03 09:13:22.319009223 +0000 UTC m=+8663.579963649" Dec 03 09:13:26 crc kubenswrapper[4947]: I1203 09:13:26.816356 4947 scope.go:117] "RemoveContainer" containerID="16a76d14c570ea94fd6772ca2cd0bf8e966411d98e6f4627e6a1d8990c5f5ac4" Dec 03 09:13:26 crc kubenswrapper[4947]: I1203 09:13:26.843826 4947 scope.go:117] "RemoveContainer" containerID="4bbe0bfeb543de59ff49e77e9842eb2c08f80707a29d1356d8a2a3754cc883e2" Dec 03 09:13:26 crc kubenswrapper[4947]: I1203 09:13:26.866386 4947 scope.go:117] "RemoveContainer" containerID="c2869799ad7199eb2bcba21cc6484d0e297f037df9fb8ef5096be7a6cea44915" Dec 03 09:13:27 crc kubenswrapper[4947]: I1203 09:13:27.084088 4947 scope.go:117] "RemoveContainer" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" Dec 03 09:13:27 crc kubenswrapper[4947]: E1203 09:13:27.084611 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:13:42 crc kubenswrapper[4947]: I1203 09:13:42.084166 4947 scope.go:117] "RemoveContainer" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" Dec 03 09:13:42 crc kubenswrapper[4947]: E1203 09:13:42.085236 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:13:48 crc kubenswrapper[4947]: I1203 09:13:48.657172 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.496106 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-869d64bff9-hz8s9"] Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.499092 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869d64bff9-hz8s9" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.501246 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.516808 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-ovsdbserver-nb\") pod \"dnsmasq-dns-869d64bff9-hz8s9\" (UID: \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\") " pod="openstack/dnsmasq-dns-869d64bff9-hz8s9" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.516921 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb8v8\" (UniqueName: \"kubernetes.io/projected/1949b00a-6ec5-4c3e-94df-fd057e94da2c-kube-api-access-qb8v8\") pod \"dnsmasq-dns-869d64bff9-hz8s9\" (UID: \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\") " pod="openstack/dnsmasq-dns-869d64bff9-hz8s9" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.516958 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-config\") pod \"dnsmasq-dns-869d64bff9-hz8s9\" (UID: \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\") " pod="openstack/dnsmasq-dns-869d64bff9-hz8s9" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.516979 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-openstack-cell1\") pod \"dnsmasq-dns-869d64bff9-hz8s9\" (UID: \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\") " pod="openstack/dnsmasq-dns-869d64bff9-hz8s9" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.517010 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-869d64bff9-hz8s9"] Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.517018 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-dns-svc\") pod \"dnsmasq-dns-869d64bff9-hz8s9\" (UID: \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\") " pod="openstack/dnsmasq-dns-869d64bff9-hz8s9" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.517117 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-ovsdbserver-sb\") pod \"dnsmasq-dns-869d64bff9-hz8s9\" (UID: \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\") " pod="openstack/dnsmasq-dns-869d64bff9-hz8s9" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.602104 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869d64bff9-hz8s9"] Dec 03 09:13:55 crc kubenswrapper[4947]: E1203 09:13:55.603278 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-qb8v8 openstack-cell1 ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-869d64bff9-hz8s9" podUID="1949b00a-6ec5-4c3e-94df-fd057e94da2c" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.618726 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb8v8\" (UniqueName: \"kubernetes.io/projected/1949b00a-6ec5-4c3e-94df-fd057e94da2c-kube-api-access-qb8v8\") pod \"dnsmasq-dns-869d64bff9-hz8s9\" (UID: \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\") " pod="openstack/dnsmasq-dns-869d64bff9-hz8s9" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.618801 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-config\") pod \"dnsmasq-dns-869d64bff9-hz8s9\" (UID: \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\") " pod="openstack/dnsmasq-dns-869d64bff9-hz8s9" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.618833 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-openstack-cell1\") pod \"dnsmasq-dns-869d64bff9-hz8s9\" (UID: \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\") " pod="openstack/dnsmasq-dns-869d64bff9-hz8s9" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.618894 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-dns-svc\") pod \"dnsmasq-dns-869d64bff9-hz8s9\" (UID: \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\") " pod="openstack/dnsmasq-dns-869d64bff9-hz8s9" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.618935 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-ovsdbserver-sb\") pod \"dnsmasq-dns-869d64bff9-hz8s9\" (UID: \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\") " pod="openstack/dnsmasq-dns-869d64bff9-hz8s9" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.619109 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-ovsdbserver-nb\") pod \"dnsmasq-dns-869d64bff9-hz8s9\" (UID: \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\") " pod="openstack/dnsmasq-dns-869d64bff9-hz8s9" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.620019 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-openstack-cell1\") pod \"dnsmasq-dns-869d64bff9-hz8s9\" (UID: \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\") " pod="openstack/dnsmasq-dns-869d64bff9-hz8s9" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.620172 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-config\") pod \"dnsmasq-dns-869d64bff9-hz8s9\" (UID: \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\") " pod="openstack/dnsmasq-dns-869d64bff9-hz8s9" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.620278 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-dns-svc\") pod \"dnsmasq-dns-869d64bff9-hz8s9\" (UID: \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\") " pod="openstack/dnsmasq-dns-869d64bff9-hz8s9" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.620369 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-ovsdbserver-nb\") pod \"dnsmasq-dns-869d64bff9-hz8s9\" (UID: \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\") " pod="openstack/dnsmasq-dns-869d64bff9-hz8s9" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.621197 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-ovsdbserver-sb\") pod \"dnsmasq-dns-869d64bff9-hz8s9\" (UID: \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\") " pod="openstack/dnsmasq-dns-869d64bff9-hz8s9" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.630294 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84dcc59587-8hcjb"] Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.632557 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.642976 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell2" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.646086 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84dcc59587-8hcjb"] Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.670248 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb8v8\" (UniqueName: \"kubernetes.io/projected/1949b00a-6ec5-4c3e-94df-fd057e94da2c-kube-api-access-qb8v8\") pod \"dnsmasq-dns-869d64bff9-hz8s9\" (UID: \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\") " pod="openstack/dnsmasq-dns-869d64bff9-hz8s9" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.681394 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869d64bff9-hz8s9" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.715373 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869d64bff9-hz8s9" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.721432 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-ovsdbserver-sb\") pod \"dnsmasq-dns-84dcc59587-8hcjb\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.721534 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell2\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-openstack-cell2\") pod \"dnsmasq-dns-84dcc59587-8hcjb\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.721606 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-config\") pod \"dnsmasq-dns-84dcc59587-8hcjb\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.721641 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kpbl\" (UniqueName: \"kubernetes.io/projected/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-kube-api-access-6kpbl\") pod \"dnsmasq-dns-84dcc59587-8hcjb\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.721710 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-dns-svc\") pod \"dnsmasq-dns-84dcc59587-8hcjb\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.721730 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-openstack-cell1\") pod \"dnsmasq-dns-84dcc59587-8hcjb\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.721839 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-ovsdbserver-nb\") pod \"dnsmasq-dns-84dcc59587-8hcjb\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.823469 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-ovsdbserver-nb\") pod \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\" (UID: \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\") " Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.823544 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-openstack-cell1\") pod \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\" (UID: \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\") " Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.823567 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-ovsdbserver-sb\") pod \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\" (UID: \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\") " Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.823728 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb8v8\" (UniqueName: \"kubernetes.io/projected/1949b00a-6ec5-4c3e-94df-fd057e94da2c-kube-api-access-qb8v8\") pod \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\" (UID: \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\") " Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.823843 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-dns-svc\") pod \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\" (UID: \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\") " Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.823899 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1949b00a-6ec5-4c3e-94df-fd057e94da2c" (UID: "1949b00a-6ec5-4c3e-94df-fd057e94da2c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.823918 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-config\") pod \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\" (UID: \"1949b00a-6ec5-4c3e-94df-fd057e94da2c\") " Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.824293 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-config" (OuterVolumeSpecName: "config") pod "1949b00a-6ec5-4c3e-94df-fd057e94da2c" (UID: "1949b00a-6ec5-4c3e-94df-fd057e94da2c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.824339 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-ovsdbserver-sb\") pod \"dnsmasq-dns-84dcc59587-8hcjb\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.824447 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell2\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-openstack-cell2\") pod \"dnsmasq-dns-84dcc59587-8hcjb\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.824475 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "1949b00a-6ec5-4c3e-94df-fd057e94da2c" (UID: "1949b00a-6ec5-4c3e-94df-fd057e94da2c"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.824488 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1949b00a-6ec5-4c3e-94df-fd057e94da2c" (UID: "1949b00a-6ec5-4c3e-94df-fd057e94da2c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.824584 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-config\") pod \"dnsmasq-dns-84dcc59587-8hcjb\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.824627 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kpbl\" (UniqueName: \"kubernetes.io/projected/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-kube-api-access-6kpbl\") pod \"dnsmasq-dns-84dcc59587-8hcjb\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.824694 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-dns-svc\") pod \"dnsmasq-dns-84dcc59587-8hcjb\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.824719 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-openstack-cell1\") pod \"dnsmasq-dns-84dcc59587-8hcjb\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.824826 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-ovsdbserver-nb\") pod \"dnsmasq-dns-84dcc59587-8hcjb\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.824905 4947 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.824920 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.824932 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-config\") on node \"crc\" DevicePath \"\"" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.824943 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.825258 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell2\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-openstack-cell2\") pod \"dnsmasq-dns-84dcc59587-8hcjb\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.825335 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-ovsdbserver-sb\") pod \"dnsmasq-dns-84dcc59587-8hcjb\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.825460 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-config\") pod \"dnsmasq-dns-84dcc59587-8hcjb\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.825571 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1949b00a-6ec5-4c3e-94df-fd057e94da2c" (UID: "1949b00a-6ec5-4c3e-94df-fd057e94da2c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.825611 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-ovsdbserver-nb\") pod \"dnsmasq-dns-84dcc59587-8hcjb\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.826225 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-dns-svc\") pod \"dnsmasq-dns-84dcc59587-8hcjb\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.826526 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-openstack-cell1\") pod \"dnsmasq-dns-84dcc59587-8hcjb\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.827546 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1949b00a-6ec5-4c3e-94df-fd057e94da2c-kube-api-access-qb8v8" (OuterVolumeSpecName: "kube-api-access-qb8v8") pod "1949b00a-6ec5-4c3e-94df-fd057e94da2c" (UID: "1949b00a-6ec5-4c3e-94df-fd057e94da2c"). InnerVolumeSpecName "kube-api-access-qb8v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.851976 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kpbl\" (UniqueName: \"kubernetes.io/projected/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-kube-api-access-6kpbl\") pod \"dnsmasq-dns-84dcc59587-8hcjb\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.927050 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb8v8\" (UniqueName: \"kubernetes.io/projected/1949b00a-6ec5-4c3e-94df-fd057e94da2c-kube-api-access-qb8v8\") on node \"crc\" DevicePath \"\"" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.927091 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1949b00a-6ec5-4c3e-94df-fd057e94da2c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 09:13:55 crc kubenswrapper[4947]: I1203 09:13:55.955207 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:13:56 crc kubenswrapper[4947]: I1203 09:13:56.482417 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84dcc59587-8hcjb"] Dec 03 09:13:56 crc kubenswrapper[4947]: I1203 09:13:56.700757 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" event={"ID":"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10","Type":"ContainerStarted","Data":"c6a9fdcb2bf082ac4b9ebbbe575d29566d10e68acf36566e8ef6e65f6c5c43fe"} Dec 03 09:13:56 crc kubenswrapper[4947]: I1203 09:13:56.700777 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-869d64bff9-hz8s9" Dec 03 09:13:56 crc kubenswrapper[4947]: I1203 09:13:56.814424 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-869d64bff9-hz8s9"] Dec 03 09:13:56 crc kubenswrapper[4947]: I1203 09:13:56.839955 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-869d64bff9-hz8s9"] Dec 03 09:13:57 crc kubenswrapper[4947]: I1203 09:13:57.042485 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-bn5wz"] Dec 03 09:13:57 crc kubenswrapper[4947]: I1203 09:13:57.053356 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-bn5wz"] Dec 03 09:13:57 crc kubenswrapper[4947]: I1203 09:13:57.084168 4947 scope.go:117] "RemoveContainer" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" Dec 03 09:13:57 crc kubenswrapper[4947]: E1203 09:13:57.084429 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:13:57 crc kubenswrapper[4947]: I1203 09:13:57.100249 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1949b00a-6ec5-4c3e-94df-fd057e94da2c" path="/var/lib/kubelet/pods/1949b00a-6ec5-4c3e-94df-fd057e94da2c/volumes" Dec 03 09:13:57 crc kubenswrapper[4947]: I1203 09:13:57.100802 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ca62b55-e3c3-42b8-b98b-8147bef1e7db" path="/var/lib/kubelet/pods/6ca62b55-e3c3-42b8-b98b-8147bef1e7db/volumes" Dec 03 09:13:57 crc kubenswrapper[4947]: I1203 09:13:57.715032 4947 generic.go:334] "Generic (PLEG): container finished" podID="f2d29526-8adf-47f0-bdaf-0cda2ac1fd10" containerID="9086326f4cb993c7a5af8a3267cb329c91e01917856a7307fce775403d1a6ecd" exitCode=0 Dec 03 09:13:57 crc kubenswrapper[4947]: I1203 09:13:57.715083 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" event={"ID":"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10","Type":"ContainerDied","Data":"9086326f4cb993c7a5af8a3267cb329c91e01917856a7307fce775403d1a6ecd"} Dec 03 09:13:58 crc kubenswrapper[4947]: I1203 09:13:58.036314 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-c4e1-account-create-update-c52zf"] Dec 03 09:13:58 crc kubenswrapper[4947]: I1203 09:13:58.049254 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-c4e1-account-create-update-c52zf"] Dec 03 09:13:58 crc kubenswrapper[4947]: I1203 09:13:58.729361 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" event={"ID":"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10","Type":"ContainerStarted","Data":"ed6032b1d2ba2dd111032c23dddb744c668dac3ef87634a752c31790a0fb65e1"} Dec 03 09:13:58 crc kubenswrapper[4947]: I1203 09:13:58.729535 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:13:58 crc kubenswrapper[4947]: I1203 09:13:58.759356 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" podStartSLOduration=3.759333476 podStartE2EDuration="3.759333476s" podCreationTimestamp="2025-12-03 09:13:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:13:58.74988461 +0000 UTC m=+8700.010839066" watchObservedRunningTime="2025-12-03 09:13:58.759333476 +0000 UTC m=+8700.020287922" Dec 03 09:13:59 crc kubenswrapper[4947]: I1203 09:13:59.102722 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="714cfec2-f79e-43ee-8b81-54bef9aae50f" path="/var/lib/kubelet/pods/714cfec2-f79e-43ee-8b81-54bef9aae50f/volumes" Dec 03 09:14:05 crc kubenswrapper[4947]: I1203 09:14:05.957626 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.033352 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d696ffc89-zrvbs"] Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.033644 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" podUID="fa5219b3-502b-445f-9b19-66acfef1f54f" containerName="dnsmasq-dns" containerID="cri-o://47e621da3b4dcd9289ff86e6ee9df7f7a8498228880c1aabd3c5360c42c6c304" gracePeriod=10 Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.250173 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" podUID="fa5219b3-502b-445f-9b19-66acfef1f54f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.139:5353: connect: connection refused" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.261961 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58649ffb79-hqgck"] Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.263725 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.290581 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58649ffb79-hqgck"] Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.360621 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvz58\" (UniqueName: \"kubernetes.io/projected/5d90b449-565e-4a74-a224-da372aec8475-kube-api-access-tvz58\") pod \"dnsmasq-dns-58649ffb79-hqgck\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.360679 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-ovsdbserver-nb\") pod \"dnsmasq-dns-58649ffb79-hqgck\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.360700 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-config\") pod \"dnsmasq-dns-58649ffb79-hqgck\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.360748 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-openstack-cell1\") pod \"dnsmasq-dns-58649ffb79-hqgck\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.360815 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-ovsdbserver-sb\") pod \"dnsmasq-dns-58649ffb79-hqgck\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.360832 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-dns-svc\") pod \"dnsmasq-dns-58649ffb79-hqgck\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.360863 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell2\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-openstack-cell2\") pod \"dnsmasq-dns-58649ffb79-hqgck\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.474693 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvz58\" (UniqueName: \"kubernetes.io/projected/5d90b449-565e-4a74-a224-da372aec8475-kube-api-access-tvz58\") pod \"dnsmasq-dns-58649ffb79-hqgck\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.474750 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-ovsdbserver-nb\") pod \"dnsmasq-dns-58649ffb79-hqgck\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.474777 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-config\") pod \"dnsmasq-dns-58649ffb79-hqgck\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.474828 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-openstack-cell1\") pod \"dnsmasq-dns-58649ffb79-hqgck\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.474899 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-ovsdbserver-sb\") pod \"dnsmasq-dns-58649ffb79-hqgck\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.474918 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-dns-svc\") pod \"dnsmasq-dns-58649ffb79-hqgck\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.474956 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell2\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-openstack-cell2\") pod \"dnsmasq-dns-58649ffb79-hqgck\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.475754 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell2\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-openstack-cell2\") pod \"dnsmasq-dns-58649ffb79-hqgck\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.476515 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-ovsdbserver-nb\") pod \"dnsmasq-dns-58649ffb79-hqgck\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.477010 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-config\") pod \"dnsmasq-dns-58649ffb79-hqgck\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.477449 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-ovsdbserver-sb\") pod \"dnsmasq-dns-58649ffb79-hqgck\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.477612 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-openstack-cell1\") pod \"dnsmasq-dns-58649ffb79-hqgck\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.483564 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-dns-svc\") pod \"dnsmasq-dns-58649ffb79-hqgck\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.510226 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvz58\" (UniqueName: \"kubernetes.io/projected/5d90b449-565e-4a74-a224-da372aec8475-kube-api-access-tvz58\") pod \"dnsmasq-dns-58649ffb79-hqgck\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.613651 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.826608 4947 generic.go:334] "Generic (PLEG): container finished" podID="fa5219b3-502b-445f-9b19-66acfef1f54f" containerID="47e621da3b4dcd9289ff86e6ee9df7f7a8498228880c1aabd3c5360c42c6c304" exitCode=0 Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.826652 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" event={"ID":"fa5219b3-502b-445f-9b19-66acfef1f54f","Type":"ContainerDied","Data":"47e621da3b4dcd9289ff86e6ee9df7f7a8498228880c1aabd3c5360c42c6c304"} Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.826664 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.826677 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" event={"ID":"fa5219b3-502b-445f-9b19-66acfef1f54f","Type":"ContainerDied","Data":"031121cfd47d053897e34a24db6f2b7b9f421b3bf3a203983d7288a8fe5a3cd4"} Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.826690 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="031121cfd47d053897e34a24db6f2b7b9f421b3bf3a203983d7288a8fe5a3cd4" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.890103 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa5219b3-502b-445f-9b19-66acfef1f54f-dns-svc\") pod \"fa5219b3-502b-445f-9b19-66acfef1f54f\" (UID: \"fa5219b3-502b-445f-9b19-66acfef1f54f\") " Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.890549 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa5219b3-502b-445f-9b19-66acfef1f54f-config\") pod \"fa5219b3-502b-445f-9b19-66acfef1f54f\" (UID: \"fa5219b3-502b-445f-9b19-66acfef1f54f\") " Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.890621 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa5219b3-502b-445f-9b19-66acfef1f54f-ovsdbserver-nb\") pod \"fa5219b3-502b-445f-9b19-66acfef1f54f\" (UID: \"fa5219b3-502b-445f-9b19-66acfef1f54f\") " Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.890670 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa5219b3-502b-445f-9b19-66acfef1f54f-ovsdbserver-sb\") pod \"fa5219b3-502b-445f-9b19-66acfef1f54f\" (UID: \"fa5219b3-502b-445f-9b19-66acfef1f54f\") " Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.890784 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvl2h\" (UniqueName: \"kubernetes.io/projected/fa5219b3-502b-445f-9b19-66acfef1f54f-kube-api-access-jvl2h\") pod \"fa5219b3-502b-445f-9b19-66acfef1f54f\" (UID: \"fa5219b3-502b-445f-9b19-66acfef1f54f\") " Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.896297 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa5219b3-502b-445f-9b19-66acfef1f54f-kube-api-access-jvl2h" (OuterVolumeSpecName: "kube-api-access-jvl2h") pod "fa5219b3-502b-445f-9b19-66acfef1f54f" (UID: "fa5219b3-502b-445f-9b19-66acfef1f54f"). InnerVolumeSpecName "kube-api-access-jvl2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.945130 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa5219b3-502b-445f-9b19-66acfef1f54f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fa5219b3-502b-445f-9b19-66acfef1f54f" (UID: "fa5219b3-502b-445f-9b19-66acfef1f54f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.953738 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa5219b3-502b-445f-9b19-66acfef1f54f-config" (OuterVolumeSpecName: "config") pod "fa5219b3-502b-445f-9b19-66acfef1f54f" (UID: "fa5219b3-502b-445f-9b19-66acfef1f54f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.956466 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa5219b3-502b-445f-9b19-66acfef1f54f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fa5219b3-502b-445f-9b19-66acfef1f54f" (UID: "fa5219b3-502b-445f-9b19-66acfef1f54f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.965079 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa5219b3-502b-445f-9b19-66acfef1f54f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fa5219b3-502b-445f-9b19-66acfef1f54f" (UID: "fa5219b3-502b-445f-9b19-66acfef1f54f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.993164 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa5219b3-502b-445f-9b19-66acfef1f54f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.993198 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa5219b3-502b-445f-9b19-66acfef1f54f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.993208 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvl2h\" (UniqueName: \"kubernetes.io/projected/fa5219b3-502b-445f-9b19-66acfef1f54f-kube-api-access-jvl2h\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.993221 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa5219b3-502b-445f-9b19-66acfef1f54f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:06 crc kubenswrapper[4947]: I1203 09:14:06.993229 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa5219b3-502b-445f-9b19-66acfef1f54f-config\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:07 crc kubenswrapper[4947]: I1203 09:14:07.177789 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58649ffb79-hqgck"] Dec 03 09:14:07 crc kubenswrapper[4947]: I1203 09:14:07.838757 4947 generic.go:334] "Generic (PLEG): container finished" podID="5d90b449-565e-4a74-a224-da372aec8475" containerID="5512727d1155edce5cb0928fa70d72e5c502f7c2edafe32ec0d0e2ece626400f" exitCode=0 Dec 03 09:14:07 crc kubenswrapper[4947]: I1203 09:14:07.839120 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d696ffc89-zrvbs" Dec 03 09:14:07 crc kubenswrapper[4947]: I1203 09:14:07.838941 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58649ffb79-hqgck" event={"ID":"5d90b449-565e-4a74-a224-da372aec8475","Type":"ContainerDied","Data":"5512727d1155edce5cb0928fa70d72e5c502f7c2edafe32ec0d0e2ece626400f"} Dec 03 09:14:07 crc kubenswrapper[4947]: I1203 09:14:07.839162 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58649ffb79-hqgck" event={"ID":"5d90b449-565e-4a74-a224-da372aec8475","Type":"ContainerStarted","Data":"f0af49684f408eb3d4cf4284f091b8d30ffa82986d8b814d50a2aebe8d666292"} Dec 03 09:14:07 crc kubenswrapper[4947]: I1203 09:14:07.887085 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d696ffc89-zrvbs"] Dec 03 09:14:07 crc kubenswrapper[4947]: I1203 09:14:07.899629 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d696ffc89-zrvbs"] Dec 03 09:14:08 crc kubenswrapper[4947]: I1203 09:14:08.855387 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58649ffb79-hqgck" event={"ID":"5d90b449-565e-4a74-a224-da372aec8475","Type":"ContainerStarted","Data":"7a24eda2119f6029e66f4aa63a0c6778b2912af12235954aaeac81938a722890"} Dec 03 09:14:08 crc kubenswrapper[4947]: I1203 09:14:08.855920 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:08 crc kubenswrapper[4947]: I1203 09:14:08.889027 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58649ffb79-hqgck" podStartSLOduration=2.889002257 podStartE2EDuration="2.889002257s" podCreationTimestamp="2025-12-03 09:14:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:14:08.874271688 +0000 UTC m=+8710.135226154" watchObservedRunningTime="2025-12-03 09:14:08.889002257 +0000 UTC m=+8710.149956683" Dec 03 09:14:09 crc kubenswrapper[4947]: I1203 09:14:09.095174 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa5219b3-502b-445f-9b19-66acfef1f54f" path="/var/lib/kubelet/pods/fa5219b3-502b-445f-9b19-66acfef1f54f/volumes" Dec 03 09:14:11 crc kubenswrapper[4947]: I1203 09:14:11.083765 4947 scope.go:117] "RemoveContainer" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" Dec 03 09:14:11 crc kubenswrapper[4947]: E1203 09:14:11.084102 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:14:16 crc kubenswrapper[4947]: I1203 09:14:16.615706 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:16 crc kubenswrapper[4947]: I1203 09:14:16.716407 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84dcc59587-8hcjb"] Dec 03 09:14:16 crc kubenswrapper[4947]: I1203 09:14:16.716721 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" podUID="f2d29526-8adf-47f0-bdaf-0cda2ac1fd10" containerName="dnsmasq-dns" containerID="cri-o://ed6032b1d2ba2dd111032c23dddb744c668dac3ef87634a752c31790a0fb65e1" gracePeriod=10 Dec 03 09:14:16 crc kubenswrapper[4947]: I1203 09:14:16.890554 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7558866fff-lkt2j"] Dec 03 09:14:16 crc kubenswrapper[4947]: E1203 09:14:16.891409 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa5219b3-502b-445f-9b19-66acfef1f54f" containerName="dnsmasq-dns" Dec 03 09:14:16 crc kubenswrapper[4947]: I1203 09:14:16.891428 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa5219b3-502b-445f-9b19-66acfef1f54f" containerName="dnsmasq-dns" Dec 03 09:14:16 crc kubenswrapper[4947]: E1203 09:14:16.891462 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa5219b3-502b-445f-9b19-66acfef1f54f" containerName="init" Dec 03 09:14:16 crc kubenswrapper[4947]: I1203 09:14:16.891471 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa5219b3-502b-445f-9b19-66acfef1f54f" containerName="init" Dec 03 09:14:16 crc kubenswrapper[4947]: I1203 09:14:16.891816 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa5219b3-502b-445f-9b19-66acfef1f54f" containerName="dnsmasq-dns" Dec 03 09:14:16 crc kubenswrapper[4947]: I1203 09:14:16.893337 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7558866fff-lkt2j" Dec 03 09:14:16 crc kubenswrapper[4947]: I1203 09:14:16.903085 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7558866fff-lkt2j"] Dec 03 09:14:16 crc kubenswrapper[4947]: I1203 09:14:16.967795 4947 generic.go:334] "Generic (PLEG): container finished" podID="f2d29526-8adf-47f0-bdaf-0cda2ac1fd10" containerID="ed6032b1d2ba2dd111032c23dddb744c668dac3ef87634a752c31790a0fb65e1" exitCode=0 Dec 03 09:14:16 crc kubenswrapper[4947]: I1203 09:14:16.967840 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" event={"ID":"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10","Type":"ContainerDied","Data":"ed6032b1d2ba2dd111032c23dddb744c668dac3ef87634a752c31790a0fb65e1"} Dec 03 09:14:16 crc kubenswrapper[4947]: I1203 09:14:16.996163 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/6a4c0c95-ce41-4f47-8900-52da56018f73-openstack-cell1\") pod \"dnsmasq-dns-7558866fff-lkt2j\" (UID: \"6a4c0c95-ce41-4f47-8900-52da56018f73\") " pod="openstack/dnsmasq-dns-7558866fff-lkt2j" Dec 03 09:14:16 crc kubenswrapper[4947]: I1203 09:14:16.996381 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a4c0c95-ce41-4f47-8900-52da56018f73-dns-svc\") pod \"dnsmasq-dns-7558866fff-lkt2j\" (UID: \"6a4c0c95-ce41-4f47-8900-52da56018f73\") " pod="openstack/dnsmasq-dns-7558866fff-lkt2j" Dec 03 09:14:16 crc kubenswrapper[4947]: I1203 09:14:16.996542 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell2\" (UniqueName: \"kubernetes.io/configmap/6a4c0c95-ce41-4f47-8900-52da56018f73-openstack-cell2\") pod \"dnsmasq-dns-7558866fff-lkt2j\" (UID: \"6a4c0c95-ce41-4f47-8900-52da56018f73\") " pod="openstack/dnsmasq-dns-7558866fff-lkt2j" Dec 03 09:14:16 crc kubenswrapper[4947]: I1203 09:14:16.996638 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mtcb\" (UniqueName: \"kubernetes.io/projected/6a4c0c95-ce41-4f47-8900-52da56018f73-kube-api-access-8mtcb\") pod \"dnsmasq-dns-7558866fff-lkt2j\" (UID: \"6a4c0c95-ce41-4f47-8900-52da56018f73\") " pod="openstack/dnsmasq-dns-7558866fff-lkt2j" Dec 03 09:14:16 crc kubenswrapper[4947]: I1203 09:14:16.996707 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a4c0c95-ce41-4f47-8900-52da56018f73-ovsdbserver-sb\") pod \"dnsmasq-dns-7558866fff-lkt2j\" (UID: \"6a4c0c95-ce41-4f47-8900-52da56018f73\") " pod="openstack/dnsmasq-dns-7558866fff-lkt2j" Dec 03 09:14:16 crc kubenswrapper[4947]: I1203 09:14:16.996832 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a4c0c95-ce41-4f47-8900-52da56018f73-config\") pod \"dnsmasq-dns-7558866fff-lkt2j\" (UID: \"6a4c0c95-ce41-4f47-8900-52da56018f73\") " pod="openstack/dnsmasq-dns-7558866fff-lkt2j" Dec 03 09:14:16 crc kubenswrapper[4947]: I1203 09:14:16.996928 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a4c0c95-ce41-4f47-8900-52da56018f73-ovsdbserver-nb\") pod \"dnsmasq-dns-7558866fff-lkt2j\" (UID: \"6a4c0c95-ce41-4f47-8900-52da56018f73\") " pod="openstack/dnsmasq-dns-7558866fff-lkt2j" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.099587 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell2\" (UniqueName: \"kubernetes.io/configmap/6a4c0c95-ce41-4f47-8900-52da56018f73-openstack-cell2\") pod \"dnsmasq-dns-7558866fff-lkt2j\" (UID: \"6a4c0c95-ce41-4f47-8900-52da56018f73\") " pod="openstack/dnsmasq-dns-7558866fff-lkt2j" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.099651 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mtcb\" (UniqueName: \"kubernetes.io/projected/6a4c0c95-ce41-4f47-8900-52da56018f73-kube-api-access-8mtcb\") pod \"dnsmasq-dns-7558866fff-lkt2j\" (UID: \"6a4c0c95-ce41-4f47-8900-52da56018f73\") " pod="openstack/dnsmasq-dns-7558866fff-lkt2j" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.099678 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a4c0c95-ce41-4f47-8900-52da56018f73-ovsdbserver-sb\") pod \"dnsmasq-dns-7558866fff-lkt2j\" (UID: \"6a4c0c95-ce41-4f47-8900-52da56018f73\") " pod="openstack/dnsmasq-dns-7558866fff-lkt2j" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.099758 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a4c0c95-ce41-4f47-8900-52da56018f73-config\") pod \"dnsmasq-dns-7558866fff-lkt2j\" (UID: \"6a4c0c95-ce41-4f47-8900-52da56018f73\") " pod="openstack/dnsmasq-dns-7558866fff-lkt2j" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.099793 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a4c0c95-ce41-4f47-8900-52da56018f73-ovsdbserver-nb\") pod \"dnsmasq-dns-7558866fff-lkt2j\" (UID: \"6a4c0c95-ce41-4f47-8900-52da56018f73\") " pod="openstack/dnsmasq-dns-7558866fff-lkt2j" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.099852 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/6a4c0c95-ce41-4f47-8900-52da56018f73-openstack-cell1\") pod \"dnsmasq-dns-7558866fff-lkt2j\" (UID: \"6a4c0c95-ce41-4f47-8900-52da56018f73\") " pod="openstack/dnsmasq-dns-7558866fff-lkt2j" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.099885 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a4c0c95-ce41-4f47-8900-52da56018f73-dns-svc\") pod \"dnsmasq-dns-7558866fff-lkt2j\" (UID: \"6a4c0c95-ce41-4f47-8900-52da56018f73\") " pod="openstack/dnsmasq-dns-7558866fff-lkt2j" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.100974 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell2\" (UniqueName: \"kubernetes.io/configmap/6a4c0c95-ce41-4f47-8900-52da56018f73-openstack-cell2\") pod \"dnsmasq-dns-7558866fff-lkt2j\" (UID: \"6a4c0c95-ce41-4f47-8900-52da56018f73\") " pod="openstack/dnsmasq-dns-7558866fff-lkt2j" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.101944 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a4c0c95-ce41-4f47-8900-52da56018f73-ovsdbserver-nb\") pod \"dnsmasq-dns-7558866fff-lkt2j\" (UID: \"6a4c0c95-ce41-4f47-8900-52da56018f73\") " pod="openstack/dnsmasq-dns-7558866fff-lkt2j" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.102115 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/6a4c0c95-ce41-4f47-8900-52da56018f73-openstack-cell1\") pod \"dnsmasq-dns-7558866fff-lkt2j\" (UID: \"6a4c0c95-ce41-4f47-8900-52da56018f73\") " pod="openstack/dnsmasq-dns-7558866fff-lkt2j" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.102451 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a4c0c95-ce41-4f47-8900-52da56018f73-ovsdbserver-sb\") pod \"dnsmasq-dns-7558866fff-lkt2j\" (UID: \"6a4c0c95-ce41-4f47-8900-52da56018f73\") " pod="openstack/dnsmasq-dns-7558866fff-lkt2j" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.104041 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a4c0c95-ce41-4f47-8900-52da56018f73-dns-svc\") pod \"dnsmasq-dns-7558866fff-lkt2j\" (UID: \"6a4c0c95-ce41-4f47-8900-52da56018f73\") " pod="openstack/dnsmasq-dns-7558866fff-lkt2j" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.104202 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a4c0c95-ce41-4f47-8900-52da56018f73-config\") pod \"dnsmasq-dns-7558866fff-lkt2j\" (UID: \"6a4c0c95-ce41-4f47-8900-52da56018f73\") " pod="openstack/dnsmasq-dns-7558866fff-lkt2j" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.134544 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mtcb\" (UniqueName: \"kubernetes.io/projected/6a4c0c95-ce41-4f47-8900-52da56018f73-kube-api-access-8mtcb\") pod \"dnsmasq-dns-7558866fff-lkt2j\" (UID: \"6a4c0c95-ce41-4f47-8900-52da56018f73\") " pod="openstack/dnsmasq-dns-7558866fff-lkt2j" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.218616 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7558866fff-lkt2j" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.374913 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.416323 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-openstack-cell1\") pod \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.416468 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-dns-svc\") pod \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.416536 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-ovsdbserver-sb\") pod \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.416588 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-ovsdbserver-nb\") pod \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.416724 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell2\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-openstack-cell2\") pod \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.416743 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kpbl\" (UniqueName: \"kubernetes.io/projected/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-kube-api-access-6kpbl\") pod \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.416779 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-config\") pod \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\" (UID: \"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10\") " Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.426040 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-kube-api-access-6kpbl" (OuterVolumeSpecName: "kube-api-access-6kpbl") pod "f2d29526-8adf-47f0-bdaf-0cda2ac1fd10" (UID: "f2d29526-8adf-47f0-bdaf-0cda2ac1fd10"). InnerVolumeSpecName "kube-api-access-6kpbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.479624 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-config" (OuterVolumeSpecName: "config") pod "f2d29526-8adf-47f0-bdaf-0cda2ac1fd10" (UID: "f2d29526-8adf-47f0-bdaf-0cda2ac1fd10"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.481334 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f2d29526-8adf-47f0-bdaf-0cda2ac1fd10" (UID: "f2d29526-8adf-47f0-bdaf-0cda2ac1fd10"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.497807 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f2d29526-8adf-47f0-bdaf-0cda2ac1fd10" (UID: "f2d29526-8adf-47f0-bdaf-0cda2ac1fd10"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.503154 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f2d29526-8adf-47f0-bdaf-0cda2ac1fd10" (UID: "f2d29526-8adf-47f0-bdaf-0cda2ac1fd10"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.519106 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.519142 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.519156 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.519168 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kpbl\" (UniqueName: \"kubernetes.io/projected/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-kube-api-access-6kpbl\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.519179 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-config\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.528070 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "f2d29526-8adf-47f0-bdaf-0cda2ac1fd10" (UID: "f2d29526-8adf-47f0-bdaf-0cda2ac1fd10"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.542712 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-openstack-cell2" (OuterVolumeSpecName: "openstack-cell2") pod "f2d29526-8adf-47f0-bdaf-0cda2ac1fd10" (UID: "f2d29526-8adf-47f0-bdaf-0cda2ac1fd10"). InnerVolumeSpecName "openstack-cell2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.621795 4947 reconciler_common.go:293] "Volume detached for volume \"openstack-cell2\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-openstack-cell2\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.621847 4947 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:17 crc kubenswrapper[4947]: W1203 09:14:17.786310 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a4c0c95_ce41_4f47_8900_52da56018f73.slice/crio-61890532b475d782612800bd717edae8894d251bad7dbf9a2109bdbc2af14856 WatchSource:0}: Error finding container 61890532b475d782612800bd717edae8894d251bad7dbf9a2109bdbc2af14856: Status 404 returned error can't find the container with id 61890532b475d782612800bd717edae8894d251bad7dbf9a2109bdbc2af14856 Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.792675 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7558866fff-lkt2j"] Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.979432 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7558866fff-lkt2j" event={"ID":"6a4c0c95-ce41-4f47-8900-52da56018f73","Type":"ContainerStarted","Data":"61890532b475d782612800bd717edae8894d251bad7dbf9a2109bdbc2af14856"} Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.981586 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" event={"ID":"f2d29526-8adf-47f0-bdaf-0cda2ac1fd10","Type":"ContainerDied","Data":"c6a9fdcb2bf082ac4b9ebbbe575d29566d10e68acf36566e8ef6e65f6c5c43fe"} Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.981634 4947 scope.go:117] "RemoveContainer" containerID="ed6032b1d2ba2dd111032c23dddb744c668dac3ef87634a752c31790a0fb65e1" Dec 03 09:14:17 crc kubenswrapper[4947]: I1203 09:14:17.981658 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84dcc59587-8hcjb" Dec 03 09:14:18 crc kubenswrapper[4947]: I1203 09:14:18.006601 4947 scope.go:117] "RemoveContainer" containerID="9086326f4cb993c7a5af8a3267cb329c91e01917856a7307fce775403d1a6ecd" Dec 03 09:14:18 crc kubenswrapper[4947]: I1203 09:14:18.029871 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84dcc59587-8hcjb"] Dec 03 09:14:18 crc kubenswrapper[4947]: I1203 09:14:18.040828 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84dcc59587-8hcjb"] Dec 03 09:14:19 crc kubenswrapper[4947]: I1203 09:14:19.042123 4947 generic.go:334] "Generic (PLEG): container finished" podID="6a4c0c95-ce41-4f47-8900-52da56018f73" containerID="3a7c62d98489adcd68601a9d6aec0bab7cb955cfd62b3b6f99a8d03519582b57" exitCode=0 Dec 03 09:14:19 crc kubenswrapper[4947]: I1203 09:14:19.042458 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7558866fff-lkt2j" event={"ID":"6a4c0c95-ce41-4f47-8900-52da56018f73","Type":"ContainerDied","Data":"3a7c62d98489adcd68601a9d6aec0bab7cb955cfd62b3b6f99a8d03519582b57"} Dec 03 09:14:19 crc kubenswrapper[4947]: I1203 09:14:19.115668 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2d29526-8adf-47f0-bdaf-0cda2ac1fd10" path="/var/lib/kubelet/pods/f2d29526-8adf-47f0-bdaf-0cda2ac1fd10/volumes" Dec 03 09:14:20 crc kubenswrapper[4947]: I1203 09:14:20.053485 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7558866fff-lkt2j" event={"ID":"6a4c0c95-ce41-4f47-8900-52da56018f73","Type":"ContainerStarted","Data":"6f1ca88c5b48d7d88b4ba4ccc0af3ec3a875360328836ea388e303637e8e8f2f"} Dec 03 09:14:20 crc kubenswrapper[4947]: I1203 09:14:20.055070 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7558866fff-lkt2j" Dec 03 09:14:20 crc kubenswrapper[4947]: I1203 09:14:20.079338 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7558866fff-lkt2j" podStartSLOduration=4.079317747 podStartE2EDuration="4.079317747s" podCreationTimestamp="2025-12-03 09:14:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:14:20.076972504 +0000 UTC m=+8721.337926930" watchObservedRunningTime="2025-12-03 09:14:20.079317747 +0000 UTC m=+8721.340272173" Dec 03 09:14:23 crc kubenswrapper[4947]: I1203 09:14:23.055871 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-hk2ws"] Dec 03 09:14:23 crc kubenswrapper[4947]: I1203 09:14:23.069829 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-hk2ws"] Dec 03 09:14:23 crc kubenswrapper[4947]: I1203 09:14:23.109189 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3e13f5e-62e2-4404-90a8-e5d7e47322c6" path="/var/lib/kubelet/pods/b3e13f5e-62e2-4404-90a8-e5d7e47322c6/volumes" Dec 03 09:14:26 crc kubenswrapper[4947]: I1203 09:14:26.083683 4947 scope.go:117] "RemoveContainer" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" Dec 03 09:14:26 crc kubenswrapper[4947]: E1203 09:14:26.084902 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.053209 4947 scope.go:117] "RemoveContainer" containerID="4b3e08d8388cbeb73d533373158c34d6309e2693962201be40d8c493aaea17b7" Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.078897 4947 scope.go:117] "RemoveContainer" containerID="47e621da3b4dcd9289ff86e6ee9df7f7a8498228880c1aabd3c5360c42c6c304" Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.135108 4947 scope.go:117] "RemoveContainer" containerID="1abf4f790890780fb413bacc27444951ba9fcab5393acbf407742712eb215b0a" Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.157058 4947 scope.go:117] "RemoveContainer" containerID="9fb32a3c79132b0ef0344e7f97774b458f0665680409c28228b8bbf1d2f20e98" Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.220280 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7558866fff-lkt2j" Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.290738 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58649ffb79-hqgck"] Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.290999 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58649ffb79-hqgck" podUID="5d90b449-565e-4a74-a224-da372aec8475" containerName="dnsmasq-dns" containerID="cri-o://7a24eda2119f6029e66f4aa63a0c6778b2912af12235954aaeac81938a722890" gracePeriod=10 Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.437773 4947 scope.go:117] "RemoveContainer" containerID="5b91fbae315a36893d9f232fc5068a2d1beb120078190e2a1dbd05cb12668420" Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.544064 4947 scope.go:117] "RemoveContainer" containerID="32f43fd9f17ce976e5b9032aaaf005518a768ffed6edadcf037c3da77372c212" Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.586740 4947 scope.go:117] "RemoveContainer" containerID="0ce794639d5e990a72f8ffebf429004823b413040d15118b9d81920f28f10704" Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.782636 4947 scope.go:117] "RemoveContainer" containerID="3e0f99cf5d76014a4b250d0b2d0992085060204ff2350124a2c18cfedde0ecee" Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.857915 4947 scope.go:117] "RemoveContainer" containerID="af483dbc11a06b127181b4ec8975febdf31408555cc20de47bd159ec4139f7dd" Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.859013 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.971430 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5"] Dec 03 09:14:27 crc kubenswrapper[4947]: E1203 09:14:27.972366 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2d29526-8adf-47f0-bdaf-0cda2ac1fd10" containerName="dnsmasq-dns" Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.972408 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2d29526-8adf-47f0-bdaf-0cda2ac1fd10" containerName="dnsmasq-dns" Dec 03 09:14:27 crc kubenswrapper[4947]: E1203 09:14:27.972424 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2d29526-8adf-47f0-bdaf-0cda2ac1fd10" containerName="init" Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.972431 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2d29526-8adf-47f0-bdaf-0cda2ac1fd10" containerName="init" Dec 03 09:14:27 crc kubenswrapper[4947]: E1203 09:14:27.972636 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d90b449-565e-4a74-a224-da372aec8475" containerName="dnsmasq-dns" Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.972645 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d90b449-565e-4a74-a224-da372aec8475" containerName="dnsmasq-dns" Dec 03 09:14:27 crc kubenswrapper[4947]: E1203 09:14:27.972667 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d90b449-565e-4a74-a224-da372aec8475" containerName="init" Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.972685 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d90b449-565e-4a74-a224-da372aec8475" containerName="init" Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.972908 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d90b449-565e-4a74-a224-da372aec8475" containerName="dnsmasq-dns" Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.972935 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2d29526-8adf-47f0-bdaf-0cda2ac1fd10" containerName="dnsmasq-dns" Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.974420 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5" Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.983291 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell2" Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.983331 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.983374 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.983464 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell2-dockercfg-cl4m2" Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.985901 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-ovsdbserver-nb\") pod \"5d90b449-565e-4a74-a224-da372aec8475\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.985935 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-ovsdbserver-sb\") pod \"5d90b449-565e-4a74-a224-da372aec8475\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.986148 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvz58\" (UniqueName: \"kubernetes.io/projected/5d90b449-565e-4a74-a224-da372aec8475-kube-api-access-tvz58\") pod \"5d90b449-565e-4a74-a224-da372aec8475\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.986216 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-openstack-cell1\") pod \"5d90b449-565e-4a74-a224-da372aec8475\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.986283 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-dns-svc\") pod \"5d90b449-565e-4a74-a224-da372aec8475\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.986343 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-config\") pod \"5d90b449-565e-4a74-a224-da372aec8475\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.986397 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell2\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-openstack-cell2\") pod \"5d90b449-565e-4a74-a224-da372aec8475\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " Dec 03 09:14:27 crc kubenswrapper[4947]: I1203 09:14:27.994676 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d90b449-565e-4a74-a224-da372aec8475-kube-api-access-tvz58" (OuterVolumeSpecName: "kube-api-access-tvz58") pod "5d90b449-565e-4a74-a224-da372aec8475" (UID: "5d90b449-565e-4a74-a224-da372aec8475"). InnerVolumeSpecName "kube-api-access-tvz58". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.006996 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql"] Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.010312 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.012631 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rfmtm" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.012818 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.040712 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5"] Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.055107 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql"] Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.056416 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "5d90b449-565e-4a74-a224-da372aec8475" (UID: "5d90b449-565e-4a74-a224-da372aec8475"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.071601 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5d90b449-565e-4a74-a224-da372aec8475" (UID: "5d90b449-565e-4a74-a224-da372aec8475"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.083083 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5d90b449-565e-4a74-a224-da372aec8475" (UID: "5d90b449-565e-4a74-a224-da372aec8475"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.083537 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-openstack-cell2" (OuterVolumeSpecName: "openstack-cell2") pod "5d90b449-565e-4a74-a224-da372aec8475" (UID: "5d90b449-565e-4a74-a224-da372aec8475"). InnerVolumeSpecName "openstack-cell2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.085916 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-config" (OuterVolumeSpecName: "config") pod "5d90b449-565e-4a74-a224-da372aec8475" (UID: "5d90b449-565e-4a74-a224-da372aec8475"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.087444 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5d90b449-565e-4a74-a224-da372aec8475" (UID: "5d90b449-565e-4a74-a224-da372aec8475"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.087932 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-dns-svc\") pod \"5d90b449-565e-4a74-a224-da372aec8475\" (UID: \"5d90b449-565e-4a74-a224-da372aec8475\") " Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.088184 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b81d648-89b9-4c0b-b312-4a667b806f59-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5\" (UID: \"6b81d648-89b9-4c0b-b312-4a667b806f59\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.088259 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b81d648-89b9-4c0b-b312-4a667b806f59-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5\" (UID: \"6b81d648-89b9-4c0b-b312-4a667b806f59\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.088319 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsmbd\" (UniqueName: \"kubernetes.io/projected/6b81d648-89b9-4c0b-b312-4a667b806f59-kube-api-access-qsmbd\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5\" (UID: \"6b81d648-89b9-4c0b-b312-4a667b806f59\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.088396 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b81d648-89b9-4c0b-b312-4a667b806f59-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5\" (UID: \"6b81d648-89b9-4c0b-b312-4a667b806f59\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.088458 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvz58\" (UniqueName: \"kubernetes.io/projected/5d90b449-565e-4a74-a224-da372aec8475-kube-api-access-tvz58\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.088472 4947 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.088481 4947 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-config\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.088502 4947 reconciler_common.go:293] "Volume detached for volume \"openstack-cell2\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-openstack-cell2\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.088510 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.088517 4947 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:28 crc kubenswrapper[4947]: W1203 09:14:28.088627 4947 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/5d90b449-565e-4a74-a224-da372aec8475/volumes/kubernetes.io~configmap/dns-svc Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.088638 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5d90b449-565e-4a74-a224-da372aec8475" (UID: "5d90b449-565e-4a74-a224-da372aec8475"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.155564 4947 generic.go:334] "Generic (PLEG): container finished" podID="5d90b449-565e-4a74-a224-da372aec8475" containerID="7a24eda2119f6029e66f4aa63a0c6778b2912af12235954aaeac81938a722890" exitCode=0 Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.155620 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58649ffb79-hqgck" event={"ID":"5d90b449-565e-4a74-a224-da372aec8475","Type":"ContainerDied","Data":"7a24eda2119f6029e66f4aa63a0c6778b2912af12235954aaeac81938a722890"} Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.155674 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58649ffb79-hqgck" event={"ID":"5d90b449-565e-4a74-a224-da372aec8475","Type":"ContainerDied","Data":"f0af49684f408eb3d4cf4284f091b8d30ffa82986d8b814d50a2aebe8d666292"} Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.155733 4947 scope.go:117] "RemoveContainer" containerID="7a24eda2119f6029e66f4aa63a0c6778b2912af12235954aaeac81938a722890" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.156174 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58649ffb79-hqgck" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.191843 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsmbd\" (UniqueName: \"kubernetes.io/projected/6b81d648-89b9-4c0b-b312-4a667b806f59-kube-api-access-qsmbd\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5\" (UID: \"6b81d648-89b9-4c0b-b312-4a667b806f59\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.192060 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2e53d2-0d5b-4895-802b-83538cc2fb92-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql\" (UID: \"7d2e53d2-0d5b-4895-802b-83538cc2fb92\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.192298 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b81d648-89b9-4c0b-b312-4a667b806f59-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5\" (UID: \"6b81d648-89b9-4c0b-b312-4a667b806f59\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.192383 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d2e53d2-0d5b-4895-802b-83538cc2fb92-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql\" (UID: \"7d2e53d2-0d5b-4895-802b-83538cc2fb92\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.192467 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b81d648-89b9-4c0b-b312-4a667b806f59-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5\" (UID: \"6b81d648-89b9-4c0b-b312-4a667b806f59\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.192647 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nn8g\" (UniqueName: \"kubernetes.io/projected/7d2e53d2-0d5b-4895-802b-83538cc2fb92-kube-api-access-6nn8g\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql\" (UID: \"7d2e53d2-0d5b-4895-802b-83538cc2fb92\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.192715 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b81d648-89b9-4c0b-b312-4a667b806f59-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5\" (UID: \"6b81d648-89b9-4c0b-b312-4a667b806f59\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.192789 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d2e53d2-0d5b-4895-802b-83538cc2fb92-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql\" (UID: \"7d2e53d2-0d5b-4895-802b-83538cc2fb92\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.192975 4947 scope.go:117] "RemoveContainer" containerID="5512727d1155edce5cb0928fa70d72e5c502f7c2edafe32ec0d0e2ece626400f" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.193255 4947 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d90b449-565e-4a74-a224-da372aec8475-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.195760 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58649ffb79-hqgck"] Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.197059 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b81d648-89b9-4c0b-b312-4a667b806f59-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5\" (UID: \"6b81d648-89b9-4c0b-b312-4a667b806f59\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.200527 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b81d648-89b9-4c0b-b312-4a667b806f59-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5\" (UID: \"6b81d648-89b9-4c0b-b312-4a667b806f59\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.201443 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b81d648-89b9-4c0b-b312-4a667b806f59-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5\" (UID: \"6b81d648-89b9-4c0b-b312-4a667b806f59\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.207944 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58649ffb79-hqgck"] Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.211191 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsmbd\" (UniqueName: \"kubernetes.io/projected/6b81d648-89b9-4c0b-b312-4a667b806f59-kube-api-access-qsmbd\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5\" (UID: \"6b81d648-89b9-4c0b-b312-4a667b806f59\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.294595 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d2e53d2-0d5b-4895-802b-83538cc2fb92-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql\" (UID: \"7d2e53d2-0d5b-4895-802b-83538cc2fb92\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.294855 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2e53d2-0d5b-4895-802b-83538cc2fb92-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql\" (UID: \"7d2e53d2-0d5b-4895-802b-83538cc2fb92\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.295017 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d2e53d2-0d5b-4895-802b-83538cc2fb92-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql\" (UID: \"7d2e53d2-0d5b-4895-802b-83538cc2fb92\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.295138 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nn8g\" (UniqueName: \"kubernetes.io/projected/7d2e53d2-0d5b-4895-802b-83538cc2fb92-kube-api-access-6nn8g\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql\" (UID: \"7d2e53d2-0d5b-4895-802b-83538cc2fb92\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.298504 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d2e53d2-0d5b-4895-802b-83538cc2fb92-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql\" (UID: \"7d2e53d2-0d5b-4895-802b-83538cc2fb92\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.311258 4947 scope.go:117] "RemoveContainer" containerID="7a24eda2119f6029e66f4aa63a0c6778b2912af12235954aaeac81938a722890" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.312122 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d2e53d2-0d5b-4895-802b-83538cc2fb92-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql\" (UID: \"7d2e53d2-0d5b-4895-802b-83538cc2fb92\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql" Dec 03 09:14:28 crc kubenswrapper[4947]: E1203 09:14:28.312342 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a24eda2119f6029e66f4aa63a0c6778b2912af12235954aaeac81938a722890\": container with ID starting with 7a24eda2119f6029e66f4aa63a0c6778b2912af12235954aaeac81938a722890 not found: ID does not exist" containerID="7a24eda2119f6029e66f4aa63a0c6778b2912af12235954aaeac81938a722890" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.312365 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2e53d2-0d5b-4895-802b-83538cc2fb92-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql\" (UID: \"7d2e53d2-0d5b-4895-802b-83538cc2fb92\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.312412 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a24eda2119f6029e66f4aa63a0c6778b2912af12235954aaeac81938a722890"} err="failed to get container status \"7a24eda2119f6029e66f4aa63a0c6778b2912af12235954aaeac81938a722890\": rpc error: code = NotFound desc = could not find container \"7a24eda2119f6029e66f4aa63a0c6778b2912af12235954aaeac81938a722890\": container with ID starting with 7a24eda2119f6029e66f4aa63a0c6778b2912af12235954aaeac81938a722890 not found: ID does not exist" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.312439 4947 scope.go:117] "RemoveContainer" containerID="5512727d1155edce5cb0928fa70d72e5c502f7c2edafe32ec0d0e2ece626400f" Dec 03 09:14:28 crc kubenswrapper[4947]: E1203 09:14:28.312865 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5512727d1155edce5cb0928fa70d72e5c502f7c2edafe32ec0d0e2ece626400f\": container with ID starting with 5512727d1155edce5cb0928fa70d72e5c502f7c2edafe32ec0d0e2ece626400f not found: ID does not exist" containerID="5512727d1155edce5cb0928fa70d72e5c502f7c2edafe32ec0d0e2ece626400f" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.312914 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5512727d1155edce5cb0928fa70d72e5c502f7c2edafe32ec0d0e2ece626400f"} err="failed to get container status \"5512727d1155edce5cb0928fa70d72e5c502f7c2edafe32ec0d0e2ece626400f\": rpc error: code = NotFound desc = could not find container \"5512727d1155edce5cb0928fa70d72e5c502f7c2edafe32ec0d0e2ece626400f\": container with ID starting with 5512727d1155edce5cb0928fa70d72e5c502f7c2edafe32ec0d0e2ece626400f not found: ID does not exist" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.313359 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.314185 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nn8g\" (UniqueName: \"kubernetes.io/projected/7d2e53d2-0d5b-4895-802b-83538cc2fb92-kube-api-access-6nn8g\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql\" (UID: \"7d2e53d2-0d5b-4895-802b-83538cc2fb92\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql" Dec 03 09:14:28 crc kubenswrapper[4947]: I1203 09:14:28.335565 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql" Dec 03 09:14:29 crc kubenswrapper[4947]: I1203 09:14:29.040055 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql"] Dec 03 09:14:29 crc kubenswrapper[4947]: I1203 09:14:29.098830 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d90b449-565e-4a74-a224-da372aec8475" path="/var/lib/kubelet/pods/5d90b449-565e-4a74-a224-da372aec8475/volumes" Dec 03 09:14:29 crc kubenswrapper[4947]: I1203 09:14:29.183775 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql" event={"ID":"7d2e53d2-0d5b-4895-802b-83538cc2fb92","Type":"ContainerStarted","Data":"8769ba24b155038d98bf52ac78539a1dcd891971c17ccb1af1923adbf3fdd192"} Dec 03 09:14:29 crc kubenswrapper[4947]: I1203 09:14:29.670754 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5"] Dec 03 09:14:30 crc kubenswrapper[4947]: I1203 09:14:30.206693 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5" event={"ID":"6b81d648-89b9-4c0b-b312-4a667b806f59","Type":"ContainerStarted","Data":"5daadbbb078288b141102b66461210032c0099e9e2f356166aae4e38123b9600"} Dec 03 09:14:39 crc kubenswrapper[4947]: I1203 09:14:39.741366 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="7cf44fd8-460f-4a2e-8e17-4b34f5158c24" containerName="galera" probeResult="failure" output="command timed out" Dec 03 09:14:39 crc kubenswrapper[4947]: I1203 09:14:39.746941 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="7cf44fd8-460f-4a2e-8e17-4b34f5158c24" containerName="galera" probeResult="failure" output="command timed out" Dec 03 09:14:40 crc kubenswrapper[4947]: I1203 09:14:40.083345 4947 scope.go:117] "RemoveContainer" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" Dec 03 09:14:41 crc kubenswrapper[4947]: I1203 09:14:41.340464 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5" event={"ID":"6b81d648-89b9-4c0b-b312-4a667b806f59","Type":"ContainerStarted","Data":"a1e2c328924d76d4dff7f4535d9c6bdd5e6e1f0bae9ab1955fcf57dd0bc0a6aa"} Dec 03 09:14:41 crc kubenswrapper[4947]: I1203 09:14:41.343617 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"eba6635ac29e19bd98a8b26677bfa21b381fc8b93839c29c2b536e3d5e16b6c4"} Dec 03 09:14:41 crc kubenswrapper[4947]: I1203 09:14:41.345201 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql" event={"ID":"7d2e53d2-0d5b-4895-802b-83538cc2fb92","Type":"ContainerStarted","Data":"d183549d2209a712d6d89d7fc278b882d26d7721cac3e2e8d28548992bb137d8"} Dec 03 09:14:41 crc kubenswrapper[4947]: I1203 09:14:41.366641 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5" podStartSLOduration=3.20681771 podStartE2EDuration="14.366624807s" podCreationTimestamp="2025-12-03 09:14:27 +0000 UTC" firstStartedPulling="2025-12-03 09:14:29.699410256 +0000 UTC m=+8730.960364682" lastFinishedPulling="2025-12-03 09:14:40.859217353 +0000 UTC m=+8742.120171779" observedRunningTime="2025-12-03 09:14:41.354605052 +0000 UTC m=+8742.615559478" watchObservedRunningTime="2025-12-03 09:14:41.366624807 +0000 UTC m=+8742.627579233" Dec 03 09:14:41 crc kubenswrapper[4947]: I1203 09:14:41.395443 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql" podStartSLOduration=2.536242782 podStartE2EDuration="14.395422685s" podCreationTimestamp="2025-12-03 09:14:27 +0000 UTC" firstStartedPulling="2025-12-03 09:14:29.041879281 +0000 UTC m=+8730.302833707" lastFinishedPulling="2025-12-03 09:14:40.901059184 +0000 UTC m=+8742.162013610" observedRunningTime="2025-12-03 09:14:41.388484548 +0000 UTC m=+8742.649438994" watchObservedRunningTime="2025-12-03 09:14:41.395422685 +0000 UTC m=+8742.656377111" Dec 03 09:14:51 crc kubenswrapper[4947]: I1203 09:14:51.439940 4947 generic.go:334] "Generic (PLEG): container finished" podID="7d2e53d2-0d5b-4895-802b-83538cc2fb92" containerID="d183549d2209a712d6d89d7fc278b882d26d7721cac3e2e8d28548992bb137d8" exitCode=0 Dec 03 09:14:51 crc kubenswrapper[4947]: I1203 09:14:51.440509 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql" event={"ID":"7d2e53d2-0d5b-4895-802b-83538cc2fb92","Type":"ContainerDied","Data":"d183549d2209a712d6d89d7fc278b882d26d7721cac3e2e8d28548992bb137d8"} Dec 03 09:14:51 crc kubenswrapper[4947]: I1203 09:14:51.443131 4947 generic.go:334] "Generic (PLEG): container finished" podID="6b81d648-89b9-4c0b-b312-4a667b806f59" containerID="a1e2c328924d76d4dff7f4535d9c6bdd5e6e1f0bae9ab1955fcf57dd0bc0a6aa" exitCode=0 Dec 03 09:14:51 crc kubenswrapper[4947]: I1203 09:14:51.443163 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5" event={"ID":"6b81d648-89b9-4c0b-b312-4a667b806f59","Type":"ContainerDied","Data":"a1e2c328924d76d4dff7f4535d9c6bdd5e6e1f0bae9ab1955fcf57dd0bc0a6aa"} Dec 03 09:14:52 crc kubenswrapper[4947]: I1203 09:14:52.943542 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5" Dec 03 09:14:52 crc kubenswrapper[4947]: I1203 09:14:52.993766 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b81d648-89b9-4c0b-b312-4a667b806f59-ssh-key\") pod \"6b81d648-89b9-4c0b-b312-4a667b806f59\" (UID: \"6b81d648-89b9-4c0b-b312-4a667b806f59\") " Dec 03 09:14:52 crc kubenswrapper[4947]: I1203 09:14:52.993831 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b81d648-89b9-4c0b-b312-4a667b806f59-pre-adoption-validation-combined-ca-bundle\") pod \"6b81d648-89b9-4c0b-b312-4a667b806f59\" (UID: \"6b81d648-89b9-4c0b-b312-4a667b806f59\") " Dec 03 09:14:52 crc kubenswrapper[4947]: I1203 09:14:52.993893 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsmbd\" (UniqueName: \"kubernetes.io/projected/6b81d648-89b9-4c0b-b312-4a667b806f59-kube-api-access-qsmbd\") pod \"6b81d648-89b9-4c0b-b312-4a667b806f59\" (UID: \"6b81d648-89b9-4c0b-b312-4a667b806f59\") " Dec 03 09:14:52 crc kubenswrapper[4947]: I1203 09:14:52.993987 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b81d648-89b9-4c0b-b312-4a667b806f59-inventory\") pod \"6b81d648-89b9-4c0b-b312-4a667b806f59\" (UID: \"6b81d648-89b9-4c0b-b312-4a667b806f59\") " Dec 03 09:14:53 crc kubenswrapper[4947]: I1203 09:14:53.000261 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b81d648-89b9-4c0b-b312-4a667b806f59-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "6b81d648-89b9-4c0b-b312-4a667b806f59" (UID: "6b81d648-89b9-4c0b-b312-4a667b806f59"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:14:53 crc kubenswrapper[4947]: I1203 09:14:53.002601 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b81d648-89b9-4c0b-b312-4a667b806f59-kube-api-access-qsmbd" (OuterVolumeSpecName: "kube-api-access-qsmbd") pod "6b81d648-89b9-4c0b-b312-4a667b806f59" (UID: "6b81d648-89b9-4c0b-b312-4a667b806f59"). InnerVolumeSpecName "kube-api-access-qsmbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:14:53 crc kubenswrapper[4947]: I1203 09:14:53.022155 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b81d648-89b9-4c0b-b312-4a667b806f59-inventory" (OuterVolumeSpecName: "inventory") pod "6b81d648-89b9-4c0b-b312-4a667b806f59" (UID: "6b81d648-89b9-4c0b-b312-4a667b806f59"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:14:53 crc kubenswrapper[4947]: I1203 09:14:53.029890 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b81d648-89b9-4c0b-b312-4a667b806f59-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6b81d648-89b9-4c0b-b312-4a667b806f59" (UID: "6b81d648-89b9-4c0b-b312-4a667b806f59"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:14:53 crc kubenswrapper[4947]: I1203 09:14:53.096723 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b81d648-89b9-4c0b-b312-4a667b806f59-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:53 crc kubenswrapper[4947]: I1203 09:14:53.096755 4947 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b81d648-89b9-4c0b-b312-4a667b806f59-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:53 crc kubenswrapper[4947]: I1203 09:14:53.096767 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsmbd\" (UniqueName: \"kubernetes.io/projected/6b81d648-89b9-4c0b-b312-4a667b806f59-kube-api-access-qsmbd\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:53 crc kubenswrapper[4947]: I1203 09:14:53.096779 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b81d648-89b9-4c0b-b312-4a667b806f59-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:53 crc kubenswrapper[4947]: I1203 09:14:53.475176 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5" event={"ID":"6b81d648-89b9-4c0b-b312-4a667b806f59","Type":"ContainerDied","Data":"5daadbbb078288b141102b66461210032c0099e9e2f356166aae4e38123b9600"} Dec 03 09:14:53 crc kubenswrapper[4947]: I1203 09:14:53.475225 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5" Dec 03 09:14:53 crc kubenswrapper[4947]: I1203 09:14:53.475226 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5daadbbb078288b141102b66461210032c0099e9e2f356166aae4e38123b9600" Dec 03 09:14:53 crc kubenswrapper[4947]: I1203 09:14:53.919327 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql" Dec 03 09:14:54 crc kubenswrapper[4947]: I1203 09:14:54.024360 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d2e53d2-0d5b-4895-802b-83538cc2fb92-ssh-key\") pod \"7d2e53d2-0d5b-4895-802b-83538cc2fb92\" (UID: \"7d2e53d2-0d5b-4895-802b-83538cc2fb92\") " Dec 03 09:14:54 crc kubenswrapper[4947]: I1203 09:14:54.024427 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2e53d2-0d5b-4895-802b-83538cc2fb92-pre-adoption-validation-combined-ca-bundle\") pod \"7d2e53d2-0d5b-4895-802b-83538cc2fb92\" (UID: \"7d2e53d2-0d5b-4895-802b-83538cc2fb92\") " Dec 03 09:14:54 crc kubenswrapper[4947]: I1203 09:14:54.024457 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d2e53d2-0d5b-4895-802b-83538cc2fb92-inventory\") pod \"7d2e53d2-0d5b-4895-802b-83538cc2fb92\" (UID: \"7d2e53d2-0d5b-4895-802b-83538cc2fb92\") " Dec 03 09:14:54 crc kubenswrapper[4947]: I1203 09:14:54.024532 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nn8g\" (UniqueName: \"kubernetes.io/projected/7d2e53d2-0d5b-4895-802b-83538cc2fb92-kube-api-access-6nn8g\") pod \"7d2e53d2-0d5b-4895-802b-83538cc2fb92\" (UID: \"7d2e53d2-0d5b-4895-802b-83538cc2fb92\") " Dec 03 09:14:54 crc kubenswrapper[4947]: I1203 09:14:54.029628 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2e53d2-0d5b-4895-802b-83538cc2fb92-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "7d2e53d2-0d5b-4895-802b-83538cc2fb92" (UID: "7d2e53d2-0d5b-4895-802b-83538cc2fb92"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:14:54 crc kubenswrapper[4947]: I1203 09:14:54.029909 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d2e53d2-0d5b-4895-802b-83538cc2fb92-kube-api-access-6nn8g" (OuterVolumeSpecName: "kube-api-access-6nn8g") pod "7d2e53d2-0d5b-4895-802b-83538cc2fb92" (UID: "7d2e53d2-0d5b-4895-802b-83538cc2fb92"). InnerVolumeSpecName "kube-api-access-6nn8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:14:54 crc kubenswrapper[4947]: I1203 09:14:54.053156 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-mbcvh"] Dec 03 09:14:54 crc kubenswrapper[4947]: I1203 09:14:54.066404 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2e53d2-0d5b-4895-802b-83538cc2fb92-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7d2e53d2-0d5b-4895-802b-83538cc2fb92" (UID: "7d2e53d2-0d5b-4895-802b-83538cc2fb92"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:14:54 crc kubenswrapper[4947]: I1203 09:14:54.068342 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-ef07-account-create-update-vlczq"] Dec 03 09:14:54 crc kubenswrapper[4947]: I1203 09:14:54.073057 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d2e53d2-0d5b-4895-802b-83538cc2fb92-inventory" (OuterVolumeSpecName: "inventory") pod "7d2e53d2-0d5b-4895-802b-83538cc2fb92" (UID: "7d2e53d2-0d5b-4895-802b-83538cc2fb92"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:14:54 crc kubenswrapper[4947]: I1203 09:14:54.081705 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-mbcvh"] Dec 03 09:14:54 crc kubenswrapper[4947]: I1203 09:14:54.095619 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-ef07-account-create-update-vlczq"] Dec 03 09:14:54 crc kubenswrapper[4947]: I1203 09:14:54.126976 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7d2e53d2-0d5b-4895-802b-83538cc2fb92-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:54 crc kubenswrapper[4947]: I1203 09:14:54.127000 4947 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2e53d2-0d5b-4895-802b-83538cc2fb92-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:54 crc kubenswrapper[4947]: I1203 09:14:54.127011 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7d2e53d2-0d5b-4895-802b-83538cc2fb92-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:54 crc kubenswrapper[4947]: I1203 09:14:54.127022 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nn8g\" (UniqueName: \"kubernetes.io/projected/7d2e53d2-0d5b-4895-802b-83538cc2fb92-kube-api-access-6nn8g\") on node \"crc\" DevicePath \"\"" Dec 03 09:14:54 crc kubenswrapper[4947]: I1203 09:14:54.487590 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql" event={"ID":"7d2e53d2-0d5b-4895-802b-83538cc2fb92","Type":"ContainerDied","Data":"8769ba24b155038d98bf52ac78539a1dcd891971c17ccb1af1923adbf3fdd192"} Dec 03 09:14:54 crc kubenswrapper[4947]: I1203 09:14:54.487905 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8769ba24b155038d98bf52ac78539a1dcd891971c17ccb1af1923adbf3fdd192" Dec 03 09:14:54 crc kubenswrapper[4947]: I1203 09:14:54.487647 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql" Dec 03 09:14:55 crc kubenswrapper[4947]: I1203 09:14:55.094647 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f4b85c2-6f17-4b45-adaf-7600a962658a" path="/var/lib/kubelet/pods/8f4b85c2-6f17-4b45-adaf-7600a962658a/volumes" Dec 03 09:14:55 crc kubenswrapper[4947]: I1203 09:14:55.095353 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8da5d95-8a33-43fe-b1a1-23888f1c8a3b" path="/var/lib/kubelet/pods/d8da5d95-8a33-43fe-b1a1-23888f1c8a3b/volumes" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.155455 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412555-c6ggp"] Dec 03 09:15:00 crc kubenswrapper[4947]: E1203 09:15:00.156424 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b81d648-89b9-4c0b-b312-4a667b806f59" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell2" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.156442 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b81d648-89b9-4c0b-b312-4a667b806f59" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell2" Dec 03 09:15:00 crc kubenswrapper[4947]: E1203 09:15:00.156464 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d2e53d2-0d5b-4895-802b-83538cc2fb92" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.156473 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d2e53d2-0d5b-4895-802b-83538cc2fb92" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.156890 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d2e53d2-0d5b-4895-802b-83538cc2fb92" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.156913 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b81d648-89b9-4c0b-b312-4a667b806f59" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell2" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.157787 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-c6ggp" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.163060 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.164053 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.173693 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412555-c6ggp"] Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.357862 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/817dee6d-50b1-47f2-84f3-d9be2e542a20-secret-volume\") pod \"collect-profiles-29412555-c6ggp\" (UID: \"817dee6d-50b1-47f2-84f3-d9be2e542a20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-c6ggp" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.358149 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/817dee6d-50b1-47f2-84f3-d9be2e542a20-config-volume\") pod \"collect-profiles-29412555-c6ggp\" (UID: \"817dee6d-50b1-47f2-84f3-d9be2e542a20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-c6ggp" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.358237 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt82b\" (UniqueName: \"kubernetes.io/projected/817dee6d-50b1-47f2-84f3-d9be2e542a20-kube-api-access-mt82b\") pod \"collect-profiles-29412555-c6ggp\" (UID: \"817dee6d-50b1-47f2-84f3-d9be2e542a20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-c6ggp" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.459764 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/817dee6d-50b1-47f2-84f3-d9be2e542a20-config-volume\") pod \"collect-profiles-29412555-c6ggp\" (UID: \"817dee6d-50b1-47f2-84f3-d9be2e542a20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-c6ggp" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.459805 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt82b\" (UniqueName: \"kubernetes.io/projected/817dee6d-50b1-47f2-84f3-d9be2e542a20-kube-api-access-mt82b\") pod \"collect-profiles-29412555-c6ggp\" (UID: \"817dee6d-50b1-47f2-84f3-d9be2e542a20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-c6ggp" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.459872 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/817dee6d-50b1-47f2-84f3-d9be2e542a20-secret-volume\") pod \"collect-profiles-29412555-c6ggp\" (UID: \"817dee6d-50b1-47f2-84f3-d9be2e542a20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-c6ggp" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.460550 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/817dee6d-50b1-47f2-84f3-d9be2e542a20-config-volume\") pod \"collect-profiles-29412555-c6ggp\" (UID: \"817dee6d-50b1-47f2-84f3-d9be2e542a20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-c6ggp" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.480915 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/817dee6d-50b1-47f2-84f3-d9be2e542a20-secret-volume\") pod \"collect-profiles-29412555-c6ggp\" (UID: \"817dee6d-50b1-47f2-84f3-d9be2e542a20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-c6ggp" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.482667 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt82b\" (UniqueName: \"kubernetes.io/projected/817dee6d-50b1-47f2-84f3-d9be2e542a20-kube-api-access-mt82b\") pod \"collect-profiles-29412555-c6ggp\" (UID: \"817dee6d-50b1-47f2-84f3-d9be2e542a20\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-c6ggp" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.489816 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-c6ggp" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.835946 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l"] Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.841295 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.846089 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.846349 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell2" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.846384 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell2-dockercfg-cl4m2" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.846366 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.856339 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr"] Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.858246 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.873853 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.873879 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rfmtm" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.901667 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqdlc\" (UniqueName: \"kubernetes.io/projected/0c3e5b3c-7650-4636-963d-a108c90aab4c-kube-api-access-bqdlc\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr\" (UID: \"0c3e5b3c-7650-4636-963d-a108c90aab4c\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.901894 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c3e5b3c-7650-4636-963d-a108c90aab4c-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr\" (UID: \"0c3e5b3c-7650-4636-963d-a108c90aab4c\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.901958 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c3e5b3c-7650-4636-963d-a108c90aab4c-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr\" (UID: \"0c3e5b3c-7650-4636-963d-a108c90aab4c\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.902050 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c3e5b3c-7650-4636-963d-a108c90aab4c-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr\" (UID: \"0c3e5b3c-7650-4636-963d-a108c90aab4c\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr" Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.907173 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l"] Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.921648 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr"] Dec 03 09:15:00 crc kubenswrapper[4947]: I1203 09:15:00.992854 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412555-c6ggp"] Dec 03 09:15:01 crc kubenswrapper[4947]: I1203 09:15:01.003829 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c3e5b3c-7650-4636-963d-a108c90aab4c-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr\" (UID: \"0c3e5b3c-7650-4636-963d-a108c90aab4c\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr" Dec 03 09:15:01 crc kubenswrapper[4947]: I1203 09:15:01.003899 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c3e5b3c-7650-4636-963d-a108c90aab4c-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr\" (UID: \"0c3e5b3c-7650-4636-963d-a108c90aab4c\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr" Dec 03 09:15:01 crc kubenswrapper[4947]: I1203 09:15:01.003932 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1033702d-9d8f-40b5-8435-2e4cb6850bed-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l\" (UID: \"1033702d-9d8f-40b5-8435-2e4cb6850bed\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l" Dec 03 09:15:01 crc kubenswrapper[4947]: I1203 09:15:01.003978 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c3e5b3c-7650-4636-963d-a108c90aab4c-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr\" (UID: \"0c3e5b3c-7650-4636-963d-a108c90aab4c\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr" Dec 03 09:15:01 crc kubenswrapper[4947]: I1203 09:15:01.004038 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp6sp\" (UniqueName: \"kubernetes.io/projected/1033702d-9d8f-40b5-8435-2e4cb6850bed-kube-api-access-cp6sp\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l\" (UID: \"1033702d-9d8f-40b5-8435-2e4cb6850bed\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l" Dec 03 09:15:01 crc kubenswrapper[4947]: I1203 09:15:01.004068 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1033702d-9d8f-40b5-8435-2e4cb6850bed-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l\" (UID: \"1033702d-9d8f-40b5-8435-2e4cb6850bed\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l" Dec 03 09:15:01 crc kubenswrapper[4947]: I1203 09:15:01.004112 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqdlc\" (UniqueName: \"kubernetes.io/projected/0c3e5b3c-7650-4636-963d-a108c90aab4c-kube-api-access-bqdlc\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr\" (UID: \"0c3e5b3c-7650-4636-963d-a108c90aab4c\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr" Dec 03 09:15:01 crc kubenswrapper[4947]: I1203 09:15:01.004195 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1033702d-9d8f-40b5-8435-2e4cb6850bed-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l\" (UID: \"1033702d-9d8f-40b5-8435-2e4cb6850bed\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l" Dec 03 09:15:01 crc kubenswrapper[4947]: I1203 09:15:01.106779 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1033702d-9d8f-40b5-8435-2e4cb6850bed-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l\" (UID: \"1033702d-9d8f-40b5-8435-2e4cb6850bed\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l" Dec 03 09:15:01 crc kubenswrapper[4947]: I1203 09:15:01.106970 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1033702d-9d8f-40b5-8435-2e4cb6850bed-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l\" (UID: \"1033702d-9d8f-40b5-8435-2e4cb6850bed\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l" Dec 03 09:15:01 crc kubenswrapper[4947]: I1203 09:15:01.107120 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1033702d-9d8f-40b5-8435-2e4cb6850bed-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l\" (UID: \"1033702d-9d8f-40b5-8435-2e4cb6850bed\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l" Dec 03 09:15:01 crc kubenswrapper[4947]: I1203 09:15:01.107266 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp6sp\" (UniqueName: \"kubernetes.io/projected/1033702d-9d8f-40b5-8435-2e4cb6850bed-kube-api-access-cp6sp\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l\" (UID: \"1033702d-9d8f-40b5-8435-2e4cb6850bed\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l" Dec 03 09:15:01 crc kubenswrapper[4947]: I1203 09:15:01.177744 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c3e5b3c-7650-4636-963d-a108c90aab4c-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr\" (UID: \"0c3e5b3c-7650-4636-963d-a108c90aab4c\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr" Dec 03 09:15:01 crc kubenswrapper[4947]: I1203 09:15:01.178177 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1033702d-9d8f-40b5-8435-2e4cb6850bed-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l\" (UID: \"1033702d-9d8f-40b5-8435-2e4cb6850bed\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l" Dec 03 09:15:01 crc kubenswrapper[4947]: I1203 09:15:01.180147 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqdlc\" (UniqueName: \"kubernetes.io/projected/0c3e5b3c-7650-4636-963d-a108c90aab4c-kube-api-access-bqdlc\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr\" (UID: \"0c3e5b3c-7650-4636-963d-a108c90aab4c\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr" Dec 03 09:15:01 crc kubenswrapper[4947]: I1203 09:15:01.180184 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c3e5b3c-7650-4636-963d-a108c90aab4c-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr\" (UID: \"0c3e5b3c-7650-4636-963d-a108c90aab4c\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr" Dec 03 09:15:01 crc kubenswrapper[4947]: I1203 09:15:01.180260 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1033702d-9d8f-40b5-8435-2e4cb6850bed-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l\" (UID: \"1033702d-9d8f-40b5-8435-2e4cb6850bed\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l" Dec 03 09:15:01 crc kubenswrapper[4947]: I1203 09:15:01.181008 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c3e5b3c-7650-4636-963d-a108c90aab4c-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr\" (UID: \"0c3e5b3c-7650-4636-963d-a108c90aab4c\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr" Dec 03 09:15:01 crc kubenswrapper[4947]: I1203 09:15:01.182961 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp6sp\" (UniqueName: \"kubernetes.io/projected/1033702d-9d8f-40b5-8435-2e4cb6850bed-kube-api-access-cp6sp\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l\" (UID: \"1033702d-9d8f-40b5-8435-2e4cb6850bed\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l" Dec 03 09:15:01 crc kubenswrapper[4947]: W1203 09:15:01.189007 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod817dee6d_50b1_47f2_84f3_d9be2e542a20.slice/crio-b4606b7a1fcb382eb71315ed1cff5c4a2bd58a0ad354fbe3af015bd0c6ffff03 WatchSource:0}: Error finding container b4606b7a1fcb382eb71315ed1cff5c4a2bd58a0ad354fbe3af015bd0c6ffff03: Status 404 returned error can't find the container with id b4606b7a1fcb382eb71315ed1cff5c4a2bd58a0ad354fbe3af015bd0c6ffff03 Dec 03 09:15:01 crc kubenswrapper[4947]: I1203 09:15:01.198430 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1033702d-9d8f-40b5-8435-2e4cb6850bed-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l\" (UID: \"1033702d-9d8f-40b5-8435-2e4cb6850bed\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l" Dec 03 09:15:01 crc kubenswrapper[4947]: I1203 09:15:01.218646 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l" Dec 03 09:15:01 crc kubenswrapper[4947]: I1203 09:15:01.231737 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr" Dec 03 09:15:01 crc kubenswrapper[4947]: I1203 09:15:01.561753 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-c6ggp" event={"ID":"817dee6d-50b1-47f2-84f3-d9be2e542a20","Type":"ContainerStarted","Data":"b4606b7a1fcb382eb71315ed1cff5c4a2bd58a0ad354fbe3af015bd0c6ffff03"} Dec 03 09:15:02 crc kubenswrapper[4947]: I1203 09:15:02.154197 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr"] Dec 03 09:15:02 crc kubenswrapper[4947]: I1203 09:15:02.257912 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l"] Dec 03 09:15:02 crc kubenswrapper[4947]: W1203 09:15:02.261581 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1033702d_9d8f_40b5_8435_2e4cb6850bed.slice/crio-dcf6ef8e9923e9944a7c9c6acbfaddc055ce276599c1dd4d5ec750d77998a5a1 WatchSource:0}: Error finding container dcf6ef8e9923e9944a7c9c6acbfaddc055ce276599c1dd4d5ec750d77998a5a1: Status 404 returned error can't find the container with id dcf6ef8e9923e9944a7c9c6acbfaddc055ce276599c1dd4d5ec750d77998a5a1 Dec 03 09:15:02 crc kubenswrapper[4947]: I1203 09:15:02.576606 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l" event={"ID":"1033702d-9d8f-40b5-8435-2e4cb6850bed","Type":"ContainerStarted","Data":"dcf6ef8e9923e9944a7c9c6acbfaddc055ce276599c1dd4d5ec750d77998a5a1"} Dec 03 09:15:02 crc kubenswrapper[4947]: I1203 09:15:02.579034 4947 generic.go:334] "Generic (PLEG): container finished" podID="817dee6d-50b1-47f2-84f3-d9be2e542a20" containerID="4a84ea49d2198f18b790483395b0b0749f2ba3e3e7eec91afe91223ab382639c" exitCode=0 Dec 03 09:15:02 crc kubenswrapper[4947]: I1203 09:15:02.579142 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-c6ggp" event={"ID":"817dee6d-50b1-47f2-84f3-d9be2e542a20","Type":"ContainerDied","Data":"4a84ea49d2198f18b790483395b0b0749f2ba3e3e7eec91afe91223ab382639c"} Dec 03 09:15:02 crc kubenswrapper[4947]: I1203 09:15:02.580745 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr" event={"ID":"0c3e5b3c-7650-4636-963d-a108c90aab4c","Type":"ContainerStarted","Data":"a1d014babc355fd68364d2bd66e8d53f4d2f7d2e2b71567f68f2257ff96930aa"} Dec 03 09:15:03 crc kubenswrapper[4947]: I1203 09:15:03.594666 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr" event={"ID":"0c3e5b3c-7650-4636-963d-a108c90aab4c","Type":"ContainerStarted","Data":"15ca286de8a14acc4c3cc2ec4141d68b21492ae1cb3d2644ff39871f40d08382"} Dec 03 09:15:03 crc kubenswrapper[4947]: I1203 09:15:03.598314 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l" event={"ID":"1033702d-9d8f-40b5-8435-2e4cb6850bed","Type":"ContainerStarted","Data":"f5193b8b2d98e395cd27eca03efa04c790996ccce3be376ae2464c7b92013d1a"} Dec 03 09:15:03 crc kubenswrapper[4947]: I1203 09:15:03.631932 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr" podStartSLOduration=3.099182434 podStartE2EDuration="3.631910283s" podCreationTimestamp="2025-12-03 09:15:00 +0000 UTC" firstStartedPulling="2025-12-03 09:15:02.166580629 +0000 UTC m=+8763.427535065" lastFinishedPulling="2025-12-03 09:15:02.699308488 +0000 UTC m=+8763.960262914" observedRunningTime="2025-12-03 09:15:03.613316011 +0000 UTC m=+8764.874270447" watchObservedRunningTime="2025-12-03 09:15:03.631910283 +0000 UTC m=+8764.892864729" Dec 03 09:15:03 crc kubenswrapper[4947]: I1203 09:15:03.652715 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l" podStartSLOduration=2.876596399 podStartE2EDuration="3.652696875s" podCreationTimestamp="2025-12-03 09:15:00 +0000 UTC" firstStartedPulling="2025-12-03 09:15:02.264550997 +0000 UTC m=+8763.525505433" lastFinishedPulling="2025-12-03 09:15:03.040651483 +0000 UTC m=+8764.301605909" observedRunningTime="2025-12-03 09:15:03.637415552 +0000 UTC m=+8764.898369988" watchObservedRunningTime="2025-12-03 09:15:03.652696875 +0000 UTC m=+8764.913651301" Dec 03 09:15:04 crc kubenswrapper[4947]: I1203 09:15:04.039968 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-c6ggp" Dec 03 09:15:04 crc kubenswrapper[4947]: I1203 09:15:04.073511 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt82b\" (UniqueName: \"kubernetes.io/projected/817dee6d-50b1-47f2-84f3-d9be2e542a20-kube-api-access-mt82b\") pod \"817dee6d-50b1-47f2-84f3-d9be2e542a20\" (UID: \"817dee6d-50b1-47f2-84f3-d9be2e542a20\") " Dec 03 09:15:04 crc kubenswrapper[4947]: I1203 09:15:04.073705 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/817dee6d-50b1-47f2-84f3-d9be2e542a20-secret-volume\") pod \"817dee6d-50b1-47f2-84f3-d9be2e542a20\" (UID: \"817dee6d-50b1-47f2-84f3-d9be2e542a20\") " Dec 03 09:15:04 crc kubenswrapper[4947]: I1203 09:15:04.073752 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/817dee6d-50b1-47f2-84f3-d9be2e542a20-config-volume\") pod \"817dee6d-50b1-47f2-84f3-d9be2e542a20\" (UID: \"817dee6d-50b1-47f2-84f3-d9be2e542a20\") " Dec 03 09:15:04 crc kubenswrapper[4947]: I1203 09:15:04.074446 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/817dee6d-50b1-47f2-84f3-d9be2e542a20-config-volume" (OuterVolumeSpecName: "config-volume") pod "817dee6d-50b1-47f2-84f3-d9be2e542a20" (UID: "817dee6d-50b1-47f2-84f3-d9be2e542a20"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:15:04 crc kubenswrapper[4947]: I1203 09:15:04.079649 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/817dee6d-50b1-47f2-84f3-d9be2e542a20-kube-api-access-mt82b" (OuterVolumeSpecName: "kube-api-access-mt82b") pod "817dee6d-50b1-47f2-84f3-d9be2e542a20" (UID: "817dee6d-50b1-47f2-84f3-d9be2e542a20"). InnerVolumeSpecName "kube-api-access-mt82b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:15:04 crc kubenswrapper[4947]: I1203 09:15:04.092051 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/817dee6d-50b1-47f2-84f3-d9be2e542a20-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "817dee6d-50b1-47f2-84f3-d9be2e542a20" (UID: "817dee6d-50b1-47f2-84f3-d9be2e542a20"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:15:04 crc kubenswrapper[4947]: I1203 09:15:04.177731 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt82b\" (UniqueName: \"kubernetes.io/projected/817dee6d-50b1-47f2-84f3-d9be2e542a20-kube-api-access-mt82b\") on node \"crc\" DevicePath \"\"" Dec 03 09:15:04 crc kubenswrapper[4947]: I1203 09:15:04.177768 4947 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/817dee6d-50b1-47f2-84f3-d9be2e542a20-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 09:15:04 crc kubenswrapper[4947]: I1203 09:15:04.177779 4947 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/817dee6d-50b1-47f2-84f3-d9be2e542a20-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 09:15:04 crc kubenswrapper[4947]: I1203 09:15:04.616584 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-c6ggp" event={"ID":"817dee6d-50b1-47f2-84f3-d9be2e542a20","Type":"ContainerDied","Data":"b4606b7a1fcb382eb71315ed1cff5c4a2bd58a0ad354fbe3af015bd0c6ffff03"} Dec 03 09:15:04 crc kubenswrapper[4947]: I1203 09:15:04.616699 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-c6ggp" Dec 03 09:15:04 crc kubenswrapper[4947]: I1203 09:15:04.626711 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4606b7a1fcb382eb71315ed1cff5c4a2bd58a0ad354fbe3af015bd0c6ffff03" Dec 03 09:15:05 crc kubenswrapper[4947]: I1203 09:15:05.106956 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412510-xdtx6"] Dec 03 09:15:05 crc kubenswrapper[4947]: I1203 09:15:05.117096 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412510-xdtx6"] Dec 03 09:15:06 crc kubenswrapper[4947]: I1203 09:15:06.040581 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-vmh4b"] Dec 03 09:15:06 crc kubenswrapper[4947]: I1203 09:15:06.055733 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-vmh4b"] Dec 03 09:15:07 crc kubenswrapper[4947]: I1203 09:15:07.103584 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1069a3b8-049a-4f2a-b7cf-1940a8129e4d" path="/var/lib/kubelet/pods/1069a3b8-049a-4f2a-b7cf-1940a8129e4d/volumes" Dec 03 09:15:07 crc kubenswrapper[4947]: I1203 09:15:07.105300 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8be5d69-561e-41cc-bf19-f216936f8c9b" path="/var/lib/kubelet/pods/d8be5d69-561e-41cc-bf19-f216936f8c9b/volumes" Dec 03 09:15:28 crc kubenswrapper[4947]: I1203 09:15:28.140633 4947 scope.go:117] "RemoveContainer" containerID="fbc39a2968baf632859177722f2b34ce11d613f0d3220f1bf64f257b52b4a1b3" Dec 03 09:15:28 crc kubenswrapper[4947]: I1203 09:15:28.179476 4947 scope.go:117] "RemoveContainer" containerID="e90162baaca1f0b5bad71f0f8f5a4ed750636334bd672b786e9f6185f7e70e5b" Dec 03 09:15:28 crc kubenswrapper[4947]: I1203 09:15:28.214444 4947 scope.go:117] "RemoveContainer" containerID="8db58c493c8602607814eb9d1db832633a067c83e677e770dda23253fa0771aa" Dec 03 09:15:28 crc kubenswrapper[4947]: I1203 09:15:28.268636 4947 scope.go:117] "RemoveContainer" containerID="945b8d0c53b24f98426655fb060f883839c9bc6cd2098b6510cbdccfd213329b" Dec 03 09:16:07 crc kubenswrapper[4947]: I1203 09:16:07.062753 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell3-db-create-ntsgx"] Dec 03 09:16:07 crc kubenswrapper[4947]: I1203 09:16:07.077030 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-7csgl"] Dec 03 09:16:07 crc kubenswrapper[4947]: I1203 09:16:07.118978 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-6cpvn"] Dec 03 09:16:07 crc kubenswrapper[4947]: I1203 09:16:07.119014 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-ec73-account-create-update-zq27k"] Dec 03 09:16:07 crc kubenswrapper[4947]: I1203 09:16:07.121693 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell2-db-create-qgb8g"] Dec 03 09:16:07 crc kubenswrapper[4947]: I1203 09:16:07.131441 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-m87jz"] Dec 03 09:16:07 crc kubenswrapper[4947]: I1203 09:16:07.141192 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-7csgl"] Dec 03 09:16:07 crc kubenswrapper[4947]: I1203 09:16:07.150574 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-ec73-account-create-update-zq27k"] Dec 03 09:16:07 crc kubenswrapper[4947]: I1203 09:16:07.161816 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell3-db-create-ntsgx"] Dec 03 09:16:07 crc kubenswrapper[4947]: I1203 09:16:07.171816 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-m87jz"] Dec 03 09:16:07 crc kubenswrapper[4947]: I1203 09:16:07.180986 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell2-db-create-qgb8g"] Dec 03 09:16:07 crc kubenswrapper[4947]: I1203 09:16:07.190043 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-4ec2-account-create-update-f7xgv"] Dec 03 09:16:07 crc kubenswrapper[4947]: I1203 09:16:07.200654 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-6cpvn"] Dec 03 09:16:07 crc kubenswrapper[4947]: I1203 09:16:07.212466 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-4ec2-account-create-update-f7xgv"] Dec 03 09:16:08 crc kubenswrapper[4947]: I1203 09:16:08.050179 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-3f1a-account-create-update-xt86d"] Dec 03 09:16:08 crc kubenswrapper[4947]: I1203 09:16:08.064301 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell2-87f1-account-create-update-lkw4s"] Dec 03 09:16:08 crc kubenswrapper[4947]: I1203 09:16:08.074785 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell3-8ab9-account-create-update-222fx"] Dec 03 09:16:08 crc kubenswrapper[4947]: I1203 09:16:08.085719 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell2-87f1-account-create-update-lkw4s"] Dec 03 09:16:08 crc kubenswrapper[4947]: I1203 09:16:08.094879 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell3-8ab9-account-create-update-222fx"] Dec 03 09:16:08 crc kubenswrapper[4947]: I1203 09:16:08.103850 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-3f1a-account-create-update-xt86d"] Dec 03 09:16:09 crc kubenswrapper[4947]: I1203 09:16:09.098019 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a7f2d65-0a76-45f8-add3-9e040c04d500" path="/var/lib/kubelet/pods/0a7f2d65-0a76-45f8-add3-9e040c04d500/volumes" Dec 03 09:16:09 crc kubenswrapper[4947]: I1203 09:16:09.099955 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab80fcd-7771-41f9-a64c-08e0cacf63c5" path="/var/lib/kubelet/pods/3ab80fcd-7771-41f9-a64c-08e0cacf63c5/volumes" Dec 03 09:16:09 crc kubenswrapper[4947]: I1203 09:16:09.100655 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7076b626-52f4-442b-9bc3-96d8a747ddaa" path="/var/lib/kubelet/pods/7076b626-52f4-442b-9bc3-96d8a747ddaa/volumes" Dec 03 09:16:09 crc kubenswrapper[4947]: I1203 09:16:09.101272 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82669ed0-9ff7-4ebc-a09f-8e31ef7358bc" path="/var/lib/kubelet/pods/82669ed0-9ff7-4ebc-a09f-8e31ef7358bc/volumes" Dec 03 09:16:09 crc kubenswrapper[4947]: I1203 09:16:09.102339 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a614093c-3c22-4f0d-916b-9719904fc295" path="/var/lib/kubelet/pods/a614093c-3c22-4f0d-916b-9719904fc295/volumes" Dec 03 09:16:09 crc kubenswrapper[4947]: I1203 09:16:09.102983 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1" path="/var/lib/kubelet/pods/c2ce9dbe-7fd3-4aac-bffd-6688f7d7ebc1/volumes" Dec 03 09:16:09 crc kubenswrapper[4947]: I1203 09:16:09.103630 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2f0e73d-81cd-4359-bba3-8fe0147be087" path="/var/lib/kubelet/pods/c2f0e73d-81cd-4359-bba3-8fe0147be087/volumes" Dec 03 09:16:09 crc kubenswrapper[4947]: I1203 09:16:09.104828 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0" path="/var/lib/kubelet/pods/d2fcdb22-2b64-44ef-9d4b-5d6cffd4cbc0/volumes" Dec 03 09:16:09 crc kubenswrapper[4947]: I1203 09:16:09.105482 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3130440-e339-4ca1-9fc6-d4feeac9ae93" path="/var/lib/kubelet/pods/e3130440-e339-4ca1-9fc6-d4feeac9ae93/volumes" Dec 03 09:16:09 crc kubenswrapper[4947]: I1203 09:16:09.106142 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6" path="/var/lib/kubelet/pods/f96fcab4-55a8-4b1d-8a24-67fbc1cfafd6/volumes" Dec 03 09:16:28 crc kubenswrapper[4947]: I1203 09:16:28.418724 4947 scope.go:117] "RemoveContainer" containerID="607078157d2ccf487999a75a36f9d90ec1cff093751322c155d4cdd02678468c" Dec 03 09:16:28 crc kubenswrapper[4947]: I1203 09:16:28.462901 4947 scope.go:117] "RemoveContainer" containerID="abad442556398d4818dc4ff4b1dc018c706bea8979247a6c73d4a10393b2a850" Dec 03 09:16:28 crc kubenswrapper[4947]: I1203 09:16:28.545381 4947 scope.go:117] "RemoveContainer" containerID="ce242123f417fb4772b3b8fed2700290cbc93375e0050f7e7c1fa44eaaa2e808" Dec 03 09:16:28 crc kubenswrapper[4947]: I1203 09:16:28.578987 4947 scope.go:117] "RemoveContainer" containerID="2fc06c8352a3a245cf483849bc980b7345b7534cd4630f5870be486e35180875" Dec 03 09:16:28 crc kubenswrapper[4947]: I1203 09:16:28.634858 4947 scope.go:117] "RemoveContainer" containerID="aae611f0757d19d89d16a80fe897296ed226f13ed6f810730f1bd02a6ba9cddc" Dec 03 09:16:28 crc kubenswrapper[4947]: I1203 09:16:28.671181 4947 scope.go:117] "RemoveContainer" containerID="e264b0a91480befbce48dfa8a0e4b87bc6959d1bc6c33e2fcc3fdd0c26b59ace" Dec 03 09:16:28 crc kubenswrapper[4947]: I1203 09:16:28.720711 4947 scope.go:117] "RemoveContainer" containerID="5c0ffeff5caa73c84b3ca1f78f22d28f34af4891cf0f71350003481dbe43d60d" Dec 03 09:16:28 crc kubenswrapper[4947]: I1203 09:16:28.740645 4947 scope.go:117] "RemoveContainer" containerID="4496c4d6d39195088b229a9a1fab081db144b46ba7268d070245ca847b4f3426" Dec 03 09:16:28 crc kubenswrapper[4947]: I1203 09:16:28.760200 4947 scope.go:117] "RemoveContainer" containerID="b20486fd2e1e682323f829f2f1bab9f228faf7a8c65f5cbf27ae7617b13432ba" Dec 03 09:16:28 crc kubenswrapper[4947]: I1203 09:16:28.785768 4947 scope.go:117] "RemoveContainer" containerID="20b347a296676464532d3e136ab9e37b5ba555aa34bf8d2b8b8c5b90ed0805eb" Dec 03 09:16:29 crc kubenswrapper[4947]: I1203 09:16:29.985680 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vvks9"] Dec 03 09:16:29 crc kubenswrapper[4947]: E1203 09:16:29.986386 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="817dee6d-50b1-47f2-84f3-d9be2e542a20" containerName="collect-profiles" Dec 03 09:16:29 crc kubenswrapper[4947]: I1203 09:16:29.986398 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="817dee6d-50b1-47f2-84f3-d9be2e542a20" containerName="collect-profiles" Dec 03 09:16:29 crc kubenswrapper[4947]: I1203 09:16:29.986868 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="817dee6d-50b1-47f2-84f3-d9be2e542a20" containerName="collect-profiles" Dec 03 09:16:29 crc kubenswrapper[4947]: I1203 09:16:29.996573 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vvks9" Dec 03 09:16:30 crc kubenswrapper[4947]: I1203 09:16:30.007461 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vvks9"] Dec 03 09:16:30 crc kubenswrapper[4947]: I1203 09:16:30.133428 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9caf88f-fcc3-4420-8cdb-7d34c44e53da-catalog-content\") pod \"certified-operators-vvks9\" (UID: \"e9caf88f-fcc3-4420-8cdb-7d34c44e53da\") " pod="openshift-marketplace/certified-operators-vvks9" Dec 03 09:16:30 crc kubenswrapper[4947]: I1203 09:16:30.133641 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9caf88f-fcc3-4420-8cdb-7d34c44e53da-utilities\") pod \"certified-operators-vvks9\" (UID: \"e9caf88f-fcc3-4420-8cdb-7d34c44e53da\") " pod="openshift-marketplace/certified-operators-vvks9" Dec 03 09:16:30 crc kubenswrapper[4947]: I1203 09:16:30.133692 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk8vr\" (UniqueName: \"kubernetes.io/projected/e9caf88f-fcc3-4420-8cdb-7d34c44e53da-kube-api-access-zk8vr\") pod \"certified-operators-vvks9\" (UID: \"e9caf88f-fcc3-4420-8cdb-7d34c44e53da\") " pod="openshift-marketplace/certified-operators-vvks9" Dec 03 09:16:30 crc kubenswrapper[4947]: I1203 09:16:30.236367 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9caf88f-fcc3-4420-8cdb-7d34c44e53da-utilities\") pod \"certified-operators-vvks9\" (UID: \"e9caf88f-fcc3-4420-8cdb-7d34c44e53da\") " pod="openshift-marketplace/certified-operators-vvks9" Dec 03 09:16:30 crc kubenswrapper[4947]: I1203 09:16:30.236776 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk8vr\" (UniqueName: \"kubernetes.io/projected/e9caf88f-fcc3-4420-8cdb-7d34c44e53da-kube-api-access-zk8vr\") pod \"certified-operators-vvks9\" (UID: \"e9caf88f-fcc3-4420-8cdb-7d34c44e53da\") " pod="openshift-marketplace/certified-operators-vvks9" Dec 03 09:16:30 crc kubenswrapper[4947]: I1203 09:16:30.236919 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9caf88f-fcc3-4420-8cdb-7d34c44e53da-catalog-content\") pod \"certified-operators-vvks9\" (UID: \"e9caf88f-fcc3-4420-8cdb-7d34c44e53da\") " pod="openshift-marketplace/certified-operators-vvks9" Dec 03 09:16:30 crc kubenswrapper[4947]: I1203 09:16:30.236919 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9caf88f-fcc3-4420-8cdb-7d34c44e53da-utilities\") pod \"certified-operators-vvks9\" (UID: \"e9caf88f-fcc3-4420-8cdb-7d34c44e53da\") " pod="openshift-marketplace/certified-operators-vvks9" Dec 03 09:16:30 crc kubenswrapper[4947]: I1203 09:16:30.237174 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9caf88f-fcc3-4420-8cdb-7d34c44e53da-catalog-content\") pod \"certified-operators-vvks9\" (UID: \"e9caf88f-fcc3-4420-8cdb-7d34c44e53da\") " pod="openshift-marketplace/certified-operators-vvks9" Dec 03 09:16:30 crc kubenswrapper[4947]: I1203 09:16:30.266693 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk8vr\" (UniqueName: \"kubernetes.io/projected/e9caf88f-fcc3-4420-8cdb-7d34c44e53da-kube-api-access-zk8vr\") pod \"certified-operators-vvks9\" (UID: \"e9caf88f-fcc3-4420-8cdb-7d34c44e53da\") " pod="openshift-marketplace/certified-operators-vvks9" Dec 03 09:16:30 crc kubenswrapper[4947]: I1203 09:16:30.316172 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vvks9" Dec 03 09:16:30 crc kubenswrapper[4947]: I1203 09:16:30.839230 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vvks9"] Dec 03 09:16:31 crc kubenswrapper[4947]: I1203 09:16:31.605454 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vvks9" event={"ID":"e9caf88f-fcc3-4420-8cdb-7d34c44e53da","Type":"ContainerStarted","Data":"8ac14471c4b06a488e440ebab87ad8563a080104c3e26a719e20a67117f343d8"} Dec 03 09:16:32 crc kubenswrapper[4947]: I1203 09:16:32.618007 4947 generic.go:334] "Generic (PLEG): container finished" podID="e9caf88f-fcc3-4420-8cdb-7d34c44e53da" containerID="1e8fb7e62222b313b4402b5f3e902a40c5f1fd41dddb18ed8798965d87b525ab" exitCode=0 Dec 03 09:16:32 crc kubenswrapper[4947]: I1203 09:16:32.618083 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vvks9" event={"ID":"e9caf88f-fcc3-4420-8cdb-7d34c44e53da","Type":"ContainerDied","Data":"1e8fb7e62222b313b4402b5f3e902a40c5f1fd41dddb18ed8798965d87b525ab"} Dec 03 09:16:32 crc kubenswrapper[4947]: I1203 09:16:32.621753 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 09:16:33 crc kubenswrapper[4947]: I1203 09:16:33.631957 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vvks9" event={"ID":"e9caf88f-fcc3-4420-8cdb-7d34c44e53da","Type":"ContainerStarted","Data":"759ec7c6b19f5b64749e82b229eb76e959648c942580df91265561dd06555c17"} Dec 03 09:16:35 crc kubenswrapper[4947]: I1203 09:16:35.653688 4947 generic.go:334] "Generic (PLEG): container finished" podID="e9caf88f-fcc3-4420-8cdb-7d34c44e53da" containerID="759ec7c6b19f5b64749e82b229eb76e959648c942580df91265561dd06555c17" exitCode=0 Dec 03 09:16:35 crc kubenswrapper[4947]: I1203 09:16:35.653764 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vvks9" event={"ID":"e9caf88f-fcc3-4420-8cdb-7d34c44e53da","Type":"ContainerDied","Data":"759ec7c6b19f5b64749e82b229eb76e959648c942580df91265561dd06555c17"} Dec 03 09:16:36 crc kubenswrapper[4947]: I1203 09:16:36.667720 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vvks9" event={"ID":"e9caf88f-fcc3-4420-8cdb-7d34c44e53da","Type":"ContainerStarted","Data":"db7012d07a259fe3745150c03a65d645133bfd44eca11bb26c58acee76c166c5"} Dec 03 09:16:36 crc kubenswrapper[4947]: I1203 09:16:36.690343 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vvks9" podStartSLOduration=4.215943923 podStartE2EDuration="7.690321003s" podCreationTimestamp="2025-12-03 09:16:29 +0000 UTC" firstStartedPulling="2025-12-03 09:16:32.621547339 +0000 UTC m=+8853.882501765" lastFinishedPulling="2025-12-03 09:16:36.095924419 +0000 UTC m=+8857.356878845" observedRunningTime="2025-12-03 09:16:36.689239045 +0000 UTC m=+8857.950193471" watchObservedRunningTime="2025-12-03 09:16:36.690321003 +0000 UTC m=+8857.951275429" Dec 03 09:16:40 crc kubenswrapper[4947]: I1203 09:16:40.317232 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vvks9" Dec 03 09:16:40 crc kubenswrapper[4947]: I1203 09:16:40.319314 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vvks9" Dec 03 09:16:40 crc kubenswrapper[4947]: I1203 09:16:40.400623 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vvks9" Dec 03 09:16:42 crc kubenswrapper[4947]: I1203 09:16:42.103764 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vvks9" Dec 03 09:16:42 crc kubenswrapper[4947]: I1203 09:16:42.176867 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vvks9"] Dec 03 09:16:43 crc kubenswrapper[4947]: I1203 09:16:43.735324 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vvks9" podUID="e9caf88f-fcc3-4420-8cdb-7d34c44e53da" containerName="registry-server" containerID="cri-o://db7012d07a259fe3745150c03a65d645133bfd44eca11bb26c58acee76c166c5" gracePeriod=2 Dec 03 09:16:44 crc kubenswrapper[4947]: I1203 09:16:44.573425 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vvks9" Dec 03 09:16:44 crc kubenswrapper[4947]: I1203 09:16:44.644970 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk8vr\" (UniqueName: \"kubernetes.io/projected/e9caf88f-fcc3-4420-8cdb-7d34c44e53da-kube-api-access-zk8vr\") pod \"e9caf88f-fcc3-4420-8cdb-7d34c44e53da\" (UID: \"e9caf88f-fcc3-4420-8cdb-7d34c44e53da\") " Dec 03 09:16:44 crc kubenswrapper[4947]: I1203 09:16:44.645004 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9caf88f-fcc3-4420-8cdb-7d34c44e53da-catalog-content\") pod \"e9caf88f-fcc3-4420-8cdb-7d34c44e53da\" (UID: \"e9caf88f-fcc3-4420-8cdb-7d34c44e53da\") " Dec 03 09:16:44 crc kubenswrapper[4947]: I1203 09:16:44.645114 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9caf88f-fcc3-4420-8cdb-7d34c44e53da-utilities\") pod \"e9caf88f-fcc3-4420-8cdb-7d34c44e53da\" (UID: \"e9caf88f-fcc3-4420-8cdb-7d34c44e53da\") " Dec 03 09:16:44 crc kubenswrapper[4947]: I1203 09:16:44.646033 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9caf88f-fcc3-4420-8cdb-7d34c44e53da-utilities" (OuterVolumeSpecName: "utilities") pod "e9caf88f-fcc3-4420-8cdb-7d34c44e53da" (UID: "e9caf88f-fcc3-4420-8cdb-7d34c44e53da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:16:44 crc kubenswrapper[4947]: I1203 09:16:44.650478 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9caf88f-fcc3-4420-8cdb-7d34c44e53da-kube-api-access-zk8vr" (OuterVolumeSpecName: "kube-api-access-zk8vr") pod "e9caf88f-fcc3-4420-8cdb-7d34c44e53da" (UID: "e9caf88f-fcc3-4420-8cdb-7d34c44e53da"). InnerVolumeSpecName "kube-api-access-zk8vr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:16:44 crc kubenswrapper[4947]: I1203 09:16:44.690268 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9caf88f-fcc3-4420-8cdb-7d34c44e53da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9caf88f-fcc3-4420-8cdb-7d34c44e53da" (UID: "e9caf88f-fcc3-4420-8cdb-7d34c44e53da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:16:44 crc kubenswrapper[4947]: I1203 09:16:44.746683 4947 generic.go:334] "Generic (PLEG): container finished" podID="e9caf88f-fcc3-4420-8cdb-7d34c44e53da" containerID="db7012d07a259fe3745150c03a65d645133bfd44eca11bb26c58acee76c166c5" exitCode=0 Dec 03 09:16:44 crc kubenswrapper[4947]: I1203 09:16:44.746731 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vvks9" event={"ID":"e9caf88f-fcc3-4420-8cdb-7d34c44e53da","Type":"ContainerDied","Data":"db7012d07a259fe3745150c03a65d645133bfd44eca11bb26c58acee76c166c5"} Dec 03 09:16:44 crc kubenswrapper[4947]: I1203 09:16:44.746768 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vvks9" event={"ID":"e9caf88f-fcc3-4420-8cdb-7d34c44e53da","Type":"ContainerDied","Data":"8ac14471c4b06a488e440ebab87ad8563a080104c3e26a719e20a67117f343d8"} Dec 03 09:16:44 crc kubenswrapper[4947]: I1203 09:16:44.746789 4947 scope.go:117] "RemoveContainer" containerID="db7012d07a259fe3745150c03a65d645133bfd44eca11bb26c58acee76c166c5" Dec 03 09:16:44 crc kubenswrapper[4947]: I1203 09:16:44.746804 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vvks9" Dec 03 09:16:44 crc kubenswrapper[4947]: I1203 09:16:44.747861 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9caf88f-fcc3-4420-8cdb-7d34c44e53da-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:16:44 crc kubenswrapper[4947]: I1203 09:16:44.748062 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk8vr\" (UniqueName: \"kubernetes.io/projected/e9caf88f-fcc3-4420-8cdb-7d34c44e53da-kube-api-access-zk8vr\") on node \"crc\" DevicePath \"\"" Dec 03 09:16:44 crc kubenswrapper[4947]: I1203 09:16:44.748093 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9caf88f-fcc3-4420-8cdb-7d34c44e53da-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:16:44 crc kubenswrapper[4947]: I1203 09:16:44.766321 4947 scope.go:117] "RemoveContainer" containerID="759ec7c6b19f5b64749e82b229eb76e959648c942580df91265561dd06555c17" Dec 03 09:16:44 crc kubenswrapper[4947]: I1203 09:16:44.793987 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vvks9"] Dec 03 09:16:44 crc kubenswrapper[4947]: I1203 09:16:44.803611 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vvks9"] Dec 03 09:16:44 crc kubenswrapper[4947]: I1203 09:16:44.810225 4947 scope.go:117] "RemoveContainer" containerID="1e8fb7e62222b313b4402b5f3e902a40c5f1fd41dddb18ed8798965d87b525ab" Dec 03 09:16:44 crc kubenswrapper[4947]: I1203 09:16:44.945865 4947 scope.go:117] "RemoveContainer" containerID="db7012d07a259fe3745150c03a65d645133bfd44eca11bb26c58acee76c166c5" Dec 03 09:16:44 crc kubenswrapper[4947]: E1203 09:16:44.946339 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db7012d07a259fe3745150c03a65d645133bfd44eca11bb26c58acee76c166c5\": container with ID starting with db7012d07a259fe3745150c03a65d645133bfd44eca11bb26c58acee76c166c5 not found: ID does not exist" containerID="db7012d07a259fe3745150c03a65d645133bfd44eca11bb26c58acee76c166c5" Dec 03 09:16:44 crc kubenswrapper[4947]: I1203 09:16:44.946390 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db7012d07a259fe3745150c03a65d645133bfd44eca11bb26c58acee76c166c5"} err="failed to get container status \"db7012d07a259fe3745150c03a65d645133bfd44eca11bb26c58acee76c166c5\": rpc error: code = NotFound desc = could not find container \"db7012d07a259fe3745150c03a65d645133bfd44eca11bb26c58acee76c166c5\": container with ID starting with db7012d07a259fe3745150c03a65d645133bfd44eca11bb26c58acee76c166c5 not found: ID does not exist" Dec 03 09:16:44 crc kubenswrapper[4947]: I1203 09:16:44.946421 4947 scope.go:117] "RemoveContainer" containerID="759ec7c6b19f5b64749e82b229eb76e959648c942580df91265561dd06555c17" Dec 03 09:16:44 crc kubenswrapper[4947]: E1203 09:16:44.946822 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"759ec7c6b19f5b64749e82b229eb76e959648c942580df91265561dd06555c17\": container with ID starting with 759ec7c6b19f5b64749e82b229eb76e959648c942580df91265561dd06555c17 not found: ID does not exist" containerID="759ec7c6b19f5b64749e82b229eb76e959648c942580df91265561dd06555c17" Dec 03 09:16:44 crc kubenswrapper[4947]: I1203 09:16:44.946844 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"759ec7c6b19f5b64749e82b229eb76e959648c942580df91265561dd06555c17"} err="failed to get container status \"759ec7c6b19f5b64749e82b229eb76e959648c942580df91265561dd06555c17\": rpc error: code = NotFound desc = could not find container \"759ec7c6b19f5b64749e82b229eb76e959648c942580df91265561dd06555c17\": container with ID starting with 759ec7c6b19f5b64749e82b229eb76e959648c942580df91265561dd06555c17 not found: ID does not exist" Dec 03 09:16:44 crc kubenswrapper[4947]: I1203 09:16:44.946858 4947 scope.go:117] "RemoveContainer" containerID="1e8fb7e62222b313b4402b5f3e902a40c5f1fd41dddb18ed8798965d87b525ab" Dec 03 09:16:44 crc kubenswrapper[4947]: E1203 09:16:44.947243 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e8fb7e62222b313b4402b5f3e902a40c5f1fd41dddb18ed8798965d87b525ab\": container with ID starting with 1e8fb7e62222b313b4402b5f3e902a40c5f1fd41dddb18ed8798965d87b525ab not found: ID does not exist" containerID="1e8fb7e62222b313b4402b5f3e902a40c5f1fd41dddb18ed8798965d87b525ab" Dec 03 09:16:44 crc kubenswrapper[4947]: I1203 09:16:44.947295 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e8fb7e62222b313b4402b5f3e902a40c5f1fd41dddb18ed8798965d87b525ab"} err="failed to get container status \"1e8fb7e62222b313b4402b5f3e902a40c5f1fd41dddb18ed8798965d87b525ab\": rpc error: code = NotFound desc = could not find container \"1e8fb7e62222b313b4402b5f3e902a40c5f1fd41dddb18ed8798965d87b525ab\": container with ID starting with 1e8fb7e62222b313b4402b5f3e902a40c5f1fd41dddb18ed8798965d87b525ab not found: ID does not exist" Dec 03 09:16:45 crc kubenswrapper[4947]: I1203 09:16:45.097836 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9caf88f-fcc3-4420-8cdb-7d34c44e53da" path="/var/lib/kubelet/pods/e9caf88f-fcc3-4420-8cdb-7d34c44e53da/volumes" Dec 03 09:16:49 crc kubenswrapper[4947]: I1203 09:16:49.048373 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hx4fk"] Dec 03 09:16:49 crc kubenswrapper[4947]: I1203 09:16:49.060028 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hx4fk"] Dec 03 09:16:49 crc kubenswrapper[4947]: I1203 09:16:49.097441 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="963b895d-9438-47d0-b4ca-3fd58d5e5bad" path="/var/lib/kubelet/pods/963b895d-9438-47d0-b4ca-3fd58d5e5bad/volumes" Dec 03 09:17:00 crc kubenswrapper[4947]: I1203 09:17:00.086884 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:17:00 crc kubenswrapper[4947]: I1203 09:17:00.087508 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:17:08 crc kubenswrapper[4947]: I1203 09:17:08.054224 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell3-conductor-db-sync-bn2f5"] Dec 03 09:17:08 crc kubenswrapper[4947]: I1203 09:17:08.069457 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell2-conductor-db-sync-p99zm"] Dec 03 09:17:08 crc kubenswrapper[4947]: I1203 09:17:08.081189 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dp5gk"] Dec 03 09:17:08 crc kubenswrapper[4947]: I1203 09:17:08.090915 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell3-conductor-db-sync-bn2f5"] Dec 03 09:17:08 crc kubenswrapper[4947]: I1203 09:17:08.101856 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell2-conductor-db-sync-p99zm"] Dec 03 09:17:08 crc kubenswrapper[4947]: I1203 09:17:08.111233 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dp5gk"] Dec 03 09:17:09 crc kubenswrapper[4947]: I1203 09:17:09.033830 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-5zj7g"] Dec 03 09:17:09 crc kubenswrapper[4947]: I1203 09:17:09.044771 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-5zj7g"] Dec 03 09:17:09 crc kubenswrapper[4947]: I1203 09:17:09.097894 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36ef7a0e-8716-49c8-8605-d4d09d31df62" path="/var/lib/kubelet/pods/36ef7a0e-8716-49c8-8605-d4d09d31df62/volumes" Dec 03 09:17:09 crc kubenswrapper[4947]: I1203 09:17:09.099067 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40791046-82ba-4013-932c-2ef81bb1c309" path="/var/lib/kubelet/pods/40791046-82ba-4013-932c-2ef81bb1c309/volumes" Dec 03 09:17:09 crc kubenswrapper[4947]: I1203 09:17:09.100470 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1442d5f-950b-4776-933d-0c75857b7249" path="/var/lib/kubelet/pods/b1442d5f-950b-4776-933d-0c75857b7249/volumes" Dec 03 09:17:09 crc kubenswrapper[4947]: I1203 09:17:09.101553 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb51151c-0951-48f3-a2cf-4f71d43b811e" path="/var/lib/kubelet/pods/fb51151c-0951-48f3-a2cf-4f71d43b811e/volumes" Dec 03 09:17:23 crc kubenswrapper[4947]: I1203 09:17:23.057873 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell3-cell-mapping-rl2pj"] Dec 03 09:17:23 crc kubenswrapper[4947]: I1203 09:17:23.071417 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-ztsvz"] Dec 03 09:17:23 crc kubenswrapper[4947]: I1203 09:17:23.098861 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-ztsvz"] Dec 03 09:17:23 crc kubenswrapper[4947]: I1203 09:17:23.100062 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell3-cell-mapping-rl2pj"] Dec 03 09:17:25 crc kubenswrapper[4947]: I1203 09:17:25.104122 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1783d52-312d-4302-9a4b-6a15255d3518" path="/var/lib/kubelet/pods/e1783d52-312d-4302-9a4b-6a15255d3518/volumes" Dec 03 09:17:25 crc kubenswrapper[4947]: I1203 09:17:25.107464 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3d3e206-10b7-4bf2-bec3-b57694a2318f" path="/var/lib/kubelet/pods/f3d3e206-10b7-4bf2-bec3-b57694a2318f/volumes" Dec 03 09:17:28 crc kubenswrapper[4947]: I1203 09:17:28.036374 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell2-cell-mapping-8gkkb"] Dec 03 09:17:28 crc kubenswrapper[4947]: I1203 09:17:28.046285 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell2-cell-mapping-8gkkb"] Dec 03 09:17:28 crc kubenswrapper[4947]: I1203 09:17:28.976044 4947 scope.go:117] "RemoveContainer" containerID="93191f250b936d927d19cf8148da57f63627aa990a327cd79f5e2f2a82b60a09" Dec 03 09:17:29 crc kubenswrapper[4947]: I1203 09:17:29.017661 4947 scope.go:117] "RemoveContainer" containerID="4f06a320344f861a246db6b3ff759d3ec9a17028d3427b6b48b0015822d477aa" Dec 03 09:17:29 crc kubenswrapper[4947]: I1203 09:17:29.066869 4947 scope.go:117] "RemoveContainer" containerID="8a6a1e6c39050cbe44230e19bc65d642caa329dfb36e90560082928ecb3450f2" Dec 03 09:17:29 crc kubenswrapper[4947]: I1203 09:17:29.102519 4947 scope.go:117] "RemoveContainer" containerID="836e00d1e9b086a63305af21373c00b5622a5a4c3bb5e48477c55deec17acece" Dec 03 09:17:29 crc kubenswrapper[4947]: I1203 09:17:29.103043 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a03c107-424a-45ce-94fa-f65a738d62a1" path="/var/lib/kubelet/pods/9a03c107-424a-45ce-94fa-f65a738d62a1/volumes" Dec 03 09:17:29 crc kubenswrapper[4947]: I1203 09:17:29.137363 4947 scope.go:117] "RemoveContainer" containerID="d584ca348560fe0a998a55eeac8b2369be4083f05a7604408c9302f24cdb55a7" Dec 03 09:17:29 crc kubenswrapper[4947]: I1203 09:17:29.168765 4947 scope.go:117] "RemoveContainer" containerID="3e7969ecf221536360979a4f956b25b4a56ae7c58eb131fa2748a2210d94fa97" Dec 03 09:17:29 crc kubenswrapper[4947]: I1203 09:17:29.243024 4947 scope.go:117] "RemoveContainer" containerID="c6f74581fa7df7cf0f8ffa53fcc435452df09d0f336ecae60a568f0afe891716" Dec 03 09:17:29 crc kubenswrapper[4947]: I1203 09:17:29.296508 4947 scope.go:117] "RemoveContainer" containerID="00f7cf22b160c936da8edff9be6af758151465eb729339529defcbf75c4e16f6" Dec 03 09:17:29 crc kubenswrapper[4947]: I1203 09:17:29.330146 4947 scope.go:117] "RemoveContainer" containerID="7e7d5f6bff7df8bcc0fc52da398887063f5c0c0f1d931604266d9fabe65a754b" Dec 03 09:17:29 crc kubenswrapper[4947]: I1203 09:17:29.371804 4947 scope.go:117] "RemoveContainer" containerID="dcc01c53bf83e1af85f67d8b7218dcf11c92256ccefd97fcead07bbed46da0b2" Dec 03 09:17:29 crc kubenswrapper[4947]: I1203 09:17:29.397880 4947 scope.go:117] "RemoveContainer" containerID="2ec848735a6de3c7dcc4f778e3e57d40c7ebec99e21c08721868c0c9bfe0b00e" Dec 03 09:17:30 crc kubenswrapper[4947]: I1203 09:17:30.088989 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:17:30 crc kubenswrapper[4947]: I1203 09:17:30.091607 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:18:00 crc kubenswrapper[4947]: I1203 09:18:00.086311 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:18:00 crc kubenswrapper[4947]: I1203 09:18:00.087193 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:18:00 crc kubenswrapper[4947]: I1203 09:18:00.087283 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 09:18:00 crc kubenswrapper[4947]: I1203 09:18:00.088283 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eba6635ac29e19bd98a8b26677bfa21b381fc8b93839c29c2b536e3d5e16b6c4"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 09:18:00 crc kubenswrapper[4947]: I1203 09:18:00.088412 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://eba6635ac29e19bd98a8b26677bfa21b381fc8b93839c29c2b536e3d5e16b6c4" gracePeriod=600 Dec 03 09:18:00 crc kubenswrapper[4947]: I1203 09:18:00.575178 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="eba6635ac29e19bd98a8b26677bfa21b381fc8b93839c29c2b536e3d5e16b6c4" exitCode=0 Dec 03 09:18:00 crc kubenswrapper[4947]: I1203 09:18:00.575243 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"eba6635ac29e19bd98a8b26677bfa21b381fc8b93839c29c2b536e3d5e16b6c4"} Dec 03 09:18:00 crc kubenswrapper[4947]: I1203 09:18:00.575648 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548"} Dec 03 09:18:00 crc kubenswrapper[4947]: I1203 09:18:00.575673 4947 scope.go:117] "RemoveContainer" containerID="43c4db0d7953f93fa173b1b7252d7b0a7e66ce7b5cd78d8d5091cca30e6858ca" Dec 03 09:18:23 crc kubenswrapper[4947]: I1203 09:18:23.810358 4947 generic.go:334] "Generic (PLEG): container finished" podID="1033702d-9d8f-40b5-8435-2e4cb6850bed" containerID="f5193b8b2d98e395cd27eca03efa04c790996ccce3be376ae2464c7b92013d1a" exitCode=0 Dec 03 09:18:23 crc kubenswrapper[4947]: I1203 09:18:23.810444 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l" event={"ID":"1033702d-9d8f-40b5-8435-2e4cb6850bed","Type":"ContainerDied","Data":"f5193b8b2d98e395cd27eca03efa04c790996ccce3be376ae2464c7b92013d1a"} Dec 03 09:18:25 crc kubenswrapper[4947]: I1203 09:18:25.304378 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l" Dec 03 09:18:25 crc kubenswrapper[4947]: I1203 09:18:25.363916 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1033702d-9d8f-40b5-8435-2e4cb6850bed-tripleo-cleanup-combined-ca-bundle\") pod \"1033702d-9d8f-40b5-8435-2e4cb6850bed\" (UID: \"1033702d-9d8f-40b5-8435-2e4cb6850bed\") " Dec 03 09:18:25 crc kubenswrapper[4947]: I1203 09:18:25.364417 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1033702d-9d8f-40b5-8435-2e4cb6850bed-inventory\") pod \"1033702d-9d8f-40b5-8435-2e4cb6850bed\" (UID: \"1033702d-9d8f-40b5-8435-2e4cb6850bed\") " Dec 03 09:18:25 crc kubenswrapper[4947]: I1203 09:18:25.364680 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp6sp\" (UniqueName: \"kubernetes.io/projected/1033702d-9d8f-40b5-8435-2e4cb6850bed-kube-api-access-cp6sp\") pod \"1033702d-9d8f-40b5-8435-2e4cb6850bed\" (UID: \"1033702d-9d8f-40b5-8435-2e4cb6850bed\") " Dec 03 09:18:25 crc kubenswrapper[4947]: I1203 09:18:25.364786 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1033702d-9d8f-40b5-8435-2e4cb6850bed-ssh-key\") pod \"1033702d-9d8f-40b5-8435-2e4cb6850bed\" (UID: \"1033702d-9d8f-40b5-8435-2e4cb6850bed\") " Dec 03 09:18:25 crc kubenswrapper[4947]: I1203 09:18:25.374767 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1033702d-9d8f-40b5-8435-2e4cb6850bed-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "1033702d-9d8f-40b5-8435-2e4cb6850bed" (UID: "1033702d-9d8f-40b5-8435-2e4cb6850bed"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:18:25 crc kubenswrapper[4947]: I1203 09:18:25.374914 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1033702d-9d8f-40b5-8435-2e4cb6850bed-kube-api-access-cp6sp" (OuterVolumeSpecName: "kube-api-access-cp6sp") pod "1033702d-9d8f-40b5-8435-2e4cb6850bed" (UID: "1033702d-9d8f-40b5-8435-2e4cb6850bed"). InnerVolumeSpecName "kube-api-access-cp6sp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:18:25 crc kubenswrapper[4947]: I1203 09:18:25.402397 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1033702d-9d8f-40b5-8435-2e4cb6850bed-inventory" (OuterVolumeSpecName: "inventory") pod "1033702d-9d8f-40b5-8435-2e4cb6850bed" (UID: "1033702d-9d8f-40b5-8435-2e4cb6850bed"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:18:25 crc kubenswrapper[4947]: I1203 09:18:25.404403 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1033702d-9d8f-40b5-8435-2e4cb6850bed-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1033702d-9d8f-40b5-8435-2e4cb6850bed" (UID: "1033702d-9d8f-40b5-8435-2e4cb6850bed"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:18:25 crc kubenswrapper[4947]: I1203 09:18:25.467486 4947 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1033702d-9d8f-40b5-8435-2e4cb6850bed-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:18:25 crc kubenswrapper[4947]: I1203 09:18:25.467543 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1033702d-9d8f-40b5-8435-2e4cb6850bed-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:18:25 crc kubenswrapper[4947]: I1203 09:18:25.467555 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp6sp\" (UniqueName: \"kubernetes.io/projected/1033702d-9d8f-40b5-8435-2e4cb6850bed-kube-api-access-cp6sp\") on node \"crc\" DevicePath \"\"" Dec 03 09:18:25 crc kubenswrapper[4947]: I1203 09:18:25.467567 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1033702d-9d8f-40b5-8435-2e4cb6850bed-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:18:25 crc kubenswrapper[4947]: I1203 09:18:25.839926 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l" event={"ID":"1033702d-9d8f-40b5-8435-2e4cb6850bed","Type":"ContainerDied","Data":"dcf6ef8e9923e9944a7c9c6acbfaddc055ce276599c1dd4d5ec750d77998a5a1"} Dec 03 09:18:25 crc kubenswrapper[4947]: I1203 09:18:25.839976 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcf6ef8e9923e9944a7c9c6acbfaddc055ce276599c1dd4d5ec750d77998a5a1" Dec 03 09:18:25 crc kubenswrapper[4947]: I1203 09:18:25.839979 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l" Dec 03 09:20:00 crc kubenswrapper[4947]: I1203 09:20:00.086913 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:20:00 crc kubenswrapper[4947]: I1203 09:20:00.087697 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:20:07 crc kubenswrapper[4947]: I1203 09:20:07.038472 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-ac5a-account-create-update-8qldg"] Dec 03 09:20:07 crc kubenswrapper[4947]: I1203 09:20:07.049847 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-dptfn"] Dec 03 09:20:07 crc kubenswrapper[4947]: I1203 09:20:07.058378 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-ac5a-account-create-update-8qldg"] Dec 03 09:20:07 crc kubenswrapper[4947]: I1203 09:20:07.070720 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-dptfn"] Dec 03 09:20:07 crc kubenswrapper[4947]: I1203 09:20:07.093512 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="326dcb2c-2dda-4388-86f0-0fe7d911bd0a" path="/var/lib/kubelet/pods/326dcb2c-2dda-4388-86f0-0fe7d911bd0a/volumes" Dec 03 09:20:07 crc kubenswrapper[4947]: I1203 09:20:07.094083 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e7c96fb-1caf-42ed-adb6-5b0a72276b2a" path="/var/lib/kubelet/pods/3e7c96fb-1caf-42ed-adb6-5b0a72276b2a/volumes" Dec 03 09:20:22 crc kubenswrapper[4947]: I1203 09:20:22.056839 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-blw2g"] Dec 03 09:20:22 crc kubenswrapper[4947]: I1203 09:20:22.071482 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-blw2g"] Dec 03 09:20:23 crc kubenswrapper[4947]: I1203 09:20:23.057305 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7ct85"] Dec 03 09:20:23 crc kubenswrapper[4947]: E1203 09:20:23.058281 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9caf88f-fcc3-4420-8cdb-7d34c44e53da" containerName="extract-content" Dec 03 09:20:23 crc kubenswrapper[4947]: I1203 09:20:23.058297 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9caf88f-fcc3-4420-8cdb-7d34c44e53da" containerName="extract-content" Dec 03 09:20:23 crc kubenswrapper[4947]: E1203 09:20:23.058336 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9caf88f-fcc3-4420-8cdb-7d34c44e53da" containerName="extract-utilities" Dec 03 09:20:23 crc kubenswrapper[4947]: I1203 09:20:23.058345 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9caf88f-fcc3-4420-8cdb-7d34c44e53da" containerName="extract-utilities" Dec 03 09:20:23 crc kubenswrapper[4947]: E1203 09:20:23.058365 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9caf88f-fcc3-4420-8cdb-7d34c44e53da" containerName="registry-server" Dec 03 09:20:23 crc kubenswrapper[4947]: I1203 09:20:23.058372 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9caf88f-fcc3-4420-8cdb-7d34c44e53da" containerName="registry-server" Dec 03 09:20:23 crc kubenswrapper[4947]: E1203 09:20:23.058397 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1033702d-9d8f-40b5-8435-2e4cb6850bed" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell2" Dec 03 09:20:23 crc kubenswrapper[4947]: I1203 09:20:23.058406 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1033702d-9d8f-40b5-8435-2e4cb6850bed" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell2" Dec 03 09:20:23 crc kubenswrapper[4947]: I1203 09:20:23.058645 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="1033702d-9d8f-40b5-8435-2e4cb6850bed" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell2" Dec 03 09:20:23 crc kubenswrapper[4947]: I1203 09:20:23.058684 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9caf88f-fcc3-4420-8cdb-7d34c44e53da" containerName="registry-server" Dec 03 09:20:23 crc kubenswrapper[4947]: I1203 09:20:23.060267 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7ct85" Dec 03 09:20:23 crc kubenswrapper[4947]: I1203 09:20:23.076983 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7ct85"] Dec 03 09:20:23 crc kubenswrapper[4947]: I1203 09:20:23.107603 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="941fd7b8-ace3-42f6-ac09-8784a0473417" path="/var/lib/kubelet/pods/941fd7b8-ace3-42f6-ac09-8784a0473417/volumes" Dec 03 09:20:23 crc kubenswrapper[4947]: I1203 09:20:23.198234 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac3d87d3-7fa6-4adf-83a9-007ca45f0482-utilities\") pod \"redhat-operators-7ct85\" (UID: \"ac3d87d3-7fa6-4adf-83a9-007ca45f0482\") " pod="openshift-marketplace/redhat-operators-7ct85" Dec 03 09:20:23 crc kubenswrapper[4947]: I1203 09:20:23.198306 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76l5q\" (UniqueName: \"kubernetes.io/projected/ac3d87d3-7fa6-4adf-83a9-007ca45f0482-kube-api-access-76l5q\") pod \"redhat-operators-7ct85\" (UID: \"ac3d87d3-7fa6-4adf-83a9-007ca45f0482\") " pod="openshift-marketplace/redhat-operators-7ct85" Dec 03 09:20:23 crc kubenswrapper[4947]: I1203 09:20:23.198516 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac3d87d3-7fa6-4adf-83a9-007ca45f0482-catalog-content\") pod \"redhat-operators-7ct85\" (UID: \"ac3d87d3-7fa6-4adf-83a9-007ca45f0482\") " pod="openshift-marketplace/redhat-operators-7ct85" Dec 03 09:20:23 crc kubenswrapper[4947]: I1203 09:20:23.299923 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac3d87d3-7fa6-4adf-83a9-007ca45f0482-catalog-content\") pod \"redhat-operators-7ct85\" (UID: \"ac3d87d3-7fa6-4adf-83a9-007ca45f0482\") " pod="openshift-marketplace/redhat-operators-7ct85" Dec 03 09:20:23 crc kubenswrapper[4947]: I1203 09:20:23.300153 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac3d87d3-7fa6-4adf-83a9-007ca45f0482-utilities\") pod \"redhat-operators-7ct85\" (UID: \"ac3d87d3-7fa6-4adf-83a9-007ca45f0482\") " pod="openshift-marketplace/redhat-operators-7ct85" Dec 03 09:20:23 crc kubenswrapper[4947]: I1203 09:20:23.300204 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76l5q\" (UniqueName: \"kubernetes.io/projected/ac3d87d3-7fa6-4adf-83a9-007ca45f0482-kube-api-access-76l5q\") pod \"redhat-operators-7ct85\" (UID: \"ac3d87d3-7fa6-4adf-83a9-007ca45f0482\") " pod="openshift-marketplace/redhat-operators-7ct85" Dec 03 09:20:23 crc kubenswrapper[4947]: I1203 09:20:23.300655 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac3d87d3-7fa6-4adf-83a9-007ca45f0482-catalog-content\") pod \"redhat-operators-7ct85\" (UID: \"ac3d87d3-7fa6-4adf-83a9-007ca45f0482\") " pod="openshift-marketplace/redhat-operators-7ct85" Dec 03 09:20:23 crc kubenswrapper[4947]: I1203 09:20:23.300933 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac3d87d3-7fa6-4adf-83a9-007ca45f0482-utilities\") pod \"redhat-operators-7ct85\" (UID: \"ac3d87d3-7fa6-4adf-83a9-007ca45f0482\") " pod="openshift-marketplace/redhat-operators-7ct85" Dec 03 09:20:23 crc kubenswrapper[4947]: I1203 09:20:23.321521 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76l5q\" (UniqueName: \"kubernetes.io/projected/ac3d87d3-7fa6-4adf-83a9-007ca45f0482-kube-api-access-76l5q\") pod \"redhat-operators-7ct85\" (UID: \"ac3d87d3-7fa6-4adf-83a9-007ca45f0482\") " pod="openshift-marketplace/redhat-operators-7ct85" Dec 03 09:20:23 crc kubenswrapper[4947]: I1203 09:20:23.394021 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7ct85" Dec 03 09:20:23 crc kubenswrapper[4947]: I1203 09:20:23.863454 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7ct85"] Dec 03 09:20:24 crc kubenswrapper[4947]: I1203 09:20:24.247401 4947 generic.go:334] "Generic (PLEG): container finished" podID="ac3d87d3-7fa6-4adf-83a9-007ca45f0482" containerID="5daee3331eaacd09299e6106c06edef8bd8aebad05f27fc67977986c615461f4" exitCode=0 Dec 03 09:20:24 crc kubenswrapper[4947]: I1203 09:20:24.247532 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7ct85" event={"ID":"ac3d87d3-7fa6-4adf-83a9-007ca45f0482","Type":"ContainerDied","Data":"5daee3331eaacd09299e6106c06edef8bd8aebad05f27fc67977986c615461f4"} Dec 03 09:20:24 crc kubenswrapper[4947]: I1203 09:20:24.247809 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7ct85" event={"ID":"ac3d87d3-7fa6-4adf-83a9-007ca45f0482","Type":"ContainerStarted","Data":"d2e25286a1105e29a4196b9d9a8760c8b58ba4f62b58654588c17bdc9cc138c8"} Dec 03 09:20:26 crc kubenswrapper[4947]: I1203 09:20:26.272586 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7ct85" event={"ID":"ac3d87d3-7fa6-4adf-83a9-007ca45f0482","Type":"ContainerStarted","Data":"39d07afdf3b6c435c1cc8b3f8aa3a3eeba40faa2262970a0aec4bc897dcb5ede"} Dec 03 09:20:29 crc kubenswrapper[4947]: I1203 09:20:29.646586 4947 scope.go:117] "RemoveContainer" containerID="ad5de7e3db1cd83608a8e1fe6280478a26f10bc2c17a0f4d8c2cda071f917695" Dec 03 09:20:29 crc kubenswrapper[4947]: I1203 09:20:29.693705 4947 scope.go:117] "RemoveContainer" containerID="0465f2f614221438c45e658faad0b2b3a37e9a98d406c4ce9450939ebe48018b" Dec 03 09:20:29 crc kubenswrapper[4947]: I1203 09:20:29.757134 4947 scope.go:117] "RemoveContainer" containerID="31e99cb539d0d3666a31248644abf19f284cbb6a4bd64bb009245d18bf548498" Dec 03 09:20:30 crc kubenswrapper[4947]: I1203 09:20:30.087082 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:20:30 crc kubenswrapper[4947]: I1203 09:20:30.087175 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:20:30 crc kubenswrapper[4947]: I1203 09:20:30.314782 4947 generic.go:334] "Generic (PLEG): container finished" podID="ac3d87d3-7fa6-4adf-83a9-007ca45f0482" containerID="39d07afdf3b6c435c1cc8b3f8aa3a3eeba40faa2262970a0aec4bc897dcb5ede" exitCode=0 Dec 03 09:20:30 crc kubenswrapper[4947]: I1203 09:20:30.314925 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7ct85" event={"ID":"ac3d87d3-7fa6-4adf-83a9-007ca45f0482","Type":"ContainerDied","Data":"39d07afdf3b6c435c1cc8b3f8aa3a3eeba40faa2262970a0aec4bc897dcb5ede"} Dec 03 09:20:31 crc kubenswrapper[4947]: I1203 09:20:31.329531 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7ct85" event={"ID":"ac3d87d3-7fa6-4adf-83a9-007ca45f0482","Type":"ContainerStarted","Data":"91ff6134c2199ebbc98ec077231dd6e9d360e29635c03234ad833861f03c2d13"} Dec 03 09:20:31 crc kubenswrapper[4947]: I1203 09:20:31.354277 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7ct85" podStartSLOduration=1.840752247 podStartE2EDuration="8.354255994s" podCreationTimestamp="2025-12-03 09:20:23 +0000 UTC" firstStartedPulling="2025-12-03 09:20:24.249161689 +0000 UTC m=+9085.510116115" lastFinishedPulling="2025-12-03 09:20:30.762665436 +0000 UTC m=+9092.023619862" observedRunningTime="2025-12-03 09:20:31.345767405 +0000 UTC m=+9092.606721821" watchObservedRunningTime="2025-12-03 09:20:31.354255994 +0000 UTC m=+9092.615210420" Dec 03 09:20:33 crc kubenswrapper[4947]: I1203 09:20:33.394408 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7ct85" Dec 03 09:20:33 crc kubenswrapper[4947]: I1203 09:20:33.394752 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7ct85" Dec 03 09:20:34 crc kubenswrapper[4947]: I1203 09:20:34.437620 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7ct85" podUID="ac3d87d3-7fa6-4adf-83a9-007ca45f0482" containerName="registry-server" probeResult="failure" output=< Dec 03 09:20:34 crc kubenswrapper[4947]: timeout: failed to connect service ":50051" within 1s Dec 03 09:20:34 crc kubenswrapper[4947]: > Dec 03 09:20:43 crc kubenswrapper[4947]: I1203 09:20:43.440578 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7ct85" Dec 03 09:20:43 crc kubenswrapper[4947]: I1203 09:20:43.504334 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7ct85" Dec 03 09:20:43 crc kubenswrapper[4947]: I1203 09:20:43.692227 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7ct85"] Dec 03 09:20:45 crc kubenswrapper[4947]: I1203 09:20:45.464063 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7ct85" podUID="ac3d87d3-7fa6-4adf-83a9-007ca45f0482" containerName="registry-server" containerID="cri-o://91ff6134c2199ebbc98ec077231dd6e9d360e29635c03234ad833861f03c2d13" gracePeriod=2 Dec 03 09:20:45 crc kubenswrapper[4947]: I1203 09:20:45.981237 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7ct85" Dec 03 09:20:46 crc kubenswrapper[4947]: I1203 09:20:46.095980 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac3d87d3-7fa6-4adf-83a9-007ca45f0482-utilities\") pod \"ac3d87d3-7fa6-4adf-83a9-007ca45f0482\" (UID: \"ac3d87d3-7fa6-4adf-83a9-007ca45f0482\") " Dec 03 09:20:46 crc kubenswrapper[4947]: I1203 09:20:46.096057 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac3d87d3-7fa6-4adf-83a9-007ca45f0482-catalog-content\") pod \"ac3d87d3-7fa6-4adf-83a9-007ca45f0482\" (UID: \"ac3d87d3-7fa6-4adf-83a9-007ca45f0482\") " Dec 03 09:20:46 crc kubenswrapper[4947]: I1203 09:20:46.096093 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76l5q\" (UniqueName: \"kubernetes.io/projected/ac3d87d3-7fa6-4adf-83a9-007ca45f0482-kube-api-access-76l5q\") pod \"ac3d87d3-7fa6-4adf-83a9-007ca45f0482\" (UID: \"ac3d87d3-7fa6-4adf-83a9-007ca45f0482\") " Dec 03 09:20:46 crc kubenswrapper[4947]: I1203 09:20:46.097358 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac3d87d3-7fa6-4adf-83a9-007ca45f0482-utilities" (OuterVolumeSpecName: "utilities") pod "ac3d87d3-7fa6-4adf-83a9-007ca45f0482" (UID: "ac3d87d3-7fa6-4adf-83a9-007ca45f0482"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:20:46 crc kubenswrapper[4947]: I1203 09:20:46.103239 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac3d87d3-7fa6-4adf-83a9-007ca45f0482-kube-api-access-76l5q" (OuterVolumeSpecName: "kube-api-access-76l5q") pod "ac3d87d3-7fa6-4adf-83a9-007ca45f0482" (UID: "ac3d87d3-7fa6-4adf-83a9-007ca45f0482"). InnerVolumeSpecName "kube-api-access-76l5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:20:46 crc kubenswrapper[4947]: I1203 09:20:46.198628 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac3d87d3-7fa6-4adf-83a9-007ca45f0482-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:20:46 crc kubenswrapper[4947]: I1203 09:20:46.198906 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76l5q\" (UniqueName: \"kubernetes.io/projected/ac3d87d3-7fa6-4adf-83a9-007ca45f0482-kube-api-access-76l5q\") on node \"crc\" DevicePath \"\"" Dec 03 09:20:46 crc kubenswrapper[4947]: I1203 09:20:46.211229 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac3d87d3-7fa6-4adf-83a9-007ca45f0482-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac3d87d3-7fa6-4adf-83a9-007ca45f0482" (UID: "ac3d87d3-7fa6-4adf-83a9-007ca45f0482"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:20:46 crc kubenswrapper[4947]: I1203 09:20:46.300591 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac3d87d3-7fa6-4adf-83a9-007ca45f0482-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:20:46 crc kubenswrapper[4947]: I1203 09:20:46.474820 4947 generic.go:334] "Generic (PLEG): container finished" podID="ac3d87d3-7fa6-4adf-83a9-007ca45f0482" containerID="91ff6134c2199ebbc98ec077231dd6e9d360e29635c03234ad833861f03c2d13" exitCode=0 Dec 03 09:20:46 crc kubenswrapper[4947]: I1203 09:20:46.474876 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7ct85" event={"ID":"ac3d87d3-7fa6-4adf-83a9-007ca45f0482","Type":"ContainerDied","Data":"91ff6134c2199ebbc98ec077231dd6e9d360e29635c03234ad833861f03c2d13"} Dec 03 09:20:46 crc kubenswrapper[4947]: I1203 09:20:46.474901 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7ct85" event={"ID":"ac3d87d3-7fa6-4adf-83a9-007ca45f0482","Type":"ContainerDied","Data":"d2e25286a1105e29a4196b9d9a8760c8b58ba4f62b58654588c17bdc9cc138c8"} Dec 03 09:20:46 crc kubenswrapper[4947]: I1203 09:20:46.474917 4947 scope.go:117] "RemoveContainer" containerID="91ff6134c2199ebbc98ec077231dd6e9d360e29635c03234ad833861f03c2d13" Dec 03 09:20:46 crc kubenswrapper[4947]: I1203 09:20:46.475954 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7ct85" Dec 03 09:20:46 crc kubenswrapper[4947]: I1203 09:20:46.496060 4947 scope.go:117] "RemoveContainer" containerID="39d07afdf3b6c435c1cc8b3f8aa3a3eeba40faa2262970a0aec4bc897dcb5ede" Dec 03 09:20:46 crc kubenswrapper[4947]: I1203 09:20:46.509879 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7ct85"] Dec 03 09:20:46 crc kubenswrapper[4947]: I1203 09:20:46.518058 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7ct85"] Dec 03 09:20:46 crc kubenswrapper[4947]: I1203 09:20:46.538444 4947 scope.go:117] "RemoveContainer" containerID="5daee3331eaacd09299e6106c06edef8bd8aebad05f27fc67977986c615461f4" Dec 03 09:20:46 crc kubenswrapper[4947]: I1203 09:20:46.564117 4947 scope.go:117] "RemoveContainer" containerID="91ff6134c2199ebbc98ec077231dd6e9d360e29635c03234ad833861f03c2d13" Dec 03 09:20:46 crc kubenswrapper[4947]: E1203 09:20:46.565434 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91ff6134c2199ebbc98ec077231dd6e9d360e29635c03234ad833861f03c2d13\": container with ID starting with 91ff6134c2199ebbc98ec077231dd6e9d360e29635c03234ad833861f03c2d13 not found: ID does not exist" containerID="91ff6134c2199ebbc98ec077231dd6e9d360e29635c03234ad833861f03c2d13" Dec 03 09:20:46 crc kubenswrapper[4947]: I1203 09:20:46.565637 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91ff6134c2199ebbc98ec077231dd6e9d360e29635c03234ad833861f03c2d13"} err="failed to get container status \"91ff6134c2199ebbc98ec077231dd6e9d360e29635c03234ad833861f03c2d13\": rpc error: code = NotFound desc = could not find container \"91ff6134c2199ebbc98ec077231dd6e9d360e29635c03234ad833861f03c2d13\": container with ID starting with 91ff6134c2199ebbc98ec077231dd6e9d360e29635c03234ad833861f03c2d13 not found: ID does not exist" Dec 03 09:20:46 crc kubenswrapper[4947]: I1203 09:20:46.565790 4947 scope.go:117] "RemoveContainer" containerID="39d07afdf3b6c435c1cc8b3f8aa3a3eeba40faa2262970a0aec4bc897dcb5ede" Dec 03 09:20:46 crc kubenswrapper[4947]: E1203 09:20:46.566238 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39d07afdf3b6c435c1cc8b3f8aa3a3eeba40faa2262970a0aec4bc897dcb5ede\": container with ID starting with 39d07afdf3b6c435c1cc8b3f8aa3a3eeba40faa2262970a0aec4bc897dcb5ede not found: ID does not exist" containerID="39d07afdf3b6c435c1cc8b3f8aa3a3eeba40faa2262970a0aec4bc897dcb5ede" Dec 03 09:20:46 crc kubenswrapper[4947]: I1203 09:20:46.566281 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39d07afdf3b6c435c1cc8b3f8aa3a3eeba40faa2262970a0aec4bc897dcb5ede"} err="failed to get container status \"39d07afdf3b6c435c1cc8b3f8aa3a3eeba40faa2262970a0aec4bc897dcb5ede\": rpc error: code = NotFound desc = could not find container \"39d07afdf3b6c435c1cc8b3f8aa3a3eeba40faa2262970a0aec4bc897dcb5ede\": container with ID starting with 39d07afdf3b6c435c1cc8b3f8aa3a3eeba40faa2262970a0aec4bc897dcb5ede not found: ID does not exist" Dec 03 09:20:46 crc kubenswrapper[4947]: I1203 09:20:46.566307 4947 scope.go:117] "RemoveContainer" containerID="5daee3331eaacd09299e6106c06edef8bd8aebad05f27fc67977986c615461f4" Dec 03 09:20:46 crc kubenswrapper[4947]: E1203 09:20:46.566691 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5daee3331eaacd09299e6106c06edef8bd8aebad05f27fc67977986c615461f4\": container with ID starting with 5daee3331eaacd09299e6106c06edef8bd8aebad05f27fc67977986c615461f4 not found: ID does not exist" containerID="5daee3331eaacd09299e6106c06edef8bd8aebad05f27fc67977986c615461f4" Dec 03 09:20:46 crc kubenswrapper[4947]: I1203 09:20:46.566710 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5daee3331eaacd09299e6106c06edef8bd8aebad05f27fc67977986c615461f4"} err="failed to get container status \"5daee3331eaacd09299e6106c06edef8bd8aebad05f27fc67977986c615461f4\": rpc error: code = NotFound desc = could not find container \"5daee3331eaacd09299e6106c06edef8bd8aebad05f27fc67977986c615461f4\": container with ID starting with 5daee3331eaacd09299e6106c06edef8bd8aebad05f27fc67977986c615461f4 not found: ID does not exist" Dec 03 09:20:47 crc kubenswrapper[4947]: I1203 09:20:47.105395 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac3d87d3-7fa6-4adf-83a9-007ca45f0482" path="/var/lib/kubelet/pods/ac3d87d3-7fa6-4adf-83a9-007ca45f0482/volumes" Dec 03 09:21:00 crc kubenswrapper[4947]: I1203 09:21:00.086887 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:21:00 crc kubenswrapper[4947]: I1203 09:21:00.087519 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:21:00 crc kubenswrapper[4947]: I1203 09:21:00.087572 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 09:21:00 crc kubenswrapper[4947]: I1203 09:21:00.088366 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 09:21:00 crc kubenswrapper[4947]: I1203 09:21:00.088439 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" gracePeriod=600 Dec 03 09:21:00 crc kubenswrapper[4947]: E1203 09:21:00.381598 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:21:00 crc kubenswrapper[4947]: I1203 09:21:00.622413 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" exitCode=0 Dec 03 09:21:00 crc kubenswrapper[4947]: I1203 09:21:00.622463 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548"} Dec 03 09:21:00 crc kubenswrapper[4947]: I1203 09:21:00.622518 4947 scope.go:117] "RemoveContainer" containerID="eba6635ac29e19bd98a8b26677bfa21b381fc8b93839c29c2b536e3d5e16b6c4" Dec 03 09:21:00 crc kubenswrapper[4947]: I1203 09:21:00.623272 4947 scope.go:117] "RemoveContainer" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" Dec 03 09:21:00 crc kubenswrapper[4947]: E1203 09:21:00.623570 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:21:14 crc kubenswrapper[4947]: I1203 09:21:14.082344 4947 scope.go:117] "RemoveContainer" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" Dec 03 09:21:14 crc kubenswrapper[4947]: E1203 09:21:14.083062 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:21:29 crc kubenswrapper[4947]: I1203 09:21:29.093179 4947 scope.go:117] "RemoveContainer" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" Dec 03 09:21:29 crc kubenswrapper[4947]: E1203 09:21:29.093992 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:21:42 crc kubenswrapper[4947]: I1203 09:21:42.083629 4947 scope.go:117] "RemoveContainer" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" Dec 03 09:21:42 crc kubenswrapper[4947]: E1203 09:21:42.084453 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:21:56 crc kubenswrapper[4947]: I1203 09:21:56.084767 4947 scope.go:117] "RemoveContainer" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" Dec 03 09:21:56 crc kubenswrapper[4947]: E1203 09:21:56.085825 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:22:09 crc kubenswrapper[4947]: I1203 09:22:09.091229 4947 scope.go:117] "RemoveContainer" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" Dec 03 09:22:09 crc kubenswrapper[4947]: E1203 09:22:09.092039 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:22:20 crc kubenswrapper[4947]: I1203 09:22:20.084292 4947 scope.go:117] "RemoveContainer" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" Dec 03 09:22:20 crc kubenswrapper[4947]: E1203 09:22:20.085582 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:22:34 crc kubenswrapper[4947]: I1203 09:22:34.525433 4947 generic.go:334] "Generic (PLEG): container finished" podID="0c3e5b3c-7650-4636-963d-a108c90aab4c" containerID="15ca286de8a14acc4c3cc2ec4141d68b21492ae1cb3d2644ff39871f40d08382" exitCode=0 Dec 03 09:22:34 crc kubenswrapper[4947]: I1203 09:22:34.526012 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr" event={"ID":"0c3e5b3c-7650-4636-963d-a108c90aab4c","Type":"ContainerDied","Data":"15ca286de8a14acc4c3cc2ec4141d68b21492ae1cb3d2644ff39871f40d08382"} Dec 03 09:22:35 crc kubenswrapper[4947]: I1203 09:22:35.083694 4947 scope.go:117] "RemoveContainer" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" Dec 03 09:22:35 crc kubenswrapper[4947]: E1203 09:22:35.084260 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:22:36 crc kubenswrapper[4947]: I1203 09:22:36.238892 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr" Dec 03 09:22:36 crc kubenswrapper[4947]: I1203 09:22:36.410027 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c3e5b3c-7650-4636-963d-a108c90aab4c-ssh-key\") pod \"0c3e5b3c-7650-4636-963d-a108c90aab4c\" (UID: \"0c3e5b3c-7650-4636-963d-a108c90aab4c\") " Dec 03 09:22:36 crc kubenswrapper[4947]: I1203 09:22:36.410205 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c3e5b3c-7650-4636-963d-a108c90aab4c-tripleo-cleanup-combined-ca-bundle\") pod \"0c3e5b3c-7650-4636-963d-a108c90aab4c\" (UID: \"0c3e5b3c-7650-4636-963d-a108c90aab4c\") " Dec 03 09:22:36 crc kubenswrapper[4947]: I1203 09:22:36.410248 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqdlc\" (UniqueName: \"kubernetes.io/projected/0c3e5b3c-7650-4636-963d-a108c90aab4c-kube-api-access-bqdlc\") pod \"0c3e5b3c-7650-4636-963d-a108c90aab4c\" (UID: \"0c3e5b3c-7650-4636-963d-a108c90aab4c\") " Dec 03 09:22:36 crc kubenswrapper[4947]: I1203 09:22:36.410282 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c3e5b3c-7650-4636-963d-a108c90aab4c-inventory\") pod \"0c3e5b3c-7650-4636-963d-a108c90aab4c\" (UID: \"0c3e5b3c-7650-4636-963d-a108c90aab4c\") " Dec 03 09:22:36 crc kubenswrapper[4947]: I1203 09:22:36.416487 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c3e5b3c-7650-4636-963d-a108c90aab4c-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "0c3e5b3c-7650-4636-963d-a108c90aab4c" (UID: "0c3e5b3c-7650-4636-963d-a108c90aab4c"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:22:36 crc kubenswrapper[4947]: I1203 09:22:36.418874 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c3e5b3c-7650-4636-963d-a108c90aab4c-kube-api-access-bqdlc" (OuterVolumeSpecName: "kube-api-access-bqdlc") pod "0c3e5b3c-7650-4636-963d-a108c90aab4c" (UID: "0c3e5b3c-7650-4636-963d-a108c90aab4c"). InnerVolumeSpecName "kube-api-access-bqdlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:22:36 crc kubenswrapper[4947]: I1203 09:22:36.440444 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c3e5b3c-7650-4636-963d-a108c90aab4c-inventory" (OuterVolumeSpecName: "inventory") pod "0c3e5b3c-7650-4636-963d-a108c90aab4c" (UID: "0c3e5b3c-7650-4636-963d-a108c90aab4c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:22:36 crc kubenswrapper[4947]: I1203 09:22:36.446016 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c3e5b3c-7650-4636-963d-a108c90aab4c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0c3e5b3c-7650-4636-963d-a108c90aab4c" (UID: "0c3e5b3c-7650-4636-963d-a108c90aab4c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:22:36 crc kubenswrapper[4947]: I1203 09:22:36.514230 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0c3e5b3c-7650-4636-963d-a108c90aab4c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:22:36 crc kubenswrapper[4947]: I1203 09:22:36.514281 4947 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c3e5b3c-7650-4636-963d-a108c90aab4c-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:22:36 crc kubenswrapper[4947]: I1203 09:22:36.514302 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqdlc\" (UniqueName: \"kubernetes.io/projected/0c3e5b3c-7650-4636-963d-a108c90aab4c-kube-api-access-bqdlc\") on node \"crc\" DevicePath \"\"" Dec 03 09:22:36 crc kubenswrapper[4947]: I1203 09:22:36.514318 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c3e5b3c-7650-4636-963d-a108c90aab4c-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:22:36 crc kubenswrapper[4947]: I1203 09:22:36.548649 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr" event={"ID":"0c3e5b3c-7650-4636-963d-a108c90aab4c","Type":"ContainerDied","Data":"a1d014babc355fd68364d2bd66e8d53f4d2f7d2e2b71567f68f2257ff96930aa"} Dec 03 09:22:36 crc kubenswrapper[4947]: I1203 09:22:36.548698 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1d014babc355fd68364d2bd66e8d53f4d2f7d2e2b71567f68f2257ff96930aa" Dec 03 09:22:36 crc kubenswrapper[4947]: I1203 09:22:36.548718 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.495780 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell2-d84ng"] Dec 03 09:22:45 crc kubenswrapper[4947]: E1203 09:22:45.496916 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3d87d3-7fa6-4adf-83a9-007ca45f0482" containerName="extract-content" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.496935 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3d87d3-7fa6-4adf-83a9-007ca45f0482" containerName="extract-content" Dec 03 09:22:45 crc kubenswrapper[4947]: E1203 09:22:45.496959 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3d87d3-7fa6-4adf-83a9-007ca45f0482" containerName="extract-utilities" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.496970 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3d87d3-7fa6-4adf-83a9-007ca45f0482" containerName="extract-utilities" Dec 03 09:22:45 crc kubenswrapper[4947]: E1203 09:22:45.496987 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c3e5b3c-7650-4636-963d-a108c90aab4c" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.496996 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c3e5b3c-7650-4636-963d-a108c90aab4c" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 03 09:22:45 crc kubenswrapper[4947]: E1203 09:22:45.497019 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3d87d3-7fa6-4adf-83a9-007ca45f0482" containerName="registry-server" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.497028 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3d87d3-7fa6-4adf-83a9-007ca45f0482" containerName="registry-server" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.497269 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac3d87d3-7fa6-4adf-83a9-007ca45f0482" containerName="registry-server" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.497296 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c3e5b3c-7650-4636-963d-a108c90aab4c" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.498270 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell2-d84ng" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.501033 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell2-dockercfg-cl4m2" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.501485 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.501671 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell2" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.501746 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.517563 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-hv6px"] Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.521626 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-hv6px" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.526161 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rfmtm" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.526952 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.535526 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell2-d84ng"] Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.549439 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-hv6px"] Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.609985 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdpk5\" (UniqueName: \"kubernetes.io/projected/490f2f91-8ec4-42da-bee9-65ebd38a7492-kube-api-access-wdpk5\") pod \"bootstrap-openstack-openstack-cell1-hv6px\" (UID: \"490f2f91-8ec4-42da-bee9-65ebd38a7492\") " pod="openstack/bootstrap-openstack-openstack-cell1-hv6px" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.610328 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe124b64-001a-435a-8096-764e2a71097b-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell2-d84ng\" (UID: \"fe124b64-001a-435a-8096-764e2a71097b\") " pod="openstack/bootstrap-openstack-openstack-cell2-d84ng" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.610486 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxt9z\" (UniqueName: \"kubernetes.io/projected/fe124b64-001a-435a-8096-764e2a71097b-kube-api-access-hxt9z\") pod \"bootstrap-openstack-openstack-cell2-d84ng\" (UID: \"fe124b64-001a-435a-8096-764e2a71097b\") " pod="openstack/bootstrap-openstack-openstack-cell2-d84ng" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.610701 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fe124b64-001a-435a-8096-764e2a71097b-ssh-key\") pod \"bootstrap-openstack-openstack-cell2-d84ng\" (UID: \"fe124b64-001a-435a-8096-764e2a71097b\") " pod="openstack/bootstrap-openstack-openstack-cell2-d84ng" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.610906 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/490f2f91-8ec4-42da-bee9-65ebd38a7492-inventory\") pod \"bootstrap-openstack-openstack-cell1-hv6px\" (UID: \"490f2f91-8ec4-42da-bee9-65ebd38a7492\") " pod="openstack/bootstrap-openstack-openstack-cell1-hv6px" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.611099 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/490f2f91-8ec4-42da-bee9-65ebd38a7492-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-hv6px\" (UID: \"490f2f91-8ec4-42da-bee9-65ebd38a7492\") " pod="openstack/bootstrap-openstack-openstack-cell1-hv6px" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.611263 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe124b64-001a-435a-8096-764e2a71097b-inventory\") pod \"bootstrap-openstack-openstack-cell2-d84ng\" (UID: \"fe124b64-001a-435a-8096-764e2a71097b\") " pod="openstack/bootstrap-openstack-openstack-cell2-d84ng" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.611458 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/490f2f91-8ec4-42da-bee9-65ebd38a7492-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-hv6px\" (UID: \"490f2f91-8ec4-42da-bee9-65ebd38a7492\") " pod="openstack/bootstrap-openstack-openstack-cell1-hv6px" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.713208 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/490f2f91-8ec4-42da-bee9-65ebd38a7492-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-hv6px\" (UID: \"490f2f91-8ec4-42da-bee9-65ebd38a7492\") " pod="openstack/bootstrap-openstack-openstack-cell1-hv6px" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.713361 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdpk5\" (UniqueName: \"kubernetes.io/projected/490f2f91-8ec4-42da-bee9-65ebd38a7492-kube-api-access-wdpk5\") pod \"bootstrap-openstack-openstack-cell1-hv6px\" (UID: \"490f2f91-8ec4-42da-bee9-65ebd38a7492\") " pod="openstack/bootstrap-openstack-openstack-cell1-hv6px" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.713413 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe124b64-001a-435a-8096-764e2a71097b-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell2-d84ng\" (UID: \"fe124b64-001a-435a-8096-764e2a71097b\") " pod="openstack/bootstrap-openstack-openstack-cell2-d84ng" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.713455 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxt9z\" (UniqueName: \"kubernetes.io/projected/fe124b64-001a-435a-8096-764e2a71097b-kube-api-access-hxt9z\") pod \"bootstrap-openstack-openstack-cell2-d84ng\" (UID: \"fe124b64-001a-435a-8096-764e2a71097b\") " pod="openstack/bootstrap-openstack-openstack-cell2-d84ng" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.713519 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fe124b64-001a-435a-8096-764e2a71097b-ssh-key\") pod \"bootstrap-openstack-openstack-cell2-d84ng\" (UID: \"fe124b64-001a-435a-8096-764e2a71097b\") " pod="openstack/bootstrap-openstack-openstack-cell2-d84ng" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.713569 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/490f2f91-8ec4-42da-bee9-65ebd38a7492-inventory\") pod \"bootstrap-openstack-openstack-cell1-hv6px\" (UID: \"490f2f91-8ec4-42da-bee9-65ebd38a7492\") " pod="openstack/bootstrap-openstack-openstack-cell1-hv6px" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.713619 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/490f2f91-8ec4-42da-bee9-65ebd38a7492-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-hv6px\" (UID: \"490f2f91-8ec4-42da-bee9-65ebd38a7492\") " pod="openstack/bootstrap-openstack-openstack-cell1-hv6px" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.713637 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe124b64-001a-435a-8096-764e2a71097b-inventory\") pod \"bootstrap-openstack-openstack-cell2-d84ng\" (UID: \"fe124b64-001a-435a-8096-764e2a71097b\") " pod="openstack/bootstrap-openstack-openstack-cell2-d84ng" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.719254 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fe124b64-001a-435a-8096-764e2a71097b-ssh-key\") pod \"bootstrap-openstack-openstack-cell2-d84ng\" (UID: \"fe124b64-001a-435a-8096-764e2a71097b\") " pod="openstack/bootstrap-openstack-openstack-cell2-d84ng" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.719262 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/490f2f91-8ec4-42da-bee9-65ebd38a7492-inventory\") pod \"bootstrap-openstack-openstack-cell1-hv6px\" (UID: \"490f2f91-8ec4-42da-bee9-65ebd38a7492\") " pod="openstack/bootstrap-openstack-openstack-cell1-hv6px" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.721326 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe124b64-001a-435a-8096-764e2a71097b-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell2-d84ng\" (UID: \"fe124b64-001a-435a-8096-764e2a71097b\") " pod="openstack/bootstrap-openstack-openstack-cell2-d84ng" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.722342 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe124b64-001a-435a-8096-764e2a71097b-inventory\") pod \"bootstrap-openstack-openstack-cell2-d84ng\" (UID: \"fe124b64-001a-435a-8096-764e2a71097b\") " pod="openstack/bootstrap-openstack-openstack-cell2-d84ng" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.726342 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/490f2f91-8ec4-42da-bee9-65ebd38a7492-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-hv6px\" (UID: \"490f2f91-8ec4-42da-bee9-65ebd38a7492\") " pod="openstack/bootstrap-openstack-openstack-cell1-hv6px" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.727932 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/490f2f91-8ec4-42da-bee9-65ebd38a7492-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-hv6px\" (UID: \"490f2f91-8ec4-42da-bee9-65ebd38a7492\") " pod="openstack/bootstrap-openstack-openstack-cell1-hv6px" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.735169 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdpk5\" (UniqueName: \"kubernetes.io/projected/490f2f91-8ec4-42da-bee9-65ebd38a7492-kube-api-access-wdpk5\") pod \"bootstrap-openstack-openstack-cell1-hv6px\" (UID: \"490f2f91-8ec4-42da-bee9-65ebd38a7492\") " pod="openstack/bootstrap-openstack-openstack-cell1-hv6px" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.737235 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxt9z\" (UniqueName: \"kubernetes.io/projected/fe124b64-001a-435a-8096-764e2a71097b-kube-api-access-hxt9z\") pod \"bootstrap-openstack-openstack-cell2-d84ng\" (UID: \"fe124b64-001a-435a-8096-764e2a71097b\") " pod="openstack/bootstrap-openstack-openstack-cell2-d84ng" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.842246 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell2-d84ng" Dec 03 09:22:45 crc kubenswrapper[4947]: I1203 09:22:45.860891 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-hv6px" Dec 03 09:22:46 crc kubenswrapper[4947]: I1203 09:22:46.082968 4947 scope.go:117] "RemoveContainer" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" Dec 03 09:22:46 crc kubenswrapper[4947]: E1203 09:22:46.083453 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:22:46 crc kubenswrapper[4947]: I1203 09:22:46.394004 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-hv6px"] Dec 03 09:22:46 crc kubenswrapper[4947]: I1203 09:22:46.403261 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 09:22:46 crc kubenswrapper[4947]: I1203 09:22:46.492794 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell2-d84ng"] Dec 03 09:22:46 crc kubenswrapper[4947]: W1203 09:22:46.498564 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe124b64_001a_435a_8096_764e2a71097b.slice/crio-6559ab243be3128e23e046fcc846134779972298e97a5f89ae59cfff12b5c78a WatchSource:0}: Error finding container 6559ab243be3128e23e046fcc846134779972298e97a5f89ae59cfff12b5c78a: Status 404 returned error can't find the container with id 6559ab243be3128e23e046fcc846134779972298e97a5f89ae59cfff12b5c78a Dec 03 09:22:46 crc kubenswrapper[4947]: I1203 09:22:46.650933 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-hv6px" event={"ID":"490f2f91-8ec4-42da-bee9-65ebd38a7492","Type":"ContainerStarted","Data":"5f62a3073ee56191b762b7ea3030c8227db57fa16df959c07a0be29014f0ab94"} Dec 03 09:22:46 crc kubenswrapper[4947]: I1203 09:22:46.652680 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell2-d84ng" event={"ID":"fe124b64-001a-435a-8096-764e2a71097b","Type":"ContainerStarted","Data":"6559ab243be3128e23e046fcc846134779972298e97a5f89ae59cfff12b5c78a"} Dec 03 09:22:47 crc kubenswrapper[4947]: I1203 09:22:47.663310 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-hv6px" event={"ID":"490f2f91-8ec4-42da-bee9-65ebd38a7492","Type":"ContainerStarted","Data":"ff69e80018e457079e8a43027c03bdfae4c26ba482b27eff6078d048d8908cda"} Dec 03 09:22:47 crc kubenswrapper[4947]: I1203 09:22:47.665230 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell2-d84ng" event={"ID":"fe124b64-001a-435a-8096-764e2a71097b","Type":"ContainerStarted","Data":"7dba7649327b43eca70e692682d6a5171204b38e1db2eed0eb3ff68635bd3cc4"} Dec 03 09:22:47 crc kubenswrapper[4947]: I1203 09:22:47.683297 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-hv6px" podStartSLOduration=2.129287078 podStartE2EDuration="2.683273969s" podCreationTimestamp="2025-12-03 09:22:45 +0000 UTC" firstStartedPulling="2025-12-03 09:22:46.403090711 +0000 UTC m=+9227.664045127" lastFinishedPulling="2025-12-03 09:22:46.957077572 +0000 UTC m=+9228.218032018" observedRunningTime="2025-12-03 09:22:47.680601617 +0000 UTC m=+9228.941556043" watchObservedRunningTime="2025-12-03 09:22:47.683273969 +0000 UTC m=+9228.944228395" Dec 03 09:22:47 crc kubenswrapper[4947]: I1203 09:22:47.704239 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell2-d84ng" podStartSLOduration=2.208655411 podStartE2EDuration="2.704216275s" podCreationTimestamp="2025-12-03 09:22:45 +0000 UTC" firstStartedPulling="2025-12-03 09:22:46.501094469 +0000 UTC m=+9227.762048895" lastFinishedPulling="2025-12-03 09:22:46.996655313 +0000 UTC m=+9228.257609759" observedRunningTime="2025-12-03 09:22:47.698385648 +0000 UTC m=+9228.959340104" watchObservedRunningTime="2025-12-03 09:22:47.704216275 +0000 UTC m=+9228.965170691" Dec 03 09:22:55 crc kubenswrapper[4947]: I1203 09:22:55.044843 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-smxsk"] Dec 03 09:22:55 crc kubenswrapper[4947]: I1203 09:22:55.056417 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-475b-account-create-update-rkmdb"] Dec 03 09:22:55 crc kubenswrapper[4947]: I1203 09:22:55.068257 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-smxsk"] Dec 03 09:22:55 crc kubenswrapper[4947]: I1203 09:22:55.077532 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-475b-account-create-update-rkmdb"] Dec 03 09:22:55 crc kubenswrapper[4947]: I1203 09:22:55.096261 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fc249da-1377-4963-9556-38b7a1effa79" path="/var/lib/kubelet/pods/1fc249da-1377-4963-9556-38b7a1effa79/volumes" Dec 03 09:22:55 crc kubenswrapper[4947]: I1203 09:22:55.097031 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c" path="/var/lib/kubelet/pods/3af6039a-d2cf-4aaa-87d6-8ba7a4e7633c/volumes" Dec 03 09:23:00 crc kubenswrapper[4947]: I1203 09:23:00.083839 4947 scope.go:117] "RemoveContainer" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" Dec 03 09:23:00 crc kubenswrapper[4947]: E1203 09:23:00.084635 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:23:06 crc kubenswrapper[4947]: I1203 09:23:06.032931 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-4pqqh"] Dec 03 09:23:06 crc kubenswrapper[4947]: I1203 09:23:06.046923 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-4pqqh"] Dec 03 09:23:07 crc kubenswrapper[4947]: I1203 09:23:07.098423 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30d3388b-b492-404e-bfbd-d9c84de28766" path="/var/lib/kubelet/pods/30d3388b-b492-404e-bfbd-d9c84de28766/volumes" Dec 03 09:23:12 crc kubenswrapper[4947]: I1203 09:23:12.084507 4947 scope.go:117] "RemoveContainer" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" Dec 03 09:23:12 crc kubenswrapper[4947]: E1203 09:23:12.085245 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:23:26 crc kubenswrapper[4947]: I1203 09:23:26.083928 4947 scope.go:117] "RemoveContainer" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" Dec 03 09:23:26 crc kubenswrapper[4947]: E1203 09:23:26.084694 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:23:30 crc kubenswrapper[4947]: I1203 09:23:30.016716 4947 scope.go:117] "RemoveContainer" containerID="9eabfb70ace61efc0edbf49be73a86ddcaae44937fc9895d7dc5a4a85521dec7" Dec 03 09:23:30 crc kubenswrapper[4947]: I1203 09:23:30.043613 4947 scope.go:117] "RemoveContainer" containerID="3ea4ae7a65fdba513c2c2b8e9656c853967c8f2d9cbeb0fe4168c7e93565ea01" Dec 03 09:23:30 crc kubenswrapper[4947]: I1203 09:23:30.099867 4947 scope.go:117] "RemoveContainer" containerID="8bb5b8db77cce67c639ee2ba505a16125c593cb60b66a7877e3404155ea3381b" Dec 03 09:23:39 crc kubenswrapper[4947]: I1203 09:23:39.092756 4947 scope.go:117] "RemoveContainer" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" Dec 03 09:23:39 crc kubenswrapper[4947]: E1203 09:23:39.093463 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:23:54 crc kubenswrapper[4947]: I1203 09:23:54.082597 4947 scope.go:117] "RemoveContainer" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" Dec 03 09:23:54 crc kubenswrapper[4947]: E1203 09:23:54.083571 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:24:04 crc kubenswrapper[4947]: I1203 09:24:04.115702 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rzm7s"] Dec 03 09:24:04 crc kubenswrapper[4947]: I1203 09:24:04.119059 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rzm7s" Dec 03 09:24:04 crc kubenswrapper[4947]: I1203 09:24:04.132628 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rzm7s"] Dec 03 09:24:04 crc kubenswrapper[4947]: I1203 09:24:04.179286 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnqn8\" (UniqueName: \"kubernetes.io/projected/a6830677-adef-4ee7-8435-154242866f3c-kube-api-access-wnqn8\") pod \"community-operators-rzm7s\" (UID: \"a6830677-adef-4ee7-8435-154242866f3c\") " pod="openshift-marketplace/community-operators-rzm7s" Dec 03 09:24:04 crc kubenswrapper[4947]: I1203 09:24:04.179788 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6830677-adef-4ee7-8435-154242866f3c-catalog-content\") pod \"community-operators-rzm7s\" (UID: \"a6830677-adef-4ee7-8435-154242866f3c\") " pod="openshift-marketplace/community-operators-rzm7s" Dec 03 09:24:04 crc kubenswrapper[4947]: I1203 09:24:04.179953 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6830677-adef-4ee7-8435-154242866f3c-utilities\") pod \"community-operators-rzm7s\" (UID: \"a6830677-adef-4ee7-8435-154242866f3c\") " pod="openshift-marketplace/community-operators-rzm7s" Dec 03 09:24:04 crc kubenswrapper[4947]: I1203 09:24:04.281943 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnqn8\" (UniqueName: \"kubernetes.io/projected/a6830677-adef-4ee7-8435-154242866f3c-kube-api-access-wnqn8\") pod \"community-operators-rzm7s\" (UID: \"a6830677-adef-4ee7-8435-154242866f3c\") " pod="openshift-marketplace/community-operators-rzm7s" Dec 03 09:24:04 crc kubenswrapper[4947]: I1203 09:24:04.282128 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6830677-adef-4ee7-8435-154242866f3c-catalog-content\") pod \"community-operators-rzm7s\" (UID: \"a6830677-adef-4ee7-8435-154242866f3c\") " pod="openshift-marketplace/community-operators-rzm7s" Dec 03 09:24:04 crc kubenswrapper[4947]: I1203 09:24:04.282188 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6830677-adef-4ee7-8435-154242866f3c-utilities\") pod \"community-operators-rzm7s\" (UID: \"a6830677-adef-4ee7-8435-154242866f3c\") " pod="openshift-marketplace/community-operators-rzm7s" Dec 03 09:24:04 crc kubenswrapper[4947]: I1203 09:24:04.282827 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6830677-adef-4ee7-8435-154242866f3c-utilities\") pod \"community-operators-rzm7s\" (UID: \"a6830677-adef-4ee7-8435-154242866f3c\") " pod="openshift-marketplace/community-operators-rzm7s" Dec 03 09:24:04 crc kubenswrapper[4947]: I1203 09:24:04.282959 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6830677-adef-4ee7-8435-154242866f3c-catalog-content\") pod \"community-operators-rzm7s\" (UID: \"a6830677-adef-4ee7-8435-154242866f3c\") " pod="openshift-marketplace/community-operators-rzm7s" Dec 03 09:24:04 crc kubenswrapper[4947]: I1203 09:24:04.307124 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnqn8\" (UniqueName: \"kubernetes.io/projected/a6830677-adef-4ee7-8435-154242866f3c-kube-api-access-wnqn8\") pod \"community-operators-rzm7s\" (UID: \"a6830677-adef-4ee7-8435-154242866f3c\") " pod="openshift-marketplace/community-operators-rzm7s" Dec 03 09:24:04 crc kubenswrapper[4947]: I1203 09:24:04.450538 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rzm7s" Dec 03 09:24:05 crc kubenswrapper[4947]: I1203 09:24:05.083398 4947 scope.go:117] "RemoveContainer" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" Dec 03 09:24:05 crc kubenswrapper[4947]: E1203 09:24:05.084065 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:24:05 crc kubenswrapper[4947]: I1203 09:24:05.196197 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rzm7s"] Dec 03 09:24:05 crc kubenswrapper[4947]: I1203 09:24:05.423612 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzm7s" event={"ID":"a6830677-adef-4ee7-8435-154242866f3c","Type":"ContainerStarted","Data":"83914a334510f208c442dc0ffa5bc63743af8f3864494ab539572db322346a95"} Dec 03 09:24:06 crc kubenswrapper[4947]: I1203 09:24:06.433182 4947 generic.go:334] "Generic (PLEG): container finished" podID="a6830677-adef-4ee7-8435-154242866f3c" containerID="6ae778afbe62edc58055993e03f3b85ef35c0877e0cd0624acb8618eca727bf1" exitCode=0 Dec 03 09:24:06 crc kubenswrapper[4947]: I1203 09:24:06.433288 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzm7s" event={"ID":"a6830677-adef-4ee7-8435-154242866f3c","Type":"ContainerDied","Data":"6ae778afbe62edc58055993e03f3b85ef35c0877e0cd0624acb8618eca727bf1"} Dec 03 09:24:06 crc kubenswrapper[4947]: I1203 09:24:06.511477 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-64bwj"] Dec 03 09:24:06 crc kubenswrapper[4947]: I1203 09:24:06.514914 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64bwj" Dec 03 09:24:06 crc kubenswrapper[4947]: I1203 09:24:06.532898 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-64bwj"] Dec 03 09:24:06 crc kubenswrapper[4947]: I1203 09:24:06.533053 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btjx4\" (UniqueName: \"kubernetes.io/projected/fd5359bf-6e44-4e12-a201-4a134f673957-kube-api-access-btjx4\") pod \"redhat-marketplace-64bwj\" (UID: \"fd5359bf-6e44-4e12-a201-4a134f673957\") " pod="openshift-marketplace/redhat-marketplace-64bwj" Dec 03 09:24:06 crc kubenswrapper[4947]: I1203 09:24:06.533127 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd5359bf-6e44-4e12-a201-4a134f673957-catalog-content\") pod \"redhat-marketplace-64bwj\" (UID: \"fd5359bf-6e44-4e12-a201-4a134f673957\") " pod="openshift-marketplace/redhat-marketplace-64bwj" Dec 03 09:24:06 crc kubenswrapper[4947]: I1203 09:24:06.533168 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd5359bf-6e44-4e12-a201-4a134f673957-utilities\") pod \"redhat-marketplace-64bwj\" (UID: \"fd5359bf-6e44-4e12-a201-4a134f673957\") " pod="openshift-marketplace/redhat-marketplace-64bwj" Dec 03 09:24:06 crc kubenswrapper[4947]: I1203 09:24:06.635840 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd5359bf-6e44-4e12-a201-4a134f673957-catalog-content\") pod \"redhat-marketplace-64bwj\" (UID: \"fd5359bf-6e44-4e12-a201-4a134f673957\") " pod="openshift-marketplace/redhat-marketplace-64bwj" Dec 03 09:24:06 crc kubenswrapper[4947]: I1203 09:24:06.635882 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd5359bf-6e44-4e12-a201-4a134f673957-utilities\") pod \"redhat-marketplace-64bwj\" (UID: \"fd5359bf-6e44-4e12-a201-4a134f673957\") " pod="openshift-marketplace/redhat-marketplace-64bwj" Dec 03 09:24:06 crc kubenswrapper[4947]: I1203 09:24:06.636071 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btjx4\" (UniqueName: \"kubernetes.io/projected/fd5359bf-6e44-4e12-a201-4a134f673957-kube-api-access-btjx4\") pod \"redhat-marketplace-64bwj\" (UID: \"fd5359bf-6e44-4e12-a201-4a134f673957\") " pod="openshift-marketplace/redhat-marketplace-64bwj" Dec 03 09:24:06 crc kubenswrapper[4947]: I1203 09:24:06.636444 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd5359bf-6e44-4e12-a201-4a134f673957-catalog-content\") pod \"redhat-marketplace-64bwj\" (UID: \"fd5359bf-6e44-4e12-a201-4a134f673957\") " pod="openshift-marketplace/redhat-marketplace-64bwj" Dec 03 09:24:06 crc kubenswrapper[4947]: I1203 09:24:06.636461 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd5359bf-6e44-4e12-a201-4a134f673957-utilities\") pod \"redhat-marketplace-64bwj\" (UID: \"fd5359bf-6e44-4e12-a201-4a134f673957\") " pod="openshift-marketplace/redhat-marketplace-64bwj" Dec 03 09:24:06 crc kubenswrapper[4947]: I1203 09:24:06.656545 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btjx4\" (UniqueName: \"kubernetes.io/projected/fd5359bf-6e44-4e12-a201-4a134f673957-kube-api-access-btjx4\") pod \"redhat-marketplace-64bwj\" (UID: \"fd5359bf-6e44-4e12-a201-4a134f673957\") " pod="openshift-marketplace/redhat-marketplace-64bwj" Dec 03 09:24:06 crc kubenswrapper[4947]: I1203 09:24:06.845820 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64bwj" Dec 03 09:24:07 crc kubenswrapper[4947]: W1203 09:24:07.373675 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd5359bf_6e44_4e12_a201_4a134f673957.slice/crio-8a926b38de35eabdb6dbbc129a9ad8aee5872827c33da707bc887c131b9635d7 WatchSource:0}: Error finding container 8a926b38de35eabdb6dbbc129a9ad8aee5872827c33da707bc887c131b9635d7: Status 404 returned error can't find the container with id 8a926b38de35eabdb6dbbc129a9ad8aee5872827c33da707bc887c131b9635d7 Dec 03 09:24:07 crc kubenswrapper[4947]: I1203 09:24:07.377910 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-64bwj"] Dec 03 09:24:07 crc kubenswrapper[4947]: I1203 09:24:07.445132 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64bwj" event={"ID":"fd5359bf-6e44-4e12-a201-4a134f673957","Type":"ContainerStarted","Data":"8a926b38de35eabdb6dbbc129a9ad8aee5872827c33da707bc887c131b9635d7"} Dec 03 09:24:08 crc kubenswrapper[4947]: I1203 09:24:08.459443 4947 generic.go:334] "Generic (PLEG): container finished" podID="fd5359bf-6e44-4e12-a201-4a134f673957" containerID="3db60f2ba6736a18d3801ec254ce0cfb1f2d731f150ab613950e44ccc5408609" exitCode=0 Dec 03 09:24:08 crc kubenswrapper[4947]: I1203 09:24:08.459687 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64bwj" event={"ID":"fd5359bf-6e44-4e12-a201-4a134f673957","Type":"ContainerDied","Data":"3db60f2ba6736a18d3801ec254ce0cfb1f2d731f150ab613950e44ccc5408609"} Dec 03 09:24:08 crc kubenswrapper[4947]: I1203 09:24:08.463252 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzm7s" event={"ID":"a6830677-adef-4ee7-8435-154242866f3c","Type":"ContainerStarted","Data":"b5be7946274c389e0add20bef100e8c8f2ed138b94d62806ac0d206a202b519d"} Dec 03 09:24:09 crc kubenswrapper[4947]: I1203 09:24:09.493948 4947 generic.go:334] "Generic (PLEG): container finished" podID="a6830677-adef-4ee7-8435-154242866f3c" containerID="b5be7946274c389e0add20bef100e8c8f2ed138b94d62806ac0d206a202b519d" exitCode=0 Dec 03 09:24:09 crc kubenswrapper[4947]: I1203 09:24:09.494303 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzm7s" event={"ID":"a6830677-adef-4ee7-8435-154242866f3c","Type":"ContainerDied","Data":"b5be7946274c389e0add20bef100e8c8f2ed138b94d62806ac0d206a202b519d"} Dec 03 09:24:10 crc kubenswrapper[4947]: I1203 09:24:10.515602 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzm7s" event={"ID":"a6830677-adef-4ee7-8435-154242866f3c","Type":"ContainerStarted","Data":"6a8099e56c395791a10cccafbe1e5be00034f06bd28ceb7ddbde8d92ee76482c"} Dec 03 09:24:10 crc kubenswrapper[4947]: I1203 09:24:10.518885 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64bwj" event={"ID":"fd5359bf-6e44-4e12-a201-4a134f673957","Type":"ContainerStarted","Data":"cec58a195b745f74315aad6edb3251af14f4ba5e98974d600b30e7f0b10a015c"} Dec 03 09:24:10 crc kubenswrapper[4947]: I1203 09:24:10.548192 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rzm7s" podStartSLOduration=3.076887055 podStartE2EDuration="6.548170452s" podCreationTimestamp="2025-12-03 09:24:04 +0000 UTC" firstStartedPulling="2025-12-03 09:24:06.435667215 +0000 UTC m=+9307.696621641" lastFinishedPulling="2025-12-03 09:24:09.906950612 +0000 UTC m=+9311.167905038" observedRunningTime="2025-12-03 09:24:10.535300684 +0000 UTC m=+9311.796255110" watchObservedRunningTime="2025-12-03 09:24:10.548170452 +0000 UTC m=+9311.809124868" Dec 03 09:24:11 crc kubenswrapper[4947]: I1203 09:24:11.533474 4947 generic.go:334] "Generic (PLEG): container finished" podID="fd5359bf-6e44-4e12-a201-4a134f673957" containerID="cec58a195b745f74315aad6edb3251af14f4ba5e98974d600b30e7f0b10a015c" exitCode=0 Dec 03 09:24:11 crc kubenswrapper[4947]: I1203 09:24:11.533571 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64bwj" event={"ID":"fd5359bf-6e44-4e12-a201-4a134f673957","Type":"ContainerDied","Data":"cec58a195b745f74315aad6edb3251af14f4ba5e98974d600b30e7f0b10a015c"} Dec 03 09:24:14 crc kubenswrapper[4947]: I1203 09:24:14.451801 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rzm7s" Dec 03 09:24:14 crc kubenswrapper[4947]: I1203 09:24:14.453603 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rzm7s" Dec 03 09:24:14 crc kubenswrapper[4947]: I1203 09:24:14.503040 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rzm7s" Dec 03 09:24:14 crc kubenswrapper[4947]: I1203 09:24:14.561112 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64bwj" event={"ID":"fd5359bf-6e44-4e12-a201-4a134f673957","Type":"ContainerStarted","Data":"472dcb0fafca5b13259c3dd22e9b19de640bddae8966f2809b4bcc431f4253f5"} Dec 03 09:24:14 crc kubenswrapper[4947]: I1203 09:24:14.586252 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-64bwj" podStartSLOduration=3.33674696 podStartE2EDuration="8.586232776s" podCreationTimestamp="2025-12-03 09:24:06 +0000 UTC" firstStartedPulling="2025-12-03 09:24:08.462452021 +0000 UTC m=+9309.723406457" lastFinishedPulling="2025-12-03 09:24:13.711937847 +0000 UTC m=+9314.972892273" observedRunningTime="2025-12-03 09:24:14.582000491 +0000 UTC m=+9315.842954927" watchObservedRunningTime="2025-12-03 09:24:14.586232776 +0000 UTC m=+9315.847187212" Dec 03 09:24:15 crc kubenswrapper[4947]: I1203 09:24:15.878795 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rzm7s" Dec 03 09:24:16 crc kubenswrapper[4947]: I1203 09:24:16.847026 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-64bwj" Dec 03 09:24:16 crc kubenswrapper[4947]: I1203 09:24:16.847387 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-64bwj" Dec 03 09:24:16 crc kubenswrapper[4947]: I1203 09:24:16.894966 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rzm7s"] Dec 03 09:24:16 crc kubenswrapper[4947]: I1203 09:24:16.905015 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-64bwj" Dec 03 09:24:17 crc kubenswrapper[4947]: I1203 09:24:17.588553 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rzm7s" podUID="a6830677-adef-4ee7-8435-154242866f3c" containerName="registry-server" containerID="cri-o://6a8099e56c395791a10cccafbe1e5be00034f06bd28ceb7ddbde8d92ee76482c" gracePeriod=2 Dec 03 09:24:18 crc kubenswrapper[4947]: I1203 09:24:18.191481 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rzm7s" Dec 03 09:24:18 crc kubenswrapper[4947]: I1203 09:24:18.373965 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnqn8\" (UniqueName: \"kubernetes.io/projected/a6830677-adef-4ee7-8435-154242866f3c-kube-api-access-wnqn8\") pod \"a6830677-adef-4ee7-8435-154242866f3c\" (UID: \"a6830677-adef-4ee7-8435-154242866f3c\") " Dec 03 09:24:18 crc kubenswrapper[4947]: I1203 09:24:18.374009 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6830677-adef-4ee7-8435-154242866f3c-utilities\") pod \"a6830677-adef-4ee7-8435-154242866f3c\" (UID: \"a6830677-adef-4ee7-8435-154242866f3c\") " Dec 03 09:24:18 crc kubenswrapper[4947]: I1203 09:24:18.374195 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6830677-adef-4ee7-8435-154242866f3c-catalog-content\") pod \"a6830677-adef-4ee7-8435-154242866f3c\" (UID: \"a6830677-adef-4ee7-8435-154242866f3c\") " Dec 03 09:24:18 crc kubenswrapper[4947]: I1203 09:24:18.375035 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6830677-adef-4ee7-8435-154242866f3c-utilities" (OuterVolumeSpecName: "utilities") pod "a6830677-adef-4ee7-8435-154242866f3c" (UID: "a6830677-adef-4ee7-8435-154242866f3c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:24:18 crc kubenswrapper[4947]: I1203 09:24:18.379012 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6830677-adef-4ee7-8435-154242866f3c-kube-api-access-wnqn8" (OuterVolumeSpecName: "kube-api-access-wnqn8") pod "a6830677-adef-4ee7-8435-154242866f3c" (UID: "a6830677-adef-4ee7-8435-154242866f3c"). InnerVolumeSpecName "kube-api-access-wnqn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:24:18 crc kubenswrapper[4947]: I1203 09:24:18.434866 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6830677-adef-4ee7-8435-154242866f3c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6830677-adef-4ee7-8435-154242866f3c" (UID: "a6830677-adef-4ee7-8435-154242866f3c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:24:18 crc kubenswrapper[4947]: I1203 09:24:18.477369 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6830677-adef-4ee7-8435-154242866f3c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:24:18 crc kubenswrapper[4947]: I1203 09:24:18.477409 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnqn8\" (UniqueName: \"kubernetes.io/projected/a6830677-adef-4ee7-8435-154242866f3c-kube-api-access-wnqn8\") on node \"crc\" DevicePath \"\"" Dec 03 09:24:18 crc kubenswrapper[4947]: I1203 09:24:18.477426 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6830677-adef-4ee7-8435-154242866f3c-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:24:18 crc kubenswrapper[4947]: I1203 09:24:18.602540 4947 generic.go:334] "Generic (PLEG): container finished" podID="a6830677-adef-4ee7-8435-154242866f3c" containerID="6a8099e56c395791a10cccafbe1e5be00034f06bd28ceb7ddbde8d92ee76482c" exitCode=0 Dec 03 09:24:18 crc kubenswrapper[4947]: I1203 09:24:18.602618 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzm7s" event={"ID":"a6830677-adef-4ee7-8435-154242866f3c","Type":"ContainerDied","Data":"6a8099e56c395791a10cccafbe1e5be00034f06bd28ceb7ddbde8d92ee76482c"} Dec 03 09:24:18 crc kubenswrapper[4947]: I1203 09:24:18.602638 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rzm7s" Dec 03 09:24:18 crc kubenswrapper[4947]: I1203 09:24:18.602695 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rzm7s" event={"ID":"a6830677-adef-4ee7-8435-154242866f3c","Type":"ContainerDied","Data":"83914a334510f208c442dc0ffa5bc63743af8f3864494ab539572db322346a95"} Dec 03 09:24:18 crc kubenswrapper[4947]: I1203 09:24:18.602737 4947 scope.go:117] "RemoveContainer" containerID="6a8099e56c395791a10cccafbe1e5be00034f06bd28ceb7ddbde8d92ee76482c" Dec 03 09:24:18 crc kubenswrapper[4947]: I1203 09:24:18.628736 4947 scope.go:117] "RemoveContainer" containerID="b5be7946274c389e0add20bef100e8c8f2ed138b94d62806ac0d206a202b519d" Dec 03 09:24:18 crc kubenswrapper[4947]: I1203 09:24:18.650421 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rzm7s"] Dec 03 09:24:18 crc kubenswrapper[4947]: I1203 09:24:18.663557 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rzm7s"] Dec 03 09:24:18 crc kubenswrapper[4947]: I1203 09:24:18.665852 4947 scope.go:117] "RemoveContainer" containerID="6ae778afbe62edc58055993e03f3b85ef35c0877e0cd0624acb8618eca727bf1" Dec 03 09:24:18 crc kubenswrapper[4947]: I1203 09:24:18.719308 4947 scope.go:117] "RemoveContainer" containerID="6a8099e56c395791a10cccafbe1e5be00034f06bd28ceb7ddbde8d92ee76482c" Dec 03 09:24:18 crc kubenswrapper[4947]: E1203 09:24:18.719821 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a8099e56c395791a10cccafbe1e5be00034f06bd28ceb7ddbde8d92ee76482c\": container with ID starting with 6a8099e56c395791a10cccafbe1e5be00034f06bd28ceb7ddbde8d92ee76482c not found: ID does not exist" containerID="6a8099e56c395791a10cccafbe1e5be00034f06bd28ceb7ddbde8d92ee76482c" Dec 03 09:24:18 crc kubenswrapper[4947]: I1203 09:24:18.719852 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a8099e56c395791a10cccafbe1e5be00034f06bd28ceb7ddbde8d92ee76482c"} err="failed to get container status \"6a8099e56c395791a10cccafbe1e5be00034f06bd28ceb7ddbde8d92ee76482c\": rpc error: code = NotFound desc = could not find container \"6a8099e56c395791a10cccafbe1e5be00034f06bd28ceb7ddbde8d92ee76482c\": container with ID starting with 6a8099e56c395791a10cccafbe1e5be00034f06bd28ceb7ddbde8d92ee76482c not found: ID does not exist" Dec 03 09:24:18 crc kubenswrapper[4947]: I1203 09:24:18.719873 4947 scope.go:117] "RemoveContainer" containerID="b5be7946274c389e0add20bef100e8c8f2ed138b94d62806ac0d206a202b519d" Dec 03 09:24:18 crc kubenswrapper[4947]: E1203 09:24:18.720042 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5be7946274c389e0add20bef100e8c8f2ed138b94d62806ac0d206a202b519d\": container with ID starting with b5be7946274c389e0add20bef100e8c8f2ed138b94d62806ac0d206a202b519d not found: ID does not exist" containerID="b5be7946274c389e0add20bef100e8c8f2ed138b94d62806ac0d206a202b519d" Dec 03 09:24:18 crc kubenswrapper[4947]: I1203 09:24:18.720064 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5be7946274c389e0add20bef100e8c8f2ed138b94d62806ac0d206a202b519d"} err="failed to get container status \"b5be7946274c389e0add20bef100e8c8f2ed138b94d62806ac0d206a202b519d\": rpc error: code = NotFound desc = could not find container \"b5be7946274c389e0add20bef100e8c8f2ed138b94d62806ac0d206a202b519d\": container with ID starting with b5be7946274c389e0add20bef100e8c8f2ed138b94d62806ac0d206a202b519d not found: ID does not exist" Dec 03 09:24:18 crc kubenswrapper[4947]: I1203 09:24:18.720079 4947 scope.go:117] "RemoveContainer" containerID="6ae778afbe62edc58055993e03f3b85ef35c0877e0cd0624acb8618eca727bf1" Dec 03 09:24:18 crc kubenswrapper[4947]: E1203 09:24:18.720228 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ae778afbe62edc58055993e03f3b85ef35c0877e0cd0624acb8618eca727bf1\": container with ID starting with 6ae778afbe62edc58055993e03f3b85ef35c0877e0cd0624acb8618eca727bf1 not found: ID does not exist" containerID="6ae778afbe62edc58055993e03f3b85ef35c0877e0cd0624acb8618eca727bf1" Dec 03 09:24:18 crc kubenswrapper[4947]: I1203 09:24:18.720247 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ae778afbe62edc58055993e03f3b85ef35c0877e0cd0624acb8618eca727bf1"} err="failed to get container status \"6ae778afbe62edc58055993e03f3b85ef35c0877e0cd0624acb8618eca727bf1\": rpc error: code = NotFound desc = could not find container \"6ae778afbe62edc58055993e03f3b85ef35c0877e0cd0624acb8618eca727bf1\": container with ID starting with 6ae778afbe62edc58055993e03f3b85ef35c0877e0cd0624acb8618eca727bf1 not found: ID does not exist" Dec 03 09:24:19 crc kubenswrapper[4947]: I1203 09:24:19.090063 4947 scope.go:117] "RemoveContainer" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" Dec 03 09:24:19 crc kubenswrapper[4947]: E1203 09:24:19.090373 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:24:19 crc kubenswrapper[4947]: I1203 09:24:19.094606 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6830677-adef-4ee7-8435-154242866f3c" path="/var/lib/kubelet/pods/a6830677-adef-4ee7-8435-154242866f3c/volumes" Dec 03 09:24:26 crc kubenswrapper[4947]: I1203 09:24:26.903203 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-64bwj" Dec 03 09:24:26 crc kubenswrapper[4947]: I1203 09:24:26.973090 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-64bwj"] Dec 03 09:24:27 crc kubenswrapper[4947]: I1203 09:24:27.688602 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-64bwj" podUID="fd5359bf-6e44-4e12-a201-4a134f673957" containerName="registry-server" containerID="cri-o://472dcb0fafca5b13259c3dd22e9b19de640bddae8966f2809b4bcc431f4253f5" gracePeriod=2 Dec 03 09:24:28 crc kubenswrapper[4947]: I1203 09:24:28.316343 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64bwj" Dec 03 09:24:28 crc kubenswrapper[4947]: I1203 09:24:28.482548 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd5359bf-6e44-4e12-a201-4a134f673957-utilities\") pod \"fd5359bf-6e44-4e12-a201-4a134f673957\" (UID: \"fd5359bf-6e44-4e12-a201-4a134f673957\") " Dec 03 09:24:28 crc kubenswrapper[4947]: I1203 09:24:28.482654 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd5359bf-6e44-4e12-a201-4a134f673957-catalog-content\") pod \"fd5359bf-6e44-4e12-a201-4a134f673957\" (UID: \"fd5359bf-6e44-4e12-a201-4a134f673957\") " Dec 03 09:24:28 crc kubenswrapper[4947]: I1203 09:24:28.482816 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btjx4\" (UniqueName: \"kubernetes.io/projected/fd5359bf-6e44-4e12-a201-4a134f673957-kube-api-access-btjx4\") pod \"fd5359bf-6e44-4e12-a201-4a134f673957\" (UID: \"fd5359bf-6e44-4e12-a201-4a134f673957\") " Dec 03 09:24:28 crc kubenswrapper[4947]: I1203 09:24:28.483229 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd5359bf-6e44-4e12-a201-4a134f673957-utilities" (OuterVolumeSpecName: "utilities") pod "fd5359bf-6e44-4e12-a201-4a134f673957" (UID: "fd5359bf-6e44-4e12-a201-4a134f673957"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:24:28 crc kubenswrapper[4947]: I1203 09:24:28.483469 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd5359bf-6e44-4e12-a201-4a134f673957-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:24:28 crc kubenswrapper[4947]: I1203 09:24:28.488518 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd5359bf-6e44-4e12-a201-4a134f673957-kube-api-access-btjx4" (OuterVolumeSpecName: "kube-api-access-btjx4") pod "fd5359bf-6e44-4e12-a201-4a134f673957" (UID: "fd5359bf-6e44-4e12-a201-4a134f673957"). InnerVolumeSpecName "kube-api-access-btjx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:24:28 crc kubenswrapper[4947]: I1203 09:24:28.502214 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd5359bf-6e44-4e12-a201-4a134f673957-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd5359bf-6e44-4e12-a201-4a134f673957" (UID: "fd5359bf-6e44-4e12-a201-4a134f673957"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:24:28 crc kubenswrapper[4947]: I1203 09:24:28.585290 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btjx4\" (UniqueName: \"kubernetes.io/projected/fd5359bf-6e44-4e12-a201-4a134f673957-kube-api-access-btjx4\") on node \"crc\" DevicePath \"\"" Dec 03 09:24:28 crc kubenswrapper[4947]: I1203 09:24:28.585329 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd5359bf-6e44-4e12-a201-4a134f673957-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:24:28 crc kubenswrapper[4947]: I1203 09:24:28.698927 4947 generic.go:334] "Generic (PLEG): container finished" podID="fd5359bf-6e44-4e12-a201-4a134f673957" containerID="472dcb0fafca5b13259c3dd22e9b19de640bddae8966f2809b4bcc431f4253f5" exitCode=0 Dec 03 09:24:28 crc kubenswrapper[4947]: I1203 09:24:28.698980 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64bwj" event={"ID":"fd5359bf-6e44-4e12-a201-4a134f673957","Type":"ContainerDied","Data":"472dcb0fafca5b13259c3dd22e9b19de640bddae8966f2809b4bcc431f4253f5"} Dec 03 09:24:28 crc kubenswrapper[4947]: I1203 09:24:28.699018 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-64bwj" event={"ID":"fd5359bf-6e44-4e12-a201-4a134f673957","Type":"ContainerDied","Data":"8a926b38de35eabdb6dbbc129a9ad8aee5872827c33da707bc887c131b9635d7"} Dec 03 09:24:28 crc kubenswrapper[4947]: I1203 09:24:28.699022 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-64bwj" Dec 03 09:24:28 crc kubenswrapper[4947]: I1203 09:24:28.699041 4947 scope.go:117] "RemoveContainer" containerID="472dcb0fafca5b13259c3dd22e9b19de640bddae8966f2809b4bcc431f4253f5" Dec 03 09:24:28 crc kubenswrapper[4947]: I1203 09:24:28.721090 4947 scope.go:117] "RemoveContainer" containerID="cec58a195b745f74315aad6edb3251af14f4ba5e98974d600b30e7f0b10a015c" Dec 03 09:24:28 crc kubenswrapper[4947]: I1203 09:24:28.742227 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-64bwj"] Dec 03 09:24:28 crc kubenswrapper[4947]: I1203 09:24:28.749059 4947 scope.go:117] "RemoveContainer" containerID="3db60f2ba6736a18d3801ec254ce0cfb1f2d731f150ab613950e44ccc5408609" Dec 03 09:24:28 crc kubenswrapper[4947]: I1203 09:24:28.753105 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-64bwj"] Dec 03 09:24:28 crc kubenswrapper[4947]: I1203 09:24:28.811235 4947 scope.go:117] "RemoveContainer" containerID="472dcb0fafca5b13259c3dd22e9b19de640bddae8966f2809b4bcc431f4253f5" Dec 03 09:24:28 crc kubenswrapper[4947]: E1203 09:24:28.811772 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"472dcb0fafca5b13259c3dd22e9b19de640bddae8966f2809b4bcc431f4253f5\": container with ID starting with 472dcb0fafca5b13259c3dd22e9b19de640bddae8966f2809b4bcc431f4253f5 not found: ID does not exist" containerID="472dcb0fafca5b13259c3dd22e9b19de640bddae8966f2809b4bcc431f4253f5" Dec 03 09:24:28 crc kubenswrapper[4947]: I1203 09:24:28.811812 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"472dcb0fafca5b13259c3dd22e9b19de640bddae8966f2809b4bcc431f4253f5"} err="failed to get container status \"472dcb0fafca5b13259c3dd22e9b19de640bddae8966f2809b4bcc431f4253f5\": rpc error: code = NotFound desc = could not find container \"472dcb0fafca5b13259c3dd22e9b19de640bddae8966f2809b4bcc431f4253f5\": container with ID starting with 472dcb0fafca5b13259c3dd22e9b19de640bddae8966f2809b4bcc431f4253f5 not found: ID does not exist" Dec 03 09:24:28 crc kubenswrapper[4947]: I1203 09:24:28.811839 4947 scope.go:117] "RemoveContainer" containerID="cec58a195b745f74315aad6edb3251af14f4ba5e98974d600b30e7f0b10a015c" Dec 03 09:24:28 crc kubenswrapper[4947]: E1203 09:24:28.812305 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cec58a195b745f74315aad6edb3251af14f4ba5e98974d600b30e7f0b10a015c\": container with ID starting with cec58a195b745f74315aad6edb3251af14f4ba5e98974d600b30e7f0b10a015c not found: ID does not exist" containerID="cec58a195b745f74315aad6edb3251af14f4ba5e98974d600b30e7f0b10a015c" Dec 03 09:24:28 crc kubenswrapper[4947]: I1203 09:24:28.812336 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cec58a195b745f74315aad6edb3251af14f4ba5e98974d600b30e7f0b10a015c"} err="failed to get container status \"cec58a195b745f74315aad6edb3251af14f4ba5e98974d600b30e7f0b10a015c\": rpc error: code = NotFound desc = could not find container \"cec58a195b745f74315aad6edb3251af14f4ba5e98974d600b30e7f0b10a015c\": container with ID starting with cec58a195b745f74315aad6edb3251af14f4ba5e98974d600b30e7f0b10a015c not found: ID does not exist" Dec 03 09:24:28 crc kubenswrapper[4947]: I1203 09:24:28.812359 4947 scope.go:117] "RemoveContainer" containerID="3db60f2ba6736a18d3801ec254ce0cfb1f2d731f150ab613950e44ccc5408609" Dec 03 09:24:28 crc kubenswrapper[4947]: E1203 09:24:28.812917 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3db60f2ba6736a18d3801ec254ce0cfb1f2d731f150ab613950e44ccc5408609\": container with ID starting with 3db60f2ba6736a18d3801ec254ce0cfb1f2d731f150ab613950e44ccc5408609 not found: ID does not exist" containerID="3db60f2ba6736a18d3801ec254ce0cfb1f2d731f150ab613950e44ccc5408609" Dec 03 09:24:28 crc kubenswrapper[4947]: I1203 09:24:28.812967 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3db60f2ba6736a18d3801ec254ce0cfb1f2d731f150ab613950e44ccc5408609"} err="failed to get container status \"3db60f2ba6736a18d3801ec254ce0cfb1f2d731f150ab613950e44ccc5408609\": rpc error: code = NotFound desc = could not find container \"3db60f2ba6736a18d3801ec254ce0cfb1f2d731f150ab613950e44ccc5408609\": container with ID starting with 3db60f2ba6736a18d3801ec254ce0cfb1f2d731f150ab613950e44ccc5408609 not found: ID does not exist" Dec 03 09:24:29 crc kubenswrapper[4947]: I1203 09:24:29.097404 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd5359bf-6e44-4e12-a201-4a134f673957" path="/var/lib/kubelet/pods/fd5359bf-6e44-4e12-a201-4a134f673957/volumes" Dec 03 09:24:33 crc kubenswrapper[4947]: I1203 09:24:33.085753 4947 scope.go:117] "RemoveContainer" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" Dec 03 09:24:33 crc kubenswrapper[4947]: E1203 09:24:33.086891 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:24:46 crc kubenswrapper[4947]: I1203 09:24:46.083182 4947 scope.go:117] "RemoveContainer" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" Dec 03 09:24:46 crc kubenswrapper[4947]: E1203 09:24:46.083839 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:24:58 crc kubenswrapper[4947]: I1203 09:24:58.083349 4947 scope.go:117] "RemoveContainer" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" Dec 03 09:24:58 crc kubenswrapper[4947]: E1203 09:24:58.084145 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:25:09 crc kubenswrapper[4947]: I1203 09:25:09.092963 4947 scope.go:117] "RemoveContainer" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" Dec 03 09:25:09 crc kubenswrapper[4947]: E1203 09:25:09.093753 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:25:23 crc kubenswrapper[4947]: I1203 09:25:23.083984 4947 scope.go:117] "RemoveContainer" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" Dec 03 09:25:23 crc kubenswrapper[4947]: E1203 09:25:23.085533 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:25:36 crc kubenswrapper[4947]: I1203 09:25:36.084266 4947 scope.go:117] "RemoveContainer" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" Dec 03 09:25:36 crc kubenswrapper[4947]: E1203 09:25:36.085046 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:25:39 crc kubenswrapper[4947]: I1203 09:25:39.435221 4947 generic.go:334] "Generic (PLEG): container finished" podID="fe124b64-001a-435a-8096-764e2a71097b" containerID="7dba7649327b43eca70e692682d6a5171204b38e1db2eed0eb3ff68635bd3cc4" exitCode=0 Dec 03 09:25:39 crc kubenswrapper[4947]: I1203 09:25:39.435338 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell2-d84ng" event={"ID":"fe124b64-001a-435a-8096-764e2a71097b","Type":"ContainerDied","Data":"7dba7649327b43eca70e692682d6a5171204b38e1db2eed0eb3ff68635bd3cc4"} Dec 03 09:25:40 crc kubenswrapper[4947]: I1203 09:25:40.984942 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell2-d84ng" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.020425 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxt9z\" (UniqueName: \"kubernetes.io/projected/fe124b64-001a-435a-8096-764e2a71097b-kube-api-access-hxt9z\") pod \"fe124b64-001a-435a-8096-764e2a71097b\" (UID: \"fe124b64-001a-435a-8096-764e2a71097b\") " Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.020517 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe124b64-001a-435a-8096-764e2a71097b-inventory\") pod \"fe124b64-001a-435a-8096-764e2a71097b\" (UID: \"fe124b64-001a-435a-8096-764e2a71097b\") " Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.020707 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe124b64-001a-435a-8096-764e2a71097b-bootstrap-combined-ca-bundle\") pod \"fe124b64-001a-435a-8096-764e2a71097b\" (UID: \"fe124b64-001a-435a-8096-764e2a71097b\") " Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.020745 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fe124b64-001a-435a-8096-764e2a71097b-ssh-key\") pod \"fe124b64-001a-435a-8096-764e2a71097b\" (UID: \"fe124b64-001a-435a-8096-764e2a71097b\") " Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.029829 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe124b64-001a-435a-8096-764e2a71097b-kube-api-access-hxt9z" (OuterVolumeSpecName: "kube-api-access-hxt9z") pod "fe124b64-001a-435a-8096-764e2a71097b" (UID: "fe124b64-001a-435a-8096-764e2a71097b"). InnerVolumeSpecName "kube-api-access-hxt9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.031642 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe124b64-001a-435a-8096-764e2a71097b-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "fe124b64-001a-435a-8096-764e2a71097b" (UID: "fe124b64-001a-435a-8096-764e2a71097b"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.065407 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe124b64-001a-435a-8096-764e2a71097b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fe124b64-001a-435a-8096-764e2a71097b" (UID: "fe124b64-001a-435a-8096-764e2a71097b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.066155 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe124b64-001a-435a-8096-764e2a71097b-inventory" (OuterVolumeSpecName: "inventory") pod "fe124b64-001a-435a-8096-764e2a71097b" (UID: "fe124b64-001a-435a-8096-764e2a71097b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.124586 4947 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe124b64-001a-435a-8096-764e2a71097b-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.124629 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fe124b64-001a-435a-8096-764e2a71097b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.124650 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxt9z\" (UniqueName: \"kubernetes.io/projected/fe124b64-001a-435a-8096-764e2a71097b-kube-api-access-hxt9z\") on node \"crc\" DevicePath \"\"" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.124665 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe124b64-001a-435a-8096-764e2a71097b-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.471364 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell2-d84ng" event={"ID":"fe124b64-001a-435a-8096-764e2a71097b","Type":"ContainerDied","Data":"6559ab243be3128e23e046fcc846134779972298e97a5f89ae59cfff12b5c78a"} Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.471405 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6559ab243be3128e23e046fcc846134779972298e97a5f89ae59cfff12b5c78a" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.471464 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell2-d84ng" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.580564 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell2-l4vbx"] Dec 03 09:25:41 crc kubenswrapper[4947]: E1203 09:25:41.581226 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5359bf-6e44-4e12-a201-4a134f673957" containerName="extract-utilities" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.581249 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5359bf-6e44-4e12-a201-4a134f673957" containerName="extract-utilities" Dec 03 09:25:41 crc kubenswrapper[4947]: E1203 09:25:41.581277 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe124b64-001a-435a-8096-764e2a71097b" containerName="bootstrap-openstack-openstack-cell2" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.581289 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe124b64-001a-435a-8096-764e2a71097b" containerName="bootstrap-openstack-openstack-cell2" Dec 03 09:25:41 crc kubenswrapper[4947]: E1203 09:25:41.581310 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6830677-adef-4ee7-8435-154242866f3c" containerName="registry-server" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.581323 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6830677-adef-4ee7-8435-154242866f3c" containerName="registry-server" Dec 03 09:25:41 crc kubenswrapper[4947]: E1203 09:25:41.581387 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6830677-adef-4ee7-8435-154242866f3c" containerName="extract-content" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.581400 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6830677-adef-4ee7-8435-154242866f3c" containerName="extract-content" Dec 03 09:25:41 crc kubenswrapper[4947]: E1203 09:25:41.581431 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5359bf-6e44-4e12-a201-4a134f673957" containerName="extract-content" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.581445 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5359bf-6e44-4e12-a201-4a134f673957" containerName="extract-content" Dec 03 09:25:41 crc kubenswrapper[4947]: E1203 09:25:41.581483 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5359bf-6e44-4e12-a201-4a134f673957" containerName="registry-server" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.581526 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5359bf-6e44-4e12-a201-4a134f673957" containerName="registry-server" Dec 03 09:25:41 crc kubenswrapper[4947]: E1203 09:25:41.581556 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6830677-adef-4ee7-8435-154242866f3c" containerName="extract-utilities" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.581568 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6830677-adef-4ee7-8435-154242866f3c" containerName="extract-utilities" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.581915 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6830677-adef-4ee7-8435-154242866f3c" containerName="registry-server" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.581952 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe124b64-001a-435a-8096-764e2a71097b" containerName="bootstrap-openstack-openstack-cell2" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.581968 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd5359bf-6e44-4e12-a201-4a134f673957" containerName="registry-server" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.583053 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell2-l4vbx" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.588676 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell2-dockercfg-cl4m2" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.588936 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell2" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.603613 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell2-l4vbx"] Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.634827 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27b8h\" (UniqueName: \"kubernetes.io/projected/53b2f919-6f0d-475b-a5fe-c59abec0ccbb-kube-api-access-27b8h\") pod \"download-cache-openstack-openstack-cell2-l4vbx\" (UID: \"53b2f919-6f0d-475b-a5fe-c59abec0ccbb\") " pod="openstack/download-cache-openstack-openstack-cell2-l4vbx" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.634963 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53b2f919-6f0d-475b-a5fe-c59abec0ccbb-inventory\") pod \"download-cache-openstack-openstack-cell2-l4vbx\" (UID: \"53b2f919-6f0d-475b-a5fe-c59abec0ccbb\") " pod="openstack/download-cache-openstack-openstack-cell2-l4vbx" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.635068 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53b2f919-6f0d-475b-a5fe-c59abec0ccbb-ssh-key\") pod \"download-cache-openstack-openstack-cell2-l4vbx\" (UID: \"53b2f919-6f0d-475b-a5fe-c59abec0ccbb\") " pod="openstack/download-cache-openstack-openstack-cell2-l4vbx" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.737892 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53b2f919-6f0d-475b-a5fe-c59abec0ccbb-ssh-key\") pod \"download-cache-openstack-openstack-cell2-l4vbx\" (UID: \"53b2f919-6f0d-475b-a5fe-c59abec0ccbb\") " pod="openstack/download-cache-openstack-openstack-cell2-l4vbx" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.738129 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27b8h\" (UniqueName: \"kubernetes.io/projected/53b2f919-6f0d-475b-a5fe-c59abec0ccbb-kube-api-access-27b8h\") pod \"download-cache-openstack-openstack-cell2-l4vbx\" (UID: \"53b2f919-6f0d-475b-a5fe-c59abec0ccbb\") " pod="openstack/download-cache-openstack-openstack-cell2-l4vbx" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.738239 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53b2f919-6f0d-475b-a5fe-c59abec0ccbb-inventory\") pod \"download-cache-openstack-openstack-cell2-l4vbx\" (UID: \"53b2f919-6f0d-475b-a5fe-c59abec0ccbb\") " pod="openstack/download-cache-openstack-openstack-cell2-l4vbx" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.761197 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53b2f919-6f0d-475b-a5fe-c59abec0ccbb-inventory\") pod \"download-cache-openstack-openstack-cell2-l4vbx\" (UID: \"53b2f919-6f0d-475b-a5fe-c59abec0ccbb\") " pod="openstack/download-cache-openstack-openstack-cell2-l4vbx" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.761515 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53b2f919-6f0d-475b-a5fe-c59abec0ccbb-ssh-key\") pod \"download-cache-openstack-openstack-cell2-l4vbx\" (UID: \"53b2f919-6f0d-475b-a5fe-c59abec0ccbb\") " pod="openstack/download-cache-openstack-openstack-cell2-l4vbx" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.764135 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27b8h\" (UniqueName: \"kubernetes.io/projected/53b2f919-6f0d-475b-a5fe-c59abec0ccbb-kube-api-access-27b8h\") pod \"download-cache-openstack-openstack-cell2-l4vbx\" (UID: \"53b2f919-6f0d-475b-a5fe-c59abec0ccbb\") " pod="openstack/download-cache-openstack-openstack-cell2-l4vbx" Dec 03 09:25:41 crc kubenswrapper[4947]: I1203 09:25:41.912918 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell2-l4vbx" Dec 03 09:25:42 crc kubenswrapper[4947]: I1203 09:25:42.482186 4947 generic.go:334] "Generic (PLEG): container finished" podID="490f2f91-8ec4-42da-bee9-65ebd38a7492" containerID="ff69e80018e457079e8a43027c03bdfae4c26ba482b27eff6078d048d8908cda" exitCode=0 Dec 03 09:25:42 crc kubenswrapper[4947]: I1203 09:25:42.482245 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-hv6px" event={"ID":"490f2f91-8ec4-42da-bee9-65ebd38a7492","Type":"ContainerDied","Data":"ff69e80018e457079e8a43027c03bdfae4c26ba482b27eff6078d048d8908cda"} Dec 03 09:25:42 crc kubenswrapper[4947]: I1203 09:25:42.531097 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell2-l4vbx"] Dec 03 09:25:43 crc kubenswrapper[4947]: I1203 09:25:43.491624 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell2-l4vbx" event={"ID":"53b2f919-6f0d-475b-a5fe-c59abec0ccbb","Type":"ContainerStarted","Data":"333e79c8592733ef062e6a085148ec57e7a650b5bb5baf8ef5aa45b439964253"} Dec 03 09:25:43 crc kubenswrapper[4947]: I1203 09:25:43.929754 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-hv6px" Dec 03 09:25:43 crc kubenswrapper[4947]: I1203 09:25:43.989575 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/490f2f91-8ec4-42da-bee9-65ebd38a7492-bootstrap-combined-ca-bundle\") pod \"490f2f91-8ec4-42da-bee9-65ebd38a7492\" (UID: \"490f2f91-8ec4-42da-bee9-65ebd38a7492\") " Dec 03 09:25:43 crc kubenswrapper[4947]: I1203 09:25:43.989659 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdpk5\" (UniqueName: \"kubernetes.io/projected/490f2f91-8ec4-42da-bee9-65ebd38a7492-kube-api-access-wdpk5\") pod \"490f2f91-8ec4-42da-bee9-65ebd38a7492\" (UID: \"490f2f91-8ec4-42da-bee9-65ebd38a7492\") " Dec 03 09:25:43 crc kubenswrapper[4947]: I1203 09:25:43.989797 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/490f2f91-8ec4-42da-bee9-65ebd38a7492-ssh-key\") pod \"490f2f91-8ec4-42da-bee9-65ebd38a7492\" (UID: \"490f2f91-8ec4-42da-bee9-65ebd38a7492\") " Dec 03 09:25:43 crc kubenswrapper[4947]: I1203 09:25:43.990024 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/490f2f91-8ec4-42da-bee9-65ebd38a7492-inventory\") pod \"490f2f91-8ec4-42da-bee9-65ebd38a7492\" (UID: \"490f2f91-8ec4-42da-bee9-65ebd38a7492\") " Dec 03 09:25:43 crc kubenswrapper[4947]: I1203 09:25:43.994871 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/490f2f91-8ec4-42da-bee9-65ebd38a7492-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "490f2f91-8ec4-42da-bee9-65ebd38a7492" (UID: "490f2f91-8ec4-42da-bee9-65ebd38a7492"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:25:43 crc kubenswrapper[4947]: I1203 09:25:43.995345 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/490f2f91-8ec4-42da-bee9-65ebd38a7492-kube-api-access-wdpk5" (OuterVolumeSpecName: "kube-api-access-wdpk5") pod "490f2f91-8ec4-42da-bee9-65ebd38a7492" (UID: "490f2f91-8ec4-42da-bee9-65ebd38a7492"). InnerVolumeSpecName "kube-api-access-wdpk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.021197 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/490f2f91-8ec4-42da-bee9-65ebd38a7492-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "490f2f91-8ec4-42da-bee9-65ebd38a7492" (UID: "490f2f91-8ec4-42da-bee9-65ebd38a7492"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.040749 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/490f2f91-8ec4-42da-bee9-65ebd38a7492-inventory" (OuterVolumeSpecName: "inventory") pod "490f2f91-8ec4-42da-bee9-65ebd38a7492" (UID: "490f2f91-8ec4-42da-bee9-65ebd38a7492"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.098850 4947 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/490f2f91-8ec4-42da-bee9-65ebd38a7492-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.098947 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdpk5\" (UniqueName: \"kubernetes.io/projected/490f2f91-8ec4-42da-bee9-65ebd38a7492-kube-api-access-wdpk5\") on node \"crc\" DevicePath \"\"" Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.098966 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/490f2f91-8ec4-42da-bee9-65ebd38a7492-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.098984 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/490f2f91-8ec4-42da-bee9-65ebd38a7492-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.503248 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-hv6px" event={"ID":"490f2f91-8ec4-42da-bee9-65ebd38a7492","Type":"ContainerDied","Data":"5f62a3073ee56191b762b7ea3030c8227db57fa16df959c07a0be29014f0ab94"} Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.503569 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f62a3073ee56191b762b7ea3030c8227db57fa16df959c07a0be29014f0ab94" Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.505656 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-hv6px" Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.506229 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell2-l4vbx" event={"ID":"53b2f919-6f0d-475b-a5fe-c59abec0ccbb","Type":"ContainerStarted","Data":"223a2f78cd4532a70c4f28e96541c53c8eb0c1ad7221b25920947796e4d07fd5"} Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.533982 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell2-l4vbx" podStartSLOduration=1.8397148140000001 podStartE2EDuration="3.533962364s" podCreationTimestamp="2025-12-03 09:25:41 +0000 UTC" firstStartedPulling="2025-12-03 09:25:42.524029591 +0000 UTC m=+9403.784984017" lastFinishedPulling="2025-12-03 09:25:44.218277141 +0000 UTC m=+9405.479231567" observedRunningTime="2025-12-03 09:25:44.525253098 +0000 UTC m=+9405.786207544" watchObservedRunningTime="2025-12-03 09:25:44.533962364 +0000 UTC m=+9405.794916790" Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.578641 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-4b7ns"] Dec 03 09:25:44 crc kubenswrapper[4947]: E1203 09:25:44.579114 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="490f2f91-8ec4-42da-bee9-65ebd38a7492" containerName="bootstrap-openstack-openstack-cell1" Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.579130 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="490f2f91-8ec4-42da-bee9-65ebd38a7492" containerName="bootstrap-openstack-openstack-cell1" Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.579309 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="490f2f91-8ec4-42da-bee9-65ebd38a7492" containerName="bootstrap-openstack-openstack-cell1" Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.580112 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-4b7ns" Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.584356 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rfmtm" Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.584605 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.606932 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e168aa9d-0251-46ec-8237-596594385a28-inventory\") pod \"download-cache-openstack-openstack-cell1-4b7ns\" (UID: \"e168aa9d-0251-46ec-8237-596594385a28\") " pod="openstack/download-cache-openstack-openstack-cell1-4b7ns" Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.606997 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbwxs\" (UniqueName: \"kubernetes.io/projected/e168aa9d-0251-46ec-8237-596594385a28-kube-api-access-rbwxs\") pod \"download-cache-openstack-openstack-cell1-4b7ns\" (UID: \"e168aa9d-0251-46ec-8237-596594385a28\") " pod="openstack/download-cache-openstack-openstack-cell1-4b7ns" Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.607036 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e168aa9d-0251-46ec-8237-596594385a28-ssh-key\") pod \"download-cache-openstack-openstack-cell1-4b7ns\" (UID: \"e168aa9d-0251-46ec-8237-596594385a28\") " pod="openstack/download-cache-openstack-openstack-cell1-4b7ns" Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.611014 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-4b7ns"] Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.708191 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e168aa9d-0251-46ec-8237-596594385a28-inventory\") pod \"download-cache-openstack-openstack-cell1-4b7ns\" (UID: \"e168aa9d-0251-46ec-8237-596594385a28\") " pod="openstack/download-cache-openstack-openstack-cell1-4b7ns" Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.708245 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbwxs\" (UniqueName: \"kubernetes.io/projected/e168aa9d-0251-46ec-8237-596594385a28-kube-api-access-rbwxs\") pod \"download-cache-openstack-openstack-cell1-4b7ns\" (UID: \"e168aa9d-0251-46ec-8237-596594385a28\") " pod="openstack/download-cache-openstack-openstack-cell1-4b7ns" Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.708270 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e168aa9d-0251-46ec-8237-596594385a28-ssh-key\") pod \"download-cache-openstack-openstack-cell1-4b7ns\" (UID: \"e168aa9d-0251-46ec-8237-596594385a28\") " pod="openstack/download-cache-openstack-openstack-cell1-4b7ns" Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.711821 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e168aa9d-0251-46ec-8237-596594385a28-inventory\") pod \"download-cache-openstack-openstack-cell1-4b7ns\" (UID: \"e168aa9d-0251-46ec-8237-596594385a28\") " pod="openstack/download-cache-openstack-openstack-cell1-4b7ns" Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.711826 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e168aa9d-0251-46ec-8237-596594385a28-ssh-key\") pod \"download-cache-openstack-openstack-cell1-4b7ns\" (UID: \"e168aa9d-0251-46ec-8237-596594385a28\") " pod="openstack/download-cache-openstack-openstack-cell1-4b7ns" Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.723805 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbwxs\" (UniqueName: \"kubernetes.io/projected/e168aa9d-0251-46ec-8237-596594385a28-kube-api-access-rbwxs\") pod \"download-cache-openstack-openstack-cell1-4b7ns\" (UID: \"e168aa9d-0251-46ec-8237-596594385a28\") " pod="openstack/download-cache-openstack-openstack-cell1-4b7ns" Dec 03 09:25:44 crc kubenswrapper[4947]: I1203 09:25:44.926089 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-4b7ns" Dec 03 09:25:45 crc kubenswrapper[4947]: I1203 09:25:45.549302 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-4b7ns"] Dec 03 09:25:45 crc kubenswrapper[4947]: W1203 09:25:45.553585 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode168aa9d_0251_46ec_8237_596594385a28.slice/crio-b6c86f7d52e0bd4d3b3bc2bdc27dabceb8d6ea2c9ca7129e4d6f909ba9d903b0 WatchSource:0}: Error finding container b6c86f7d52e0bd4d3b3bc2bdc27dabceb8d6ea2c9ca7129e4d6f909ba9d903b0: Status 404 returned error can't find the container with id b6c86f7d52e0bd4d3b3bc2bdc27dabceb8d6ea2c9ca7129e4d6f909ba9d903b0 Dec 03 09:25:46 crc kubenswrapper[4947]: I1203 09:25:46.536186 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-4b7ns" event={"ID":"e168aa9d-0251-46ec-8237-596594385a28","Type":"ContainerStarted","Data":"a95449ad4bd6e19079a4cb5227b1e4923673f36e4e90d98f935bc94031ef9bde"} Dec 03 09:25:46 crc kubenswrapper[4947]: I1203 09:25:46.536436 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-4b7ns" event={"ID":"e168aa9d-0251-46ec-8237-596594385a28","Type":"ContainerStarted","Data":"b6c86f7d52e0bd4d3b3bc2bdc27dabceb8d6ea2c9ca7129e4d6f909ba9d903b0"} Dec 03 09:25:46 crc kubenswrapper[4947]: I1203 09:25:46.564768 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-4b7ns" podStartSLOduration=2.140469512 podStartE2EDuration="2.564750509s" podCreationTimestamp="2025-12-03 09:25:44 +0000 UTC" firstStartedPulling="2025-12-03 09:25:45.557808104 +0000 UTC m=+9406.818762530" lastFinishedPulling="2025-12-03 09:25:45.982089081 +0000 UTC m=+9407.243043527" observedRunningTime="2025-12-03 09:25:46.556973659 +0000 UTC m=+9407.817928085" watchObservedRunningTime="2025-12-03 09:25:46.564750509 +0000 UTC m=+9407.825704935" Dec 03 09:25:50 crc kubenswrapper[4947]: I1203 09:25:50.082920 4947 scope.go:117] "RemoveContainer" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" Dec 03 09:25:50 crc kubenswrapper[4947]: E1203 09:25:50.083683 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:26:05 crc kubenswrapper[4947]: I1203 09:26:05.083808 4947 scope.go:117] "RemoveContainer" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" Dec 03 09:26:05 crc kubenswrapper[4947]: I1203 09:26:05.716203 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"4bb8680f19120bd6e546ee45532bd7955e35871304affa4095bcd96a25cdccb1"} Dec 03 09:27:10 crc kubenswrapper[4947]: I1203 09:27:10.392366 4947 generic.go:334] "Generic (PLEG): container finished" podID="53b2f919-6f0d-475b-a5fe-c59abec0ccbb" containerID="223a2f78cd4532a70c4f28e96541c53c8eb0c1ad7221b25920947796e4d07fd5" exitCode=0 Dec 03 09:27:10 crc kubenswrapper[4947]: I1203 09:27:10.392484 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell2-l4vbx" event={"ID":"53b2f919-6f0d-475b-a5fe-c59abec0ccbb","Type":"ContainerDied","Data":"223a2f78cd4532a70c4f28e96541c53c8eb0c1ad7221b25920947796e4d07fd5"} Dec 03 09:27:11 crc kubenswrapper[4947]: I1203 09:27:11.405073 4947 generic.go:334] "Generic (PLEG): container finished" podID="e168aa9d-0251-46ec-8237-596594385a28" containerID="a95449ad4bd6e19079a4cb5227b1e4923673f36e4e90d98f935bc94031ef9bde" exitCode=0 Dec 03 09:27:11 crc kubenswrapper[4947]: I1203 09:27:11.405116 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-4b7ns" event={"ID":"e168aa9d-0251-46ec-8237-596594385a28","Type":"ContainerDied","Data":"a95449ad4bd6e19079a4cb5227b1e4923673f36e4e90d98f935bc94031ef9bde"} Dec 03 09:27:11 crc kubenswrapper[4947]: I1203 09:27:11.886486 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell2-l4vbx" Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.007313 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53b2f919-6f0d-475b-a5fe-c59abec0ccbb-ssh-key\") pod \"53b2f919-6f0d-475b-a5fe-c59abec0ccbb\" (UID: \"53b2f919-6f0d-475b-a5fe-c59abec0ccbb\") " Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.007401 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27b8h\" (UniqueName: \"kubernetes.io/projected/53b2f919-6f0d-475b-a5fe-c59abec0ccbb-kube-api-access-27b8h\") pod \"53b2f919-6f0d-475b-a5fe-c59abec0ccbb\" (UID: \"53b2f919-6f0d-475b-a5fe-c59abec0ccbb\") " Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.007461 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53b2f919-6f0d-475b-a5fe-c59abec0ccbb-inventory\") pod \"53b2f919-6f0d-475b-a5fe-c59abec0ccbb\" (UID: \"53b2f919-6f0d-475b-a5fe-c59abec0ccbb\") " Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.014272 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53b2f919-6f0d-475b-a5fe-c59abec0ccbb-kube-api-access-27b8h" (OuterVolumeSpecName: "kube-api-access-27b8h") pod "53b2f919-6f0d-475b-a5fe-c59abec0ccbb" (UID: "53b2f919-6f0d-475b-a5fe-c59abec0ccbb"). InnerVolumeSpecName "kube-api-access-27b8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.035173 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53b2f919-6f0d-475b-a5fe-c59abec0ccbb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "53b2f919-6f0d-475b-a5fe-c59abec0ccbb" (UID: "53b2f919-6f0d-475b-a5fe-c59abec0ccbb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.037280 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53b2f919-6f0d-475b-a5fe-c59abec0ccbb-inventory" (OuterVolumeSpecName: "inventory") pod "53b2f919-6f0d-475b-a5fe-c59abec0ccbb" (UID: "53b2f919-6f0d-475b-a5fe-c59abec0ccbb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.110931 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/53b2f919-6f0d-475b-a5fe-c59abec0ccbb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.110968 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27b8h\" (UniqueName: \"kubernetes.io/projected/53b2f919-6f0d-475b-a5fe-c59abec0ccbb-kube-api-access-27b8h\") on node \"crc\" DevicePath \"\"" Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.110984 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53b2f919-6f0d-475b-a5fe-c59abec0ccbb-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.417019 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell2-l4vbx" Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.417019 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell2-l4vbx" event={"ID":"53b2f919-6f0d-475b-a5fe-c59abec0ccbb","Type":"ContainerDied","Data":"333e79c8592733ef062e6a085148ec57e7a650b5bb5baf8ef5aa45b439964253"} Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.417476 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="333e79c8592733ef062e6a085148ec57e7a650b5bb5baf8ef5aa45b439964253" Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.505650 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell2-nswm8"] Dec 03 09:27:12 crc kubenswrapper[4947]: E1203 09:27:12.506112 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53b2f919-6f0d-475b-a5fe-c59abec0ccbb" containerName="download-cache-openstack-openstack-cell2" Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.506128 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="53b2f919-6f0d-475b-a5fe-c59abec0ccbb" containerName="download-cache-openstack-openstack-cell2" Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.506323 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="53b2f919-6f0d-475b-a5fe-c59abec0ccbb" containerName="download-cache-openstack-openstack-cell2" Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.507095 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell2-nswm8" Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.508628 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell2-dockercfg-cl4m2" Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.512641 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell2" Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.516004 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6607a030-ff9e-4b09-b42b-11c78be5d094-ssh-key\") pod \"configure-network-openstack-openstack-cell2-nswm8\" (UID: \"6607a030-ff9e-4b09-b42b-11c78be5d094\") " pod="openstack/configure-network-openstack-openstack-cell2-nswm8" Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.516024 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell2-nswm8"] Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.516054 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p44kc\" (UniqueName: \"kubernetes.io/projected/6607a030-ff9e-4b09-b42b-11c78be5d094-kube-api-access-p44kc\") pod \"configure-network-openstack-openstack-cell2-nswm8\" (UID: \"6607a030-ff9e-4b09-b42b-11c78be5d094\") " pod="openstack/configure-network-openstack-openstack-cell2-nswm8" Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.516197 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6607a030-ff9e-4b09-b42b-11c78be5d094-inventory\") pod \"configure-network-openstack-openstack-cell2-nswm8\" (UID: \"6607a030-ff9e-4b09-b42b-11c78be5d094\") " pod="openstack/configure-network-openstack-openstack-cell2-nswm8" Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.617999 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6607a030-ff9e-4b09-b42b-11c78be5d094-ssh-key\") pod \"configure-network-openstack-openstack-cell2-nswm8\" (UID: \"6607a030-ff9e-4b09-b42b-11c78be5d094\") " pod="openstack/configure-network-openstack-openstack-cell2-nswm8" Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.618067 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p44kc\" (UniqueName: \"kubernetes.io/projected/6607a030-ff9e-4b09-b42b-11c78be5d094-kube-api-access-p44kc\") pod \"configure-network-openstack-openstack-cell2-nswm8\" (UID: \"6607a030-ff9e-4b09-b42b-11c78be5d094\") " pod="openstack/configure-network-openstack-openstack-cell2-nswm8" Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.618204 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6607a030-ff9e-4b09-b42b-11c78be5d094-inventory\") pod \"configure-network-openstack-openstack-cell2-nswm8\" (UID: \"6607a030-ff9e-4b09-b42b-11c78be5d094\") " pod="openstack/configure-network-openstack-openstack-cell2-nswm8" Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.625117 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6607a030-ff9e-4b09-b42b-11c78be5d094-inventory\") pod \"configure-network-openstack-openstack-cell2-nswm8\" (UID: \"6607a030-ff9e-4b09-b42b-11c78be5d094\") " pod="openstack/configure-network-openstack-openstack-cell2-nswm8" Dec 03 09:27:12 crc kubenswrapper[4947]: I1203 09:27:12.629546 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6607a030-ff9e-4b09-b42b-11c78be5d094-ssh-key\") pod \"configure-network-openstack-openstack-cell2-nswm8\" (UID: \"6607a030-ff9e-4b09-b42b-11c78be5d094\") " pod="openstack/configure-network-openstack-openstack-cell2-nswm8" Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.240763 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p44kc\" (UniqueName: \"kubernetes.io/projected/6607a030-ff9e-4b09-b42b-11c78be5d094-kube-api-access-p44kc\") pod \"configure-network-openstack-openstack-cell2-nswm8\" (UID: \"6607a030-ff9e-4b09-b42b-11c78be5d094\") " pod="openstack/configure-network-openstack-openstack-cell2-nswm8" Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.367155 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-4b7ns" Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.422584 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell2-nswm8" Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.427208 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-4b7ns" event={"ID":"e168aa9d-0251-46ec-8237-596594385a28","Type":"ContainerDied","Data":"b6c86f7d52e0bd4d3b3bc2bdc27dabceb8d6ea2c9ca7129e4d6f909ba9d903b0"} Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.427242 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6c86f7d52e0bd4d3b3bc2bdc27dabceb8d6ea2c9ca7129e4d6f909ba9d903b0" Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.427263 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-4b7ns" Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.507425 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-vxc8z"] Dec 03 09:27:13 crc kubenswrapper[4947]: E1203 09:27:13.508279 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e168aa9d-0251-46ec-8237-596594385a28" containerName="download-cache-openstack-openstack-cell1" Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.508301 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e168aa9d-0251-46ec-8237-596594385a28" containerName="download-cache-openstack-openstack-cell1" Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.508595 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="e168aa9d-0251-46ec-8237-596594385a28" containerName="download-cache-openstack-openstack-cell1" Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.509553 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-vxc8z" Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.519638 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-vxc8z"] Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.536341 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e168aa9d-0251-46ec-8237-596594385a28-inventory\") pod \"e168aa9d-0251-46ec-8237-596594385a28\" (UID: \"e168aa9d-0251-46ec-8237-596594385a28\") " Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.536996 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbwxs\" (UniqueName: \"kubernetes.io/projected/e168aa9d-0251-46ec-8237-596594385a28-kube-api-access-rbwxs\") pod \"e168aa9d-0251-46ec-8237-596594385a28\" (UID: \"e168aa9d-0251-46ec-8237-596594385a28\") " Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.537094 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e168aa9d-0251-46ec-8237-596594385a28-ssh-key\") pod \"e168aa9d-0251-46ec-8237-596594385a28\" (UID: \"e168aa9d-0251-46ec-8237-596594385a28\") " Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.537444 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6949c8f-209b-45db-9297-e7de78baa4ca-ssh-key\") pod \"configure-network-openstack-openstack-cell1-vxc8z\" (UID: \"a6949c8f-209b-45db-9297-e7de78baa4ca\") " pod="openstack/configure-network-openstack-openstack-cell1-vxc8z" Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.537626 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgrzn\" (UniqueName: \"kubernetes.io/projected/a6949c8f-209b-45db-9297-e7de78baa4ca-kube-api-access-qgrzn\") pod \"configure-network-openstack-openstack-cell1-vxc8z\" (UID: \"a6949c8f-209b-45db-9297-e7de78baa4ca\") " pod="openstack/configure-network-openstack-openstack-cell1-vxc8z" Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.538037 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6949c8f-209b-45db-9297-e7de78baa4ca-inventory\") pod \"configure-network-openstack-openstack-cell1-vxc8z\" (UID: \"a6949c8f-209b-45db-9297-e7de78baa4ca\") " pod="openstack/configure-network-openstack-openstack-cell1-vxc8z" Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.541250 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e168aa9d-0251-46ec-8237-596594385a28-kube-api-access-rbwxs" (OuterVolumeSpecName: "kube-api-access-rbwxs") pod "e168aa9d-0251-46ec-8237-596594385a28" (UID: "e168aa9d-0251-46ec-8237-596594385a28"). InnerVolumeSpecName "kube-api-access-rbwxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.566730 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e168aa9d-0251-46ec-8237-596594385a28-inventory" (OuterVolumeSpecName: "inventory") pod "e168aa9d-0251-46ec-8237-596594385a28" (UID: "e168aa9d-0251-46ec-8237-596594385a28"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.567574 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e168aa9d-0251-46ec-8237-596594385a28-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e168aa9d-0251-46ec-8237-596594385a28" (UID: "e168aa9d-0251-46ec-8237-596594385a28"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.646621 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6949c8f-209b-45db-9297-e7de78baa4ca-inventory\") pod \"configure-network-openstack-openstack-cell1-vxc8z\" (UID: \"a6949c8f-209b-45db-9297-e7de78baa4ca\") " pod="openstack/configure-network-openstack-openstack-cell1-vxc8z" Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.646785 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6949c8f-209b-45db-9297-e7de78baa4ca-ssh-key\") pod \"configure-network-openstack-openstack-cell1-vxc8z\" (UID: \"a6949c8f-209b-45db-9297-e7de78baa4ca\") " pod="openstack/configure-network-openstack-openstack-cell1-vxc8z" Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.646840 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgrzn\" (UniqueName: \"kubernetes.io/projected/a6949c8f-209b-45db-9297-e7de78baa4ca-kube-api-access-qgrzn\") pod \"configure-network-openstack-openstack-cell1-vxc8z\" (UID: \"a6949c8f-209b-45db-9297-e7de78baa4ca\") " pod="openstack/configure-network-openstack-openstack-cell1-vxc8z" Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.646950 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e168aa9d-0251-46ec-8237-596594385a28-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.646965 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbwxs\" (UniqueName: \"kubernetes.io/projected/e168aa9d-0251-46ec-8237-596594385a28-kube-api-access-rbwxs\") on node \"crc\" DevicePath \"\"" Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.646977 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e168aa9d-0251-46ec-8237-596594385a28-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.651198 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6949c8f-209b-45db-9297-e7de78baa4ca-inventory\") pod \"configure-network-openstack-openstack-cell1-vxc8z\" (UID: \"a6949c8f-209b-45db-9297-e7de78baa4ca\") " pod="openstack/configure-network-openstack-openstack-cell1-vxc8z" Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.656278 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6949c8f-209b-45db-9297-e7de78baa4ca-ssh-key\") pod \"configure-network-openstack-openstack-cell1-vxc8z\" (UID: \"a6949c8f-209b-45db-9297-e7de78baa4ca\") " pod="openstack/configure-network-openstack-openstack-cell1-vxc8z" Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.665886 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgrzn\" (UniqueName: \"kubernetes.io/projected/a6949c8f-209b-45db-9297-e7de78baa4ca-kube-api-access-qgrzn\") pod \"configure-network-openstack-openstack-cell1-vxc8z\" (UID: \"a6949c8f-209b-45db-9297-e7de78baa4ca\") " pod="openstack/configure-network-openstack-openstack-cell1-vxc8z" Dec 03 09:27:13 crc kubenswrapper[4947]: I1203 09:27:13.955667 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-vxc8z" Dec 03 09:27:14 crc kubenswrapper[4947]: I1203 09:27:14.036116 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell2-nswm8"] Dec 03 09:27:14 crc kubenswrapper[4947]: I1203 09:27:14.439075 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell2-nswm8" event={"ID":"6607a030-ff9e-4b09-b42b-11c78be5d094","Type":"ContainerStarted","Data":"6dcf5575e4ee016723e4baae4927758c3e76c2cfa839023e53dd79b998466876"} Dec 03 09:27:14 crc kubenswrapper[4947]: I1203 09:27:14.507121 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-vxc8z"] Dec 03 09:27:15 crc kubenswrapper[4947]: I1203 09:27:15.451678 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-vxc8z" event={"ID":"a6949c8f-209b-45db-9297-e7de78baa4ca","Type":"ContainerStarted","Data":"2d67043618bd528d51300a10eac9a9895071b1a0e48f941e380de8b7814f9c8a"} Dec 03 09:27:15 crc kubenswrapper[4947]: I1203 09:27:15.453145 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell2-nswm8" event={"ID":"6607a030-ff9e-4b09-b42b-11c78be5d094","Type":"ContainerStarted","Data":"a933f40c340b7aaccf7b96955db7d5ebce885051295c491563da48d24237fbed"} Dec 03 09:27:15 crc kubenswrapper[4947]: I1203 09:27:15.478548 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell2-nswm8" podStartSLOduration=2.89044171 podStartE2EDuration="3.478525054s" podCreationTimestamp="2025-12-03 09:27:12 +0000 UTC" firstStartedPulling="2025-12-03 09:27:14.034013333 +0000 UTC m=+9495.294967769" lastFinishedPulling="2025-12-03 09:27:14.622096687 +0000 UTC m=+9495.883051113" observedRunningTime="2025-12-03 09:27:15.47101895 +0000 UTC m=+9496.731973396" watchObservedRunningTime="2025-12-03 09:27:15.478525054 +0000 UTC m=+9496.739479480" Dec 03 09:27:16 crc kubenswrapper[4947]: I1203 09:27:16.466787 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-vxc8z" event={"ID":"a6949c8f-209b-45db-9297-e7de78baa4ca","Type":"ContainerStarted","Data":"0f60292177b43bd75c8bd106d0425bd03cd7b4c8f95ecb161e7301b2ca903615"} Dec 03 09:27:16 crc kubenswrapper[4947]: I1203 09:27:16.494277 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-vxc8z" podStartSLOduration=2.669418642 podStartE2EDuration="3.494262225s" podCreationTimestamp="2025-12-03 09:27:13 +0000 UTC" firstStartedPulling="2025-12-03 09:27:14.51267462 +0000 UTC m=+9495.773629066" lastFinishedPulling="2025-12-03 09:27:15.337518223 +0000 UTC m=+9496.598472649" observedRunningTime="2025-12-03 09:27:16.48811455 +0000 UTC m=+9497.749068976" watchObservedRunningTime="2025-12-03 09:27:16.494262225 +0000 UTC m=+9497.755216651" Dec 03 09:28:12 crc kubenswrapper[4947]: I1203 09:28:12.016800 4947 generic.go:334] "Generic (PLEG): container finished" podID="6607a030-ff9e-4b09-b42b-11c78be5d094" containerID="a933f40c340b7aaccf7b96955db7d5ebce885051295c491563da48d24237fbed" exitCode=0 Dec 03 09:28:12 crc kubenswrapper[4947]: I1203 09:28:12.016899 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell2-nswm8" event={"ID":"6607a030-ff9e-4b09-b42b-11c78be5d094","Type":"ContainerDied","Data":"a933f40c340b7aaccf7b96955db7d5ebce885051295c491563da48d24237fbed"} Dec 03 09:28:13 crc kubenswrapper[4947]: I1203 09:28:13.485406 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell2-nswm8" Dec 03 09:28:13 crc kubenswrapper[4947]: I1203 09:28:13.666296 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6607a030-ff9e-4b09-b42b-11c78be5d094-ssh-key\") pod \"6607a030-ff9e-4b09-b42b-11c78be5d094\" (UID: \"6607a030-ff9e-4b09-b42b-11c78be5d094\") " Dec 03 09:28:13 crc kubenswrapper[4947]: I1203 09:28:13.666422 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6607a030-ff9e-4b09-b42b-11c78be5d094-inventory\") pod \"6607a030-ff9e-4b09-b42b-11c78be5d094\" (UID: \"6607a030-ff9e-4b09-b42b-11c78be5d094\") " Dec 03 09:28:13 crc kubenswrapper[4947]: I1203 09:28:13.666467 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p44kc\" (UniqueName: \"kubernetes.io/projected/6607a030-ff9e-4b09-b42b-11c78be5d094-kube-api-access-p44kc\") pod \"6607a030-ff9e-4b09-b42b-11c78be5d094\" (UID: \"6607a030-ff9e-4b09-b42b-11c78be5d094\") " Dec 03 09:28:13 crc kubenswrapper[4947]: I1203 09:28:13.671479 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6607a030-ff9e-4b09-b42b-11c78be5d094-kube-api-access-p44kc" (OuterVolumeSpecName: "kube-api-access-p44kc") pod "6607a030-ff9e-4b09-b42b-11c78be5d094" (UID: "6607a030-ff9e-4b09-b42b-11c78be5d094"). InnerVolumeSpecName "kube-api-access-p44kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:28:13 crc kubenswrapper[4947]: I1203 09:28:13.695579 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6607a030-ff9e-4b09-b42b-11c78be5d094-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6607a030-ff9e-4b09-b42b-11c78be5d094" (UID: "6607a030-ff9e-4b09-b42b-11c78be5d094"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:28:13 crc kubenswrapper[4947]: I1203 09:28:13.697440 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6607a030-ff9e-4b09-b42b-11c78be5d094-inventory" (OuterVolumeSpecName: "inventory") pod "6607a030-ff9e-4b09-b42b-11c78be5d094" (UID: "6607a030-ff9e-4b09-b42b-11c78be5d094"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:28:13 crc kubenswrapper[4947]: I1203 09:28:13.769525 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6607a030-ff9e-4b09-b42b-11c78be5d094-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:28:13 crc kubenswrapper[4947]: I1203 09:28:13.769581 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p44kc\" (UniqueName: \"kubernetes.io/projected/6607a030-ff9e-4b09-b42b-11c78be5d094-kube-api-access-p44kc\") on node \"crc\" DevicePath \"\"" Dec 03 09:28:13 crc kubenswrapper[4947]: I1203 09:28:13.769603 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6607a030-ff9e-4b09-b42b-11c78be5d094-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:28:14 crc kubenswrapper[4947]: I1203 09:28:14.039881 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell2-nswm8" event={"ID":"6607a030-ff9e-4b09-b42b-11c78be5d094","Type":"ContainerDied","Data":"6dcf5575e4ee016723e4baae4927758c3e76c2cfa839023e53dd79b998466876"} Dec 03 09:28:14 crc kubenswrapper[4947]: I1203 09:28:14.039943 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dcf5575e4ee016723e4baae4927758c3e76c2cfa839023e53dd79b998466876" Dec 03 09:28:14 crc kubenswrapper[4947]: I1203 09:28:14.040349 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell2-nswm8" Dec 03 09:28:14 crc kubenswrapper[4947]: I1203 09:28:14.152437 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell2-h4hrv"] Dec 03 09:28:14 crc kubenswrapper[4947]: E1203 09:28:14.153953 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6607a030-ff9e-4b09-b42b-11c78be5d094" containerName="configure-network-openstack-openstack-cell2" Dec 03 09:28:14 crc kubenswrapper[4947]: I1203 09:28:14.153974 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="6607a030-ff9e-4b09-b42b-11c78be5d094" containerName="configure-network-openstack-openstack-cell2" Dec 03 09:28:14 crc kubenswrapper[4947]: I1203 09:28:14.154232 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="6607a030-ff9e-4b09-b42b-11c78be5d094" containerName="configure-network-openstack-openstack-cell2" Dec 03 09:28:14 crc kubenswrapper[4947]: I1203 09:28:14.155282 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell2-h4hrv" Dec 03 09:28:14 crc kubenswrapper[4947]: I1203 09:28:14.157966 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell2-dockercfg-cl4m2" Dec 03 09:28:14 crc kubenswrapper[4947]: I1203 09:28:14.163888 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell2" Dec 03 09:28:14 crc kubenswrapper[4947]: I1203 09:28:14.188780 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell2-h4hrv"] Dec 03 09:28:14 crc kubenswrapper[4947]: I1203 09:28:14.279006 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj922\" (UniqueName: \"kubernetes.io/projected/76762737-e806-4f14-bde0-1a8daa074767-kube-api-access-nj922\") pod \"validate-network-openstack-openstack-cell2-h4hrv\" (UID: \"76762737-e806-4f14-bde0-1a8daa074767\") " pod="openstack/validate-network-openstack-openstack-cell2-h4hrv" Dec 03 09:28:14 crc kubenswrapper[4947]: I1203 09:28:14.279229 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76762737-e806-4f14-bde0-1a8daa074767-ssh-key\") pod \"validate-network-openstack-openstack-cell2-h4hrv\" (UID: \"76762737-e806-4f14-bde0-1a8daa074767\") " pod="openstack/validate-network-openstack-openstack-cell2-h4hrv" Dec 03 09:28:14 crc kubenswrapper[4947]: I1203 09:28:14.279261 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76762737-e806-4f14-bde0-1a8daa074767-inventory\") pod \"validate-network-openstack-openstack-cell2-h4hrv\" (UID: \"76762737-e806-4f14-bde0-1a8daa074767\") " pod="openstack/validate-network-openstack-openstack-cell2-h4hrv" Dec 03 09:28:14 crc kubenswrapper[4947]: I1203 09:28:14.380751 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76762737-e806-4f14-bde0-1a8daa074767-ssh-key\") pod \"validate-network-openstack-openstack-cell2-h4hrv\" (UID: \"76762737-e806-4f14-bde0-1a8daa074767\") " pod="openstack/validate-network-openstack-openstack-cell2-h4hrv" Dec 03 09:28:14 crc kubenswrapper[4947]: I1203 09:28:14.380971 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76762737-e806-4f14-bde0-1a8daa074767-inventory\") pod \"validate-network-openstack-openstack-cell2-h4hrv\" (UID: \"76762737-e806-4f14-bde0-1a8daa074767\") " pod="openstack/validate-network-openstack-openstack-cell2-h4hrv" Dec 03 09:28:14 crc kubenswrapper[4947]: I1203 09:28:14.381121 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj922\" (UniqueName: \"kubernetes.io/projected/76762737-e806-4f14-bde0-1a8daa074767-kube-api-access-nj922\") pod \"validate-network-openstack-openstack-cell2-h4hrv\" (UID: \"76762737-e806-4f14-bde0-1a8daa074767\") " pod="openstack/validate-network-openstack-openstack-cell2-h4hrv" Dec 03 09:28:14 crc kubenswrapper[4947]: I1203 09:28:14.384476 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76762737-e806-4f14-bde0-1a8daa074767-inventory\") pod \"validate-network-openstack-openstack-cell2-h4hrv\" (UID: \"76762737-e806-4f14-bde0-1a8daa074767\") " pod="openstack/validate-network-openstack-openstack-cell2-h4hrv" Dec 03 09:28:14 crc kubenswrapper[4947]: I1203 09:28:14.385090 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76762737-e806-4f14-bde0-1a8daa074767-ssh-key\") pod \"validate-network-openstack-openstack-cell2-h4hrv\" (UID: \"76762737-e806-4f14-bde0-1a8daa074767\") " pod="openstack/validate-network-openstack-openstack-cell2-h4hrv" Dec 03 09:28:14 crc kubenswrapper[4947]: I1203 09:28:14.412481 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj922\" (UniqueName: \"kubernetes.io/projected/76762737-e806-4f14-bde0-1a8daa074767-kube-api-access-nj922\") pod \"validate-network-openstack-openstack-cell2-h4hrv\" (UID: \"76762737-e806-4f14-bde0-1a8daa074767\") " pod="openstack/validate-network-openstack-openstack-cell2-h4hrv" Dec 03 09:28:14 crc kubenswrapper[4947]: I1203 09:28:14.475539 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell2-h4hrv" Dec 03 09:28:15 crc kubenswrapper[4947]: I1203 09:28:15.056249 4947 generic.go:334] "Generic (PLEG): container finished" podID="a6949c8f-209b-45db-9297-e7de78baa4ca" containerID="0f60292177b43bd75c8bd106d0425bd03cd7b4c8f95ecb161e7301b2ca903615" exitCode=0 Dec 03 09:28:15 crc kubenswrapper[4947]: I1203 09:28:15.056315 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-vxc8z" event={"ID":"a6949c8f-209b-45db-9297-e7de78baa4ca","Type":"ContainerDied","Data":"0f60292177b43bd75c8bd106d0425bd03cd7b4c8f95ecb161e7301b2ca903615"} Dec 03 09:28:15 crc kubenswrapper[4947]: I1203 09:28:15.104395 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell2-h4hrv"] Dec 03 09:28:15 crc kubenswrapper[4947]: I1203 09:28:15.106674 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 09:28:16 crc kubenswrapper[4947]: I1203 09:28:16.071533 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell2-h4hrv" event={"ID":"76762737-e806-4f14-bde0-1a8daa074767","Type":"ContainerStarted","Data":"0421c355c3c18fb82a54e4a147c332998d71e5210fbc8c1fb69e59cff7391185"} Dec 03 09:28:16 crc kubenswrapper[4947]: I1203 09:28:16.071932 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell2-h4hrv" event={"ID":"76762737-e806-4f14-bde0-1a8daa074767","Type":"ContainerStarted","Data":"25d3ef8b37cb5cf353e2bd342532a1df83dd937844db301323214a202b282cdf"} Dec 03 09:28:16 crc kubenswrapper[4947]: I1203 09:28:16.106028 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell2-h4hrv" podStartSLOduration=1.532205329 podStartE2EDuration="2.105978816s" podCreationTimestamp="2025-12-03 09:28:14 +0000 UTC" firstStartedPulling="2025-12-03 09:28:15.10635309 +0000 UTC m=+9556.367307516" lastFinishedPulling="2025-12-03 09:28:15.680126587 +0000 UTC m=+9556.941081003" observedRunningTime="2025-12-03 09:28:16.093346005 +0000 UTC m=+9557.354300431" watchObservedRunningTime="2025-12-03 09:28:16.105978816 +0000 UTC m=+9557.366933252" Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:16.501271 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-vxc8z" Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:16.630396 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6949c8f-209b-45db-9297-e7de78baa4ca-ssh-key\") pod \"a6949c8f-209b-45db-9297-e7de78baa4ca\" (UID: \"a6949c8f-209b-45db-9297-e7de78baa4ca\") " Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:16.630555 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgrzn\" (UniqueName: \"kubernetes.io/projected/a6949c8f-209b-45db-9297-e7de78baa4ca-kube-api-access-qgrzn\") pod \"a6949c8f-209b-45db-9297-e7de78baa4ca\" (UID: \"a6949c8f-209b-45db-9297-e7de78baa4ca\") " Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:16.630668 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6949c8f-209b-45db-9297-e7de78baa4ca-inventory\") pod \"a6949c8f-209b-45db-9297-e7de78baa4ca\" (UID: \"a6949c8f-209b-45db-9297-e7de78baa4ca\") " Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:16.641240 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6949c8f-209b-45db-9297-e7de78baa4ca-kube-api-access-qgrzn" (OuterVolumeSpecName: "kube-api-access-qgrzn") pod "a6949c8f-209b-45db-9297-e7de78baa4ca" (UID: "a6949c8f-209b-45db-9297-e7de78baa4ca"). InnerVolumeSpecName "kube-api-access-qgrzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:16.677791 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6949c8f-209b-45db-9297-e7de78baa4ca-inventory" (OuterVolumeSpecName: "inventory") pod "a6949c8f-209b-45db-9297-e7de78baa4ca" (UID: "a6949c8f-209b-45db-9297-e7de78baa4ca"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:16.680895 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6949c8f-209b-45db-9297-e7de78baa4ca-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a6949c8f-209b-45db-9297-e7de78baa4ca" (UID: "a6949c8f-209b-45db-9297-e7de78baa4ca"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:16.735483 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgrzn\" (UniqueName: \"kubernetes.io/projected/a6949c8f-209b-45db-9297-e7de78baa4ca-kube-api-access-qgrzn\") on node \"crc\" DevicePath \"\"" Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:16.735527 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6949c8f-209b-45db-9297-e7de78baa4ca-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:16.735536 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6949c8f-209b-45db-9297-e7de78baa4ca-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:17.121478 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-vxc8z" Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:17.150735 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-vxc8z" event={"ID":"a6949c8f-209b-45db-9297-e7de78baa4ca","Type":"ContainerDied","Data":"2d67043618bd528d51300a10eac9a9895071b1a0e48f941e380de8b7814f9c8a"} Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:17.150796 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d67043618bd528d51300a10eac9a9895071b1a0e48f941e380de8b7814f9c8a" Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:17.174120 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-2r4v7"] Dec 03 09:28:17 crc kubenswrapper[4947]: E1203 09:28:17.174634 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6949c8f-209b-45db-9297-e7de78baa4ca" containerName="configure-network-openstack-openstack-cell1" Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:17.174655 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6949c8f-209b-45db-9297-e7de78baa4ca" containerName="configure-network-openstack-openstack-cell1" Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:17.174952 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6949c8f-209b-45db-9297-e7de78baa4ca" containerName="configure-network-openstack-openstack-cell1" Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:17.175982 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-2r4v7" Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:17.179925 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:17.180253 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rfmtm" Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:17.231546 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-2r4v7"] Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:17.247444 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xb2d\" (UniqueName: \"kubernetes.io/projected/5067a94c-5435-42d8-9e27-e5dfd3abe35f-kube-api-access-4xb2d\") pod \"validate-network-openstack-openstack-cell1-2r4v7\" (UID: \"5067a94c-5435-42d8-9e27-e5dfd3abe35f\") " pod="openstack/validate-network-openstack-openstack-cell1-2r4v7" Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:17.247592 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5067a94c-5435-42d8-9e27-e5dfd3abe35f-inventory\") pod \"validate-network-openstack-openstack-cell1-2r4v7\" (UID: \"5067a94c-5435-42d8-9e27-e5dfd3abe35f\") " pod="openstack/validate-network-openstack-openstack-cell1-2r4v7" Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:17.247685 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5067a94c-5435-42d8-9e27-e5dfd3abe35f-ssh-key\") pod \"validate-network-openstack-openstack-cell1-2r4v7\" (UID: \"5067a94c-5435-42d8-9e27-e5dfd3abe35f\") " pod="openstack/validate-network-openstack-openstack-cell1-2r4v7" Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:17.349695 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xb2d\" (UniqueName: \"kubernetes.io/projected/5067a94c-5435-42d8-9e27-e5dfd3abe35f-kube-api-access-4xb2d\") pod \"validate-network-openstack-openstack-cell1-2r4v7\" (UID: \"5067a94c-5435-42d8-9e27-e5dfd3abe35f\") " pod="openstack/validate-network-openstack-openstack-cell1-2r4v7" Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:17.349814 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5067a94c-5435-42d8-9e27-e5dfd3abe35f-inventory\") pod \"validate-network-openstack-openstack-cell1-2r4v7\" (UID: \"5067a94c-5435-42d8-9e27-e5dfd3abe35f\") " pod="openstack/validate-network-openstack-openstack-cell1-2r4v7" Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:17.349871 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5067a94c-5435-42d8-9e27-e5dfd3abe35f-ssh-key\") pod \"validate-network-openstack-openstack-cell1-2r4v7\" (UID: \"5067a94c-5435-42d8-9e27-e5dfd3abe35f\") " pod="openstack/validate-network-openstack-openstack-cell1-2r4v7" Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:17.354804 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5067a94c-5435-42d8-9e27-e5dfd3abe35f-ssh-key\") pod \"validate-network-openstack-openstack-cell1-2r4v7\" (UID: \"5067a94c-5435-42d8-9e27-e5dfd3abe35f\") " pod="openstack/validate-network-openstack-openstack-cell1-2r4v7" Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:17.355226 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5067a94c-5435-42d8-9e27-e5dfd3abe35f-inventory\") pod \"validate-network-openstack-openstack-cell1-2r4v7\" (UID: \"5067a94c-5435-42d8-9e27-e5dfd3abe35f\") " pod="openstack/validate-network-openstack-openstack-cell1-2r4v7" Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:17.373148 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xb2d\" (UniqueName: \"kubernetes.io/projected/5067a94c-5435-42d8-9e27-e5dfd3abe35f-kube-api-access-4xb2d\") pod \"validate-network-openstack-openstack-cell1-2r4v7\" (UID: \"5067a94c-5435-42d8-9e27-e5dfd3abe35f\") " pod="openstack/validate-network-openstack-openstack-cell1-2r4v7" Dec 03 09:28:17 crc kubenswrapper[4947]: I1203 09:28:17.529167 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-2r4v7" Dec 03 09:28:18 crc kubenswrapper[4947]: I1203 09:28:18.196808 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-2r4v7"] Dec 03 09:28:18 crc kubenswrapper[4947]: W1203 09:28:18.207260 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5067a94c_5435_42d8_9e27_e5dfd3abe35f.slice/crio-3d6ca6bae2e1c4a64ad2aab59ac82f36c0e6867305d600b07f5a455dadc503b0 WatchSource:0}: Error finding container 3d6ca6bae2e1c4a64ad2aab59ac82f36c0e6867305d600b07f5a455dadc503b0: Status 404 returned error can't find the container with id 3d6ca6bae2e1c4a64ad2aab59ac82f36c0e6867305d600b07f5a455dadc503b0 Dec 03 09:28:19 crc kubenswrapper[4947]: I1203 09:28:19.151145 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-2r4v7" event={"ID":"5067a94c-5435-42d8-9e27-e5dfd3abe35f","Type":"ContainerStarted","Data":"7ac63c962262ed213bb1b7a75b538fc1c2ff00ca436bb2fc445ae44911610c1c"} Dec 03 09:28:19 crc kubenswrapper[4947]: I1203 09:28:19.151416 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-2r4v7" event={"ID":"5067a94c-5435-42d8-9e27-e5dfd3abe35f","Type":"ContainerStarted","Data":"3d6ca6bae2e1c4a64ad2aab59ac82f36c0e6867305d600b07f5a455dadc503b0"} Dec 03 09:28:19 crc kubenswrapper[4947]: I1203 09:28:19.181721 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-2r4v7" podStartSLOduration=1.64653033 podStartE2EDuration="2.181699273s" podCreationTimestamp="2025-12-03 09:28:17 +0000 UTC" firstStartedPulling="2025-12-03 09:28:18.210807944 +0000 UTC m=+9559.471762370" lastFinishedPulling="2025-12-03 09:28:18.745976877 +0000 UTC m=+9560.006931313" observedRunningTime="2025-12-03 09:28:19.171076026 +0000 UTC m=+9560.432030462" watchObservedRunningTime="2025-12-03 09:28:19.181699273 +0000 UTC m=+9560.442653709" Dec 03 09:28:19 crc kubenswrapper[4947]: I1203 09:28:19.262345 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mfpg7"] Dec 03 09:28:19 crc kubenswrapper[4947]: I1203 09:28:19.265428 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mfpg7" Dec 03 09:28:19 crc kubenswrapper[4947]: I1203 09:28:19.280966 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mfpg7"] Dec 03 09:28:19 crc kubenswrapper[4947]: I1203 09:28:19.295655 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af4ee33c-879f-42de-af88-592510a53fb9-catalog-content\") pod \"certified-operators-mfpg7\" (UID: \"af4ee33c-879f-42de-af88-592510a53fb9\") " pod="openshift-marketplace/certified-operators-mfpg7" Dec 03 09:28:19 crc kubenswrapper[4947]: I1203 09:28:19.295952 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpjjg\" (UniqueName: \"kubernetes.io/projected/af4ee33c-879f-42de-af88-592510a53fb9-kube-api-access-dpjjg\") pod \"certified-operators-mfpg7\" (UID: \"af4ee33c-879f-42de-af88-592510a53fb9\") " pod="openshift-marketplace/certified-operators-mfpg7" Dec 03 09:28:19 crc kubenswrapper[4947]: I1203 09:28:19.296051 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af4ee33c-879f-42de-af88-592510a53fb9-utilities\") pod \"certified-operators-mfpg7\" (UID: \"af4ee33c-879f-42de-af88-592510a53fb9\") " pod="openshift-marketplace/certified-operators-mfpg7" Dec 03 09:28:19 crc kubenswrapper[4947]: I1203 09:28:19.398515 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af4ee33c-879f-42de-af88-592510a53fb9-catalog-content\") pod \"certified-operators-mfpg7\" (UID: \"af4ee33c-879f-42de-af88-592510a53fb9\") " pod="openshift-marketplace/certified-operators-mfpg7" Dec 03 09:28:19 crc kubenswrapper[4947]: I1203 09:28:19.398811 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpjjg\" (UniqueName: \"kubernetes.io/projected/af4ee33c-879f-42de-af88-592510a53fb9-kube-api-access-dpjjg\") pod \"certified-operators-mfpg7\" (UID: \"af4ee33c-879f-42de-af88-592510a53fb9\") " pod="openshift-marketplace/certified-operators-mfpg7" Dec 03 09:28:19 crc kubenswrapper[4947]: I1203 09:28:19.398988 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af4ee33c-879f-42de-af88-592510a53fb9-catalog-content\") pod \"certified-operators-mfpg7\" (UID: \"af4ee33c-879f-42de-af88-592510a53fb9\") " pod="openshift-marketplace/certified-operators-mfpg7" Dec 03 09:28:19 crc kubenswrapper[4947]: I1203 09:28:19.399008 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af4ee33c-879f-42de-af88-592510a53fb9-utilities\") pod \"certified-operators-mfpg7\" (UID: \"af4ee33c-879f-42de-af88-592510a53fb9\") " pod="openshift-marketplace/certified-operators-mfpg7" Dec 03 09:28:19 crc kubenswrapper[4947]: I1203 09:28:19.399259 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af4ee33c-879f-42de-af88-592510a53fb9-utilities\") pod \"certified-operators-mfpg7\" (UID: \"af4ee33c-879f-42de-af88-592510a53fb9\") " pod="openshift-marketplace/certified-operators-mfpg7" Dec 03 09:28:19 crc kubenswrapper[4947]: I1203 09:28:19.425615 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpjjg\" (UniqueName: \"kubernetes.io/projected/af4ee33c-879f-42de-af88-592510a53fb9-kube-api-access-dpjjg\") pod \"certified-operators-mfpg7\" (UID: \"af4ee33c-879f-42de-af88-592510a53fb9\") " pod="openshift-marketplace/certified-operators-mfpg7" Dec 03 09:28:19 crc kubenswrapper[4947]: I1203 09:28:19.595290 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mfpg7" Dec 03 09:28:20 crc kubenswrapper[4947]: I1203 09:28:20.133299 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mfpg7"] Dec 03 09:28:20 crc kubenswrapper[4947]: W1203 09:28:20.135186 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf4ee33c_879f_42de_af88_592510a53fb9.slice/crio-1ce9e48a4a99b3d22c3e5cd4f0042afc9457a311bcf82b293afce42ed75b8a49 WatchSource:0}: Error finding container 1ce9e48a4a99b3d22c3e5cd4f0042afc9457a311bcf82b293afce42ed75b8a49: Status 404 returned error can't find the container with id 1ce9e48a4a99b3d22c3e5cd4f0042afc9457a311bcf82b293afce42ed75b8a49 Dec 03 09:28:20 crc kubenswrapper[4947]: I1203 09:28:20.160415 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mfpg7" event={"ID":"af4ee33c-879f-42de-af88-592510a53fb9","Type":"ContainerStarted","Data":"1ce9e48a4a99b3d22c3e5cd4f0042afc9457a311bcf82b293afce42ed75b8a49"} Dec 03 09:28:21 crc kubenswrapper[4947]: I1203 09:28:21.172134 4947 generic.go:334] "Generic (PLEG): container finished" podID="af4ee33c-879f-42de-af88-592510a53fb9" containerID="167c41bd5ae0d638d1f809fa9159d1fb824fb3dad03d2fbae214f0059810afac" exitCode=0 Dec 03 09:28:21 crc kubenswrapper[4947]: I1203 09:28:21.172221 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mfpg7" event={"ID":"af4ee33c-879f-42de-af88-592510a53fb9","Type":"ContainerDied","Data":"167c41bd5ae0d638d1f809fa9159d1fb824fb3dad03d2fbae214f0059810afac"} Dec 03 09:28:21 crc kubenswrapper[4947]: I1203 09:28:21.177928 4947 generic.go:334] "Generic (PLEG): container finished" podID="76762737-e806-4f14-bde0-1a8daa074767" containerID="0421c355c3c18fb82a54e4a147c332998d71e5210fbc8c1fb69e59cff7391185" exitCode=0 Dec 03 09:28:21 crc kubenswrapper[4947]: I1203 09:28:21.178248 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell2-h4hrv" event={"ID":"76762737-e806-4f14-bde0-1a8daa074767","Type":"ContainerDied","Data":"0421c355c3c18fb82a54e4a147c332998d71e5210fbc8c1fb69e59cff7391185"} Dec 03 09:28:22 crc kubenswrapper[4947]: I1203 09:28:22.745596 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell2-h4hrv" Dec 03 09:28:22 crc kubenswrapper[4947]: I1203 09:28:22.774210 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj922\" (UniqueName: \"kubernetes.io/projected/76762737-e806-4f14-bde0-1a8daa074767-kube-api-access-nj922\") pod \"76762737-e806-4f14-bde0-1a8daa074767\" (UID: \"76762737-e806-4f14-bde0-1a8daa074767\") " Dec 03 09:28:22 crc kubenswrapper[4947]: I1203 09:28:22.774381 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76762737-e806-4f14-bde0-1a8daa074767-inventory\") pod \"76762737-e806-4f14-bde0-1a8daa074767\" (UID: \"76762737-e806-4f14-bde0-1a8daa074767\") " Dec 03 09:28:22 crc kubenswrapper[4947]: I1203 09:28:22.774462 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76762737-e806-4f14-bde0-1a8daa074767-ssh-key\") pod \"76762737-e806-4f14-bde0-1a8daa074767\" (UID: \"76762737-e806-4f14-bde0-1a8daa074767\") " Dec 03 09:28:22 crc kubenswrapper[4947]: I1203 09:28:22.780383 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76762737-e806-4f14-bde0-1a8daa074767-kube-api-access-nj922" (OuterVolumeSpecName: "kube-api-access-nj922") pod "76762737-e806-4f14-bde0-1a8daa074767" (UID: "76762737-e806-4f14-bde0-1a8daa074767"). InnerVolumeSpecName "kube-api-access-nj922". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:28:22 crc kubenswrapper[4947]: I1203 09:28:22.805351 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76762737-e806-4f14-bde0-1a8daa074767-inventory" (OuterVolumeSpecName: "inventory") pod "76762737-e806-4f14-bde0-1a8daa074767" (UID: "76762737-e806-4f14-bde0-1a8daa074767"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:28:22 crc kubenswrapper[4947]: I1203 09:28:22.823211 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76762737-e806-4f14-bde0-1a8daa074767-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "76762737-e806-4f14-bde0-1a8daa074767" (UID: "76762737-e806-4f14-bde0-1a8daa074767"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:28:22 crc kubenswrapper[4947]: I1203 09:28:22.877486 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/76762737-e806-4f14-bde0-1a8daa074767-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:28:22 crc kubenswrapper[4947]: I1203 09:28:22.877541 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/76762737-e806-4f14-bde0-1a8daa074767-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:28:22 crc kubenswrapper[4947]: I1203 09:28:22.877555 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj922\" (UniqueName: \"kubernetes.io/projected/76762737-e806-4f14-bde0-1a8daa074767-kube-api-access-nj922\") on node \"crc\" DevicePath \"\"" Dec 03 09:28:23 crc kubenswrapper[4947]: I1203 09:28:23.198052 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell2-h4hrv" event={"ID":"76762737-e806-4f14-bde0-1a8daa074767","Type":"ContainerDied","Data":"25d3ef8b37cb5cf353e2bd342532a1df83dd937844db301323214a202b282cdf"} Dec 03 09:28:23 crc kubenswrapper[4947]: I1203 09:28:23.198086 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell2-h4hrv" Dec 03 09:28:23 crc kubenswrapper[4947]: I1203 09:28:23.198092 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25d3ef8b37cb5cf353e2bd342532a1df83dd937844db301323214a202b282cdf" Dec 03 09:28:23 crc kubenswrapper[4947]: I1203 09:28:23.200816 4947 generic.go:334] "Generic (PLEG): container finished" podID="af4ee33c-879f-42de-af88-592510a53fb9" containerID="e189a94c859107144ecf4f9bc333a739ff07473c231e07cb5f541c6fbff10a1d" exitCode=0 Dec 03 09:28:23 crc kubenswrapper[4947]: I1203 09:28:23.200861 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mfpg7" event={"ID":"af4ee33c-879f-42de-af88-592510a53fb9","Type":"ContainerDied","Data":"e189a94c859107144ecf4f9bc333a739ff07473c231e07cb5f541c6fbff10a1d"} Dec 03 09:28:23 crc kubenswrapper[4947]: I1203 09:28:23.368354 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell2-5x8jx"] Dec 03 09:28:23 crc kubenswrapper[4947]: E1203 09:28:23.368810 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76762737-e806-4f14-bde0-1a8daa074767" containerName="validate-network-openstack-openstack-cell2" Dec 03 09:28:23 crc kubenswrapper[4947]: I1203 09:28:23.368833 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="76762737-e806-4f14-bde0-1a8daa074767" containerName="validate-network-openstack-openstack-cell2" Dec 03 09:28:23 crc kubenswrapper[4947]: I1203 09:28:23.369044 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="76762737-e806-4f14-bde0-1a8daa074767" containerName="validate-network-openstack-openstack-cell2" Dec 03 09:28:23 crc kubenswrapper[4947]: I1203 09:28:23.369883 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell2-5x8jx" Dec 03 09:28:23 crc kubenswrapper[4947]: I1203 09:28:23.380113 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell2-5x8jx"] Dec 03 09:28:23 crc kubenswrapper[4947]: I1203 09:28:23.420904 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell2" Dec 03 09:28:23 crc kubenswrapper[4947]: I1203 09:28:23.421579 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell2-dockercfg-cl4m2" Dec 03 09:28:23 crc kubenswrapper[4947]: I1203 09:28:23.524123 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49fz2\" (UniqueName: \"kubernetes.io/projected/fce62e9e-4b15-4a9b-9ac9-da544976d2fe-kube-api-access-49fz2\") pod \"install-os-openstack-openstack-cell2-5x8jx\" (UID: \"fce62e9e-4b15-4a9b-9ac9-da544976d2fe\") " pod="openstack/install-os-openstack-openstack-cell2-5x8jx" Dec 03 09:28:23 crc kubenswrapper[4947]: I1203 09:28:23.524791 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fce62e9e-4b15-4a9b-9ac9-da544976d2fe-ssh-key\") pod \"install-os-openstack-openstack-cell2-5x8jx\" (UID: \"fce62e9e-4b15-4a9b-9ac9-da544976d2fe\") " pod="openstack/install-os-openstack-openstack-cell2-5x8jx" Dec 03 09:28:23 crc kubenswrapper[4947]: I1203 09:28:23.524881 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fce62e9e-4b15-4a9b-9ac9-da544976d2fe-inventory\") pod \"install-os-openstack-openstack-cell2-5x8jx\" (UID: \"fce62e9e-4b15-4a9b-9ac9-da544976d2fe\") " pod="openstack/install-os-openstack-openstack-cell2-5x8jx" Dec 03 09:28:23 crc kubenswrapper[4947]: I1203 09:28:23.627076 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49fz2\" (UniqueName: \"kubernetes.io/projected/fce62e9e-4b15-4a9b-9ac9-da544976d2fe-kube-api-access-49fz2\") pod \"install-os-openstack-openstack-cell2-5x8jx\" (UID: \"fce62e9e-4b15-4a9b-9ac9-da544976d2fe\") " pod="openstack/install-os-openstack-openstack-cell2-5x8jx" Dec 03 09:28:23 crc kubenswrapper[4947]: I1203 09:28:23.627132 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fce62e9e-4b15-4a9b-9ac9-da544976d2fe-ssh-key\") pod \"install-os-openstack-openstack-cell2-5x8jx\" (UID: \"fce62e9e-4b15-4a9b-9ac9-da544976d2fe\") " pod="openstack/install-os-openstack-openstack-cell2-5x8jx" Dec 03 09:28:23 crc kubenswrapper[4947]: I1203 09:28:23.627211 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fce62e9e-4b15-4a9b-9ac9-da544976d2fe-inventory\") pod \"install-os-openstack-openstack-cell2-5x8jx\" (UID: \"fce62e9e-4b15-4a9b-9ac9-da544976d2fe\") " pod="openstack/install-os-openstack-openstack-cell2-5x8jx" Dec 03 09:28:23 crc kubenswrapper[4947]: I1203 09:28:23.633199 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fce62e9e-4b15-4a9b-9ac9-da544976d2fe-inventory\") pod \"install-os-openstack-openstack-cell2-5x8jx\" (UID: \"fce62e9e-4b15-4a9b-9ac9-da544976d2fe\") " pod="openstack/install-os-openstack-openstack-cell2-5x8jx" Dec 03 09:28:23 crc kubenswrapper[4947]: I1203 09:28:23.633616 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fce62e9e-4b15-4a9b-9ac9-da544976d2fe-ssh-key\") pod \"install-os-openstack-openstack-cell2-5x8jx\" (UID: \"fce62e9e-4b15-4a9b-9ac9-da544976d2fe\") " pod="openstack/install-os-openstack-openstack-cell2-5x8jx" Dec 03 09:28:24 crc kubenswrapper[4947]: I1203 09:28:24.436607 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49fz2\" (UniqueName: \"kubernetes.io/projected/fce62e9e-4b15-4a9b-9ac9-da544976d2fe-kube-api-access-49fz2\") pod \"install-os-openstack-openstack-cell2-5x8jx\" (UID: \"fce62e9e-4b15-4a9b-9ac9-da544976d2fe\") " pod="openstack/install-os-openstack-openstack-cell2-5x8jx" Dec 03 09:28:24 crc kubenswrapper[4947]: I1203 09:28:24.642645 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell2-5x8jx" Dec 03 09:28:25 crc kubenswrapper[4947]: I1203 09:28:25.221262 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mfpg7" event={"ID":"af4ee33c-879f-42de-af88-592510a53fb9","Type":"ContainerStarted","Data":"792dd6074f9a95c6de21facd3888d9a20888a0f43b396d91a8ae5672e2b68743"} Dec 03 09:28:25 crc kubenswrapper[4947]: I1203 09:28:25.222788 4947 generic.go:334] "Generic (PLEG): container finished" podID="5067a94c-5435-42d8-9e27-e5dfd3abe35f" containerID="7ac63c962262ed213bb1b7a75b538fc1c2ff00ca436bb2fc445ae44911610c1c" exitCode=0 Dec 03 09:28:25 crc kubenswrapper[4947]: I1203 09:28:25.222816 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-2r4v7" event={"ID":"5067a94c-5435-42d8-9e27-e5dfd3abe35f","Type":"ContainerDied","Data":"7ac63c962262ed213bb1b7a75b538fc1c2ff00ca436bb2fc445ae44911610c1c"} Dec 03 09:28:25 crc kubenswrapper[4947]: I1203 09:28:25.251585 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell2-5x8jx"] Dec 03 09:28:25 crc kubenswrapper[4947]: I1203 09:28:25.259079 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mfpg7" podStartSLOduration=3.733773544 podStartE2EDuration="6.259059424s" podCreationTimestamp="2025-12-03 09:28:19 +0000 UTC" firstStartedPulling="2025-12-03 09:28:21.174127812 +0000 UTC m=+9562.435082258" lastFinishedPulling="2025-12-03 09:28:23.699413712 +0000 UTC m=+9564.960368138" observedRunningTime="2025-12-03 09:28:25.244991224 +0000 UTC m=+9566.505945650" watchObservedRunningTime="2025-12-03 09:28:25.259059424 +0000 UTC m=+9566.520013850" Dec 03 09:28:25 crc kubenswrapper[4947]: W1203 09:28:25.260656 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfce62e9e_4b15_4a9b_9ac9_da544976d2fe.slice/crio-46564d7b718164ca70dff14d8be59b960e6837f13bb386c379a0b288e9940815 WatchSource:0}: Error finding container 46564d7b718164ca70dff14d8be59b960e6837f13bb386c379a0b288e9940815: Status 404 returned error can't find the container with id 46564d7b718164ca70dff14d8be59b960e6837f13bb386c379a0b288e9940815 Dec 03 09:28:26 crc kubenswrapper[4947]: I1203 09:28:26.254118 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell2-5x8jx" event={"ID":"fce62e9e-4b15-4a9b-9ac9-da544976d2fe","Type":"ContainerStarted","Data":"3c4dafb5fa5688e3a892057335687cce045448cc6b5866522bae3c333dd2f383"} Dec 03 09:28:26 crc kubenswrapper[4947]: I1203 09:28:26.254971 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell2-5x8jx" event={"ID":"fce62e9e-4b15-4a9b-9ac9-da544976d2fe","Type":"ContainerStarted","Data":"46564d7b718164ca70dff14d8be59b960e6837f13bb386c379a0b288e9940815"} Dec 03 09:28:26 crc kubenswrapper[4947]: I1203 09:28:26.283068 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell2-5x8jx" podStartSLOduration=2.829828729 podStartE2EDuration="3.283039658s" podCreationTimestamp="2025-12-03 09:28:23 +0000 UTC" firstStartedPulling="2025-12-03 09:28:25.264379437 +0000 UTC m=+9566.525333863" lastFinishedPulling="2025-12-03 09:28:25.717590366 +0000 UTC m=+9566.978544792" observedRunningTime="2025-12-03 09:28:26.279225725 +0000 UTC m=+9567.540180161" watchObservedRunningTime="2025-12-03 09:28:26.283039658 +0000 UTC m=+9567.543994104" Dec 03 09:28:26 crc kubenswrapper[4947]: I1203 09:28:26.803051 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-2r4v7" Dec 03 09:28:26 crc kubenswrapper[4947]: I1203 09:28:26.895438 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5067a94c-5435-42d8-9e27-e5dfd3abe35f-inventory\") pod \"5067a94c-5435-42d8-9e27-e5dfd3abe35f\" (UID: \"5067a94c-5435-42d8-9e27-e5dfd3abe35f\") " Dec 03 09:28:26 crc kubenswrapper[4947]: I1203 09:28:26.895491 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5067a94c-5435-42d8-9e27-e5dfd3abe35f-ssh-key\") pod \"5067a94c-5435-42d8-9e27-e5dfd3abe35f\" (UID: \"5067a94c-5435-42d8-9e27-e5dfd3abe35f\") " Dec 03 09:28:26 crc kubenswrapper[4947]: I1203 09:28:26.895583 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xb2d\" (UniqueName: \"kubernetes.io/projected/5067a94c-5435-42d8-9e27-e5dfd3abe35f-kube-api-access-4xb2d\") pod \"5067a94c-5435-42d8-9e27-e5dfd3abe35f\" (UID: \"5067a94c-5435-42d8-9e27-e5dfd3abe35f\") " Dec 03 09:28:26 crc kubenswrapper[4947]: I1203 09:28:26.899892 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5067a94c-5435-42d8-9e27-e5dfd3abe35f-kube-api-access-4xb2d" (OuterVolumeSpecName: "kube-api-access-4xb2d") pod "5067a94c-5435-42d8-9e27-e5dfd3abe35f" (UID: "5067a94c-5435-42d8-9e27-e5dfd3abe35f"). InnerVolumeSpecName "kube-api-access-4xb2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:28:26 crc kubenswrapper[4947]: I1203 09:28:26.923679 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5067a94c-5435-42d8-9e27-e5dfd3abe35f-inventory" (OuterVolumeSpecName: "inventory") pod "5067a94c-5435-42d8-9e27-e5dfd3abe35f" (UID: "5067a94c-5435-42d8-9e27-e5dfd3abe35f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:28:26 crc kubenswrapper[4947]: I1203 09:28:26.926737 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5067a94c-5435-42d8-9e27-e5dfd3abe35f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5067a94c-5435-42d8-9e27-e5dfd3abe35f" (UID: "5067a94c-5435-42d8-9e27-e5dfd3abe35f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:28:26 crc kubenswrapper[4947]: I1203 09:28:26.997306 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5067a94c-5435-42d8-9e27-e5dfd3abe35f-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:28:26 crc kubenswrapper[4947]: I1203 09:28:26.997336 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5067a94c-5435-42d8-9e27-e5dfd3abe35f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:28:26 crc kubenswrapper[4947]: I1203 09:28:26.997346 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xb2d\" (UniqueName: \"kubernetes.io/projected/5067a94c-5435-42d8-9e27-e5dfd3abe35f-kube-api-access-4xb2d\") on node \"crc\" DevicePath \"\"" Dec 03 09:28:27 crc kubenswrapper[4947]: I1203 09:28:27.265004 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-2r4v7" event={"ID":"5067a94c-5435-42d8-9e27-e5dfd3abe35f","Type":"ContainerDied","Data":"3d6ca6bae2e1c4a64ad2aab59ac82f36c0e6867305d600b07f5a455dadc503b0"} Dec 03 09:28:27 crc kubenswrapper[4947]: I1203 09:28:27.265407 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d6ca6bae2e1c4a64ad2aab59ac82f36c0e6867305d600b07f5a455dadc503b0" Dec 03 09:28:27 crc kubenswrapper[4947]: I1203 09:28:27.265530 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-2r4v7" Dec 03 09:28:27 crc kubenswrapper[4947]: I1203 09:28:27.371188 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-9szzl"] Dec 03 09:28:27 crc kubenswrapper[4947]: E1203 09:28:27.371756 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5067a94c-5435-42d8-9e27-e5dfd3abe35f" containerName="validate-network-openstack-openstack-cell1" Dec 03 09:28:27 crc kubenswrapper[4947]: I1203 09:28:27.371780 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5067a94c-5435-42d8-9e27-e5dfd3abe35f" containerName="validate-network-openstack-openstack-cell1" Dec 03 09:28:27 crc kubenswrapper[4947]: I1203 09:28:27.372106 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5067a94c-5435-42d8-9e27-e5dfd3abe35f" containerName="validate-network-openstack-openstack-cell1" Dec 03 09:28:27 crc kubenswrapper[4947]: I1203 09:28:27.373109 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-9szzl" Dec 03 09:28:27 crc kubenswrapper[4947]: I1203 09:28:27.375696 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rfmtm" Dec 03 09:28:27 crc kubenswrapper[4947]: I1203 09:28:27.375947 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 09:28:27 crc kubenswrapper[4947]: I1203 09:28:27.385141 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-9szzl"] Dec 03 09:28:27 crc kubenswrapper[4947]: I1203 09:28:27.507294 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3bb31b8a-1ed3-421b-a9a7-65ae531c90dd-ssh-key\") pod \"install-os-openstack-openstack-cell1-9szzl\" (UID: \"3bb31b8a-1ed3-421b-a9a7-65ae531c90dd\") " pod="openstack/install-os-openstack-openstack-cell1-9szzl" Dec 03 09:28:27 crc kubenswrapper[4947]: I1203 09:28:27.507373 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhxrt\" (UniqueName: \"kubernetes.io/projected/3bb31b8a-1ed3-421b-a9a7-65ae531c90dd-kube-api-access-bhxrt\") pod \"install-os-openstack-openstack-cell1-9szzl\" (UID: \"3bb31b8a-1ed3-421b-a9a7-65ae531c90dd\") " pod="openstack/install-os-openstack-openstack-cell1-9szzl" Dec 03 09:28:27 crc kubenswrapper[4947]: I1203 09:28:27.507420 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3bb31b8a-1ed3-421b-a9a7-65ae531c90dd-inventory\") pod \"install-os-openstack-openstack-cell1-9szzl\" (UID: \"3bb31b8a-1ed3-421b-a9a7-65ae531c90dd\") " pod="openstack/install-os-openstack-openstack-cell1-9szzl" Dec 03 09:28:27 crc kubenswrapper[4947]: I1203 09:28:27.610138 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3bb31b8a-1ed3-421b-a9a7-65ae531c90dd-ssh-key\") pod \"install-os-openstack-openstack-cell1-9szzl\" (UID: \"3bb31b8a-1ed3-421b-a9a7-65ae531c90dd\") " pod="openstack/install-os-openstack-openstack-cell1-9szzl" Dec 03 09:28:27 crc kubenswrapper[4947]: I1203 09:28:27.610220 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhxrt\" (UniqueName: \"kubernetes.io/projected/3bb31b8a-1ed3-421b-a9a7-65ae531c90dd-kube-api-access-bhxrt\") pod \"install-os-openstack-openstack-cell1-9szzl\" (UID: \"3bb31b8a-1ed3-421b-a9a7-65ae531c90dd\") " pod="openstack/install-os-openstack-openstack-cell1-9szzl" Dec 03 09:28:27 crc kubenswrapper[4947]: I1203 09:28:27.610271 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3bb31b8a-1ed3-421b-a9a7-65ae531c90dd-inventory\") pod \"install-os-openstack-openstack-cell1-9szzl\" (UID: \"3bb31b8a-1ed3-421b-a9a7-65ae531c90dd\") " pod="openstack/install-os-openstack-openstack-cell1-9szzl" Dec 03 09:28:27 crc kubenswrapper[4947]: I1203 09:28:27.615092 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3bb31b8a-1ed3-421b-a9a7-65ae531c90dd-inventory\") pod \"install-os-openstack-openstack-cell1-9szzl\" (UID: \"3bb31b8a-1ed3-421b-a9a7-65ae531c90dd\") " pod="openstack/install-os-openstack-openstack-cell1-9szzl" Dec 03 09:28:27 crc kubenswrapper[4947]: I1203 09:28:27.616054 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3bb31b8a-1ed3-421b-a9a7-65ae531c90dd-ssh-key\") pod \"install-os-openstack-openstack-cell1-9szzl\" (UID: \"3bb31b8a-1ed3-421b-a9a7-65ae531c90dd\") " pod="openstack/install-os-openstack-openstack-cell1-9szzl" Dec 03 09:28:27 crc kubenswrapper[4947]: I1203 09:28:27.628867 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhxrt\" (UniqueName: \"kubernetes.io/projected/3bb31b8a-1ed3-421b-a9a7-65ae531c90dd-kube-api-access-bhxrt\") pod \"install-os-openstack-openstack-cell1-9szzl\" (UID: \"3bb31b8a-1ed3-421b-a9a7-65ae531c90dd\") " pod="openstack/install-os-openstack-openstack-cell1-9szzl" Dec 03 09:28:27 crc kubenswrapper[4947]: I1203 09:28:27.692485 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-9szzl" Dec 03 09:28:28 crc kubenswrapper[4947]: I1203 09:28:28.215411 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-9szzl"] Dec 03 09:28:28 crc kubenswrapper[4947]: W1203 09:28:28.221695 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bb31b8a_1ed3_421b_a9a7_65ae531c90dd.slice/crio-5bf110db278c69151f9f5fdc8a05f9447e21897dcb303d4dc8fee1fd0f127f18 WatchSource:0}: Error finding container 5bf110db278c69151f9f5fdc8a05f9447e21897dcb303d4dc8fee1fd0f127f18: Status 404 returned error can't find the container with id 5bf110db278c69151f9f5fdc8a05f9447e21897dcb303d4dc8fee1fd0f127f18 Dec 03 09:28:28 crc kubenswrapper[4947]: I1203 09:28:28.276749 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-9szzl" event={"ID":"3bb31b8a-1ed3-421b-a9a7-65ae531c90dd","Type":"ContainerStarted","Data":"5bf110db278c69151f9f5fdc8a05f9447e21897dcb303d4dc8fee1fd0f127f18"} Dec 03 09:28:29 crc kubenswrapper[4947]: I1203 09:28:29.286658 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-9szzl" event={"ID":"3bb31b8a-1ed3-421b-a9a7-65ae531c90dd","Type":"ContainerStarted","Data":"3be871508bcbfc93838ef9349c156b258df98e6ae50eda75e771251f15fb2f40"} Dec 03 09:28:29 crc kubenswrapper[4947]: I1203 09:28:29.310528 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-9szzl" podStartSLOduration=1.757242748 podStartE2EDuration="2.31050752s" podCreationTimestamp="2025-12-03 09:28:27 +0000 UTC" firstStartedPulling="2025-12-03 09:28:28.223197935 +0000 UTC m=+9569.484152351" lastFinishedPulling="2025-12-03 09:28:28.776462697 +0000 UTC m=+9570.037417123" observedRunningTime="2025-12-03 09:28:29.298990389 +0000 UTC m=+9570.559944825" watchObservedRunningTime="2025-12-03 09:28:29.31050752 +0000 UTC m=+9570.571461946" Dec 03 09:28:29 crc kubenswrapper[4947]: I1203 09:28:29.598120 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mfpg7" Dec 03 09:28:29 crc kubenswrapper[4947]: I1203 09:28:29.598159 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mfpg7" Dec 03 09:28:29 crc kubenswrapper[4947]: I1203 09:28:29.699338 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mfpg7" Dec 03 09:28:30 crc kubenswrapper[4947]: I1203 09:28:30.086589 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:28:30 crc kubenswrapper[4947]: I1203 09:28:30.086662 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:28:30 crc kubenswrapper[4947]: I1203 09:28:30.342788 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mfpg7" Dec 03 09:28:31 crc kubenswrapper[4947]: I1203 09:28:31.431027 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mfpg7"] Dec 03 09:28:32 crc kubenswrapper[4947]: I1203 09:28:32.313403 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mfpg7" podUID="af4ee33c-879f-42de-af88-592510a53fb9" containerName="registry-server" containerID="cri-o://792dd6074f9a95c6de21facd3888d9a20888a0f43b396d91a8ae5672e2b68743" gracePeriod=2 Dec 03 09:28:32 crc kubenswrapper[4947]: I1203 09:28:32.862203 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mfpg7" Dec 03 09:28:33 crc kubenswrapper[4947]: I1203 09:28:33.020384 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af4ee33c-879f-42de-af88-592510a53fb9-catalog-content\") pod \"af4ee33c-879f-42de-af88-592510a53fb9\" (UID: \"af4ee33c-879f-42de-af88-592510a53fb9\") " Dec 03 09:28:33 crc kubenswrapper[4947]: I1203 09:28:33.020813 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpjjg\" (UniqueName: \"kubernetes.io/projected/af4ee33c-879f-42de-af88-592510a53fb9-kube-api-access-dpjjg\") pod \"af4ee33c-879f-42de-af88-592510a53fb9\" (UID: \"af4ee33c-879f-42de-af88-592510a53fb9\") " Dec 03 09:28:33 crc kubenswrapper[4947]: I1203 09:28:33.020852 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af4ee33c-879f-42de-af88-592510a53fb9-utilities\") pod \"af4ee33c-879f-42de-af88-592510a53fb9\" (UID: \"af4ee33c-879f-42de-af88-592510a53fb9\") " Dec 03 09:28:33 crc kubenswrapper[4947]: I1203 09:28:33.022088 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af4ee33c-879f-42de-af88-592510a53fb9-utilities" (OuterVolumeSpecName: "utilities") pod "af4ee33c-879f-42de-af88-592510a53fb9" (UID: "af4ee33c-879f-42de-af88-592510a53fb9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:28:33 crc kubenswrapper[4947]: I1203 09:28:33.030944 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af4ee33c-879f-42de-af88-592510a53fb9-kube-api-access-dpjjg" (OuterVolumeSpecName: "kube-api-access-dpjjg") pod "af4ee33c-879f-42de-af88-592510a53fb9" (UID: "af4ee33c-879f-42de-af88-592510a53fb9"). InnerVolumeSpecName "kube-api-access-dpjjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:28:33 crc kubenswrapper[4947]: I1203 09:28:33.092138 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af4ee33c-879f-42de-af88-592510a53fb9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af4ee33c-879f-42de-af88-592510a53fb9" (UID: "af4ee33c-879f-42de-af88-592510a53fb9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:28:33 crc kubenswrapper[4947]: I1203 09:28:33.123649 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af4ee33c-879f-42de-af88-592510a53fb9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:28:33 crc kubenswrapper[4947]: I1203 09:28:33.123807 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpjjg\" (UniqueName: \"kubernetes.io/projected/af4ee33c-879f-42de-af88-592510a53fb9-kube-api-access-dpjjg\") on node \"crc\" DevicePath \"\"" Dec 03 09:28:33 crc kubenswrapper[4947]: I1203 09:28:33.123935 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af4ee33c-879f-42de-af88-592510a53fb9-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:28:33 crc kubenswrapper[4947]: I1203 09:28:33.330414 4947 generic.go:334] "Generic (PLEG): container finished" podID="af4ee33c-879f-42de-af88-592510a53fb9" containerID="792dd6074f9a95c6de21facd3888d9a20888a0f43b396d91a8ae5672e2b68743" exitCode=0 Dec 03 09:28:33 crc kubenswrapper[4947]: I1203 09:28:33.330525 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mfpg7" Dec 03 09:28:33 crc kubenswrapper[4947]: I1203 09:28:33.330588 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mfpg7" event={"ID":"af4ee33c-879f-42de-af88-592510a53fb9","Type":"ContainerDied","Data":"792dd6074f9a95c6de21facd3888d9a20888a0f43b396d91a8ae5672e2b68743"} Dec 03 09:28:33 crc kubenswrapper[4947]: I1203 09:28:33.331056 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mfpg7" event={"ID":"af4ee33c-879f-42de-af88-592510a53fb9","Type":"ContainerDied","Data":"1ce9e48a4a99b3d22c3e5cd4f0042afc9457a311bcf82b293afce42ed75b8a49"} Dec 03 09:28:33 crc kubenswrapper[4947]: I1203 09:28:33.331081 4947 scope.go:117] "RemoveContainer" containerID="792dd6074f9a95c6de21facd3888d9a20888a0f43b396d91a8ae5672e2b68743" Dec 03 09:28:33 crc kubenswrapper[4947]: I1203 09:28:33.357659 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mfpg7"] Dec 03 09:28:33 crc kubenswrapper[4947]: I1203 09:28:33.363725 4947 scope.go:117] "RemoveContainer" containerID="e189a94c859107144ecf4f9bc333a739ff07473c231e07cb5f541c6fbff10a1d" Dec 03 09:28:33 crc kubenswrapper[4947]: I1203 09:28:33.365867 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mfpg7"] Dec 03 09:28:33 crc kubenswrapper[4947]: I1203 09:28:33.389315 4947 scope.go:117] "RemoveContainer" containerID="167c41bd5ae0d638d1f809fa9159d1fb824fb3dad03d2fbae214f0059810afac" Dec 03 09:28:33 crc kubenswrapper[4947]: I1203 09:28:33.437036 4947 scope.go:117] "RemoveContainer" containerID="792dd6074f9a95c6de21facd3888d9a20888a0f43b396d91a8ae5672e2b68743" Dec 03 09:28:33 crc kubenswrapper[4947]: E1203 09:28:33.437971 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"792dd6074f9a95c6de21facd3888d9a20888a0f43b396d91a8ae5672e2b68743\": container with ID starting with 792dd6074f9a95c6de21facd3888d9a20888a0f43b396d91a8ae5672e2b68743 not found: ID does not exist" containerID="792dd6074f9a95c6de21facd3888d9a20888a0f43b396d91a8ae5672e2b68743" Dec 03 09:28:33 crc kubenswrapper[4947]: I1203 09:28:33.438010 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"792dd6074f9a95c6de21facd3888d9a20888a0f43b396d91a8ae5672e2b68743"} err="failed to get container status \"792dd6074f9a95c6de21facd3888d9a20888a0f43b396d91a8ae5672e2b68743\": rpc error: code = NotFound desc = could not find container \"792dd6074f9a95c6de21facd3888d9a20888a0f43b396d91a8ae5672e2b68743\": container with ID starting with 792dd6074f9a95c6de21facd3888d9a20888a0f43b396d91a8ae5672e2b68743 not found: ID does not exist" Dec 03 09:28:33 crc kubenswrapper[4947]: I1203 09:28:33.438038 4947 scope.go:117] "RemoveContainer" containerID="e189a94c859107144ecf4f9bc333a739ff07473c231e07cb5f541c6fbff10a1d" Dec 03 09:28:33 crc kubenswrapper[4947]: E1203 09:28:33.438530 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e189a94c859107144ecf4f9bc333a739ff07473c231e07cb5f541c6fbff10a1d\": container with ID starting with e189a94c859107144ecf4f9bc333a739ff07473c231e07cb5f541c6fbff10a1d not found: ID does not exist" containerID="e189a94c859107144ecf4f9bc333a739ff07473c231e07cb5f541c6fbff10a1d" Dec 03 09:28:33 crc kubenswrapper[4947]: I1203 09:28:33.438580 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e189a94c859107144ecf4f9bc333a739ff07473c231e07cb5f541c6fbff10a1d"} err="failed to get container status \"e189a94c859107144ecf4f9bc333a739ff07473c231e07cb5f541c6fbff10a1d\": rpc error: code = NotFound desc = could not find container \"e189a94c859107144ecf4f9bc333a739ff07473c231e07cb5f541c6fbff10a1d\": container with ID starting with e189a94c859107144ecf4f9bc333a739ff07473c231e07cb5f541c6fbff10a1d not found: ID does not exist" Dec 03 09:28:33 crc kubenswrapper[4947]: I1203 09:28:33.438617 4947 scope.go:117] "RemoveContainer" containerID="167c41bd5ae0d638d1f809fa9159d1fb824fb3dad03d2fbae214f0059810afac" Dec 03 09:28:33 crc kubenswrapper[4947]: E1203 09:28:33.439026 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"167c41bd5ae0d638d1f809fa9159d1fb824fb3dad03d2fbae214f0059810afac\": container with ID starting with 167c41bd5ae0d638d1f809fa9159d1fb824fb3dad03d2fbae214f0059810afac not found: ID does not exist" containerID="167c41bd5ae0d638d1f809fa9159d1fb824fb3dad03d2fbae214f0059810afac" Dec 03 09:28:33 crc kubenswrapper[4947]: I1203 09:28:33.439057 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"167c41bd5ae0d638d1f809fa9159d1fb824fb3dad03d2fbae214f0059810afac"} err="failed to get container status \"167c41bd5ae0d638d1f809fa9159d1fb824fb3dad03d2fbae214f0059810afac\": rpc error: code = NotFound desc = could not find container \"167c41bd5ae0d638d1f809fa9159d1fb824fb3dad03d2fbae214f0059810afac\": container with ID starting with 167c41bd5ae0d638d1f809fa9159d1fb824fb3dad03d2fbae214f0059810afac not found: ID does not exist" Dec 03 09:28:35 crc kubenswrapper[4947]: I1203 09:28:35.099960 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af4ee33c-879f-42de-af88-592510a53fb9" path="/var/lib/kubelet/pods/af4ee33c-879f-42de-af88-592510a53fb9/volumes" Dec 03 09:29:00 crc kubenswrapper[4947]: I1203 09:29:00.087038 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:29:00 crc kubenswrapper[4947]: I1203 09:29:00.088650 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:29:10 crc kubenswrapper[4947]: I1203 09:29:10.718117 4947 generic.go:334] "Generic (PLEG): container finished" podID="fce62e9e-4b15-4a9b-9ac9-da544976d2fe" containerID="3c4dafb5fa5688e3a892057335687cce045448cc6b5866522bae3c333dd2f383" exitCode=0 Dec 03 09:29:10 crc kubenswrapper[4947]: I1203 09:29:10.718166 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell2-5x8jx" event={"ID":"fce62e9e-4b15-4a9b-9ac9-da544976d2fe","Type":"ContainerDied","Data":"3c4dafb5fa5688e3a892057335687cce045448cc6b5866522bae3c333dd2f383"} Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.217554 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell2-5x8jx" Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.273457 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fce62e9e-4b15-4a9b-9ac9-da544976d2fe-ssh-key\") pod \"fce62e9e-4b15-4a9b-9ac9-da544976d2fe\" (UID: \"fce62e9e-4b15-4a9b-9ac9-da544976d2fe\") " Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.273561 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fce62e9e-4b15-4a9b-9ac9-da544976d2fe-inventory\") pod \"fce62e9e-4b15-4a9b-9ac9-da544976d2fe\" (UID: \"fce62e9e-4b15-4a9b-9ac9-da544976d2fe\") " Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.273862 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49fz2\" (UniqueName: \"kubernetes.io/projected/fce62e9e-4b15-4a9b-9ac9-da544976d2fe-kube-api-access-49fz2\") pod \"fce62e9e-4b15-4a9b-9ac9-da544976d2fe\" (UID: \"fce62e9e-4b15-4a9b-9ac9-da544976d2fe\") " Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.284959 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fce62e9e-4b15-4a9b-9ac9-da544976d2fe-kube-api-access-49fz2" (OuterVolumeSpecName: "kube-api-access-49fz2") pod "fce62e9e-4b15-4a9b-9ac9-da544976d2fe" (UID: "fce62e9e-4b15-4a9b-9ac9-da544976d2fe"). InnerVolumeSpecName "kube-api-access-49fz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.309318 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fce62e9e-4b15-4a9b-9ac9-da544976d2fe-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fce62e9e-4b15-4a9b-9ac9-da544976d2fe" (UID: "fce62e9e-4b15-4a9b-9ac9-da544976d2fe"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.330685 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fce62e9e-4b15-4a9b-9ac9-da544976d2fe-inventory" (OuterVolumeSpecName: "inventory") pod "fce62e9e-4b15-4a9b-9ac9-da544976d2fe" (UID: "fce62e9e-4b15-4a9b-9ac9-da544976d2fe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.376909 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49fz2\" (UniqueName: \"kubernetes.io/projected/fce62e9e-4b15-4a9b-9ac9-da544976d2fe-kube-api-access-49fz2\") on node \"crc\" DevicePath \"\"" Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.376955 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fce62e9e-4b15-4a9b-9ac9-da544976d2fe-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.376972 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fce62e9e-4b15-4a9b-9ac9-da544976d2fe-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.743076 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell2-5x8jx" event={"ID":"fce62e9e-4b15-4a9b-9ac9-da544976d2fe","Type":"ContainerDied","Data":"46564d7b718164ca70dff14d8be59b960e6837f13bb386c379a0b288e9940815"} Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.743403 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46564d7b718164ca70dff14d8be59b960e6837f13bb386c379a0b288e9940815" Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.743127 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell2-5x8jx" Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.745797 4947 generic.go:334] "Generic (PLEG): container finished" podID="3bb31b8a-1ed3-421b-a9a7-65ae531c90dd" containerID="3be871508bcbfc93838ef9349c156b258df98e6ae50eda75e771251f15fb2f40" exitCode=0 Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.745875 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-9szzl" event={"ID":"3bb31b8a-1ed3-421b-a9a7-65ae531c90dd","Type":"ContainerDied","Data":"3be871508bcbfc93838ef9349c156b258df98e6ae50eda75e771251f15fb2f40"} Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.873837 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell2-fmlct"] Dec 03 09:29:12 crc kubenswrapper[4947]: E1203 09:29:12.874318 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af4ee33c-879f-42de-af88-592510a53fb9" containerName="registry-server" Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.874337 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="af4ee33c-879f-42de-af88-592510a53fb9" containerName="registry-server" Dec 03 09:29:12 crc kubenswrapper[4947]: E1203 09:29:12.874355 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af4ee33c-879f-42de-af88-592510a53fb9" containerName="extract-content" Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.874363 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="af4ee33c-879f-42de-af88-592510a53fb9" containerName="extract-content" Dec 03 09:29:12 crc kubenswrapper[4947]: E1203 09:29:12.874406 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fce62e9e-4b15-4a9b-9ac9-da544976d2fe" containerName="install-os-openstack-openstack-cell2" Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.874415 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="fce62e9e-4b15-4a9b-9ac9-da544976d2fe" containerName="install-os-openstack-openstack-cell2" Dec 03 09:29:12 crc kubenswrapper[4947]: E1203 09:29:12.874429 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af4ee33c-879f-42de-af88-592510a53fb9" containerName="extract-utilities" Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.874437 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="af4ee33c-879f-42de-af88-592510a53fb9" containerName="extract-utilities" Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.874881 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="af4ee33c-879f-42de-af88-592510a53fb9" containerName="registry-server" Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.874917 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="fce62e9e-4b15-4a9b-9ac9-da544976d2fe" containerName="install-os-openstack-openstack-cell2" Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.875968 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell2-fmlct" Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.880075 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell2" Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.880386 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell2-dockercfg-cl4m2" Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.894530 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell2-fmlct"] Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.988800 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29fab9b2-d93c-432c-8563-f28b9af0e313-ssh-key\") pod \"configure-os-openstack-openstack-cell2-fmlct\" (UID: \"29fab9b2-d93c-432c-8563-f28b9af0e313\") " pod="openstack/configure-os-openstack-openstack-cell2-fmlct" Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.988913 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29fab9b2-d93c-432c-8563-f28b9af0e313-inventory\") pod \"configure-os-openstack-openstack-cell2-fmlct\" (UID: \"29fab9b2-d93c-432c-8563-f28b9af0e313\") " pod="openstack/configure-os-openstack-openstack-cell2-fmlct" Dec 03 09:29:12 crc kubenswrapper[4947]: I1203 09:29:12.988961 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6tg6\" (UniqueName: \"kubernetes.io/projected/29fab9b2-d93c-432c-8563-f28b9af0e313-kube-api-access-r6tg6\") pod \"configure-os-openstack-openstack-cell2-fmlct\" (UID: \"29fab9b2-d93c-432c-8563-f28b9af0e313\") " pod="openstack/configure-os-openstack-openstack-cell2-fmlct" Dec 03 09:29:13 crc kubenswrapper[4947]: I1203 09:29:13.090898 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29fab9b2-d93c-432c-8563-f28b9af0e313-inventory\") pod \"configure-os-openstack-openstack-cell2-fmlct\" (UID: \"29fab9b2-d93c-432c-8563-f28b9af0e313\") " pod="openstack/configure-os-openstack-openstack-cell2-fmlct" Dec 03 09:29:13 crc kubenswrapper[4947]: I1203 09:29:13.090940 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6tg6\" (UniqueName: \"kubernetes.io/projected/29fab9b2-d93c-432c-8563-f28b9af0e313-kube-api-access-r6tg6\") pod \"configure-os-openstack-openstack-cell2-fmlct\" (UID: \"29fab9b2-d93c-432c-8563-f28b9af0e313\") " pod="openstack/configure-os-openstack-openstack-cell2-fmlct" Dec 03 09:29:13 crc kubenswrapper[4947]: I1203 09:29:13.091069 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29fab9b2-d93c-432c-8563-f28b9af0e313-ssh-key\") pod \"configure-os-openstack-openstack-cell2-fmlct\" (UID: \"29fab9b2-d93c-432c-8563-f28b9af0e313\") " pod="openstack/configure-os-openstack-openstack-cell2-fmlct" Dec 03 09:29:13 crc kubenswrapper[4947]: I1203 09:29:13.095951 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29fab9b2-d93c-432c-8563-f28b9af0e313-inventory\") pod \"configure-os-openstack-openstack-cell2-fmlct\" (UID: \"29fab9b2-d93c-432c-8563-f28b9af0e313\") " pod="openstack/configure-os-openstack-openstack-cell2-fmlct" Dec 03 09:29:13 crc kubenswrapper[4947]: I1203 09:29:13.096062 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29fab9b2-d93c-432c-8563-f28b9af0e313-ssh-key\") pod \"configure-os-openstack-openstack-cell2-fmlct\" (UID: \"29fab9b2-d93c-432c-8563-f28b9af0e313\") " pod="openstack/configure-os-openstack-openstack-cell2-fmlct" Dec 03 09:29:13 crc kubenswrapper[4947]: I1203 09:29:13.109010 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6tg6\" (UniqueName: \"kubernetes.io/projected/29fab9b2-d93c-432c-8563-f28b9af0e313-kube-api-access-r6tg6\") pod \"configure-os-openstack-openstack-cell2-fmlct\" (UID: \"29fab9b2-d93c-432c-8563-f28b9af0e313\") " pod="openstack/configure-os-openstack-openstack-cell2-fmlct" Dec 03 09:29:13 crc kubenswrapper[4947]: I1203 09:29:13.194418 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell2-fmlct" Dec 03 09:29:13 crc kubenswrapper[4947]: I1203 09:29:13.796487 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell2-fmlct"] Dec 03 09:29:13 crc kubenswrapper[4947]: W1203 09:29:13.804406 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29fab9b2_d93c_432c_8563_f28b9af0e313.slice/crio-b5fb92bc5383f2a483ced8781a65ed2984d7cd7e78fb690ac602038823295ad1 WatchSource:0}: Error finding container b5fb92bc5383f2a483ced8781a65ed2984d7cd7e78fb690ac602038823295ad1: Status 404 returned error can't find the container with id b5fb92bc5383f2a483ced8781a65ed2984d7cd7e78fb690ac602038823295ad1 Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.272939 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-9szzl" Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.422162 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3bb31b8a-1ed3-421b-a9a7-65ae531c90dd-ssh-key\") pod \"3bb31b8a-1ed3-421b-a9a7-65ae531c90dd\" (UID: \"3bb31b8a-1ed3-421b-a9a7-65ae531c90dd\") " Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.422596 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhxrt\" (UniqueName: \"kubernetes.io/projected/3bb31b8a-1ed3-421b-a9a7-65ae531c90dd-kube-api-access-bhxrt\") pod \"3bb31b8a-1ed3-421b-a9a7-65ae531c90dd\" (UID: \"3bb31b8a-1ed3-421b-a9a7-65ae531c90dd\") " Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.422693 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3bb31b8a-1ed3-421b-a9a7-65ae531c90dd-inventory\") pod \"3bb31b8a-1ed3-421b-a9a7-65ae531c90dd\" (UID: \"3bb31b8a-1ed3-421b-a9a7-65ae531c90dd\") " Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.428006 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bb31b8a-1ed3-421b-a9a7-65ae531c90dd-kube-api-access-bhxrt" (OuterVolumeSpecName: "kube-api-access-bhxrt") pod "3bb31b8a-1ed3-421b-a9a7-65ae531c90dd" (UID: "3bb31b8a-1ed3-421b-a9a7-65ae531c90dd"). InnerVolumeSpecName "kube-api-access-bhxrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.461064 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bb31b8a-1ed3-421b-a9a7-65ae531c90dd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3bb31b8a-1ed3-421b-a9a7-65ae531c90dd" (UID: "3bb31b8a-1ed3-421b-a9a7-65ae531c90dd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.462817 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bb31b8a-1ed3-421b-a9a7-65ae531c90dd-inventory" (OuterVolumeSpecName: "inventory") pod "3bb31b8a-1ed3-421b-a9a7-65ae531c90dd" (UID: "3bb31b8a-1ed3-421b-a9a7-65ae531c90dd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.525646 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhxrt\" (UniqueName: \"kubernetes.io/projected/3bb31b8a-1ed3-421b-a9a7-65ae531c90dd-kube-api-access-bhxrt\") on node \"crc\" DevicePath \"\"" Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.525802 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3bb31b8a-1ed3-421b-a9a7-65ae531c90dd-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.525895 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3bb31b8a-1ed3-421b-a9a7-65ae531c90dd-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.768684 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell2-fmlct" event={"ID":"29fab9b2-d93c-432c-8563-f28b9af0e313","Type":"ContainerStarted","Data":"d54419f4a06eabbe58b013e971192dfca69e5314a786c91b6b04581005246250"} Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.768723 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell2-fmlct" event={"ID":"29fab9b2-d93c-432c-8563-f28b9af0e313","Type":"ContainerStarted","Data":"b5fb92bc5383f2a483ced8781a65ed2984d7cd7e78fb690ac602038823295ad1"} Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.770769 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-9szzl" event={"ID":"3bb31b8a-1ed3-421b-a9a7-65ae531c90dd","Type":"ContainerDied","Data":"5bf110db278c69151f9f5fdc8a05f9447e21897dcb303d4dc8fee1fd0f127f18"} Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.770806 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bf110db278c69151f9f5fdc8a05f9447e21897dcb303d4dc8fee1fd0f127f18" Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.770813 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-9szzl" Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.807553 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell2-fmlct" podStartSLOduration=2.048856915 podStartE2EDuration="2.807529439s" podCreationTimestamp="2025-12-03 09:29:12 +0000 UTC" firstStartedPulling="2025-12-03 09:29:13.811003816 +0000 UTC m=+9615.071958252" lastFinishedPulling="2025-12-03 09:29:14.56967635 +0000 UTC m=+9615.830630776" observedRunningTime="2025-12-03 09:29:14.795333339 +0000 UTC m=+9616.056287765" watchObservedRunningTime="2025-12-03 09:29:14.807529439 +0000 UTC m=+9616.068483855" Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.860015 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-j4sjz"] Dec 03 09:29:14 crc kubenswrapper[4947]: E1203 09:29:14.860570 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bb31b8a-1ed3-421b-a9a7-65ae531c90dd" containerName="install-os-openstack-openstack-cell1" Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.860594 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bb31b8a-1ed3-421b-a9a7-65ae531c90dd" containerName="install-os-openstack-openstack-cell1" Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.860862 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bb31b8a-1ed3-421b-a9a7-65ae531c90dd" containerName="install-os-openstack-openstack-cell1" Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.861808 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-j4sjz" Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.865823 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rfmtm" Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.866039 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.874772 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-j4sjz"] Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.934272 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6d93b30-a643-4185-975a-a650351e70d5-ssh-key\") pod \"configure-os-openstack-openstack-cell1-j4sjz\" (UID: \"a6d93b30-a643-4185-975a-a650351e70d5\") " pod="openstack/configure-os-openstack-openstack-cell1-j4sjz" Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.934681 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6d93b30-a643-4185-975a-a650351e70d5-inventory\") pod \"configure-os-openstack-openstack-cell1-j4sjz\" (UID: \"a6d93b30-a643-4185-975a-a650351e70d5\") " pod="openstack/configure-os-openstack-openstack-cell1-j4sjz" Dec 03 09:29:14 crc kubenswrapper[4947]: I1203 09:29:14.934856 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t558h\" (UniqueName: \"kubernetes.io/projected/a6d93b30-a643-4185-975a-a650351e70d5-kube-api-access-t558h\") pod \"configure-os-openstack-openstack-cell1-j4sjz\" (UID: \"a6d93b30-a643-4185-975a-a650351e70d5\") " pod="openstack/configure-os-openstack-openstack-cell1-j4sjz" Dec 03 09:29:15 crc kubenswrapper[4947]: I1203 09:29:15.037289 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6d93b30-a643-4185-975a-a650351e70d5-ssh-key\") pod \"configure-os-openstack-openstack-cell1-j4sjz\" (UID: \"a6d93b30-a643-4185-975a-a650351e70d5\") " pod="openstack/configure-os-openstack-openstack-cell1-j4sjz" Dec 03 09:29:15 crc kubenswrapper[4947]: I1203 09:29:15.037391 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6d93b30-a643-4185-975a-a650351e70d5-inventory\") pod \"configure-os-openstack-openstack-cell1-j4sjz\" (UID: \"a6d93b30-a643-4185-975a-a650351e70d5\") " pod="openstack/configure-os-openstack-openstack-cell1-j4sjz" Dec 03 09:29:15 crc kubenswrapper[4947]: I1203 09:29:15.037463 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t558h\" (UniqueName: \"kubernetes.io/projected/a6d93b30-a643-4185-975a-a650351e70d5-kube-api-access-t558h\") pod \"configure-os-openstack-openstack-cell1-j4sjz\" (UID: \"a6d93b30-a643-4185-975a-a650351e70d5\") " pod="openstack/configure-os-openstack-openstack-cell1-j4sjz" Dec 03 09:29:15 crc kubenswrapper[4947]: I1203 09:29:15.043628 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6d93b30-a643-4185-975a-a650351e70d5-inventory\") pod \"configure-os-openstack-openstack-cell1-j4sjz\" (UID: \"a6d93b30-a643-4185-975a-a650351e70d5\") " pod="openstack/configure-os-openstack-openstack-cell1-j4sjz" Dec 03 09:29:15 crc kubenswrapper[4947]: I1203 09:29:15.043632 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6d93b30-a643-4185-975a-a650351e70d5-ssh-key\") pod \"configure-os-openstack-openstack-cell1-j4sjz\" (UID: \"a6d93b30-a643-4185-975a-a650351e70d5\") " pod="openstack/configure-os-openstack-openstack-cell1-j4sjz" Dec 03 09:29:15 crc kubenswrapper[4947]: I1203 09:29:15.055573 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t558h\" (UniqueName: \"kubernetes.io/projected/a6d93b30-a643-4185-975a-a650351e70d5-kube-api-access-t558h\") pod \"configure-os-openstack-openstack-cell1-j4sjz\" (UID: \"a6d93b30-a643-4185-975a-a650351e70d5\") " pod="openstack/configure-os-openstack-openstack-cell1-j4sjz" Dec 03 09:29:15 crc kubenswrapper[4947]: I1203 09:29:15.182088 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-j4sjz" Dec 03 09:29:15 crc kubenswrapper[4947]: I1203 09:29:15.748318 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-j4sjz"] Dec 03 09:29:15 crc kubenswrapper[4947]: I1203 09:29:15.788890 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-j4sjz" event={"ID":"a6d93b30-a643-4185-975a-a650351e70d5","Type":"ContainerStarted","Data":"c9d30f9d7a6d9426c4a4d449fd5b6fefd58c0f8756798d0eee9ea428cc019150"} Dec 03 09:29:16 crc kubenswrapper[4947]: I1203 09:29:16.798191 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-j4sjz" event={"ID":"a6d93b30-a643-4185-975a-a650351e70d5","Type":"ContainerStarted","Data":"8911f75560afc042b23967d01f75ff7d493b3cd22cd1084e7465a5219e1f46cf"} Dec 03 09:29:16 crc kubenswrapper[4947]: I1203 09:29:16.824043 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-j4sjz" podStartSLOduration=2.353199432 podStartE2EDuration="2.824022297s" podCreationTimestamp="2025-12-03 09:29:14 +0000 UTC" firstStartedPulling="2025-12-03 09:29:15.751840069 +0000 UTC m=+9617.012794505" lastFinishedPulling="2025-12-03 09:29:16.222662904 +0000 UTC m=+9617.483617370" observedRunningTime="2025-12-03 09:29:16.812249859 +0000 UTC m=+9618.073204285" watchObservedRunningTime="2025-12-03 09:29:16.824022297 +0000 UTC m=+9618.084976723" Dec 03 09:29:30 crc kubenswrapper[4947]: I1203 09:29:30.086036 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:29:30 crc kubenswrapper[4947]: I1203 09:29:30.086635 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:29:30 crc kubenswrapper[4947]: I1203 09:29:30.086696 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 09:29:30 crc kubenswrapper[4947]: I1203 09:29:30.087560 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4bb8680f19120bd6e546ee45532bd7955e35871304affa4095bcd96a25cdccb1"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 09:29:30 crc kubenswrapper[4947]: I1203 09:29:30.087680 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://4bb8680f19120bd6e546ee45532bd7955e35871304affa4095bcd96a25cdccb1" gracePeriod=600 Dec 03 09:29:30 crc kubenswrapper[4947]: I1203 09:29:30.928464 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="4bb8680f19120bd6e546ee45532bd7955e35871304affa4095bcd96a25cdccb1" exitCode=0 Dec 03 09:29:30 crc kubenswrapper[4947]: I1203 09:29:30.928582 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"4bb8680f19120bd6e546ee45532bd7955e35871304affa4095bcd96a25cdccb1"} Dec 03 09:29:30 crc kubenswrapper[4947]: I1203 09:29:30.928995 4947 scope.go:117] "RemoveContainer" containerID="36832975f9f2a9e9905cc4d0900112f86eae651a7e985348ddbe79baf680e548" Dec 03 09:29:31 crc kubenswrapper[4947]: I1203 09:29:31.943569 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f"} Dec 03 09:30:00 crc kubenswrapper[4947]: I1203 09:30:00.143269 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412570-fj6ff"] Dec 03 09:30:00 crc kubenswrapper[4947]: I1203 09:30:00.145381 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412570-fj6ff" Dec 03 09:30:00 crc kubenswrapper[4947]: I1203 09:30:00.147652 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 09:30:00 crc kubenswrapper[4947]: I1203 09:30:00.148266 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 09:30:00 crc kubenswrapper[4947]: I1203 09:30:00.162551 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412570-fj6ff"] Dec 03 09:30:00 crc kubenswrapper[4947]: I1203 09:30:00.308187 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6ls4\" (UniqueName: \"kubernetes.io/projected/ef663b37-7b25-4a1d-81e4-e33b06d90a30-kube-api-access-n6ls4\") pod \"collect-profiles-29412570-fj6ff\" (UID: \"ef663b37-7b25-4a1d-81e4-e33b06d90a30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412570-fj6ff" Dec 03 09:30:00 crc kubenswrapper[4947]: I1203 09:30:00.308512 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef663b37-7b25-4a1d-81e4-e33b06d90a30-config-volume\") pod \"collect-profiles-29412570-fj6ff\" (UID: \"ef663b37-7b25-4a1d-81e4-e33b06d90a30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412570-fj6ff" Dec 03 09:30:00 crc kubenswrapper[4947]: I1203 09:30:00.308678 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef663b37-7b25-4a1d-81e4-e33b06d90a30-secret-volume\") pod \"collect-profiles-29412570-fj6ff\" (UID: \"ef663b37-7b25-4a1d-81e4-e33b06d90a30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412570-fj6ff" Dec 03 09:30:00 crc kubenswrapper[4947]: I1203 09:30:00.410373 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef663b37-7b25-4a1d-81e4-e33b06d90a30-secret-volume\") pod \"collect-profiles-29412570-fj6ff\" (UID: \"ef663b37-7b25-4a1d-81e4-e33b06d90a30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412570-fj6ff" Dec 03 09:30:00 crc kubenswrapper[4947]: I1203 09:30:00.410589 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6ls4\" (UniqueName: \"kubernetes.io/projected/ef663b37-7b25-4a1d-81e4-e33b06d90a30-kube-api-access-n6ls4\") pod \"collect-profiles-29412570-fj6ff\" (UID: \"ef663b37-7b25-4a1d-81e4-e33b06d90a30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412570-fj6ff" Dec 03 09:30:00 crc kubenswrapper[4947]: I1203 09:30:00.410643 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef663b37-7b25-4a1d-81e4-e33b06d90a30-config-volume\") pod \"collect-profiles-29412570-fj6ff\" (UID: \"ef663b37-7b25-4a1d-81e4-e33b06d90a30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412570-fj6ff" Dec 03 09:30:00 crc kubenswrapper[4947]: I1203 09:30:00.411740 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef663b37-7b25-4a1d-81e4-e33b06d90a30-config-volume\") pod \"collect-profiles-29412570-fj6ff\" (UID: \"ef663b37-7b25-4a1d-81e4-e33b06d90a30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412570-fj6ff" Dec 03 09:30:00 crc kubenswrapper[4947]: I1203 09:30:00.417357 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef663b37-7b25-4a1d-81e4-e33b06d90a30-secret-volume\") pod \"collect-profiles-29412570-fj6ff\" (UID: \"ef663b37-7b25-4a1d-81e4-e33b06d90a30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412570-fj6ff" Dec 03 09:30:00 crc kubenswrapper[4947]: I1203 09:30:00.427247 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6ls4\" (UniqueName: \"kubernetes.io/projected/ef663b37-7b25-4a1d-81e4-e33b06d90a30-kube-api-access-n6ls4\") pod \"collect-profiles-29412570-fj6ff\" (UID: \"ef663b37-7b25-4a1d-81e4-e33b06d90a30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412570-fj6ff" Dec 03 09:30:00 crc kubenswrapper[4947]: I1203 09:30:00.471035 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412570-fj6ff" Dec 03 09:30:00 crc kubenswrapper[4947]: I1203 09:30:00.944970 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412570-fj6ff"] Dec 03 09:30:00 crc kubenswrapper[4947]: W1203 09:30:00.954699 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef663b37_7b25_4a1d_81e4_e33b06d90a30.slice/crio-1df10672ec611b8d1f2deb4d523d553715e69981cbd923b2dfb6724b5a9e16a9 WatchSource:0}: Error finding container 1df10672ec611b8d1f2deb4d523d553715e69981cbd923b2dfb6724b5a9e16a9: Status 404 returned error can't find the container with id 1df10672ec611b8d1f2deb4d523d553715e69981cbd923b2dfb6724b5a9e16a9 Dec 03 09:30:01 crc kubenswrapper[4947]: I1203 09:30:01.256996 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412570-fj6ff" event={"ID":"ef663b37-7b25-4a1d-81e4-e33b06d90a30","Type":"ContainerStarted","Data":"65dcc688c64324e9dd2fadfdccb34b59f41935e7a00e4a414bce422cae9c40ae"} Dec 03 09:30:01 crc kubenswrapper[4947]: I1203 09:30:01.257342 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412570-fj6ff" event={"ID":"ef663b37-7b25-4a1d-81e4-e33b06d90a30","Type":"ContainerStarted","Data":"1df10672ec611b8d1f2deb4d523d553715e69981cbd923b2dfb6724b5a9e16a9"} Dec 03 09:30:01 crc kubenswrapper[4947]: I1203 09:30:01.300364 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29412570-fj6ff" podStartSLOduration=1.300336301 podStartE2EDuration="1.300336301s" podCreationTimestamp="2025-12-03 09:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:30:01.281460031 +0000 UTC m=+9662.542414457" watchObservedRunningTime="2025-12-03 09:30:01.300336301 +0000 UTC m=+9662.561290727" Dec 03 09:30:02 crc kubenswrapper[4947]: I1203 09:30:02.269758 4947 generic.go:334] "Generic (PLEG): container finished" podID="ef663b37-7b25-4a1d-81e4-e33b06d90a30" containerID="65dcc688c64324e9dd2fadfdccb34b59f41935e7a00e4a414bce422cae9c40ae" exitCode=0 Dec 03 09:30:02 crc kubenswrapper[4947]: I1203 09:30:02.269813 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412570-fj6ff" event={"ID":"ef663b37-7b25-4a1d-81e4-e33b06d90a30","Type":"ContainerDied","Data":"65dcc688c64324e9dd2fadfdccb34b59f41935e7a00e4a414bce422cae9c40ae"} Dec 03 09:30:04 crc kubenswrapper[4947]: I1203 09:30:04.155950 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412570-fj6ff" Dec 03 09:30:04 crc kubenswrapper[4947]: I1203 09:30:04.291996 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef663b37-7b25-4a1d-81e4-e33b06d90a30-secret-volume\") pod \"ef663b37-7b25-4a1d-81e4-e33b06d90a30\" (UID: \"ef663b37-7b25-4a1d-81e4-e33b06d90a30\") " Dec 03 09:30:04 crc kubenswrapper[4947]: I1203 09:30:04.292101 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6ls4\" (UniqueName: \"kubernetes.io/projected/ef663b37-7b25-4a1d-81e4-e33b06d90a30-kube-api-access-n6ls4\") pod \"ef663b37-7b25-4a1d-81e4-e33b06d90a30\" (UID: \"ef663b37-7b25-4a1d-81e4-e33b06d90a30\") " Dec 03 09:30:04 crc kubenswrapper[4947]: I1203 09:30:04.292169 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef663b37-7b25-4a1d-81e4-e33b06d90a30-config-volume\") pod \"ef663b37-7b25-4a1d-81e4-e33b06d90a30\" (UID: \"ef663b37-7b25-4a1d-81e4-e33b06d90a30\") " Dec 03 09:30:04 crc kubenswrapper[4947]: I1203 09:30:04.293439 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef663b37-7b25-4a1d-81e4-e33b06d90a30-config-volume" (OuterVolumeSpecName: "config-volume") pod "ef663b37-7b25-4a1d-81e4-e33b06d90a30" (UID: "ef663b37-7b25-4a1d-81e4-e33b06d90a30"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:30:04 crc kubenswrapper[4947]: I1203 09:30:04.293667 4947 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef663b37-7b25-4a1d-81e4-e33b06d90a30-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:04 crc kubenswrapper[4947]: I1203 09:30:04.294214 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412570-fj6ff" event={"ID":"ef663b37-7b25-4a1d-81e4-e33b06d90a30","Type":"ContainerDied","Data":"1df10672ec611b8d1f2deb4d523d553715e69981cbd923b2dfb6724b5a9e16a9"} Dec 03 09:30:04 crc kubenswrapper[4947]: I1203 09:30:04.294240 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1df10672ec611b8d1f2deb4d523d553715e69981cbd923b2dfb6724b5a9e16a9" Dec 03 09:30:04 crc kubenswrapper[4947]: I1203 09:30:04.294297 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412570-fj6ff" Dec 03 09:30:04 crc kubenswrapper[4947]: I1203 09:30:04.334566 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef663b37-7b25-4a1d-81e4-e33b06d90a30-kube-api-access-n6ls4" (OuterVolumeSpecName: "kube-api-access-n6ls4") pod "ef663b37-7b25-4a1d-81e4-e33b06d90a30" (UID: "ef663b37-7b25-4a1d-81e4-e33b06d90a30"). InnerVolumeSpecName "kube-api-access-n6ls4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:30:04 crc kubenswrapper[4947]: I1203 09:30:04.334810 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef663b37-7b25-4a1d-81e4-e33b06d90a30-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ef663b37-7b25-4a1d-81e4-e33b06d90a30" (UID: "ef663b37-7b25-4a1d-81e4-e33b06d90a30"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:30:04 crc kubenswrapper[4947]: I1203 09:30:04.361008 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412525-lpltb"] Dec 03 09:30:04 crc kubenswrapper[4947]: I1203 09:30:04.369949 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412525-lpltb"] Dec 03 09:30:04 crc kubenswrapper[4947]: I1203 09:30:04.395705 4947 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef663b37-7b25-4a1d-81e4-e33b06d90a30-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:04 crc kubenswrapper[4947]: I1203 09:30:04.395740 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6ls4\" (UniqueName: \"kubernetes.io/projected/ef663b37-7b25-4a1d-81e4-e33b06d90a30-kube-api-access-n6ls4\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:05 crc kubenswrapper[4947]: I1203 09:30:05.098818 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="603e2092-218a-4357-9eba-a012f333e76f" path="/var/lib/kubelet/pods/603e2092-218a-4357-9eba-a012f333e76f/volumes" Dec 03 09:30:05 crc kubenswrapper[4947]: I1203 09:30:05.304963 4947 generic.go:334] "Generic (PLEG): container finished" podID="a6d93b30-a643-4185-975a-a650351e70d5" containerID="8911f75560afc042b23967d01f75ff7d493b3cd22cd1084e7465a5219e1f46cf" exitCode=0 Dec 03 09:30:05 crc kubenswrapper[4947]: I1203 09:30:05.305049 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-j4sjz" event={"ID":"a6d93b30-a643-4185-975a-a650351e70d5","Type":"ContainerDied","Data":"8911f75560afc042b23967d01f75ff7d493b3cd22cd1084e7465a5219e1f46cf"} Dec 03 09:30:05 crc kubenswrapper[4947]: I1203 09:30:05.309970 4947 generic.go:334] "Generic (PLEG): container finished" podID="29fab9b2-d93c-432c-8563-f28b9af0e313" containerID="d54419f4a06eabbe58b013e971192dfca69e5314a786c91b6b04581005246250" exitCode=0 Dec 03 09:30:05 crc kubenswrapper[4947]: I1203 09:30:05.310012 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell2-fmlct" event={"ID":"29fab9b2-d93c-432c-8563-f28b9af0e313","Type":"ContainerDied","Data":"d54419f4a06eabbe58b013e971192dfca69e5314a786c91b6b04581005246250"} Dec 03 09:30:06 crc kubenswrapper[4947]: I1203 09:30:06.786416 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell2-fmlct" Dec 03 09:30:06 crc kubenswrapper[4947]: I1203 09:30:06.946992 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29fab9b2-d93c-432c-8563-f28b9af0e313-ssh-key\") pod \"29fab9b2-d93c-432c-8563-f28b9af0e313\" (UID: \"29fab9b2-d93c-432c-8563-f28b9af0e313\") " Dec 03 09:30:06 crc kubenswrapper[4947]: I1203 09:30:06.947115 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29fab9b2-d93c-432c-8563-f28b9af0e313-inventory\") pod \"29fab9b2-d93c-432c-8563-f28b9af0e313\" (UID: \"29fab9b2-d93c-432c-8563-f28b9af0e313\") " Dec 03 09:30:06 crc kubenswrapper[4947]: I1203 09:30:06.947320 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6tg6\" (UniqueName: \"kubernetes.io/projected/29fab9b2-d93c-432c-8563-f28b9af0e313-kube-api-access-r6tg6\") pod \"29fab9b2-d93c-432c-8563-f28b9af0e313\" (UID: \"29fab9b2-d93c-432c-8563-f28b9af0e313\") " Dec 03 09:30:06 crc kubenswrapper[4947]: I1203 09:30:06.953607 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29fab9b2-d93c-432c-8563-f28b9af0e313-kube-api-access-r6tg6" (OuterVolumeSpecName: "kube-api-access-r6tg6") pod "29fab9b2-d93c-432c-8563-f28b9af0e313" (UID: "29fab9b2-d93c-432c-8563-f28b9af0e313"). InnerVolumeSpecName "kube-api-access-r6tg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:30:06 crc kubenswrapper[4947]: I1203 09:30:06.998669 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29fab9b2-d93c-432c-8563-f28b9af0e313-inventory" (OuterVolumeSpecName: "inventory") pod "29fab9b2-d93c-432c-8563-f28b9af0e313" (UID: "29fab9b2-d93c-432c-8563-f28b9af0e313"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.010790 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29fab9b2-d93c-432c-8563-f28b9af0e313-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "29fab9b2-d93c-432c-8563-f28b9af0e313" (UID: "29fab9b2-d93c-432c-8563-f28b9af0e313"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.050049 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6tg6\" (UniqueName: \"kubernetes.io/projected/29fab9b2-d93c-432c-8563-f28b9af0e313-kube-api-access-r6tg6\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.050082 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29fab9b2-d93c-432c-8563-f28b9af0e313-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.050095 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29fab9b2-d93c-432c-8563-f28b9af0e313-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.334238 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell2-fmlct" event={"ID":"29fab9b2-d93c-432c-8563-f28b9af0e313","Type":"ContainerDied","Data":"b5fb92bc5383f2a483ced8781a65ed2984d7cd7e78fb690ac602038823295ad1"} Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.334572 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5fb92bc5383f2a483ced8781a65ed2984d7cd7e78fb690ac602038823295ad1" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.334414 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell2-fmlct" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.422971 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-qcps8"] Dec 03 09:30:07 crc kubenswrapper[4947]: E1203 09:30:07.423721 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29fab9b2-d93c-432c-8563-f28b9af0e313" containerName="configure-os-openstack-openstack-cell2" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.423742 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="29fab9b2-d93c-432c-8563-f28b9af0e313" containerName="configure-os-openstack-openstack-cell2" Dec 03 09:30:07 crc kubenswrapper[4947]: E1203 09:30:07.423755 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef663b37-7b25-4a1d-81e4-e33b06d90a30" containerName="collect-profiles" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.423763 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef663b37-7b25-4a1d-81e4-e33b06d90a30" containerName="collect-profiles" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.425139 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="29fab9b2-d93c-432c-8563-f28b9af0e313" containerName="configure-os-openstack-openstack-cell2" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.425178 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef663b37-7b25-4a1d-81e4-e33b06d90a30" containerName="collect-profiles" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.426030 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-qcps8" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.428711 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell2" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.428911 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell2-dockercfg-cl4m2" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.465184 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-qcps8"] Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.511109 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-j4sjz" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.563997 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/f1af51fb-9081-44b1-9408-7d267894140c-inventory-1\") pod \"ssh-known-hosts-openstack-qcps8\" (UID: \"f1af51fb-9081-44b1-9408-7d267894140c\") " pod="openstack/ssh-known-hosts-openstack-qcps8" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.564052 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell2\" (UniqueName: \"kubernetes.io/secret/f1af51fb-9081-44b1-9408-7d267894140c-ssh-key-openstack-cell2\") pod \"ssh-known-hosts-openstack-qcps8\" (UID: \"f1af51fb-9081-44b1-9408-7d267894140c\") " pod="openstack/ssh-known-hosts-openstack-qcps8" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.564082 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xmtz\" (UniqueName: \"kubernetes.io/projected/f1af51fb-9081-44b1-9408-7d267894140c-kube-api-access-4xmtz\") pod \"ssh-known-hosts-openstack-qcps8\" (UID: \"f1af51fb-9081-44b1-9408-7d267894140c\") " pod="openstack/ssh-known-hosts-openstack-qcps8" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.564125 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f1af51fb-9081-44b1-9408-7d267894140c-inventory-0\") pod \"ssh-known-hosts-openstack-qcps8\" (UID: \"f1af51fb-9081-44b1-9408-7d267894140c\") " pod="openstack/ssh-known-hosts-openstack-qcps8" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.564167 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f1af51fb-9081-44b1-9408-7d267894140c-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-qcps8\" (UID: \"f1af51fb-9081-44b1-9408-7d267894140c\") " pod="openstack/ssh-known-hosts-openstack-qcps8" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.665757 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6d93b30-a643-4185-975a-a650351e70d5-ssh-key\") pod \"a6d93b30-a643-4185-975a-a650351e70d5\" (UID: \"a6d93b30-a643-4185-975a-a650351e70d5\") " Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.666174 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t558h\" (UniqueName: \"kubernetes.io/projected/a6d93b30-a643-4185-975a-a650351e70d5-kube-api-access-t558h\") pod \"a6d93b30-a643-4185-975a-a650351e70d5\" (UID: \"a6d93b30-a643-4185-975a-a650351e70d5\") " Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.666362 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6d93b30-a643-4185-975a-a650351e70d5-inventory\") pod \"a6d93b30-a643-4185-975a-a650351e70d5\" (UID: \"a6d93b30-a643-4185-975a-a650351e70d5\") " Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.666794 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f1af51fb-9081-44b1-9408-7d267894140c-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-qcps8\" (UID: \"f1af51fb-9081-44b1-9408-7d267894140c\") " pod="openstack/ssh-known-hosts-openstack-qcps8" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.667046 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/f1af51fb-9081-44b1-9408-7d267894140c-inventory-1\") pod \"ssh-known-hosts-openstack-qcps8\" (UID: \"f1af51fb-9081-44b1-9408-7d267894140c\") " pod="openstack/ssh-known-hosts-openstack-qcps8" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.667159 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell2\" (UniqueName: \"kubernetes.io/secret/f1af51fb-9081-44b1-9408-7d267894140c-ssh-key-openstack-cell2\") pod \"ssh-known-hosts-openstack-qcps8\" (UID: \"f1af51fb-9081-44b1-9408-7d267894140c\") " pod="openstack/ssh-known-hosts-openstack-qcps8" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.667269 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xmtz\" (UniqueName: \"kubernetes.io/projected/f1af51fb-9081-44b1-9408-7d267894140c-kube-api-access-4xmtz\") pod \"ssh-known-hosts-openstack-qcps8\" (UID: \"f1af51fb-9081-44b1-9408-7d267894140c\") " pod="openstack/ssh-known-hosts-openstack-qcps8" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.667407 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f1af51fb-9081-44b1-9408-7d267894140c-inventory-0\") pod \"ssh-known-hosts-openstack-qcps8\" (UID: \"f1af51fb-9081-44b1-9408-7d267894140c\") " pod="openstack/ssh-known-hosts-openstack-qcps8" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.670324 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f1af51fb-9081-44b1-9408-7d267894140c-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-qcps8\" (UID: \"f1af51fb-9081-44b1-9408-7d267894140c\") " pod="openstack/ssh-known-hosts-openstack-qcps8" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.670449 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/f1af51fb-9081-44b1-9408-7d267894140c-inventory-1\") pod \"ssh-known-hosts-openstack-qcps8\" (UID: \"f1af51fb-9081-44b1-9408-7d267894140c\") " pod="openstack/ssh-known-hosts-openstack-qcps8" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.670742 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6d93b30-a643-4185-975a-a650351e70d5-kube-api-access-t558h" (OuterVolumeSpecName: "kube-api-access-t558h") pod "a6d93b30-a643-4185-975a-a650351e70d5" (UID: "a6d93b30-a643-4185-975a-a650351e70d5"). InnerVolumeSpecName "kube-api-access-t558h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.671471 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell2\" (UniqueName: \"kubernetes.io/secret/f1af51fb-9081-44b1-9408-7d267894140c-ssh-key-openstack-cell2\") pod \"ssh-known-hosts-openstack-qcps8\" (UID: \"f1af51fb-9081-44b1-9408-7d267894140c\") " pod="openstack/ssh-known-hosts-openstack-qcps8" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.672691 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f1af51fb-9081-44b1-9408-7d267894140c-inventory-0\") pod \"ssh-known-hosts-openstack-qcps8\" (UID: \"f1af51fb-9081-44b1-9408-7d267894140c\") " pod="openstack/ssh-known-hosts-openstack-qcps8" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.686476 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xmtz\" (UniqueName: \"kubernetes.io/projected/f1af51fb-9081-44b1-9408-7d267894140c-kube-api-access-4xmtz\") pod \"ssh-known-hosts-openstack-qcps8\" (UID: \"f1af51fb-9081-44b1-9408-7d267894140c\") " pod="openstack/ssh-known-hosts-openstack-qcps8" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.693366 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d93b30-a643-4185-975a-a650351e70d5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a6d93b30-a643-4185-975a-a650351e70d5" (UID: "a6d93b30-a643-4185-975a-a650351e70d5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.695918 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6d93b30-a643-4185-975a-a650351e70d5-inventory" (OuterVolumeSpecName: "inventory") pod "a6d93b30-a643-4185-975a-a650351e70d5" (UID: "a6d93b30-a643-4185-975a-a650351e70d5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.768875 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t558h\" (UniqueName: \"kubernetes.io/projected/a6d93b30-a643-4185-975a-a650351e70d5-kube-api-access-t558h\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.768910 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6d93b30-a643-4185-975a-a650351e70d5-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.768919 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a6d93b30-a643-4185-975a-a650351e70d5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:07 crc kubenswrapper[4947]: I1203 09:30:07.804707 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-qcps8" Dec 03 09:30:08 crc kubenswrapper[4947]: I1203 09:30:08.347233 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-j4sjz" event={"ID":"a6d93b30-a643-4185-975a-a650351e70d5","Type":"ContainerDied","Data":"c9d30f9d7a6d9426c4a4d449fd5b6fefd58c0f8756798d0eee9ea428cc019150"} Dec 03 09:30:08 crc kubenswrapper[4947]: I1203 09:30:08.347734 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9d30f9d7a6d9426c4a4d449fd5b6fefd58c0f8756798d0eee9ea428cc019150" Dec 03 09:30:08 crc kubenswrapper[4947]: I1203 09:30:08.347284 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-j4sjz" Dec 03 09:30:08 crc kubenswrapper[4947]: I1203 09:30:08.393758 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-qcps8"] Dec 03 09:30:08 crc kubenswrapper[4947]: I1203 09:30:08.611859 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-fxwdw"] Dec 03 09:30:08 crc kubenswrapper[4947]: E1203 09:30:08.612354 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6d93b30-a643-4185-975a-a650351e70d5" containerName="configure-os-openstack-openstack-cell1" Dec 03 09:30:08 crc kubenswrapper[4947]: I1203 09:30:08.612376 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d93b30-a643-4185-975a-a650351e70d5" containerName="configure-os-openstack-openstack-cell1" Dec 03 09:30:08 crc kubenswrapper[4947]: I1203 09:30:08.612694 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6d93b30-a643-4185-975a-a650351e70d5" containerName="configure-os-openstack-openstack-cell1" Dec 03 09:30:08 crc kubenswrapper[4947]: I1203 09:30:08.613672 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fxwdw" Dec 03 09:30:08 crc kubenswrapper[4947]: I1203 09:30:08.622857 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rfmtm" Dec 03 09:30:08 crc kubenswrapper[4947]: I1203 09:30:08.631984 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-fxwdw"] Dec 03 09:30:08 crc kubenswrapper[4947]: I1203 09:30:08.789986 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6534752f-c509-4d12-9123-599ba4eb1da7-ssh-key\") pod \"run-os-openstack-openstack-cell1-fxwdw\" (UID: \"6534752f-c509-4d12-9123-599ba4eb1da7\") " pod="openstack/run-os-openstack-openstack-cell1-fxwdw" Dec 03 09:30:08 crc kubenswrapper[4947]: I1203 09:30:08.790103 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4j22\" (UniqueName: \"kubernetes.io/projected/6534752f-c509-4d12-9123-599ba4eb1da7-kube-api-access-z4j22\") pod \"run-os-openstack-openstack-cell1-fxwdw\" (UID: \"6534752f-c509-4d12-9123-599ba4eb1da7\") " pod="openstack/run-os-openstack-openstack-cell1-fxwdw" Dec 03 09:30:08 crc kubenswrapper[4947]: I1203 09:30:08.790249 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6534752f-c509-4d12-9123-599ba4eb1da7-inventory\") pod \"run-os-openstack-openstack-cell1-fxwdw\" (UID: \"6534752f-c509-4d12-9123-599ba4eb1da7\") " pod="openstack/run-os-openstack-openstack-cell1-fxwdw" Dec 03 09:30:08 crc kubenswrapper[4947]: I1203 09:30:08.891753 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6534752f-c509-4d12-9123-599ba4eb1da7-ssh-key\") pod \"run-os-openstack-openstack-cell1-fxwdw\" (UID: \"6534752f-c509-4d12-9123-599ba4eb1da7\") " pod="openstack/run-os-openstack-openstack-cell1-fxwdw" Dec 03 09:30:08 crc kubenswrapper[4947]: I1203 09:30:08.891964 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4j22\" (UniqueName: \"kubernetes.io/projected/6534752f-c509-4d12-9123-599ba4eb1da7-kube-api-access-z4j22\") pod \"run-os-openstack-openstack-cell1-fxwdw\" (UID: \"6534752f-c509-4d12-9123-599ba4eb1da7\") " pod="openstack/run-os-openstack-openstack-cell1-fxwdw" Dec 03 09:30:08 crc kubenswrapper[4947]: I1203 09:30:08.892002 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6534752f-c509-4d12-9123-599ba4eb1da7-inventory\") pod \"run-os-openstack-openstack-cell1-fxwdw\" (UID: \"6534752f-c509-4d12-9123-599ba4eb1da7\") " pod="openstack/run-os-openstack-openstack-cell1-fxwdw" Dec 03 09:30:08 crc kubenswrapper[4947]: I1203 09:30:08.899011 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6534752f-c509-4d12-9123-599ba4eb1da7-inventory\") pod \"run-os-openstack-openstack-cell1-fxwdw\" (UID: \"6534752f-c509-4d12-9123-599ba4eb1da7\") " pod="openstack/run-os-openstack-openstack-cell1-fxwdw" Dec 03 09:30:08 crc kubenswrapper[4947]: I1203 09:30:08.903064 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6534752f-c509-4d12-9123-599ba4eb1da7-ssh-key\") pod \"run-os-openstack-openstack-cell1-fxwdw\" (UID: \"6534752f-c509-4d12-9123-599ba4eb1da7\") " pod="openstack/run-os-openstack-openstack-cell1-fxwdw" Dec 03 09:30:08 crc kubenswrapper[4947]: I1203 09:30:08.912245 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4j22\" (UniqueName: \"kubernetes.io/projected/6534752f-c509-4d12-9123-599ba4eb1da7-kube-api-access-z4j22\") pod \"run-os-openstack-openstack-cell1-fxwdw\" (UID: \"6534752f-c509-4d12-9123-599ba4eb1da7\") " pod="openstack/run-os-openstack-openstack-cell1-fxwdw" Dec 03 09:30:08 crc kubenswrapper[4947]: I1203 09:30:08.944173 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fxwdw" Dec 03 09:30:09 crc kubenswrapper[4947]: I1203 09:30:09.359971 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-qcps8" event={"ID":"f1af51fb-9081-44b1-9408-7d267894140c","Type":"ContainerStarted","Data":"c357a21723cbb9fadff35ca25b4b7be2919a2f3521d4f14a6b5e6e8799013b44"} Dec 03 09:30:09 crc kubenswrapper[4947]: I1203 09:30:09.584586 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-fxwdw"] Dec 03 09:30:09 crc kubenswrapper[4947]: W1203 09:30:09.592178 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6534752f_c509_4d12_9123_599ba4eb1da7.slice/crio-9a7834e6a868cb5f70d0183a0a1ea436b79c796c5ad52fe61a422392e0d33bfe WatchSource:0}: Error finding container 9a7834e6a868cb5f70d0183a0a1ea436b79c796c5ad52fe61a422392e0d33bfe: Status 404 returned error can't find the container with id 9a7834e6a868cb5f70d0183a0a1ea436b79c796c5ad52fe61a422392e0d33bfe Dec 03 09:30:10 crc kubenswrapper[4947]: I1203 09:30:10.372903 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fxwdw" event={"ID":"6534752f-c509-4d12-9123-599ba4eb1da7","Type":"ContainerStarted","Data":"1249792bb64b82783cdc5bfc6d2b9cf11986cccb52e93f3a28e22d462244545b"} Dec 03 09:30:10 crc kubenswrapper[4947]: I1203 09:30:10.373249 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fxwdw" event={"ID":"6534752f-c509-4d12-9123-599ba4eb1da7","Type":"ContainerStarted","Data":"9a7834e6a868cb5f70d0183a0a1ea436b79c796c5ad52fe61a422392e0d33bfe"} Dec 03 09:30:10 crc kubenswrapper[4947]: I1203 09:30:10.375163 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-qcps8" event={"ID":"f1af51fb-9081-44b1-9408-7d267894140c","Type":"ContainerStarted","Data":"17f25f75efe9d88082f9a2892b13db5f07be1229f32f67dfe1650ec2d24de8d6"} Dec 03 09:30:10 crc kubenswrapper[4947]: I1203 09:30:10.395102 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-fxwdw" podStartSLOduration=1.904216535 podStartE2EDuration="2.395083202s" podCreationTimestamp="2025-12-03 09:30:08 +0000 UTC" firstStartedPulling="2025-12-03 09:30:09.59435173 +0000 UTC m=+9670.855306156" lastFinishedPulling="2025-12-03 09:30:10.085218397 +0000 UTC m=+9671.346172823" observedRunningTime="2025-12-03 09:30:10.39167201 +0000 UTC m=+9671.652626446" watchObservedRunningTime="2025-12-03 09:30:10.395083202 +0000 UTC m=+9671.656037628" Dec 03 09:30:10 crc kubenswrapper[4947]: I1203 09:30:10.424163 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-qcps8" podStartSLOduration=2.8386426030000003 podStartE2EDuration="3.424131267s" podCreationTimestamp="2025-12-03 09:30:07 +0000 UTC" firstStartedPulling="2025-12-03 09:30:08.396666121 +0000 UTC m=+9669.657620547" lastFinishedPulling="2025-12-03 09:30:08.982154785 +0000 UTC m=+9670.243109211" observedRunningTime="2025-12-03 09:30:10.414394444 +0000 UTC m=+9671.675348890" watchObservedRunningTime="2025-12-03 09:30:10.424131267 +0000 UTC m=+9671.685085713" Dec 03 09:30:19 crc kubenswrapper[4947]: I1203 09:30:19.502846 4947 generic.go:334] "Generic (PLEG): container finished" podID="f1af51fb-9081-44b1-9408-7d267894140c" containerID="17f25f75efe9d88082f9a2892b13db5f07be1229f32f67dfe1650ec2d24de8d6" exitCode=0 Dec 03 09:30:19 crc kubenswrapper[4947]: I1203 09:30:19.502941 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-qcps8" event={"ID":"f1af51fb-9081-44b1-9408-7d267894140c","Type":"ContainerDied","Data":"17f25f75efe9d88082f9a2892b13db5f07be1229f32f67dfe1650ec2d24de8d6"} Dec 03 09:30:19 crc kubenswrapper[4947]: I1203 09:30:19.505762 4947 generic.go:334] "Generic (PLEG): container finished" podID="6534752f-c509-4d12-9123-599ba4eb1da7" containerID="1249792bb64b82783cdc5bfc6d2b9cf11986cccb52e93f3a28e22d462244545b" exitCode=0 Dec 03 09:30:19 crc kubenswrapper[4947]: I1203 09:30:19.505804 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fxwdw" event={"ID":"6534752f-c509-4d12-9123-599ba4eb1da7","Type":"ContainerDied","Data":"1249792bb64b82783cdc5bfc6d2b9cf11986cccb52e93f3a28e22d462244545b"} Dec 03 09:30:20 crc kubenswrapper[4947]: I1203 09:30:20.979955 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fxwdw" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.060523 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6534752f-c509-4d12-9123-599ba4eb1da7-inventory\") pod \"6534752f-c509-4d12-9123-599ba4eb1da7\" (UID: \"6534752f-c509-4d12-9123-599ba4eb1da7\") " Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.060818 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4j22\" (UniqueName: \"kubernetes.io/projected/6534752f-c509-4d12-9123-599ba4eb1da7-kube-api-access-z4j22\") pod \"6534752f-c509-4d12-9123-599ba4eb1da7\" (UID: \"6534752f-c509-4d12-9123-599ba4eb1da7\") " Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.060925 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6534752f-c509-4d12-9123-599ba4eb1da7-ssh-key\") pod \"6534752f-c509-4d12-9123-599ba4eb1da7\" (UID: \"6534752f-c509-4d12-9123-599ba4eb1da7\") " Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.066345 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6534752f-c509-4d12-9123-599ba4eb1da7-kube-api-access-z4j22" (OuterVolumeSpecName: "kube-api-access-z4j22") pod "6534752f-c509-4d12-9123-599ba4eb1da7" (UID: "6534752f-c509-4d12-9123-599ba4eb1da7"). InnerVolumeSpecName "kube-api-access-z4j22". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.097250 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6534752f-c509-4d12-9123-599ba4eb1da7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6534752f-c509-4d12-9123-599ba4eb1da7" (UID: "6534752f-c509-4d12-9123-599ba4eb1da7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.107788 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6534752f-c509-4d12-9123-599ba4eb1da7-inventory" (OuterVolumeSpecName: "inventory") pod "6534752f-c509-4d12-9123-599ba4eb1da7" (UID: "6534752f-c509-4d12-9123-599ba4eb1da7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.163921 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6534752f-c509-4d12-9123-599ba4eb1da7-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.163983 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6534752f-c509-4d12-9123-599ba4eb1da7-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.164004 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4j22\" (UniqueName: \"kubernetes.io/projected/6534752f-c509-4d12-9123-599ba4eb1da7-kube-api-access-z4j22\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.526748 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fxwdw" event={"ID":"6534752f-c509-4d12-9123-599ba4eb1da7","Type":"ContainerDied","Data":"9a7834e6a868cb5f70d0183a0a1ea436b79c796c5ad52fe61a422392e0d33bfe"} Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.527081 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a7834e6a868cb5f70d0183a0a1ea436b79c796c5ad52fe61a422392e0d33bfe" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.526806 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fxwdw" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.649878 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-2t7q6"] Dec 03 09:30:21 crc kubenswrapper[4947]: E1203 09:30:21.651071 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6534752f-c509-4d12-9123-599ba4eb1da7" containerName="run-os-openstack-openstack-cell1" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.651108 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="6534752f-c509-4d12-9123-599ba4eb1da7" containerName="run-os-openstack-openstack-cell1" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.651854 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="6534752f-c509-4d12-9123-599ba4eb1da7" containerName="run-os-openstack-openstack-cell1" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.652397 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-qcps8" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.653925 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-2t7q6" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.656207 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rfmtm" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.695055 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-2t7q6"] Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.781532 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xmtz\" (UniqueName: \"kubernetes.io/projected/f1af51fb-9081-44b1-9408-7d267894140c-kube-api-access-4xmtz\") pod \"f1af51fb-9081-44b1-9408-7d267894140c\" (UID: \"f1af51fb-9081-44b1-9408-7d267894140c\") " Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.781655 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/f1af51fb-9081-44b1-9408-7d267894140c-inventory-1\") pod \"f1af51fb-9081-44b1-9408-7d267894140c\" (UID: \"f1af51fb-9081-44b1-9408-7d267894140c\") " Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.781708 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell2\" (UniqueName: \"kubernetes.io/secret/f1af51fb-9081-44b1-9408-7d267894140c-ssh-key-openstack-cell2\") pod \"f1af51fb-9081-44b1-9408-7d267894140c\" (UID: \"f1af51fb-9081-44b1-9408-7d267894140c\") " Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.781772 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f1af51fb-9081-44b1-9408-7d267894140c-inventory-0\") pod \"f1af51fb-9081-44b1-9408-7d267894140c\" (UID: \"f1af51fb-9081-44b1-9408-7d267894140c\") " Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.781864 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f1af51fb-9081-44b1-9408-7d267894140c-ssh-key-openstack-cell1\") pod \"f1af51fb-9081-44b1-9408-7d267894140c\" (UID: \"f1af51fb-9081-44b1-9408-7d267894140c\") " Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.782300 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11-inventory\") pod \"reboot-os-openstack-openstack-cell1-2t7q6\" (UID: \"295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11\") " pod="openstack/reboot-os-openstack-openstack-cell1-2t7q6" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.782630 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-2t7q6\" (UID: \"295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11\") " pod="openstack/reboot-os-openstack-openstack-cell1-2t7q6" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.782755 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t6dw\" (UniqueName: \"kubernetes.io/projected/295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11-kube-api-access-8t6dw\") pod \"reboot-os-openstack-openstack-cell1-2t7q6\" (UID: \"295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11\") " pod="openstack/reboot-os-openstack-openstack-cell1-2t7q6" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.787886 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1af51fb-9081-44b1-9408-7d267894140c-kube-api-access-4xmtz" (OuterVolumeSpecName: "kube-api-access-4xmtz") pod "f1af51fb-9081-44b1-9408-7d267894140c" (UID: "f1af51fb-9081-44b1-9408-7d267894140c"). InnerVolumeSpecName "kube-api-access-4xmtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.812309 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1af51fb-9081-44b1-9408-7d267894140c-ssh-key-openstack-cell2" (OuterVolumeSpecName: "ssh-key-openstack-cell2") pod "f1af51fb-9081-44b1-9408-7d267894140c" (UID: "f1af51fb-9081-44b1-9408-7d267894140c"). InnerVolumeSpecName "ssh-key-openstack-cell2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.812863 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1af51fb-9081-44b1-9408-7d267894140c-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "f1af51fb-9081-44b1-9408-7d267894140c" (UID: "f1af51fb-9081-44b1-9408-7d267894140c"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.813235 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1af51fb-9081-44b1-9408-7d267894140c-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "f1af51fb-9081-44b1-9408-7d267894140c" (UID: "f1af51fb-9081-44b1-9408-7d267894140c"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.819558 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1af51fb-9081-44b1-9408-7d267894140c-inventory-1" (OuterVolumeSpecName: "inventory-1") pod "f1af51fb-9081-44b1-9408-7d267894140c" (UID: "f1af51fb-9081-44b1-9408-7d267894140c"). InnerVolumeSpecName "inventory-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.884975 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-2t7q6\" (UID: \"295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11\") " pod="openstack/reboot-os-openstack-openstack-cell1-2t7q6" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.885464 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t6dw\" (UniqueName: \"kubernetes.io/projected/295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11-kube-api-access-8t6dw\") pod \"reboot-os-openstack-openstack-cell1-2t7q6\" (UID: \"295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11\") " pod="openstack/reboot-os-openstack-openstack-cell1-2t7q6" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.885613 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11-inventory\") pod \"reboot-os-openstack-openstack-cell1-2t7q6\" (UID: \"295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11\") " pod="openstack/reboot-os-openstack-openstack-cell1-2t7q6" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.885690 4947 reconciler_common.go:293] "Volume detached for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/f1af51fb-9081-44b1-9408-7d267894140c-inventory-1\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.885714 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell2\" (UniqueName: \"kubernetes.io/secret/f1af51fb-9081-44b1-9408-7d267894140c-ssh-key-openstack-cell2\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.885726 4947 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f1af51fb-9081-44b1-9408-7d267894140c-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.885737 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f1af51fb-9081-44b1-9408-7d267894140c-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.885748 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xmtz\" (UniqueName: \"kubernetes.io/projected/f1af51fb-9081-44b1-9408-7d267894140c-kube-api-access-4xmtz\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.888821 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-2t7q6\" (UID: \"295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11\") " pod="openstack/reboot-os-openstack-openstack-cell1-2t7q6" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.889992 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11-inventory\") pod \"reboot-os-openstack-openstack-cell1-2t7q6\" (UID: \"295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11\") " pod="openstack/reboot-os-openstack-openstack-cell1-2t7q6" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.900617 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t6dw\" (UniqueName: \"kubernetes.io/projected/295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11-kube-api-access-8t6dw\") pod \"reboot-os-openstack-openstack-cell1-2t7q6\" (UID: \"295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11\") " pod="openstack/reboot-os-openstack-openstack-cell1-2t7q6" Dec 03 09:30:21 crc kubenswrapper[4947]: I1203 09:30:21.974464 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-2t7q6" Dec 03 09:30:22 crc kubenswrapper[4947]: I1203 09:30:22.538615 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-qcps8" event={"ID":"f1af51fb-9081-44b1-9408-7d267894140c","Type":"ContainerDied","Data":"c357a21723cbb9fadff35ca25b4b7be2919a2f3521d4f14a6b5e6e8799013b44"} Dec 03 09:30:22 crc kubenswrapper[4947]: I1203 09:30:22.539248 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c357a21723cbb9fadff35ca25b4b7be2919a2f3521d4f14a6b5e6e8799013b44" Dec 03 09:30:22 crc kubenswrapper[4947]: I1203 09:30:22.539356 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-qcps8" Dec 03 09:30:22 crc kubenswrapper[4947]: I1203 09:30:22.635070 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-2t7q6"] Dec 03 09:30:22 crc kubenswrapper[4947]: I1203 09:30:22.745442 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell2-qtsdm"] Dec 03 09:30:22 crc kubenswrapper[4947]: E1203 09:30:22.746085 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1af51fb-9081-44b1-9408-7d267894140c" containerName="ssh-known-hosts-openstack" Dec 03 09:30:22 crc kubenswrapper[4947]: I1203 09:30:22.746113 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1af51fb-9081-44b1-9408-7d267894140c" containerName="ssh-known-hosts-openstack" Dec 03 09:30:22 crc kubenswrapper[4947]: I1203 09:30:22.746382 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1af51fb-9081-44b1-9408-7d267894140c" containerName="ssh-known-hosts-openstack" Dec 03 09:30:22 crc kubenswrapper[4947]: I1203 09:30:22.747386 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell2-qtsdm" Dec 03 09:30:22 crc kubenswrapper[4947]: I1203 09:30:22.751446 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell2" Dec 03 09:30:22 crc kubenswrapper[4947]: I1203 09:30:22.751561 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell2-dockercfg-cl4m2" Dec 03 09:30:22 crc kubenswrapper[4947]: I1203 09:30:22.760898 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell2-qtsdm"] Dec 03 09:30:22 crc kubenswrapper[4947]: I1203 09:30:22.812177 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1aed45c4-3852-4919-b581-d82a139c3f03-inventory\") pod \"run-os-openstack-openstack-cell2-qtsdm\" (UID: \"1aed45c4-3852-4919-b581-d82a139c3f03\") " pod="openstack/run-os-openstack-openstack-cell2-qtsdm" Dec 03 09:30:22 crc kubenswrapper[4947]: I1203 09:30:22.812461 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87jp2\" (UniqueName: \"kubernetes.io/projected/1aed45c4-3852-4919-b581-d82a139c3f03-kube-api-access-87jp2\") pod \"run-os-openstack-openstack-cell2-qtsdm\" (UID: \"1aed45c4-3852-4919-b581-d82a139c3f03\") " pod="openstack/run-os-openstack-openstack-cell2-qtsdm" Dec 03 09:30:22 crc kubenswrapper[4947]: I1203 09:30:22.812555 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1aed45c4-3852-4919-b581-d82a139c3f03-ssh-key\") pod \"run-os-openstack-openstack-cell2-qtsdm\" (UID: \"1aed45c4-3852-4919-b581-d82a139c3f03\") " pod="openstack/run-os-openstack-openstack-cell2-qtsdm" Dec 03 09:30:22 crc kubenswrapper[4947]: I1203 09:30:22.914408 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87jp2\" (UniqueName: \"kubernetes.io/projected/1aed45c4-3852-4919-b581-d82a139c3f03-kube-api-access-87jp2\") pod \"run-os-openstack-openstack-cell2-qtsdm\" (UID: \"1aed45c4-3852-4919-b581-d82a139c3f03\") " pod="openstack/run-os-openstack-openstack-cell2-qtsdm" Dec 03 09:30:22 crc kubenswrapper[4947]: I1203 09:30:22.914800 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1aed45c4-3852-4919-b581-d82a139c3f03-ssh-key\") pod \"run-os-openstack-openstack-cell2-qtsdm\" (UID: \"1aed45c4-3852-4919-b581-d82a139c3f03\") " pod="openstack/run-os-openstack-openstack-cell2-qtsdm" Dec 03 09:30:22 crc kubenswrapper[4947]: I1203 09:30:22.914842 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1aed45c4-3852-4919-b581-d82a139c3f03-inventory\") pod \"run-os-openstack-openstack-cell2-qtsdm\" (UID: \"1aed45c4-3852-4919-b581-d82a139c3f03\") " pod="openstack/run-os-openstack-openstack-cell2-qtsdm" Dec 03 09:30:22 crc kubenswrapper[4947]: I1203 09:30:22.923723 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1aed45c4-3852-4919-b581-d82a139c3f03-ssh-key\") pod \"run-os-openstack-openstack-cell2-qtsdm\" (UID: \"1aed45c4-3852-4919-b581-d82a139c3f03\") " pod="openstack/run-os-openstack-openstack-cell2-qtsdm" Dec 03 09:30:22 crc kubenswrapper[4947]: I1203 09:30:22.924475 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1aed45c4-3852-4919-b581-d82a139c3f03-inventory\") pod \"run-os-openstack-openstack-cell2-qtsdm\" (UID: \"1aed45c4-3852-4919-b581-d82a139c3f03\") " pod="openstack/run-os-openstack-openstack-cell2-qtsdm" Dec 03 09:30:22 crc kubenswrapper[4947]: I1203 09:30:22.947599 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87jp2\" (UniqueName: \"kubernetes.io/projected/1aed45c4-3852-4919-b581-d82a139c3f03-kube-api-access-87jp2\") pod \"run-os-openstack-openstack-cell2-qtsdm\" (UID: \"1aed45c4-3852-4919-b581-d82a139c3f03\") " pod="openstack/run-os-openstack-openstack-cell2-qtsdm" Dec 03 09:30:23 crc kubenswrapper[4947]: I1203 09:30:23.068712 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell2-qtsdm" Dec 03 09:30:23 crc kubenswrapper[4947]: I1203 09:30:23.548264 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-2t7q6" event={"ID":"295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11","Type":"ContainerStarted","Data":"23321bcd4b997ba832b72ae9dfd149b2b0cd926b9cd1be3ca4d875269e5219c6"} Dec 03 09:30:23 crc kubenswrapper[4947]: I1203 09:30:23.549732 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-2t7q6" event={"ID":"295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11","Type":"ContainerStarted","Data":"29d02aa74f2cdc0017362762a7f32903ae37a77a2f2359bbe31d09b01015bb24"} Dec 03 09:30:23 crc kubenswrapper[4947]: I1203 09:30:23.576222 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-2t7q6" podStartSLOduration=2.052632483 podStartE2EDuration="2.576204283s" podCreationTimestamp="2025-12-03 09:30:21 +0000 UTC" firstStartedPulling="2025-12-03 09:30:22.642420786 +0000 UTC m=+9683.903375222" lastFinishedPulling="2025-12-03 09:30:23.165992586 +0000 UTC m=+9684.426947022" observedRunningTime="2025-12-03 09:30:23.560886229 +0000 UTC m=+9684.821840665" watchObservedRunningTime="2025-12-03 09:30:23.576204283 +0000 UTC m=+9684.837158709" Dec 03 09:30:23 crc kubenswrapper[4947]: I1203 09:30:23.701586 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell2-qtsdm"] Dec 03 09:30:23 crc kubenswrapper[4947]: W1203 09:30:23.701942 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1aed45c4_3852_4919_b581_d82a139c3f03.slice/crio-e1ab215626d37874343732f6c912fb148b5a8ad8b93d2b4cf33bde2a5e7637e4 WatchSource:0}: Error finding container e1ab215626d37874343732f6c912fb148b5a8ad8b93d2b4cf33bde2a5e7637e4: Status 404 returned error can't find the container with id e1ab215626d37874343732f6c912fb148b5a8ad8b93d2b4cf33bde2a5e7637e4 Dec 03 09:30:24 crc kubenswrapper[4947]: I1203 09:30:24.577152 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell2-qtsdm" event={"ID":"1aed45c4-3852-4919-b581-d82a139c3f03","Type":"ContainerStarted","Data":"e1ab215626d37874343732f6c912fb148b5a8ad8b93d2b4cf33bde2a5e7637e4"} Dec 03 09:30:25 crc kubenswrapper[4947]: I1203 09:30:25.614249 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell2-qtsdm" event={"ID":"1aed45c4-3852-4919-b581-d82a139c3f03","Type":"ContainerStarted","Data":"f1b36cfa53d2f54fcef233d7dd03273327535b8988de15db49b98805728422f7"} Dec 03 09:30:25 crc kubenswrapper[4947]: I1203 09:30:25.641681 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell2-qtsdm" podStartSLOduration=3.189376052 podStartE2EDuration="3.641662625s" podCreationTimestamp="2025-12-03 09:30:22 +0000 UTC" firstStartedPulling="2025-12-03 09:30:23.708378856 +0000 UTC m=+9684.969333292" lastFinishedPulling="2025-12-03 09:30:24.160665449 +0000 UTC m=+9685.421619865" observedRunningTime="2025-12-03 09:30:25.633905056 +0000 UTC m=+9686.894859492" watchObservedRunningTime="2025-12-03 09:30:25.641662625 +0000 UTC m=+9686.902617041" Dec 03 09:30:26 crc kubenswrapper[4947]: I1203 09:30:26.298962 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dlhv6"] Dec 03 09:30:26 crc kubenswrapper[4947]: I1203 09:30:26.301957 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dlhv6" Dec 03 09:30:26 crc kubenswrapper[4947]: I1203 09:30:26.316620 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dlhv6"] Dec 03 09:30:26 crc kubenswrapper[4947]: I1203 09:30:26.401462 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f90fdea-619e-441e-9470-70418f94a82d-utilities\") pod \"redhat-operators-dlhv6\" (UID: \"9f90fdea-619e-441e-9470-70418f94a82d\") " pod="openshift-marketplace/redhat-operators-dlhv6" Dec 03 09:30:26 crc kubenswrapper[4947]: I1203 09:30:26.401546 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rd84\" (UniqueName: \"kubernetes.io/projected/9f90fdea-619e-441e-9470-70418f94a82d-kube-api-access-8rd84\") pod \"redhat-operators-dlhv6\" (UID: \"9f90fdea-619e-441e-9470-70418f94a82d\") " pod="openshift-marketplace/redhat-operators-dlhv6" Dec 03 09:30:26 crc kubenswrapper[4947]: I1203 09:30:26.401650 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f90fdea-619e-441e-9470-70418f94a82d-catalog-content\") pod \"redhat-operators-dlhv6\" (UID: \"9f90fdea-619e-441e-9470-70418f94a82d\") " pod="openshift-marketplace/redhat-operators-dlhv6" Dec 03 09:30:26 crc kubenswrapper[4947]: I1203 09:30:26.503425 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f90fdea-619e-441e-9470-70418f94a82d-utilities\") pod \"redhat-operators-dlhv6\" (UID: \"9f90fdea-619e-441e-9470-70418f94a82d\") " pod="openshift-marketplace/redhat-operators-dlhv6" Dec 03 09:30:26 crc kubenswrapper[4947]: I1203 09:30:26.503488 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rd84\" (UniqueName: \"kubernetes.io/projected/9f90fdea-619e-441e-9470-70418f94a82d-kube-api-access-8rd84\") pod \"redhat-operators-dlhv6\" (UID: \"9f90fdea-619e-441e-9470-70418f94a82d\") " pod="openshift-marketplace/redhat-operators-dlhv6" Dec 03 09:30:26 crc kubenswrapper[4947]: I1203 09:30:26.503536 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f90fdea-619e-441e-9470-70418f94a82d-catalog-content\") pod \"redhat-operators-dlhv6\" (UID: \"9f90fdea-619e-441e-9470-70418f94a82d\") " pod="openshift-marketplace/redhat-operators-dlhv6" Dec 03 09:30:26 crc kubenswrapper[4947]: I1203 09:30:26.504084 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f90fdea-619e-441e-9470-70418f94a82d-utilities\") pod \"redhat-operators-dlhv6\" (UID: \"9f90fdea-619e-441e-9470-70418f94a82d\") " pod="openshift-marketplace/redhat-operators-dlhv6" Dec 03 09:30:26 crc kubenswrapper[4947]: I1203 09:30:26.504093 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f90fdea-619e-441e-9470-70418f94a82d-catalog-content\") pod \"redhat-operators-dlhv6\" (UID: \"9f90fdea-619e-441e-9470-70418f94a82d\") " pod="openshift-marketplace/redhat-operators-dlhv6" Dec 03 09:30:26 crc kubenswrapper[4947]: I1203 09:30:26.525004 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rd84\" (UniqueName: \"kubernetes.io/projected/9f90fdea-619e-441e-9470-70418f94a82d-kube-api-access-8rd84\") pod \"redhat-operators-dlhv6\" (UID: \"9f90fdea-619e-441e-9470-70418f94a82d\") " pod="openshift-marketplace/redhat-operators-dlhv6" Dec 03 09:30:26 crc kubenswrapper[4947]: I1203 09:30:26.630281 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dlhv6" Dec 03 09:30:27 crc kubenswrapper[4947]: I1203 09:30:27.163609 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dlhv6"] Dec 03 09:30:27 crc kubenswrapper[4947]: W1203 09:30:27.164931 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f90fdea_619e_441e_9470_70418f94a82d.slice/crio-b0d57538cf70d96a2c730706beb1f79eced84280635baa9dd32ca6771cbd143c WatchSource:0}: Error finding container b0d57538cf70d96a2c730706beb1f79eced84280635baa9dd32ca6771cbd143c: Status 404 returned error can't find the container with id b0d57538cf70d96a2c730706beb1f79eced84280635baa9dd32ca6771cbd143c Dec 03 09:30:27 crc kubenswrapper[4947]: I1203 09:30:27.632678 4947 generic.go:334] "Generic (PLEG): container finished" podID="9f90fdea-619e-441e-9470-70418f94a82d" containerID="0cc1c3cc105ebcd79cc87292a28cd4096f1aa82461844d1e2b9dc902893f0e0f" exitCode=0 Dec 03 09:30:27 crc kubenswrapper[4947]: I1203 09:30:27.632736 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dlhv6" event={"ID":"9f90fdea-619e-441e-9470-70418f94a82d","Type":"ContainerDied","Data":"0cc1c3cc105ebcd79cc87292a28cd4096f1aa82461844d1e2b9dc902893f0e0f"} Dec 03 09:30:27 crc kubenswrapper[4947]: I1203 09:30:27.632935 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dlhv6" event={"ID":"9f90fdea-619e-441e-9470-70418f94a82d","Type":"ContainerStarted","Data":"b0d57538cf70d96a2c730706beb1f79eced84280635baa9dd32ca6771cbd143c"} Dec 03 09:30:28 crc kubenswrapper[4947]: I1203 09:30:28.644886 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dlhv6" event={"ID":"9f90fdea-619e-441e-9470-70418f94a82d","Type":"ContainerStarted","Data":"14a0dea8d7ff7e0fa024c804aa93c0627486c1d424fb9d5c263f551031e08e3b"} Dec 03 09:30:30 crc kubenswrapper[4947]: I1203 09:30:30.402113 4947 scope.go:117] "RemoveContainer" containerID="36a6286e97b68a60233985e746e6f8392c373cb68563221b51e6f7e4d79ab9bb" Dec 03 09:30:32 crc kubenswrapper[4947]: I1203 09:30:32.683226 4947 generic.go:334] "Generic (PLEG): container finished" podID="9f90fdea-619e-441e-9470-70418f94a82d" containerID="14a0dea8d7ff7e0fa024c804aa93c0627486c1d424fb9d5c263f551031e08e3b" exitCode=0 Dec 03 09:30:32 crc kubenswrapper[4947]: I1203 09:30:32.683312 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dlhv6" event={"ID":"9f90fdea-619e-441e-9470-70418f94a82d","Type":"ContainerDied","Data":"14a0dea8d7ff7e0fa024c804aa93c0627486c1d424fb9d5c263f551031e08e3b"} Dec 03 09:30:33 crc kubenswrapper[4947]: I1203 09:30:33.694551 4947 generic.go:334] "Generic (PLEG): container finished" podID="1aed45c4-3852-4919-b581-d82a139c3f03" containerID="f1b36cfa53d2f54fcef233d7dd03273327535b8988de15db49b98805728422f7" exitCode=0 Dec 03 09:30:33 crc kubenswrapper[4947]: I1203 09:30:33.694617 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell2-qtsdm" event={"ID":"1aed45c4-3852-4919-b581-d82a139c3f03","Type":"ContainerDied","Data":"f1b36cfa53d2f54fcef233d7dd03273327535b8988de15db49b98805728422f7"} Dec 03 09:30:33 crc kubenswrapper[4947]: I1203 09:30:33.698584 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dlhv6" event={"ID":"9f90fdea-619e-441e-9470-70418f94a82d","Type":"ContainerStarted","Data":"8eb09eb3b03aa32e275187341fdc1ff9f5ede994be2303c608ef3194eedd89f1"} Dec 03 09:30:33 crc kubenswrapper[4947]: I1203 09:30:33.750412 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dlhv6" podStartSLOduration=2.250519013 podStartE2EDuration="7.750395485s" podCreationTimestamp="2025-12-03 09:30:26 +0000 UTC" firstStartedPulling="2025-12-03 09:30:27.634229947 +0000 UTC m=+9688.895184383" lastFinishedPulling="2025-12-03 09:30:33.134106389 +0000 UTC m=+9694.395060855" observedRunningTime="2025-12-03 09:30:33.743437337 +0000 UTC m=+9695.004391803" watchObservedRunningTime="2025-12-03 09:30:33.750395485 +0000 UTC m=+9695.011349911" Dec 03 09:30:35 crc kubenswrapper[4947]: I1203 09:30:35.219017 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell2-qtsdm" Dec 03 09:30:35 crc kubenswrapper[4947]: I1203 09:30:35.303457 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1aed45c4-3852-4919-b581-d82a139c3f03-ssh-key\") pod \"1aed45c4-3852-4919-b581-d82a139c3f03\" (UID: \"1aed45c4-3852-4919-b581-d82a139c3f03\") " Dec 03 09:30:35 crc kubenswrapper[4947]: I1203 09:30:35.303547 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87jp2\" (UniqueName: \"kubernetes.io/projected/1aed45c4-3852-4919-b581-d82a139c3f03-kube-api-access-87jp2\") pod \"1aed45c4-3852-4919-b581-d82a139c3f03\" (UID: \"1aed45c4-3852-4919-b581-d82a139c3f03\") " Dec 03 09:30:35 crc kubenswrapper[4947]: I1203 09:30:35.303600 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1aed45c4-3852-4919-b581-d82a139c3f03-inventory\") pod \"1aed45c4-3852-4919-b581-d82a139c3f03\" (UID: \"1aed45c4-3852-4919-b581-d82a139c3f03\") " Dec 03 09:30:35 crc kubenswrapper[4947]: I1203 09:30:35.311283 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aed45c4-3852-4919-b581-d82a139c3f03-kube-api-access-87jp2" (OuterVolumeSpecName: "kube-api-access-87jp2") pod "1aed45c4-3852-4919-b581-d82a139c3f03" (UID: "1aed45c4-3852-4919-b581-d82a139c3f03"). InnerVolumeSpecName "kube-api-access-87jp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:30:35 crc kubenswrapper[4947]: I1203 09:30:35.337638 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aed45c4-3852-4919-b581-d82a139c3f03-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1aed45c4-3852-4919-b581-d82a139c3f03" (UID: "1aed45c4-3852-4919-b581-d82a139c3f03"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:30:35 crc kubenswrapper[4947]: I1203 09:30:35.344966 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aed45c4-3852-4919-b581-d82a139c3f03-inventory" (OuterVolumeSpecName: "inventory") pod "1aed45c4-3852-4919-b581-d82a139c3f03" (UID: "1aed45c4-3852-4919-b581-d82a139c3f03"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:30:35 crc kubenswrapper[4947]: I1203 09:30:35.406233 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1aed45c4-3852-4919-b581-d82a139c3f03-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:35 crc kubenswrapper[4947]: I1203 09:30:35.406274 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87jp2\" (UniqueName: \"kubernetes.io/projected/1aed45c4-3852-4919-b581-d82a139c3f03-kube-api-access-87jp2\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:35 crc kubenswrapper[4947]: I1203 09:30:35.406289 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1aed45c4-3852-4919-b581-d82a139c3f03-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:35 crc kubenswrapper[4947]: I1203 09:30:35.720062 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell2-qtsdm" event={"ID":"1aed45c4-3852-4919-b581-d82a139c3f03","Type":"ContainerDied","Data":"e1ab215626d37874343732f6c912fb148b5a8ad8b93d2b4cf33bde2a5e7637e4"} Dec 03 09:30:35 crc kubenswrapper[4947]: I1203 09:30:35.720108 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell2-qtsdm" Dec 03 09:30:35 crc kubenswrapper[4947]: I1203 09:30:35.720113 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1ab215626d37874343732f6c912fb148b5a8ad8b93d2b4cf33bde2a5e7637e4" Dec 03 09:30:35 crc kubenswrapper[4947]: I1203 09:30:35.799275 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell2-pqmhv"] Dec 03 09:30:35 crc kubenswrapper[4947]: E1203 09:30:35.800603 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aed45c4-3852-4919-b581-d82a139c3f03" containerName="run-os-openstack-openstack-cell2" Dec 03 09:30:35 crc kubenswrapper[4947]: I1203 09:30:35.800624 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aed45c4-3852-4919-b581-d82a139c3f03" containerName="run-os-openstack-openstack-cell2" Dec 03 09:30:35 crc kubenswrapper[4947]: I1203 09:30:35.800857 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aed45c4-3852-4919-b581-d82a139c3f03" containerName="run-os-openstack-openstack-cell2" Dec 03 09:30:35 crc kubenswrapper[4947]: I1203 09:30:35.801974 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell2-pqmhv" Dec 03 09:30:35 crc kubenswrapper[4947]: I1203 09:30:35.804196 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell2-dockercfg-cl4m2" Dec 03 09:30:35 crc kubenswrapper[4947]: I1203 09:30:35.804261 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell2" Dec 03 09:30:35 crc kubenswrapper[4947]: I1203 09:30:35.810957 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell2-pqmhv"] Dec 03 09:30:35 crc kubenswrapper[4947]: I1203 09:30:35.914355 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44b8db5a-143f-459e-be84-316ae4a51d47-inventory\") pod \"reboot-os-openstack-openstack-cell2-pqmhv\" (UID: \"44b8db5a-143f-459e-be84-316ae4a51d47\") " pod="openstack/reboot-os-openstack-openstack-cell2-pqmhv" Dec 03 09:30:35 crc kubenswrapper[4947]: I1203 09:30:35.914425 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44b8db5a-143f-459e-be84-316ae4a51d47-ssh-key\") pod \"reboot-os-openstack-openstack-cell2-pqmhv\" (UID: \"44b8db5a-143f-459e-be84-316ae4a51d47\") " pod="openstack/reboot-os-openstack-openstack-cell2-pqmhv" Dec 03 09:30:35 crc kubenswrapper[4947]: I1203 09:30:35.914444 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s49nx\" (UniqueName: \"kubernetes.io/projected/44b8db5a-143f-459e-be84-316ae4a51d47-kube-api-access-s49nx\") pod \"reboot-os-openstack-openstack-cell2-pqmhv\" (UID: \"44b8db5a-143f-459e-be84-316ae4a51d47\") " pod="openstack/reboot-os-openstack-openstack-cell2-pqmhv" Dec 03 09:30:36 crc kubenswrapper[4947]: I1203 09:30:36.016559 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44b8db5a-143f-459e-be84-316ae4a51d47-inventory\") pod \"reboot-os-openstack-openstack-cell2-pqmhv\" (UID: \"44b8db5a-143f-459e-be84-316ae4a51d47\") " pod="openstack/reboot-os-openstack-openstack-cell2-pqmhv" Dec 03 09:30:36 crc kubenswrapper[4947]: I1203 09:30:36.016948 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44b8db5a-143f-459e-be84-316ae4a51d47-ssh-key\") pod \"reboot-os-openstack-openstack-cell2-pqmhv\" (UID: \"44b8db5a-143f-459e-be84-316ae4a51d47\") " pod="openstack/reboot-os-openstack-openstack-cell2-pqmhv" Dec 03 09:30:36 crc kubenswrapper[4947]: I1203 09:30:36.017073 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s49nx\" (UniqueName: \"kubernetes.io/projected/44b8db5a-143f-459e-be84-316ae4a51d47-kube-api-access-s49nx\") pod \"reboot-os-openstack-openstack-cell2-pqmhv\" (UID: \"44b8db5a-143f-459e-be84-316ae4a51d47\") " pod="openstack/reboot-os-openstack-openstack-cell2-pqmhv" Dec 03 09:30:36 crc kubenswrapper[4947]: I1203 09:30:36.019939 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44b8db5a-143f-459e-be84-316ae4a51d47-inventory\") pod \"reboot-os-openstack-openstack-cell2-pqmhv\" (UID: \"44b8db5a-143f-459e-be84-316ae4a51d47\") " pod="openstack/reboot-os-openstack-openstack-cell2-pqmhv" Dec 03 09:30:36 crc kubenswrapper[4947]: I1203 09:30:36.019968 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44b8db5a-143f-459e-be84-316ae4a51d47-ssh-key\") pod \"reboot-os-openstack-openstack-cell2-pqmhv\" (UID: \"44b8db5a-143f-459e-be84-316ae4a51d47\") " pod="openstack/reboot-os-openstack-openstack-cell2-pqmhv" Dec 03 09:30:36 crc kubenswrapper[4947]: I1203 09:30:36.038360 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s49nx\" (UniqueName: \"kubernetes.io/projected/44b8db5a-143f-459e-be84-316ae4a51d47-kube-api-access-s49nx\") pod \"reboot-os-openstack-openstack-cell2-pqmhv\" (UID: \"44b8db5a-143f-459e-be84-316ae4a51d47\") " pod="openstack/reboot-os-openstack-openstack-cell2-pqmhv" Dec 03 09:30:36 crc kubenswrapper[4947]: I1203 09:30:36.121918 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell2-pqmhv" Dec 03 09:30:36 crc kubenswrapper[4947]: I1203 09:30:36.630551 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dlhv6" Dec 03 09:30:36 crc kubenswrapper[4947]: I1203 09:30:36.631883 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dlhv6" Dec 03 09:30:36 crc kubenswrapper[4947]: I1203 09:30:36.671800 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell2-pqmhv"] Dec 03 09:30:36 crc kubenswrapper[4947]: I1203 09:30:36.738087 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell2-pqmhv" event={"ID":"44b8db5a-143f-459e-be84-316ae4a51d47","Type":"ContainerStarted","Data":"ba788c0c9ec209623f38358313b822edc07a367f9cce220c95c40910f78b3587"} Dec 03 09:30:37 crc kubenswrapper[4947]: I1203 09:30:37.694431 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dlhv6" podUID="9f90fdea-619e-441e-9470-70418f94a82d" containerName="registry-server" probeResult="failure" output=< Dec 03 09:30:37 crc kubenswrapper[4947]: timeout: failed to connect service ":50051" within 1s Dec 03 09:30:37 crc kubenswrapper[4947]: > Dec 03 09:30:37 crc kubenswrapper[4947]: I1203 09:30:37.748246 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell2-pqmhv" event={"ID":"44b8db5a-143f-459e-be84-316ae4a51d47","Type":"ContainerStarted","Data":"6f768b6ea5d7b096423962fbcb42e5c04247b76e726c19a67a17745598dfd525"} Dec 03 09:30:37 crc kubenswrapper[4947]: I1203 09:30:37.764125 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell2-pqmhv" podStartSLOduration=2.353545045 podStartE2EDuration="2.764103241s" podCreationTimestamp="2025-12-03 09:30:35 +0000 UTC" firstStartedPulling="2025-12-03 09:30:36.682296474 +0000 UTC m=+9697.943250900" lastFinishedPulling="2025-12-03 09:30:37.09285467 +0000 UTC m=+9698.353809096" observedRunningTime="2025-12-03 09:30:37.761047389 +0000 UTC m=+9699.022001815" watchObservedRunningTime="2025-12-03 09:30:37.764103241 +0000 UTC m=+9699.025057667" Dec 03 09:30:39 crc kubenswrapper[4947]: I1203 09:30:39.775371 4947 generic.go:334] "Generic (PLEG): container finished" podID="295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11" containerID="23321bcd4b997ba832b72ae9dfd149b2b0cd926b9cd1be3ca4d875269e5219c6" exitCode=0 Dec 03 09:30:39 crc kubenswrapper[4947]: I1203 09:30:39.775453 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-2t7q6" event={"ID":"295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11","Type":"ContainerDied","Data":"23321bcd4b997ba832b72ae9dfd149b2b0cd926b9cd1be3ca4d875269e5219c6"} Dec 03 09:30:41 crc kubenswrapper[4947]: I1203 09:30:41.332238 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-2t7q6" Dec 03 09:30:41 crc kubenswrapper[4947]: I1203 09:30:41.443144 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11-ssh-key\") pod \"295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11\" (UID: \"295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11\") " Dec 03 09:30:41 crc kubenswrapper[4947]: I1203 09:30:41.443305 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t6dw\" (UniqueName: \"kubernetes.io/projected/295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11-kube-api-access-8t6dw\") pod \"295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11\" (UID: \"295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11\") " Dec 03 09:30:41 crc kubenswrapper[4947]: I1203 09:30:41.443344 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11-inventory\") pod \"295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11\" (UID: \"295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11\") " Dec 03 09:30:41 crc kubenswrapper[4947]: I1203 09:30:41.451770 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11-kube-api-access-8t6dw" (OuterVolumeSpecName: "kube-api-access-8t6dw") pod "295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11" (UID: "295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11"). InnerVolumeSpecName "kube-api-access-8t6dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:30:41 crc kubenswrapper[4947]: I1203 09:30:41.477911 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11" (UID: "295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:30:41 crc kubenswrapper[4947]: I1203 09:30:41.499631 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11-inventory" (OuterVolumeSpecName: "inventory") pod "295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11" (UID: "295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:30:41 crc kubenswrapper[4947]: I1203 09:30:41.545434 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:41 crc kubenswrapper[4947]: I1203 09:30:41.545472 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t6dw\" (UniqueName: \"kubernetes.io/projected/295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11-kube-api-access-8t6dw\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:41 crc kubenswrapper[4947]: I1203 09:30:41.545512 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:41 crc kubenswrapper[4947]: I1203 09:30:41.795821 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-2t7q6" event={"ID":"295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11","Type":"ContainerDied","Data":"29d02aa74f2cdc0017362762a7f32903ae37a77a2f2359bbe31d09b01015bb24"} Dec 03 09:30:41 crc kubenswrapper[4947]: I1203 09:30:41.795866 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29d02aa74f2cdc0017362762a7f32903ae37a77a2f2359bbe31d09b01015bb24" Dec 03 09:30:41 crc kubenswrapper[4947]: I1203 09:30:41.795845 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-2t7q6" Dec 03 09:30:41 crc kubenswrapper[4947]: I1203 09:30:41.896673 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-qkqhk"] Dec 03 09:30:41 crc kubenswrapper[4947]: E1203 09:30:41.897129 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11" containerName="reboot-os-openstack-openstack-cell1" Dec 03 09:30:41 crc kubenswrapper[4947]: I1203 09:30:41.897145 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11" containerName="reboot-os-openstack-openstack-cell1" Dec 03 09:30:41 crc kubenswrapper[4947]: I1203 09:30:41.897416 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11" containerName="reboot-os-openstack-openstack-cell1" Dec 03 09:30:41 crc kubenswrapper[4947]: I1203 09:30:41.898229 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:41 crc kubenswrapper[4947]: I1203 09:30:41.900268 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rfmtm" Dec 03 09:30:41 crc kubenswrapper[4947]: I1203 09:30:41.902539 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 09:30:41 crc kubenswrapper[4947]: I1203 09:30:41.912592 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-qkqhk"] Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.055189 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.055258 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-ssh-key\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.055295 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.055356 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.055470 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.055578 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.055731 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.055903 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-inventory\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.055946 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.056058 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb5j7\" (UniqueName: \"kubernetes.io/projected/9079c043-f2ef-421d-b8cd-81dea5769f02-kube-api-access-rb5j7\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.056110 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.158068 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-inventory\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.158127 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.158188 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb5j7\" (UniqueName: \"kubernetes.io/projected/9079c043-f2ef-421d-b8cd-81dea5769f02-kube-api-access-rb5j7\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.158222 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.158278 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.158340 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-ssh-key\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.158386 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.158464 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.158513 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.158552 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.158692 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.163934 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-ssh-key\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.164431 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.164562 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.164884 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.164878 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.165373 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.166328 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.170286 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.174207 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.177708 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-inventory\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.191063 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb5j7\" (UniqueName: \"kubernetes.io/projected/9079c043-f2ef-421d-b8cd-81dea5769f02-kube-api-access-rb5j7\") pod \"install-certs-openstack-openstack-cell1-qkqhk\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.213968 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:30:42 crc kubenswrapper[4947]: I1203 09:30:42.816765 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-qkqhk"] Dec 03 09:30:43 crc kubenswrapper[4947]: I1203 09:30:43.823179 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" event={"ID":"9079c043-f2ef-421d-b8cd-81dea5769f02","Type":"ContainerStarted","Data":"0b2f00b686eae8c448bcb5583e1d71ab6343f8de5fb1daa962bbd0e4785d9f1c"} Dec 03 09:30:43 crc kubenswrapper[4947]: I1203 09:30:43.823753 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" event={"ID":"9079c043-f2ef-421d-b8cd-81dea5769f02","Type":"ContainerStarted","Data":"2a79e345e4d3926d841b6400ee2e587f08c379dd3e987466292764bd751aac7d"} Dec 03 09:30:43 crc kubenswrapper[4947]: I1203 09:30:43.850799 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" podStartSLOduration=2.437181724 podStartE2EDuration="2.850782483s" podCreationTimestamp="2025-12-03 09:30:41 +0000 UTC" firstStartedPulling="2025-12-03 09:30:42.822701038 +0000 UTC m=+9704.083655454" lastFinishedPulling="2025-12-03 09:30:43.236301777 +0000 UTC m=+9704.497256213" observedRunningTime="2025-12-03 09:30:43.842806048 +0000 UTC m=+9705.103760464" watchObservedRunningTime="2025-12-03 09:30:43.850782483 +0000 UTC m=+9705.111736909" Dec 03 09:30:46 crc kubenswrapper[4947]: I1203 09:30:46.686552 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dlhv6" Dec 03 09:30:46 crc kubenswrapper[4947]: I1203 09:30:46.767279 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dlhv6" Dec 03 09:30:46 crc kubenswrapper[4947]: I1203 09:30:46.926540 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dlhv6"] Dec 03 09:30:47 crc kubenswrapper[4947]: I1203 09:30:47.866897 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dlhv6" podUID="9f90fdea-619e-441e-9470-70418f94a82d" containerName="registry-server" containerID="cri-o://8eb09eb3b03aa32e275187341fdc1ff9f5ede994be2303c608ef3194eedd89f1" gracePeriod=2 Dec 03 09:30:48 crc kubenswrapper[4947]: I1203 09:30:48.880877 4947 generic.go:334] "Generic (PLEG): container finished" podID="9f90fdea-619e-441e-9470-70418f94a82d" containerID="8eb09eb3b03aa32e275187341fdc1ff9f5ede994be2303c608ef3194eedd89f1" exitCode=0 Dec 03 09:30:48 crc kubenswrapper[4947]: I1203 09:30:48.880918 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dlhv6" event={"ID":"9f90fdea-619e-441e-9470-70418f94a82d","Type":"ContainerDied","Data":"8eb09eb3b03aa32e275187341fdc1ff9f5ede994be2303c608ef3194eedd89f1"} Dec 03 09:30:49 crc kubenswrapper[4947]: I1203 09:30:49.246308 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dlhv6" Dec 03 09:30:49 crc kubenswrapper[4947]: I1203 09:30:49.308560 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rd84\" (UniqueName: \"kubernetes.io/projected/9f90fdea-619e-441e-9470-70418f94a82d-kube-api-access-8rd84\") pod \"9f90fdea-619e-441e-9470-70418f94a82d\" (UID: \"9f90fdea-619e-441e-9470-70418f94a82d\") " Dec 03 09:30:49 crc kubenswrapper[4947]: I1203 09:30:49.308919 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f90fdea-619e-441e-9470-70418f94a82d-catalog-content\") pod \"9f90fdea-619e-441e-9470-70418f94a82d\" (UID: \"9f90fdea-619e-441e-9470-70418f94a82d\") " Dec 03 09:30:49 crc kubenswrapper[4947]: I1203 09:30:49.308976 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f90fdea-619e-441e-9470-70418f94a82d-utilities\") pod \"9f90fdea-619e-441e-9470-70418f94a82d\" (UID: \"9f90fdea-619e-441e-9470-70418f94a82d\") " Dec 03 09:30:49 crc kubenswrapper[4947]: I1203 09:30:49.309805 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f90fdea-619e-441e-9470-70418f94a82d-utilities" (OuterVolumeSpecName: "utilities") pod "9f90fdea-619e-441e-9470-70418f94a82d" (UID: "9f90fdea-619e-441e-9470-70418f94a82d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:30:49 crc kubenswrapper[4947]: I1203 09:30:49.313979 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f90fdea-619e-441e-9470-70418f94a82d-kube-api-access-8rd84" (OuterVolumeSpecName: "kube-api-access-8rd84") pod "9f90fdea-619e-441e-9470-70418f94a82d" (UID: "9f90fdea-619e-441e-9470-70418f94a82d"). InnerVolumeSpecName "kube-api-access-8rd84". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:30:49 crc kubenswrapper[4947]: I1203 09:30:49.410403 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f90fdea-619e-441e-9470-70418f94a82d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f90fdea-619e-441e-9470-70418f94a82d" (UID: "9f90fdea-619e-441e-9470-70418f94a82d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:30:49 crc kubenswrapper[4947]: I1203 09:30:49.411106 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f90fdea-619e-441e-9470-70418f94a82d-catalog-content\") pod \"9f90fdea-619e-441e-9470-70418f94a82d\" (UID: \"9f90fdea-619e-441e-9470-70418f94a82d\") " Dec 03 09:30:49 crc kubenswrapper[4947]: W1203 09:30:49.411268 4947 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/9f90fdea-619e-441e-9470-70418f94a82d/volumes/kubernetes.io~empty-dir/catalog-content Dec 03 09:30:49 crc kubenswrapper[4947]: I1203 09:30:49.411284 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f90fdea-619e-441e-9470-70418f94a82d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f90fdea-619e-441e-9470-70418f94a82d" (UID: "9f90fdea-619e-441e-9470-70418f94a82d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:30:49 crc kubenswrapper[4947]: I1203 09:30:49.411883 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rd84\" (UniqueName: \"kubernetes.io/projected/9f90fdea-619e-441e-9470-70418f94a82d-kube-api-access-8rd84\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:49 crc kubenswrapper[4947]: I1203 09:30:49.411949 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f90fdea-619e-441e-9470-70418f94a82d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:49 crc kubenswrapper[4947]: I1203 09:30:49.412014 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f90fdea-619e-441e-9470-70418f94a82d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:49 crc kubenswrapper[4947]: I1203 09:30:49.893706 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dlhv6" event={"ID":"9f90fdea-619e-441e-9470-70418f94a82d","Type":"ContainerDied","Data":"b0d57538cf70d96a2c730706beb1f79eced84280635baa9dd32ca6771cbd143c"} Dec 03 09:30:49 crc kubenswrapper[4947]: I1203 09:30:49.893762 4947 scope.go:117] "RemoveContainer" containerID="8eb09eb3b03aa32e275187341fdc1ff9f5ede994be2303c608ef3194eedd89f1" Dec 03 09:30:49 crc kubenswrapper[4947]: I1203 09:30:49.893794 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dlhv6" Dec 03 09:30:49 crc kubenswrapper[4947]: I1203 09:30:49.927595 4947 scope.go:117] "RemoveContainer" containerID="14a0dea8d7ff7e0fa024c804aa93c0627486c1d424fb9d5c263f551031e08e3b" Dec 03 09:30:49 crc kubenswrapper[4947]: I1203 09:30:49.934973 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dlhv6"] Dec 03 09:30:49 crc kubenswrapper[4947]: I1203 09:30:49.944260 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dlhv6"] Dec 03 09:30:49 crc kubenswrapper[4947]: I1203 09:30:49.953126 4947 scope.go:117] "RemoveContainer" containerID="0cc1c3cc105ebcd79cc87292a28cd4096f1aa82461844d1e2b9dc902893f0e0f" Dec 03 09:30:51 crc kubenswrapper[4947]: I1203 09:30:51.094776 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f90fdea-619e-441e-9470-70418f94a82d" path="/var/lib/kubelet/pods/9f90fdea-619e-441e-9470-70418f94a82d/volumes" Dec 03 09:30:52 crc kubenswrapper[4947]: I1203 09:30:52.926511 4947 generic.go:334] "Generic (PLEG): container finished" podID="44b8db5a-143f-459e-be84-316ae4a51d47" containerID="6f768b6ea5d7b096423962fbcb42e5c04247b76e726c19a67a17745598dfd525" exitCode=0 Dec 03 09:30:52 crc kubenswrapper[4947]: I1203 09:30:52.926609 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell2-pqmhv" event={"ID":"44b8db5a-143f-459e-be84-316ae4a51d47","Type":"ContainerDied","Data":"6f768b6ea5d7b096423962fbcb42e5c04247b76e726c19a67a17745598dfd525"} Dec 03 09:30:54 crc kubenswrapper[4947]: I1203 09:30:54.474002 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell2-pqmhv" Dec 03 09:30:54 crc kubenswrapper[4947]: I1203 09:30:54.532344 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44b8db5a-143f-459e-be84-316ae4a51d47-ssh-key\") pod \"44b8db5a-143f-459e-be84-316ae4a51d47\" (UID: \"44b8db5a-143f-459e-be84-316ae4a51d47\") " Dec 03 09:30:54 crc kubenswrapper[4947]: I1203 09:30:54.532410 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44b8db5a-143f-459e-be84-316ae4a51d47-inventory\") pod \"44b8db5a-143f-459e-be84-316ae4a51d47\" (UID: \"44b8db5a-143f-459e-be84-316ae4a51d47\") " Dec 03 09:30:54 crc kubenswrapper[4947]: I1203 09:30:54.532563 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s49nx\" (UniqueName: \"kubernetes.io/projected/44b8db5a-143f-459e-be84-316ae4a51d47-kube-api-access-s49nx\") pod \"44b8db5a-143f-459e-be84-316ae4a51d47\" (UID: \"44b8db5a-143f-459e-be84-316ae4a51d47\") " Dec 03 09:30:54 crc kubenswrapper[4947]: I1203 09:30:54.539842 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44b8db5a-143f-459e-be84-316ae4a51d47-kube-api-access-s49nx" (OuterVolumeSpecName: "kube-api-access-s49nx") pod "44b8db5a-143f-459e-be84-316ae4a51d47" (UID: "44b8db5a-143f-459e-be84-316ae4a51d47"). InnerVolumeSpecName "kube-api-access-s49nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:30:54 crc kubenswrapper[4947]: I1203 09:30:54.575953 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b8db5a-143f-459e-be84-316ae4a51d47-inventory" (OuterVolumeSpecName: "inventory") pod "44b8db5a-143f-459e-be84-316ae4a51d47" (UID: "44b8db5a-143f-459e-be84-316ae4a51d47"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:30:54 crc kubenswrapper[4947]: I1203 09:30:54.597770 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b8db5a-143f-459e-be84-316ae4a51d47-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "44b8db5a-143f-459e-be84-316ae4a51d47" (UID: "44b8db5a-143f-459e-be84-316ae4a51d47"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:30:54 crc kubenswrapper[4947]: I1203 09:30:54.636986 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s49nx\" (UniqueName: \"kubernetes.io/projected/44b8db5a-143f-459e-be84-316ae4a51d47-kube-api-access-s49nx\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:54 crc kubenswrapper[4947]: I1203 09:30:54.637043 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44b8db5a-143f-459e-be84-316ae4a51d47-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:54 crc kubenswrapper[4947]: I1203 09:30:54.637067 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44b8db5a-143f-459e-be84-316ae4a51d47-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:30:54 crc kubenswrapper[4947]: I1203 09:30:54.948585 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell2-pqmhv" event={"ID":"44b8db5a-143f-459e-be84-316ae4a51d47","Type":"ContainerDied","Data":"ba788c0c9ec209623f38358313b822edc07a367f9cce220c95c40910f78b3587"} Dec 03 09:30:54 crc kubenswrapper[4947]: I1203 09:30:54.948849 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba788c0c9ec209623f38358313b822edc07a367f9cce220c95c40910f78b3587" Dec 03 09:30:54 crc kubenswrapper[4947]: I1203 09:30:54.948653 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell2-pqmhv" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.046916 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell2-42v76"] Dec 03 09:30:55 crc kubenswrapper[4947]: E1203 09:30:55.047425 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44b8db5a-143f-459e-be84-316ae4a51d47" containerName="reboot-os-openstack-openstack-cell2" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.047446 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="44b8db5a-143f-459e-be84-316ae4a51d47" containerName="reboot-os-openstack-openstack-cell2" Dec 03 09:30:55 crc kubenswrapper[4947]: E1203 09:30:55.047480 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f90fdea-619e-441e-9470-70418f94a82d" containerName="registry-server" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.047489 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f90fdea-619e-441e-9470-70418f94a82d" containerName="registry-server" Dec 03 09:30:55 crc kubenswrapper[4947]: E1203 09:30:55.047539 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f90fdea-619e-441e-9470-70418f94a82d" containerName="extract-content" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.047546 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f90fdea-619e-441e-9470-70418f94a82d" containerName="extract-content" Dec 03 09:30:55 crc kubenswrapper[4947]: E1203 09:30:55.047572 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f90fdea-619e-441e-9470-70418f94a82d" containerName="extract-utilities" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.047580 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f90fdea-619e-441e-9470-70418f94a82d" containerName="extract-utilities" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.047846 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="44b8db5a-143f-459e-be84-316ae4a51d47" containerName="reboot-os-openstack-openstack-cell2" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.047864 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f90fdea-619e-441e-9470-70418f94a82d" containerName="registry-server" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.048806 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.052437 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell2" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.057338 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell2-dockercfg-cl4m2" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.064668 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell2-42v76"] Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.146794 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.146855 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.146898 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-inventory\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.146927 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.146991 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhpgk\" (UniqueName: \"kubernetes.io/projected/4c493b7d-12ef-47ed-825b-8de463ffb17b-kube-api-access-hhpgk\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.147031 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.147073 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.147090 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.147133 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.147162 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-ssh-key\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.147191 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.249019 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.249175 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhpgk\" (UniqueName: \"kubernetes.io/projected/4c493b7d-12ef-47ed-825b-8de463ffb17b-kube-api-access-hhpgk\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.249256 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.249330 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.249371 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.249478 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.249606 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-ssh-key\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.249672 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.249855 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.249928 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.250003 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-inventory\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.255965 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-inventory\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.256541 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.256732 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.257772 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-ssh-key\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.258193 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.259272 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.259882 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.261053 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.261188 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.261720 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.268787 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhpgk\" (UniqueName: \"kubernetes.io/projected/4c493b7d-12ef-47ed-825b-8de463ffb17b-kube-api-access-hhpgk\") pod \"install-certs-openstack-openstack-cell2-42v76\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.366257 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.930783 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell2-42v76"] Dec 03 09:30:55 crc kubenswrapper[4947]: I1203 09:30:55.959856 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell2-42v76" event={"ID":"4c493b7d-12ef-47ed-825b-8de463ffb17b","Type":"ContainerStarted","Data":"5bae0a4c20fd146648ac66b89216d6e7bab891cc3c610cef5cea6f7d740e0637"} Dec 03 09:30:56 crc kubenswrapper[4947]: I1203 09:30:56.999082 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell2-42v76" event={"ID":"4c493b7d-12ef-47ed-825b-8de463ffb17b","Type":"ContainerStarted","Data":"eafefb7e054474eb01aa4a754a2b7e91716a0b4a7286c196fb8727cd0ea7a943"} Dec 03 09:30:57 crc kubenswrapper[4947]: I1203 09:30:57.022190 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell2-42v76" podStartSLOduration=1.62617393 podStartE2EDuration="2.022166172s" podCreationTimestamp="2025-12-03 09:30:55 +0000 UTC" firstStartedPulling="2025-12-03 09:30:55.921893535 +0000 UTC m=+9717.182847961" lastFinishedPulling="2025-12-03 09:30:56.317885767 +0000 UTC m=+9717.578840203" observedRunningTime="2025-12-03 09:30:57.019716615 +0000 UTC m=+9718.280671171" watchObservedRunningTime="2025-12-03 09:30:57.022166172 +0000 UTC m=+9718.283120598" Dec 03 09:31:05 crc kubenswrapper[4947]: I1203 09:31:05.086240 4947 generic.go:334] "Generic (PLEG): container finished" podID="9079c043-f2ef-421d-b8cd-81dea5769f02" containerID="0b2f00b686eae8c448bcb5583e1d71ab6343f8de5fb1daa962bbd0e4785d9f1c" exitCode=0 Dec 03 09:31:05 crc kubenswrapper[4947]: I1203 09:31:05.097547 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" event={"ID":"9079c043-f2ef-421d-b8cd-81dea5769f02","Type":"ContainerDied","Data":"0b2f00b686eae8c448bcb5583e1d71ab6343f8de5fb1daa962bbd0e4785d9f1c"} Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.701637 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.842028 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-neutron-metadata-combined-ca-bundle\") pod \"9079c043-f2ef-421d-b8cd-81dea5769f02\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.842478 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-telemetry-combined-ca-bundle\") pod \"9079c043-f2ef-421d-b8cd-81dea5769f02\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.842560 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb5j7\" (UniqueName: \"kubernetes.io/projected/9079c043-f2ef-421d-b8cd-81dea5769f02-kube-api-access-rb5j7\") pod \"9079c043-f2ef-421d-b8cd-81dea5769f02\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.842619 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-neutron-dhcp-combined-ca-bundle\") pod \"9079c043-f2ef-421d-b8cd-81dea5769f02\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.842659 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-ovn-combined-ca-bundle\") pod \"9079c043-f2ef-421d-b8cd-81dea5769f02\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.842688 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-nova-combined-ca-bundle\") pod \"9079c043-f2ef-421d-b8cd-81dea5769f02\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.842761 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-neutron-sriov-combined-ca-bundle\") pod \"9079c043-f2ef-421d-b8cd-81dea5769f02\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.842795 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-libvirt-combined-ca-bundle\") pod \"9079c043-f2ef-421d-b8cd-81dea5769f02\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.842827 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-ssh-key\") pod \"9079c043-f2ef-421d-b8cd-81dea5769f02\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.842852 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-inventory\") pod \"9079c043-f2ef-421d-b8cd-81dea5769f02\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.842872 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-bootstrap-combined-ca-bundle\") pod \"9079c043-f2ef-421d-b8cd-81dea5769f02\" (UID: \"9079c043-f2ef-421d-b8cd-81dea5769f02\") " Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.849049 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "9079c043-f2ef-421d-b8cd-81dea5769f02" (UID: "9079c043-f2ef-421d-b8cd-81dea5769f02"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.850285 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9079c043-f2ef-421d-b8cd-81dea5769f02" (UID: "9079c043-f2ef-421d-b8cd-81dea5769f02"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.850568 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "9079c043-f2ef-421d-b8cd-81dea5769f02" (UID: "9079c043-f2ef-421d-b8cd-81dea5769f02"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.851242 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "9079c043-f2ef-421d-b8cd-81dea5769f02" (UID: "9079c043-f2ef-421d-b8cd-81dea5769f02"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.852116 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "9079c043-f2ef-421d-b8cd-81dea5769f02" (UID: "9079c043-f2ef-421d-b8cd-81dea5769f02"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.852465 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9079c043-f2ef-421d-b8cd-81dea5769f02" (UID: "9079c043-f2ef-421d-b8cd-81dea5769f02"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.852546 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9079c043-f2ef-421d-b8cd-81dea5769f02-kube-api-access-rb5j7" (OuterVolumeSpecName: "kube-api-access-rb5j7") pod "9079c043-f2ef-421d-b8cd-81dea5769f02" (UID: "9079c043-f2ef-421d-b8cd-81dea5769f02"). InnerVolumeSpecName "kube-api-access-rb5j7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.854256 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9079c043-f2ef-421d-b8cd-81dea5769f02" (UID: "9079c043-f2ef-421d-b8cd-81dea5769f02"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.854615 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "9079c043-f2ef-421d-b8cd-81dea5769f02" (UID: "9079c043-f2ef-421d-b8cd-81dea5769f02"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.877681 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-inventory" (OuterVolumeSpecName: "inventory") pod "9079c043-f2ef-421d-b8cd-81dea5769f02" (UID: "9079c043-f2ef-421d-b8cd-81dea5769f02"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.893780 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9079c043-f2ef-421d-b8cd-81dea5769f02" (UID: "9079c043-f2ef-421d-b8cd-81dea5769f02"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.947426 4947 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.947470 4947 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.947484 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.947499 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.947533 4947 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.947546 4947 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.947559 4947 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.947571 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb5j7\" (UniqueName: \"kubernetes.io/projected/9079c043-f2ef-421d-b8cd-81dea5769f02-kube-api-access-rb5j7\") on node \"crc\" DevicePath \"\"" Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.947582 4947 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.947595 4947 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:31:06 crc kubenswrapper[4947]: I1203 09:31:06.947606 4947 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9079c043-f2ef-421d-b8cd-81dea5769f02-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.108594 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" event={"ID":"9079c043-f2ef-421d-b8cd-81dea5769f02","Type":"ContainerDied","Data":"2a79e345e4d3926d841b6400ee2e587f08c379dd3e987466292764bd751aac7d"} Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.108633 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a79e345e4d3926d841b6400ee2e587f08c379dd3e987466292764bd751aac7d" Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.108684 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-qkqhk" Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.210436 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-zmkh5"] Dec 03 09:31:07 crc kubenswrapper[4947]: E1203 09:31:07.211005 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9079c043-f2ef-421d-b8cd-81dea5769f02" containerName="install-certs-openstack-openstack-cell1" Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.211027 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9079c043-f2ef-421d-b8cd-81dea5769f02" containerName="install-certs-openstack-openstack-cell1" Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.211353 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="9079c043-f2ef-421d-b8cd-81dea5769f02" containerName="install-certs-openstack-openstack-cell1" Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.212343 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-zmkh5" Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.215148 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.215258 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rfmtm" Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.215582 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.225361 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-zmkh5"] Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.354997 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6fvt\" (UniqueName: \"kubernetes.io/projected/45bf01ea-193c-4b97-85a1-135a04051991-kube-api-access-n6fvt\") pod \"ovn-openstack-openstack-cell1-zmkh5\" (UID: \"45bf01ea-193c-4b97-85a1-135a04051991\") " pod="openstack/ovn-openstack-openstack-cell1-zmkh5" Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.355366 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45bf01ea-193c-4b97-85a1-135a04051991-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-zmkh5\" (UID: \"45bf01ea-193c-4b97-85a1-135a04051991\") " pod="openstack/ovn-openstack-openstack-cell1-zmkh5" Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.355636 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/45bf01ea-193c-4b97-85a1-135a04051991-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-zmkh5\" (UID: \"45bf01ea-193c-4b97-85a1-135a04051991\") " pod="openstack/ovn-openstack-openstack-cell1-zmkh5" Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.355694 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45bf01ea-193c-4b97-85a1-135a04051991-inventory\") pod \"ovn-openstack-openstack-cell1-zmkh5\" (UID: \"45bf01ea-193c-4b97-85a1-135a04051991\") " pod="openstack/ovn-openstack-openstack-cell1-zmkh5" Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.355837 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/45bf01ea-193c-4b97-85a1-135a04051991-ssh-key\") pod \"ovn-openstack-openstack-cell1-zmkh5\" (UID: \"45bf01ea-193c-4b97-85a1-135a04051991\") " pod="openstack/ovn-openstack-openstack-cell1-zmkh5" Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.458144 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45bf01ea-193c-4b97-85a1-135a04051991-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-zmkh5\" (UID: \"45bf01ea-193c-4b97-85a1-135a04051991\") " pod="openstack/ovn-openstack-openstack-cell1-zmkh5" Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.458240 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/45bf01ea-193c-4b97-85a1-135a04051991-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-zmkh5\" (UID: \"45bf01ea-193c-4b97-85a1-135a04051991\") " pod="openstack/ovn-openstack-openstack-cell1-zmkh5" Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.458265 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45bf01ea-193c-4b97-85a1-135a04051991-inventory\") pod \"ovn-openstack-openstack-cell1-zmkh5\" (UID: \"45bf01ea-193c-4b97-85a1-135a04051991\") " pod="openstack/ovn-openstack-openstack-cell1-zmkh5" Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.458308 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/45bf01ea-193c-4b97-85a1-135a04051991-ssh-key\") pod \"ovn-openstack-openstack-cell1-zmkh5\" (UID: \"45bf01ea-193c-4b97-85a1-135a04051991\") " pod="openstack/ovn-openstack-openstack-cell1-zmkh5" Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.458377 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6fvt\" (UniqueName: \"kubernetes.io/projected/45bf01ea-193c-4b97-85a1-135a04051991-kube-api-access-n6fvt\") pod \"ovn-openstack-openstack-cell1-zmkh5\" (UID: \"45bf01ea-193c-4b97-85a1-135a04051991\") " pod="openstack/ovn-openstack-openstack-cell1-zmkh5" Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.459325 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/45bf01ea-193c-4b97-85a1-135a04051991-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-zmkh5\" (UID: \"45bf01ea-193c-4b97-85a1-135a04051991\") " pod="openstack/ovn-openstack-openstack-cell1-zmkh5" Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.462843 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/45bf01ea-193c-4b97-85a1-135a04051991-ssh-key\") pod \"ovn-openstack-openstack-cell1-zmkh5\" (UID: \"45bf01ea-193c-4b97-85a1-135a04051991\") " pod="openstack/ovn-openstack-openstack-cell1-zmkh5" Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.464001 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45bf01ea-193c-4b97-85a1-135a04051991-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-zmkh5\" (UID: \"45bf01ea-193c-4b97-85a1-135a04051991\") " pod="openstack/ovn-openstack-openstack-cell1-zmkh5" Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.464724 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45bf01ea-193c-4b97-85a1-135a04051991-inventory\") pod \"ovn-openstack-openstack-cell1-zmkh5\" (UID: \"45bf01ea-193c-4b97-85a1-135a04051991\") " pod="openstack/ovn-openstack-openstack-cell1-zmkh5" Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.474772 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6fvt\" (UniqueName: \"kubernetes.io/projected/45bf01ea-193c-4b97-85a1-135a04051991-kube-api-access-n6fvt\") pod \"ovn-openstack-openstack-cell1-zmkh5\" (UID: \"45bf01ea-193c-4b97-85a1-135a04051991\") " pod="openstack/ovn-openstack-openstack-cell1-zmkh5" Dec 03 09:31:07 crc kubenswrapper[4947]: I1203 09:31:07.532716 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-zmkh5" Dec 03 09:31:08 crc kubenswrapper[4947]: W1203 09:31:08.068856 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45bf01ea_193c_4b97_85a1_135a04051991.slice/crio-a0cb972138aae4e94e92163ced72425b8ec48118a560d6ec2d5ca6a37942cb16 WatchSource:0}: Error finding container a0cb972138aae4e94e92163ced72425b8ec48118a560d6ec2d5ca6a37942cb16: Status 404 returned error can't find the container with id a0cb972138aae4e94e92163ced72425b8ec48118a560d6ec2d5ca6a37942cb16 Dec 03 09:31:08 crc kubenswrapper[4947]: I1203 09:31:08.070120 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-zmkh5"] Dec 03 09:31:08 crc kubenswrapper[4947]: I1203 09:31:08.120593 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-zmkh5" event={"ID":"45bf01ea-193c-4b97-85a1-135a04051991","Type":"ContainerStarted","Data":"a0cb972138aae4e94e92163ced72425b8ec48118a560d6ec2d5ca6a37942cb16"} Dec 03 09:31:09 crc kubenswrapper[4947]: I1203 09:31:09.133778 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-zmkh5" event={"ID":"45bf01ea-193c-4b97-85a1-135a04051991","Type":"ContainerStarted","Data":"af4e640320658f7eab609c958a80b0aaf761700931c9bed615b83bbe242b5bf0"} Dec 03 09:31:09 crc kubenswrapper[4947]: I1203 09:31:09.161276 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-zmkh5" podStartSLOduration=1.749836682 podStartE2EDuration="2.161253031s" podCreationTimestamp="2025-12-03 09:31:07 +0000 UTC" firstStartedPulling="2025-12-03 09:31:08.071467957 +0000 UTC m=+9729.332422383" lastFinishedPulling="2025-12-03 09:31:08.482884276 +0000 UTC m=+9729.743838732" observedRunningTime="2025-12-03 09:31:09.153769728 +0000 UTC m=+9730.414724264" watchObservedRunningTime="2025-12-03 09:31:09.161253031 +0000 UTC m=+9730.422207467" Dec 03 09:31:18 crc kubenswrapper[4947]: I1203 09:31:18.240595 4947 generic.go:334] "Generic (PLEG): container finished" podID="4c493b7d-12ef-47ed-825b-8de463ffb17b" containerID="eafefb7e054474eb01aa4a754a2b7e91716a0b4a7286c196fb8727cd0ea7a943" exitCode=0 Dec 03 09:31:18 crc kubenswrapper[4947]: I1203 09:31:18.241137 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell2-42v76" event={"ID":"4c493b7d-12ef-47ed-825b-8de463ffb17b","Type":"ContainerDied","Data":"eafefb7e054474eb01aa4a754a2b7e91716a0b4a7286c196fb8727cd0ea7a943"} Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.731142 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.826862 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhpgk\" (UniqueName: \"kubernetes.io/projected/4c493b7d-12ef-47ed-825b-8de463ffb17b-kube-api-access-hhpgk\") pod \"4c493b7d-12ef-47ed-825b-8de463ffb17b\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.827216 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-ovn-combined-ca-bundle\") pod \"4c493b7d-12ef-47ed-825b-8de463ffb17b\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.827290 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-neutron-metadata-combined-ca-bundle\") pod \"4c493b7d-12ef-47ed-825b-8de463ffb17b\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.827346 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-ssh-key\") pod \"4c493b7d-12ef-47ed-825b-8de463ffb17b\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.827462 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-libvirt-combined-ca-bundle\") pod \"4c493b7d-12ef-47ed-825b-8de463ffb17b\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.827525 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-neutron-sriov-combined-ca-bundle\") pod \"4c493b7d-12ef-47ed-825b-8de463ffb17b\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.827550 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-bootstrap-combined-ca-bundle\") pod \"4c493b7d-12ef-47ed-825b-8de463ffb17b\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.827618 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-telemetry-combined-ca-bundle\") pod \"4c493b7d-12ef-47ed-825b-8de463ffb17b\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.827747 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-nova-combined-ca-bundle\") pod \"4c493b7d-12ef-47ed-825b-8de463ffb17b\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.827921 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-inventory\") pod \"4c493b7d-12ef-47ed-825b-8de463ffb17b\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.827950 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-neutron-dhcp-combined-ca-bundle\") pod \"4c493b7d-12ef-47ed-825b-8de463ffb17b\" (UID: \"4c493b7d-12ef-47ed-825b-8de463ffb17b\") " Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.833439 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "4c493b7d-12ef-47ed-825b-8de463ffb17b" (UID: "4c493b7d-12ef-47ed-825b-8de463ffb17b"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.833519 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "4c493b7d-12ef-47ed-825b-8de463ffb17b" (UID: "4c493b7d-12ef-47ed-825b-8de463ffb17b"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.834026 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c493b7d-12ef-47ed-825b-8de463ffb17b-kube-api-access-hhpgk" (OuterVolumeSpecName: "kube-api-access-hhpgk") pod "4c493b7d-12ef-47ed-825b-8de463ffb17b" (UID: "4c493b7d-12ef-47ed-825b-8de463ffb17b"). InnerVolumeSpecName "kube-api-access-hhpgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.834155 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "4c493b7d-12ef-47ed-825b-8de463ffb17b" (UID: "4c493b7d-12ef-47ed-825b-8de463ffb17b"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.834321 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "4c493b7d-12ef-47ed-825b-8de463ffb17b" (UID: "4c493b7d-12ef-47ed-825b-8de463ffb17b"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.835794 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "4c493b7d-12ef-47ed-825b-8de463ffb17b" (UID: "4c493b7d-12ef-47ed-825b-8de463ffb17b"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.835902 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "4c493b7d-12ef-47ed-825b-8de463ffb17b" (UID: "4c493b7d-12ef-47ed-825b-8de463ffb17b"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.836330 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "4c493b7d-12ef-47ed-825b-8de463ffb17b" (UID: "4c493b7d-12ef-47ed-825b-8de463ffb17b"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.837235 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "4c493b7d-12ef-47ed-825b-8de463ffb17b" (UID: "4c493b7d-12ef-47ed-825b-8de463ffb17b"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.858975 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-inventory" (OuterVolumeSpecName: "inventory") pod "4c493b7d-12ef-47ed-825b-8de463ffb17b" (UID: "4c493b7d-12ef-47ed-825b-8de463ffb17b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.861508 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4c493b7d-12ef-47ed-825b-8de463ffb17b" (UID: "4c493b7d-12ef-47ed-825b-8de463ffb17b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.930892 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.930924 4947 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.930936 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhpgk\" (UniqueName: \"kubernetes.io/projected/4c493b7d-12ef-47ed-825b-8de463ffb17b-kube-api-access-hhpgk\") on node \"crc\" DevicePath \"\"" Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.930947 4947 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.930956 4947 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.930965 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.930975 4947 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.930983 4947 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.930991 4947 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.931000 4947 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:31:19 crc kubenswrapper[4947]: I1203 09:31:19.931008 4947 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c493b7d-12ef-47ed-825b-8de463ffb17b-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.265413 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell2-42v76" event={"ID":"4c493b7d-12ef-47ed-825b-8de463ffb17b","Type":"ContainerDied","Data":"5bae0a4c20fd146648ac66b89216d6e7bab891cc3c610cef5cea6f7d740e0637"} Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.265470 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bae0a4c20fd146648ac66b89216d6e7bab891cc3c610cef5cea6f7d740e0637" Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.265542 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell2-42v76" Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.393876 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell2-hk42j"] Dec 03 09:31:20 crc kubenswrapper[4947]: E1203 09:31:20.394390 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c493b7d-12ef-47ed-825b-8de463ffb17b" containerName="install-certs-openstack-openstack-cell2" Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.394411 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c493b7d-12ef-47ed-825b-8de463ffb17b" containerName="install-certs-openstack-openstack-cell2" Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.394794 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c493b7d-12ef-47ed-825b-8de463ffb17b" containerName="install-certs-openstack-openstack-cell2" Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.395836 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell2-hk42j" Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.397875 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell2" Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.397974 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell2-dockercfg-cl4m2" Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.409436 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell2-hk42j"] Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.440839 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/793dd67d-fca6-4985-86d1-bd542f7e84e3-ssh-key\") pod \"ovn-openstack-openstack-cell2-hk42j\" (UID: \"793dd67d-fca6-4985-86d1-bd542f7e84e3\") " pod="openstack/ovn-openstack-openstack-cell2-hk42j" Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.440936 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/793dd67d-fca6-4985-86d1-bd542f7e84e3-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell2-hk42j\" (UID: \"793dd67d-fca6-4985-86d1-bd542f7e84e3\") " pod="openstack/ovn-openstack-openstack-cell2-hk42j" Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.440976 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793dd67d-fca6-4985-86d1-bd542f7e84e3-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell2-hk42j\" (UID: \"793dd67d-fca6-4985-86d1-bd542f7e84e3\") " pod="openstack/ovn-openstack-openstack-cell2-hk42j" Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.441003 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lts9\" (UniqueName: \"kubernetes.io/projected/793dd67d-fca6-4985-86d1-bd542f7e84e3-kube-api-access-6lts9\") pod \"ovn-openstack-openstack-cell2-hk42j\" (UID: \"793dd67d-fca6-4985-86d1-bd542f7e84e3\") " pod="openstack/ovn-openstack-openstack-cell2-hk42j" Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.441091 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/793dd67d-fca6-4985-86d1-bd542f7e84e3-inventory\") pod \"ovn-openstack-openstack-cell2-hk42j\" (UID: \"793dd67d-fca6-4985-86d1-bd542f7e84e3\") " pod="openstack/ovn-openstack-openstack-cell2-hk42j" Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.543214 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/793dd67d-fca6-4985-86d1-bd542f7e84e3-inventory\") pod \"ovn-openstack-openstack-cell2-hk42j\" (UID: \"793dd67d-fca6-4985-86d1-bd542f7e84e3\") " pod="openstack/ovn-openstack-openstack-cell2-hk42j" Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.543377 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/793dd67d-fca6-4985-86d1-bd542f7e84e3-ssh-key\") pod \"ovn-openstack-openstack-cell2-hk42j\" (UID: \"793dd67d-fca6-4985-86d1-bd542f7e84e3\") " pod="openstack/ovn-openstack-openstack-cell2-hk42j" Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.543523 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/793dd67d-fca6-4985-86d1-bd542f7e84e3-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell2-hk42j\" (UID: \"793dd67d-fca6-4985-86d1-bd542f7e84e3\") " pod="openstack/ovn-openstack-openstack-cell2-hk42j" Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.543569 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793dd67d-fca6-4985-86d1-bd542f7e84e3-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell2-hk42j\" (UID: \"793dd67d-fca6-4985-86d1-bd542f7e84e3\") " pod="openstack/ovn-openstack-openstack-cell2-hk42j" Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.543608 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lts9\" (UniqueName: \"kubernetes.io/projected/793dd67d-fca6-4985-86d1-bd542f7e84e3-kube-api-access-6lts9\") pod \"ovn-openstack-openstack-cell2-hk42j\" (UID: \"793dd67d-fca6-4985-86d1-bd542f7e84e3\") " pod="openstack/ovn-openstack-openstack-cell2-hk42j" Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.544283 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/793dd67d-fca6-4985-86d1-bd542f7e84e3-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell2-hk42j\" (UID: \"793dd67d-fca6-4985-86d1-bd542f7e84e3\") " pod="openstack/ovn-openstack-openstack-cell2-hk42j" Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.548991 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/793dd67d-fca6-4985-86d1-bd542f7e84e3-ssh-key\") pod \"ovn-openstack-openstack-cell2-hk42j\" (UID: \"793dd67d-fca6-4985-86d1-bd542f7e84e3\") " pod="openstack/ovn-openstack-openstack-cell2-hk42j" Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.549142 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/793dd67d-fca6-4985-86d1-bd542f7e84e3-inventory\") pod \"ovn-openstack-openstack-cell2-hk42j\" (UID: \"793dd67d-fca6-4985-86d1-bd542f7e84e3\") " pod="openstack/ovn-openstack-openstack-cell2-hk42j" Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.549339 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793dd67d-fca6-4985-86d1-bd542f7e84e3-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell2-hk42j\" (UID: \"793dd67d-fca6-4985-86d1-bd542f7e84e3\") " pod="openstack/ovn-openstack-openstack-cell2-hk42j" Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.558945 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lts9\" (UniqueName: \"kubernetes.io/projected/793dd67d-fca6-4985-86d1-bd542f7e84e3-kube-api-access-6lts9\") pod \"ovn-openstack-openstack-cell2-hk42j\" (UID: \"793dd67d-fca6-4985-86d1-bd542f7e84e3\") " pod="openstack/ovn-openstack-openstack-cell2-hk42j" Dec 03 09:31:20 crc kubenswrapper[4947]: I1203 09:31:20.718755 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell2-hk42j" Dec 03 09:31:21 crc kubenswrapper[4947]: I1203 09:31:21.326074 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell2-hk42j"] Dec 03 09:31:21 crc kubenswrapper[4947]: W1203 09:31:21.328294 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod793dd67d_fca6_4985_86d1_bd542f7e84e3.slice/crio-2ecb4ef14993c10334701e318c0a02cb2ac2b56a60dad24f55daadcb7d1c17eb WatchSource:0}: Error finding container 2ecb4ef14993c10334701e318c0a02cb2ac2b56a60dad24f55daadcb7d1c17eb: Status 404 returned error can't find the container with id 2ecb4ef14993c10334701e318c0a02cb2ac2b56a60dad24f55daadcb7d1c17eb Dec 03 09:31:22 crc kubenswrapper[4947]: I1203 09:31:22.290258 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell2-hk42j" event={"ID":"793dd67d-fca6-4985-86d1-bd542f7e84e3","Type":"ContainerStarted","Data":"2ecb4ef14993c10334701e318c0a02cb2ac2b56a60dad24f55daadcb7d1c17eb"} Dec 03 09:31:23 crc kubenswrapper[4947]: I1203 09:31:23.304873 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell2-hk42j" event={"ID":"793dd67d-fca6-4985-86d1-bd542f7e84e3","Type":"ContainerStarted","Data":"1224f459036b607aa85a00c0dafd619566986fa72a20b533575c8dd87f8a6e3f"} Dec 03 09:31:23 crc kubenswrapper[4947]: I1203 09:31:23.328703 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell2-hk42j" podStartSLOduration=2.876556587 podStartE2EDuration="3.328685557s" podCreationTimestamp="2025-12-03 09:31:20 +0000 UTC" firstStartedPulling="2025-12-03 09:31:21.330253737 +0000 UTC m=+9742.591208163" lastFinishedPulling="2025-12-03 09:31:21.782382697 +0000 UTC m=+9743.043337133" observedRunningTime="2025-12-03 09:31:23.325984094 +0000 UTC m=+9744.586938520" watchObservedRunningTime="2025-12-03 09:31:23.328685557 +0000 UTC m=+9744.589639993" Dec 03 09:32:00 crc kubenswrapper[4947]: I1203 09:32:00.086293 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:32:00 crc kubenswrapper[4947]: I1203 09:32:00.087044 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:32:16 crc kubenswrapper[4947]: I1203 09:32:16.877223 4947 generic.go:334] "Generic (PLEG): container finished" podID="45bf01ea-193c-4b97-85a1-135a04051991" containerID="af4e640320658f7eab609c958a80b0aaf761700931c9bed615b83bbe242b5bf0" exitCode=0 Dec 03 09:32:16 crc kubenswrapper[4947]: I1203 09:32:16.877308 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-zmkh5" event={"ID":"45bf01ea-193c-4b97-85a1-135a04051991","Type":"ContainerDied","Data":"af4e640320658f7eab609c958a80b0aaf761700931c9bed615b83bbe242b5bf0"} Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.544462 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-zmkh5" Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.654570 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/45bf01ea-193c-4b97-85a1-135a04051991-ovncontroller-config-0\") pod \"45bf01ea-193c-4b97-85a1-135a04051991\" (UID: \"45bf01ea-193c-4b97-85a1-135a04051991\") " Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.654681 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6fvt\" (UniqueName: \"kubernetes.io/projected/45bf01ea-193c-4b97-85a1-135a04051991-kube-api-access-n6fvt\") pod \"45bf01ea-193c-4b97-85a1-135a04051991\" (UID: \"45bf01ea-193c-4b97-85a1-135a04051991\") " Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.654753 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/45bf01ea-193c-4b97-85a1-135a04051991-ssh-key\") pod \"45bf01ea-193c-4b97-85a1-135a04051991\" (UID: \"45bf01ea-193c-4b97-85a1-135a04051991\") " Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.654787 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45bf01ea-193c-4b97-85a1-135a04051991-inventory\") pod \"45bf01ea-193c-4b97-85a1-135a04051991\" (UID: \"45bf01ea-193c-4b97-85a1-135a04051991\") " Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.654908 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45bf01ea-193c-4b97-85a1-135a04051991-ovn-combined-ca-bundle\") pod \"45bf01ea-193c-4b97-85a1-135a04051991\" (UID: \"45bf01ea-193c-4b97-85a1-135a04051991\") " Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.682807 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45bf01ea-193c-4b97-85a1-135a04051991-kube-api-access-n6fvt" (OuterVolumeSpecName: "kube-api-access-n6fvt") pod "45bf01ea-193c-4b97-85a1-135a04051991" (UID: "45bf01ea-193c-4b97-85a1-135a04051991"). InnerVolumeSpecName "kube-api-access-n6fvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.687789 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45bf01ea-193c-4b97-85a1-135a04051991-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "45bf01ea-193c-4b97-85a1-135a04051991" (UID: "45bf01ea-193c-4b97-85a1-135a04051991"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.710862 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45bf01ea-193c-4b97-85a1-135a04051991-inventory" (OuterVolumeSpecName: "inventory") pod "45bf01ea-193c-4b97-85a1-135a04051991" (UID: "45bf01ea-193c-4b97-85a1-135a04051991"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.717108 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45bf01ea-193c-4b97-85a1-135a04051991-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "45bf01ea-193c-4b97-85a1-135a04051991" (UID: "45bf01ea-193c-4b97-85a1-135a04051991"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.737636 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45bf01ea-193c-4b97-85a1-135a04051991-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "45bf01ea-193c-4b97-85a1-135a04051991" (UID: "45bf01ea-193c-4b97-85a1-135a04051991"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.757477 4947 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45bf01ea-193c-4b97-85a1-135a04051991-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.757553 4947 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/45bf01ea-193c-4b97-85a1-135a04051991-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.757566 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6fvt\" (UniqueName: \"kubernetes.io/projected/45bf01ea-193c-4b97-85a1-135a04051991-kube-api-access-n6fvt\") on node \"crc\" DevicePath \"\"" Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.757577 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/45bf01ea-193c-4b97-85a1-135a04051991-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.757651 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/45bf01ea-193c-4b97-85a1-135a04051991-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.905526 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-zmkh5" event={"ID":"45bf01ea-193c-4b97-85a1-135a04051991","Type":"ContainerDied","Data":"a0cb972138aae4e94e92163ced72425b8ec48118a560d6ec2d5ca6a37942cb16"} Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.905573 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0cb972138aae4e94e92163ced72425b8ec48118a560d6ec2d5ca6a37942cb16" Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.905636 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-zmkh5" Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.994779 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-5pxn4"] Dec 03 09:32:18 crc kubenswrapper[4947]: E1203 09:32:18.995349 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45bf01ea-193c-4b97-85a1-135a04051991" containerName="ovn-openstack-openstack-cell1" Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.995372 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="45bf01ea-193c-4b97-85a1-135a04051991" containerName="ovn-openstack-openstack-cell1" Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.995708 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="45bf01ea-193c-4b97-85a1-135a04051991" containerName="ovn-openstack-openstack-cell1" Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.996744 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.998646 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.998906 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 09:32:18 crc kubenswrapper[4947]: I1203 09:32:18.999704 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 03 09:32:19 crc kubenswrapper[4947]: I1203 09:32:19.000990 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rfmtm" Dec 03 09:32:19 crc kubenswrapper[4947]: I1203 09:32:19.005138 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-5pxn4"] Dec 03 09:32:19 crc kubenswrapper[4947]: I1203 09:32:19.171214 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-5pxn4\" (UID: \"a4f58536-7db6-4b2a-89c7-a043893e4543\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" Dec 03 09:32:19 crc kubenswrapper[4947]: I1203 09:32:19.171315 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xtbj\" (UniqueName: \"kubernetes.io/projected/a4f58536-7db6-4b2a-89c7-a043893e4543-kube-api-access-8xtbj\") pod \"neutron-metadata-openstack-openstack-cell1-5pxn4\" (UID: \"a4f58536-7db6-4b2a-89c7-a043893e4543\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" Dec 03 09:32:19 crc kubenswrapper[4947]: I1203 09:32:19.171383 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-5pxn4\" (UID: \"a4f58536-7db6-4b2a-89c7-a043893e4543\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" Dec 03 09:32:19 crc kubenswrapper[4947]: I1203 09:32:19.171964 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-5pxn4\" (UID: \"a4f58536-7db6-4b2a-89c7-a043893e4543\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" Dec 03 09:32:19 crc kubenswrapper[4947]: I1203 09:32:19.172084 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-5pxn4\" (UID: \"a4f58536-7db6-4b2a-89c7-a043893e4543\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" Dec 03 09:32:19 crc kubenswrapper[4947]: I1203 09:32:19.172238 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-5pxn4\" (UID: \"a4f58536-7db6-4b2a-89c7-a043893e4543\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" Dec 03 09:32:19 crc kubenswrapper[4947]: I1203 09:32:19.275221 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-5pxn4\" (UID: \"a4f58536-7db6-4b2a-89c7-a043893e4543\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" Dec 03 09:32:19 crc kubenswrapper[4947]: I1203 09:32:19.275672 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xtbj\" (UniqueName: \"kubernetes.io/projected/a4f58536-7db6-4b2a-89c7-a043893e4543-kube-api-access-8xtbj\") pod \"neutron-metadata-openstack-openstack-cell1-5pxn4\" (UID: \"a4f58536-7db6-4b2a-89c7-a043893e4543\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" Dec 03 09:32:19 crc kubenswrapper[4947]: I1203 09:32:19.275744 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-5pxn4\" (UID: \"a4f58536-7db6-4b2a-89c7-a043893e4543\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" Dec 03 09:32:19 crc kubenswrapper[4947]: I1203 09:32:19.275823 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-5pxn4\" (UID: \"a4f58536-7db6-4b2a-89c7-a043893e4543\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" Dec 03 09:32:19 crc kubenswrapper[4947]: I1203 09:32:19.275859 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-5pxn4\" (UID: \"a4f58536-7db6-4b2a-89c7-a043893e4543\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" Dec 03 09:32:19 crc kubenswrapper[4947]: I1203 09:32:19.276003 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-5pxn4\" (UID: \"a4f58536-7db6-4b2a-89c7-a043893e4543\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" Dec 03 09:32:19 crc kubenswrapper[4947]: I1203 09:32:19.280098 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-5pxn4\" (UID: \"a4f58536-7db6-4b2a-89c7-a043893e4543\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" Dec 03 09:32:19 crc kubenswrapper[4947]: I1203 09:32:19.280669 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-5pxn4\" (UID: \"a4f58536-7db6-4b2a-89c7-a043893e4543\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" Dec 03 09:32:19 crc kubenswrapper[4947]: I1203 09:32:19.282387 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-5pxn4\" (UID: \"a4f58536-7db6-4b2a-89c7-a043893e4543\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" Dec 03 09:32:19 crc kubenswrapper[4947]: I1203 09:32:19.282796 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-5pxn4\" (UID: \"a4f58536-7db6-4b2a-89c7-a043893e4543\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" Dec 03 09:32:19 crc kubenswrapper[4947]: I1203 09:32:19.289175 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-5pxn4\" (UID: \"a4f58536-7db6-4b2a-89c7-a043893e4543\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" Dec 03 09:32:19 crc kubenswrapper[4947]: I1203 09:32:19.296733 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xtbj\" (UniqueName: \"kubernetes.io/projected/a4f58536-7db6-4b2a-89c7-a043893e4543-kube-api-access-8xtbj\") pod \"neutron-metadata-openstack-openstack-cell1-5pxn4\" (UID: \"a4f58536-7db6-4b2a-89c7-a043893e4543\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" Dec 03 09:32:19 crc kubenswrapper[4947]: I1203 09:32:19.315828 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" Dec 03 09:32:19 crc kubenswrapper[4947]: I1203 09:32:19.940060 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-5pxn4"] Dec 03 09:32:20 crc kubenswrapper[4947]: I1203 09:32:20.935313 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" event={"ID":"a4f58536-7db6-4b2a-89c7-a043893e4543","Type":"ContainerStarted","Data":"054da341957ce27479fe4b96e9c2b726610cb12cdb93cd08f11f1a29576eb306"} Dec 03 09:32:20 crc kubenswrapper[4947]: I1203 09:32:20.936177 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" event={"ID":"a4f58536-7db6-4b2a-89c7-a043893e4543","Type":"ContainerStarted","Data":"c8a307fd15d8c0cac4d2f15111c4a02081837c0cb99c51ebcb009c3564a7eaaa"} Dec 03 09:32:20 crc kubenswrapper[4947]: I1203 09:32:20.965197 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" podStartSLOduration=2.562410443 podStartE2EDuration="2.965176138s" podCreationTimestamp="2025-12-03 09:32:18 +0000 UTC" firstStartedPulling="2025-12-03 09:32:19.942809291 +0000 UTC m=+9801.203763717" lastFinishedPulling="2025-12-03 09:32:20.345574986 +0000 UTC m=+9801.606529412" observedRunningTime="2025-12-03 09:32:20.959168515 +0000 UTC m=+9802.220122951" watchObservedRunningTime="2025-12-03 09:32:20.965176138 +0000 UTC m=+9802.226130564" Dec 03 09:32:29 crc kubenswrapper[4947]: I1203 09:32:29.021710 4947 generic.go:334] "Generic (PLEG): container finished" podID="793dd67d-fca6-4985-86d1-bd542f7e84e3" containerID="1224f459036b607aa85a00c0dafd619566986fa72a20b533575c8dd87f8a6e3f" exitCode=0 Dec 03 09:32:29 crc kubenswrapper[4947]: I1203 09:32:29.021792 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell2-hk42j" event={"ID":"793dd67d-fca6-4985-86d1-bd542f7e84e3","Type":"ContainerDied","Data":"1224f459036b607aa85a00c0dafd619566986fa72a20b533575c8dd87f8a6e3f"} Dec 03 09:32:30 crc kubenswrapper[4947]: I1203 09:32:30.086353 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:32:30 crc kubenswrapper[4947]: I1203 09:32:30.086440 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:32:30 crc kubenswrapper[4947]: I1203 09:32:30.543351 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell2-hk42j" Dec 03 09:32:30 crc kubenswrapper[4947]: I1203 09:32:30.640566 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lts9\" (UniqueName: \"kubernetes.io/projected/793dd67d-fca6-4985-86d1-bd542f7e84e3-kube-api-access-6lts9\") pod \"793dd67d-fca6-4985-86d1-bd542f7e84e3\" (UID: \"793dd67d-fca6-4985-86d1-bd542f7e84e3\") " Dec 03 09:32:30 crc kubenswrapper[4947]: I1203 09:32:30.640674 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/793dd67d-fca6-4985-86d1-bd542f7e84e3-inventory\") pod \"793dd67d-fca6-4985-86d1-bd542f7e84e3\" (UID: \"793dd67d-fca6-4985-86d1-bd542f7e84e3\") " Dec 03 09:32:30 crc kubenswrapper[4947]: I1203 09:32:30.640902 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793dd67d-fca6-4985-86d1-bd542f7e84e3-ovn-combined-ca-bundle\") pod \"793dd67d-fca6-4985-86d1-bd542f7e84e3\" (UID: \"793dd67d-fca6-4985-86d1-bd542f7e84e3\") " Dec 03 09:32:30 crc kubenswrapper[4947]: I1203 09:32:30.641038 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/793dd67d-fca6-4985-86d1-bd542f7e84e3-ovncontroller-config-0\") pod \"793dd67d-fca6-4985-86d1-bd542f7e84e3\" (UID: \"793dd67d-fca6-4985-86d1-bd542f7e84e3\") " Dec 03 09:32:30 crc kubenswrapper[4947]: I1203 09:32:30.641081 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/793dd67d-fca6-4985-86d1-bd542f7e84e3-ssh-key\") pod \"793dd67d-fca6-4985-86d1-bd542f7e84e3\" (UID: \"793dd67d-fca6-4985-86d1-bd542f7e84e3\") " Dec 03 09:32:30 crc kubenswrapper[4947]: I1203 09:32:30.652786 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/793dd67d-fca6-4985-86d1-bd542f7e84e3-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "793dd67d-fca6-4985-86d1-bd542f7e84e3" (UID: "793dd67d-fca6-4985-86d1-bd542f7e84e3"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:32:30 crc kubenswrapper[4947]: I1203 09:32:30.654487 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/793dd67d-fca6-4985-86d1-bd542f7e84e3-kube-api-access-6lts9" (OuterVolumeSpecName: "kube-api-access-6lts9") pod "793dd67d-fca6-4985-86d1-bd542f7e84e3" (UID: "793dd67d-fca6-4985-86d1-bd542f7e84e3"). InnerVolumeSpecName "kube-api-access-6lts9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:32:30 crc kubenswrapper[4947]: I1203 09:32:30.672117 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/793dd67d-fca6-4985-86d1-bd542f7e84e3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "793dd67d-fca6-4985-86d1-bd542f7e84e3" (UID: "793dd67d-fca6-4985-86d1-bd542f7e84e3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:32:30 crc kubenswrapper[4947]: I1203 09:32:30.685056 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/793dd67d-fca6-4985-86d1-bd542f7e84e3-inventory" (OuterVolumeSpecName: "inventory") pod "793dd67d-fca6-4985-86d1-bd542f7e84e3" (UID: "793dd67d-fca6-4985-86d1-bd542f7e84e3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:32:30 crc kubenswrapper[4947]: I1203 09:32:30.694888 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/793dd67d-fca6-4985-86d1-bd542f7e84e3-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "793dd67d-fca6-4985-86d1-bd542f7e84e3" (UID: "793dd67d-fca6-4985-86d1-bd542f7e84e3"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:32:30 crc kubenswrapper[4947]: I1203 09:32:30.743818 4947 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/793dd67d-fca6-4985-86d1-bd542f7e84e3-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:32:30 crc kubenswrapper[4947]: I1203 09:32:30.743854 4947 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/793dd67d-fca6-4985-86d1-bd542f7e84e3-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:32:30 crc kubenswrapper[4947]: I1203 09:32:30.743863 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/793dd67d-fca6-4985-86d1-bd542f7e84e3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:32:30 crc kubenswrapper[4947]: I1203 09:32:30.743873 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lts9\" (UniqueName: \"kubernetes.io/projected/793dd67d-fca6-4985-86d1-bd542f7e84e3-kube-api-access-6lts9\") on node \"crc\" DevicePath \"\"" Dec 03 09:32:30 crc kubenswrapper[4947]: I1203 09:32:30.743882 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/793dd67d-fca6-4985-86d1-bd542f7e84e3-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.058394 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell2-hk42j" event={"ID":"793dd67d-fca6-4985-86d1-bd542f7e84e3","Type":"ContainerDied","Data":"2ecb4ef14993c10334701e318c0a02cb2ac2b56a60dad24f55daadcb7d1c17eb"} Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.058466 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ecb4ef14993c10334701e318c0a02cb2ac2b56a60dad24f55daadcb7d1c17eb" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.058520 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell2-hk42j" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.181548 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell2-ksjwl"] Dec 03 09:32:31 crc kubenswrapper[4947]: E1203 09:32:31.182161 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="793dd67d-fca6-4985-86d1-bd542f7e84e3" containerName="ovn-openstack-openstack-cell2" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.182177 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="793dd67d-fca6-4985-86d1-bd542f7e84e3" containerName="ovn-openstack-openstack-cell2" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.182417 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="793dd67d-fca6-4985-86d1-bd542f7e84e3" containerName="ovn-openstack-openstack-cell2" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.183370 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.186132 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell2" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.187001 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell2-dockercfg-cl4m2" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.201538 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell2-ksjwl"] Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.257932 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell2-ksjwl\" (UID: \"231eb955-7b76-4bc9-b221-df73a0d8aae2\") " pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.258401 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell2-ksjwl\" (UID: \"231eb955-7b76-4bc9-b221-df73a0d8aae2\") " pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.258620 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbplg\" (UniqueName: \"kubernetes.io/projected/231eb955-7b76-4bc9-b221-df73a0d8aae2-kube-api-access-jbplg\") pod \"neutron-metadata-openstack-openstack-cell2-ksjwl\" (UID: \"231eb955-7b76-4bc9-b221-df73a0d8aae2\") " pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.258856 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-inventory\") pod \"neutron-metadata-openstack-openstack-cell2-ksjwl\" (UID: \"231eb955-7b76-4bc9-b221-df73a0d8aae2\") " pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.259003 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell2-ksjwl\" (UID: \"231eb955-7b76-4bc9-b221-df73a0d8aae2\") " pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.259082 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell2-ksjwl\" (UID: \"231eb955-7b76-4bc9-b221-df73a0d8aae2\") " pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.361550 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell2-ksjwl\" (UID: \"231eb955-7b76-4bc9-b221-df73a0d8aae2\") " pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.361691 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell2-ksjwl\" (UID: \"231eb955-7b76-4bc9-b221-df73a0d8aae2\") " pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.361784 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbplg\" (UniqueName: \"kubernetes.io/projected/231eb955-7b76-4bc9-b221-df73a0d8aae2-kube-api-access-jbplg\") pod \"neutron-metadata-openstack-openstack-cell2-ksjwl\" (UID: \"231eb955-7b76-4bc9-b221-df73a0d8aae2\") " pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.361849 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-inventory\") pod \"neutron-metadata-openstack-openstack-cell2-ksjwl\" (UID: \"231eb955-7b76-4bc9-b221-df73a0d8aae2\") " pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.361906 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell2-ksjwl\" (UID: \"231eb955-7b76-4bc9-b221-df73a0d8aae2\") " pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.361952 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell2-ksjwl\" (UID: \"231eb955-7b76-4bc9-b221-df73a0d8aae2\") " pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.365982 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell2-ksjwl\" (UID: \"231eb955-7b76-4bc9-b221-df73a0d8aae2\") " pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.366120 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell2-ksjwl\" (UID: \"231eb955-7b76-4bc9-b221-df73a0d8aae2\") " pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.366725 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-inventory\") pod \"neutron-metadata-openstack-openstack-cell2-ksjwl\" (UID: \"231eb955-7b76-4bc9-b221-df73a0d8aae2\") " pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.366750 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell2-ksjwl\" (UID: \"231eb955-7b76-4bc9-b221-df73a0d8aae2\") " pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.370119 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell2-ksjwl\" (UID: \"231eb955-7b76-4bc9-b221-df73a0d8aae2\") " pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.380368 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbplg\" (UniqueName: \"kubernetes.io/projected/231eb955-7b76-4bc9-b221-df73a0d8aae2-kube-api-access-jbplg\") pod \"neutron-metadata-openstack-openstack-cell2-ksjwl\" (UID: \"231eb955-7b76-4bc9-b221-df73a0d8aae2\") " pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" Dec 03 09:32:31 crc kubenswrapper[4947]: I1203 09:32:31.509168 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" Dec 03 09:32:32 crc kubenswrapper[4947]: I1203 09:32:32.039400 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell2-ksjwl"] Dec 03 09:32:32 crc kubenswrapper[4947]: W1203 09:32:32.040675 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod231eb955_7b76_4bc9_b221_df73a0d8aae2.slice/crio-4a518277b0e10b064be7c7b518ba2c9eab1c1fe78e47e11db00a2d48c1a02590 WatchSource:0}: Error finding container 4a518277b0e10b064be7c7b518ba2c9eab1c1fe78e47e11db00a2d48c1a02590: Status 404 returned error can't find the container with id 4a518277b0e10b064be7c7b518ba2c9eab1c1fe78e47e11db00a2d48c1a02590 Dec 03 09:32:32 crc kubenswrapper[4947]: I1203 09:32:32.069428 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" event={"ID":"231eb955-7b76-4bc9-b221-df73a0d8aae2","Type":"ContainerStarted","Data":"4a518277b0e10b064be7c7b518ba2c9eab1c1fe78e47e11db00a2d48c1a02590"} Dec 03 09:32:33 crc kubenswrapper[4947]: I1203 09:32:33.110891 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" podStartSLOduration=1.400052118 podStartE2EDuration="2.110870113s" podCreationTimestamp="2025-12-03 09:32:31 +0000 UTC" firstStartedPulling="2025-12-03 09:32:32.042545815 +0000 UTC m=+9813.303500241" lastFinishedPulling="2025-12-03 09:32:32.75336377 +0000 UTC m=+9814.014318236" observedRunningTime="2025-12-03 09:32:33.105036916 +0000 UTC m=+9814.365991352" watchObservedRunningTime="2025-12-03 09:32:33.110870113 +0000 UTC m=+9814.371824549" Dec 03 09:32:33 crc kubenswrapper[4947]: I1203 09:32:33.112648 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" event={"ID":"231eb955-7b76-4bc9-b221-df73a0d8aae2","Type":"ContainerStarted","Data":"0e261d36e36d1948260a61bc4690a369c9caba3b8dcbf06e5c0340202d1912be"} Dec 03 09:33:00 crc kubenswrapper[4947]: I1203 09:33:00.086117 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:33:00 crc kubenswrapper[4947]: I1203 09:33:00.086756 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:33:00 crc kubenswrapper[4947]: I1203 09:33:00.086797 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 09:33:00 crc kubenswrapper[4947]: I1203 09:33:00.087291 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 09:33:00 crc kubenswrapper[4947]: I1203 09:33:00.087340 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" gracePeriod=600 Dec 03 09:33:00 crc kubenswrapper[4947]: E1203 09:33:00.238599 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:33:00 crc kubenswrapper[4947]: I1203 09:33:00.382137 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" exitCode=0 Dec 03 09:33:00 crc kubenswrapper[4947]: I1203 09:33:00.382180 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f"} Dec 03 09:33:00 crc kubenswrapper[4947]: I1203 09:33:00.382217 4947 scope.go:117] "RemoveContainer" containerID="4bb8680f19120bd6e546ee45532bd7955e35871304affa4095bcd96a25cdccb1" Dec 03 09:33:00 crc kubenswrapper[4947]: I1203 09:33:00.382617 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:33:00 crc kubenswrapper[4947]: E1203 09:33:00.382981 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:33:14 crc kubenswrapper[4947]: I1203 09:33:14.083393 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:33:14 crc kubenswrapper[4947]: E1203 09:33:14.084317 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:33:15 crc kubenswrapper[4947]: I1203 09:33:15.556558 4947 generic.go:334] "Generic (PLEG): container finished" podID="a4f58536-7db6-4b2a-89c7-a043893e4543" containerID="054da341957ce27479fe4b96e9c2b726610cb12cdb93cd08f11f1a29576eb306" exitCode=0 Dec 03 09:33:15 crc kubenswrapper[4947]: I1203 09:33:15.556657 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" event={"ID":"a4f58536-7db6-4b2a-89c7-a043893e4543","Type":"ContainerDied","Data":"054da341957ce27479fe4b96e9c2b726610cb12cdb93cd08f11f1a29576eb306"} Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.056816 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.142954 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-inventory\") pod \"a4f58536-7db6-4b2a-89c7-a043893e4543\" (UID: \"a4f58536-7db6-4b2a-89c7-a043893e4543\") " Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.143069 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-neutron-ovn-metadata-agent-neutron-config-0\") pod \"a4f58536-7db6-4b2a-89c7-a043893e4543\" (UID: \"a4f58536-7db6-4b2a-89c7-a043893e4543\") " Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.143096 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-neutron-metadata-combined-ca-bundle\") pod \"a4f58536-7db6-4b2a-89c7-a043893e4543\" (UID: \"a4f58536-7db6-4b2a-89c7-a043893e4543\") " Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.143162 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-ssh-key\") pod \"a4f58536-7db6-4b2a-89c7-a043893e4543\" (UID: \"a4f58536-7db6-4b2a-89c7-a043893e4543\") " Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.143264 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-nova-metadata-neutron-config-0\") pod \"a4f58536-7db6-4b2a-89c7-a043893e4543\" (UID: \"a4f58536-7db6-4b2a-89c7-a043893e4543\") " Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.143352 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xtbj\" (UniqueName: \"kubernetes.io/projected/a4f58536-7db6-4b2a-89c7-a043893e4543-kube-api-access-8xtbj\") pod \"a4f58536-7db6-4b2a-89c7-a043893e4543\" (UID: \"a4f58536-7db6-4b2a-89c7-a043893e4543\") " Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.149110 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "a4f58536-7db6-4b2a-89c7-a043893e4543" (UID: "a4f58536-7db6-4b2a-89c7-a043893e4543"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.149908 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4f58536-7db6-4b2a-89c7-a043893e4543-kube-api-access-8xtbj" (OuterVolumeSpecName: "kube-api-access-8xtbj") pod "a4f58536-7db6-4b2a-89c7-a043893e4543" (UID: "a4f58536-7db6-4b2a-89c7-a043893e4543"). InnerVolumeSpecName "kube-api-access-8xtbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.172822 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "a4f58536-7db6-4b2a-89c7-a043893e4543" (UID: "a4f58536-7db6-4b2a-89c7-a043893e4543"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.175419 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "a4f58536-7db6-4b2a-89c7-a043893e4543" (UID: "a4f58536-7db6-4b2a-89c7-a043893e4543"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.176798 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a4f58536-7db6-4b2a-89c7-a043893e4543" (UID: "a4f58536-7db6-4b2a-89c7-a043893e4543"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.176950 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-inventory" (OuterVolumeSpecName: "inventory") pod "a4f58536-7db6-4b2a-89c7-a043893e4543" (UID: "a4f58536-7db6-4b2a-89c7-a043893e4543"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.245856 4947 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.246222 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xtbj\" (UniqueName: \"kubernetes.io/projected/a4f58536-7db6-4b2a-89c7-a043893e4543-kube-api-access-8xtbj\") on node \"crc\" DevicePath \"\"" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.246240 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.246254 4947 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.246300 4947 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.246316 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4f58536-7db6-4b2a-89c7-a043893e4543-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.609927 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" event={"ID":"a4f58536-7db6-4b2a-89c7-a043893e4543","Type":"ContainerDied","Data":"c8a307fd15d8c0cac4d2f15111c4a02081837c0cb99c51ebcb009c3564a7eaaa"} Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.609965 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8a307fd15d8c0cac4d2f15111c4a02081837c0cb99c51ebcb009c3564a7eaaa" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.610024 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-5pxn4" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.689017 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-6dmhf"] Dec 03 09:33:17 crc kubenswrapper[4947]: E1203 09:33:17.689583 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4f58536-7db6-4b2a-89c7-a043893e4543" containerName="neutron-metadata-openstack-openstack-cell1" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.689608 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4f58536-7db6-4b2a-89c7-a043893e4543" containerName="neutron-metadata-openstack-openstack-cell1" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.689885 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4f58536-7db6-4b2a-89c7-a043893e4543" containerName="neutron-metadata-openstack-openstack-cell1" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.690856 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-6dmhf" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.702341 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-6dmhf"] Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.719091 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.719397 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.719415 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rfmtm" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.759311 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvr4z\" (UniqueName: \"kubernetes.io/projected/1207aa81-9a4e-4189-b19c-65b393c24b4c-kube-api-access-mvr4z\") pod \"libvirt-openstack-openstack-cell1-6dmhf\" (UID: \"1207aa81-9a4e-4189-b19c-65b393c24b4c\") " pod="openstack/libvirt-openstack-openstack-cell1-6dmhf" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.759653 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1207aa81-9a4e-4189-b19c-65b393c24b4c-ssh-key\") pod \"libvirt-openstack-openstack-cell1-6dmhf\" (UID: \"1207aa81-9a4e-4189-b19c-65b393c24b4c\") " pod="openstack/libvirt-openstack-openstack-cell1-6dmhf" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.759959 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1207aa81-9a4e-4189-b19c-65b393c24b4c-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-6dmhf\" (UID: \"1207aa81-9a4e-4189-b19c-65b393c24b4c\") " pod="openstack/libvirt-openstack-openstack-cell1-6dmhf" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.759998 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/1207aa81-9a4e-4189-b19c-65b393c24b4c-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-6dmhf\" (UID: \"1207aa81-9a4e-4189-b19c-65b393c24b4c\") " pod="openstack/libvirt-openstack-openstack-cell1-6dmhf" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.760101 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1207aa81-9a4e-4189-b19c-65b393c24b4c-inventory\") pod \"libvirt-openstack-openstack-cell1-6dmhf\" (UID: \"1207aa81-9a4e-4189-b19c-65b393c24b4c\") " pod="openstack/libvirt-openstack-openstack-cell1-6dmhf" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.861465 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1207aa81-9a4e-4189-b19c-65b393c24b4c-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-6dmhf\" (UID: \"1207aa81-9a4e-4189-b19c-65b393c24b4c\") " pod="openstack/libvirt-openstack-openstack-cell1-6dmhf" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.861543 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/1207aa81-9a4e-4189-b19c-65b393c24b4c-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-6dmhf\" (UID: \"1207aa81-9a4e-4189-b19c-65b393c24b4c\") " pod="openstack/libvirt-openstack-openstack-cell1-6dmhf" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.861587 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1207aa81-9a4e-4189-b19c-65b393c24b4c-inventory\") pod \"libvirt-openstack-openstack-cell1-6dmhf\" (UID: \"1207aa81-9a4e-4189-b19c-65b393c24b4c\") " pod="openstack/libvirt-openstack-openstack-cell1-6dmhf" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.861619 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvr4z\" (UniqueName: \"kubernetes.io/projected/1207aa81-9a4e-4189-b19c-65b393c24b4c-kube-api-access-mvr4z\") pod \"libvirt-openstack-openstack-cell1-6dmhf\" (UID: \"1207aa81-9a4e-4189-b19c-65b393c24b4c\") " pod="openstack/libvirt-openstack-openstack-cell1-6dmhf" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.861660 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1207aa81-9a4e-4189-b19c-65b393c24b4c-ssh-key\") pod \"libvirt-openstack-openstack-cell1-6dmhf\" (UID: \"1207aa81-9a4e-4189-b19c-65b393c24b4c\") " pod="openstack/libvirt-openstack-openstack-cell1-6dmhf" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.866382 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1207aa81-9a4e-4189-b19c-65b393c24b4c-ssh-key\") pod \"libvirt-openstack-openstack-cell1-6dmhf\" (UID: \"1207aa81-9a4e-4189-b19c-65b393c24b4c\") " pod="openstack/libvirt-openstack-openstack-cell1-6dmhf" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.866427 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/1207aa81-9a4e-4189-b19c-65b393c24b4c-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-6dmhf\" (UID: \"1207aa81-9a4e-4189-b19c-65b393c24b4c\") " pod="openstack/libvirt-openstack-openstack-cell1-6dmhf" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.867291 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1207aa81-9a4e-4189-b19c-65b393c24b4c-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-6dmhf\" (UID: \"1207aa81-9a4e-4189-b19c-65b393c24b4c\") " pod="openstack/libvirt-openstack-openstack-cell1-6dmhf" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.870939 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1207aa81-9a4e-4189-b19c-65b393c24b4c-inventory\") pod \"libvirt-openstack-openstack-cell1-6dmhf\" (UID: \"1207aa81-9a4e-4189-b19c-65b393c24b4c\") " pod="openstack/libvirt-openstack-openstack-cell1-6dmhf" Dec 03 09:33:17 crc kubenswrapper[4947]: I1203 09:33:17.881145 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvr4z\" (UniqueName: \"kubernetes.io/projected/1207aa81-9a4e-4189-b19c-65b393c24b4c-kube-api-access-mvr4z\") pod \"libvirt-openstack-openstack-cell1-6dmhf\" (UID: \"1207aa81-9a4e-4189-b19c-65b393c24b4c\") " pod="openstack/libvirt-openstack-openstack-cell1-6dmhf" Dec 03 09:33:18 crc kubenswrapper[4947]: I1203 09:33:18.030556 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-6dmhf" Dec 03 09:33:18 crc kubenswrapper[4947]: I1203 09:33:18.611178 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-6dmhf"] Dec 03 09:33:18 crc kubenswrapper[4947]: I1203 09:33:18.621162 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 09:33:19 crc kubenswrapper[4947]: I1203 09:33:19.634021 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-6dmhf" event={"ID":"1207aa81-9a4e-4189-b19c-65b393c24b4c","Type":"ContainerStarted","Data":"5d03461612e71eec556eda29f681339bc87240f8c078785fe71e77779aab90c1"} Dec 03 09:33:20 crc kubenswrapper[4947]: I1203 09:33:20.645165 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-6dmhf" event={"ID":"1207aa81-9a4e-4189-b19c-65b393c24b4c","Type":"ContainerStarted","Data":"3b4f891f6f665f461eb62eb356b391d3823dc0cf4aad0f84492f1e8d91a9f2fe"} Dec 03 09:33:20 crc kubenswrapper[4947]: I1203 09:33:20.666325 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-6dmhf" podStartSLOduration=2.824419175 podStartE2EDuration="3.666305198s" podCreationTimestamp="2025-12-03 09:33:17 +0000 UTC" firstStartedPulling="2025-12-03 09:33:18.620926376 +0000 UTC m=+9859.881880802" lastFinishedPulling="2025-12-03 09:33:19.462812399 +0000 UTC m=+9860.723766825" observedRunningTime="2025-12-03 09:33:20.662074514 +0000 UTC m=+9861.923028960" watchObservedRunningTime="2025-12-03 09:33:20.666305198 +0000 UTC m=+9861.927259624" Dec 03 09:33:26 crc kubenswrapper[4947]: I1203 09:33:26.083510 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:33:26 crc kubenswrapper[4947]: E1203 09:33:26.084391 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:33:27 crc kubenswrapper[4947]: I1203 09:33:27.710335 4947 generic.go:334] "Generic (PLEG): container finished" podID="231eb955-7b76-4bc9-b221-df73a0d8aae2" containerID="0e261d36e36d1948260a61bc4690a369c9caba3b8dcbf06e5c0340202d1912be" exitCode=0 Dec 03 09:33:27 crc kubenswrapper[4947]: I1203 09:33:27.710380 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" event={"ID":"231eb955-7b76-4bc9-b221-df73a0d8aae2","Type":"ContainerDied","Data":"0e261d36e36d1948260a61bc4690a369c9caba3b8dcbf06e5c0340202d1912be"} Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.181509 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.296995 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-neutron-metadata-combined-ca-bundle\") pod \"231eb955-7b76-4bc9-b221-df73a0d8aae2\" (UID: \"231eb955-7b76-4bc9-b221-df73a0d8aae2\") " Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.297072 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-nova-metadata-neutron-config-0\") pod \"231eb955-7b76-4bc9-b221-df73a0d8aae2\" (UID: \"231eb955-7b76-4bc9-b221-df73a0d8aae2\") " Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.297141 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbplg\" (UniqueName: \"kubernetes.io/projected/231eb955-7b76-4bc9-b221-df73a0d8aae2-kube-api-access-jbplg\") pod \"231eb955-7b76-4bc9-b221-df73a0d8aae2\" (UID: \"231eb955-7b76-4bc9-b221-df73a0d8aae2\") " Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.297188 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-inventory\") pod \"231eb955-7b76-4bc9-b221-df73a0d8aae2\" (UID: \"231eb955-7b76-4bc9-b221-df73a0d8aae2\") " Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.297408 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"231eb955-7b76-4bc9-b221-df73a0d8aae2\" (UID: \"231eb955-7b76-4bc9-b221-df73a0d8aae2\") " Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.297463 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-ssh-key\") pod \"231eb955-7b76-4bc9-b221-df73a0d8aae2\" (UID: \"231eb955-7b76-4bc9-b221-df73a0d8aae2\") " Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.303020 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/231eb955-7b76-4bc9-b221-df73a0d8aae2-kube-api-access-jbplg" (OuterVolumeSpecName: "kube-api-access-jbplg") pod "231eb955-7b76-4bc9-b221-df73a0d8aae2" (UID: "231eb955-7b76-4bc9-b221-df73a0d8aae2"). InnerVolumeSpecName "kube-api-access-jbplg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.307024 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "231eb955-7b76-4bc9-b221-df73a0d8aae2" (UID: "231eb955-7b76-4bc9-b221-df73a0d8aae2"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.327441 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "231eb955-7b76-4bc9-b221-df73a0d8aae2" (UID: "231eb955-7b76-4bc9-b221-df73a0d8aae2"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.328198 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-inventory" (OuterVolumeSpecName: "inventory") pod "231eb955-7b76-4bc9-b221-df73a0d8aae2" (UID: "231eb955-7b76-4bc9-b221-df73a0d8aae2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.330121 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "231eb955-7b76-4bc9-b221-df73a0d8aae2" (UID: "231eb955-7b76-4bc9-b221-df73a0d8aae2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.339626 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "231eb955-7b76-4bc9-b221-df73a0d8aae2" (UID: "231eb955-7b76-4bc9-b221-df73a0d8aae2"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.400436 4947 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.400483 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.400516 4947 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.400528 4947 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.400540 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbplg\" (UniqueName: \"kubernetes.io/projected/231eb955-7b76-4bc9-b221-df73a0d8aae2-kube-api-access-jbplg\") on node \"crc\" DevicePath \"\"" Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.400550 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/231eb955-7b76-4bc9-b221-df73a0d8aae2-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.732310 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" event={"ID":"231eb955-7b76-4bc9-b221-df73a0d8aae2","Type":"ContainerDied","Data":"4a518277b0e10b064be7c7b518ba2c9eab1c1fe78e47e11db00a2d48c1a02590"} Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.732578 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a518277b0e10b064be7c7b518ba2c9eab1c1fe78e47e11db00a2d48c1a02590" Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.732354 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell2-ksjwl" Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.852471 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell2-knn2l"] Dec 03 09:33:29 crc kubenswrapper[4947]: E1203 09:33:29.854697 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="231eb955-7b76-4bc9-b221-df73a0d8aae2" containerName="neutron-metadata-openstack-openstack-cell2" Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.854730 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="231eb955-7b76-4bc9-b221-df73a0d8aae2" containerName="neutron-metadata-openstack-openstack-cell2" Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.856440 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="231eb955-7b76-4bc9-b221-df73a0d8aae2" containerName="neutron-metadata-openstack-openstack-cell2" Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.902314 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell2-knn2l" Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.922881 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell2" Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.923224 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell2-dockercfg-cl4m2" Dec 03 09:33:29 crc kubenswrapper[4947]: I1203 09:33:29.985100 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell2-knn2l"] Dec 03 09:33:30 crc kubenswrapper[4947]: I1203 09:33:30.017723 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sc8v\" (UniqueName: \"kubernetes.io/projected/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-kube-api-access-9sc8v\") pod \"libvirt-openstack-openstack-cell2-knn2l\" (UID: \"9ea8f5d8-aea6-44e9-995f-537fd9e2f655\") " pod="openstack/libvirt-openstack-openstack-cell2-knn2l" Dec 03 09:33:30 crc kubenswrapper[4947]: I1203 09:33:30.017822 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-ssh-key\") pod \"libvirt-openstack-openstack-cell2-knn2l\" (UID: \"9ea8f5d8-aea6-44e9-995f-537fd9e2f655\") " pod="openstack/libvirt-openstack-openstack-cell2-knn2l" Dec 03 09:33:30 crc kubenswrapper[4947]: I1203 09:33:30.017843 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell2-knn2l\" (UID: \"9ea8f5d8-aea6-44e9-995f-537fd9e2f655\") " pod="openstack/libvirt-openstack-openstack-cell2-knn2l" Dec 03 09:33:30 crc kubenswrapper[4947]: I1203 09:33:30.017909 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-inventory\") pod \"libvirt-openstack-openstack-cell2-knn2l\" (UID: \"9ea8f5d8-aea6-44e9-995f-537fd9e2f655\") " pod="openstack/libvirt-openstack-openstack-cell2-knn2l" Dec 03 09:33:30 crc kubenswrapper[4947]: I1203 09:33:30.017930 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell2-knn2l\" (UID: \"9ea8f5d8-aea6-44e9-995f-537fd9e2f655\") " pod="openstack/libvirt-openstack-openstack-cell2-knn2l" Dec 03 09:33:30 crc kubenswrapper[4947]: I1203 09:33:30.120016 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-ssh-key\") pod \"libvirt-openstack-openstack-cell2-knn2l\" (UID: \"9ea8f5d8-aea6-44e9-995f-537fd9e2f655\") " pod="openstack/libvirt-openstack-openstack-cell2-knn2l" Dec 03 09:33:30 crc kubenswrapper[4947]: I1203 09:33:30.120062 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell2-knn2l\" (UID: \"9ea8f5d8-aea6-44e9-995f-537fd9e2f655\") " pod="openstack/libvirt-openstack-openstack-cell2-knn2l" Dec 03 09:33:30 crc kubenswrapper[4947]: I1203 09:33:30.120157 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-inventory\") pod \"libvirt-openstack-openstack-cell2-knn2l\" (UID: \"9ea8f5d8-aea6-44e9-995f-537fd9e2f655\") " pod="openstack/libvirt-openstack-openstack-cell2-knn2l" Dec 03 09:33:30 crc kubenswrapper[4947]: I1203 09:33:30.120178 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell2-knn2l\" (UID: \"9ea8f5d8-aea6-44e9-995f-537fd9e2f655\") " pod="openstack/libvirt-openstack-openstack-cell2-knn2l" Dec 03 09:33:30 crc kubenswrapper[4947]: I1203 09:33:30.120273 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sc8v\" (UniqueName: \"kubernetes.io/projected/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-kube-api-access-9sc8v\") pod \"libvirt-openstack-openstack-cell2-knn2l\" (UID: \"9ea8f5d8-aea6-44e9-995f-537fd9e2f655\") " pod="openstack/libvirt-openstack-openstack-cell2-knn2l" Dec 03 09:33:30 crc kubenswrapper[4947]: I1203 09:33:30.124694 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-ssh-key\") pod \"libvirt-openstack-openstack-cell2-knn2l\" (UID: \"9ea8f5d8-aea6-44e9-995f-537fd9e2f655\") " pod="openstack/libvirt-openstack-openstack-cell2-knn2l" Dec 03 09:33:30 crc kubenswrapper[4947]: I1203 09:33:30.124982 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell2-knn2l\" (UID: \"9ea8f5d8-aea6-44e9-995f-537fd9e2f655\") " pod="openstack/libvirt-openstack-openstack-cell2-knn2l" Dec 03 09:33:30 crc kubenswrapper[4947]: I1203 09:33:30.129196 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell2-knn2l\" (UID: \"9ea8f5d8-aea6-44e9-995f-537fd9e2f655\") " pod="openstack/libvirt-openstack-openstack-cell2-knn2l" Dec 03 09:33:30 crc kubenswrapper[4947]: I1203 09:33:30.136798 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-inventory\") pod \"libvirt-openstack-openstack-cell2-knn2l\" (UID: \"9ea8f5d8-aea6-44e9-995f-537fd9e2f655\") " pod="openstack/libvirt-openstack-openstack-cell2-knn2l" Dec 03 09:33:30 crc kubenswrapper[4947]: I1203 09:33:30.137174 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sc8v\" (UniqueName: \"kubernetes.io/projected/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-kube-api-access-9sc8v\") pod \"libvirt-openstack-openstack-cell2-knn2l\" (UID: \"9ea8f5d8-aea6-44e9-995f-537fd9e2f655\") " pod="openstack/libvirt-openstack-openstack-cell2-knn2l" Dec 03 09:33:30 crc kubenswrapper[4947]: I1203 09:33:30.308878 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell2-knn2l" Dec 03 09:33:30 crc kubenswrapper[4947]: W1203 09:33:30.869230 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ea8f5d8_aea6_44e9_995f_537fd9e2f655.slice/crio-c91bfdbd7caed7f79b19687d7b273117bc42c9afa04fa8476b9afbcef95654dc WatchSource:0}: Error finding container c91bfdbd7caed7f79b19687d7b273117bc42c9afa04fa8476b9afbcef95654dc: Status 404 returned error can't find the container with id c91bfdbd7caed7f79b19687d7b273117bc42c9afa04fa8476b9afbcef95654dc Dec 03 09:33:30 crc kubenswrapper[4947]: I1203 09:33:30.871241 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell2-knn2l"] Dec 03 09:33:31 crc kubenswrapper[4947]: I1203 09:33:31.758551 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell2-knn2l" event={"ID":"9ea8f5d8-aea6-44e9-995f-537fd9e2f655","Type":"ContainerStarted","Data":"a867e1c5ba54752a1d411bea8ca4cfa157a66ab87ebb7c8ff6a28de3fe751f5e"} Dec 03 09:33:31 crc kubenswrapper[4947]: I1203 09:33:31.759848 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell2-knn2l" event={"ID":"9ea8f5d8-aea6-44e9-995f-537fd9e2f655","Type":"ContainerStarted","Data":"c91bfdbd7caed7f79b19687d7b273117bc42c9afa04fa8476b9afbcef95654dc"} Dec 03 09:33:31 crc kubenswrapper[4947]: I1203 09:33:31.776271 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell2-knn2l" podStartSLOduration=2.310583281 podStartE2EDuration="2.776251496s" podCreationTimestamp="2025-12-03 09:33:29 +0000 UTC" firstStartedPulling="2025-12-03 09:33:30.871919466 +0000 UTC m=+9872.132873892" lastFinishedPulling="2025-12-03 09:33:31.337587661 +0000 UTC m=+9872.598542107" observedRunningTime="2025-12-03 09:33:31.773523382 +0000 UTC m=+9873.034477838" watchObservedRunningTime="2025-12-03 09:33:31.776251496 +0000 UTC m=+9873.037205932" Dec 03 09:33:39 crc kubenswrapper[4947]: I1203 09:33:39.090995 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:33:39 crc kubenswrapper[4947]: E1203 09:33:39.091748 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:33:54 crc kubenswrapper[4947]: I1203 09:33:54.082912 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:33:54 crc kubenswrapper[4947]: E1203 09:33:54.084478 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:34:06 crc kubenswrapper[4947]: I1203 09:34:06.083147 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:34:06 crc kubenswrapper[4947]: E1203 09:34:06.083965 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:34:07 crc kubenswrapper[4947]: I1203 09:34:07.645474 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p2fhq"] Dec 03 09:34:07 crc kubenswrapper[4947]: I1203 09:34:07.648172 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p2fhq" Dec 03 09:34:07 crc kubenswrapper[4947]: I1203 09:34:07.659918 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p2fhq"] Dec 03 09:34:07 crc kubenswrapper[4947]: I1203 09:34:07.738677 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2q55\" (UniqueName: \"kubernetes.io/projected/f7696a3f-6f14-4701-8e1e-ab42cd24e472-kube-api-access-r2q55\") pod \"community-operators-p2fhq\" (UID: \"f7696a3f-6f14-4701-8e1e-ab42cd24e472\") " pod="openshift-marketplace/community-operators-p2fhq" Dec 03 09:34:07 crc kubenswrapper[4947]: I1203 09:34:07.739110 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7696a3f-6f14-4701-8e1e-ab42cd24e472-utilities\") pod \"community-operators-p2fhq\" (UID: \"f7696a3f-6f14-4701-8e1e-ab42cd24e472\") " pod="openshift-marketplace/community-operators-p2fhq" Dec 03 09:34:07 crc kubenswrapper[4947]: I1203 09:34:07.739218 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7696a3f-6f14-4701-8e1e-ab42cd24e472-catalog-content\") pod \"community-operators-p2fhq\" (UID: \"f7696a3f-6f14-4701-8e1e-ab42cd24e472\") " pod="openshift-marketplace/community-operators-p2fhq" Dec 03 09:34:07 crc kubenswrapper[4947]: I1203 09:34:07.841149 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2q55\" (UniqueName: \"kubernetes.io/projected/f7696a3f-6f14-4701-8e1e-ab42cd24e472-kube-api-access-r2q55\") pod \"community-operators-p2fhq\" (UID: \"f7696a3f-6f14-4701-8e1e-ab42cd24e472\") " pod="openshift-marketplace/community-operators-p2fhq" Dec 03 09:34:07 crc kubenswrapper[4947]: I1203 09:34:07.841296 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7696a3f-6f14-4701-8e1e-ab42cd24e472-utilities\") pod \"community-operators-p2fhq\" (UID: \"f7696a3f-6f14-4701-8e1e-ab42cd24e472\") " pod="openshift-marketplace/community-operators-p2fhq" Dec 03 09:34:07 crc kubenswrapper[4947]: I1203 09:34:07.841370 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7696a3f-6f14-4701-8e1e-ab42cd24e472-catalog-content\") pod \"community-operators-p2fhq\" (UID: \"f7696a3f-6f14-4701-8e1e-ab42cd24e472\") " pod="openshift-marketplace/community-operators-p2fhq" Dec 03 09:34:07 crc kubenswrapper[4947]: I1203 09:34:07.841908 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7696a3f-6f14-4701-8e1e-ab42cd24e472-utilities\") pod \"community-operators-p2fhq\" (UID: \"f7696a3f-6f14-4701-8e1e-ab42cd24e472\") " pod="openshift-marketplace/community-operators-p2fhq" Dec 03 09:34:07 crc kubenswrapper[4947]: I1203 09:34:07.841906 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7696a3f-6f14-4701-8e1e-ab42cd24e472-catalog-content\") pod \"community-operators-p2fhq\" (UID: \"f7696a3f-6f14-4701-8e1e-ab42cd24e472\") " pod="openshift-marketplace/community-operators-p2fhq" Dec 03 09:34:07 crc kubenswrapper[4947]: I1203 09:34:07.858910 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2q55\" (UniqueName: \"kubernetes.io/projected/f7696a3f-6f14-4701-8e1e-ab42cd24e472-kube-api-access-r2q55\") pod \"community-operators-p2fhq\" (UID: \"f7696a3f-6f14-4701-8e1e-ab42cd24e472\") " pod="openshift-marketplace/community-operators-p2fhq" Dec 03 09:34:08 crc kubenswrapper[4947]: I1203 09:34:08.014261 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p2fhq" Dec 03 09:34:08 crc kubenswrapper[4947]: I1203 09:34:08.661171 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p2fhq"] Dec 03 09:34:09 crc kubenswrapper[4947]: I1203 09:34:09.171340 4947 generic.go:334] "Generic (PLEG): container finished" podID="f7696a3f-6f14-4701-8e1e-ab42cd24e472" containerID="15679b3525ee4cdf3cd219917313d0ea485e3cfa8e1e75a44796b36f76b091af" exitCode=0 Dec 03 09:34:09 crc kubenswrapper[4947]: I1203 09:34:09.172382 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p2fhq" event={"ID":"f7696a3f-6f14-4701-8e1e-ab42cd24e472","Type":"ContainerDied","Data":"15679b3525ee4cdf3cd219917313d0ea485e3cfa8e1e75a44796b36f76b091af"} Dec 03 09:34:09 crc kubenswrapper[4947]: I1203 09:34:09.174798 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p2fhq" event={"ID":"f7696a3f-6f14-4701-8e1e-ab42cd24e472","Type":"ContainerStarted","Data":"ad4efe1d7543a4804dfdf3327674725dcf7259cc81ee3b48a796ba8f9d4703e9"} Dec 03 09:34:10 crc kubenswrapper[4947]: I1203 09:34:10.195748 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p2fhq" event={"ID":"f7696a3f-6f14-4701-8e1e-ab42cd24e472","Type":"ContainerStarted","Data":"0bcb7b3f84a169e015ebefaf1fb27b0df013dc582b83ad7b910a107a14cfda4d"} Dec 03 09:34:11 crc kubenswrapper[4947]: I1203 09:34:11.209101 4947 generic.go:334] "Generic (PLEG): container finished" podID="f7696a3f-6f14-4701-8e1e-ab42cd24e472" containerID="0bcb7b3f84a169e015ebefaf1fb27b0df013dc582b83ad7b910a107a14cfda4d" exitCode=0 Dec 03 09:34:11 crc kubenswrapper[4947]: I1203 09:34:11.209221 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p2fhq" event={"ID":"f7696a3f-6f14-4701-8e1e-ab42cd24e472","Type":"ContainerDied","Data":"0bcb7b3f84a169e015ebefaf1fb27b0df013dc582b83ad7b910a107a14cfda4d"} Dec 03 09:34:12 crc kubenswrapper[4947]: I1203 09:34:12.221102 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p2fhq" event={"ID":"f7696a3f-6f14-4701-8e1e-ab42cd24e472","Type":"ContainerStarted","Data":"0db8e20069feb7087b318b671ac1a46a623df307f2744a673fe16c335858dc5a"} Dec 03 09:34:18 crc kubenswrapper[4947]: I1203 09:34:18.015725 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p2fhq" Dec 03 09:34:18 crc kubenswrapper[4947]: I1203 09:34:18.016322 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p2fhq" Dec 03 09:34:18 crc kubenswrapper[4947]: I1203 09:34:18.061362 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p2fhq" Dec 03 09:34:18 crc kubenswrapper[4947]: I1203 09:34:18.095455 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p2fhq" podStartSLOduration=8.54910571 podStartE2EDuration="11.095431929s" podCreationTimestamp="2025-12-03 09:34:07 +0000 UTC" firstStartedPulling="2025-12-03 09:34:09.173851695 +0000 UTC m=+9910.434806131" lastFinishedPulling="2025-12-03 09:34:11.720177924 +0000 UTC m=+9912.981132350" observedRunningTime="2025-12-03 09:34:12.237134704 +0000 UTC m=+9913.498089140" watchObservedRunningTime="2025-12-03 09:34:18.095431929 +0000 UTC m=+9919.356386365" Dec 03 09:34:18 crc kubenswrapper[4947]: I1203 09:34:18.360102 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p2fhq" Dec 03 09:34:18 crc kubenswrapper[4947]: I1203 09:34:18.406342 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p2fhq"] Dec 03 09:34:19 crc kubenswrapper[4947]: I1203 09:34:19.093946 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:34:19 crc kubenswrapper[4947]: E1203 09:34:19.094543 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:34:20 crc kubenswrapper[4947]: I1203 09:34:20.311339 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p2fhq" podUID="f7696a3f-6f14-4701-8e1e-ab42cd24e472" containerName="registry-server" containerID="cri-o://0db8e20069feb7087b318b671ac1a46a623df307f2744a673fe16c335858dc5a" gracePeriod=2 Dec 03 09:34:20 crc kubenswrapper[4947]: I1203 09:34:20.843540 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p2fhq" Dec 03 09:34:20 crc kubenswrapper[4947]: I1203 09:34:20.906046 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7696a3f-6f14-4701-8e1e-ab42cd24e472-catalog-content\") pod \"f7696a3f-6f14-4701-8e1e-ab42cd24e472\" (UID: \"f7696a3f-6f14-4701-8e1e-ab42cd24e472\") " Dec 03 09:34:20 crc kubenswrapper[4947]: I1203 09:34:20.906265 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7696a3f-6f14-4701-8e1e-ab42cd24e472-utilities\") pod \"f7696a3f-6f14-4701-8e1e-ab42cd24e472\" (UID: \"f7696a3f-6f14-4701-8e1e-ab42cd24e472\") " Dec 03 09:34:20 crc kubenswrapper[4947]: I1203 09:34:20.906331 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2q55\" (UniqueName: \"kubernetes.io/projected/f7696a3f-6f14-4701-8e1e-ab42cd24e472-kube-api-access-r2q55\") pod \"f7696a3f-6f14-4701-8e1e-ab42cd24e472\" (UID: \"f7696a3f-6f14-4701-8e1e-ab42cd24e472\") " Dec 03 09:34:20 crc kubenswrapper[4947]: I1203 09:34:20.907239 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7696a3f-6f14-4701-8e1e-ab42cd24e472-utilities" (OuterVolumeSpecName: "utilities") pod "f7696a3f-6f14-4701-8e1e-ab42cd24e472" (UID: "f7696a3f-6f14-4701-8e1e-ab42cd24e472"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:34:20 crc kubenswrapper[4947]: I1203 09:34:20.916239 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7696a3f-6f14-4701-8e1e-ab42cd24e472-kube-api-access-r2q55" (OuterVolumeSpecName: "kube-api-access-r2q55") pod "f7696a3f-6f14-4701-8e1e-ab42cd24e472" (UID: "f7696a3f-6f14-4701-8e1e-ab42cd24e472"). InnerVolumeSpecName "kube-api-access-r2q55". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:34:20 crc kubenswrapper[4947]: I1203 09:34:20.979726 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7696a3f-6f14-4701-8e1e-ab42cd24e472-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f7696a3f-6f14-4701-8e1e-ab42cd24e472" (UID: "f7696a3f-6f14-4701-8e1e-ab42cd24e472"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:34:21 crc kubenswrapper[4947]: I1203 09:34:21.009255 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2q55\" (UniqueName: \"kubernetes.io/projected/f7696a3f-6f14-4701-8e1e-ab42cd24e472-kube-api-access-r2q55\") on node \"crc\" DevicePath \"\"" Dec 03 09:34:21 crc kubenswrapper[4947]: I1203 09:34:21.009298 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7696a3f-6f14-4701-8e1e-ab42cd24e472-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:34:21 crc kubenswrapper[4947]: I1203 09:34:21.009308 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7696a3f-6f14-4701-8e1e-ab42cd24e472-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:34:21 crc kubenswrapper[4947]: I1203 09:34:21.325242 4947 generic.go:334] "Generic (PLEG): container finished" podID="f7696a3f-6f14-4701-8e1e-ab42cd24e472" containerID="0db8e20069feb7087b318b671ac1a46a623df307f2744a673fe16c335858dc5a" exitCode=0 Dec 03 09:34:21 crc kubenswrapper[4947]: I1203 09:34:21.325369 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p2fhq" Dec 03 09:34:21 crc kubenswrapper[4947]: I1203 09:34:21.325483 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p2fhq" event={"ID":"f7696a3f-6f14-4701-8e1e-ab42cd24e472","Type":"ContainerDied","Data":"0db8e20069feb7087b318b671ac1a46a623df307f2744a673fe16c335858dc5a"} Dec 03 09:34:21 crc kubenswrapper[4947]: I1203 09:34:21.326466 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p2fhq" event={"ID":"f7696a3f-6f14-4701-8e1e-ab42cd24e472","Type":"ContainerDied","Data":"ad4efe1d7543a4804dfdf3327674725dcf7259cc81ee3b48a796ba8f9d4703e9"} Dec 03 09:34:21 crc kubenswrapper[4947]: I1203 09:34:21.326491 4947 scope.go:117] "RemoveContainer" containerID="0db8e20069feb7087b318b671ac1a46a623df307f2744a673fe16c335858dc5a" Dec 03 09:34:21 crc kubenswrapper[4947]: I1203 09:34:21.349602 4947 scope.go:117] "RemoveContainer" containerID="0bcb7b3f84a169e015ebefaf1fb27b0df013dc582b83ad7b910a107a14cfda4d" Dec 03 09:34:21 crc kubenswrapper[4947]: I1203 09:34:21.353721 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p2fhq"] Dec 03 09:34:21 crc kubenswrapper[4947]: I1203 09:34:21.363085 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p2fhq"] Dec 03 09:34:21 crc kubenswrapper[4947]: I1203 09:34:21.375304 4947 scope.go:117] "RemoveContainer" containerID="15679b3525ee4cdf3cd219917313d0ea485e3cfa8e1e75a44796b36f76b091af" Dec 03 09:34:21 crc kubenswrapper[4947]: I1203 09:34:21.433978 4947 scope.go:117] "RemoveContainer" containerID="0db8e20069feb7087b318b671ac1a46a623df307f2744a673fe16c335858dc5a" Dec 03 09:34:21 crc kubenswrapper[4947]: E1203 09:34:21.434462 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0db8e20069feb7087b318b671ac1a46a623df307f2744a673fe16c335858dc5a\": container with ID starting with 0db8e20069feb7087b318b671ac1a46a623df307f2744a673fe16c335858dc5a not found: ID does not exist" containerID="0db8e20069feb7087b318b671ac1a46a623df307f2744a673fe16c335858dc5a" Dec 03 09:34:21 crc kubenswrapper[4947]: I1203 09:34:21.434536 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0db8e20069feb7087b318b671ac1a46a623df307f2744a673fe16c335858dc5a"} err="failed to get container status \"0db8e20069feb7087b318b671ac1a46a623df307f2744a673fe16c335858dc5a\": rpc error: code = NotFound desc = could not find container \"0db8e20069feb7087b318b671ac1a46a623df307f2744a673fe16c335858dc5a\": container with ID starting with 0db8e20069feb7087b318b671ac1a46a623df307f2744a673fe16c335858dc5a not found: ID does not exist" Dec 03 09:34:21 crc kubenswrapper[4947]: I1203 09:34:21.434584 4947 scope.go:117] "RemoveContainer" containerID="0bcb7b3f84a169e015ebefaf1fb27b0df013dc582b83ad7b910a107a14cfda4d" Dec 03 09:34:21 crc kubenswrapper[4947]: E1203 09:34:21.435004 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bcb7b3f84a169e015ebefaf1fb27b0df013dc582b83ad7b910a107a14cfda4d\": container with ID starting with 0bcb7b3f84a169e015ebefaf1fb27b0df013dc582b83ad7b910a107a14cfda4d not found: ID does not exist" containerID="0bcb7b3f84a169e015ebefaf1fb27b0df013dc582b83ad7b910a107a14cfda4d" Dec 03 09:34:21 crc kubenswrapper[4947]: I1203 09:34:21.435098 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bcb7b3f84a169e015ebefaf1fb27b0df013dc582b83ad7b910a107a14cfda4d"} err="failed to get container status \"0bcb7b3f84a169e015ebefaf1fb27b0df013dc582b83ad7b910a107a14cfda4d\": rpc error: code = NotFound desc = could not find container \"0bcb7b3f84a169e015ebefaf1fb27b0df013dc582b83ad7b910a107a14cfda4d\": container with ID starting with 0bcb7b3f84a169e015ebefaf1fb27b0df013dc582b83ad7b910a107a14cfda4d not found: ID does not exist" Dec 03 09:34:21 crc kubenswrapper[4947]: I1203 09:34:21.435187 4947 scope.go:117] "RemoveContainer" containerID="15679b3525ee4cdf3cd219917313d0ea485e3cfa8e1e75a44796b36f76b091af" Dec 03 09:34:21 crc kubenswrapper[4947]: E1203 09:34:21.435645 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15679b3525ee4cdf3cd219917313d0ea485e3cfa8e1e75a44796b36f76b091af\": container with ID starting with 15679b3525ee4cdf3cd219917313d0ea485e3cfa8e1e75a44796b36f76b091af not found: ID does not exist" containerID="15679b3525ee4cdf3cd219917313d0ea485e3cfa8e1e75a44796b36f76b091af" Dec 03 09:34:21 crc kubenswrapper[4947]: I1203 09:34:21.435676 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15679b3525ee4cdf3cd219917313d0ea485e3cfa8e1e75a44796b36f76b091af"} err="failed to get container status \"15679b3525ee4cdf3cd219917313d0ea485e3cfa8e1e75a44796b36f76b091af\": rpc error: code = NotFound desc = could not find container \"15679b3525ee4cdf3cd219917313d0ea485e3cfa8e1e75a44796b36f76b091af\": container with ID starting with 15679b3525ee4cdf3cd219917313d0ea485e3cfa8e1e75a44796b36f76b091af not found: ID does not exist" Dec 03 09:34:23 crc kubenswrapper[4947]: I1203 09:34:23.101615 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7696a3f-6f14-4701-8e1e-ab42cd24e472" path="/var/lib/kubelet/pods/f7696a3f-6f14-4701-8e1e-ab42cd24e472/volumes" Dec 03 09:34:34 crc kubenswrapper[4947]: I1203 09:34:34.083964 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:34:34 crc kubenswrapper[4947]: E1203 09:34:34.084914 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:34:48 crc kubenswrapper[4947]: I1203 09:34:48.084068 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:34:48 crc kubenswrapper[4947]: E1203 09:34:48.087190 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:35:02 crc kubenswrapper[4947]: I1203 09:35:02.083788 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:35:02 crc kubenswrapper[4947]: E1203 09:35:02.084643 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:35:15 crc kubenswrapper[4947]: I1203 09:35:15.083704 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:35:15 crc kubenswrapper[4947]: E1203 09:35:15.084827 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:35:26 crc kubenswrapper[4947]: I1203 09:35:26.085709 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:35:26 crc kubenswrapper[4947]: E1203 09:35:26.086540 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:35:40 crc kubenswrapper[4947]: I1203 09:35:40.084288 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:35:40 crc kubenswrapper[4947]: E1203 09:35:40.085576 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:35:51 crc kubenswrapper[4947]: I1203 09:35:51.084067 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:35:51 crc kubenswrapper[4947]: E1203 09:35:51.084830 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:36:02 crc kubenswrapper[4947]: I1203 09:36:02.083645 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:36:02 crc kubenswrapper[4947]: E1203 09:36:02.084920 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:36:13 crc kubenswrapper[4947]: I1203 09:36:13.086419 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:36:13 crc kubenswrapper[4947]: E1203 09:36:13.087425 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:36:27 crc kubenswrapper[4947]: I1203 09:36:27.083645 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:36:27 crc kubenswrapper[4947]: E1203 09:36:27.084470 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:36:38 crc kubenswrapper[4947]: I1203 09:36:38.083240 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:36:38 crc kubenswrapper[4947]: E1203 09:36:38.084042 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:36:53 crc kubenswrapper[4947]: I1203 09:36:53.083637 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:36:53 crc kubenswrapper[4947]: E1203 09:36:53.085263 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:37:06 crc kubenswrapper[4947]: I1203 09:37:06.083036 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:37:06 crc kubenswrapper[4947]: E1203 09:37:06.085390 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:37:21 crc kubenswrapper[4947]: I1203 09:37:21.083586 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:37:21 crc kubenswrapper[4947]: E1203 09:37:21.084617 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:37:32 crc kubenswrapper[4947]: I1203 09:37:32.083396 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:37:32 crc kubenswrapper[4947]: E1203 09:37:32.084088 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:37:40 crc kubenswrapper[4947]: I1203 09:37:40.463950 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l8q7n"] Dec 03 09:37:40 crc kubenswrapper[4947]: E1203 09:37:40.464933 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7696a3f-6f14-4701-8e1e-ab42cd24e472" containerName="extract-content" Dec 03 09:37:40 crc kubenswrapper[4947]: I1203 09:37:40.465324 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7696a3f-6f14-4701-8e1e-ab42cd24e472" containerName="extract-content" Dec 03 09:37:40 crc kubenswrapper[4947]: E1203 09:37:40.465338 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7696a3f-6f14-4701-8e1e-ab42cd24e472" containerName="registry-server" Dec 03 09:37:40 crc kubenswrapper[4947]: I1203 09:37:40.465346 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7696a3f-6f14-4701-8e1e-ab42cd24e472" containerName="registry-server" Dec 03 09:37:40 crc kubenswrapper[4947]: E1203 09:37:40.465399 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7696a3f-6f14-4701-8e1e-ab42cd24e472" containerName="extract-utilities" Dec 03 09:37:40 crc kubenswrapper[4947]: I1203 09:37:40.465409 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7696a3f-6f14-4701-8e1e-ab42cd24e472" containerName="extract-utilities" Dec 03 09:37:40 crc kubenswrapper[4947]: I1203 09:37:40.465691 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7696a3f-6f14-4701-8e1e-ab42cd24e472" containerName="registry-server" Dec 03 09:37:40 crc kubenswrapper[4947]: I1203 09:37:40.467846 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8q7n" Dec 03 09:37:40 crc kubenswrapper[4947]: I1203 09:37:40.487437 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8q7n"] Dec 03 09:37:40 crc kubenswrapper[4947]: I1203 09:37:40.531738 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f6ae272-f07d-4462-aca5-0a880853cc49-utilities\") pod \"redhat-marketplace-l8q7n\" (UID: \"0f6ae272-f07d-4462-aca5-0a880853cc49\") " pod="openshift-marketplace/redhat-marketplace-l8q7n" Dec 03 09:37:40 crc kubenswrapper[4947]: I1203 09:37:40.532425 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f6ae272-f07d-4462-aca5-0a880853cc49-catalog-content\") pod \"redhat-marketplace-l8q7n\" (UID: \"0f6ae272-f07d-4462-aca5-0a880853cc49\") " pod="openshift-marketplace/redhat-marketplace-l8q7n" Dec 03 09:37:40 crc kubenswrapper[4947]: I1203 09:37:40.533065 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5cz9\" (UniqueName: \"kubernetes.io/projected/0f6ae272-f07d-4462-aca5-0a880853cc49-kube-api-access-b5cz9\") pod \"redhat-marketplace-l8q7n\" (UID: \"0f6ae272-f07d-4462-aca5-0a880853cc49\") " pod="openshift-marketplace/redhat-marketplace-l8q7n" Dec 03 09:37:40 crc kubenswrapper[4947]: I1203 09:37:40.635411 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5cz9\" (UniqueName: \"kubernetes.io/projected/0f6ae272-f07d-4462-aca5-0a880853cc49-kube-api-access-b5cz9\") pod \"redhat-marketplace-l8q7n\" (UID: \"0f6ae272-f07d-4462-aca5-0a880853cc49\") " pod="openshift-marketplace/redhat-marketplace-l8q7n" Dec 03 09:37:40 crc kubenswrapper[4947]: I1203 09:37:40.635636 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f6ae272-f07d-4462-aca5-0a880853cc49-utilities\") pod \"redhat-marketplace-l8q7n\" (UID: \"0f6ae272-f07d-4462-aca5-0a880853cc49\") " pod="openshift-marketplace/redhat-marketplace-l8q7n" Dec 03 09:37:40 crc kubenswrapper[4947]: I1203 09:37:40.635661 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f6ae272-f07d-4462-aca5-0a880853cc49-catalog-content\") pod \"redhat-marketplace-l8q7n\" (UID: \"0f6ae272-f07d-4462-aca5-0a880853cc49\") " pod="openshift-marketplace/redhat-marketplace-l8q7n" Dec 03 09:37:40 crc kubenswrapper[4947]: I1203 09:37:40.636125 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f6ae272-f07d-4462-aca5-0a880853cc49-catalog-content\") pod \"redhat-marketplace-l8q7n\" (UID: \"0f6ae272-f07d-4462-aca5-0a880853cc49\") " pod="openshift-marketplace/redhat-marketplace-l8q7n" Dec 03 09:37:40 crc kubenswrapper[4947]: I1203 09:37:40.636533 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f6ae272-f07d-4462-aca5-0a880853cc49-utilities\") pod \"redhat-marketplace-l8q7n\" (UID: \"0f6ae272-f07d-4462-aca5-0a880853cc49\") " pod="openshift-marketplace/redhat-marketplace-l8q7n" Dec 03 09:37:40 crc kubenswrapper[4947]: I1203 09:37:40.660977 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5cz9\" (UniqueName: \"kubernetes.io/projected/0f6ae272-f07d-4462-aca5-0a880853cc49-kube-api-access-b5cz9\") pod \"redhat-marketplace-l8q7n\" (UID: \"0f6ae272-f07d-4462-aca5-0a880853cc49\") " pod="openshift-marketplace/redhat-marketplace-l8q7n" Dec 03 09:37:40 crc kubenswrapper[4947]: I1203 09:37:40.802148 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8q7n" Dec 03 09:37:41 crc kubenswrapper[4947]: I1203 09:37:41.334578 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8q7n"] Dec 03 09:37:42 crc kubenswrapper[4947]: I1203 09:37:42.039573 4947 generic.go:334] "Generic (PLEG): container finished" podID="0f6ae272-f07d-4462-aca5-0a880853cc49" containerID="74cd955a660ed68a9f8fd614e3c7708e3e5fb0f5c47e0b1416a15d5b925c145c" exitCode=0 Dec 03 09:37:42 crc kubenswrapper[4947]: I1203 09:37:42.039967 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8q7n" event={"ID":"0f6ae272-f07d-4462-aca5-0a880853cc49","Type":"ContainerDied","Data":"74cd955a660ed68a9f8fd614e3c7708e3e5fb0f5c47e0b1416a15d5b925c145c"} Dec 03 09:37:42 crc kubenswrapper[4947]: I1203 09:37:42.040008 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8q7n" event={"ID":"0f6ae272-f07d-4462-aca5-0a880853cc49","Type":"ContainerStarted","Data":"67848040e22bd5cbbc37e01b1202f95515ae02195eaece442bf0662f3ff9260c"} Dec 03 09:37:44 crc kubenswrapper[4947]: I1203 09:37:44.064936 4947 generic.go:334] "Generic (PLEG): container finished" podID="0f6ae272-f07d-4462-aca5-0a880853cc49" containerID="3daca8c3c5a54f41bc209dc0f78a98361f49a33fff554e9b65f80a7a89b8011d" exitCode=0 Dec 03 09:37:44 crc kubenswrapper[4947]: I1203 09:37:44.065016 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8q7n" event={"ID":"0f6ae272-f07d-4462-aca5-0a880853cc49","Type":"ContainerDied","Data":"3daca8c3c5a54f41bc209dc0f78a98361f49a33fff554e9b65f80a7a89b8011d"} Dec 03 09:37:45 crc kubenswrapper[4947]: I1203 09:37:45.077751 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8q7n" event={"ID":"0f6ae272-f07d-4462-aca5-0a880853cc49","Type":"ContainerStarted","Data":"409d7961cf95c8978d58ca7fe54e0f6513823f57c3e90bbc42e405c418fa7087"} Dec 03 09:37:45 crc kubenswrapper[4947]: I1203 09:37:45.083520 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:37:45 crc kubenswrapper[4947]: E1203 09:37:45.083804 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:37:45 crc kubenswrapper[4947]: I1203 09:37:45.102185 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l8q7n" podStartSLOduration=2.494171407 podStartE2EDuration="5.102164133s" podCreationTimestamp="2025-12-03 09:37:40 +0000 UTC" firstStartedPulling="2025-12-03 09:37:42.047970929 +0000 UTC m=+10123.308925355" lastFinishedPulling="2025-12-03 09:37:44.655963655 +0000 UTC m=+10125.916918081" observedRunningTime="2025-12-03 09:37:45.100664553 +0000 UTC m=+10126.361618989" watchObservedRunningTime="2025-12-03 09:37:45.102164133 +0000 UTC m=+10126.363118569" Dec 03 09:37:50 crc kubenswrapper[4947]: I1203 09:37:50.803284 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l8q7n" Dec 03 09:37:50 crc kubenswrapper[4947]: I1203 09:37:50.803760 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l8q7n" Dec 03 09:37:50 crc kubenswrapper[4947]: I1203 09:37:50.850245 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l8q7n" Dec 03 09:37:51 crc kubenswrapper[4947]: I1203 09:37:51.193629 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l8q7n" Dec 03 09:37:51 crc kubenswrapper[4947]: I1203 09:37:51.249107 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8q7n"] Dec 03 09:37:53 crc kubenswrapper[4947]: I1203 09:37:53.162711 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l8q7n" podUID="0f6ae272-f07d-4462-aca5-0a880853cc49" containerName="registry-server" containerID="cri-o://409d7961cf95c8978d58ca7fe54e0f6513823f57c3e90bbc42e405c418fa7087" gracePeriod=2 Dec 03 09:37:53 crc kubenswrapper[4947]: I1203 09:37:53.761414 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8q7n" Dec 03 09:37:53 crc kubenswrapper[4947]: I1203 09:37:53.814223 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f6ae272-f07d-4462-aca5-0a880853cc49-catalog-content\") pod \"0f6ae272-f07d-4462-aca5-0a880853cc49\" (UID: \"0f6ae272-f07d-4462-aca5-0a880853cc49\") " Dec 03 09:37:53 crc kubenswrapper[4947]: I1203 09:37:53.814402 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5cz9\" (UniqueName: \"kubernetes.io/projected/0f6ae272-f07d-4462-aca5-0a880853cc49-kube-api-access-b5cz9\") pod \"0f6ae272-f07d-4462-aca5-0a880853cc49\" (UID: \"0f6ae272-f07d-4462-aca5-0a880853cc49\") " Dec 03 09:37:53 crc kubenswrapper[4947]: I1203 09:37:53.814464 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f6ae272-f07d-4462-aca5-0a880853cc49-utilities\") pod \"0f6ae272-f07d-4462-aca5-0a880853cc49\" (UID: \"0f6ae272-f07d-4462-aca5-0a880853cc49\") " Dec 03 09:37:53 crc kubenswrapper[4947]: I1203 09:37:53.815806 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f6ae272-f07d-4462-aca5-0a880853cc49-utilities" (OuterVolumeSpecName: "utilities") pod "0f6ae272-f07d-4462-aca5-0a880853cc49" (UID: "0f6ae272-f07d-4462-aca5-0a880853cc49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:37:53 crc kubenswrapper[4947]: I1203 09:37:53.821388 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f6ae272-f07d-4462-aca5-0a880853cc49-kube-api-access-b5cz9" (OuterVolumeSpecName: "kube-api-access-b5cz9") pod "0f6ae272-f07d-4462-aca5-0a880853cc49" (UID: "0f6ae272-f07d-4462-aca5-0a880853cc49"). InnerVolumeSpecName "kube-api-access-b5cz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:37:53 crc kubenswrapper[4947]: I1203 09:37:53.833472 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f6ae272-f07d-4462-aca5-0a880853cc49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f6ae272-f07d-4462-aca5-0a880853cc49" (UID: "0f6ae272-f07d-4462-aca5-0a880853cc49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:37:53 crc kubenswrapper[4947]: I1203 09:37:53.916523 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5cz9\" (UniqueName: \"kubernetes.io/projected/0f6ae272-f07d-4462-aca5-0a880853cc49-kube-api-access-b5cz9\") on node \"crc\" DevicePath \"\"" Dec 03 09:37:53 crc kubenswrapper[4947]: I1203 09:37:53.916569 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f6ae272-f07d-4462-aca5-0a880853cc49-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:37:53 crc kubenswrapper[4947]: I1203 09:37:53.916579 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f6ae272-f07d-4462-aca5-0a880853cc49-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:37:54 crc kubenswrapper[4947]: I1203 09:37:54.178182 4947 generic.go:334] "Generic (PLEG): container finished" podID="0f6ae272-f07d-4462-aca5-0a880853cc49" containerID="409d7961cf95c8978d58ca7fe54e0f6513823f57c3e90bbc42e405c418fa7087" exitCode=0 Dec 03 09:37:54 crc kubenswrapper[4947]: I1203 09:37:54.178235 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8q7n" event={"ID":"0f6ae272-f07d-4462-aca5-0a880853cc49","Type":"ContainerDied","Data":"409d7961cf95c8978d58ca7fe54e0f6513823f57c3e90bbc42e405c418fa7087"} Dec 03 09:37:54 crc kubenswrapper[4947]: I1203 09:37:54.178266 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8q7n" event={"ID":"0f6ae272-f07d-4462-aca5-0a880853cc49","Type":"ContainerDied","Data":"67848040e22bd5cbbc37e01b1202f95515ae02195eaece442bf0662f3ff9260c"} Dec 03 09:37:54 crc kubenswrapper[4947]: I1203 09:37:54.178288 4947 scope.go:117] "RemoveContainer" containerID="409d7961cf95c8978d58ca7fe54e0f6513823f57c3e90bbc42e405c418fa7087" Dec 03 09:37:54 crc kubenswrapper[4947]: I1203 09:37:54.178316 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8q7n" Dec 03 09:37:54 crc kubenswrapper[4947]: I1203 09:37:54.212630 4947 scope.go:117] "RemoveContainer" containerID="3daca8c3c5a54f41bc209dc0f78a98361f49a33fff554e9b65f80a7a89b8011d" Dec 03 09:37:54 crc kubenswrapper[4947]: I1203 09:37:54.222560 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8q7n"] Dec 03 09:37:54 crc kubenswrapper[4947]: I1203 09:37:54.236303 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8q7n"] Dec 03 09:37:54 crc kubenswrapper[4947]: I1203 09:37:54.252768 4947 scope.go:117] "RemoveContainer" containerID="74cd955a660ed68a9f8fd614e3c7708e3e5fb0f5c47e0b1416a15d5b925c145c" Dec 03 09:37:54 crc kubenswrapper[4947]: I1203 09:37:54.311680 4947 scope.go:117] "RemoveContainer" containerID="409d7961cf95c8978d58ca7fe54e0f6513823f57c3e90bbc42e405c418fa7087" Dec 03 09:37:54 crc kubenswrapper[4947]: E1203 09:37:54.312325 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"409d7961cf95c8978d58ca7fe54e0f6513823f57c3e90bbc42e405c418fa7087\": container with ID starting with 409d7961cf95c8978d58ca7fe54e0f6513823f57c3e90bbc42e405c418fa7087 not found: ID does not exist" containerID="409d7961cf95c8978d58ca7fe54e0f6513823f57c3e90bbc42e405c418fa7087" Dec 03 09:37:54 crc kubenswrapper[4947]: I1203 09:37:54.312366 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"409d7961cf95c8978d58ca7fe54e0f6513823f57c3e90bbc42e405c418fa7087"} err="failed to get container status \"409d7961cf95c8978d58ca7fe54e0f6513823f57c3e90bbc42e405c418fa7087\": rpc error: code = NotFound desc = could not find container \"409d7961cf95c8978d58ca7fe54e0f6513823f57c3e90bbc42e405c418fa7087\": container with ID starting with 409d7961cf95c8978d58ca7fe54e0f6513823f57c3e90bbc42e405c418fa7087 not found: ID does not exist" Dec 03 09:37:54 crc kubenswrapper[4947]: I1203 09:37:54.312390 4947 scope.go:117] "RemoveContainer" containerID="3daca8c3c5a54f41bc209dc0f78a98361f49a33fff554e9b65f80a7a89b8011d" Dec 03 09:37:54 crc kubenswrapper[4947]: E1203 09:37:54.313089 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3daca8c3c5a54f41bc209dc0f78a98361f49a33fff554e9b65f80a7a89b8011d\": container with ID starting with 3daca8c3c5a54f41bc209dc0f78a98361f49a33fff554e9b65f80a7a89b8011d not found: ID does not exist" containerID="3daca8c3c5a54f41bc209dc0f78a98361f49a33fff554e9b65f80a7a89b8011d" Dec 03 09:37:54 crc kubenswrapper[4947]: I1203 09:37:54.313131 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3daca8c3c5a54f41bc209dc0f78a98361f49a33fff554e9b65f80a7a89b8011d"} err="failed to get container status \"3daca8c3c5a54f41bc209dc0f78a98361f49a33fff554e9b65f80a7a89b8011d\": rpc error: code = NotFound desc = could not find container \"3daca8c3c5a54f41bc209dc0f78a98361f49a33fff554e9b65f80a7a89b8011d\": container with ID starting with 3daca8c3c5a54f41bc209dc0f78a98361f49a33fff554e9b65f80a7a89b8011d not found: ID does not exist" Dec 03 09:37:54 crc kubenswrapper[4947]: I1203 09:37:54.313150 4947 scope.go:117] "RemoveContainer" containerID="74cd955a660ed68a9f8fd614e3c7708e3e5fb0f5c47e0b1416a15d5b925c145c" Dec 03 09:37:54 crc kubenswrapper[4947]: E1203 09:37:54.313580 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74cd955a660ed68a9f8fd614e3c7708e3e5fb0f5c47e0b1416a15d5b925c145c\": container with ID starting with 74cd955a660ed68a9f8fd614e3c7708e3e5fb0f5c47e0b1416a15d5b925c145c not found: ID does not exist" containerID="74cd955a660ed68a9f8fd614e3c7708e3e5fb0f5c47e0b1416a15d5b925c145c" Dec 03 09:37:54 crc kubenswrapper[4947]: I1203 09:37:54.313631 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74cd955a660ed68a9f8fd614e3c7708e3e5fb0f5c47e0b1416a15d5b925c145c"} err="failed to get container status \"74cd955a660ed68a9f8fd614e3c7708e3e5fb0f5c47e0b1416a15d5b925c145c\": rpc error: code = NotFound desc = could not find container \"74cd955a660ed68a9f8fd614e3c7708e3e5fb0f5c47e0b1416a15d5b925c145c\": container with ID starting with 74cd955a660ed68a9f8fd614e3c7708e3e5fb0f5c47e0b1416a15d5b925c145c not found: ID does not exist" Dec 03 09:37:55 crc kubenswrapper[4947]: I1203 09:37:55.102871 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f6ae272-f07d-4462-aca5-0a880853cc49" path="/var/lib/kubelet/pods/0f6ae272-f07d-4462-aca5-0a880853cc49/volumes" Dec 03 09:37:57 crc kubenswrapper[4947]: I1203 09:37:57.083090 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:37:57 crc kubenswrapper[4947]: E1203 09:37:57.083601 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:38:02 crc kubenswrapper[4947]: I1203 09:38:02.272434 4947 generic.go:334] "Generic (PLEG): container finished" podID="1207aa81-9a4e-4189-b19c-65b393c24b4c" containerID="3b4f891f6f665f461eb62eb356b391d3823dc0cf4aad0f84492f1e8d91a9f2fe" exitCode=0 Dec 03 09:38:02 crc kubenswrapper[4947]: I1203 09:38:02.272677 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-6dmhf" event={"ID":"1207aa81-9a4e-4189-b19c-65b393c24b4c","Type":"ContainerDied","Data":"3b4f891f6f665f461eb62eb356b391d3823dc0cf4aad0f84492f1e8d91a9f2fe"} Dec 03 09:38:03 crc kubenswrapper[4947]: I1203 09:38:03.746334 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-6dmhf" Dec 03 09:38:03 crc kubenswrapper[4947]: I1203 09:38:03.831713 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1207aa81-9a4e-4189-b19c-65b393c24b4c-libvirt-combined-ca-bundle\") pod \"1207aa81-9a4e-4189-b19c-65b393c24b4c\" (UID: \"1207aa81-9a4e-4189-b19c-65b393c24b4c\") " Dec 03 09:38:03 crc kubenswrapper[4947]: I1203 09:38:03.831955 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/1207aa81-9a4e-4189-b19c-65b393c24b4c-libvirt-secret-0\") pod \"1207aa81-9a4e-4189-b19c-65b393c24b4c\" (UID: \"1207aa81-9a4e-4189-b19c-65b393c24b4c\") " Dec 03 09:38:03 crc kubenswrapper[4947]: I1203 09:38:03.832089 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvr4z\" (UniqueName: \"kubernetes.io/projected/1207aa81-9a4e-4189-b19c-65b393c24b4c-kube-api-access-mvr4z\") pod \"1207aa81-9a4e-4189-b19c-65b393c24b4c\" (UID: \"1207aa81-9a4e-4189-b19c-65b393c24b4c\") " Dec 03 09:38:03 crc kubenswrapper[4947]: I1203 09:38:03.832111 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1207aa81-9a4e-4189-b19c-65b393c24b4c-ssh-key\") pod \"1207aa81-9a4e-4189-b19c-65b393c24b4c\" (UID: \"1207aa81-9a4e-4189-b19c-65b393c24b4c\") " Dec 03 09:38:03 crc kubenswrapper[4947]: I1203 09:38:03.832139 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1207aa81-9a4e-4189-b19c-65b393c24b4c-inventory\") pod \"1207aa81-9a4e-4189-b19c-65b393c24b4c\" (UID: \"1207aa81-9a4e-4189-b19c-65b393c24b4c\") " Dec 03 09:38:03 crc kubenswrapper[4947]: I1203 09:38:03.837527 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1207aa81-9a4e-4189-b19c-65b393c24b4c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "1207aa81-9a4e-4189-b19c-65b393c24b4c" (UID: "1207aa81-9a4e-4189-b19c-65b393c24b4c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:38:03 crc kubenswrapper[4947]: I1203 09:38:03.837538 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1207aa81-9a4e-4189-b19c-65b393c24b4c-kube-api-access-mvr4z" (OuterVolumeSpecName: "kube-api-access-mvr4z") pod "1207aa81-9a4e-4189-b19c-65b393c24b4c" (UID: "1207aa81-9a4e-4189-b19c-65b393c24b4c"). InnerVolumeSpecName "kube-api-access-mvr4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:38:03 crc kubenswrapper[4947]: I1203 09:38:03.860749 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1207aa81-9a4e-4189-b19c-65b393c24b4c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1207aa81-9a4e-4189-b19c-65b393c24b4c" (UID: "1207aa81-9a4e-4189-b19c-65b393c24b4c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:38:03 crc kubenswrapper[4947]: I1203 09:38:03.861120 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1207aa81-9a4e-4189-b19c-65b393c24b4c-inventory" (OuterVolumeSpecName: "inventory") pod "1207aa81-9a4e-4189-b19c-65b393c24b4c" (UID: "1207aa81-9a4e-4189-b19c-65b393c24b4c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:38:03 crc kubenswrapper[4947]: I1203 09:38:03.871215 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1207aa81-9a4e-4189-b19c-65b393c24b4c-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "1207aa81-9a4e-4189-b19c-65b393c24b4c" (UID: "1207aa81-9a4e-4189-b19c-65b393c24b4c"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:38:03 crc kubenswrapper[4947]: I1203 09:38:03.934842 4947 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/1207aa81-9a4e-4189-b19c-65b393c24b4c-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:38:03 crc kubenswrapper[4947]: I1203 09:38:03.934882 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvr4z\" (UniqueName: \"kubernetes.io/projected/1207aa81-9a4e-4189-b19c-65b393c24b4c-kube-api-access-mvr4z\") on node \"crc\" DevicePath \"\"" Dec 03 09:38:03 crc kubenswrapper[4947]: I1203 09:38:03.934892 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1207aa81-9a4e-4189-b19c-65b393c24b4c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:38:03 crc kubenswrapper[4947]: I1203 09:38:03.934902 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1207aa81-9a4e-4189-b19c-65b393c24b4c-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:38:03 crc kubenswrapper[4947]: I1203 09:38:03.934910 4947 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1207aa81-9a4e-4189-b19c-65b393c24b4c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.298704 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-6dmhf" event={"ID":"1207aa81-9a4e-4189-b19c-65b393c24b4c","Type":"ContainerDied","Data":"5d03461612e71eec556eda29f681339bc87240f8c078785fe71e77779aab90c1"} Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.298872 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-6dmhf" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.299358 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d03461612e71eec556eda29f681339bc87240f8c078785fe71e77779aab90c1" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.383764 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-qz8md"] Dec 03 09:38:04 crc kubenswrapper[4947]: E1203 09:38:04.384685 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1207aa81-9a4e-4189-b19c-65b393c24b4c" containerName="libvirt-openstack-openstack-cell1" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.384703 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1207aa81-9a4e-4189-b19c-65b393c24b4c" containerName="libvirt-openstack-openstack-cell1" Dec 03 09:38:04 crc kubenswrapper[4947]: E1203 09:38:04.384734 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f6ae272-f07d-4462-aca5-0a880853cc49" containerName="extract-content" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.384741 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6ae272-f07d-4462-aca5-0a880853cc49" containerName="extract-content" Dec 03 09:38:04 crc kubenswrapper[4947]: E1203 09:38:04.384793 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f6ae272-f07d-4462-aca5-0a880853cc49" containerName="extract-utilities" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.384800 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6ae272-f07d-4462-aca5-0a880853cc49" containerName="extract-utilities" Dec 03 09:38:04 crc kubenswrapper[4947]: E1203 09:38:04.384814 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f6ae272-f07d-4462-aca5-0a880853cc49" containerName="registry-server" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.384821 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6ae272-f07d-4462-aca5-0a880853cc49" containerName="registry-server" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.385051 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f6ae272-f07d-4462-aca5-0a880853cc49" containerName="registry-server" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.385073 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="1207aa81-9a4e-4189-b19c-65b393c24b4c" containerName="libvirt-openstack-openstack-cell1" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.385806 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.388169 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.388254 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.388270 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.388171 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.392952 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rfmtm" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.401174 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-qz8md"] Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.447029 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.447309 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.447417 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-inventory\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.447537 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.447631 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdbzs\" (UniqueName: \"kubernetes.io/projected/fad89668-b52f-4a0c-a347-08242c3d566d-kube-api-access-jdbzs\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.447707 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.447822 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.448312 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.448470 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/fad89668-b52f-4a0c-a347-08242c3d566d-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.550580 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.550672 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.550711 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/fad89668-b52f-4a0c-a347-08242c3d566d-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.550786 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.550838 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.550867 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-inventory\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.550917 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.550947 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdbzs\" (UniqueName: \"kubernetes.io/projected/fad89668-b52f-4a0c-a347-08242c3d566d-kube-api-access-jdbzs\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.550970 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.551978 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/fad89668-b52f-4a0c-a347-08242c3d566d-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.554811 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.554841 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.554937 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.561842 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.563615 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.565234 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.570967 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-inventory\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.571149 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdbzs\" (UniqueName: \"kubernetes.io/projected/fad89668-b52f-4a0c-a347-08242c3d566d-kube-api-access-jdbzs\") pod \"nova-cell1-openstack-openstack-cell1-qz8md\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:04 crc kubenswrapper[4947]: I1203 09:38:04.704617 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:38:05 crc kubenswrapper[4947]: I1203 09:38:05.332832 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-qz8md"] Dec 03 09:38:06 crc kubenswrapper[4947]: I1203 09:38:06.319468 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" event={"ID":"fad89668-b52f-4a0c-a347-08242c3d566d","Type":"ContainerStarted","Data":"1777e197926b29970de157072ea8fbadfda4e3f0b4ba5055c9bbd257dc3a5858"} Dec 03 09:38:06 crc kubenswrapper[4947]: I1203 09:38:06.319557 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" event={"ID":"fad89668-b52f-4a0c-a347-08242c3d566d","Type":"ContainerStarted","Data":"ebef9ad0a8a4e14836f67a8525b5d8bd2334d5980001f03fd6983824d8528f28"} Dec 03 09:38:06 crc kubenswrapper[4947]: I1203 09:38:06.343874 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" podStartSLOduration=1.8566661089999998 podStartE2EDuration="2.343858555s" podCreationTimestamp="2025-12-03 09:38:04 +0000 UTC" firstStartedPulling="2025-12-03 09:38:05.336747039 +0000 UTC m=+10146.597701465" lastFinishedPulling="2025-12-03 09:38:05.823939465 +0000 UTC m=+10147.084893911" observedRunningTime="2025-12-03 09:38:06.336525177 +0000 UTC m=+10147.597479603" watchObservedRunningTime="2025-12-03 09:38:06.343858555 +0000 UTC m=+10147.604812971" Dec 03 09:38:09 crc kubenswrapper[4947]: I1203 09:38:09.353407 4947 generic.go:334] "Generic (PLEG): container finished" podID="9ea8f5d8-aea6-44e9-995f-537fd9e2f655" containerID="a867e1c5ba54752a1d411bea8ca4cfa157a66ab87ebb7c8ff6a28de3fe751f5e" exitCode=0 Dec 03 09:38:09 crc kubenswrapper[4947]: I1203 09:38:09.353531 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell2-knn2l" event={"ID":"9ea8f5d8-aea6-44e9-995f-537fd9e2f655","Type":"ContainerDied","Data":"a867e1c5ba54752a1d411bea8ca4cfa157a66ab87ebb7c8ff6a28de3fe751f5e"} Dec 03 09:38:10 crc kubenswrapper[4947]: I1203 09:38:10.894719 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell2-knn2l" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.011988 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-libvirt-combined-ca-bundle\") pod \"9ea8f5d8-aea6-44e9-995f-537fd9e2f655\" (UID: \"9ea8f5d8-aea6-44e9-995f-537fd9e2f655\") " Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.012193 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sc8v\" (UniqueName: \"kubernetes.io/projected/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-kube-api-access-9sc8v\") pod \"9ea8f5d8-aea6-44e9-995f-537fd9e2f655\" (UID: \"9ea8f5d8-aea6-44e9-995f-537fd9e2f655\") " Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.012232 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-libvirt-secret-0\") pod \"9ea8f5d8-aea6-44e9-995f-537fd9e2f655\" (UID: \"9ea8f5d8-aea6-44e9-995f-537fd9e2f655\") " Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.012289 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-ssh-key\") pod \"9ea8f5d8-aea6-44e9-995f-537fd9e2f655\" (UID: \"9ea8f5d8-aea6-44e9-995f-537fd9e2f655\") " Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.012319 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-inventory\") pod \"9ea8f5d8-aea6-44e9-995f-537fd9e2f655\" (UID: \"9ea8f5d8-aea6-44e9-995f-537fd9e2f655\") " Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.018156 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "9ea8f5d8-aea6-44e9-995f-537fd9e2f655" (UID: "9ea8f5d8-aea6-44e9-995f-537fd9e2f655"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.018159 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-kube-api-access-9sc8v" (OuterVolumeSpecName: "kube-api-access-9sc8v") pod "9ea8f5d8-aea6-44e9-995f-537fd9e2f655" (UID: "9ea8f5d8-aea6-44e9-995f-537fd9e2f655"). InnerVolumeSpecName "kube-api-access-9sc8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.042822 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "9ea8f5d8-aea6-44e9-995f-537fd9e2f655" (UID: "9ea8f5d8-aea6-44e9-995f-537fd9e2f655"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.044015 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9ea8f5d8-aea6-44e9-995f-537fd9e2f655" (UID: "9ea8f5d8-aea6-44e9-995f-537fd9e2f655"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.047154 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-inventory" (OuterVolumeSpecName: "inventory") pod "9ea8f5d8-aea6-44e9-995f-537fd9e2f655" (UID: "9ea8f5d8-aea6-44e9-995f-537fd9e2f655"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.083358 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.115145 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sc8v\" (UniqueName: \"kubernetes.io/projected/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-kube-api-access-9sc8v\") on node \"crc\" DevicePath \"\"" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.115307 4947 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.115773 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.115797 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.115809 4947 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ea8f5d8-aea6-44e9-995f-537fd9e2f655-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.385101 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell2-knn2l" event={"ID":"9ea8f5d8-aea6-44e9-995f-537fd9e2f655","Type":"ContainerDied","Data":"c91bfdbd7caed7f79b19687d7b273117bc42c9afa04fa8476b9afbcef95654dc"} Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.385406 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c91bfdbd7caed7f79b19687d7b273117bc42c9afa04fa8476b9afbcef95654dc" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.385192 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell2-knn2l" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.394731 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"5ba261672a72367230c55655e4f606db9a224cb56edda83f0bceaf8270f61180"} Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.544186 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell2-openstack-openstack-cell2-nlwj4"] Dec 03 09:38:11 crc kubenswrapper[4947]: E1203 09:38:11.544654 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ea8f5d8-aea6-44e9-995f-537fd9e2f655" containerName="libvirt-openstack-openstack-cell2" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.544671 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ea8f5d8-aea6-44e9-995f-537fd9e2f655" containerName="libvirt-openstack-openstack-cell2" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.544887 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ea8f5d8-aea6-44e9-995f-537fd9e2f655" containerName="libvirt-openstack-openstack-cell2" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.545653 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.551635 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell2-dockercfg-cl4m2" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.551834 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell2-compute-config" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.551956 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell2" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.573725 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell2-openstack-openstack-cell2-nlwj4"] Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.627234 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-migration-ssh-key-1\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.627349 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-ssh-key\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.627373 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-inventory\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.627405 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell2-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-cell2-compute-config-1\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.627431 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-migration-ssh-key-0\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.627484 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell2-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-cell2-compute-config-0\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.627518 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfhhd\" (UniqueName: \"kubernetes.io/projected/6366976b-5812-49e2-84dd-6ff069eecd14-kube-api-access-kfhhd\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.627547 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell2-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-cell2-combined-ca-bundle\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.627584 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/6366976b-5812-49e2-84dd-6ff069eecd14-nova-cells-global-config-0\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.729755 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell2-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-cell2-compute-config-0\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.729802 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfhhd\" (UniqueName: \"kubernetes.io/projected/6366976b-5812-49e2-84dd-6ff069eecd14-kube-api-access-kfhhd\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.729838 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell2-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-cell2-combined-ca-bundle\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.729861 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/6366976b-5812-49e2-84dd-6ff069eecd14-nova-cells-global-config-0\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.730080 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-migration-ssh-key-1\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.731230 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-ssh-key\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.731257 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-inventory\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.731295 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell2-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-cell2-compute-config-1\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.731325 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-migration-ssh-key-0\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.731127 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/6366976b-5812-49e2-84dd-6ff069eecd14-nova-cells-global-config-0\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.735582 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell2-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-cell2-combined-ca-bundle\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.736181 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-ssh-key\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.736360 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-migration-ssh-key-0\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.736602 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-migration-ssh-key-1\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.739655 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell2-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-cell2-compute-config-1\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.739720 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell2-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-cell2-compute-config-0\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.745135 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-inventory\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.747966 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfhhd\" (UniqueName: \"kubernetes.io/projected/6366976b-5812-49e2-84dd-6ff069eecd14-kube-api-access-kfhhd\") pod \"nova-cell2-openstack-openstack-cell2-nlwj4\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:11 crc kubenswrapper[4947]: I1203 09:38:11.874821 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:38:12 crc kubenswrapper[4947]: I1203 09:38:12.509864 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell2-openstack-openstack-cell2-nlwj4"] Dec 03 09:38:13 crc kubenswrapper[4947]: I1203 09:38:13.422532 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" event={"ID":"6366976b-5812-49e2-84dd-6ff069eecd14","Type":"ContainerStarted","Data":"c72a9c523128aca343b2fefed11ea9eb9ae97f0179db2e0823110db18cf6026d"} Dec 03 09:38:14 crc kubenswrapper[4947]: I1203 09:38:14.433651 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" event={"ID":"6366976b-5812-49e2-84dd-6ff069eecd14","Type":"ContainerStarted","Data":"88a71ef8b546dc6f5b850349398a0c49accc4eead8b4dc1ad490b878d1cba181"} Dec 03 09:38:14 crc kubenswrapper[4947]: I1203 09:38:14.470572 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" podStartSLOduration=2.9007648169999998 podStartE2EDuration="3.470555214s" podCreationTimestamp="2025-12-03 09:38:11 +0000 UTC" firstStartedPulling="2025-12-03 09:38:12.51550554 +0000 UTC m=+10153.776459966" lastFinishedPulling="2025-12-03 09:38:13.085295927 +0000 UTC m=+10154.346250363" observedRunningTime="2025-12-03 09:38:14.465031705 +0000 UTC m=+10155.725986131" watchObservedRunningTime="2025-12-03 09:38:14.470555214 +0000 UTC m=+10155.731509640" Dec 03 09:39:25 crc kubenswrapper[4947]: I1203 09:39:25.486545 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7lqcm"] Dec 03 09:39:25 crc kubenswrapper[4947]: I1203 09:39:25.491412 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7lqcm" Dec 03 09:39:25 crc kubenswrapper[4947]: I1203 09:39:25.502032 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7lqcm"] Dec 03 09:39:25 crc kubenswrapper[4947]: I1203 09:39:25.584285 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwv9t\" (UniqueName: \"kubernetes.io/projected/03f31e79-b832-4826-92ea-deaf7c98ff86-kube-api-access-dwv9t\") pod \"certified-operators-7lqcm\" (UID: \"03f31e79-b832-4826-92ea-deaf7c98ff86\") " pod="openshift-marketplace/certified-operators-7lqcm" Dec 03 09:39:25 crc kubenswrapper[4947]: I1203 09:39:25.584676 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03f31e79-b832-4826-92ea-deaf7c98ff86-utilities\") pod \"certified-operators-7lqcm\" (UID: \"03f31e79-b832-4826-92ea-deaf7c98ff86\") " pod="openshift-marketplace/certified-operators-7lqcm" Dec 03 09:39:25 crc kubenswrapper[4947]: I1203 09:39:25.587128 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03f31e79-b832-4826-92ea-deaf7c98ff86-catalog-content\") pod \"certified-operators-7lqcm\" (UID: \"03f31e79-b832-4826-92ea-deaf7c98ff86\") " pod="openshift-marketplace/certified-operators-7lqcm" Dec 03 09:39:25 crc kubenswrapper[4947]: I1203 09:39:25.688291 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwv9t\" (UniqueName: \"kubernetes.io/projected/03f31e79-b832-4826-92ea-deaf7c98ff86-kube-api-access-dwv9t\") pod \"certified-operators-7lqcm\" (UID: \"03f31e79-b832-4826-92ea-deaf7c98ff86\") " pod="openshift-marketplace/certified-operators-7lqcm" Dec 03 09:39:25 crc kubenswrapper[4947]: I1203 09:39:25.688348 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03f31e79-b832-4826-92ea-deaf7c98ff86-utilities\") pod \"certified-operators-7lqcm\" (UID: \"03f31e79-b832-4826-92ea-deaf7c98ff86\") " pod="openshift-marketplace/certified-operators-7lqcm" Dec 03 09:39:25 crc kubenswrapper[4947]: I1203 09:39:25.688465 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03f31e79-b832-4826-92ea-deaf7c98ff86-catalog-content\") pod \"certified-operators-7lqcm\" (UID: \"03f31e79-b832-4826-92ea-deaf7c98ff86\") " pod="openshift-marketplace/certified-operators-7lqcm" Dec 03 09:39:25 crc kubenswrapper[4947]: I1203 09:39:25.688990 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03f31e79-b832-4826-92ea-deaf7c98ff86-catalog-content\") pod \"certified-operators-7lqcm\" (UID: \"03f31e79-b832-4826-92ea-deaf7c98ff86\") " pod="openshift-marketplace/certified-operators-7lqcm" Dec 03 09:39:25 crc kubenswrapper[4947]: I1203 09:39:25.689217 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03f31e79-b832-4826-92ea-deaf7c98ff86-utilities\") pod \"certified-operators-7lqcm\" (UID: \"03f31e79-b832-4826-92ea-deaf7c98ff86\") " pod="openshift-marketplace/certified-operators-7lqcm" Dec 03 09:39:25 crc kubenswrapper[4947]: I1203 09:39:25.706825 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwv9t\" (UniqueName: \"kubernetes.io/projected/03f31e79-b832-4826-92ea-deaf7c98ff86-kube-api-access-dwv9t\") pod \"certified-operators-7lqcm\" (UID: \"03f31e79-b832-4826-92ea-deaf7c98ff86\") " pod="openshift-marketplace/certified-operators-7lqcm" Dec 03 09:39:25 crc kubenswrapper[4947]: I1203 09:39:25.826361 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7lqcm" Dec 03 09:39:26 crc kubenswrapper[4947]: I1203 09:39:26.435637 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7lqcm"] Dec 03 09:39:27 crc kubenswrapper[4947]: I1203 09:39:27.200950 4947 generic.go:334] "Generic (PLEG): container finished" podID="03f31e79-b832-4826-92ea-deaf7c98ff86" containerID="30aca07e16640a47a624782e9608704c62a513aa3d02a9b121753f1857c022cd" exitCode=0 Dec 03 09:39:27 crc kubenswrapper[4947]: I1203 09:39:27.201068 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lqcm" event={"ID":"03f31e79-b832-4826-92ea-deaf7c98ff86","Type":"ContainerDied","Data":"30aca07e16640a47a624782e9608704c62a513aa3d02a9b121753f1857c022cd"} Dec 03 09:39:27 crc kubenswrapper[4947]: I1203 09:39:27.203927 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lqcm" event={"ID":"03f31e79-b832-4826-92ea-deaf7c98ff86","Type":"ContainerStarted","Data":"03b35dbe58c69c332371f8e27e9251ba10cffbddb543e90507d6b6bd228894cf"} Dec 03 09:39:27 crc kubenswrapper[4947]: I1203 09:39:27.204109 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 09:39:29 crc kubenswrapper[4947]: I1203 09:39:29.239838 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lqcm" event={"ID":"03f31e79-b832-4826-92ea-deaf7c98ff86","Type":"ContainerStarted","Data":"c5efc5edff1ca1dd4934721d11331d08aa0dfb596d28a796cb6e45f1d6af1020"} Dec 03 09:39:30 crc kubenswrapper[4947]: I1203 09:39:30.252172 4947 generic.go:334] "Generic (PLEG): container finished" podID="03f31e79-b832-4826-92ea-deaf7c98ff86" containerID="c5efc5edff1ca1dd4934721d11331d08aa0dfb596d28a796cb6e45f1d6af1020" exitCode=0 Dec 03 09:39:30 crc kubenswrapper[4947]: I1203 09:39:30.252325 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lqcm" event={"ID":"03f31e79-b832-4826-92ea-deaf7c98ff86","Type":"ContainerDied","Data":"c5efc5edff1ca1dd4934721d11331d08aa0dfb596d28a796cb6e45f1d6af1020"} Dec 03 09:39:31 crc kubenswrapper[4947]: I1203 09:39:31.273238 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lqcm" event={"ID":"03f31e79-b832-4826-92ea-deaf7c98ff86","Type":"ContainerStarted","Data":"7129245accdaeb1d4d5d65bfb523f85f4c108d6ebfe65c66eab468949da0b181"} Dec 03 09:39:31 crc kubenswrapper[4947]: I1203 09:39:31.294391 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7lqcm" podStartSLOduration=2.78678786 podStartE2EDuration="6.294371847s" podCreationTimestamp="2025-12-03 09:39:25 +0000 UTC" firstStartedPulling="2025-12-03 09:39:27.203789858 +0000 UTC m=+10228.464744284" lastFinishedPulling="2025-12-03 09:39:30.711373825 +0000 UTC m=+10231.972328271" observedRunningTime="2025-12-03 09:39:31.290687997 +0000 UTC m=+10232.551642493" watchObservedRunningTime="2025-12-03 09:39:31.294371847 +0000 UTC m=+10232.555326273" Dec 03 09:39:35 crc kubenswrapper[4947]: I1203 09:39:35.827025 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7lqcm" Dec 03 09:39:35 crc kubenswrapper[4947]: I1203 09:39:35.827543 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7lqcm" Dec 03 09:39:35 crc kubenswrapper[4947]: I1203 09:39:35.876054 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7lqcm" Dec 03 09:39:36 crc kubenswrapper[4947]: I1203 09:39:36.394119 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7lqcm" Dec 03 09:39:36 crc kubenswrapper[4947]: I1203 09:39:36.442327 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7lqcm"] Dec 03 09:39:38 crc kubenswrapper[4947]: I1203 09:39:38.354412 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7lqcm" podUID="03f31e79-b832-4826-92ea-deaf7c98ff86" containerName="registry-server" containerID="cri-o://7129245accdaeb1d4d5d65bfb523f85f4c108d6ebfe65c66eab468949da0b181" gracePeriod=2 Dec 03 09:39:38 crc kubenswrapper[4947]: I1203 09:39:38.867458 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7lqcm" Dec 03 09:39:38 crc kubenswrapper[4947]: I1203 09:39:38.959733 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03f31e79-b832-4826-92ea-deaf7c98ff86-utilities\") pod \"03f31e79-b832-4826-92ea-deaf7c98ff86\" (UID: \"03f31e79-b832-4826-92ea-deaf7c98ff86\") " Dec 03 09:39:38 crc kubenswrapper[4947]: I1203 09:39:38.959855 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwv9t\" (UniqueName: \"kubernetes.io/projected/03f31e79-b832-4826-92ea-deaf7c98ff86-kube-api-access-dwv9t\") pod \"03f31e79-b832-4826-92ea-deaf7c98ff86\" (UID: \"03f31e79-b832-4826-92ea-deaf7c98ff86\") " Dec 03 09:39:38 crc kubenswrapper[4947]: I1203 09:39:38.959919 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03f31e79-b832-4826-92ea-deaf7c98ff86-catalog-content\") pod \"03f31e79-b832-4826-92ea-deaf7c98ff86\" (UID: \"03f31e79-b832-4826-92ea-deaf7c98ff86\") " Dec 03 09:39:38 crc kubenswrapper[4947]: I1203 09:39:38.960559 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03f31e79-b832-4826-92ea-deaf7c98ff86-utilities" (OuterVolumeSpecName: "utilities") pod "03f31e79-b832-4826-92ea-deaf7c98ff86" (UID: "03f31e79-b832-4826-92ea-deaf7c98ff86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:39:38 crc kubenswrapper[4947]: I1203 09:39:38.968070 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03f31e79-b832-4826-92ea-deaf7c98ff86-kube-api-access-dwv9t" (OuterVolumeSpecName: "kube-api-access-dwv9t") pod "03f31e79-b832-4826-92ea-deaf7c98ff86" (UID: "03f31e79-b832-4826-92ea-deaf7c98ff86"). InnerVolumeSpecName "kube-api-access-dwv9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:39:39 crc kubenswrapper[4947]: I1203 09:39:39.007315 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03f31e79-b832-4826-92ea-deaf7c98ff86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03f31e79-b832-4826-92ea-deaf7c98ff86" (UID: "03f31e79-b832-4826-92ea-deaf7c98ff86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:39:39 crc kubenswrapper[4947]: I1203 09:39:39.061657 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03f31e79-b832-4826-92ea-deaf7c98ff86-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:39:39 crc kubenswrapper[4947]: I1203 09:39:39.061697 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03f31e79-b832-4826-92ea-deaf7c98ff86-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:39:39 crc kubenswrapper[4947]: I1203 09:39:39.061710 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwv9t\" (UniqueName: \"kubernetes.io/projected/03f31e79-b832-4826-92ea-deaf7c98ff86-kube-api-access-dwv9t\") on node \"crc\" DevicePath \"\"" Dec 03 09:39:39 crc kubenswrapper[4947]: I1203 09:39:39.373985 4947 generic.go:334] "Generic (PLEG): container finished" podID="03f31e79-b832-4826-92ea-deaf7c98ff86" containerID="7129245accdaeb1d4d5d65bfb523f85f4c108d6ebfe65c66eab468949da0b181" exitCode=0 Dec 03 09:39:39 crc kubenswrapper[4947]: I1203 09:39:39.374077 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7lqcm" Dec 03 09:39:39 crc kubenswrapper[4947]: I1203 09:39:39.374071 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lqcm" event={"ID":"03f31e79-b832-4826-92ea-deaf7c98ff86","Type":"ContainerDied","Data":"7129245accdaeb1d4d5d65bfb523f85f4c108d6ebfe65c66eab468949da0b181"} Dec 03 09:39:39 crc kubenswrapper[4947]: I1203 09:39:39.374718 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7lqcm" event={"ID":"03f31e79-b832-4826-92ea-deaf7c98ff86","Type":"ContainerDied","Data":"03b35dbe58c69c332371f8e27e9251ba10cffbddb543e90507d6b6bd228894cf"} Dec 03 09:39:39 crc kubenswrapper[4947]: I1203 09:39:39.374738 4947 scope.go:117] "RemoveContainer" containerID="7129245accdaeb1d4d5d65bfb523f85f4c108d6ebfe65c66eab468949da0b181" Dec 03 09:39:39 crc kubenswrapper[4947]: I1203 09:39:39.408157 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7lqcm"] Dec 03 09:39:39 crc kubenswrapper[4947]: I1203 09:39:39.411738 4947 scope.go:117] "RemoveContainer" containerID="c5efc5edff1ca1dd4934721d11331d08aa0dfb596d28a796cb6e45f1d6af1020" Dec 03 09:39:39 crc kubenswrapper[4947]: I1203 09:39:39.426332 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7lqcm"] Dec 03 09:39:39 crc kubenswrapper[4947]: I1203 09:39:39.443277 4947 scope.go:117] "RemoveContainer" containerID="30aca07e16640a47a624782e9608704c62a513aa3d02a9b121753f1857c022cd" Dec 03 09:39:39 crc kubenswrapper[4947]: I1203 09:39:39.494574 4947 scope.go:117] "RemoveContainer" containerID="7129245accdaeb1d4d5d65bfb523f85f4c108d6ebfe65c66eab468949da0b181" Dec 03 09:39:39 crc kubenswrapper[4947]: E1203 09:39:39.495617 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7129245accdaeb1d4d5d65bfb523f85f4c108d6ebfe65c66eab468949da0b181\": container with ID starting with 7129245accdaeb1d4d5d65bfb523f85f4c108d6ebfe65c66eab468949da0b181 not found: ID does not exist" containerID="7129245accdaeb1d4d5d65bfb523f85f4c108d6ebfe65c66eab468949da0b181" Dec 03 09:39:39 crc kubenswrapper[4947]: I1203 09:39:39.495676 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7129245accdaeb1d4d5d65bfb523f85f4c108d6ebfe65c66eab468949da0b181"} err="failed to get container status \"7129245accdaeb1d4d5d65bfb523f85f4c108d6ebfe65c66eab468949da0b181\": rpc error: code = NotFound desc = could not find container \"7129245accdaeb1d4d5d65bfb523f85f4c108d6ebfe65c66eab468949da0b181\": container with ID starting with 7129245accdaeb1d4d5d65bfb523f85f4c108d6ebfe65c66eab468949da0b181 not found: ID does not exist" Dec 03 09:39:39 crc kubenswrapper[4947]: I1203 09:39:39.495710 4947 scope.go:117] "RemoveContainer" containerID="c5efc5edff1ca1dd4934721d11331d08aa0dfb596d28a796cb6e45f1d6af1020" Dec 03 09:39:39 crc kubenswrapper[4947]: E1203 09:39:39.496238 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5efc5edff1ca1dd4934721d11331d08aa0dfb596d28a796cb6e45f1d6af1020\": container with ID starting with c5efc5edff1ca1dd4934721d11331d08aa0dfb596d28a796cb6e45f1d6af1020 not found: ID does not exist" containerID="c5efc5edff1ca1dd4934721d11331d08aa0dfb596d28a796cb6e45f1d6af1020" Dec 03 09:39:39 crc kubenswrapper[4947]: I1203 09:39:39.496304 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5efc5edff1ca1dd4934721d11331d08aa0dfb596d28a796cb6e45f1d6af1020"} err="failed to get container status \"c5efc5edff1ca1dd4934721d11331d08aa0dfb596d28a796cb6e45f1d6af1020\": rpc error: code = NotFound desc = could not find container \"c5efc5edff1ca1dd4934721d11331d08aa0dfb596d28a796cb6e45f1d6af1020\": container with ID starting with c5efc5edff1ca1dd4934721d11331d08aa0dfb596d28a796cb6e45f1d6af1020 not found: ID does not exist" Dec 03 09:39:39 crc kubenswrapper[4947]: I1203 09:39:39.496343 4947 scope.go:117] "RemoveContainer" containerID="30aca07e16640a47a624782e9608704c62a513aa3d02a9b121753f1857c022cd" Dec 03 09:39:39 crc kubenswrapper[4947]: E1203 09:39:39.496749 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30aca07e16640a47a624782e9608704c62a513aa3d02a9b121753f1857c022cd\": container with ID starting with 30aca07e16640a47a624782e9608704c62a513aa3d02a9b121753f1857c022cd not found: ID does not exist" containerID="30aca07e16640a47a624782e9608704c62a513aa3d02a9b121753f1857c022cd" Dec 03 09:39:39 crc kubenswrapper[4947]: I1203 09:39:39.496773 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30aca07e16640a47a624782e9608704c62a513aa3d02a9b121753f1857c022cd"} err="failed to get container status \"30aca07e16640a47a624782e9608704c62a513aa3d02a9b121753f1857c022cd\": rpc error: code = NotFound desc = could not find container \"30aca07e16640a47a624782e9608704c62a513aa3d02a9b121753f1857c022cd\": container with ID starting with 30aca07e16640a47a624782e9608704c62a513aa3d02a9b121753f1857c022cd not found: ID does not exist" Dec 03 09:39:41 crc kubenswrapper[4947]: I1203 09:39:41.094595 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03f31e79-b832-4826-92ea-deaf7c98ff86" path="/var/lib/kubelet/pods/03f31e79-b832-4826-92ea-deaf7c98ff86/volumes" Dec 03 09:40:30 crc kubenswrapper[4947]: I1203 09:40:30.086262 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:40:30 crc kubenswrapper[4947]: I1203 09:40:30.087058 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:40:33 crc kubenswrapper[4947]: I1203 09:40:33.648601 4947 trace.go:236] Trace[104151853]: "Calculate volume metrics of ovndbcluster-sb-etc-ovn for pod openstack/ovsdbserver-sb-2" (03-Dec-2025 09:40:32.213) (total time: 1435ms): Dec 03 09:40:33 crc kubenswrapper[4947]: Trace[104151853]: [1.435047712s] [1.435047712s] END Dec 03 09:41:00 crc kubenswrapper[4947]: I1203 09:41:00.086637 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:41:00 crc kubenswrapper[4947]: I1203 09:41:00.087252 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:41:10 crc kubenswrapper[4947]: I1203 09:41:10.150875 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rnjrr"] Dec 03 09:41:10 crc kubenswrapper[4947]: E1203 09:41:10.152232 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f31e79-b832-4826-92ea-deaf7c98ff86" containerName="extract-content" Dec 03 09:41:10 crc kubenswrapper[4947]: I1203 09:41:10.152249 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f31e79-b832-4826-92ea-deaf7c98ff86" containerName="extract-content" Dec 03 09:41:10 crc kubenswrapper[4947]: E1203 09:41:10.152286 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f31e79-b832-4826-92ea-deaf7c98ff86" containerName="registry-server" Dec 03 09:41:10 crc kubenswrapper[4947]: I1203 09:41:10.152295 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f31e79-b832-4826-92ea-deaf7c98ff86" containerName="registry-server" Dec 03 09:41:10 crc kubenswrapper[4947]: E1203 09:41:10.152325 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f31e79-b832-4826-92ea-deaf7c98ff86" containerName="extract-utilities" Dec 03 09:41:10 crc kubenswrapper[4947]: I1203 09:41:10.152334 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f31e79-b832-4826-92ea-deaf7c98ff86" containerName="extract-utilities" Dec 03 09:41:10 crc kubenswrapper[4947]: I1203 09:41:10.152619 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="03f31e79-b832-4826-92ea-deaf7c98ff86" containerName="registry-server" Dec 03 09:41:10 crc kubenswrapper[4947]: I1203 09:41:10.154396 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rnjrr" Dec 03 09:41:10 crc kubenswrapper[4947]: I1203 09:41:10.214039 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rnjrr"] Dec 03 09:41:10 crc kubenswrapper[4947]: I1203 09:41:10.269628 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjnmm\" (UniqueName: \"kubernetes.io/projected/47893688-11d8-425a-b7c0-2b8e0dcb3744-kube-api-access-mjnmm\") pod \"redhat-operators-rnjrr\" (UID: \"47893688-11d8-425a-b7c0-2b8e0dcb3744\") " pod="openshift-marketplace/redhat-operators-rnjrr" Dec 03 09:41:10 crc kubenswrapper[4947]: I1203 09:41:10.269698 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47893688-11d8-425a-b7c0-2b8e0dcb3744-catalog-content\") pod \"redhat-operators-rnjrr\" (UID: \"47893688-11d8-425a-b7c0-2b8e0dcb3744\") " pod="openshift-marketplace/redhat-operators-rnjrr" Dec 03 09:41:10 crc kubenswrapper[4947]: I1203 09:41:10.269890 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47893688-11d8-425a-b7c0-2b8e0dcb3744-utilities\") pod \"redhat-operators-rnjrr\" (UID: \"47893688-11d8-425a-b7c0-2b8e0dcb3744\") " pod="openshift-marketplace/redhat-operators-rnjrr" Dec 03 09:41:10 crc kubenswrapper[4947]: I1203 09:41:10.372126 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjnmm\" (UniqueName: \"kubernetes.io/projected/47893688-11d8-425a-b7c0-2b8e0dcb3744-kube-api-access-mjnmm\") pod \"redhat-operators-rnjrr\" (UID: \"47893688-11d8-425a-b7c0-2b8e0dcb3744\") " pod="openshift-marketplace/redhat-operators-rnjrr" Dec 03 09:41:10 crc kubenswrapper[4947]: I1203 09:41:10.372201 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47893688-11d8-425a-b7c0-2b8e0dcb3744-catalog-content\") pod \"redhat-operators-rnjrr\" (UID: \"47893688-11d8-425a-b7c0-2b8e0dcb3744\") " pod="openshift-marketplace/redhat-operators-rnjrr" Dec 03 09:41:10 crc kubenswrapper[4947]: I1203 09:41:10.372258 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47893688-11d8-425a-b7c0-2b8e0dcb3744-utilities\") pod \"redhat-operators-rnjrr\" (UID: \"47893688-11d8-425a-b7c0-2b8e0dcb3744\") " pod="openshift-marketplace/redhat-operators-rnjrr" Dec 03 09:41:10 crc kubenswrapper[4947]: I1203 09:41:10.372910 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47893688-11d8-425a-b7c0-2b8e0dcb3744-utilities\") pod \"redhat-operators-rnjrr\" (UID: \"47893688-11d8-425a-b7c0-2b8e0dcb3744\") " pod="openshift-marketplace/redhat-operators-rnjrr" Dec 03 09:41:10 crc kubenswrapper[4947]: I1203 09:41:10.373393 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47893688-11d8-425a-b7c0-2b8e0dcb3744-catalog-content\") pod \"redhat-operators-rnjrr\" (UID: \"47893688-11d8-425a-b7c0-2b8e0dcb3744\") " pod="openshift-marketplace/redhat-operators-rnjrr" Dec 03 09:41:10 crc kubenswrapper[4947]: I1203 09:41:10.392201 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjnmm\" (UniqueName: \"kubernetes.io/projected/47893688-11d8-425a-b7c0-2b8e0dcb3744-kube-api-access-mjnmm\") pod \"redhat-operators-rnjrr\" (UID: \"47893688-11d8-425a-b7c0-2b8e0dcb3744\") " pod="openshift-marketplace/redhat-operators-rnjrr" Dec 03 09:41:10 crc kubenswrapper[4947]: I1203 09:41:10.510687 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rnjrr" Dec 03 09:41:11 crc kubenswrapper[4947]: I1203 09:41:11.026066 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rnjrr"] Dec 03 09:41:11 crc kubenswrapper[4947]: I1203 09:41:11.345592 4947 generic.go:334] "Generic (PLEG): container finished" podID="47893688-11d8-425a-b7c0-2b8e0dcb3744" containerID="4ae636f57139a5b59a4fcab77f08280c2866c3f4d3be99459403cf30e5fbb943" exitCode=0 Dec 03 09:41:11 crc kubenswrapper[4947]: I1203 09:41:11.345640 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnjrr" event={"ID":"47893688-11d8-425a-b7c0-2b8e0dcb3744","Type":"ContainerDied","Data":"4ae636f57139a5b59a4fcab77f08280c2866c3f4d3be99459403cf30e5fbb943"} Dec 03 09:41:11 crc kubenswrapper[4947]: I1203 09:41:11.345666 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnjrr" event={"ID":"47893688-11d8-425a-b7c0-2b8e0dcb3744","Type":"ContainerStarted","Data":"7eb5a6b5ade27d868ee18230271c9e4e8fa7d0b686d217558e2197fdb6229722"} Dec 03 09:41:13 crc kubenswrapper[4947]: I1203 09:41:13.366356 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnjrr" event={"ID":"47893688-11d8-425a-b7c0-2b8e0dcb3744","Type":"ContainerStarted","Data":"3032cbb0eebabd2407b8d51da91db15d5fe83f917843348049f63d5e69f6cdd1"} Dec 03 09:41:15 crc kubenswrapper[4947]: I1203 09:41:15.389412 4947 generic.go:334] "Generic (PLEG): container finished" podID="fad89668-b52f-4a0c-a347-08242c3d566d" containerID="1777e197926b29970de157072ea8fbadfda4e3f0b4ba5055c9bbd257dc3a5858" exitCode=0 Dec 03 09:41:15 crc kubenswrapper[4947]: I1203 09:41:15.389527 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" event={"ID":"fad89668-b52f-4a0c-a347-08242c3d566d","Type":"ContainerDied","Data":"1777e197926b29970de157072ea8fbadfda4e3f0b4ba5055c9bbd257dc3a5858"} Dec 03 09:41:16 crc kubenswrapper[4947]: I1203 09:41:16.404056 4947 generic.go:334] "Generic (PLEG): container finished" podID="47893688-11d8-425a-b7c0-2b8e0dcb3744" containerID="3032cbb0eebabd2407b8d51da91db15d5fe83f917843348049f63d5e69f6cdd1" exitCode=0 Dec 03 09:41:16 crc kubenswrapper[4947]: I1203 09:41:16.404143 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnjrr" event={"ID":"47893688-11d8-425a-b7c0-2b8e0dcb3744","Type":"ContainerDied","Data":"3032cbb0eebabd2407b8d51da91db15d5fe83f917843348049f63d5e69f6cdd1"} Dec 03 09:41:16 crc kubenswrapper[4947]: I1203 09:41:16.946861 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.031362 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-inventory\") pod \"fad89668-b52f-4a0c-a347-08242c3d566d\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.031411 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdbzs\" (UniqueName: \"kubernetes.io/projected/fad89668-b52f-4a0c-a347-08242c3d566d-kube-api-access-jdbzs\") pod \"fad89668-b52f-4a0c-a347-08242c3d566d\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.031513 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-cell1-compute-config-1\") pod \"fad89668-b52f-4a0c-a347-08242c3d566d\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.031643 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-cell1-combined-ca-bundle\") pod \"fad89668-b52f-4a0c-a347-08242c3d566d\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.031683 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-cell1-compute-config-0\") pod \"fad89668-b52f-4a0c-a347-08242c3d566d\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.031762 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-ssh-key\") pod \"fad89668-b52f-4a0c-a347-08242c3d566d\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.031780 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-migration-ssh-key-0\") pod \"fad89668-b52f-4a0c-a347-08242c3d566d\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.031799 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/fad89668-b52f-4a0c-a347-08242c3d566d-nova-cells-global-config-0\") pod \"fad89668-b52f-4a0c-a347-08242c3d566d\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.031827 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-migration-ssh-key-1\") pod \"fad89668-b52f-4a0c-a347-08242c3d566d\" (UID: \"fad89668-b52f-4a0c-a347-08242c3d566d\") " Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.038034 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "fad89668-b52f-4a0c-a347-08242c3d566d" (UID: "fad89668-b52f-4a0c-a347-08242c3d566d"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.047836 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fad89668-b52f-4a0c-a347-08242c3d566d-kube-api-access-jdbzs" (OuterVolumeSpecName: "kube-api-access-jdbzs") pod "fad89668-b52f-4a0c-a347-08242c3d566d" (UID: "fad89668-b52f-4a0c-a347-08242c3d566d"). InnerVolumeSpecName "kube-api-access-jdbzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.061713 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-inventory" (OuterVolumeSpecName: "inventory") pod "fad89668-b52f-4a0c-a347-08242c3d566d" (UID: "fad89668-b52f-4a0c-a347-08242c3d566d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.062643 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "fad89668-b52f-4a0c-a347-08242c3d566d" (UID: "fad89668-b52f-4a0c-a347-08242c3d566d"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.067355 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fad89668-b52f-4a0c-a347-08242c3d566d-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "fad89668-b52f-4a0c-a347-08242c3d566d" (UID: "fad89668-b52f-4a0c-a347-08242c3d566d"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.067724 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fad89668-b52f-4a0c-a347-08242c3d566d" (UID: "fad89668-b52f-4a0c-a347-08242c3d566d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.068179 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "fad89668-b52f-4a0c-a347-08242c3d566d" (UID: "fad89668-b52f-4a0c-a347-08242c3d566d"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.073153 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "fad89668-b52f-4a0c-a347-08242c3d566d" (UID: "fad89668-b52f-4a0c-a347-08242c3d566d"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.074628 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "fad89668-b52f-4a0c-a347-08242c3d566d" (UID: "fad89668-b52f-4a0c-a347-08242c3d566d"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.133849 4947 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.133896 4947 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.133910 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.133923 4947 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.133935 4947 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/fad89668-b52f-4a0c-a347-08242c3d566d-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.133949 4947 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.133960 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.133973 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdbzs\" (UniqueName: \"kubernetes.io/projected/fad89668-b52f-4a0c-a347-08242c3d566d-kube-api-access-jdbzs\") on node \"crc\" DevicePath \"\"" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.133984 4947 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/fad89668-b52f-4a0c-a347-08242c3d566d-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.418242 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" event={"ID":"fad89668-b52f-4a0c-a347-08242c3d566d","Type":"ContainerDied","Data":"ebef9ad0a8a4e14836f67a8525b5d8bd2334d5980001f03fd6983824d8528f28"} Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.418290 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebef9ad0a8a4e14836f67a8525b5d8bd2334d5980001f03fd6983824d8528f28" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.418339 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-qz8md" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.700445 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-49rkq"] Dec 03 09:41:17 crc kubenswrapper[4947]: E1203 09:41:17.701332 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fad89668-b52f-4a0c-a347-08242c3d566d" containerName="nova-cell1-openstack-openstack-cell1" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.701359 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="fad89668-b52f-4a0c-a347-08242c3d566d" containerName="nova-cell1-openstack-openstack-cell1" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.701633 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="fad89668-b52f-4a0c-a347-08242c3d566d" containerName="nova-cell1-openstack-openstack-cell1" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.702612 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-49rkq" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.707622 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.708439 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.709652 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rfmtm" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.719973 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-49rkq"] Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.850280 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-inventory\") pod \"telemetry-openstack-openstack-cell1-49rkq\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " pod="openstack/telemetry-openstack-openstack-cell1-49rkq" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.850818 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfqzd\" (UniqueName: \"kubernetes.io/projected/50db48fc-571d-4587-a07e-64ed238f82a9-kube-api-access-bfqzd\") pod \"telemetry-openstack-openstack-cell1-49rkq\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " pod="openstack/telemetry-openstack-openstack-cell1-49rkq" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.851079 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-49rkq\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " pod="openstack/telemetry-openstack-openstack-cell1-49rkq" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.851296 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-ssh-key\") pod \"telemetry-openstack-openstack-cell1-49rkq\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " pod="openstack/telemetry-openstack-openstack-cell1-49rkq" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.851556 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-49rkq\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " pod="openstack/telemetry-openstack-openstack-cell1-49rkq" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.851790 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-49rkq\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " pod="openstack/telemetry-openstack-openstack-cell1-49rkq" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.852114 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-49rkq\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " pod="openstack/telemetry-openstack-openstack-cell1-49rkq" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.954753 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-inventory\") pod \"telemetry-openstack-openstack-cell1-49rkq\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " pod="openstack/telemetry-openstack-openstack-cell1-49rkq" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.954933 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfqzd\" (UniqueName: \"kubernetes.io/projected/50db48fc-571d-4587-a07e-64ed238f82a9-kube-api-access-bfqzd\") pod \"telemetry-openstack-openstack-cell1-49rkq\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " pod="openstack/telemetry-openstack-openstack-cell1-49rkq" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.955680 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-49rkq\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " pod="openstack/telemetry-openstack-openstack-cell1-49rkq" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.956348 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-ssh-key\") pod \"telemetry-openstack-openstack-cell1-49rkq\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " pod="openstack/telemetry-openstack-openstack-cell1-49rkq" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.956462 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-49rkq\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " pod="openstack/telemetry-openstack-openstack-cell1-49rkq" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.956650 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-49rkq\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " pod="openstack/telemetry-openstack-openstack-cell1-49rkq" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.956909 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-49rkq\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " pod="openstack/telemetry-openstack-openstack-cell1-49rkq" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.959679 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-inventory\") pod \"telemetry-openstack-openstack-cell1-49rkq\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " pod="openstack/telemetry-openstack-openstack-cell1-49rkq" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.960204 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-49rkq\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " pod="openstack/telemetry-openstack-openstack-cell1-49rkq" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.960603 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-49rkq\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " pod="openstack/telemetry-openstack-openstack-cell1-49rkq" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.961723 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-ssh-key\") pod \"telemetry-openstack-openstack-cell1-49rkq\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " pod="openstack/telemetry-openstack-openstack-cell1-49rkq" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.962228 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-49rkq\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " pod="openstack/telemetry-openstack-openstack-cell1-49rkq" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.964731 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-49rkq\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " pod="openstack/telemetry-openstack-openstack-cell1-49rkq" Dec 03 09:41:17 crc kubenswrapper[4947]: I1203 09:41:17.978276 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfqzd\" (UniqueName: \"kubernetes.io/projected/50db48fc-571d-4587-a07e-64ed238f82a9-kube-api-access-bfqzd\") pod \"telemetry-openstack-openstack-cell1-49rkq\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " pod="openstack/telemetry-openstack-openstack-cell1-49rkq" Dec 03 09:41:18 crc kubenswrapper[4947]: I1203 09:41:18.029463 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-49rkq" Dec 03 09:41:18 crc kubenswrapper[4947]: I1203 09:41:18.444725 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnjrr" event={"ID":"47893688-11d8-425a-b7c0-2b8e0dcb3744","Type":"ContainerStarted","Data":"9a9e3a309284178712a6a69916abdebf8ae27e4a4190875879a37d7d47ee4278"} Dec 03 09:41:18 crc kubenswrapper[4947]: I1203 09:41:18.484237 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rnjrr" podStartSLOduration=1.814172358 podStartE2EDuration="8.484211162s" podCreationTimestamp="2025-12-03 09:41:10 +0000 UTC" firstStartedPulling="2025-12-03 09:41:11.347975459 +0000 UTC m=+10332.608929885" lastFinishedPulling="2025-12-03 09:41:18.018014253 +0000 UTC m=+10339.278968689" observedRunningTime="2025-12-03 09:41:18.470344467 +0000 UTC m=+10339.731298913" watchObservedRunningTime="2025-12-03 09:41:18.484211162 +0000 UTC m=+10339.745165588" Dec 03 09:41:18 crc kubenswrapper[4947]: W1203 09:41:18.653976 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50db48fc_571d_4587_a07e_64ed238f82a9.slice/crio-dcc433a2d3b734394bca37edf893739d0166c3eb3e1978961e471b22f77a7b8a WatchSource:0}: Error finding container dcc433a2d3b734394bca37edf893739d0166c3eb3e1978961e471b22f77a7b8a: Status 404 returned error can't find the container with id dcc433a2d3b734394bca37edf893739d0166c3eb3e1978961e471b22f77a7b8a Dec 03 09:41:18 crc kubenswrapper[4947]: I1203 09:41:18.657364 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-49rkq"] Dec 03 09:41:19 crc kubenswrapper[4947]: I1203 09:41:19.459210 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-49rkq" event={"ID":"50db48fc-571d-4587-a07e-64ed238f82a9","Type":"ContainerStarted","Data":"4e6a75d8e528772c0ed38e4a3446a99e51472b3dcb70f6131b217bf491da8169"} Dec 03 09:41:19 crc kubenswrapper[4947]: I1203 09:41:19.459612 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-49rkq" event={"ID":"50db48fc-571d-4587-a07e-64ed238f82a9","Type":"ContainerStarted","Data":"dcc433a2d3b734394bca37edf893739d0166c3eb3e1978961e471b22f77a7b8a"} Dec 03 09:41:19 crc kubenswrapper[4947]: I1203 09:41:19.482440 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-49rkq" podStartSLOduration=2.014352798 podStartE2EDuration="2.482423037s" podCreationTimestamp="2025-12-03 09:41:17 +0000 UTC" firstStartedPulling="2025-12-03 09:41:18.655814476 +0000 UTC m=+10339.916768902" lastFinishedPulling="2025-12-03 09:41:19.123884715 +0000 UTC m=+10340.384839141" observedRunningTime="2025-12-03 09:41:19.476907508 +0000 UTC m=+10340.737861934" watchObservedRunningTime="2025-12-03 09:41:19.482423037 +0000 UTC m=+10340.743377463" Dec 03 09:41:20 crc kubenswrapper[4947]: I1203 09:41:20.511211 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rnjrr" Dec 03 09:41:20 crc kubenswrapper[4947]: I1203 09:41:20.511461 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rnjrr" Dec 03 09:41:21 crc kubenswrapper[4947]: I1203 09:41:21.566111 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rnjrr" podUID="47893688-11d8-425a-b7c0-2b8e0dcb3744" containerName="registry-server" probeResult="failure" output=< Dec 03 09:41:21 crc kubenswrapper[4947]: timeout: failed to connect service ":50051" within 1s Dec 03 09:41:21 crc kubenswrapper[4947]: > Dec 03 09:41:22 crc kubenswrapper[4947]: I1203 09:41:22.490347 4947 generic.go:334] "Generic (PLEG): container finished" podID="6366976b-5812-49e2-84dd-6ff069eecd14" containerID="88a71ef8b546dc6f5b850349398a0c49accc4eead8b4dc1ad490b878d1cba181" exitCode=0 Dec 03 09:41:22 crc kubenswrapper[4947]: I1203 09:41:22.490397 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" event={"ID":"6366976b-5812-49e2-84dd-6ff069eecd14","Type":"ContainerDied","Data":"88a71ef8b546dc6f5b850349398a0c49accc4eead8b4dc1ad490b878d1cba181"} Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.035437 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.099190 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-migration-ssh-key-1\") pod \"6366976b-5812-49e2-84dd-6ff069eecd14\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.099283 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-migration-ssh-key-0\") pod \"6366976b-5812-49e2-84dd-6ff069eecd14\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.099368 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfhhd\" (UniqueName: \"kubernetes.io/projected/6366976b-5812-49e2-84dd-6ff069eecd14-kube-api-access-kfhhd\") pod \"6366976b-5812-49e2-84dd-6ff069eecd14\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.099413 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell2-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-cell2-compute-config-0\") pod \"6366976b-5812-49e2-84dd-6ff069eecd14\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.099594 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell2-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-cell2-combined-ca-bundle\") pod \"6366976b-5812-49e2-84dd-6ff069eecd14\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.099635 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-ssh-key\") pod \"6366976b-5812-49e2-84dd-6ff069eecd14\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.099688 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/6366976b-5812-49e2-84dd-6ff069eecd14-nova-cells-global-config-0\") pod \"6366976b-5812-49e2-84dd-6ff069eecd14\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.099728 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell2-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-cell2-compute-config-1\") pod \"6366976b-5812-49e2-84dd-6ff069eecd14\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.099775 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-inventory\") pod \"6366976b-5812-49e2-84dd-6ff069eecd14\" (UID: \"6366976b-5812-49e2-84dd-6ff069eecd14\") " Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.106560 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6366976b-5812-49e2-84dd-6ff069eecd14-kube-api-access-kfhhd" (OuterVolumeSpecName: "kube-api-access-kfhhd") pod "6366976b-5812-49e2-84dd-6ff069eecd14" (UID: "6366976b-5812-49e2-84dd-6ff069eecd14"). InnerVolumeSpecName "kube-api-access-kfhhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.120784 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-cell2-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell2-combined-ca-bundle") pod "6366976b-5812-49e2-84dd-6ff069eecd14" (UID: "6366976b-5812-49e2-84dd-6ff069eecd14"). InnerVolumeSpecName "nova-cell2-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.130465 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6366976b-5812-49e2-84dd-6ff069eecd14-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "6366976b-5812-49e2-84dd-6ff069eecd14" (UID: "6366976b-5812-49e2-84dd-6ff069eecd14"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.139297 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "6366976b-5812-49e2-84dd-6ff069eecd14" (UID: "6366976b-5812-49e2-84dd-6ff069eecd14"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.139868 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-inventory" (OuterVolumeSpecName: "inventory") pod "6366976b-5812-49e2-84dd-6ff069eecd14" (UID: "6366976b-5812-49e2-84dd-6ff069eecd14"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.147287 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-cell2-compute-config-0" (OuterVolumeSpecName: "nova-cell2-compute-config-0") pod "6366976b-5812-49e2-84dd-6ff069eecd14" (UID: "6366976b-5812-49e2-84dd-6ff069eecd14"). InnerVolumeSpecName "nova-cell2-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.153056 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6366976b-5812-49e2-84dd-6ff069eecd14" (UID: "6366976b-5812-49e2-84dd-6ff069eecd14"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.170659 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-cell2-compute-config-1" (OuterVolumeSpecName: "nova-cell2-compute-config-1") pod "6366976b-5812-49e2-84dd-6ff069eecd14" (UID: "6366976b-5812-49e2-84dd-6ff069eecd14"). InnerVolumeSpecName "nova-cell2-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.173560 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "6366976b-5812-49e2-84dd-6ff069eecd14" (UID: "6366976b-5812-49e2-84dd-6ff069eecd14"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.201999 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.202039 4947 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/6366976b-5812-49e2-84dd-6ff069eecd14-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.202050 4947 reconciler_common.go:293] "Volume detached for volume \"nova-cell2-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-cell2-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.202062 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.202074 4947 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.202085 4947 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.202096 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfhhd\" (UniqueName: \"kubernetes.io/projected/6366976b-5812-49e2-84dd-6ff069eecd14-kube-api-access-kfhhd\") on node \"crc\" DevicePath \"\"" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.202105 4947 reconciler_common.go:293] "Volume detached for volume \"nova-cell2-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-cell2-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.202116 4947 reconciler_common.go:293] "Volume detached for volume \"nova-cell2-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6366976b-5812-49e2-84dd-6ff069eecd14-nova-cell2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.524663 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" event={"ID":"6366976b-5812-49e2-84dd-6ff069eecd14","Type":"ContainerDied","Data":"c72a9c523128aca343b2fefed11ea9eb9ae97f0179db2e0823110db18cf6026d"} Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.525041 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c72a9c523128aca343b2fefed11ea9eb9ae97f0179db2e0823110db18cf6026d" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.524711 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-openstack-openstack-cell2-nlwj4" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.611820 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell2-sbspq"] Dec 03 09:41:24 crc kubenswrapper[4947]: E1203 09:41:24.612375 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6366976b-5812-49e2-84dd-6ff069eecd14" containerName="nova-cell2-openstack-openstack-cell2" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.612395 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="6366976b-5812-49e2-84dd-6ff069eecd14" containerName="nova-cell2-openstack-openstack-cell2" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.613110 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="6366976b-5812-49e2-84dd-6ff069eecd14" containerName="nova-cell2-openstack-openstack-cell2" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.614645 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell2-sbspq" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.616809 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell2" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.617019 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell2-dockercfg-cl4m2" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.626483 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell2-sbspq"] Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.712466 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwcrd\" (UniqueName: \"kubernetes.io/projected/42d0e62c-8bf7-4830-a450-d8cef6764abe-kube-api-access-cwcrd\") pod \"telemetry-openstack-openstack-cell2-sbspq\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " pod="openstack/telemetry-openstack-openstack-cell2-sbspq" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.712574 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-inventory\") pod \"telemetry-openstack-openstack-cell2-sbspq\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " pod="openstack/telemetry-openstack-openstack-cell2-sbspq" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.712638 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell2-sbspq\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " pod="openstack/telemetry-openstack-openstack-cell2-sbspq" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.712677 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-ssh-key\") pod \"telemetry-openstack-openstack-cell2-sbspq\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " pod="openstack/telemetry-openstack-openstack-cell2-sbspq" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.712741 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell2-sbspq\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " pod="openstack/telemetry-openstack-openstack-cell2-sbspq" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.712777 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell2-sbspq\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " pod="openstack/telemetry-openstack-openstack-cell2-sbspq" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.712816 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell2-sbspq\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " pod="openstack/telemetry-openstack-openstack-cell2-sbspq" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.814800 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell2-sbspq\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " pod="openstack/telemetry-openstack-openstack-cell2-sbspq" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.814843 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-ssh-key\") pod \"telemetry-openstack-openstack-cell2-sbspq\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " pod="openstack/telemetry-openstack-openstack-cell2-sbspq" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.814910 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell2-sbspq\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " pod="openstack/telemetry-openstack-openstack-cell2-sbspq" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.814958 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell2-sbspq\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " pod="openstack/telemetry-openstack-openstack-cell2-sbspq" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.814998 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell2-sbspq\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " pod="openstack/telemetry-openstack-openstack-cell2-sbspq" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.815069 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwcrd\" (UniqueName: \"kubernetes.io/projected/42d0e62c-8bf7-4830-a450-d8cef6764abe-kube-api-access-cwcrd\") pod \"telemetry-openstack-openstack-cell2-sbspq\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " pod="openstack/telemetry-openstack-openstack-cell2-sbspq" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.815112 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-inventory\") pod \"telemetry-openstack-openstack-cell2-sbspq\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " pod="openstack/telemetry-openstack-openstack-cell2-sbspq" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.838421 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell2-sbspq\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " pod="openstack/telemetry-openstack-openstack-cell2-sbspq" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.838451 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-ssh-key\") pod \"telemetry-openstack-openstack-cell2-sbspq\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " pod="openstack/telemetry-openstack-openstack-cell2-sbspq" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.839176 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-inventory\") pod \"telemetry-openstack-openstack-cell2-sbspq\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " pod="openstack/telemetry-openstack-openstack-cell2-sbspq" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.839192 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell2-sbspq\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " pod="openstack/telemetry-openstack-openstack-cell2-sbspq" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.844091 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell2-sbspq\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " pod="openstack/telemetry-openstack-openstack-cell2-sbspq" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.845134 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell2-sbspq\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " pod="openstack/telemetry-openstack-openstack-cell2-sbspq" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.854526 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwcrd\" (UniqueName: \"kubernetes.io/projected/42d0e62c-8bf7-4830-a450-d8cef6764abe-kube-api-access-cwcrd\") pod \"telemetry-openstack-openstack-cell2-sbspq\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " pod="openstack/telemetry-openstack-openstack-cell2-sbspq" Dec 03 09:41:24 crc kubenswrapper[4947]: I1203 09:41:24.912597 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell2-sbspq" Dec 03 09:41:25 crc kubenswrapper[4947]: I1203 09:41:25.586044 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell2-sbspq"] Dec 03 09:41:26 crc kubenswrapper[4947]: I1203 09:41:26.551434 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell2-sbspq" event={"ID":"42d0e62c-8bf7-4830-a450-d8cef6764abe","Type":"ContainerStarted","Data":"0c51c3df534216ccd7bdf5a0b938863b23df17263f7a2ee88bf16549e7444552"} Dec 03 09:41:26 crc kubenswrapper[4947]: I1203 09:41:26.552069 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell2-sbspq" event={"ID":"42d0e62c-8bf7-4830-a450-d8cef6764abe","Type":"ContainerStarted","Data":"d29b30049394774d1366af4c325210ab63437a21ab6f637594187951fd72c664"} Dec 03 09:41:26 crc kubenswrapper[4947]: I1203 09:41:26.592258 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell2-sbspq" podStartSLOduration=2.1825867150000002 podStartE2EDuration="2.592231897s" podCreationTimestamp="2025-12-03 09:41:24 +0000 UTC" firstStartedPulling="2025-12-03 09:41:25.581125104 +0000 UTC m=+10346.842079530" lastFinishedPulling="2025-12-03 09:41:25.990770286 +0000 UTC m=+10347.251724712" observedRunningTime="2025-12-03 09:41:26.574052246 +0000 UTC m=+10347.835006692" watchObservedRunningTime="2025-12-03 09:41:26.592231897 +0000 UTC m=+10347.853186333" Dec 03 09:41:30 crc kubenswrapper[4947]: I1203 09:41:30.086429 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:41:30 crc kubenswrapper[4947]: I1203 09:41:30.086820 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:41:30 crc kubenswrapper[4947]: I1203 09:41:30.086873 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 09:41:30 crc kubenswrapper[4947]: I1203 09:41:30.087832 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ba261672a72367230c55655e4f606db9a224cb56edda83f0bceaf8270f61180"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 09:41:30 crc kubenswrapper[4947]: I1203 09:41:30.087904 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://5ba261672a72367230c55655e4f606db9a224cb56edda83f0bceaf8270f61180" gracePeriod=600 Dec 03 09:41:30 crc kubenswrapper[4947]: I1203 09:41:30.564408 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rnjrr" Dec 03 09:41:30 crc kubenswrapper[4947]: I1203 09:41:30.603440 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="5ba261672a72367230c55655e4f606db9a224cb56edda83f0bceaf8270f61180" exitCode=0 Dec 03 09:41:30 crc kubenswrapper[4947]: I1203 09:41:30.603507 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"5ba261672a72367230c55655e4f606db9a224cb56edda83f0bceaf8270f61180"} Dec 03 09:41:30 crc kubenswrapper[4947]: I1203 09:41:30.603545 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7"} Dec 03 09:41:30 crc kubenswrapper[4947]: I1203 09:41:30.603565 4947 scope.go:117] "RemoveContainer" containerID="791b74edb9e8a1846604a108fa12a9b0de8b0eea8f0cde37cb14d25da81b8f9f" Dec 03 09:41:30 crc kubenswrapper[4947]: I1203 09:41:30.627382 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rnjrr" Dec 03 09:41:30 crc kubenswrapper[4947]: I1203 09:41:30.808329 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rnjrr"] Dec 03 09:41:31 crc kubenswrapper[4947]: I1203 09:41:31.619261 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rnjrr" podUID="47893688-11d8-425a-b7c0-2b8e0dcb3744" containerName="registry-server" containerID="cri-o://9a9e3a309284178712a6a69916abdebf8ae27e4a4190875879a37d7d47ee4278" gracePeriod=2 Dec 03 09:41:32 crc kubenswrapper[4947]: I1203 09:41:32.640849 4947 generic.go:334] "Generic (PLEG): container finished" podID="47893688-11d8-425a-b7c0-2b8e0dcb3744" containerID="9a9e3a309284178712a6a69916abdebf8ae27e4a4190875879a37d7d47ee4278" exitCode=0 Dec 03 09:41:32 crc kubenswrapper[4947]: I1203 09:41:32.641200 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnjrr" event={"ID":"47893688-11d8-425a-b7c0-2b8e0dcb3744","Type":"ContainerDied","Data":"9a9e3a309284178712a6a69916abdebf8ae27e4a4190875879a37d7d47ee4278"} Dec 03 09:41:32 crc kubenswrapper[4947]: I1203 09:41:32.912826 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rnjrr" Dec 03 09:41:32 crc kubenswrapper[4947]: I1203 09:41:32.972682 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjnmm\" (UniqueName: \"kubernetes.io/projected/47893688-11d8-425a-b7c0-2b8e0dcb3744-kube-api-access-mjnmm\") pod \"47893688-11d8-425a-b7c0-2b8e0dcb3744\" (UID: \"47893688-11d8-425a-b7c0-2b8e0dcb3744\") " Dec 03 09:41:32 crc kubenswrapper[4947]: I1203 09:41:32.972814 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47893688-11d8-425a-b7c0-2b8e0dcb3744-catalog-content\") pod \"47893688-11d8-425a-b7c0-2b8e0dcb3744\" (UID: \"47893688-11d8-425a-b7c0-2b8e0dcb3744\") " Dec 03 09:41:32 crc kubenswrapper[4947]: I1203 09:41:32.972838 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47893688-11d8-425a-b7c0-2b8e0dcb3744-utilities\") pod \"47893688-11d8-425a-b7c0-2b8e0dcb3744\" (UID: \"47893688-11d8-425a-b7c0-2b8e0dcb3744\") " Dec 03 09:41:32 crc kubenswrapper[4947]: I1203 09:41:32.974017 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47893688-11d8-425a-b7c0-2b8e0dcb3744-utilities" (OuterVolumeSpecName: "utilities") pod "47893688-11d8-425a-b7c0-2b8e0dcb3744" (UID: "47893688-11d8-425a-b7c0-2b8e0dcb3744"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:41:32 crc kubenswrapper[4947]: I1203 09:41:32.983730 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47893688-11d8-425a-b7c0-2b8e0dcb3744-kube-api-access-mjnmm" (OuterVolumeSpecName: "kube-api-access-mjnmm") pod "47893688-11d8-425a-b7c0-2b8e0dcb3744" (UID: "47893688-11d8-425a-b7c0-2b8e0dcb3744"). InnerVolumeSpecName "kube-api-access-mjnmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:41:33 crc kubenswrapper[4947]: I1203 09:41:33.075967 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjnmm\" (UniqueName: \"kubernetes.io/projected/47893688-11d8-425a-b7c0-2b8e0dcb3744-kube-api-access-mjnmm\") on node \"crc\" DevicePath \"\"" Dec 03 09:41:33 crc kubenswrapper[4947]: I1203 09:41:33.076033 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47893688-11d8-425a-b7c0-2b8e0dcb3744-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:41:33 crc kubenswrapper[4947]: I1203 09:41:33.088778 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47893688-11d8-425a-b7c0-2b8e0dcb3744-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47893688-11d8-425a-b7c0-2b8e0dcb3744" (UID: "47893688-11d8-425a-b7c0-2b8e0dcb3744"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:41:33 crc kubenswrapper[4947]: I1203 09:41:33.178569 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47893688-11d8-425a-b7c0-2b8e0dcb3744-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:41:33 crc kubenswrapper[4947]: I1203 09:41:33.654185 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rnjrr" event={"ID":"47893688-11d8-425a-b7c0-2b8e0dcb3744","Type":"ContainerDied","Data":"7eb5a6b5ade27d868ee18230271c9e4e8fa7d0b686d217558e2197fdb6229722"} Dec 03 09:41:33 crc kubenswrapper[4947]: I1203 09:41:33.654250 4947 scope.go:117] "RemoveContainer" containerID="9a9e3a309284178712a6a69916abdebf8ae27e4a4190875879a37d7d47ee4278" Dec 03 09:41:33 crc kubenswrapper[4947]: I1203 09:41:33.654273 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rnjrr" Dec 03 09:41:33 crc kubenswrapper[4947]: I1203 09:41:33.678712 4947 scope.go:117] "RemoveContainer" containerID="3032cbb0eebabd2407b8d51da91db15d5fe83f917843348049f63d5e69f6cdd1" Dec 03 09:41:33 crc kubenswrapper[4947]: I1203 09:41:33.687202 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rnjrr"] Dec 03 09:41:33 crc kubenswrapper[4947]: I1203 09:41:33.704122 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rnjrr"] Dec 03 09:41:33 crc kubenswrapper[4947]: I1203 09:41:33.711162 4947 scope.go:117] "RemoveContainer" containerID="4ae636f57139a5b59a4fcab77f08280c2866c3f4d3be99459403cf30e5fbb943" Dec 03 09:41:36 crc kubenswrapper[4947]: I1203 09:41:36.012400 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47893688-11d8-425a-b7c0-2b8e0dcb3744" path="/var/lib/kubelet/pods/47893688-11d8-425a-b7c0-2b8e0dcb3744/volumes" Dec 03 09:43:30 crc kubenswrapper[4947]: I1203 09:43:30.086668 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:43:30 crc kubenswrapper[4947]: I1203 09:43:30.087589 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:44:00 crc kubenswrapper[4947]: I1203 09:44:00.087304 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:44:00 crc kubenswrapper[4947]: I1203 09:44:00.088001 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:44:30 crc kubenswrapper[4947]: I1203 09:44:30.087030 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:44:30 crc kubenswrapper[4947]: I1203 09:44:30.087745 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:44:30 crc kubenswrapper[4947]: I1203 09:44:30.087808 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 09:44:30 crc kubenswrapper[4947]: I1203 09:44:30.088401 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 09:44:30 crc kubenswrapper[4947]: I1203 09:44:30.088459 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" gracePeriod=600 Dec 03 09:44:30 crc kubenswrapper[4947]: E1203 09:44:30.224800 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:44:31 crc kubenswrapper[4947]: I1203 09:44:31.003920 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" exitCode=0 Dec 03 09:44:31 crc kubenswrapper[4947]: I1203 09:44:31.003960 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7"} Dec 03 09:44:31 crc kubenswrapper[4947]: I1203 09:44:31.003991 4947 scope.go:117] "RemoveContainer" containerID="5ba261672a72367230c55655e4f606db9a224cb56edda83f0bceaf8270f61180" Dec 03 09:44:31 crc kubenswrapper[4947]: I1203 09:44:31.004655 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:44:31 crc kubenswrapper[4947]: E1203 09:44:31.004927 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:44:31 crc kubenswrapper[4947]: E1203 09:44:31.005777 4947 kuberuntime_gc.go:389] "Failed to remove container log dead symlink" err="remove /var/log/containers/machine-config-daemon-qv8tj_openshift-machine-config-operator_machine-config-daemon-5ba261672a72367230c55655e4f606db9a224cb56edda83f0bceaf8270f61180.log: no such file or directory" path="/var/log/containers/machine-config-daemon-qv8tj_openshift-machine-config-operator_machine-config-daemon-5ba261672a72367230c55655e4f606db9a224cb56edda83f0bceaf8270f61180.log" Dec 03 09:44:43 crc kubenswrapper[4947]: I1203 09:44:43.983701 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pktkf"] Dec 03 09:44:43 crc kubenswrapper[4947]: E1203 09:44:43.984992 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47893688-11d8-425a-b7c0-2b8e0dcb3744" containerName="extract-utilities" Dec 03 09:44:43 crc kubenswrapper[4947]: I1203 09:44:43.985011 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="47893688-11d8-425a-b7c0-2b8e0dcb3744" containerName="extract-utilities" Dec 03 09:44:43 crc kubenswrapper[4947]: E1203 09:44:43.985055 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47893688-11d8-425a-b7c0-2b8e0dcb3744" containerName="extract-content" Dec 03 09:44:43 crc kubenswrapper[4947]: I1203 09:44:43.985064 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="47893688-11d8-425a-b7c0-2b8e0dcb3744" containerName="extract-content" Dec 03 09:44:43 crc kubenswrapper[4947]: E1203 09:44:43.985093 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47893688-11d8-425a-b7c0-2b8e0dcb3744" containerName="registry-server" Dec 03 09:44:43 crc kubenswrapper[4947]: I1203 09:44:43.985103 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="47893688-11d8-425a-b7c0-2b8e0dcb3744" containerName="registry-server" Dec 03 09:44:43 crc kubenswrapper[4947]: I1203 09:44:43.985383 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="47893688-11d8-425a-b7c0-2b8e0dcb3744" containerName="registry-server" Dec 03 09:44:43 crc kubenswrapper[4947]: I1203 09:44:43.989305 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pktkf" Dec 03 09:44:43 crc kubenswrapper[4947]: I1203 09:44:43.999920 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pktkf"] Dec 03 09:44:44 crc kubenswrapper[4947]: I1203 09:44:44.093282 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7043c809-0a6c-4f0a-801a-0b46b55cfb6e-utilities\") pod \"community-operators-pktkf\" (UID: \"7043c809-0a6c-4f0a-801a-0b46b55cfb6e\") " pod="openshift-marketplace/community-operators-pktkf" Dec 03 09:44:44 crc kubenswrapper[4947]: I1203 09:44:44.093367 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f5hq\" (UniqueName: \"kubernetes.io/projected/7043c809-0a6c-4f0a-801a-0b46b55cfb6e-kube-api-access-9f5hq\") pod \"community-operators-pktkf\" (UID: \"7043c809-0a6c-4f0a-801a-0b46b55cfb6e\") " pod="openshift-marketplace/community-operators-pktkf" Dec 03 09:44:44 crc kubenswrapper[4947]: I1203 09:44:44.093643 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7043c809-0a6c-4f0a-801a-0b46b55cfb6e-catalog-content\") pod \"community-operators-pktkf\" (UID: \"7043c809-0a6c-4f0a-801a-0b46b55cfb6e\") " pod="openshift-marketplace/community-operators-pktkf" Dec 03 09:44:44 crc kubenswrapper[4947]: I1203 09:44:44.194996 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7043c809-0a6c-4f0a-801a-0b46b55cfb6e-catalog-content\") pod \"community-operators-pktkf\" (UID: \"7043c809-0a6c-4f0a-801a-0b46b55cfb6e\") " pod="openshift-marketplace/community-operators-pktkf" Dec 03 09:44:44 crc kubenswrapper[4947]: I1203 09:44:44.195179 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7043c809-0a6c-4f0a-801a-0b46b55cfb6e-utilities\") pod \"community-operators-pktkf\" (UID: \"7043c809-0a6c-4f0a-801a-0b46b55cfb6e\") " pod="openshift-marketplace/community-operators-pktkf" Dec 03 09:44:44 crc kubenswrapper[4947]: I1203 09:44:44.195198 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f5hq\" (UniqueName: \"kubernetes.io/projected/7043c809-0a6c-4f0a-801a-0b46b55cfb6e-kube-api-access-9f5hq\") pod \"community-operators-pktkf\" (UID: \"7043c809-0a6c-4f0a-801a-0b46b55cfb6e\") " pod="openshift-marketplace/community-operators-pktkf" Dec 03 09:44:44 crc kubenswrapper[4947]: I1203 09:44:44.195464 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7043c809-0a6c-4f0a-801a-0b46b55cfb6e-catalog-content\") pod \"community-operators-pktkf\" (UID: \"7043c809-0a6c-4f0a-801a-0b46b55cfb6e\") " pod="openshift-marketplace/community-operators-pktkf" Dec 03 09:44:44 crc kubenswrapper[4947]: I1203 09:44:44.195839 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7043c809-0a6c-4f0a-801a-0b46b55cfb6e-utilities\") pod \"community-operators-pktkf\" (UID: \"7043c809-0a6c-4f0a-801a-0b46b55cfb6e\") " pod="openshift-marketplace/community-operators-pktkf" Dec 03 09:44:44 crc kubenswrapper[4947]: I1203 09:44:44.232968 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f5hq\" (UniqueName: \"kubernetes.io/projected/7043c809-0a6c-4f0a-801a-0b46b55cfb6e-kube-api-access-9f5hq\") pod \"community-operators-pktkf\" (UID: \"7043c809-0a6c-4f0a-801a-0b46b55cfb6e\") " pod="openshift-marketplace/community-operators-pktkf" Dec 03 09:44:44 crc kubenswrapper[4947]: I1203 09:44:44.338111 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pktkf" Dec 03 09:44:44 crc kubenswrapper[4947]: I1203 09:44:44.962398 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pktkf"] Dec 03 09:44:44 crc kubenswrapper[4947]: W1203 09:44:44.974209 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7043c809_0a6c_4f0a_801a_0b46b55cfb6e.slice/crio-178d7fe07377e9db12028e1d5c78603520c67a736bf72ab100ead048dfee7ba4 WatchSource:0}: Error finding container 178d7fe07377e9db12028e1d5c78603520c67a736bf72ab100ead048dfee7ba4: Status 404 returned error can't find the container with id 178d7fe07377e9db12028e1d5c78603520c67a736bf72ab100ead048dfee7ba4 Dec 03 09:44:45 crc kubenswrapper[4947]: I1203 09:44:45.083424 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:44:45 crc kubenswrapper[4947]: E1203 09:44:45.083821 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:44:45 crc kubenswrapper[4947]: I1203 09:44:45.148447 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pktkf" event={"ID":"7043c809-0a6c-4f0a-801a-0b46b55cfb6e","Type":"ContainerStarted","Data":"178d7fe07377e9db12028e1d5c78603520c67a736bf72ab100ead048dfee7ba4"} Dec 03 09:44:46 crc kubenswrapper[4947]: I1203 09:44:46.162442 4947 generic.go:334] "Generic (PLEG): container finished" podID="7043c809-0a6c-4f0a-801a-0b46b55cfb6e" containerID="6ec727023ab23f4b714b65304291565d8de6f14ee5edb9cb52e032b6da30fb9e" exitCode=0 Dec 03 09:44:46 crc kubenswrapper[4947]: I1203 09:44:46.162559 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pktkf" event={"ID":"7043c809-0a6c-4f0a-801a-0b46b55cfb6e","Type":"ContainerDied","Data":"6ec727023ab23f4b714b65304291565d8de6f14ee5edb9cb52e032b6da30fb9e"} Dec 03 09:44:46 crc kubenswrapper[4947]: I1203 09:44:46.167067 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 09:44:47 crc kubenswrapper[4947]: I1203 09:44:47.174537 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pktkf" event={"ID":"7043c809-0a6c-4f0a-801a-0b46b55cfb6e","Type":"ContainerStarted","Data":"fc3a8a68d3fec24f6b0b174ad0c20dd4fd9c8cfd26a2f497097e70fc331b408c"} Dec 03 09:44:48 crc kubenswrapper[4947]: I1203 09:44:48.202109 4947 generic.go:334] "Generic (PLEG): container finished" podID="7043c809-0a6c-4f0a-801a-0b46b55cfb6e" containerID="fc3a8a68d3fec24f6b0b174ad0c20dd4fd9c8cfd26a2f497097e70fc331b408c" exitCode=0 Dec 03 09:44:48 crc kubenswrapper[4947]: I1203 09:44:48.202169 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pktkf" event={"ID":"7043c809-0a6c-4f0a-801a-0b46b55cfb6e","Type":"ContainerDied","Data":"fc3a8a68d3fec24f6b0b174ad0c20dd4fd9c8cfd26a2f497097e70fc331b408c"} Dec 03 09:44:49 crc kubenswrapper[4947]: I1203 09:44:49.214905 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pktkf" event={"ID":"7043c809-0a6c-4f0a-801a-0b46b55cfb6e","Type":"ContainerStarted","Data":"a85d66a743f381095b0ae31240f85b518bbc246fe4d4d997728edd6a305a763a"} Dec 03 09:44:49 crc kubenswrapper[4947]: I1203 09:44:49.239550 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pktkf" podStartSLOduration=3.712732051 podStartE2EDuration="6.239528423s" podCreationTimestamp="2025-12-03 09:44:43 +0000 UTC" firstStartedPulling="2025-12-03 09:44:46.166666485 +0000 UTC m=+10547.427620911" lastFinishedPulling="2025-12-03 09:44:48.693462857 +0000 UTC m=+10549.954417283" observedRunningTime="2025-12-03 09:44:49.233764167 +0000 UTC m=+10550.494718583" watchObservedRunningTime="2025-12-03 09:44:49.239528423 +0000 UTC m=+10550.500482859" Dec 03 09:44:54 crc kubenswrapper[4947]: I1203 09:44:54.338516 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pktkf" Dec 03 09:44:54 crc kubenswrapper[4947]: I1203 09:44:54.339137 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pktkf" Dec 03 09:44:54 crc kubenswrapper[4947]: I1203 09:44:54.402218 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pktkf" Dec 03 09:44:55 crc kubenswrapper[4947]: I1203 09:44:55.388951 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pktkf" Dec 03 09:44:55 crc kubenswrapper[4947]: I1203 09:44:55.447779 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pktkf"] Dec 03 09:44:57 crc kubenswrapper[4947]: I1203 09:44:57.083271 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:44:57 crc kubenswrapper[4947]: E1203 09:44:57.083927 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:44:57 crc kubenswrapper[4947]: I1203 09:44:57.351463 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pktkf" podUID="7043c809-0a6c-4f0a-801a-0b46b55cfb6e" containerName="registry-server" containerID="cri-o://a85d66a743f381095b0ae31240f85b518bbc246fe4d4d997728edd6a305a763a" gracePeriod=2 Dec 03 09:44:58 crc kubenswrapper[4947]: I1203 09:44:58.365853 4947 generic.go:334] "Generic (PLEG): container finished" podID="7043c809-0a6c-4f0a-801a-0b46b55cfb6e" containerID="a85d66a743f381095b0ae31240f85b518bbc246fe4d4d997728edd6a305a763a" exitCode=0 Dec 03 09:44:58 crc kubenswrapper[4947]: I1203 09:44:58.366135 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pktkf" event={"ID":"7043c809-0a6c-4f0a-801a-0b46b55cfb6e","Type":"ContainerDied","Data":"a85d66a743f381095b0ae31240f85b518bbc246fe4d4d997728edd6a305a763a"} Dec 03 09:44:58 crc kubenswrapper[4947]: I1203 09:44:58.366163 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pktkf" event={"ID":"7043c809-0a6c-4f0a-801a-0b46b55cfb6e","Type":"ContainerDied","Data":"178d7fe07377e9db12028e1d5c78603520c67a736bf72ab100ead048dfee7ba4"} Dec 03 09:44:58 crc kubenswrapper[4947]: I1203 09:44:58.366176 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="178d7fe07377e9db12028e1d5c78603520c67a736bf72ab100ead048dfee7ba4" Dec 03 09:44:58 crc kubenswrapper[4947]: I1203 09:44:58.370430 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pktkf" Dec 03 09:44:58 crc kubenswrapper[4947]: I1203 09:44:58.524029 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7043c809-0a6c-4f0a-801a-0b46b55cfb6e-catalog-content\") pod \"7043c809-0a6c-4f0a-801a-0b46b55cfb6e\" (UID: \"7043c809-0a6c-4f0a-801a-0b46b55cfb6e\") " Dec 03 09:44:58 crc kubenswrapper[4947]: I1203 09:44:58.524094 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f5hq\" (UniqueName: \"kubernetes.io/projected/7043c809-0a6c-4f0a-801a-0b46b55cfb6e-kube-api-access-9f5hq\") pod \"7043c809-0a6c-4f0a-801a-0b46b55cfb6e\" (UID: \"7043c809-0a6c-4f0a-801a-0b46b55cfb6e\") " Dec 03 09:44:58 crc kubenswrapper[4947]: I1203 09:44:58.524290 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7043c809-0a6c-4f0a-801a-0b46b55cfb6e-utilities\") pod \"7043c809-0a6c-4f0a-801a-0b46b55cfb6e\" (UID: \"7043c809-0a6c-4f0a-801a-0b46b55cfb6e\") " Dec 03 09:44:58 crc kubenswrapper[4947]: I1203 09:44:58.525868 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7043c809-0a6c-4f0a-801a-0b46b55cfb6e-utilities" (OuterVolumeSpecName: "utilities") pod "7043c809-0a6c-4f0a-801a-0b46b55cfb6e" (UID: "7043c809-0a6c-4f0a-801a-0b46b55cfb6e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:44:58 crc kubenswrapper[4947]: I1203 09:44:58.531846 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7043c809-0a6c-4f0a-801a-0b46b55cfb6e-kube-api-access-9f5hq" (OuterVolumeSpecName: "kube-api-access-9f5hq") pod "7043c809-0a6c-4f0a-801a-0b46b55cfb6e" (UID: "7043c809-0a6c-4f0a-801a-0b46b55cfb6e"). InnerVolumeSpecName "kube-api-access-9f5hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:44:58 crc kubenswrapper[4947]: I1203 09:44:58.583905 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7043c809-0a6c-4f0a-801a-0b46b55cfb6e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7043c809-0a6c-4f0a-801a-0b46b55cfb6e" (UID: "7043c809-0a6c-4f0a-801a-0b46b55cfb6e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:44:58 crc kubenswrapper[4947]: I1203 09:44:58.629569 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7043c809-0a6c-4f0a-801a-0b46b55cfb6e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:44:58 crc kubenswrapper[4947]: I1203 09:44:58.629604 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f5hq\" (UniqueName: \"kubernetes.io/projected/7043c809-0a6c-4f0a-801a-0b46b55cfb6e-kube-api-access-9f5hq\") on node \"crc\" DevicePath \"\"" Dec 03 09:44:58 crc kubenswrapper[4947]: I1203 09:44:58.629614 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7043c809-0a6c-4f0a-801a-0b46b55cfb6e-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:44:59 crc kubenswrapper[4947]: I1203 09:44:59.386842 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pktkf" Dec 03 09:44:59 crc kubenswrapper[4947]: I1203 09:44:59.425637 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pktkf"] Dec 03 09:44:59 crc kubenswrapper[4947]: I1203 09:44:59.440540 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pktkf"] Dec 03 09:45:00 crc kubenswrapper[4947]: I1203 09:45:00.152163 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412585-ccgtr"] Dec 03 09:45:00 crc kubenswrapper[4947]: E1203 09:45:00.152920 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7043c809-0a6c-4f0a-801a-0b46b55cfb6e" containerName="extract-content" Dec 03 09:45:00 crc kubenswrapper[4947]: I1203 09:45:00.152932 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="7043c809-0a6c-4f0a-801a-0b46b55cfb6e" containerName="extract-content" Dec 03 09:45:00 crc kubenswrapper[4947]: E1203 09:45:00.152953 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7043c809-0a6c-4f0a-801a-0b46b55cfb6e" containerName="extract-utilities" Dec 03 09:45:00 crc kubenswrapper[4947]: I1203 09:45:00.152959 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="7043c809-0a6c-4f0a-801a-0b46b55cfb6e" containerName="extract-utilities" Dec 03 09:45:00 crc kubenswrapper[4947]: E1203 09:45:00.152978 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7043c809-0a6c-4f0a-801a-0b46b55cfb6e" containerName="registry-server" Dec 03 09:45:00 crc kubenswrapper[4947]: I1203 09:45:00.152984 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="7043c809-0a6c-4f0a-801a-0b46b55cfb6e" containerName="registry-server" Dec 03 09:45:00 crc kubenswrapper[4947]: I1203 09:45:00.153254 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="7043c809-0a6c-4f0a-801a-0b46b55cfb6e" containerName="registry-server" Dec 03 09:45:00 crc kubenswrapper[4947]: I1203 09:45:00.154134 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412585-ccgtr" Dec 03 09:45:00 crc kubenswrapper[4947]: I1203 09:45:00.156384 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 09:45:00 crc kubenswrapper[4947]: I1203 09:45:00.156823 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 09:45:00 crc kubenswrapper[4947]: I1203 09:45:00.226007 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412585-ccgtr"] Dec 03 09:45:00 crc kubenswrapper[4947]: I1203 09:45:00.266447 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b742eacd-8c04-4b3f-8ee1-01ad85fb0c75-secret-volume\") pod \"collect-profiles-29412585-ccgtr\" (UID: \"b742eacd-8c04-4b3f-8ee1-01ad85fb0c75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412585-ccgtr" Dec 03 09:45:00 crc kubenswrapper[4947]: I1203 09:45:00.267066 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8nzn\" (UniqueName: \"kubernetes.io/projected/b742eacd-8c04-4b3f-8ee1-01ad85fb0c75-kube-api-access-c8nzn\") pod \"collect-profiles-29412585-ccgtr\" (UID: \"b742eacd-8c04-4b3f-8ee1-01ad85fb0c75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412585-ccgtr" Dec 03 09:45:00 crc kubenswrapper[4947]: I1203 09:45:00.267255 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b742eacd-8c04-4b3f-8ee1-01ad85fb0c75-config-volume\") pod \"collect-profiles-29412585-ccgtr\" (UID: \"b742eacd-8c04-4b3f-8ee1-01ad85fb0c75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412585-ccgtr" Dec 03 09:45:00 crc kubenswrapper[4947]: I1203 09:45:00.369108 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b742eacd-8c04-4b3f-8ee1-01ad85fb0c75-secret-volume\") pod \"collect-profiles-29412585-ccgtr\" (UID: \"b742eacd-8c04-4b3f-8ee1-01ad85fb0c75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412585-ccgtr" Dec 03 09:45:00 crc kubenswrapper[4947]: I1203 09:45:00.369178 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8nzn\" (UniqueName: \"kubernetes.io/projected/b742eacd-8c04-4b3f-8ee1-01ad85fb0c75-kube-api-access-c8nzn\") pod \"collect-profiles-29412585-ccgtr\" (UID: \"b742eacd-8c04-4b3f-8ee1-01ad85fb0c75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412585-ccgtr" Dec 03 09:45:00 crc kubenswrapper[4947]: I1203 09:45:00.369232 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b742eacd-8c04-4b3f-8ee1-01ad85fb0c75-config-volume\") pod \"collect-profiles-29412585-ccgtr\" (UID: \"b742eacd-8c04-4b3f-8ee1-01ad85fb0c75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412585-ccgtr" Dec 03 09:45:00 crc kubenswrapper[4947]: I1203 09:45:00.370463 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b742eacd-8c04-4b3f-8ee1-01ad85fb0c75-config-volume\") pod \"collect-profiles-29412585-ccgtr\" (UID: \"b742eacd-8c04-4b3f-8ee1-01ad85fb0c75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412585-ccgtr" Dec 03 09:45:00 crc kubenswrapper[4947]: I1203 09:45:00.540697 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8nzn\" (UniqueName: \"kubernetes.io/projected/b742eacd-8c04-4b3f-8ee1-01ad85fb0c75-kube-api-access-c8nzn\") pod \"collect-profiles-29412585-ccgtr\" (UID: \"b742eacd-8c04-4b3f-8ee1-01ad85fb0c75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412585-ccgtr" Dec 03 09:45:00 crc kubenswrapper[4947]: I1203 09:45:00.547741 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b742eacd-8c04-4b3f-8ee1-01ad85fb0c75-secret-volume\") pod \"collect-profiles-29412585-ccgtr\" (UID: \"b742eacd-8c04-4b3f-8ee1-01ad85fb0c75\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412585-ccgtr" Dec 03 09:45:00 crc kubenswrapper[4947]: I1203 09:45:00.828360 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412585-ccgtr" Dec 03 09:45:01 crc kubenswrapper[4947]: I1203 09:45:01.098094 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7043c809-0a6c-4f0a-801a-0b46b55cfb6e" path="/var/lib/kubelet/pods/7043c809-0a6c-4f0a-801a-0b46b55cfb6e/volumes" Dec 03 09:45:01 crc kubenswrapper[4947]: I1203 09:45:01.399391 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412585-ccgtr"] Dec 03 09:45:02 crc kubenswrapper[4947]: I1203 09:45:02.426759 4947 generic.go:334] "Generic (PLEG): container finished" podID="b742eacd-8c04-4b3f-8ee1-01ad85fb0c75" containerID="6fcbcae36981842ac9fea67afda16402acd385703574b61bbd21b4a3e2c21f39" exitCode=0 Dec 03 09:45:02 crc kubenswrapper[4947]: I1203 09:45:02.426819 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412585-ccgtr" event={"ID":"b742eacd-8c04-4b3f-8ee1-01ad85fb0c75","Type":"ContainerDied","Data":"6fcbcae36981842ac9fea67afda16402acd385703574b61bbd21b4a3e2c21f39"} Dec 03 09:45:02 crc kubenswrapper[4947]: I1203 09:45:02.427079 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412585-ccgtr" event={"ID":"b742eacd-8c04-4b3f-8ee1-01ad85fb0c75","Type":"ContainerStarted","Data":"b52afb6e26d04381f0d47bcc997038a79fa9d4a1c1286b517cc7f3b8b245d919"} Dec 03 09:45:03 crc kubenswrapper[4947]: I1203 09:45:03.837062 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412585-ccgtr" Dec 03 09:45:03 crc kubenswrapper[4947]: I1203 09:45:03.948110 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8nzn\" (UniqueName: \"kubernetes.io/projected/b742eacd-8c04-4b3f-8ee1-01ad85fb0c75-kube-api-access-c8nzn\") pod \"b742eacd-8c04-4b3f-8ee1-01ad85fb0c75\" (UID: \"b742eacd-8c04-4b3f-8ee1-01ad85fb0c75\") " Dec 03 09:45:03 crc kubenswrapper[4947]: I1203 09:45:03.948304 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b742eacd-8c04-4b3f-8ee1-01ad85fb0c75-config-volume\") pod \"b742eacd-8c04-4b3f-8ee1-01ad85fb0c75\" (UID: \"b742eacd-8c04-4b3f-8ee1-01ad85fb0c75\") " Dec 03 09:45:03 crc kubenswrapper[4947]: I1203 09:45:03.948390 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b742eacd-8c04-4b3f-8ee1-01ad85fb0c75-secret-volume\") pod \"b742eacd-8c04-4b3f-8ee1-01ad85fb0c75\" (UID: \"b742eacd-8c04-4b3f-8ee1-01ad85fb0c75\") " Dec 03 09:45:03 crc kubenswrapper[4947]: I1203 09:45:03.953149 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b742eacd-8c04-4b3f-8ee1-01ad85fb0c75-config-volume" (OuterVolumeSpecName: "config-volume") pod "b742eacd-8c04-4b3f-8ee1-01ad85fb0c75" (UID: "b742eacd-8c04-4b3f-8ee1-01ad85fb0c75"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:45:03 crc kubenswrapper[4947]: I1203 09:45:03.955188 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b742eacd-8c04-4b3f-8ee1-01ad85fb0c75-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b742eacd-8c04-4b3f-8ee1-01ad85fb0c75" (UID: "b742eacd-8c04-4b3f-8ee1-01ad85fb0c75"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:45:03 crc kubenswrapper[4947]: I1203 09:45:03.955623 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b742eacd-8c04-4b3f-8ee1-01ad85fb0c75-kube-api-access-c8nzn" (OuterVolumeSpecName: "kube-api-access-c8nzn") pod "b742eacd-8c04-4b3f-8ee1-01ad85fb0c75" (UID: "b742eacd-8c04-4b3f-8ee1-01ad85fb0c75"). InnerVolumeSpecName "kube-api-access-c8nzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:45:04 crc kubenswrapper[4947]: I1203 09:45:04.051670 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8nzn\" (UniqueName: \"kubernetes.io/projected/b742eacd-8c04-4b3f-8ee1-01ad85fb0c75-kube-api-access-c8nzn\") on node \"crc\" DevicePath \"\"" Dec 03 09:45:04 crc kubenswrapper[4947]: I1203 09:45:04.051714 4947 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b742eacd-8c04-4b3f-8ee1-01ad85fb0c75-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 09:45:04 crc kubenswrapper[4947]: I1203 09:45:04.051729 4947 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b742eacd-8c04-4b3f-8ee1-01ad85fb0c75-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 09:45:04 crc kubenswrapper[4947]: I1203 09:45:04.451078 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412585-ccgtr" event={"ID":"b742eacd-8c04-4b3f-8ee1-01ad85fb0c75","Type":"ContainerDied","Data":"b52afb6e26d04381f0d47bcc997038a79fa9d4a1c1286b517cc7f3b8b245d919"} Dec 03 09:45:04 crc kubenswrapper[4947]: I1203 09:45:04.451127 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b52afb6e26d04381f0d47bcc997038a79fa9d4a1c1286b517cc7f3b8b245d919" Dec 03 09:45:04 crc kubenswrapper[4947]: I1203 09:45:04.451238 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412585-ccgtr" Dec 03 09:45:04 crc kubenswrapper[4947]: E1203 09:45:04.790621 4947 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7043c809_0a6c_4f0a_801a_0b46b55cfb6e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7043c809_0a6c_4f0a_801a_0b46b55cfb6e.slice/crio-178d7fe07377e9db12028e1d5c78603520c67a736bf72ab100ead048dfee7ba4\": RecentStats: unable to find data in memory cache]" Dec 03 09:45:04 crc kubenswrapper[4947]: I1203 09:45:04.918009 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412540-7czh5"] Dec 03 09:45:04 crc kubenswrapper[4947]: I1203 09:45:04.928957 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412540-7czh5"] Dec 03 09:45:05 crc kubenswrapper[4947]: I1203 09:45:05.096693 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d95529d8-a859-4730-a786-39c231ba2be3" path="/var/lib/kubelet/pods/d95529d8-a859-4730-a786-39c231ba2be3/volumes" Dec 03 09:45:12 crc kubenswrapper[4947]: I1203 09:45:12.082993 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:45:12 crc kubenswrapper[4947]: E1203 09:45:12.085104 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:45:15 crc kubenswrapper[4947]: E1203 09:45:15.068914 4947 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7043c809_0a6c_4f0a_801a_0b46b55cfb6e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7043c809_0a6c_4f0a_801a_0b46b55cfb6e.slice/crio-178d7fe07377e9db12028e1d5c78603520c67a736bf72ab100ead048dfee7ba4\": RecentStats: unable to find data in memory cache]" Dec 03 09:45:23 crc kubenswrapper[4947]: I1203 09:45:23.082891 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:45:23 crc kubenswrapper[4947]: E1203 09:45:23.083880 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:45:25 crc kubenswrapper[4947]: E1203 09:45:25.416856 4947 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7043c809_0a6c_4f0a_801a_0b46b55cfb6e.slice/crio-178d7fe07377e9db12028e1d5c78603520c67a736bf72ab100ead048dfee7ba4\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7043c809_0a6c_4f0a_801a_0b46b55cfb6e.slice\": RecentStats: unable to find data in memory cache]" Dec 03 09:45:31 crc kubenswrapper[4947]: I1203 09:45:31.012422 4947 scope.go:117] "RemoveContainer" containerID="c12306b6242e792f15a718fed2336e9aafc9ee168add5af83e3c7e397b85b8d6" Dec 03 09:45:32 crc kubenswrapper[4947]: I1203 09:45:32.733763 4947 generic.go:334] "Generic (PLEG): container finished" podID="42d0e62c-8bf7-4830-a450-d8cef6764abe" containerID="0c51c3df534216ccd7bdf5a0b938863b23df17263f7a2ee88bf16549e7444552" exitCode=0 Dec 03 09:45:32 crc kubenswrapper[4947]: I1203 09:45:32.734364 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell2-sbspq" event={"ID":"42d0e62c-8bf7-4830-a450-d8cef6764abe","Type":"ContainerDied","Data":"0c51c3df534216ccd7bdf5a0b938863b23df17263f7a2ee88bf16549e7444552"} Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.579234 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell2-sbspq" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.618144 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-ceilometer-compute-config-data-0\") pod \"42d0e62c-8bf7-4830-a450-d8cef6764abe\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.618241 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-telemetry-combined-ca-bundle\") pod \"42d0e62c-8bf7-4830-a450-d8cef6764abe\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.618289 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-ceilometer-compute-config-data-1\") pod \"42d0e62c-8bf7-4830-a450-d8cef6764abe\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.618328 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwcrd\" (UniqueName: \"kubernetes.io/projected/42d0e62c-8bf7-4830-a450-d8cef6764abe-kube-api-access-cwcrd\") pod \"42d0e62c-8bf7-4830-a450-d8cef6764abe\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.618583 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-inventory\") pod \"42d0e62c-8bf7-4830-a450-d8cef6764abe\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.618622 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-ceilometer-compute-config-data-2\") pod \"42d0e62c-8bf7-4830-a450-d8cef6764abe\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.618758 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-ssh-key\") pod \"42d0e62c-8bf7-4830-a450-d8cef6764abe\" (UID: \"42d0e62c-8bf7-4830-a450-d8cef6764abe\") " Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.626551 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "42d0e62c-8bf7-4830-a450-d8cef6764abe" (UID: "42d0e62c-8bf7-4830-a450-d8cef6764abe"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.630811 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42d0e62c-8bf7-4830-a450-d8cef6764abe-kube-api-access-cwcrd" (OuterVolumeSpecName: "kube-api-access-cwcrd") pod "42d0e62c-8bf7-4830-a450-d8cef6764abe" (UID: "42d0e62c-8bf7-4830-a450-d8cef6764abe"). InnerVolumeSpecName "kube-api-access-cwcrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.663720 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "42d0e62c-8bf7-4830-a450-d8cef6764abe" (UID: "42d0e62c-8bf7-4830-a450-d8cef6764abe"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.678307 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "42d0e62c-8bf7-4830-a450-d8cef6764abe" (UID: "42d0e62c-8bf7-4830-a450-d8cef6764abe"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.683369 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "42d0e62c-8bf7-4830-a450-d8cef6764abe" (UID: "42d0e62c-8bf7-4830-a450-d8cef6764abe"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.689056 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-inventory" (OuterVolumeSpecName: "inventory") pod "42d0e62c-8bf7-4830-a450-d8cef6764abe" (UID: "42d0e62c-8bf7-4830-a450-d8cef6764abe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.699212 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "42d0e62c-8bf7-4830-a450-d8cef6764abe" (UID: "42d0e62c-8bf7-4830-a450-d8cef6764abe"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.721677 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.721722 4947 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.721734 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.721749 4947 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.721760 4947 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.721772 4947 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/42d0e62c-8bf7-4830-a450-d8cef6764abe-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.721784 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwcrd\" (UniqueName: \"kubernetes.io/projected/42d0e62c-8bf7-4830-a450-d8cef6764abe-kube-api-access-cwcrd\") on node \"crc\" DevicePath \"\"" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.759018 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell2-sbspq" event={"ID":"42d0e62c-8bf7-4830-a450-d8cef6764abe","Type":"ContainerDied","Data":"d29b30049394774d1366af4c325210ab63437a21ab6f637594187951fd72c664"} Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.759059 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d29b30049394774d1366af4c325210ab63437a21ab6f637594187951fd72c664" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.759092 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell2-sbspq" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.898249 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell2-2g4j6"] Dec 03 09:45:34 crc kubenswrapper[4947]: E1203 09:45:34.899074 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42d0e62c-8bf7-4830-a450-d8cef6764abe" containerName="telemetry-openstack-openstack-cell2" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.899095 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="42d0e62c-8bf7-4830-a450-d8cef6764abe" containerName="telemetry-openstack-openstack-cell2" Dec 03 09:45:34 crc kubenswrapper[4947]: E1203 09:45:34.899126 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b742eacd-8c04-4b3f-8ee1-01ad85fb0c75" containerName="collect-profiles" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.899135 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="b742eacd-8c04-4b3f-8ee1-01ad85fb0c75" containerName="collect-profiles" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.899420 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="42d0e62c-8bf7-4830-a450-d8cef6764abe" containerName="telemetry-openstack-openstack-cell2" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.899447 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="b742eacd-8c04-4b3f-8ee1-01ad85fb0c75" containerName="collect-profiles" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.900370 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell2-2g4j6" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.902373 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell2-dockercfg-cl4m2" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.909869 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.910415 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell2" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.918844 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell2-2g4j6"] Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.930201 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35f362a4-fcac-4d8b-a76b-558353ead5e5-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell2-2g4j6\" (UID: \"35f362a4-fcac-4d8b-a76b-558353ead5e5\") " pod="openstack/neutron-sriov-openstack-openstack-cell2-2g4j6" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.930275 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkf5k\" (UniqueName: \"kubernetes.io/projected/35f362a4-fcac-4d8b-a76b-558353ead5e5-kube-api-access-dkf5k\") pod \"neutron-sriov-openstack-openstack-cell2-2g4j6\" (UID: \"35f362a4-fcac-4d8b-a76b-558353ead5e5\") " pod="openstack/neutron-sriov-openstack-openstack-cell2-2g4j6" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.930358 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35f362a4-fcac-4d8b-a76b-558353ead5e5-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell2-2g4j6\" (UID: \"35f362a4-fcac-4d8b-a76b-558353ead5e5\") " pod="openstack/neutron-sriov-openstack-openstack-cell2-2g4j6" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.930422 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/35f362a4-fcac-4d8b-a76b-558353ead5e5-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell2-2g4j6\" (UID: \"35f362a4-fcac-4d8b-a76b-558353ead5e5\") " pod="openstack/neutron-sriov-openstack-openstack-cell2-2g4j6" Dec 03 09:45:34 crc kubenswrapper[4947]: I1203 09:45:34.930452 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35f362a4-fcac-4d8b-a76b-558353ead5e5-inventory\") pod \"neutron-sriov-openstack-openstack-cell2-2g4j6\" (UID: \"35f362a4-fcac-4d8b-a76b-558353ead5e5\") " pod="openstack/neutron-sriov-openstack-openstack-cell2-2g4j6" Dec 03 09:45:35 crc kubenswrapper[4947]: I1203 09:45:35.032445 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35f362a4-fcac-4d8b-a76b-558353ead5e5-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell2-2g4j6\" (UID: \"35f362a4-fcac-4d8b-a76b-558353ead5e5\") " pod="openstack/neutron-sriov-openstack-openstack-cell2-2g4j6" Dec 03 09:45:35 crc kubenswrapper[4947]: I1203 09:45:35.032613 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/35f362a4-fcac-4d8b-a76b-558353ead5e5-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell2-2g4j6\" (UID: \"35f362a4-fcac-4d8b-a76b-558353ead5e5\") " pod="openstack/neutron-sriov-openstack-openstack-cell2-2g4j6" Dec 03 09:45:35 crc kubenswrapper[4947]: I1203 09:45:35.032649 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35f362a4-fcac-4d8b-a76b-558353ead5e5-inventory\") pod \"neutron-sriov-openstack-openstack-cell2-2g4j6\" (UID: \"35f362a4-fcac-4d8b-a76b-558353ead5e5\") " pod="openstack/neutron-sriov-openstack-openstack-cell2-2g4j6" Dec 03 09:45:35 crc kubenswrapper[4947]: I1203 09:45:35.032799 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35f362a4-fcac-4d8b-a76b-558353ead5e5-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell2-2g4j6\" (UID: \"35f362a4-fcac-4d8b-a76b-558353ead5e5\") " pod="openstack/neutron-sriov-openstack-openstack-cell2-2g4j6" Dec 03 09:45:35 crc kubenswrapper[4947]: I1203 09:45:35.032830 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkf5k\" (UniqueName: \"kubernetes.io/projected/35f362a4-fcac-4d8b-a76b-558353ead5e5-kube-api-access-dkf5k\") pod \"neutron-sriov-openstack-openstack-cell2-2g4j6\" (UID: \"35f362a4-fcac-4d8b-a76b-558353ead5e5\") " pod="openstack/neutron-sriov-openstack-openstack-cell2-2g4j6" Dec 03 09:45:35 crc kubenswrapper[4947]: I1203 09:45:35.037862 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35f362a4-fcac-4d8b-a76b-558353ead5e5-inventory\") pod \"neutron-sriov-openstack-openstack-cell2-2g4j6\" (UID: \"35f362a4-fcac-4d8b-a76b-558353ead5e5\") " pod="openstack/neutron-sriov-openstack-openstack-cell2-2g4j6" Dec 03 09:45:35 crc kubenswrapper[4947]: I1203 09:45:35.040253 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35f362a4-fcac-4d8b-a76b-558353ead5e5-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell2-2g4j6\" (UID: \"35f362a4-fcac-4d8b-a76b-558353ead5e5\") " pod="openstack/neutron-sriov-openstack-openstack-cell2-2g4j6" Dec 03 09:45:35 crc kubenswrapper[4947]: I1203 09:45:35.043835 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35f362a4-fcac-4d8b-a76b-558353ead5e5-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell2-2g4j6\" (UID: \"35f362a4-fcac-4d8b-a76b-558353ead5e5\") " pod="openstack/neutron-sriov-openstack-openstack-cell2-2g4j6" Dec 03 09:45:35 crc kubenswrapper[4947]: I1203 09:45:35.046143 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/35f362a4-fcac-4d8b-a76b-558353ead5e5-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell2-2g4j6\" (UID: \"35f362a4-fcac-4d8b-a76b-558353ead5e5\") " pod="openstack/neutron-sriov-openstack-openstack-cell2-2g4j6" Dec 03 09:45:35 crc kubenswrapper[4947]: I1203 09:45:35.048770 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkf5k\" (UniqueName: \"kubernetes.io/projected/35f362a4-fcac-4d8b-a76b-558353ead5e5-kube-api-access-dkf5k\") pod \"neutron-sriov-openstack-openstack-cell2-2g4j6\" (UID: \"35f362a4-fcac-4d8b-a76b-558353ead5e5\") " pod="openstack/neutron-sriov-openstack-openstack-cell2-2g4j6" Dec 03 09:45:35 crc kubenswrapper[4947]: I1203 09:45:35.266149 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell2-2g4j6" Dec 03 09:45:35 crc kubenswrapper[4947]: E1203 09:45:35.750974 4947 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7043c809_0a6c_4f0a_801a_0b46b55cfb6e.slice/crio-178d7fe07377e9db12028e1d5c78603520c67a736bf72ab100ead048dfee7ba4\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7043c809_0a6c_4f0a_801a_0b46b55cfb6e.slice\": RecentStats: unable to find data in memory cache]" Dec 03 09:45:35 crc kubenswrapper[4947]: I1203 09:45:35.847960 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell2-2g4j6"] Dec 03 09:45:36 crc kubenswrapper[4947]: I1203 09:45:36.780591 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell2-2g4j6" event={"ID":"35f362a4-fcac-4d8b-a76b-558353ead5e5","Type":"ContainerStarted","Data":"4fcd17aa85dcf5667367a0cd879929d563622cf655e6ac2072d5bbf67d52fd92"} Dec 03 09:45:37 crc kubenswrapper[4947]: I1203 09:45:37.083964 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:45:37 crc kubenswrapper[4947]: E1203 09:45:37.084878 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:45:37 crc kubenswrapper[4947]: I1203 09:45:37.793118 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell2-2g4j6" event={"ID":"35f362a4-fcac-4d8b-a76b-558353ead5e5","Type":"ContainerStarted","Data":"796f16a9420578045d9036a4e8d3e0cbcbd0cc68f72031b5d403b0c2d9e84f6c"} Dec 03 09:45:37 crc kubenswrapper[4947]: I1203 09:45:37.814003 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell2-2g4j6" podStartSLOduration=3.289892747 podStartE2EDuration="3.813979489s" podCreationTimestamp="2025-12-03 09:45:34 +0000 UTC" firstStartedPulling="2025-12-03 09:45:36.548715142 +0000 UTC m=+10597.809669558" lastFinishedPulling="2025-12-03 09:45:37.072801864 +0000 UTC m=+10598.333756300" observedRunningTime="2025-12-03 09:45:37.812071087 +0000 UTC m=+10599.073025513" watchObservedRunningTime="2025-12-03 09:45:37.813979489 +0000 UTC m=+10599.074933935" Dec 03 09:45:43 crc kubenswrapper[4947]: I1203 09:45:43.868862 4947 generic.go:334] "Generic (PLEG): container finished" podID="50db48fc-571d-4587-a07e-64ed238f82a9" containerID="4e6a75d8e528772c0ed38e4a3446a99e51472b3dcb70f6131b217bf491da8169" exitCode=0 Dec 03 09:45:43 crc kubenswrapper[4947]: I1203 09:45:43.868952 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-49rkq" event={"ID":"50db48fc-571d-4587-a07e-64ed238f82a9","Type":"ContainerDied","Data":"4e6a75d8e528772c0ed38e4a3446a99e51472b3dcb70f6131b217bf491da8169"} Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.436141 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-49rkq" Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.562266 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-ceilometer-compute-config-data-1\") pod \"50db48fc-571d-4587-a07e-64ed238f82a9\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.562338 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfqzd\" (UniqueName: \"kubernetes.io/projected/50db48fc-571d-4587-a07e-64ed238f82a9-kube-api-access-bfqzd\") pod \"50db48fc-571d-4587-a07e-64ed238f82a9\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.562358 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-ceilometer-compute-config-data-0\") pod \"50db48fc-571d-4587-a07e-64ed238f82a9\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.562418 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-inventory\") pod \"50db48fc-571d-4587-a07e-64ed238f82a9\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.562469 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-ssh-key\") pod \"50db48fc-571d-4587-a07e-64ed238f82a9\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.562521 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-telemetry-combined-ca-bundle\") pod \"50db48fc-571d-4587-a07e-64ed238f82a9\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.562585 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-ceilometer-compute-config-data-2\") pod \"50db48fc-571d-4587-a07e-64ed238f82a9\" (UID: \"50db48fc-571d-4587-a07e-64ed238f82a9\") " Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.568614 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50db48fc-571d-4587-a07e-64ed238f82a9-kube-api-access-bfqzd" (OuterVolumeSpecName: "kube-api-access-bfqzd") pod "50db48fc-571d-4587-a07e-64ed238f82a9" (UID: "50db48fc-571d-4587-a07e-64ed238f82a9"). InnerVolumeSpecName "kube-api-access-bfqzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.573149 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "50db48fc-571d-4587-a07e-64ed238f82a9" (UID: "50db48fc-571d-4587-a07e-64ed238f82a9"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.591965 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "50db48fc-571d-4587-a07e-64ed238f82a9" (UID: "50db48fc-571d-4587-a07e-64ed238f82a9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.591979 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-inventory" (OuterVolumeSpecName: "inventory") pod "50db48fc-571d-4587-a07e-64ed238f82a9" (UID: "50db48fc-571d-4587-a07e-64ed238f82a9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.595777 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "50db48fc-571d-4587-a07e-64ed238f82a9" (UID: "50db48fc-571d-4587-a07e-64ed238f82a9"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.596062 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "50db48fc-571d-4587-a07e-64ed238f82a9" (UID: "50db48fc-571d-4587-a07e-64ed238f82a9"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.612186 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "50db48fc-571d-4587-a07e-64ed238f82a9" (UID: "50db48fc-571d-4587-a07e-64ed238f82a9"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.665289 4947 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.665336 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfqzd\" (UniqueName: \"kubernetes.io/projected/50db48fc-571d-4587-a07e-64ed238f82a9-kube-api-access-bfqzd\") on node \"crc\" DevicePath \"\"" Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.665350 4947 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.665366 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.665379 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.665389 4947 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.665401 4947 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/50db48fc-571d-4587-a07e-64ed238f82a9-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.900676 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-49rkq" event={"ID":"50db48fc-571d-4587-a07e-64ed238f82a9","Type":"ContainerDied","Data":"dcc433a2d3b734394bca37edf893739d0166c3eb3e1978961e471b22f77a7b8a"} Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.901052 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-49rkq" Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.901066 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcc433a2d3b734394bca37edf893739d0166c3eb3e1978961e471b22f77a7b8a" Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.989638 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-dzdfr"] Dec 03 09:45:45 crc kubenswrapper[4947]: E1203 09:45:45.990332 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50db48fc-571d-4587-a07e-64ed238f82a9" containerName="telemetry-openstack-openstack-cell1" Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.990351 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="50db48fc-571d-4587-a07e-64ed238f82a9" containerName="telemetry-openstack-openstack-cell1" Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.990607 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="50db48fc-571d-4587-a07e-64ed238f82a9" containerName="telemetry-openstack-openstack-cell1" Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.992076 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-dzdfr" Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.995820 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 09:45:45 crc kubenswrapper[4947]: I1203 09:45:45.996139 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rfmtm" Dec 03 09:45:46 crc kubenswrapper[4947]: I1203 09:45:46.017386 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-dzdfr"] Dec 03 09:45:46 crc kubenswrapper[4947]: I1203 09:45:46.073130 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-dzdfr\" (UID: \"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dzdfr" Dec 03 09:45:46 crc kubenswrapper[4947]: I1203 09:45:46.073235 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-dzdfr\" (UID: \"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dzdfr" Dec 03 09:45:46 crc kubenswrapper[4947]: I1203 09:45:46.073294 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-dzdfr\" (UID: \"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dzdfr" Dec 03 09:45:46 crc kubenswrapper[4947]: I1203 09:45:46.073333 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq5qb\" (UniqueName: \"kubernetes.io/projected/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-kube-api-access-fq5qb\") pod \"neutron-sriov-openstack-openstack-cell1-dzdfr\" (UID: \"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dzdfr" Dec 03 09:45:46 crc kubenswrapper[4947]: I1203 09:45:46.073411 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-dzdfr\" (UID: \"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dzdfr" Dec 03 09:45:46 crc kubenswrapper[4947]: E1203 09:45:46.091254 4947 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7043c809_0a6c_4f0a_801a_0b46b55cfb6e.slice/crio-178d7fe07377e9db12028e1d5c78603520c67a736bf72ab100ead048dfee7ba4\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50db48fc_571d_4587_a07e_64ed238f82a9.slice/crio-dcc433a2d3b734394bca37edf893739d0166c3eb3e1978961e471b22f77a7b8a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7043c809_0a6c_4f0a_801a_0b46b55cfb6e.slice\": RecentStats: unable to find data in memory cache]" Dec 03 09:45:46 crc kubenswrapper[4947]: I1203 09:45:46.175713 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-dzdfr\" (UID: \"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dzdfr" Dec 03 09:45:46 crc kubenswrapper[4947]: I1203 09:45:46.175893 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq5qb\" (UniqueName: \"kubernetes.io/projected/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-kube-api-access-fq5qb\") pod \"neutron-sriov-openstack-openstack-cell1-dzdfr\" (UID: \"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dzdfr" Dec 03 09:45:46 crc kubenswrapper[4947]: I1203 09:45:46.176066 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-dzdfr\" (UID: \"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dzdfr" Dec 03 09:45:46 crc kubenswrapper[4947]: I1203 09:45:46.176172 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-dzdfr\" (UID: \"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dzdfr" Dec 03 09:45:46 crc kubenswrapper[4947]: I1203 09:45:46.176231 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-dzdfr\" (UID: \"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dzdfr" Dec 03 09:45:46 crc kubenswrapper[4947]: I1203 09:45:46.181606 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-dzdfr\" (UID: \"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dzdfr" Dec 03 09:45:46 crc kubenswrapper[4947]: I1203 09:45:46.181663 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-dzdfr\" (UID: \"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dzdfr" Dec 03 09:45:46 crc kubenswrapper[4947]: I1203 09:45:46.182682 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-dzdfr\" (UID: \"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dzdfr" Dec 03 09:45:46 crc kubenswrapper[4947]: I1203 09:45:46.189577 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-dzdfr\" (UID: \"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dzdfr" Dec 03 09:45:46 crc kubenswrapper[4947]: I1203 09:45:46.195848 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq5qb\" (UniqueName: \"kubernetes.io/projected/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-kube-api-access-fq5qb\") pod \"neutron-sriov-openstack-openstack-cell1-dzdfr\" (UID: \"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-dzdfr" Dec 03 09:45:46 crc kubenswrapper[4947]: I1203 09:45:46.319735 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-dzdfr" Dec 03 09:45:46 crc kubenswrapper[4947]: I1203 09:45:46.919377 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-dzdfr"] Dec 03 09:45:47 crc kubenswrapper[4947]: I1203 09:45:47.931202 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-dzdfr" event={"ID":"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb","Type":"ContainerStarted","Data":"cf311a886c3dc8b747849be57e4dcba4bc9808677b2049c9ff4e2ae1e76f9520"} Dec 03 09:45:47 crc kubenswrapper[4947]: I1203 09:45:47.931813 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-dzdfr" event={"ID":"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb","Type":"ContainerStarted","Data":"0e290eadc8f3cb128ba758746567267ee059d99d8637167d3b6d38058f8eba77"} Dec 03 09:45:47 crc kubenswrapper[4947]: I1203 09:45:47.962703 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-dzdfr" podStartSLOduration=2.42164936 podStartE2EDuration="2.96267546s" podCreationTimestamp="2025-12-03 09:45:45 +0000 UTC" firstStartedPulling="2025-12-03 09:45:46.933652363 +0000 UTC m=+10608.194606789" lastFinishedPulling="2025-12-03 09:45:47.474678463 +0000 UTC m=+10608.735632889" observedRunningTime="2025-12-03 09:45:47.949564147 +0000 UTC m=+10609.210518593" watchObservedRunningTime="2025-12-03 09:45:47.96267546 +0000 UTC m=+10609.223629886" Dec 03 09:45:49 crc kubenswrapper[4947]: I1203 09:45:49.091972 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:45:49 crc kubenswrapper[4947]: E1203 09:45:49.092461 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:45:56 crc kubenswrapper[4947]: E1203 09:45:56.405347 4947 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7043c809_0a6c_4f0a_801a_0b46b55cfb6e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7043c809_0a6c_4f0a_801a_0b46b55cfb6e.slice/crio-178d7fe07377e9db12028e1d5c78603520c67a736bf72ab100ead048dfee7ba4\": RecentStats: unable to find data in memory cache]" Dec 03 09:46:01 crc kubenswrapper[4947]: I1203 09:46:01.084169 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:46:01 crc kubenswrapper[4947]: E1203 09:46:01.085485 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:46:13 crc kubenswrapper[4947]: I1203 09:46:13.083395 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:46:13 crc kubenswrapper[4947]: E1203 09:46:13.084349 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:46:25 crc kubenswrapper[4947]: I1203 09:46:25.083042 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:46:25 crc kubenswrapper[4947]: E1203 09:46:25.084057 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:46:37 crc kubenswrapper[4947]: I1203 09:46:37.082907 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:46:37 crc kubenswrapper[4947]: E1203 09:46:37.083805 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:46:49 crc kubenswrapper[4947]: I1203 09:46:49.090318 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:46:49 crc kubenswrapper[4947]: E1203 09:46:49.091105 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:47:02 crc kubenswrapper[4947]: I1203 09:47:02.082735 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:47:02 crc kubenswrapper[4947]: E1203 09:47:02.083597 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:47:13 crc kubenswrapper[4947]: I1203 09:47:13.086472 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:47:13 crc kubenswrapper[4947]: E1203 09:47:13.087494 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:47:15 crc kubenswrapper[4947]: I1203 09:47:15.941649 4947 generic.go:334] "Generic (PLEG): container finished" podID="35f362a4-fcac-4d8b-a76b-558353ead5e5" containerID="796f16a9420578045d9036a4e8d3e0cbcbd0cc68f72031b5d403b0c2d9e84f6c" exitCode=0 Dec 03 09:47:15 crc kubenswrapper[4947]: I1203 09:47:15.941750 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell2-2g4j6" event={"ID":"35f362a4-fcac-4d8b-a76b-558353ead5e5","Type":"ContainerDied","Data":"796f16a9420578045d9036a4e8d3e0cbcbd0cc68f72031b5d403b0c2d9e84f6c"} Dec 03 09:47:17 crc kubenswrapper[4947]: I1203 09:47:17.435466 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell2-2g4j6" Dec 03 09:47:17 crc kubenswrapper[4947]: I1203 09:47:17.527902 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35f362a4-fcac-4d8b-a76b-558353ead5e5-neutron-sriov-combined-ca-bundle\") pod \"35f362a4-fcac-4d8b-a76b-558353ead5e5\" (UID: \"35f362a4-fcac-4d8b-a76b-558353ead5e5\") " Dec 03 09:47:17 crc kubenswrapper[4947]: I1203 09:47:17.528161 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkf5k\" (UniqueName: \"kubernetes.io/projected/35f362a4-fcac-4d8b-a76b-558353ead5e5-kube-api-access-dkf5k\") pod \"35f362a4-fcac-4d8b-a76b-558353ead5e5\" (UID: \"35f362a4-fcac-4d8b-a76b-558353ead5e5\") " Dec 03 09:47:17 crc kubenswrapper[4947]: I1203 09:47:17.528268 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35f362a4-fcac-4d8b-a76b-558353ead5e5-ssh-key\") pod \"35f362a4-fcac-4d8b-a76b-558353ead5e5\" (UID: \"35f362a4-fcac-4d8b-a76b-558353ead5e5\") " Dec 03 09:47:17 crc kubenswrapper[4947]: I1203 09:47:17.528295 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35f362a4-fcac-4d8b-a76b-558353ead5e5-inventory\") pod \"35f362a4-fcac-4d8b-a76b-558353ead5e5\" (UID: \"35f362a4-fcac-4d8b-a76b-558353ead5e5\") " Dec 03 09:47:17 crc kubenswrapper[4947]: I1203 09:47:17.528457 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/35f362a4-fcac-4d8b-a76b-558353ead5e5-neutron-sriov-agent-neutron-config-0\") pod \"35f362a4-fcac-4d8b-a76b-558353ead5e5\" (UID: \"35f362a4-fcac-4d8b-a76b-558353ead5e5\") " Dec 03 09:47:17 crc kubenswrapper[4947]: I1203 09:47:17.534427 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35f362a4-fcac-4d8b-a76b-558353ead5e5-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "35f362a4-fcac-4d8b-a76b-558353ead5e5" (UID: "35f362a4-fcac-4d8b-a76b-558353ead5e5"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:47:17 crc kubenswrapper[4947]: I1203 09:47:17.537373 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35f362a4-fcac-4d8b-a76b-558353ead5e5-kube-api-access-dkf5k" (OuterVolumeSpecName: "kube-api-access-dkf5k") pod "35f362a4-fcac-4d8b-a76b-558353ead5e5" (UID: "35f362a4-fcac-4d8b-a76b-558353ead5e5"). InnerVolumeSpecName "kube-api-access-dkf5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:47:17 crc kubenswrapper[4947]: I1203 09:47:17.559782 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35f362a4-fcac-4d8b-a76b-558353ead5e5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "35f362a4-fcac-4d8b-a76b-558353ead5e5" (UID: "35f362a4-fcac-4d8b-a76b-558353ead5e5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:47:17 crc kubenswrapper[4947]: I1203 09:47:17.565436 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35f362a4-fcac-4d8b-a76b-558353ead5e5-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "35f362a4-fcac-4d8b-a76b-558353ead5e5" (UID: "35f362a4-fcac-4d8b-a76b-558353ead5e5"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:47:17 crc kubenswrapper[4947]: I1203 09:47:17.567456 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35f362a4-fcac-4d8b-a76b-558353ead5e5-inventory" (OuterVolumeSpecName: "inventory") pod "35f362a4-fcac-4d8b-a76b-558353ead5e5" (UID: "35f362a4-fcac-4d8b-a76b-558353ead5e5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:47:17 crc kubenswrapper[4947]: I1203 09:47:17.631216 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkf5k\" (UniqueName: \"kubernetes.io/projected/35f362a4-fcac-4d8b-a76b-558353ead5e5-kube-api-access-dkf5k\") on node \"crc\" DevicePath \"\"" Dec 03 09:47:17 crc kubenswrapper[4947]: I1203 09:47:17.631260 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/35f362a4-fcac-4d8b-a76b-558353ead5e5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:47:17 crc kubenswrapper[4947]: I1203 09:47:17.631277 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/35f362a4-fcac-4d8b-a76b-558353ead5e5-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:47:17 crc kubenswrapper[4947]: I1203 09:47:17.631298 4947 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/35f362a4-fcac-4d8b-a76b-558353ead5e5-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:47:17 crc kubenswrapper[4947]: I1203 09:47:17.631399 4947 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35f362a4-fcac-4d8b-a76b-558353ead5e5-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:47:17 crc kubenswrapper[4947]: I1203 09:47:17.976803 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell2-2g4j6" event={"ID":"35f362a4-fcac-4d8b-a76b-558353ead5e5","Type":"ContainerDied","Data":"4fcd17aa85dcf5667367a0cd879929d563622cf655e6ac2072d5bbf67d52fd92"} Dec 03 09:47:17 crc kubenswrapper[4947]: I1203 09:47:17.976846 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fcd17aa85dcf5667367a0cd879929d563622cf655e6ac2072d5bbf67d52fd92" Dec 03 09:47:17 crc kubenswrapper[4947]: I1203 09:47:17.976846 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell2-2g4j6" Dec 03 09:47:18 crc kubenswrapper[4947]: I1203 09:47:18.096129 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg"] Dec 03 09:47:18 crc kubenswrapper[4947]: E1203 09:47:18.096707 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35f362a4-fcac-4d8b-a76b-558353ead5e5" containerName="neutron-sriov-openstack-openstack-cell2" Dec 03 09:47:18 crc kubenswrapper[4947]: I1203 09:47:18.096730 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="35f362a4-fcac-4d8b-a76b-558353ead5e5" containerName="neutron-sriov-openstack-openstack-cell2" Dec 03 09:47:18 crc kubenswrapper[4947]: I1203 09:47:18.097049 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="35f362a4-fcac-4d8b-a76b-558353ead5e5" containerName="neutron-sriov-openstack-openstack-cell2" Dec 03 09:47:18 crc kubenswrapper[4947]: I1203 09:47:18.098015 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg" Dec 03 09:47:18 crc kubenswrapper[4947]: I1203 09:47:18.100102 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell2-dockercfg-cl4m2" Dec 03 09:47:18 crc kubenswrapper[4947]: I1203 09:47:18.101184 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Dec 03 09:47:18 crc kubenswrapper[4947]: I1203 09:47:18.101470 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell2" Dec 03 09:47:18 crc kubenswrapper[4947]: I1203 09:47:18.110180 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg"] Dec 03 09:47:18 crc kubenswrapper[4947]: I1203 09:47:18.243953 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ccd72a-194e-4207-bdfc-ed594d06f81c-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell2-bhnzg\" (UID: \"93ccd72a-194e-4207-bdfc-ed594d06f81c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg" Dec 03 09:47:18 crc kubenswrapper[4947]: I1203 09:47:18.244021 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/93ccd72a-194e-4207-bdfc-ed594d06f81c-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell2-bhnzg\" (UID: \"93ccd72a-194e-4207-bdfc-ed594d06f81c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg" Dec 03 09:47:18 crc kubenswrapper[4947]: I1203 09:47:18.244286 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93ccd72a-194e-4207-bdfc-ed594d06f81c-inventory\") pod \"neutron-dhcp-openstack-openstack-cell2-bhnzg\" (UID: \"93ccd72a-194e-4207-bdfc-ed594d06f81c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg" Dec 03 09:47:18 crc kubenswrapper[4947]: I1203 09:47:18.244415 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/93ccd72a-194e-4207-bdfc-ed594d06f81c-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell2-bhnzg\" (UID: \"93ccd72a-194e-4207-bdfc-ed594d06f81c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg" Dec 03 09:47:18 crc kubenswrapper[4947]: I1203 09:47:18.244581 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kfkv\" (UniqueName: \"kubernetes.io/projected/93ccd72a-194e-4207-bdfc-ed594d06f81c-kube-api-access-7kfkv\") pod \"neutron-dhcp-openstack-openstack-cell2-bhnzg\" (UID: \"93ccd72a-194e-4207-bdfc-ed594d06f81c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg" Dec 03 09:47:18 crc kubenswrapper[4947]: I1203 09:47:18.346035 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kfkv\" (UniqueName: \"kubernetes.io/projected/93ccd72a-194e-4207-bdfc-ed594d06f81c-kube-api-access-7kfkv\") pod \"neutron-dhcp-openstack-openstack-cell2-bhnzg\" (UID: \"93ccd72a-194e-4207-bdfc-ed594d06f81c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg" Dec 03 09:47:18 crc kubenswrapper[4947]: I1203 09:47:18.346346 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ccd72a-194e-4207-bdfc-ed594d06f81c-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell2-bhnzg\" (UID: \"93ccd72a-194e-4207-bdfc-ed594d06f81c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg" Dec 03 09:47:18 crc kubenswrapper[4947]: I1203 09:47:18.346494 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/93ccd72a-194e-4207-bdfc-ed594d06f81c-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell2-bhnzg\" (UID: \"93ccd72a-194e-4207-bdfc-ed594d06f81c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg" Dec 03 09:47:18 crc kubenswrapper[4947]: I1203 09:47:18.346895 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93ccd72a-194e-4207-bdfc-ed594d06f81c-inventory\") pod \"neutron-dhcp-openstack-openstack-cell2-bhnzg\" (UID: \"93ccd72a-194e-4207-bdfc-ed594d06f81c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg" Dec 03 09:47:18 crc kubenswrapper[4947]: I1203 09:47:18.347029 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/93ccd72a-194e-4207-bdfc-ed594d06f81c-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell2-bhnzg\" (UID: \"93ccd72a-194e-4207-bdfc-ed594d06f81c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg" Dec 03 09:47:18 crc kubenswrapper[4947]: I1203 09:47:18.350716 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/93ccd72a-194e-4207-bdfc-ed594d06f81c-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell2-bhnzg\" (UID: \"93ccd72a-194e-4207-bdfc-ed594d06f81c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg" Dec 03 09:47:18 crc kubenswrapper[4947]: I1203 09:47:18.351810 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ccd72a-194e-4207-bdfc-ed594d06f81c-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell2-bhnzg\" (UID: \"93ccd72a-194e-4207-bdfc-ed594d06f81c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg" Dec 03 09:47:18 crc kubenswrapper[4947]: I1203 09:47:18.353066 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93ccd72a-194e-4207-bdfc-ed594d06f81c-inventory\") pod \"neutron-dhcp-openstack-openstack-cell2-bhnzg\" (UID: \"93ccd72a-194e-4207-bdfc-ed594d06f81c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg" Dec 03 09:47:18 crc kubenswrapper[4947]: I1203 09:47:18.354566 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/93ccd72a-194e-4207-bdfc-ed594d06f81c-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell2-bhnzg\" (UID: \"93ccd72a-194e-4207-bdfc-ed594d06f81c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg" Dec 03 09:47:18 crc kubenswrapper[4947]: I1203 09:47:18.375692 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kfkv\" (UniqueName: \"kubernetes.io/projected/93ccd72a-194e-4207-bdfc-ed594d06f81c-kube-api-access-7kfkv\") pod \"neutron-dhcp-openstack-openstack-cell2-bhnzg\" (UID: \"93ccd72a-194e-4207-bdfc-ed594d06f81c\") " pod="openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg" Dec 03 09:47:18 crc kubenswrapper[4947]: I1203 09:47:18.420180 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg" Dec 03 09:47:19 crc kubenswrapper[4947]: I1203 09:47:19.005990 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg"] Dec 03 09:47:20 crc kubenswrapper[4947]: I1203 09:47:20.000418 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg" event={"ID":"93ccd72a-194e-4207-bdfc-ed594d06f81c","Type":"ContainerStarted","Data":"dee2b8e0049e39eacab16174d79077b63c3234496003d017eea4b1df59d94b7d"} Dec 03 09:47:20 crc kubenswrapper[4947]: I1203 09:47:20.000815 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg" event={"ID":"93ccd72a-194e-4207-bdfc-ed594d06f81c","Type":"ContainerStarted","Data":"e616ee4e96cfbb9a61e117964bdb51be81378030eb09edad29c0c7c82e6bc666"} Dec 03 09:47:20 crc kubenswrapper[4947]: I1203 09:47:20.025460 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg" podStartSLOduration=1.5863694069999998 podStartE2EDuration="2.025444533s" podCreationTimestamp="2025-12-03 09:47:18 +0000 UTC" firstStartedPulling="2025-12-03 09:47:19.009334935 +0000 UTC m=+10700.270289361" lastFinishedPulling="2025-12-03 09:47:19.448410021 +0000 UTC m=+10700.709364487" observedRunningTime="2025-12-03 09:47:20.018658021 +0000 UTC m=+10701.279612457" watchObservedRunningTime="2025-12-03 09:47:20.025444533 +0000 UTC m=+10701.286398959" Dec 03 09:47:27 crc kubenswrapper[4947]: I1203 09:47:27.083233 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:47:27 crc kubenswrapper[4947]: E1203 09:47:27.084014 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:47:42 crc kubenswrapper[4947]: I1203 09:47:42.084320 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:47:42 crc kubenswrapper[4947]: E1203 09:47:42.085151 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:47:53 crc kubenswrapper[4947]: I1203 09:47:53.371742 4947 generic.go:334] "Generic (PLEG): container finished" podID="1b9f4a70-dc36-4a8f-a879-3d74365bd0fb" containerID="cf311a886c3dc8b747849be57e4dcba4bc9808677b2049c9ff4e2ae1e76f9520" exitCode=0 Dec 03 09:47:53 crc kubenswrapper[4947]: I1203 09:47:53.371815 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-dzdfr" event={"ID":"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb","Type":"ContainerDied","Data":"cf311a886c3dc8b747849be57e4dcba4bc9808677b2049c9ff4e2ae1e76f9520"} Dec 03 09:47:54 crc kubenswrapper[4947]: I1203 09:47:54.854532 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-dzdfr" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.056581 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fq5qb\" (UniqueName: \"kubernetes.io/projected/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-kube-api-access-fq5qb\") pod \"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb\" (UID: \"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb\") " Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.058187 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-inventory\") pod \"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb\" (UID: \"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb\") " Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.058992 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-neutron-sriov-combined-ca-bundle\") pod \"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb\" (UID: \"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb\") " Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.059103 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-ssh-key\") pod \"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb\" (UID: \"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb\") " Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.059205 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-neutron-sriov-agent-neutron-config-0\") pod \"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb\" (UID: \"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb\") " Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.063104 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "1b9f4a70-dc36-4a8f-a879-3d74365bd0fb" (UID: "1b9f4a70-dc36-4a8f-a879-3d74365bd0fb"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.064163 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-kube-api-access-fq5qb" (OuterVolumeSpecName: "kube-api-access-fq5qb") pod "1b9f4a70-dc36-4a8f-a879-3d74365bd0fb" (UID: "1b9f4a70-dc36-4a8f-a879-3d74365bd0fb"). InnerVolumeSpecName "kube-api-access-fq5qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.087322 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-inventory" (OuterVolumeSpecName: "inventory") pod "1b9f4a70-dc36-4a8f-a879-3d74365bd0fb" (UID: "1b9f4a70-dc36-4a8f-a879-3d74365bd0fb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.090575 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "1b9f4a70-dc36-4a8f-a879-3d74365bd0fb" (UID: "1b9f4a70-dc36-4a8f-a879-3d74365bd0fb"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.095162 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1b9f4a70-dc36-4a8f-a879-3d74365bd0fb" (UID: "1b9f4a70-dc36-4a8f-a879-3d74365bd0fb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.189872 4947 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.189925 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.189942 4947 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.189967 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fq5qb\" (UniqueName: \"kubernetes.io/projected/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-kube-api-access-fq5qb\") on node \"crc\" DevicePath \"\"" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.189983 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b9f4a70-dc36-4a8f-a879-3d74365bd0fb-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.395096 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-dzdfr" event={"ID":"1b9f4a70-dc36-4a8f-a879-3d74365bd0fb","Type":"ContainerDied","Data":"0e290eadc8f3cb128ba758746567267ee059d99d8637167d3b6d38058f8eba77"} Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.395146 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e290eadc8f3cb128ba758746567267ee059d99d8637167d3b6d38058f8eba77" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.395168 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-dzdfr" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.585511 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9"] Dec 03 09:47:55 crc kubenswrapper[4947]: E1203 09:47:55.586003 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b9f4a70-dc36-4a8f-a879-3d74365bd0fb" containerName="neutron-sriov-openstack-openstack-cell1" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.586019 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9f4a70-dc36-4a8f-a879-3d74365bd0fb" containerName="neutron-sriov-openstack-openstack-cell1" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.586218 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b9f4a70-dc36-4a8f-a879-3d74365bd0fb" containerName="neutron-sriov-openstack-openstack-cell1" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.586994 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.589878 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.590749 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rfmtm" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.598884 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9"] Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.699985 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w99p2\" (UniqueName: \"kubernetes.io/projected/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-kube-api-access-w99p2\") pod \"neutron-dhcp-openstack-openstack-cell1-r5zr9\" (UID: \"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.700168 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-r5zr9\" (UID: \"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.700220 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-r5zr9\" (UID: \"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.700374 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-r5zr9\" (UID: \"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.700441 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-r5zr9\" (UID: \"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.802632 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-r5zr9\" (UID: \"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.802951 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w99p2\" (UniqueName: \"kubernetes.io/projected/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-kube-api-access-w99p2\") pod \"neutron-dhcp-openstack-openstack-cell1-r5zr9\" (UID: \"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.803056 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-r5zr9\" (UID: \"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.803089 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-r5zr9\" (UID: \"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.803179 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-r5zr9\" (UID: \"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.809059 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-r5zr9\" (UID: \"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.818208 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-r5zr9\" (UID: \"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.818367 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-r5zr9\" (UID: \"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.819423 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-r5zr9\" (UID: \"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.822410 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w99p2\" (UniqueName: \"kubernetes.io/projected/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-kube-api-access-w99p2\") pod \"neutron-dhcp-openstack-openstack-cell1-r5zr9\" (UID: \"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9" Dec 03 09:47:55 crc kubenswrapper[4947]: I1203 09:47:55.906004 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9" Dec 03 09:47:56 crc kubenswrapper[4947]: I1203 09:47:56.085613 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:47:56 crc kubenswrapper[4947]: E1203 09:47:56.086021 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:47:56 crc kubenswrapper[4947]: I1203 09:47:56.481272 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9"] Dec 03 09:47:57 crc kubenswrapper[4947]: I1203 09:47:57.418720 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9" event={"ID":"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f","Type":"ContainerStarted","Data":"12dd51f117d3983ddff2083dd4f4b96060c3f37c0f37c66a4def6a5b946ac237"} Dec 03 09:47:57 crc kubenswrapper[4947]: I1203 09:47:57.419041 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9" event={"ID":"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f","Type":"ContainerStarted","Data":"8ed40441a392df7059ad9750f5533f62644ddc38360a614a214d50770b92f27c"} Dec 03 09:47:57 crc kubenswrapper[4947]: I1203 09:47:57.444161 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9" podStartSLOduration=1.9917627759999998 podStartE2EDuration="2.444141712s" podCreationTimestamp="2025-12-03 09:47:55 +0000 UTC" firstStartedPulling="2025-12-03 09:47:56.48120727 +0000 UTC m=+10737.742161696" lastFinishedPulling="2025-12-03 09:47:56.933586196 +0000 UTC m=+10738.194540632" observedRunningTime="2025-12-03 09:47:57.434747139 +0000 UTC m=+10738.695701565" watchObservedRunningTime="2025-12-03 09:47:57.444141712 +0000 UTC m=+10738.705096138" Dec 03 09:48:08 crc kubenswrapper[4947]: I1203 09:48:08.083078 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:48:08 crc kubenswrapper[4947]: E1203 09:48:08.083908 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:48:20 crc kubenswrapper[4947]: I1203 09:48:20.083664 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:48:20 crc kubenswrapper[4947]: E1203 09:48:20.084561 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:48:27 crc kubenswrapper[4947]: I1203 09:48:27.106017 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-55sjl"] Dec 03 09:48:27 crc kubenswrapper[4947]: I1203 09:48:27.112447 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-55sjl" Dec 03 09:48:27 crc kubenswrapper[4947]: I1203 09:48:27.119888 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-55sjl"] Dec 03 09:48:27 crc kubenswrapper[4947]: I1203 09:48:27.200455 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghktq\" (UniqueName: \"kubernetes.io/projected/209d42bc-c6ea-43d6-8b96-ed6fd1681cb5-kube-api-access-ghktq\") pod \"redhat-marketplace-55sjl\" (UID: \"209d42bc-c6ea-43d6-8b96-ed6fd1681cb5\") " pod="openshift-marketplace/redhat-marketplace-55sjl" Dec 03 09:48:27 crc kubenswrapper[4947]: I1203 09:48:27.200655 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/209d42bc-c6ea-43d6-8b96-ed6fd1681cb5-utilities\") pod \"redhat-marketplace-55sjl\" (UID: \"209d42bc-c6ea-43d6-8b96-ed6fd1681cb5\") " pod="openshift-marketplace/redhat-marketplace-55sjl" Dec 03 09:48:27 crc kubenswrapper[4947]: I1203 09:48:27.200737 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/209d42bc-c6ea-43d6-8b96-ed6fd1681cb5-catalog-content\") pod \"redhat-marketplace-55sjl\" (UID: \"209d42bc-c6ea-43d6-8b96-ed6fd1681cb5\") " pod="openshift-marketplace/redhat-marketplace-55sjl" Dec 03 09:48:27 crc kubenswrapper[4947]: I1203 09:48:27.302384 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/209d42bc-c6ea-43d6-8b96-ed6fd1681cb5-catalog-content\") pod \"redhat-marketplace-55sjl\" (UID: \"209d42bc-c6ea-43d6-8b96-ed6fd1681cb5\") " pod="openshift-marketplace/redhat-marketplace-55sjl" Dec 03 09:48:27 crc kubenswrapper[4947]: I1203 09:48:27.302611 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghktq\" (UniqueName: \"kubernetes.io/projected/209d42bc-c6ea-43d6-8b96-ed6fd1681cb5-kube-api-access-ghktq\") pod \"redhat-marketplace-55sjl\" (UID: \"209d42bc-c6ea-43d6-8b96-ed6fd1681cb5\") " pod="openshift-marketplace/redhat-marketplace-55sjl" Dec 03 09:48:27 crc kubenswrapper[4947]: I1203 09:48:27.302705 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/209d42bc-c6ea-43d6-8b96-ed6fd1681cb5-utilities\") pod \"redhat-marketplace-55sjl\" (UID: \"209d42bc-c6ea-43d6-8b96-ed6fd1681cb5\") " pod="openshift-marketplace/redhat-marketplace-55sjl" Dec 03 09:48:27 crc kubenswrapper[4947]: I1203 09:48:27.302928 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/209d42bc-c6ea-43d6-8b96-ed6fd1681cb5-catalog-content\") pod \"redhat-marketplace-55sjl\" (UID: \"209d42bc-c6ea-43d6-8b96-ed6fd1681cb5\") " pod="openshift-marketplace/redhat-marketplace-55sjl" Dec 03 09:48:27 crc kubenswrapper[4947]: I1203 09:48:27.304362 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/209d42bc-c6ea-43d6-8b96-ed6fd1681cb5-utilities\") pod \"redhat-marketplace-55sjl\" (UID: \"209d42bc-c6ea-43d6-8b96-ed6fd1681cb5\") " pod="openshift-marketplace/redhat-marketplace-55sjl" Dec 03 09:48:27 crc kubenswrapper[4947]: I1203 09:48:27.321147 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghktq\" (UniqueName: \"kubernetes.io/projected/209d42bc-c6ea-43d6-8b96-ed6fd1681cb5-kube-api-access-ghktq\") pod \"redhat-marketplace-55sjl\" (UID: \"209d42bc-c6ea-43d6-8b96-ed6fd1681cb5\") " pod="openshift-marketplace/redhat-marketplace-55sjl" Dec 03 09:48:27 crc kubenswrapper[4947]: I1203 09:48:27.431372 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-55sjl" Dec 03 09:48:27 crc kubenswrapper[4947]: I1203 09:48:27.917410 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-55sjl"] Dec 03 09:48:28 crc kubenswrapper[4947]: I1203 09:48:28.816803 4947 generic.go:334] "Generic (PLEG): container finished" podID="209d42bc-c6ea-43d6-8b96-ed6fd1681cb5" containerID="1d52971162dbfaf7c9fe450d7db8bd4b6d716a0b638a45329274ac01355b1dd9" exitCode=0 Dec 03 09:48:28 crc kubenswrapper[4947]: I1203 09:48:28.816915 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-55sjl" event={"ID":"209d42bc-c6ea-43d6-8b96-ed6fd1681cb5","Type":"ContainerDied","Data":"1d52971162dbfaf7c9fe450d7db8bd4b6d716a0b638a45329274ac01355b1dd9"} Dec 03 09:48:28 crc kubenswrapper[4947]: I1203 09:48:28.817140 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-55sjl" event={"ID":"209d42bc-c6ea-43d6-8b96-ed6fd1681cb5","Type":"ContainerStarted","Data":"ff53cec1d45a4cad04776bb4e7c6efc0ae0ad363ecc1b00836a4841151b6d587"} Dec 03 09:48:29 crc kubenswrapper[4947]: I1203 09:48:29.835854 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-55sjl" event={"ID":"209d42bc-c6ea-43d6-8b96-ed6fd1681cb5","Type":"ContainerStarted","Data":"ee467285fe4c5ef72966836a7aae37127a63b9453fd1d05fb4986dbccde3742c"} Dec 03 09:48:30 crc kubenswrapper[4947]: I1203 09:48:30.848067 4947 generic.go:334] "Generic (PLEG): container finished" podID="209d42bc-c6ea-43d6-8b96-ed6fd1681cb5" containerID="ee467285fe4c5ef72966836a7aae37127a63b9453fd1d05fb4986dbccde3742c" exitCode=0 Dec 03 09:48:30 crc kubenswrapper[4947]: I1203 09:48:30.848218 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-55sjl" event={"ID":"209d42bc-c6ea-43d6-8b96-ed6fd1681cb5","Type":"ContainerDied","Data":"ee467285fe4c5ef72966836a7aae37127a63b9453fd1d05fb4986dbccde3742c"} Dec 03 09:48:31 crc kubenswrapper[4947]: I1203 09:48:31.867760 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-55sjl" event={"ID":"209d42bc-c6ea-43d6-8b96-ed6fd1681cb5","Type":"ContainerStarted","Data":"e487d6c08440a94bf661df6139676b547082ef06b359a144f28b71b7b656fa24"} Dec 03 09:48:31 crc kubenswrapper[4947]: I1203 09:48:31.893832 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-55sjl" podStartSLOduration=2.292719017 podStartE2EDuration="4.893811046s" podCreationTimestamp="2025-12-03 09:48:27 +0000 UTC" firstStartedPulling="2025-12-03 09:48:28.821644856 +0000 UTC m=+10770.082599292" lastFinishedPulling="2025-12-03 09:48:31.422736895 +0000 UTC m=+10772.683691321" observedRunningTime="2025-12-03 09:48:31.885779829 +0000 UTC m=+10773.146734265" watchObservedRunningTime="2025-12-03 09:48:31.893811046 +0000 UTC m=+10773.154765472" Dec 03 09:48:32 crc kubenswrapper[4947]: I1203 09:48:32.083421 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:48:32 crc kubenswrapper[4947]: E1203 09:48:32.083738 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:48:37 crc kubenswrapper[4947]: I1203 09:48:37.431579 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-55sjl" Dec 03 09:48:37 crc kubenswrapper[4947]: I1203 09:48:37.432205 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-55sjl" Dec 03 09:48:37 crc kubenswrapper[4947]: I1203 09:48:37.493587 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-55sjl" Dec 03 09:48:37 crc kubenswrapper[4947]: I1203 09:48:37.973745 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-55sjl" Dec 03 09:48:38 crc kubenswrapper[4947]: I1203 09:48:38.019536 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-55sjl"] Dec 03 09:48:39 crc kubenswrapper[4947]: I1203 09:48:39.945919 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-55sjl" podUID="209d42bc-c6ea-43d6-8b96-ed6fd1681cb5" containerName="registry-server" containerID="cri-o://e487d6c08440a94bf661df6139676b547082ef06b359a144f28b71b7b656fa24" gracePeriod=2 Dec 03 09:48:40 crc kubenswrapper[4947]: I1203 09:48:40.454756 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-55sjl" Dec 03 09:48:40 crc kubenswrapper[4947]: I1203 09:48:40.592856 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/209d42bc-c6ea-43d6-8b96-ed6fd1681cb5-utilities\") pod \"209d42bc-c6ea-43d6-8b96-ed6fd1681cb5\" (UID: \"209d42bc-c6ea-43d6-8b96-ed6fd1681cb5\") " Dec 03 09:48:40 crc kubenswrapper[4947]: I1203 09:48:40.593381 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghktq\" (UniqueName: \"kubernetes.io/projected/209d42bc-c6ea-43d6-8b96-ed6fd1681cb5-kube-api-access-ghktq\") pod \"209d42bc-c6ea-43d6-8b96-ed6fd1681cb5\" (UID: \"209d42bc-c6ea-43d6-8b96-ed6fd1681cb5\") " Dec 03 09:48:40 crc kubenswrapper[4947]: I1203 09:48:40.593465 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/209d42bc-c6ea-43d6-8b96-ed6fd1681cb5-catalog-content\") pod \"209d42bc-c6ea-43d6-8b96-ed6fd1681cb5\" (UID: \"209d42bc-c6ea-43d6-8b96-ed6fd1681cb5\") " Dec 03 09:48:40 crc kubenswrapper[4947]: I1203 09:48:40.593988 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/209d42bc-c6ea-43d6-8b96-ed6fd1681cb5-utilities" (OuterVolumeSpecName: "utilities") pod "209d42bc-c6ea-43d6-8b96-ed6fd1681cb5" (UID: "209d42bc-c6ea-43d6-8b96-ed6fd1681cb5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:48:40 crc kubenswrapper[4947]: I1203 09:48:40.601176 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/209d42bc-c6ea-43d6-8b96-ed6fd1681cb5-kube-api-access-ghktq" (OuterVolumeSpecName: "kube-api-access-ghktq") pod "209d42bc-c6ea-43d6-8b96-ed6fd1681cb5" (UID: "209d42bc-c6ea-43d6-8b96-ed6fd1681cb5"). InnerVolumeSpecName "kube-api-access-ghktq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:48:40 crc kubenswrapper[4947]: I1203 09:48:40.613276 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/209d42bc-c6ea-43d6-8b96-ed6fd1681cb5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "209d42bc-c6ea-43d6-8b96-ed6fd1681cb5" (UID: "209d42bc-c6ea-43d6-8b96-ed6fd1681cb5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:48:40 crc kubenswrapper[4947]: I1203 09:48:40.695724 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghktq\" (UniqueName: \"kubernetes.io/projected/209d42bc-c6ea-43d6-8b96-ed6fd1681cb5-kube-api-access-ghktq\") on node \"crc\" DevicePath \"\"" Dec 03 09:48:40 crc kubenswrapper[4947]: I1203 09:48:40.695764 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/209d42bc-c6ea-43d6-8b96-ed6fd1681cb5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:48:40 crc kubenswrapper[4947]: I1203 09:48:40.695781 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/209d42bc-c6ea-43d6-8b96-ed6fd1681cb5-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:48:40 crc kubenswrapper[4947]: I1203 09:48:40.957906 4947 generic.go:334] "Generic (PLEG): container finished" podID="209d42bc-c6ea-43d6-8b96-ed6fd1681cb5" containerID="e487d6c08440a94bf661df6139676b547082ef06b359a144f28b71b7b656fa24" exitCode=0 Dec 03 09:48:40 crc kubenswrapper[4947]: I1203 09:48:40.957954 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-55sjl" event={"ID":"209d42bc-c6ea-43d6-8b96-ed6fd1681cb5","Type":"ContainerDied","Data":"e487d6c08440a94bf661df6139676b547082ef06b359a144f28b71b7b656fa24"} Dec 03 09:48:40 crc kubenswrapper[4947]: I1203 09:48:40.957983 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-55sjl" event={"ID":"209d42bc-c6ea-43d6-8b96-ed6fd1681cb5","Type":"ContainerDied","Data":"ff53cec1d45a4cad04776bb4e7c6efc0ae0ad363ecc1b00836a4841151b6d587"} Dec 03 09:48:40 crc kubenswrapper[4947]: I1203 09:48:40.958004 4947 scope.go:117] "RemoveContainer" containerID="e487d6c08440a94bf661df6139676b547082ef06b359a144f28b71b7b656fa24" Dec 03 09:48:40 crc kubenswrapper[4947]: I1203 09:48:40.958031 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-55sjl" Dec 03 09:48:40 crc kubenswrapper[4947]: I1203 09:48:40.979900 4947 scope.go:117] "RemoveContainer" containerID="ee467285fe4c5ef72966836a7aae37127a63b9453fd1d05fb4986dbccde3742c" Dec 03 09:48:41 crc kubenswrapper[4947]: I1203 09:48:41.002274 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-55sjl"] Dec 03 09:48:41 crc kubenswrapper[4947]: I1203 09:48:41.016110 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-55sjl"] Dec 03 09:48:41 crc kubenswrapper[4947]: I1203 09:48:41.022753 4947 scope.go:117] "RemoveContainer" containerID="1d52971162dbfaf7c9fe450d7db8bd4b6d716a0b638a45329274ac01355b1dd9" Dec 03 09:48:41 crc kubenswrapper[4947]: I1203 09:48:41.086449 4947 scope.go:117] "RemoveContainer" containerID="e487d6c08440a94bf661df6139676b547082ef06b359a144f28b71b7b656fa24" Dec 03 09:48:41 crc kubenswrapper[4947]: E1203 09:48:41.087332 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e487d6c08440a94bf661df6139676b547082ef06b359a144f28b71b7b656fa24\": container with ID starting with e487d6c08440a94bf661df6139676b547082ef06b359a144f28b71b7b656fa24 not found: ID does not exist" containerID="e487d6c08440a94bf661df6139676b547082ef06b359a144f28b71b7b656fa24" Dec 03 09:48:41 crc kubenswrapper[4947]: I1203 09:48:41.087365 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e487d6c08440a94bf661df6139676b547082ef06b359a144f28b71b7b656fa24"} err="failed to get container status \"e487d6c08440a94bf661df6139676b547082ef06b359a144f28b71b7b656fa24\": rpc error: code = NotFound desc = could not find container \"e487d6c08440a94bf661df6139676b547082ef06b359a144f28b71b7b656fa24\": container with ID starting with e487d6c08440a94bf661df6139676b547082ef06b359a144f28b71b7b656fa24 not found: ID does not exist" Dec 03 09:48:41 crc kubenswrapper[4947]: I1203 09:48:41.087386 4947 scope.go:117] "RemoveContainer" containerID="ee467285fe4c5ef72966836a7aae37127a63b9453fd1d05fb4986dbccde3742c" Dec 03 09:48:41 crc kubenswrapper[4947]: E1203 09:48:41.087728 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee467285fe4c5ef72966836a7aae37127a63b9453fd1d05fb4986dbccde3742c\": container with ID starting with ee467285fe4c5ef72966836a7aae37127a63b9453fd1d05fb4986dbccde3742c not found: ID does not exist" containerID="ee467285fe4c5ef72966836a7aae37127a63b9453fd1d05fb4986dbccde3742c" Dec 03 09:48:41 crc kubenswrapper[4947]: I1203 09:48:41.087750 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee467285fe4c5ef72966836a7aae37127a63b9453fd1d05fb4986dbccde3742c"} err="failed to get container status \"ee467285fe4c5ef72966836a7aae37127a63b9453fd1d05fb4986dbccde3742c\": rpc error: code = NotFound desc = could not find container \"ee467285fe4c5ef72966836a7aae37127a63b9453fd1d05fb4986dbccde3742c\": container with ID starting with ee467285fe4c5ef72966836a7aae37127a63b9453fd1d05fb4986dbccde3742c not found: ID does not exist" Dec 03 09:48:41 crc kubenswrapper[4947]: I1203 09:48:41.087762 4947 scope.go:117] "RemoveContainer" containerID="1d52971162dbfaf7c9fe450d7db8bd4b6d716a0b638a45329274ac01355b1dd9" Dec 03 09:48:41 crc kubenswrapper[4947]: E1203 09:48:41.088016 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d52971162dbfaf7c9fe450d7db8bd4b6d716a0b638a45329274ac01355b1dd9\": container with ID starting with 1d52971162dbfaf7c9fe450d7db8bd4b6d716a0b638a45329274ac01355b1dd9 not found: ID does not exist" containerID="1d52971162dbfaf7c9fe450d7db8bd4b6d716a0b638a45329274ac01355b1dd9" Dec 03 09:48:41 crc kubenswrapper[4947]: I1203 09:48:41.088047 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d52971162dbfaf7c9fe450d7db8bd4b6d716a0b638a45329274ac01355b1dd9"} err="failed to get container status \"1d52971162dbfaf7c9fe450d7db8bd4b6d716a0b638a45329274ac01355b1dd9\": rpc error: code = NotFound desc = could not find container \"1d52971162dbfaf7c9fe450d7db8bd4b6d716a0b638a45329274ac01355b1dd9\": container with ID starting with 1d52971162dbfaf7c9fe450d7db8bd4b6d716a0b638a45329274ac01355b1dd9 not found: ID does not exist" Dec 03 09:48:41 crc kubenswrapper[4947]: I1203 09:48:41.097683 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="209d42bc-c6ea-43d6-8b96-ed6fd1681cb5" path="/var/lib/kubelet/pods/209d42bc-c6ea-43d6-8b96-ed6fd1681cb5/volumes" Dec 03 09:48:44 crc kubenswrapper[4947]: I1203 09:48:44.083864 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:48:44 crc kubenswrapper[4947]: E1203 09:48:44.084667 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:48:49 crc kubenswrapper[4947]: I1203 09:48:49.052814 4947 generic.go:334] "Generic (PLEG): container finished" podID="aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f" containerID="12dd51f117d3983ddff2083dd4f4b96060c3f37c0f37c66a4def6a5b946ac237" exitCode=0 Dec 03 09:48:49 crc kubenswrapper[4947]: I1203 09:48:49.052857 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9" event={"ID":"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f","Type":"ContainerDied","Data":"12dd51f117d3983ddff2083dd4f4b96060c3f37c0f37c66a4def6a5b946ac237"} Dec 03 09:48:50 crc kubenswrapper[4947]: I1203 09:48:50.590922 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9" Dec 03 09:48:50 crc kubenswrapper[4947]: I1203 09:48:50.711230 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-ssh-key\") pod \"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f\" (UID: \"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f\") " Dec 03 09:48:50 crc kubenswrapper[4947]: I1203 09:48:50.711376 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-neutron-dhcp-agent-neutron-config-0\") pod \"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f\" (UID: \"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f\") " Dec 03 09:48:50 crc kubenswrapper[4947]: I1203 09:48:50.711484 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-neutron-dhcp-combined-ca-bundle\") pod \"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f\" (UID: \"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f\") " Dec 03 09:48:50 crc kubenswrapper[4947]: I1203 09:48:50.711559 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-inventory\") pod \"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f\" (UID: \"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f\") " Dec 03 09:48:50 crc kubenswrapper[4947]: I1203 09:48:50.711595 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w99p2\" (UniqueName: \"kubernetes.io/projected/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-kube-api-access-w99p2\") pod \"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f\" (UID: \"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f\") " Dec 03 09:48:50 crc kubenswrapper[4947]: I1203 09:48:50.736655 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f" (UID: "aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:48:50 crc kubenswrapper[4947]: I1203 09:48:50.750709 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-kube-api-access-w99p2" (OuterVolumeSpecName: "kube-api-access-w99p2") pod "aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f" (UID: "aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f"). InnerVolumeSpecName "kube-api-access-w99p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:48:50 crc kubenswrapper[4947]: I1203 09:48:50.771213 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f" (UID: "aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:48:50 crc kubenswrapper[4947]: I1203 09:48:50.793916 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f" (UID: "aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:48:50 crc kubenswrapper[4947]: I1203 09:48:50.799635 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-inventory" (OuterVolumeSpecName: "inventory") pod "aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f" (UID: "aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:48:50 crc kubenswrapper[4947]: I1203 09:48:50.818863 4947 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:48:50 crc kubenswrapper[4947]: I1203 09:48:50.818904 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:48:50 crc kubenswrapper[4947]: I1203 09:48:50.818915 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w99p2\" (UniqueName: \"kubernetes.io/projected/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-kube-api-access-w99p2\") on node \"crc\" DevicePath \"\"" Dec 03 09:48:50 crc kubenswrapper[4947]: I1203 09:48:50.818929 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:48:50 crc kubenswrapper[4947]: I1203 09:48:50.818937 4947 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:48:51 crc kubenswrapper[4947]: I1203 09:48:51.082192 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9" Dec 03 09:48:51 crc kubenswrapper[4947]: I1203 09:48:51.094033 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-r5zr9" event={"ID":"aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f","Type":"ContainerDied","Data":"8ed40441a392df7059ad9750f5533f62644ddc38360a614a214d50770b92f27c"} Dec 03 09:48:51 crc kubenswrapper[4947]: I1203 09:48:51.094077 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ed40441a392df7059ad9750f5533f62644ddc38360a614a214d50770b92f27c" Dec 03 09:48:56 crc kubenswrapper[4947]: I1203 09:48:56.083313 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:48:56 crc kubenswrapper[4947]: E1203 09:48:56.083996 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:49:09 crc kubenswrapper[4947]: I1203 09:49:09.099162 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:49:09 crc kubenswrapper[4947]: E1203 09:49:09.100384 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:49:23 crc kubenswrapper[4947]: I1203 09:49:23.083921 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:49:23 crc kubenswrapper[4947]: E1203 09:49:23.084707 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:49:28 crc kubenswrapper[4947]: I1203 09:49:28.517712 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mg856"] Dec 03 09:49:28 crc kubenswrapper[4947]: E1203 09:49:28.519408 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 03 09:49:28 crc kubenswrapper[4947]: I1203 09:49:28.519443 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 03 09:49:28 crc kubenswrapper[4947]: E1203 09:49:28.519486 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="209d42bc-c6ea-43d6-8b96-ed6fd1681cb5" containerName="extract-utilities" Dec 03 09:49:28 crc kubenswrapper[4947]: I1203 09:49:28.519538 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="209d42bc-c6ea-43d6-8b96-ed6fd1681cb5" containerName="extract-utilities" Dec 03 09:49:28 crc kubenswrapper[4947]: E1203 09:49:28.519581 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="209d42bc-c6ea-43d6-8b96-ed6fd1681cb5" containerName="extract-content" Dec 03 09:49:28 crc kubenswrapper[4947]: I1203 09:49:28.519602 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="209d42bc-c6ea-43d6-8b96-ed6fd1681cb5" containerName="extract-content" Dec 03 09:49:28 crc kubenswrapper[4947]: E1203 09:49:28.519668 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="209d42bc-c6ea-43d6-8b96-ed6fd1681cb5" containerName="registry-server" Dec 03 09:49:28 crc kubenswrapper[4947]: I1203 09:49:28.519684 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="209d42bc-c6ea-43d6-8b96-ed6fd1681cb5" containerName="registry-server" Dec 03 09:49:28 crc kubenswrapper[4947]: I1203 09:49:28.520215 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 03 09:49:28 crc kubenswrapper[4947]: I1203 09:49:28.520257 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="209d42bc-c6ea-43d6-8b96-ed6fd1681cb5" containerName="registry-server" Dec 03 09:49:28 crc kubenswrapper[4947]: I1203 09:49:28.524201 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mg856" Dec 03 09:49:28 crc kubenswrapper[4947]: I1203 09:49:28.531953 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mg856"] Dec 03 09:49:28 crc kubenswrapper[4947]: I1203 09:49:28.596555 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpbz5\" (UniqueName: \"kubernetes.io/projected/54ba1b59-3e43-4354-bd44-006cfb0b69a2-kube-api-access-dpbz5\") pod \"certified-operators-mg856\" (UID: \"54ba1b59-3e43-4354-bd44-006cfb0b69a2\") " pod="openshift-marketplace/certified-operators-mg856" Dec 03 09:49:28 crc kubenswrapper[4947]: I1203 09:49:28.596624 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54ba1b59-3e43-4354-bd44-006cfb0b69a2-catalog-content\") pod \"certified-operators-mg856\" (UID: \"54ba1b59-3e43-4354-bd44-006cfb0b69a2\") " pod="openshift-marketplace/certified-operators-mg856" Dec 03 09:49:28 crc kubenswrapper[4947]: I1203 09:49:28.596652 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54ba1b59-3e43-4354-bd44-006cfb0b69a2-utilities\") pod \"certified-operators-mg856\" (UID: \"54ba1b59-3e43-4354-bd44-006cfb0b69a2\") " pod="openshift-marketplace/certified-operators-mg856" Dec 03 09:49:28 crc kubenswrapper[4947]: I1203 09:49:28.699121 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpbz5\" (UniqueName: \"kubernetes.io/projected/54ba1b59-3e43-4354-bd44-006cfb0b69a2-kube-api-access-dpbz5\") pod \"certified-operators-mg856\" (UID: \"54ba1b59-3e43-4354-bd44-006cfb0b69a2\") " pod="openshift-marketplace/certified-operators-mg856" Dec 03 09:49:28 crc kubenswrapper[4947]: I1203 09:49:28.699538 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54ba1b59-3e43-4354-bd44-006cfb0b69a2-catalog-content\") pod \"certified-operators-mg856\" (UID: \"54ba1b59-3e43-4354-bd44-006cfb0b69a2\") " pod="openshift-marketplace/certified-operators-mg856" Dec 03 09:49:28 crc kubenswrapper[4947]: I1203 09:49:28.699556 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54ba1b59-3e43-4354-bd44-006cfb0b69a2-utilities\") pod \"certified-operators-mg856\" (UID: \"54ba1b59-3e43-4354-bd44-006cfb0b69a2\") " pod="openshift-marketplace/certified-operators-mg856" Dec 03 09:49:28 crc kubenswrapper[4947]: I1203 09:49:28.699961 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54ba1b59-3e43-4354-bd44-006cfb0b69a2-utilities\") pod \"certified-operators-mg856\" (UID: \"54ba1b59-3e43-4354-bd44-006cfb0b69a2\") " pod="openshift-marketplace/certified-operators-mg856" Dec 03 09:49:28 crc kubenswrapper[4947]: I1203 09:49:28.700468 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54ba1b59-3e43-4354-bd44-006cfb0b69a2-catalog-content\") pod \"certified-operators-mg856\" (UID: \"54ba1b59-3e43-4354-bd44-006cfb0b69a2\") " pod="openshift-marketplace/certified-operators-mg856" Dec 03 09:49:28 crc kubenswrapper[4947]: I1203 09:49:28.720927 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpbz5\" (UniqueName: \"kubernetes.io/projected/54ba1b59-3e43-4354-bd44-006cfb0b69a2-kube-api-access-dpbz5\") pod \"certified-operators-mg856\" (UID: \"54ba1b59-3e43-4354-bd44-006cfb0b69a2\") " pod="openshift-marketplace/certified-operators-mg856" Dec 03 09:49:28 crc kubenswrapper[4947]: I1203 09:49:28.852791 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mg856" Dec 03 09:49:29 crc kubenswrapper[4947]: I1203 09:49:29.426586 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mg856"] Dec 03 09:49:29 crc kubenswrapper[4947]: I1203 09:49:29.493013 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mg856" event={"ID":"54ba1b59-3e43-4354-bd44-006cfb0b69a2","Type":"ContainerStarted","Data":"0eb6c258eb0ef84cc47c98f5cbbb1c8b0ba45b406791d6a2c3a7516950d88bc7"} Dec 03 09:49:30 crc kubenswrapper[4947]: I1203 09:49:30.512928 4947 generic.go:334] "Generic (PLEG): container finished" podID="54ba1b59-3e43-4354-bd44-006cfb0b69a2" containerID="df624a883b9d9b3a5972c9d36755c4a8275d738aee230f4c7cc008157a83aa99" exitCode=0 Dec 03 09:49:30 crc kubenswrapper[4947]: I1203 09:49:30.513027 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mg856" event={"ID":"54ba1b59-3e43-4354-bd44-006cfb0b69a2","Type":"ContainerDied","Data":"df624a883b9d9b3a5972c9d36755c4a8275d738aee230f4c7cc008157a83aa99"} Dec 03 09:49:34 crc kubenswrapper[4947]: I1203 09:49:34.558551 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mg856" event={"ID":"54ba1b59-3e43-4354-bd44-006cfb0b69a2","Type":"ContainerStarted","Data":"c56d733147f51cea3dc0d72197b97f78a9d690c0d0bcd737aae947dd3169d9f8"} Dec 03 09:49:35 crc kubenswrapper[4947]: I1203 09:49:35.574235 4947 generic.go:334] "Generic (PLEG): container finished" podID="54ba1b59-3e43-4354-bd44-006cfb0b69a2" containerID="c56d733147f51cea3dc0d72197b97f78a9d690c0d0bcd737aae947dd3169d9f8" exitCode=0 Dec 03 09:49:35 crc kubenswrapper[4947]: I1203 09:49:35.574306 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mg856" event={"ID":"54ba1b59-3e43-4354-bd44-006cfb0b69a2","Type":"ContainerDied","Data":"c56d733147f51cea3dc0d72197b97f78a9d690c0d0bcd737aae947dd3169d9f8"} Dec 03 09:49:35 crc kubenswrapper[4947]: I1203 09:49:35.578436 4947 generic.go:334] "Generic (PLEG): container finished" podID="93ccd72a-194e-4207-bdfc-ed594d06f81c" containerID="dee2b8e0049e39eacab16174d79077b63c3234496003d017eea4b1df59d94b7d" exitCode=0 Dec 03 09:49:35 crc kubenswrapper[4947]: I1203 09:49:35.578475 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg" event={"ID":"93ccd72a-194e-4207-bdfc-ed594d06f81c","Type":"ContainerDied","Data":"dee2b8e0049e39eacab16174d79077b63c3234496003d017eea4b1df59d94b7d"} Dec 03 09:49:36 crc kubenswrapper[4947]: I1203 09:49:36.083862 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:49:36 crc kubenswrapper[4947]: I1203 09:49:36.596284 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mg856" event={"ID":"54ba1b59-3e43-4354-bd44-006cfb0b69a2","Type":"ContainerStarted","Data":"b72bb4b85a5c2430f9cf905ed4e0738309bd54d06939aaacfae6212c537bca0a"} Dec 03 09:49:36 crc kubenswrapper[4947]: I1203 09:49:36.599058 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"eab5bbfe62ec2d95d68078d7965923cc0022d2b127d4947809beac88e6b7ad80"} Dec 03 09:49:36 crc kubenswrapper[4947]: I1203 09:49:36.620516 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mg856" podStartSLOduration=3.171143872 podStartE2EDuration="8.620479145s" podCreationTimestamp="2025-12-03 09:49:28 +0000 UTC" firstStartedPulling="2025-12-03 09:49:30.51511859 +0000 UTC m=+10831.776073026" lastFinishedPulling="2025-12-03 09:49:35.964453823 +0000 UTC m=+10837.225408299" observedRunningTime="2025-12-03 09:49:36.611905374 +0000 UTC m=+10837.872859820" watchObservedRunningTime="2025-12-03 09:49:36.620479145 +0000 UTC m=+10837.881433571" Dec 03 09:49:37 crc kubenswrapper[4947]: I1203 09:49:37.609270 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg" event={"ID":"93ccd72a-194e-4207-bdfc-ed594d06f81c","Type":"ContainerDied","Data":"e616ee4e96cfbb9a61e117964bdb51be81378030eb09edad29c0c7c82e6bc666"} Dec 03 09:49:37 crc kubenswrapper[4947]: I1203 09:49:37.610660 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e616ee4e96cfbb9a61e117964bdb51be81378030eb09edad29c0c7c82e6bc666" Dec 03 09:49:37 crc kubenswrapper[4947]: I1203 09:49:37.658758 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg" Dec 03 09:49:37 crc kubenswrapper[4947]: I1203 09:49:37.837159 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/93ccd72a-194e-4207-bdfc-ed594d06f81c-ssh-key\") pod \"93ccd72a-194e-4207-bdfc-ed594d06f81c\" (UID: \"93ccd72a-194e-4207-bdfc-ed594d06f81c\") " Dec 03 09:49:37 crc kubenswrapper[4947]: I1203 09:49:37.837291 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ccd72a-194e-4207-bdfc-ed594d06f81c-neutron-dhcp-combined-ca-bundle\") pod \"93ccd72a-194e-4207-bdfc-ed594d06f81c\" (UID: \"93ccd72a-194e-4207-bdfc-ed594d06f81c\") " Dec 03 09:49:37 crc kubenswrapper[4947]: I1203 09:49:37.837368 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/93ccd72a-194e-4207-bdfc-ed594d06f81c-neutron-dhcp-agent-neutron-config-0\") pod \"93ccd72a-194e-4207-bdfc-ed594d06f81c\" (UID: \"93ccd72a-194e-4207-bdfc-ed594d06f81c\") " Dec 03 09:49:37 crc kubenswrapper[4947]: I1203 09:49:37.837569 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kfkv\" (UniqueName: \"kubernetes.io/projected/93ccd72a-194e-4207-bdfc-ed594d06f81c-kube-api-access-7kfkv\") pod \"93ccd72a-194e-4207-bdfc-ed594d06f81c\" (UID: \"93ccd72a-194e-4207-bdfc-ed594d06f81c\") " Dec 03 09:49:37 crc kubenswrapper[4947]: I1203 09:49:37.837621 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93ccd72a-194e-4207-bdfc-ed594d06f81c-inventory\") pod \"93ccd72a-194e-4207-bdfc-ed594d06f81c\" (UID: \"93ccd72a-194e-4207-bdfc-ed594d06f81c\") " Dec 03 09:49:37 crc kubenswrapper[4947]: I1203 09:49:37.842525 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93ccd72a-194e-4207-bdfc-ed594d06f81c-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "93ccd72a-194e-4207-bdfc-ed594d06f81c" (UID: "93ccd72a-194e-4207-bdfc-ed594d06f81c"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:49:37 crc kubenswrapper[4947]: I1203 09:49:37.843923 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93ccd72a-194e-4207-bdfc-ed594d06f81c-kube-api-access-7kfkv" (OuterVolumeSpecName: "kube-api-access-7kfkv") pod "93ccd72a-194e-4207-bdfc-ed594d06f81c" (UID: "93ccd72a-194e-4207-bdfc-ed594d06f81c"). InnerVolumeSpecName "kube-api-access-7kfkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:49:37 crc kubenswrapper[4947]: I1203 09:49:37.865714 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93ccd72a-194e-4207-bdfc-ed594d06f81c-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "93ccd72a-194e-4207-bdfc-ed594d06f81c" (UID: "93ccd72a-194e-4207-bdfc-ed594d06f81c"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:49:37 crc kubenswrapper[4947]: I1203 09:49:37.874304 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93ccd72a-194e-4207-bdfc-ed594d06f81c-inventory" (OuterVolumeSpecName: "inventory") pod "93ccd72a-194e-4207-bdfc-ed594d06f81c" (UID: "93ccd72a-194e-4207-bdfc-ed594d06f81c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:49:37 crc kubenswrapper[4947]: I1203 09:49:37.874731 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93ccd72a-194e-4207-bdfc-ed594d06f81c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "93ccd72a-194e-4207-bdfc-ed594d06f81c" (UID: "93ccd72a-194e-4207-bdfc-ed594d06f81c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:49:37 crc kubenswrapper[4947]: I1203 09:49:37.940453 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kfkv\" (UniqueName: \"kubernetes.io/projected/93ccd72a-194e-4207-bdfc-ed594d06f81c-kube-api-access-7kfkv\") on node \"crc\" DevicePath \"\"" Dec 03 09:49:37 crc kubenswrapper[4947]: I1203 09:49:37.940509 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93ccd72a-194e-4207-bdfc-ed594d06f81c-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:49:37 crc kubenswrapper[4947]: I1203 09:49:37.940522 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/93ccd72a-194e-4207-bdfc-ed594d06f81c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:49:37 crc kubenswrapper[4947]: I1203 09:49:37.940534 4947 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93ccd72a-194e-4207-bdfc-ed594d06f81c-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:49:37 crc kubenswrapper[4947]: I1203 09:49:37.940546 4947 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/93ccd72a-194e-4207-bdfc-ed594d06f81c-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:49:38 crc kubenswrapper[4947]: I1203 09:49:38.617828 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell2-bhnzg" Dec 03 09:49:38 crc kubenswrapper[4947]: I1203 09:49:38.853636 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mg856" Dec 03 09:49:38 crc kubenswrapper[4947]: I1203 09:49:38.853687 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mg856" Dec 03 09:49:38 crc kubenswrapper[4947]: I1203 09:49:38.925669 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mg856" Dec 03 09:49:48 crc kubenswrapper[4947]: I1203 09:49:48.909721 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mg856" Dec 03 09:49:48 crc kubenswrapper[4947]: I1203 09:49:48.990993 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mg856"] Dec 03 09:49:49 crc kubenswrapper[4947]: I1203 09:49:49.049807 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-njbpj"] Dec 03 09:49:49 crc kubenswrapper[4947]: I1203 09:49:49.050354 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-njbpj" podUID="44957a4e-7c93-40c4-b32d-bd0439ba60ca" containerName="registry-server" containerID="cri-o://76fd801c795820ec066b4e1afafb5e5889de627b679860ca80d188e3f7c8253c" gracePeriod=2 Dec 03 09:49:49 crc kubenswrapper[4947]: I1203 09:49:49.603358 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-njbpj" Dec 03 09:49:49 crc kubenswrapper[4947]: I1203 09:49:49.691351 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44957a4e-7c93-40c4-b32d-bd0439ba60ca-catalog-content\") pod \"44957a4e-7c93-40c4-b32d-bd0439ba60ca\" (UID: \"44957a4e-7c93-40c4-b32d-bd0439ba60ca\") " Dec 03 09:49:49 crc kubenswrapper[4947]: I1203 09:49:49.691469 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44957a4e-7c93-40c4-b32d-bd0439ba60ca-utilities\") pod \"44957a4e-7c93-40c4-b32d-bd0439ba60ca\" (UID: \"44957a4e-7c93-40c4-b32d-bd0439ba60ca\") " Dec 03 09:49:49 crc kubenswrapper[4947]: I1203 09:49:49.691693 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h895h\" (UniqueName: \"kubernetes.io/projected/44957a4e-7c93-40c4-b32d-bd0439ba60ca-kube-api-access-h895h\") pod \"44957a4e-7c93-40c4-b32d-bd0439ba60ca\" (UID: \"44957a4e-7c93-40c4-b32d-bd0439ba60ca\") " Dec 03 09:49:49 crc kubenswrapper[4947]: I1203 09:49:49.693009 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44957a4e-7c93-40c4-b32d-bd0439ba60ca-utilities" (OuterVolumeSpecName: "utilities") pod "44957a4e-7c93-40c4-b32d-bd0439ba60ca" (UID: "44957a4e-7c93-40c4-b32d-bd0439ba60ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:49:49 crc kubenswrapper[4947]: I1203 09:49:49.698165 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44957a4e-7c93-40c4-b32d-bd0439ba60ca-kube-api-access-h895h" (OuterVolumeSpecName: "kube-api-access-h895h") pod "44957a4e-7c93-40c4-b32d-bd0439ba60ca" (UID: "44957a4e-7c93-40c4-b32d-bd0439ba60ca"). InnerVolumeSpecName "kube-api-access-h895h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:49:49 crc kubenswrapper[4947]: I1203 09:49:49.755535 4947 generic.go:334] "Generic (PLEG): container finished" podID="44957a4e-7c93-40c4-b32d-bd0439ba60ca" containerID="76fd801c795820ec066b4e1afafb5e5889de627b679860ca80d188e3f7c8253c" exitCode=0 Dec 03 09:49:49 crc kubenswrapper[4947]: I1203 09:49:49.757240 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-njbpj" Dec 03 09:49:49 crc kubenswrapper[4947]: I1203 09:49:49.757631 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-njbpj" event={"ID":"44957a4e-7c93-40c4-b32d-bd0439ba60ca","Type":"ContainerDied","Data":"76fd801c795820ec066b4e1afafb5e5889de627b679860ca80d188e3f7c8253c"} Dec 03 09:49:49 crc kubenswrapper[4947]: I1203 09:49:49.757723 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-njbpj" event={"ID":"44957a4e-7c93-40c4-b32d-bd0439ba60ca","Type":"ContainerDied","Data":"9c2ee8212e4183fba40611289a97bab3252daa84ec8f8200c8f5f11878341f00"} Dec 03 09:49:49 crc kubenswrapper[4947]: I1203 09:49:49.757749 4947 scope.go:117] "RemoveContainer" containerID="76fd801c795820ec066b4e1afafb5e5889de627b679860ca80d188e3f7c8253c" Dec 03 09:49:49 crc kubenswrapper[4947]: I1203 09:49:49.765859 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44957a4e-7c93-40c4-b32d-bd0439ba60ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44957a4e-7c93-40c4-b32d-bd0439ba60ca" (UID: "44957a4e-7c93-40c4-b32d-bd0439ba60ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:49:49 crc kubenswrapper[4947]: I1203 09:49:49.793916 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44957a4e-7c93-40c4-b32d-bd0439ba60ca-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:49:49 crc kubenswrapper[4947]: I1203 09:49:49.793952 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44957a4e-7c93-40c4-b32d-bd0439ba60ca-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:49:49 crc kubenswrapper[4947]: I1203 09:49:49.793966 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h895h\" (UniqueName: \"kubernetes.io/projected/44957a4e-7c93-40c4-b32d-bd0439ba60ca-kube-api-access-h895h\") on node \"crc\" DevicePath \"\"" Dec 03 09:49:49 crc kubenswrapper[4947]: I1203 09:49:49.799415 4947 scope.go:117] "RemoveContainer" containerID="e3dbb0e8c27af4e8f883956e645d276f4e32b10e6badd7ff1dabe57e4ac3d41e" Dec 03 09:49:49 crc kubenswrapper[4947]: I1203 09:49:49.826474 4947 scope.go:117] "RemoveContainer" containerID="255f5c6510ee512041a618de670f5d5b0ed92032dfac351296125202676cf8da" Dec 03 09:49:49 crc kubenswrapper[4947]: I1203 09:49:49.930487 4947 scope.go:117] "RemoveContainer" containerID="76fd801c795820ec066b4e1afafb5e5889de627b679860ca80d188e3f7c8253c" Dec 03 09:49:49 crc kubenswrapper[4947]: E1203 09:49:49.930988 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76fd801c795820ec066b4e1afafb5e5889de627b679860ca80d188e3f7c8253c\": container with ID starting with 76fd801c795820ec066b4e1afafb5e5889de627b679860ca80d188e3f7c8253c not found: ID does not exist" containerID="76fd801c795820ec066b4e1afafb5e5889de627b679860ca80d188e3f7c8253c" Dec 03 09:49:49 crc kubenswrapper[4947]: I1203 09:49:49.931026 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76fd801c795820ec066b4e1afafb5e5889de627b679860ca80d188e3f7c8253c"} err="failed to get container status \"76fd801c795820ec066b4e1afafb5e5889de627b679860ca80d188e3f7c8253c\": rpc error: code = NotFound desc = could not find container \"76fd801c795820ec066b4e1afafb5e5889de627b679860ca80d188e3f7c8253c\": container with ID starting with 76fd801c795820ec066b4e1afafb5e5889de627b679860ca80d188e3f7c8253c not found: ID does not exist" Dec 03 09:49:49 crc kubenswrapper[4947]: I1203 09:49:49.931051 4947 scope.go:117] "RemoveContainer" containerID="e3dbb0e8c27af4e8f883956e645d276f4e32b10e6badd7ff1dabe57e4ac3d41e" Dec 03 09:49:49 crc kubenswrapper[4947]: E1203 09:49:49.931558 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3dbb0e8c27af4e8f883956e645d276f4e32b10e6badd7ff1dabe57e4ac3d41e\": container with ID starting with e3dbb0e8c27af4e8f883956e645d276f4e32b10e6badd7ff1dabe57e4ac3d41e not found: ID does not exist" containerID="e3dbb0e8c27af4e8f883956e645d276f4e32b10e6badd7ff1dabe57e4ac3d41e" Dec 03 09:49:49 crc kubenswrapper[4947]: I1203 09:49:49.931603 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3dbb0e8c27af4e8f883956e645d276f4e32b10e6badd7ff1dabe57e4ac3d41e"} err="failed to get container status \"e3dbb0e8c27af4e8f883956e645d276f4e32b10e6badd7ff1dabe57e4ac3d41e\": rpc error: code = NotFound desc = could not find container \"e3dbb0e8c27af4e8f883956e645d276f4e32b10e6badd7ff1dabe57e4ac3d41e\": container with ID starting with e3dbb0e8c27af4e8f883956e645d276f4e32b10e6badd7ff1dabe57e4ac3d41e not found: ID does not exist" Dec 03 09:49:49 crc kubenswrapper[4947]: I1203 09:49:49.931634 4947 scope.go:117] "RemoveContainer" containerID="255f5c6510ee512041a618de670f5d5b0ed92032dfac351296125202676cf8da" Dec 03 09:49:49 crc kubenswrapper[4947]: E1203 09:49:49.932190 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"255f5c6510ee512041a618de670f5d5b0ed92032dfac351296125202676cf8da\": container with ID starting with 255f5c6510ee512041a618de670f5d5b0ed92032dfac351296125202676cf8da not found: ID does not exist" containerID="255f5c6510ee512041a618de670f5d5b0ed92032dfac351296125202676cf8da" Dec 03 09:49:49 crc kubenswrapper[4947]: I1203 09:49:49.932214 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"255f5c6510ee512041a618de670f5d5b0ed92032dfac351296125202676cf8da"} err="failed to get container status \"255f5c6510ee512041a618de670f5d5b0ed92032dfac351296125202676cf8da\": rpc error: code = NotFound desc = could not find container \"255f5c6510ee512041a618de670f5d5b0ed92032dfac351296125202676cf8da\": container with ID starting with 255f5c6510ee512041a618de670f5d5b0ed92032dfac351296125202676cf8da not found: ID does not exist" Dec 03 09:49:50 crc kubenswrapper[4947]: I1203 09:49:50.094441 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-njbpj"] Dec 03 09:49:50 crc kubenswrapper[4947]: I1203 09:49:50.113882 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-njbpj"] Dec 03 09:49:51 crc kubenswrapper[4947]: I1203 09:49:51.097578 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44957a4e-7c93-40c4-b32d-bd0439ba60ca" path="/var/lib/kubelet/pods/44957a4e-7c93-40c4-b32d-bd0439ba60ca/volumes" Dec 03 09:50:07 crc kubenswrapper[4947]: I1203 09:50:07.572314 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 09:50:07 crc kubenswrapper[4947]: I1203 09:50:07.573079 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="10daa16c-8d05-423d-8e47-760b529a5925" containerName="nova-cell0-conductor-conductor" containerID="cri-o://8aaf46bab0cf2bb2435d235f20745f19140a0b89bcedd1807ca49fe7294a0b7d" gracePeriod=30 Dec 03 09:50:07 crc kubenswrapper[4947]: I1203 09:50:07.602543 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 09:50:07 crc kubenswrapper[4947]: I1203 09:50:07.602817 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="e57f73cf-4f9a-4c56-b146-af55d546e1b0" containerName="nova-cell1-conductor-conductor" containerID="cri-o://2a9eb8d59aac80d11abeeb6f04e66b2aefa7cae7495c7ba476fbe71c58f7878c" gracePeriod=30 Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.701606 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv"] Dec 03 09:50:08 crc kubenswrapper[4947]: E1203 09:50:08.702395 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44957a4e-7c93-40c4-b32d-bd0439ba60ca" containerName="extract-utilities" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.702408 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="44957a4e-7c93-40c4-b32d-bd0439ba60ca" containerName="extract-utilities" Dec 03 09:50:08 crc kubenswrapper[4947]: E1203 09:50:08.702431 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44957a4e-7c93-40c4-b32d-bd0439ba60ca" containerName="registry-server" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.702437 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="44957a4e-7c93-40c4-b32d-bd0439ba60ca" containerName="registry-server" Dec 03 09:50:08 crc kubenswrapper[4947]: E1203 09:50:08.702461 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44957a4e-7c93-40c4-b32d-bd0439ba60ca" containerName="extract-content" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.702467 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="44957a4e-7c93-40c4-b32d-bd0439ba60ca" containerName="extract-content" Dec 03 09:50:08 crc kubenswrapper[4947]: E1203 09:50:08.702477 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ccd72a-194e-4207-bdfc-ed594d06f81c" containerName="neutron-dhcp-openstack-openstack-cell2" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.702483 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ccd72a-194e-4207-bdfc-ed594d06f81c" containerName="neutron-dhcp-openstack-openstack-cell2" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.702721 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="93ccd72a-194e-4207-bdfc-ed594d06f81c" containerName="neutron-dhcp-openstack-openstack-cell2" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.702743 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="44957a4e-7c93-40c4-b32d-bd0439ba60ca" containerName="registry-server" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.703574 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.709063 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.709307 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.709463 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.709615 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.709852 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.710016 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.710135 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rfmtm" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.729083 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv"] Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.802778 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5ltm\" (UniqueName: \"kubernetes.io/projected/9b2e7f66-e905-447d-b029-e27add784fcc-kube-api-access-p5ltm\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.802902 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/9b2e7f66-e905-447d-b029-e27add784fcc-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.802937 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.803167 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.803189 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.803286 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.803317 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.803387 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.803437 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.816766 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell2-conductor-0"] Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.816978 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell2-conductor-0" podUID="d13b2965-2e64-4e8d-a42a-48bdcb8c5bed" containerName="nova-cell2-conductor-conductor" containerID="cri-o://001dfcd357ce07fb6bceb9ec56d451cdfd1de5ea4845b9c29d545b7d0afe7bfb" gracePeriod=30 Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.906242 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/9b2e7f66-e905-447d-b029-e27add784fcc-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.906310 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.906376 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.906404 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.906531 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.906561 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.906636 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.906690 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.906742 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5ltm\" (UniqueName: \"kubernetes.io/projected/9b2e7f66-e905-447d-b029-e27add784fcc-kube-api-access-p5ltm\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.908664 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/9b2e7f66-e905-447d-b029-e27add784fcc-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.923195 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.924010 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.924845 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.925376 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.926112 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.931403 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.932083 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.935246 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5ltm\" (UniqueName: \"kubernetes.io/projected/9b2e7f66-e905-447d-b029-e27add784fcc-kube-api-access-p5ltm\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.981736 4947 generic.go:334] "Generic (PLEG): container finished" podID="e57f73cf-4f9a-4c56-b146-af55d546e1b0" containerID="2a9eb8d59aac80d11abeeb6f04e66b2aefa7cae7495c7ba476fbe71c58f7878c" exitCode=0 Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.981832 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e57f73cf-4f9a-4c56-b146-af55d546e1b0","Type":"ContainerDied","Data":"2a9eb8d59aac80d11abeeb6f04e66b2aefa7cae7495c7ba476fbe71c58f7878c"} Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.981861 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e57f73cf-4f9a-4c56-b146-af55d546e1b0","Type":"ContainerDied","Data":"3a402e77faaed21b896b61384dd12c5d4b20e714bd4e8be603286de7ce5e1212"} Dec 03 09:50:08 crc kubenswrapper[4947]: I1203 09:50:08.981872 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a402e77faaed21b896b61384dd12c5d4b20e714bd4e8be603286de7ce5e1212" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.006009 4947 generic.go:334] "Generic (PLEG): container finished" podID="10daa16c-8d05-423d-8e47-760b529a5925" containerID="8aaf46bab0cf2bb2435d235f20745f19140a0b89bcedd1807ca49fe7294a0b7d" exitCode=0 Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.006081 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"10daa16c-8d05-423d-8e47-760b529a5925","Type":"ContainerDied","Data":"8aaf46bab0cf2bb2435d235f20745f19140a0b89bcedd1807ca49fe7294a0b7d"} Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.039712 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.040035 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="212bbca4-059a-419a-acf3-984c087a0f5d" containerName="nova-api-log" containerID="cri-o://55f873951fc20225fdbca2541d02dc940f188c8a029f55995d2552a261b16573" gracePeriod=30 Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.047110 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="212bbca4-059a-419a-acf3-984c087a0f5d" containerName="nova-api-api" containerID="cri-o://f62a978e407c8be568bb8ba0315a933f9db91e10ae67b77b343c26e3078bdc73" gracePeriod=30 Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.064415 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.065445 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5ec54e67-38f1-44ed-a365-932f4103b99a" containerName="nova-metadata-log" containerID="cri-o://a5be49daac165ff5b3ff58f7b5517975355ca5ef2665d85ed8ba0f6b4120f17d" gracePeriod=30 Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.065926 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5ec54e67-38f1-44ed-a365-932f4103b99a" containerName="nova-metadata-metadata" containerID="cri-o://a6a14598a9a430b059b6bb767b2c556bf9dd1ecac8031967c832ce57e755a01c" gracePeriod=30 Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.069714 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.097449 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.120299 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e57f73cf-4f9a-4c56-b146-af55d546e1b0-combined-ca-bundle\") pod \"e57f73cf-4f9a-4c56-b146-af55d546e1b0\" (UID: \"e57f73cf-4f9a-4c56-b146-af55d546e1b0\") " Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.120360 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e57f73cf-4f9a-4c56-b146-af55d546e1b0-config-data\") pod \"e57f73cf-4f9a-4c56-b146-af55d546e1b0\" (UID: \"e57f73cf-4f9a-4c56-b146-af55d546e1b0\") " Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.120448 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vthv\" (UniqueName: \"kubernetes.io/projected/e57f73cf-4f9a-4c56-b146-af55d546e1b0-kube-api-access-9vthv\") pod \"e57f73cf-4f9a-4c56-b146-af55d546e1b0\" (UID: \"e57f73cf-4f9a-4c56-b146-af55d546e1b0\") " Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.128077 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e57f73cf-4f9a-4c56-b146-af55d546e1b0-kube-api-access-9vthv" (OuterVolumeSpecName: "kube-api-access-9vthv") pod "e57f73cf-4f9a-4c56-b146-af55d546e1b0" (UID: "e57f73cf-4f9a-4c56-b146-af55d546e1b0"). InnerVolumeSpecName "kube-api-access-9vthv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.143538 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2"] Dec 03 09:50:09 crc kubenswrapper[4947]: E1203 09:50:09.144023 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57f73cf-4f9a-4c56-b146-af55d546e1b0" containerName="nova-cell1-conductor-conductor" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.144045 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57f73cf-4f9a-4c56-b146-af55d546e1b0" containerName="nova-cell1-conductor-conductor" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.144477 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57f73cf-4f9a-4c56-b146-af55d546e1b0" containerName="nova-cell1-conductor-conductor" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.145211 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.145380 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f74cba1c-b50e-4d6c-9d24-6ff44ce942e6" containerName="nova-scheduler-scheduler" containerID="cri-o://1aced893e95341bb3205979e0183c7c041790dd5f2a3ef9804bb24f8aef24fcf" gracePeriod=30 Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.145573 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.152074 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.152264 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell2-dockercfg-cl4m2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.152548 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell2-compute-config" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.194604 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e57f73cf-4f9a-4c56-b146-af55d546e1b0-config-data" (OuterVolumeSpecName: "config-data") pod "e57f73cf-4f9a-4c56-b146-af55d546e1b0" (UID: "e57f73cf-4f9a-4c56-b146-af55d546e1b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.197279 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2"] Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.229742 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-migration-ssh-key-1\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.229823 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell2-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-cell2-combined-ca-bundle\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.229920 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell2-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-cell2-compute-config-0\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.230835 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6wjv\" (UniqueName: \"kubernetes.io/projected/2c41415b-4450-464c-bdcf-16da30cc83bb-kube-api-access-s6wjv\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.231272 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell2-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-cell2-compute-config-1\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.231600 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-migration-ssh-key-0\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.231764 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-ssh-key\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.231808 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-cells-global-config-0\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.232048 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-inventory\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.232409 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e57f73cf-4f9a-4c56-b146-af55d546e1b0-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.232426 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vthv\" (UniqueName: \"kubernetes.io/projected/e57f73cf-4f9a-4c56-b146-af55d546e1b0-kube-api-access-9vthv\") on node \"crc\" DevicePath \"\"" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.233883 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e57f73cf-4f9a-4c56-b146-af55d546e1b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e57f73cf-4f9a-4c56-b146-af55d546e1b0" (UID: "e57f73cf-4f9a-4c56-b146-af55d546e1b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:50:09 crc kubenswrapper[4947]: E1203 09:50:09.277897 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="001dfcd357ce07fb6bceb9ec56d451cdfd1de5ea4845b9c29d545b7d0afe7bfb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 09:50:09 crc kubenswrapper[4947]: E1203 09:50:09.281259 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="001dfcd357ce07fb6bceb9ec56d451cdfd1de5ea4845b9c29d545b7d0afe7bfb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 09:50:09 crc kubenswrapper[4947]: E1203 09:50:09.301474 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="001dfcd357ce07fb6bceb9ec56d451cdfd1de5ea4845b9c29d545b7d0afe7bfb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 09:50:09 crc kubenswrapper[4947]: E1203 09:50:09.301626 4947 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell2-conductor-0" podUID="d13b2965-2e64-4e8d-a42a-48bdcb8c5bed" containerName="nova-cell2-conductor-conductor" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.325901 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.337834 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell2-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-cell2-compute-config-0\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.338324 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6wjv\" (UniqueName: \"kubernetes.io/projected/2c41415b-4450-464c-bdcf-16da30cc83bb-kube-api-access-s6wjv\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.338475 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell2-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-cell2-compute-config-1\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.338620 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-migration-ssh-key-0\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.341440 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-ssh-key\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.341529 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-cells-global-config-0\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.341610 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-inventory\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.341702 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-migration-ssh-key-1\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.341746 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell2-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-cell2-combined-ca-bundle\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.341919 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e57f73cf-4f9a-4c56-b146-af55d546e1b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.343766 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-cells-global-config-0\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.347842 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell2-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-cell2-combined-ca-bundle\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.349141 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-ssh-key\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.350675 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-migration-ssh-key-1\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.351826 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-inventory\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.359238 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-migration-ssh-key-0\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.361110 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell2-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-cell2-compute-config-1\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.365135 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell2-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-cell2-compute-config-0\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.365627 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6wjv\" (UniqueName: \"kubernetes.io/projected/2c41415b-4450-464c-bdcf-16da30cc83bb-kube-api-access-s6wjv\") pod \"nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.443638 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10daa16c-8d05-423d-8e47-760b529a5925-combined-ca-bundle\") pod \"10daa16c-8d05-423d-8e47-760b529a5925\" (UID: \"10daa16c-8d05-423d-8e47-760b529a5925\") " Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.443694 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10daa16c-8d05-423d-8e47-760b529a5925-config-data\") pod \"10daa16c-8d05-423d-8e47-760b529a5925\" (UID: \"10daa16c-8d05-423d-8e47-760b529a5925\") " Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.443751 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm8vw\" (UniqueName: \"kubernetes.io/projected/10daa16c-8d05-423d-8e47-760b529a5925-kube-api-access-nm8vw\") pod \"10daa16c-8d05-423d-8e47-760b529a5925\" (UID: \"10daa16c-8d05-423d-8e47-760b529a5925\") " Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.447730 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10daa16c-8d05-423d-8e47-760b529a5925-kube-api-access-nm8vw" (OuterVolumeSpecName: "kube-api-access-nm8vw") pod "10daa16c-8d05-423d-8e47-760b529a5925" (UID: "10daa16c-8d05-423d-8e47-760b529a5925"). InnerVolumeSpecName "kube-api-access-nm8vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.470329 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10daa16c-8d05-423d-8e47-760b529a5925-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10daa16c-8d05-423d-8e47-760b529a5925" (UID: "10daa16c-8d05-423d-8e47-760b529a5925"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.487719 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10daa16c-8d05-423d-8e47-760b529a5925-config-data" (OuterVolumeSpecName: "config-data") pod "10daa16c-8d05-423d-8e47-760b529a5925" (UID: "10daa16c-8d05-423d-8e47-760b529a5925"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.497827 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.550875 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10daa16c-8d05-423d-8e47-760b529a5925-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.550905 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10daa16c-8d05-423d-8e47-760b529a5925-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.550917 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nm8vw\" (UniqueName: \"kubernetes.io/projected/10daa16c-8d05-423d-8e47-760b529a5925-kube-api-access-nm8vw\") on node \"crc\" DevicePath \"\"" Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.787189 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 09:50:09 crc kubenswrapper[4947]: I1203 09:50:09.796831 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv"] Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.045182 4947 generic.go:334] "Generic (PLEG): container finished" podID="212bbca4-059a-419a-acf3-984c087a0f5d" containerID="55f873951fc20225fdbca2541d02dc940f188c8a029f55995d2552a261b16573" exitCode=143 Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.045296 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"212bbca4-059a-419a-acf3-984c087a0f5d","Type":"ContainerDied","Data":"55f873951fc20225fdbca2541d02dc940f188c8a029f55995d2552a261b16573"} Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.049409 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" event={"ID":"9b2e7f66-e905-447d-b029-e27add784fcc","Type":"ContainerStarted","Data":"de2aeebcb7931d45e4c2ffe421d31a2ca0e6898456d4d183ca80aca54a1ab506"} Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.051786 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"10daa16c-8d05-423d-8e47-760b529a5925","Type":"ContainerDied","Data":"6502592d951f50066be86639aea2c344bf75ff34bc87f4c9c33c6db3b5ad5704"} Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.051823 4947 scope.go:117] "RemoveContainer" containerID="8aaf46bab0cf2bb2435d235f20745f19140a0b89bcedd1807ca49fe7294a0b7d" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.051907 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.074557 4947 generic.go:334] "Generic (PLEG): container finished" podID="f74cba1c-b50e-4d6c-9d24-6ff44ce942e6" containerID="1aced893e95341bb3205979e0183c7c041790dd5f2a3ef9804bb24f8aef24fcf" exitCode=0 Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.074603 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f74cba1c-b50e-4d6c-9d24-6ff44ce942e6","Type":"ContainerDied","Data":"1aced893e95341bb3205979e0183c7c041790dd5f2a3ef9804bb24f8aef24fcf"} Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.114063 4947 generic.go:334] "Generic (PLEG): container finished" podID="d13b2965-2e64-4e8d-a42a-48bdcb8c5bed" containerID="001dfcd357ce07fb6bceb9ec56d451cdfd1de5ea4845b9c29d545b7d0afe7bfb" exitCode=0 Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.114391 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-conductor-0" event={"ID":"d13b2965-2e64-4e8d-a42a-48bdcb8c5bed","Type":"ContainerDied","Data":"001dfcd357ce07fb6bceb9ec56d451cdfd1de5ea4845b9c29d545b7d0afe7bfb"} Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.127756 4947 generic.go:334] "Generic (PLEG): container finished" podID="5ec54e67-38f1-44ed-a365-932f4103b99a" containerID="a5be49daac165ff5b3ff58f7b5517975355ca5ef2665d85ed8ba0f6b4120f17d" exitCode=143 Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.127872 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.128586 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ec54e67-38f1-44ed-a365-932f4103b99a","Type":"ContainerDied","Data":"a5be49daac165ff5b3ff58f7b5517975355ca5ef2665d85ed8ba0f6b4120f17d"} Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.145540 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.185545 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.201277 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-conductor-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.264626 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 09:50:10 crc kubenswrapper[4947]: E1203 09:50:10.265110 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d13b2965-2e64-4e8d-a42a-48bdcb8c5bed" containerName="nova-cell2-conductor-conductor" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.265124 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13b2965-2e64-4e8d-a42a-48bdcb8c5bed" containerName="nova-cell2-conductor-conductor" Dec 03 09:50:10 crc kubenswrapper[4947]: E1203 09:50:10.265147 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10daa16c-8d05-423d-8e47-760b529a5925" containerName="nova-cell0-conductor-conductor" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.265153 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="10daa16c-8d05-423d-8e47-760b529a5925" containerName="nova-cell0-conductor-conductor" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.265372 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="10daa16c-8d05-423d-8e47-760b529a5925" containerName="nova-cell0-conductor-conductor" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.265391 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="d13b2965-2e64-4e8d-a42a-48bdcb8c5bed" containerName="nova-cell2-conductor-conductor" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.266180 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.274541 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llgrr\" (UniqueName: \"kubernetes.io/projected/d13b2965-2e64-4e8d-a42a-48bdcb8c5bed-kube-api-access-llgrr\") pod \"d13b2965-2e64-4e8d-a42a-48bdcb8c5bed\" (UID: \"d13b2965-2e64-4e8d-a42a-48bdcb8c5bed\") " Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.274732 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d13b2965-2e64-4e8d-a42a-48bdcb8c5bed-config-data\") pod \"d13b2965-2e64-4e8d-a42a-48bdcb8c5bed\" (UID: \"d13b2965-2e64-4e8d-a42a-48bdcb8c5bed\") " Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.274779 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d13b2965-2e64-4e8d-a42a-48bdcb8c5bed-combined-ca-bundle\") pod \"d13b2965-2e64-4e8d-a42a-48bdcb8c5bed\" (UID: \"d13b2965-2e64-4e8d-a42a-48bdcb8c5bed\") " Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.277130 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.295949 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d13b2965-2e64-4e8d-a42a-48bdcb8c5bed-kube-api-access-llgrr" (OuterVolumeSpecName: "kube-api-access-llgrr") pod "d13b2965-2e64-4e8d-a42a-48bdcb8c5bed" (UID: "d13b2965-2e64-4e8d-a42a-48bdcb8c5bed"). InnerVolumeSpecName "kube-api-access-llgrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.325605 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d13b2965-2e64-4e8d-a42a-48bdcb8c5bed-config-data" (OuterVolumeSpecName: "config-data") pod "d13b2965-2e64-4e8d-a42a-48bdcb8c5bed" (UID: "d13b2965-2e64-4e8d-a42a-48bdcb8c5bed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.367659 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d13b2965-2e64-4e8d-a42a-48bdcb8c5bed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d13b2965-2e64-4e8d-a42a-48bdcb8c5bed" (UID: "d13b2965-2e64-4e8d-a42a-48bdcb8c5bed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.378764 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d04ce99a-4178-4396-b6d3-c6485432d85f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d04ce99a-4178-4396-b6d3-c6485432d85f\") " pod="openstack/nova-cell0-conductor-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.378822 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8cj5\" (UniqueName: \"kubernetes.io/projected/d04ce99a-4178-4396-b6d3-c6485432d85f-kube-api-access-f8cj5\") pod \"nova-cell0-conductor-0\" (UID: \"d04ce99a-4178-4396-b6d3-c6485432d85f\") " pod="openstack/nova-cell0-conductor-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.378942 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d04ce99a-4178-4396-b6d3-c6485432d85f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d04ce99a-4178-4396-b6d3-c6485432d85f\") " pod="openstack/nova-cell0-conductor-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.379032 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d13b2965-2e64-4e8d-a42a-48bdcb8c5bed-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.379043 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d13b2965-2e64-4e8d-a42a-48bdcb8c5bed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.379054 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llgrr\" (UniqueName: \"kubernetes.io/projected/d13b2965-2e64-4e8d-a42a-48bdcb8c5bed-kube-api-access-llgrr\") on node \"crc\" DevicePath \"\"" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.388562 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2"] Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.399436 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.462088 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.474842 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.481342 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8cj5\" (UniqueName: \"kubernetes.io/projected/d04ce99a-4178-4396-b6d3-c6485432d85f-kube-api-access-f8cj5\") pod \"nova-cell0-conductor-0\" (UID: \"d04ce99a-4178-4396-b6d3-c6485432d85f\") " pod="openstack/nova-cell0-conductor-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.481637 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d04ce99a-4178-4396-b6d3-c6485432d85f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d04ce99a-4178-4396-b6d3-c6485432d85f\") " pod="openstack/nova-cell0-conductor-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.481792 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d04ce99a-4178-4396-b6d3-c6485432d85f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d04ce99a-4178-4396-b6d3-c6485432d85f\") " pod="openstack/nova-cell0-conductor-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.487778 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d04ce99a-4178-4396-b6d3-c6485432d85f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"d04ce99a-4178-4396-b6d3-c6485432d85f\") " pod="openstack/nova-cell0-conductor-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.489465 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d04ce99a-4178-4396-b6d3-c6485432d85f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"d04ce99a-4178-4396-b6d3-c6485432d85f\") " pod="openstack/nova-cell0-conductor-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.501788 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.504205 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.508945 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8cj5\" (UniqueName: \"kubernetes.io/projected/d04ce99a-4178-4396-b6d3-c6485432d85f-kube-api-access-f8cj5\") pod \"nova-cell0-conductor-0\" (UID: \"d04ce99a-4178-4396-b6d3-c6485432d85f\") " pod="openstack/nova-cell0-conductor-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.510301 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.524969 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.593564 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ktpq\" (UniqueName: \"kubernetes.io/projected/abc69d54-9ae1-4797-b8c8-9beb82eacff3-kube-api-access-2ktpq\") pod \"nova-cell1-conductor-0\" (UID: \"abc69d54-9ae1-4797-b8c8-9beb82eacff3\") " pod="openstack/nova-cell1-conductor-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.593905 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc69d54-9ae1-4797-b8c8-9beb82eacff3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"abc69d54-9ae1-4797-b8c8-9beb82eacff3\") " pod="openstack/nova-cell1-conductor-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.593948 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abc69d54-9ae1-4797-b8c8-9beb82eacff3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"abc69d54-9ae1-4797-b8c8-9beb82eacff3\") " pod="openstack/nova-cell1-conductor-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.622914 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.695745 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc69d54-9ae1-4797-b8c8-9beb82eacff3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"abc69d54-9ae1-4797-b8c8-9beb82eacff3\") " pod="openstack/nova-cell1-conductor-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.696380 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abc69d54-9ae1-4797-b8c8-9beb82eacff3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"abc69d54-9ae1-4797-b8c8-9beb82eacff3\") " pod="openstack/nova-cell1-conductor-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.696572 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ktpq\" (UniqueName: \"kubernetes.io/projected/abc69d54-9ae1-4797-b8c8-9beb82eacff3-kube-api-access-2ktpq\") pod \"nova-cell1-conductor-0\" (UID: \"abc69d54-9ae1-4797-b8c8-9beb82eacff3\") " pod="openstack/nova-cell1-conductor-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.699461 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abc69d54-9ae1-4797-b8c8-9beb82eacff3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"abc69d54-9ae1-4797-b8c8-9beb82eacff3\") " pod="openstack/nova-cell1-conductor-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.699913 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc69d54-9ae1-4797-b8c8-9beb82eacff3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"abc69d54-9ae1-4797-b8c8-9beb82eacff3\") " pod="openstack/nova-cell1-conductor-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.711554 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ktpq\" (UniqueName: \"kubernetes.io/projected/abc69d54-9ae1-4797-b8c8-9beb82eacff3-kube-api-access-2ktpq\") pod \"nova-cell1-conductor-0\" (UID: \"abc69d54-9ae1-4797-b8c8-9beb82eacff3\") " pod="openstack/nova-cell1-conductor-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.783691 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.818405 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.906111 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74cba1c-b50e-4d6c-9d24-6ff44ce942e6-combined-ca-bundle\") pod \"f74cba1c-b50e-4d6c-9d24-6ff44ce942e6\" (UID: \"f74cba1c-b50e-4d6c-9d24-6ff44ce942e6\") " Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.906313 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpcr6\" (UniqueName: \"kubernetes.io/projected/f74cba1c-b50e-4d6c-9d24-6ff44ce942e6-kube-api-access-fpcr6\") pod \"f74cba1c-b50e-4d6c-9d24-6ff44ce942e6\" (UID: \"f74cba1c-b50e-4d6c-9d24-6ff44ce942e6\") " Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.906518 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f74cba1c-b50e-4d6c-9d24-6ff44ce942e6-config-data\") pod \"f74cba1c-b50e-4d6c-9d24-6ff44ce942e6\" (UID: \"f74cba1c-b50e-4d6c-9d24-6ff44ce942e6\") " Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.919807 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f74cba1c-b50e-4d6c-9d24-6ff44ce942e6-kube-api-access-fpcr6" (OuterVolumeSpecName: "kube-api-access-fpcr6") pod "f74cba1c-b50e-4d6c-9d24-6ff44ce942e6" (UID: "f74cba1c-b50e-4d6c-9d24-6ff44ce942e6"). InnerVolumeSpecName "kube-api-access-fpcr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.954356 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f74cba1c-b50e-4d6c-9d24-6ff44ce942e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f74cba1c-b50e-4d6c-9d24-6ff44ce942e6" (UID: "f74cba1c-b50e-4d6c-9d24-6ff44ce942e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:50:10 crc kubenswrapper[4947]: I1203 09:50:10.961946 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f74cba1c-b50e-4d6c-9d24-6ff44ce942e6-config-data" (OuterVolumeSpecName: "config-data") pod "f74cba1c-b50e-4d6c-9d24-6ff44ce942e6" (UID: "f74cba1c-b50e-4d6c-9d24-6ff44ce942e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.010421 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpcr6\" (UniqueName: \"kubernetes.io/projected/f74cba1c-b50e-4d6c-9d24-6ff44ce942e6-kube-api-access-fpcr6\") on node \"crc\" DevicePath \"\"" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.010462 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f74cba1c-b50e-4d6c-9d24-6ff44ce942e6-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.010527 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74cba1c-b50e-4d6c-9d24-6ff44ce942e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.102986 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10daa16c-8d05-423d-8e47-760b529a5925" path="/var/lib/kubelet/pods/10daa16c-8d05-423d-8e47-760b529a5925/volumes" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.105946 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e57f73cf-4f9a-4c56-b146-af55d546e1b0" path="/var/lib/kubelet/pods/e57f73cf-4f9a-4c56-b146-af55d546e1b0/volumes" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.106998 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.145311 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-conductor-0" event={"ID":"d13b2965-2e64-4e8d-a42a-48bdcb8c5bed","Type":"ContainerDied","Data":"1a1fa8c9acb43c48a801341b48b69c013c6d97545e6e207771a93d4dbfc335ee"} Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.145382 4947 scope.go:117] "RemoveContainer" containerID="001dfcd357ce07fb6bceb9ec56d451cdfd1de5ea4845b9c29d545b7d0afe7bfb" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.145959 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-conductor-0" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.147734 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" event={"ID":"2c41415b-4450-464c-bdcf-16da30cc83bb","Type":"ContainerStarted","Data":"a7362928763435d747ec7855af8b8e364d7b0633ac3a0fe3e5a7bd77eda7f394"} Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.147823 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" event={"ID":"2c41415b-4450-464c-bdcf-16da30cc83bb","Type":"ContainerStarted","Data":"b111c69a6fcbe1cd2073f5c5484f7fc89da2b576a625bd49c8c4acaa309ada6f"} Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.150795 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" event={"ID":"9b2e7f66-e905-447d-b029-e27add784fcc","Type":"ContainerStarted","Data":"c326f48e28a4a1121653d9e837a5fb700b663f5ad2ca915be5b04cea613c290d"} Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.156629 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f74cba1c-b50e-4d6c-9d24-6ff44ce942e6","Type":"ContainerDied","Data":"6fa3d3cbb76901dfc81b3c9ebc3ff5705ceb56fd3dab23b06eb163d9fd58cfc7"} Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.156640 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.160472 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d04ce99a-4178-4396-b6d3-c6485432d85f","Type":"ContainerStarted","Data":"2e4c7e2faf6a17f42f58e8b3dbc07c05531b39c8582739bdf3a99e72a89f4620"} Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.187642 4947 scope.go:117] "RemoveContainer" containerID="1aced893e95341bb3205979e0183c7c041790dd5f2a3ef9804bb24f8aef24fcf" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.197937 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell2-conductor-0"] Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.217121 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell2-conductor-0"] Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.243086 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell2-conductor-0"] Dec 03 09:50:11 crc kubenswrapper[4947]: E1203 09:50:11.243695 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74cba1c-b50e-4d6c-9d24-6ff44ce942e6" containerName="nova-scheduler-scheduler" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.243717 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74cba1c-b50e-4d6c-9d24-6ff44ce942e6" containerName="nova-scheduler-scheduler" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.243975 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="f74cba1c-b50e-4d6c-9d24-6ff44ce942e6" containerName="nova-scheduler-scheduler" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.244961 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-conductor-0" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.247524 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell2-conductor-config-data" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.258302 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" podStartSLOduration=2.786742204 podStartE2EDuration="3.258283813s" podCreationTimestamp="2025-12-03 09:50:08 +0000 UTC" firstStartedPulling="2025-12-03 09:50:10.204312381 +0000 UTC m=+10871.465266807" lastFinishedPulling="2025-12-03 09:50:10.67585399 +0000 UTC m=+10871.936808416" observedRunningTime="2025-12-03 09:50:11.181636473 +0000 UTC m=+10872.442590959" watchObservedRunningTime="2025-12-03 09:50:11.258283813 +0000 UTC m=+10872.519238239" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.272171 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell2-conductor-0"] Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.284003 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.313739 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.324579 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acc4c853-250e-435c-9c44-3d026fd5e8ca-combined-ca-bundle\") pod \"nova-cell2-conductor-0\" (UID: \"acc4c853-250e-435c-9c44-3d026fd5e8ca\") " pod="openstack/nova-cell2-conductor-0" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.324704 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6978f\" (UniqueName: \"kubernetes.io/projected/acc4c853-250e-435c-9c44-3d026fd5e8ca-kube-api-access-6978f\") pod \"nova-cell2-conductor-0\" (UID: \"acc4c853-250e-435c-9c44-3d026fd5e8ca\") " pod="openstack/nova-cell2-conductor-0" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.324820 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acc4c853-250e-435c-9c44-3d026fd5e8ca-config-data\") pod \"nova-cell2-conductor-0\" (UID: \"acc4c853-250e-435c-9c44-3d026fd5e8ca\") " pod="openstack/nova-cell2-conductor-0" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.336217 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.337970 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.340959 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.364906 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.381297 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" podStartSLOduration=2.793464266 podStartE2EDuration="3.381273256s" podCreationTimestamp="2025-12-03 09:50:08 +0000 UTC" firstStartedPulling="2025-12-03 09:50:09.786987897 +0000 UTC m=+10871.047942323" lastFinishedPulling="2025-12-03 09:50:10.374796887 +0000 UTC m=+10871.635751313" observedRunningTime="2025-12-03 09:50:11.230137013 +0000 UTC m=+10872.491091439" watchObservedRunningTime="2025-12-03 09:50:11.381273256 +0000 UTC m=+10872.642227682" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.426745 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acc4c853-250e-435c-9c44-3d026fd5e8ca-config-data\") pod \"nova-cell2-conductor-0\" (UID: \"acc4c853-250e-435c-9c44-3d026fd5e8ca\") " pod="openstack/nova-cell2-conductor-0" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.426809 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d59dc211-4f20-4e7b-8e31-a387f4e4467d-config-data\") pod \"nova-scheduler-0\" (UID: \"d59dc211-4f20-4e7b-8e31-a387f4e4467d\") " pod="openstack/nova-scheduler-0" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.426878 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xj7g\" (UniqueName: \"kubernetes.io/projected/d59dc211-4f20-4e7b-8e31-a387f4e4467d-kube-api-access-8xj7g\") pod \"nova-scheduler-0\" (UID: \"d59dc211-4f20-4e7b-8e31-a387f4e4467d\") " pod="openstack/nova-scheduler-0" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.426930 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59dc211-4f20-4e7b-8e31-a387f4e4467d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d59dc211-4f20-4e7b-8e31-a387f4e4467d\") " pod="openstack/nova-scheduler-0" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.427336 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acc4c853-250e-435c-9c44-3d026fd5e8ca-combined-ca-bundle\") pod \"nova-cell2-conductor-0\" (UID: \"acc4c853-250e-435c-9c44-3d026fd5e8ca\") " pod="openstack/nova-cell2-conductor-0" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.427457 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6978f\" (UniqueName: \"kubernetes.io/projected/acc4c853-250e-435c-9c44-3d026fd5e8ca-kube-api-access-6978f\") pod \"nova-cell2-conductor-0\" (UID: \"acc4c853-250e-435c-9c44-3d026fd5e8ca\") " pod="openstack/nova-cell2-conductor-0" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.431544 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acc4c853-250e-435c-9c44-3d026fd5e8ca-combined-ca-bundle\") pod \"nova-cell2-conductor-0\" (UID: \"acc4c853-250e-435c-9c44-3d026fd5e8ca\") " pod="openstack/nova-cell2-conductor-0" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.432103 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acc4c853-250e-435c-9c44-3d026fd5e8ca-config-data\") pod \"nova-cell2-conductor-0\" (UID: \"acc4c853-250e-435c-9c44-3d026fd5e8ca\") " pod="openstack/nova-cell2-conductor-0" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.436108 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.447203 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6978f\" (UniqueName: \"kubernetes.io/projected/acc4c853-250e-435c-9c44-3d026fd5e8ca-kube-api-access-6978f\") pod \"nova-cell2-conductor-0\" (UID: \"acc4c853-250e-435c-9c44-3d026fd5e8ca\") " pod="openstack/nova-cell2-conductor-0" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.533522 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d59dc211-4f20-4e7b-8e31-a387f4e4467d-config-data\") pod \"nova-scheduler-0\" (UID: \"d59dc211-4f20-4e7b-8e31-a387f4e4467d\") " pod="openstack/nova-scheduler-0" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.533682 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xj7g\" (UniqueName: \"kubernetes.io/projected/d59dc211-4f20-4e7b-8e31-a387f4e4467d-kube-api-access-8xj7g\") pod \"nova-scheduler-0\" (UID: \"d59dc211-4f20-4e7b-8e31-a387f4e4467d\") " pod="openstack/nova-scheduler-0" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.533772 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59dc211-4f20-4e7b-8e31-a387f4e4467d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d59dc211-4f20-4e7b-8e31-a387f4e4467d\") " pod="openstack/nova-scheduler-0" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.538130 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59dc211-4f20-4e7b-8e31-a387f4e4467d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d59dc211-4f20-4e7b-8e31-a387f4e4467d\") " pod="openstack/nova-scheduler-0" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.541009 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d59dc211-4f20-4e7b-8e31-a387f4e4467d-config-data\") pod \"nova-scheduler-0\" (UID: \"d59dc211-4f20-4e7b-8e31-a387f4e4467d\") " pod="openstack/nova-scheduler-0" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.553519 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xj7g\" (UniqueName: \"kubernetes.io/projected/d59dc211-4f20-4e7b-8e31-a387f4e4467d-kube-api-access-8xj7g\") pod \"nova-scheduler-0\" (UID: \"d59dc211-4f20-4e7b-8e31-a387f4e4467d\") " pod="openstack/nova-scheduler-0" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.579071 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-conductor-0" Dec 03 09:50:11 crc kubenswrapper[4947]: I1203 09:50:11.662553 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 09:50:12 crc kubenswrapper[4947]: W1203 09:50:12.170794 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacc4c853_250e_435c_9c44_3d026fd5e8ca.slice/crio-c29822ce143886a49af8ad6a15fb50f14ee6b0c6d2cb8b028fdac44d5aaa58d9 WatchSource:0}: Error finding container c29822ce143886a49af8ad6a15fb50f14ee6b0c6d2cb8b028fdac44d5aaa58d9: Status 404 returned error can't find the container with id c29822ce143886a49af8ad6a15fb50f14ee6b0c6d2cb8b028fdac44d5aaa58d9 Dec 03 09:50:12 crc kubenswrapper[4947]: I1203 09:50:12.175169 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell2-conductor-0"] Dec 03 09:50:12 crc kubenswrapper[4947]: I1203 09:50:12.198560 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"abc69d54-9ae1-4797-b8c8-9beb82eacff3","Type":"ContainerStarted","Data":"9a22e89866974ea511cd0b6132a84b11de8076b3c9b19382cb1b07295ab46e6d"} Dec 03 09:50:12 crc kubenswrapper[4947]: I1203 09:50:12.198990 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"abc69d54-9ae1-4797-b8c8-9beb82eacff3","Type":"ContainerStarted","Data":"8fc734d64492c036b95a1da6cee4f291912923237ffb14cf3fa4ea4dbe4e9e79"} Dec 03 09:50:12 crc kubenswrapper[4947]: I1203 09:50:12.199118 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 03 09:50:12 crc kubenswrapper[4947]: I1203 09:50:12.202620 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"d04ce99a-4178-4396-b6d3-c6485432d85f","Type":"ContainerStarted","Data":"b8bad6f7a4657d650ec3241fc09c7f59d139c196865b9a7c2b7ae9c62103fcc4"} Dec 03 09:50:12 crc kubenswrapper[4947]: I1203 09:50:12.203586 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 03 09:50:12 crc kubenswrapper[4947]: I1203 09:50:12.221317 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.221294279 podStartE2EDuration="2.221294279s" podCreationTimestamp="2025-12-03 09:50:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:50:12.21469289 +0000 UTC m=+10873.475647326" watchObservedRunningTime="2025-12-03 09:50:12.221294279 +0000 UTC m=+10873.482248715" Dec 03 09:50:12 crc kubenswrapper[4947]: I1203 09:50:12.261337 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 09:50:12 crc kubenswrapper[4947]: I1203 09:50:12.271455 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.2713138 podStartE2EDuration="2.2713138s" podCreationTimestamp="2025-12-03 09:50:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:50:12.259172622 +0000 UTC m=+10873.520127058" watchObservedRunningTime="2025-12-03 09:50:12.2713138 +0000 UTC m=+10873.532268226" Dec 03 09:50:12 crc kubenswrapper[4947]: I1203 09:50:12.350223 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell3-conductor-0"] Dec 03 09:50:12 crc kubenswrapper[4947]: I1203 09:50:12.350632 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell3-conductor-0" podUID="4f3cdf1c-630c-4944-ae58-c426e0d2161a" containerName="nova-cell3-conductor-conductor" containerID="cri-o://021742ed4b31b0cdb1f94b3211a4216b98b34188eb48343aaf531734f0ecc45b" gracePeriod=30 Dec 03 09:50:12 crc kubenswrapper[4947]: I1203 09:50:12.601000 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="5ec54e67-38f1-44ed-a365-932f4103b99a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.136:8775/\": read tcp 10.217.0.2:60308->10.217.1.136:8775: read: connection reset by peer" Dec 03 09:50:12 crc kubenswrapper[4947]: I1203 09:50:12.601122 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="5ec54e67-38f1-44ed-a365-932f4103b99a" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.136:8775/\": read tcp 10.217.0.2:60294->10.217.1.136:8775: read: connection reset by peer" Dec 03 09:50:12 crc kubenswrapper[4947]: I1203 09:50:12.878572 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:12.974865 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/212bbca4-059a-419a-acf3-984c087a0f5d-logs\") pod \"212bbca4-059a-419a-acf3-984c087a0f5d\" (UID: \"212bbca4-059a-419a-acf3-984c087a0f5d\") " Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:12.975048 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9sr8\" (UniqueName: \"kubernetes.io/projected/212bbca4-059a-419a-acf3-984c087a0f5d-kube-api-access-v9sr8\") pod \"212bbca4-059a-419a-acf3-984c087a0f5d\" (UID: \"212bbca4-059a-419a-acf3-984c087a0f5d\") " Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:12.975130 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/212bbca4-059a-419a-acf3-984c087a0f5d-combined-ca-bundle\") pod \"212bbca4-059a-419a-acf3-984c087a0f5d\" (UID: \"212bbca4-059a-419a-acf3-984c087a0f5d\") " Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:12.975169 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/212bbca4-059a-419a-acf3-984c087a0f5d-config-data\") pod \"212bbca4-059a-419a-acf3-984c087a0f5d\" (UID: \"212bbca4-059a-419a-acf3-984c087a0f5d\") " Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:12.976125 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/212bbca4-059a-419a-acf3-984c087a0f5d-logs" (OuterVolumeSpecName: "logs") pod "212bbca4-059a-419a-acf3-984c087a0f5d" (UID: "212bbca4-059a-419a-acf3-984c087a0f5d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.038842 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/212bbca4-059a-419a-acf3-984c087a0f5d-kube-api-access-v9sr8" (OuterVolumeSpecName: "kube-api-access-v9sr8") pod "212bbca4-059a-419a-acf3-984c087a0f5d" (UID: "212bbca4-059a-419a-acf3-984c087a0f5d"). InnerVolumeSpecName "kube-api-access-v9sr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.065125 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/212bbca4-059a-419a-acf3-984c087a0f5d-config-data" (OuterVolumeSpecName: "config-data") pod "212bbca4-059a-419a-acf3-984c087a0f5d" (UID: "212bbca4-059a-419a-acf3-984c087a0f5d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.077934 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/212bbca4-059a-419a-acf3-984c087a0f5d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "212bbca4-059a-419a-acf3-984c087a0f5d" (UID: "212bbca4-059a-419a-acf3-984c087a0f5d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.078017 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9sr8\" (UniqueName: \"kubernetes.io/projected/212bbca4-059a-419a-acf3-984c087a0f5d-kube-api-access-v9sr8\") on node \"crc\" DevicePath \"\"" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.078057 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/212bbca4-059a-419a-acf3-984c087a0f5d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.078066 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/212bbca4-059a-419a-acf3-984c087a0f5d-logs\") on node \"crc\" DevicePath \"\"" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.118728 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d13b2965-2e64-4e8d-a42a-48bdcb8c5bed" path="/var/lib/kubelet/pods/d13b2965-2e64-4e8d-a42a-48bdcb8c5bed/volumes" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.119520 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f74cba1c-b50e-4d6c-9d24-6ff44ce942e6" path="/var/lib/kubelet/pods/f74cba1c-b50e-4d6c-9d24-6ff44ce942e6/volumes" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.182185 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/212bbca4-059a-419a-acf3-984c087a0f5d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.230911 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.233943 4947 generic.go:334] "Generic (PLEG): container finished" podID="5ec54e67-38f1-44ed-a365-932f4103b99a" containerID="a6a14598a9a430b059b6bb767b2c556bf9dd1ecac8031967c832ce57e755a01c" exitCode=0 Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.234004 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ec54e67-38f1-44ed-a365-932f4103b99a","Type":"ContainerDied","Data":"a6a14598a9a430b059b6bb767b2c556bf9dd1ecac8031967c832ce57e755a01c"} Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.234033 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5ec54e67-38f1-44ed-a365-932f4103b99a","Type":"ContainerDied","Data":"5d2b02edd8e66701da350262b8f2091071676a08962c089087c1dbc0787cbdc4"} Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.234051 4947 scope.go:117] "RemoveContainer" containerID="a6a14598a9a430b059b6bb767b2c556bf9dd1ecac8031967c832ce57e755a01c" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.245728 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-conductor-0" event={"ID":"acc4c853-250e-435c-9c44-3d026fd5e8ca","Type":"ContainerStarted","Data":"1bbe6fc90c6979790c340d13fa03edb3d0d398f5b0b6a5e0057088bd7ff8f322"} Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.245777 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-conductor-0" event={"ID":"acc4c853-250e-435c-9c44-3d026fd5e8ca","Type":"ContainerStarted","Data":"c29822ce143886a49af8ad6a15fb50f14ee6b0c6d2cb8b028fdac44d5aaa58d9"} Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.246722 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell2-conductor-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.264572 4947 generic.go:334] "Generic (PLEG): container finished" podID="212bbca4-059a-419a-acf3-984c087a0f5d" containerID="f62a978e407c8be568bb8ba0315a933f9db91e10ae67b77b343c26e3078bdc73" exitCode=0 Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.264668 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"212bbca4-059a-419a-acf3-984c087a0f5d","Type":"ContainerDied","Data":"f62a978e407c8be568bb8ba0315a933f9db91e10ae67b77b343c26e3078bdc73"} Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.264695 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"212bbca4-059a-419a-acf3-984c087a0f5d","Type":"ContainerDied","Data":"d2ba4f8abd0004131b977317ba7ba0c141d38ba80b456bf010f2b78d5e34568b"} Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.264773 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.285679 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec54e67-38f1-44ed-a365-932f4103b99a-config-data\") pod \"5ec54e67-38f1-44ed-a365-932f4103b99a\" (UID: \"5ec54e67-38f1-44ed-a365-932f4103b99a\") " Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.285756 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec54e67-38f1-44ed-a365-932f4103b99a-combined-ca-bundle\") pod \"5ec54e67-38f1-44ed-a365-932f4103b99a\" (UID: \"5ec54e67-38f1-44ed-a365-932f4103b99a\") " Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.285780 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ec54e67-38f1-44ed-a365-932f4103b99a-logs\") pod \"5ec54e67-38f1-44ed-a365-932f4103b99a\" (UID: \"5ec54e67-38f1-44ed-a365-932f4103b99a\") " Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.285811 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x495x\" (UniqueName: \"kubernetes.io/projected/5ec54e67-38f1-44ed-a365-932f4103b99a-kube-api-access-x495x\") pod \"5ec54e67-38f1-44ed-a365-932f4103b99a\" (UID: \"5ec54e67-38f1-44ed-a365-932f4103b99a\") " Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.288125 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ec54e67-38f1-44ed-a365-932f4103b99a-logs" (OuterVolumeSpecName: "logs") pod "5ec54e67-38f1-44ed-a365-932f4103b99a" (UID: "5ec54e67-38f1-44ed-a365-932f4103b99a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.305130 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec54e67-38f1-44ed-a365-932f4103b99a-kube-api-access-x495x" (OuterVolumeSpecName: "kube-api-access-x495x") pod "5ec54e67-38f1-44ed-a365-932f4103b99a" (UID: "5ec54e67-38f1-44ed-a365-932f4103b99a"). InnerVolumeSpecName "kube-api-access-x495x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.318278 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d59dc211-4f20-4e7b-8e31-a387f4e4467d","Type":"ContainerStarted","Data":"0846d5fb659baafe96a64396596ece1b30859c5b4b63238d80068393148bab5e"} Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.318308 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d59dc211-4f20-4e7b-8e31-a387f4e4467d","Type":"ContainerStarted","Data":"c9fbfc6e818ab14fcf97810b8121ad8b7f051b5a4846f61fc189ee2b477ae314"} Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.319632 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell2-conductor-0" podStartSLOduration=2.319483216 podStartE2EDuration="2.319483216s" podCreationTimestamp="2025-12-03 09:50:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:50:13.272062084 +0000 UTC m=+10874.533016530" watchObservedRunningTime="2025-12-03 09:50:13.319483216 +0000 UTC m=+10874.580437642" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.343002 4947 scope.go:117] "RemoveContainer" containerID="a5be49daac165ff5b3ff58f7b5517975355ca5ef2665d85ed8ba0f6b4120f17d" Dec 03 09:50:14 crc kubenswrapper[4947]: E1203 09:50:13.354905 4947 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ec54e67-38f1-44ed-a365-932f4103b99a-combined-ca-bundle podName:5ec54e67-38f1-44ed-a365-932f4103b99a nodeName:}" failed. No retries permitted until 2025-12-03 09:50:13.854873451 +0000 UTC m=+10875.115827887 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/5ec54e67-38f1-44ed-a365-932f4103b99a-combined-ca-bundle") pod "5ec54e67-38f1-44ed-a365-932f4103b99a" (UID: "5ec54e67-38f1-44ed-a365-932f4103b99a") : error deleting /var/lib/kubelet/pods/5ec54e67-38f1-44ed-a365-932f4103b99a/volume-subpaths: remove /var/lib/kubelet/pods/5ec54e67-38f1-44ed-a365-932f4103b99a/volume-subpaths: no such file or directory Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.363061 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.364265 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec54e67-38f1-44ed-a365-932f4103b99a-config-data" (OuterVolumeSpecName: "config-data") pod "5ec54e67-38f1-44ed-a365-932f4103b99a" (UID: "5ec54e67-38f1-44ed-a365-932f4103b99a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.382638 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.392503 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ec54e67-38f1-44ed-a365-932f4103b99a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.392534 4947 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5ec54e67-38f1-44ed-a365-932f4103b99a-logs\") on node \"crc\" DevicePath \"\"" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.392546 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x495x\" (UniqueName: \"kubernetes.io/projected/5ec54e67-38f1-44ed-a365-932f4103b99a-kube-api-access-x495x\") on node \"crc\" DevicePath \"\"" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.401601 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 09:50:14 crc kubenswrapper[4947]: E1203 09:50:13.402165 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="212bbca4-059a-419a-acf3-984c087a0f5d" containerName="nova-api-api" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.402182 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="212bbca4-059a-419a-acf3-984c087a0f5d" containerName="nova-api-api" Dec 03 09:50:14 crc kubenswrapper[4947]: E1203 09:50:13.402202 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="212bbca4-059a-419a-acf3-984c087a0f5d" containerName="nova-api-log" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.402230 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="212bbca4-059a-419a-acf3-984c087a0f5d" containerName="nova-api-log" Dec 03 09:50:14 crc kubenswrapper[4947]: E1203 09:50:13.402247 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec54e67-38f1-44ed-a365-932f4103b99a" containerName="nova-metadata-log" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.402257 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec54e67-38f1-44ed-a365-932f4103b99a" containerName="nova-metadata-log" Dec 03 09:50:14 crc kubenswrapper[4947]: E1203 09:50:13.402272 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec54e67-38f1-44ed-a365-932f4103b99a" containerName="nova-metadata-metadata" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.402281 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec54e67-38f1-44ed-a365-932f4103b99a" containerName="nova-metadata-metadata" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.402576 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec54e67-38f1-44ed-a365-932f4103b99a" containerName="nova-metadata-metadata" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.402611 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="212bbca4-059a-419a-acf3-984c087a0f5d" containerName="nova-api-api" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.402631 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="212bbca4-059a-419a-acf3-984c087a0f5d" containerName="nova-api-log" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.402640 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec54e67-38f1-44ed-a365-932f4103b99a" containerName="nova-metadata-log" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.404120 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.407788 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.434377 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.453981 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.453957338 podStartE2EDuration="2.453957338s" podCreationTimestamp="2025-12-03 09:50:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:50:13.358420528 +0000 UTC m=+10874.619374964" watchObservedRunningTime="2025-12-03 09:50:13.453957338 +0000 UTC m=+10874.714911764" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.494084 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b901c287-ea30-475d-8f6b-96ccd6463604-logs\") pod \"nova-api-0\" (UID: \"b901c287-ea30-475d-8f6b-96ccd6463604\") " pod="openstack/nova-api-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.494224 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b901c287-ea30-475d-8f6b-96ccd6463604-config-data\") pod \"nova-api-0\" (UID: \"b901c287-ea30-475d-8f6b-96ccd6463604\") " pod="openstack/nova-api-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.494294 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b901c287-ea30-475d-8f6b-96ccd6463604-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b901c287-ea30-475d-8f6b-96ccd6463604\") " pod="openstack/nova-api-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.494323 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74h7f\" (UniqueName: \"kubernetes.io/projected/b901c287-ea30-475d-8f6b-96ccd6463604-kube-api-access-74h7f\") pod \"nova-api-0\" (UID: \"b901c287-ea30-475d-8f6b-96ccd6463604\") " pod="openstack/nova-api-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.596829 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b901c287-ea30-475d-8f6b-96ccd6463604-config-data\") pod \"nova-api-0\" (UID: \"b901c287-ea30-475d-8f6b-96ccd6463604\") " pod="openstack/nova-api-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.596934 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b901c287-ea30-475d-8f6b-96ccd6463604-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b901c287-ea30-475d-8f6b-96ccd6463604\") " pod="openstack/nova-api-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.596973 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74h7f\" (UniqueName: \"kubernetes.io/projected/b901c287-ea30-475d-8f6b-96ccd6463604-kube-api-access-74h7f\") pod \"nova-api-0\" (UID: \"b901c287-ea30-475d-8f6b-96ccd6463604\") " pod="openstack/nova-api-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.597034 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b901c287-ea30-475d-8f6b-96ccd6463604-logs\") pod \"nova-api-0\" (UID: \"b901c287-ea30-475d-8f6b-96ccd6463604\") " pod="openstack/nova-api-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.597570 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b901c287-ea30-475d-8f6b-96ccd6463604-logs\") pod \"nova-api-0\" (UID: \"b901c287-ea30-475d-8f6b-96ccd6463604\") " pod="openstack/nova-api-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.602240 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b901c287-ea30-475d-8f6b-96ccd6463604-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b901c287-ea30-475d-8f6b-96ccd6463604\") " pod="openstack/nova-api-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.612213 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b901c287-ea30-475d-8f6b-96ccd6463604-config-data\") pod \"nova-api-0\" (UID: \"b901c287-ea30-475d-8f6b-96ccd6463604\") " pod="openstack/nova-api-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.613086 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74h7f\" (UniqueName: \"kubernetes.io/projected/b901c287-ea30-475d-8f6b-96ccd6463604-kube-api-access-74h7f\") pod \"nova-api-0\" (UID: \"b901c287-ea30-475d-8f6b-96ccd6463604\") " pod="openstack/nova-api-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.642798 4947 scope.go:117] "RemoveContainer" containerID="a6a14598a9a430b059b6bb767b2c556bf9dd1ecac8031967c832ce57e755a01c" Dec 03 09:50:14 crc kubenswrapper[4947]: E1203 09:50:13.643431 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6a14598a9a430b059b6bb767b2c556bf9dd1ecac8031967c832ce57e755a01c\": container with ID starting with a6a14598a9a430b059b6bb767b2c556bf9dd1ecac8031967c832ce57e755a01c not found: ID does not exist" containerID="a6a14598a9a430b059b6bb767b2c556bf9dd1ecac8031967c832ce57e755a01c" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.643479 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6a14598a9a430b059b6bb767b2c556bf9dd1ecac8031967c832ce57e755a01c"} err="failed to get container status \"a6a14598a9a430b059b6bb767b2c556bf9dd1ecac8031967c832ce57e755a01c\": rpc error: code = NotFound desc = could not find container \"a6a14598a9a430b059b6bb767b2c556bf9dd1ecac8031967c832ce57e755a01c\": container with ID starting with a6a14598a9a430b059b6bb767b2c556bf9dd1ecac8031967c832ce57e755a01c not found: ID does not exist" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.643565 4947 scope.go:117] "RemoveContainer" containerID="a5be49daac165ff5b3ff58f7b5517975355ca5ef2665d85ed8ba0f6b4120f17d" Dec 03 09:50:14 crc kubenswrapper[4947]: E1203 09:50:13.643911 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5be49daac165ff5b3ff58f7b5517975355ca5ef2665d85ed8ba0f6b4120f17d\": container with ID starting with a5be49daac165ff5b3ff58f7b5517975355ca5ef2665d85ed8ba0f6b4120f17d not found: ID does not exist" containerID="a5be49daac165ff5b3ff58f7b5517975355ca5ef2665d85ed8ba0f6b4120f17d" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.643939 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5be49daac165ff5b3ff58f7b5517975355ca5ef2665d85ed8ba0f6b4120f17d"} err="failed to get container status \"a5be49daac165ff5b3ff58f7b5517975355ca5ef2665d85ed8ba0f6b4120f17d\": rpc error: code = NotFound desc = could not find container \"a5be49daac165ff5b3ff58f7b5517975355ca5ef2665d85ed8ba0f6b4120f17d\": container with ID starting with a5be49daac165ff5b3ff58f7b5517975355ca5ef2665d85ed8ba0f6b4120f17d not found: ID does not exist" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.643956 4947 scope.go:117] "RemoveContainer" containerID="f62a978e407c8be568bb8ba0315a933f9db91e10ae67b77b343c26e3078bdc73" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.715956 4947 scope.go:117] "RemoveContainer" containerID="55f873951fc20225fdbca2541d02dc940f188c8a029f55995d2552a261b16573" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.729353 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.780612 4947 scope.go:117] "RemoveContainer" containerID="f62a978e407c8be568bb8ba0315a933f9db91e10ae67b77b343c26e3078bdc73" Dec 03 09:50:14 crc kubenswrapper[4947]: E1203 09:50:13.783690 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f62a978e407c8be568bb8ba0315a933f9db91e10ae67b77b343c26e3078bdc73\": container with ID starting with f62a978e407c8be568bb8ba0315a933f9db91e10ae67b77b343c26e3078bdc73 not found: ID does not exist" containerID="f62a978e407c8be568bb8ba0315a933f9db91e10ae67b77b343c26e3078bdc73" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.783726 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f62a978e407c8be568bb8ba0315a933f9db91e10ae67b77b343c26e3078bdc73"} err="failed to get container status \"f62a978e407c8be568bb8ba0315a933f9db91e10ae67b77b343c26e3078bdc73\": rpc error: code = NotFound desc = could not find container \"f62a978e407c8be568bb8ba0315a933f9db91e10ae67b77b343c26e3078bdc73\": container with ID starting with f62a978e407c8be568bb8ba0315a933f9db91e10ae67b77b343c26e3078bdc73 not found: ID does not exist" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.783755 4947 scope.go:117] "RemoveContainer" containerID="55f873951fc20225fdbca2541d02dc940f188c8a029f55995d2552a261b16573" Dec 03 09:50:14 crc kubenswrapper[4947]: E1203 09:50:13.787189 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55f873951fc20225fdbca2541d02dc940f188c8a029f55995d2552a261b16573\": container with ID starting with 55f873951fc20225fdbca2541d02dc940f188c8a029f55995d2552a261b16573 not found: ID does not exist" containerID="55f873951fc20225fdbca2541d02dc940f188c8a029f55995d2552a261b16573" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.787209 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55f873951fc20225fdbca2541d02dc940f188c8a029f55995d2552a261b16573"} err="failed to get container status \"55f873951fc20225fdbca2541d02dc940f188c8a029f55995d2552a261b16573\": rpc error: code = NotFound desc = could not find container \"55f873951fc20225fdbca2541d02dc940f188c8a029f55995d2552a261b16573\": container with ID starting with 55f873951fc20225fdbca2541d02dc940f188c8a029f55995d2552a261b16573 not found: ID does not exist" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.904613 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec54e67-38f1-44ed-a365-932f4103b99a-combined-ca-bundle\") pod \"5ec54e67-38f1-44ed-a365-932f4103b99a\" (UID: \"5ec54e67-38f1-44ed-a365-932f4103b99a\") " Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:13.914827 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ec54e67-38f1-44ed-a365-932f4103b99a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ec54e67-38f1-44ed-a365-932f4103b99a" (UID: "5ec54e67-38f1-44ed-a365-932f4103b99a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:14.016012 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ec54e67-38f1-44ed-a365-932f4103b99a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:50:14 crc kubenswrapper[4947]: E1203 09:50:14.160503 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="021742ed4b31b0cdb1f94b3211a4216b98b34188eb48343aaf531734f0ecc45b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 09:50:14 crc kubenswrapper[4947]: E1203 09:50:14.161927 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="021742ed4b31b0cdb1f94b3211a4216b98b34188eb48343aaf531734f0ecc45b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 09:50:14 crc kubenswrapper[4947]: E1203 09:50:14.163291 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="021742ed4b31b0cdb1f94b3211a4216b98b34188eb48343aaf531734f0ecc45b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 09:50:14 crc kubenswrapper[4947]: E1203 09:50:14.163337 4947 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell3-conductor-0" podUID="4f3cdf1c-630c-4944-ae58-c426e0d2161a" containerName="nova-cell3-conductor-conductor" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:14.352785 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:14.419535 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:14.435620 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:14.482157 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:14.491225 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:14.493845 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:14.521082 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:14.530110 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f498eea-ba21-4c7e-8fc0-761ebb08a859-logs\") pod \"nova-metadata-0\" (UID: \"6f498eea-ba21-4c7e-8fc0-761ebb08a859\") " pod="openstack/nova-metadata-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:14.530581 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f498eea-ba21-4c7e-8fc0-761ebb08a859-config-data\") pod \"nova-metadata-0\" (UID: \"6f498eea-ba21-4c7e-8fc0-761ebb08a859\") " pod="openstack/nova-metadata-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:14.530608 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqr5s\" (UniqueName: \"kubernetes.io/projected/6f498eea-ba21-4c7e-8fc0-761ebb08a859-kube-api-access-jqr5s\") pod \"nova-metadata-0\" (UID: \"6f498eea-ba21-4c7e-8fc0-761ebb08a859\") " pod="openstack/nova-metadata-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:14.530727 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f498eea-ba21-4c7e-8fc0-761ebb08a859-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6f498eea-ba21-4c7e-8fc0-761ebb08a859\") " pod="openstack/nova-metadata-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:14.632218 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f498eea-ba21-4c7e-8fc0-761ebb08a859-config-data\") pod \"nova-metadata-0\" (UID: \"6f498eea-ba21-4c7e-8fc0-761ebb08a859\") " pod="openstack/nova-metadata-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:14.632256 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqr5s\" (UniqueName: \"kubernetes.io/projected/6f498eea-ba21-4c7e-8fc0-761ebb08a859-kube-api-access-jqr5s\") pod \"nova-metadata-0\" (UID: \"6f498eea-ba21-4c7e-8fc0-761ebb08a859\") " pod="openstack/nova-metadata-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:14.632325 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f498eea-ba21-4c7e-8fc0-761ebb08a859-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6f498eea-ba21-4c7e-8fc0-761ebb08a859\") " pod="openstack/nova-metadata-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:14.632420 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f498eea-ba21-4c7e-8fc0-761ebb08a859-logs\") pod \"nova-metadata-0\" (UID: \"6f498eea-ba21-4c7e-8fc0-761ebb08a859\") " pod="openstack/nova-metadata-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:14.634009 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f498eea-ba21-4c7e-8fc0-761ebb08a859-logs\") pod \"nova-metadata-0\" (UID: \"6f498eea-ba21-4c7e-8fc0-761ebb08a859\") " pod="openstack/nova-metadata-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:14.645119 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f498eea-ba21-4c7e-8fc0-761ebb08a859-config-data\") pod \"nova-metadata-0\" (UID: \"6f498eea-ba21-4c7e-8fc0-761ebb08a859\") " pod="openstack/nova-metadata-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:14.646144 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f498eea-ba21-4c7e-8fc0-761ebb08a859-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6f498eea-ba21-4c7e-8fc0-761ebb08a859\") " pod="openstack/nova-metadata-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:14.658006 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqr5s\" (UniqueName: \"kubernetes.io/projected/6f498eea-ba21-4c7e-8fc0-761ebb08a859-kube-api-access-jqr5s\") pod \"nova-metadata-0\" (UID: \"6f498eea-ba21-4c7e-8fc0-761ebb08a859\") " pod="openstack/nova-metadata-0" Dec 03 09:50:14 crc kubenswrapper[4947]: I1203 09:50:14.859301 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 09:50:15 crc kubenswrapper[4947]: I1203 09:50:15.063188 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 09:50:15 crc kubenswrapper[4947]: W1203 09:50:15.067054 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb901c287_ea30_475d_8f6b_96ccd6463604.slice/crio-bb099a18f591c8cc36d0f3f38b741f28e0e064c21dc1aaedbfbf06698f05655b WatchSource:0}: Error finding container bb099a18f591c8cc36d0f3f38b741f28e0e064c21dc1aaedbfbf06698f05655b: Status 404 returned error can't find the container with id bb099a18f591c8cc36d0f3f38b741f28e0e064c21dc1aaedbfbf06698f05655b Dec 03 09:50:15 crc kubenswrapper[4947]: I1203 09:50:15.102359 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="212bbca4-059a-419a-acf3-984c087a0f5d" path="/var/lib/kubelet/pods/212bbca4-059a-419a-acf3-984c087a0f5d/volumes" Dec 03 09:50:15 crc kubenswrapper[4947]: I1203 09:50:15.103303 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ec54e67-38f1-44ed-a365-932f4103b99a" path="/var/lib/kubelet/pods/5ec54e67-38f1-44ed-a365-932f4103b99a/volumes" Dec 03 09:50:15 crc kubenswrapper[4947]: I1203 09:50:15.385849 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 09:50:15 crc kubenswrapper[4947]: I1203 09:50:15.395935 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b901c287-ea30-475d-8f6b-96ccd6463604","Type":"ContainerStarted","Data":"33cafa8bc30c7c5ea65294b0c3222b7f47197ec95cf4f8cb3d8505161bed71c5"} Dec 03 09:50:15 crc kubenswrapper[4947]: I1203 09:50:15.395976 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b901c287-ea30-475d-8f6b-96ccd6463604","Type":"ContainerStarted","Data":"bb099a18f591c8cc36d0f3f38b741f28e0e064c21dc1aaedbfbf06698f05655b"} Dec 03 09:50:16 crc kubenswrapper[4947]: I1203 09:50:16.409765 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f498eea-ba21-4c7e-8fc0-761ebb08a859","Type":"ContainerStarted","Data":"99019411efb557702c07d5d70945028c2dee8bd36691ad53306aae0731af8040"} Dec 03 09:50:16 crc kubenswrapper[4947]: I1203 09:50:16.410100 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f498eea-ba21-4c7e-8fc0-761ebb08a859","Type":"ContainerStarted","Data":"e1e937ffd3c1ecbd872db2ce979ea6494b01e9bdf264478f18392f88409af7c7"} Dec 03 09:50:16 crc kubenswrapper[4947]: I1203 09:50:16.410115 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6f498eea-ba21-4c7e-8fc0-761ebb08a859","Type":"ContainerStarted","Data":"3c3c224a26ab7008991e1c872dc48c272e08733b5476ed293d582199f603e92d"} Dec 03 09:50:16 crc kubenswrapper[4947]: I1203 09:50:16.418469 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b901c287-ea30-475d-8f6b-96ccd6463604","Type":"ContainerStarted","Data":"cd6d3e91e804eacc29bdd7faa517e80c6ce65200621b6a0de0a536077114a7c9"} Dec 03 09:50:16 crc kubenswrapper[4947]: I1203 09:50:16.440313 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.440282604 podStartE2EDuration="2.440282604s" podCreationTimestamp="2025-12-03 09:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:50:16.433625704 +0000 UTC m=+10877.694580120" watchObservedRunningTime="2025-12-03 09:50:16.440282604 +0000 UTC m=+10877.701237030" Dec 03 09:50:16 crc kubenswrapper[4947]: I1203 09:50:16.469220 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.469200475 podStartE2EDuration="3.469200475s" podCreationTimestamp="2025-12-03 09:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:50:16.458024663 +0000 UTC m=+10877.718979089" watchObservedRunningTime="2025-12-03 09:50:16.469200475 +0000 UTC m=+10877.730154901" Dec 03 09:50:16 crc kubenswrapper[4947]: I1203 09:50:16.662825 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 09:50:19 crc kubenswrapper[4947]: E1203 09:50:19.158585 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 021742ed4b31b0cdb1f94b3211a4216b98b34188eb48343aaf531734f0ecc45b is running failed: container process not found" containerID="021742ed4b31b0cdb1f94b3211a4216b98b34188eb48343aaf531734f0ecc45b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 09:50:19 crc kubenswrapper[4947]: E1203 09:50:19.159530 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 021742ed4b31b0cdb1f94b3211a4216b98b34188eb48343aaf531734f0ecc45b is running failed: container process not found" containerID="021742ed4b31b0cdb1f94b3211a4216b98b34188eb48343aaf531734f0ecc45b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 09:50:19 crc kubenswrapper[4947]: E1203 09:50:19.160390 4947 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 021742ed4b31b0cdb1f94b3211a4216b98b34188eb48343aaf531734f0ecc45b is running failed: container process not found" containerID="021742ed4b31b0cdb1f94b3211a4216b98b34188eb48343aaf531734f0ecc45b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 09:50:19 crc kubenswrapper[4947]: E1203 09:50:19.160439 4947 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 021742ed4b31b0cdb1f94b3211a4216b98b34188eb48343aaf531734f0ecc45b is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell3-conductor-0" podUID="4f3cdf1c-630c-4944-ae58-c426e0d2161a" containerName="nova-cell3-conductor-conductor" Dec 03 09:50:19 crc kubenswrapper[4947]: I1203 09:50:19.480548 4947 generic.go:334] "Generic (PLEG): container finished" podID="4f3cdf1c-630c-4944-ae58-c426e0d2161a" containerID="021742ed4b31b0cdb1f94b3211a4216b98b34188eb48343aaf531734f0ecc45b" exitCode=0 Dec 03 09:50:19 crc kubenswrapper[4947]: I1203 09:50:19.480600 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell3-conductor-0" event={"ID":"4f3cdf1c-630c-4944-ae58-c426e0d2161a","Type":"ContainerDied","Data":"021742ed4b31b0cdb1f94b3211a4216b98b34188eb48343aaf531734f0ecc45b"} Dec 03 09:50:19 crc kubenswrapper[4947]: I1203 09:50:19.860351 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 09:50:19 crc kubenswrapper[4947]: I1203 09:50:19.860695 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 09:50:19 crc kubenswrapper[4947]: I1203 09:50:19.876983 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell3-conductor-0" Dec 03 09:50:19 crc kubenswrapper[4947]: I1203 09:50:19.966918 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3cdf1c-630c-4944-ae58-c426e0d2161a-combined-ca-bundle\") pod \"4f3cdf1c-630c-4944-ae58-c426e0d2161a\" (UID: \"4f3cdf1c-630c-4944-ae58-c426e0d2161a\") " Dec 03 09:50:19 crc kubenswrapper[4947]: I1203 09:50:19.967052 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3cdf1c-630c-4944-ae58-c426e0d2161a-config-data\") pod \"4f3cdf1c-630c-4944-ae58-c426e0d2161a\" (UID: \"4f3cdf1c-630c-4944-ae58-c426e0d2161a\") " Dec 03 09:50:19 crc kubenswrapper[4947]: I1203 09:50:19.967210 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9gxh\" (UniqueName: \"kubernetes.io/projected/4f3cdf1c-630c-4944-ae58-c426e0d2161a-kube-api-access-w9gxh\") pod \"4f3cdf1c-630c-4944-ae58-c426e0d2161a\" (UID: \"4f3cdf1c-630c-4944-ae58-c426e0d2161a\") " Dec 03 09:50:19 crc kubenswrapper[4947]: I1203 09:50:19.972293 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f3cdf1c-630c-4944-ae58-c426e0d2161a-kube-api-access-w9gxh" (OuterVolumeSpecName: "kube-api-access-w9gxh") pod "4f3cdf1c-630c-4944-ae58-c426e0d2161a" (UID: "4f3cdf1c-630c-4944-ae58-c426e0d2161a"). InnerVolumeSpecName "kube-api-access-w9gxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.004423 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3cdf1c-630c-4944-ae58-c426e0d2161a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f3cdf1c-630c-4944-ae58-c426e0d2161a" (UID: "4f3cdf1c-630c-4944-ae58-c426e0d2161a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.005308 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f3cdf1c-630c-4944-ae58-c426e0d2161a-config-data" (OuterVolumeSpecName: "config-data") pod "4f3cdf1c-630c-4944-ae58-c426e0d2161a" (UID: "4f3cdf1c-630c-4944-ae58-c426e0d2161a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.070055 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f3cdf1c-630c-4944-ae58-c426e0d2161a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.070088 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f3cdf1c-630c-4944-ae58-c426e0d2161a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.070101 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9gxh\" (UniqueName: \"kubernetes.io/projected/4f3cdf1c-630c-4944-ae58-c426e0d2161a-kube-api-access-w9gxh\") on node \"crc\" DevicePath \"\"" Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.492983 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell3-conductor-0" event={"ID":"4f3cdf1c-630c-4944-ae58-c426e0d2161a","Type":"ContainerDied","Data":"ef57f8bd5b76e4daaca6be28ec90ba64ff112e583418f7fa187d77f61de3b0ed"} Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.493041 4947 scope.go:117] "RemoveContainer" containerID="021742ed4b31b0cdb1f94b3211a4216b98b34188eb48343aaf531734f0ecc45b" Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.493116 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell3-conductor-0" Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.532636 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell3-conductor-0"] Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.552550 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell3-conductor-0"] Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.565677 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell3-conductor-0"] Dec 03 09:50:20 crc kubenswrapper[4947]: E1203 09:50:20.566314 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f3cdf1c-630c-4944-ae58-c426e0d2161a" containerName="nova-cell3-conductor-conductor" Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.566339 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f3cdf1c-630c-4944-ae58-c426e0d2161a" containerName="nova-cell3-conductor-conductor" Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.566647 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f3cdf1c-630c-4944-ae58-c426e0d2161a" containerName="nova-cell3-conductor-conductor" Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.567707 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell3-conductor-0" Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.570044 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell3-conductor-config-data" Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.575465 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell3-conductor-0"] Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.652210 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.683973 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/765d2e96-579a-4b09-8df0-eee649a5620b-combined-ca-bundle\") pod \"nova-cell3-conductor-0\" (UID: \"765d2e96-579a-4b09-8df0-eee649a5620b\") " pod="openstack/nova-cell3-conductor-0" Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.684028 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt77f\" (UniqueName: \"kubernetes.io/projected/765d2e96-579a-4b09-8df0-eee649a5620b-kube-api-access-kt77f\") pod \"nova-cell3-conductor-0\" (UID: \"765d2e96-579a-4b09-8df0-eee649a5620b\") " pod="openstack/nova-cell3-conductor-0" Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.684245 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/765d2e96-579a-4b09-8df0-eee649a5620b-config-data\") pod \"nova-cell3-conductor-0\" (UID: \"765d2e96-579a-4b09-8df0-eee649a5620b\") " pod="openstack/nova-cell3-conductor-0" Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.786233 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/765d2e96-579a-4b09-8df0-eee649a5620b-config-data\") pod \"nova-cell3-conductor-0\" (UID: \"765d2e96-579a-4b09-8df0-eee649a5620b\") " pod="openstack/nova-cell3-conductor-0" Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.786362 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/765d2e96-579a-4b09-8df0-eee649a5620b-combined-ca-bundle\") pod \"nova-cell3-conductor-0\" (UID: \"765d2e96-579a-4b09-8df0-eee649a5620b\") " pod="openstack/nova-cell3-conductor-0" Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.787184 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt77f\" (UniqueName: \"kubernetes.io/projected/765d2e96-579a-4b09-8df0-eee649a5620b-kube-api-access-kt77f\") pod \"nova-cell3-conductor-0\" (UID: \"765d2e96-579a-4b09-8df0-eee649a5620b\") " pod="openstack/nova-cell3-conductor-0" Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.792707 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/765d2e96-579a-4b09-8df0-eee649a5620b-combined-ca-bundle\") pod \"nova-cell3-conductor-0\" (UID: \"765d2e96-579a-4b09-8df0-eee649a5620b\") " pod="openstack/nova-cell3-conductor-0" Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.793904 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/765d2e96-579a-4b09-8df0-eee649a5620b-config-data\") pod \"nova-cell3-conductor-0\" (UID: \"765d2e96-579a-4b09-8df0-eee649a5620b\") " pod="openstack/nova-cell3-conductor-0" Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.809256 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt77f\" (UniqueName: \"kubernetes.io/projected/765d2e96-579a-4b09-8df0-eee649a5620b-kube-api-access-kt77f\") pod \"nova-cell3-conductor-0\" (UID: \"765d2e96-579a-4b09-8df0-eee649a5620b\") " pod="openstack/nova-cell3-conductor-0" Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.862295 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 03 09:50:20 crc kubenswrapper[4947]: I1203 09:50:20.898871 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell3-conductor-0" Dec 03 09:50:21 crc kubenswrapper[4947]: I1203 09:50:21.105440 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f3cdf1c-630c-4944-ae58-c426e0d2161a" path="/var/lib/kubelet/pods/4f3cdf1c-630c-4944-ae58-c426e0d2161a/volumes" Dec 03 09:50:21 crc kubenswrapper[4947]: W1203 09:50:21.393698 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod765d2e96_579a_4b09_8df0_eee649a5620b.slice/crio-da85efb6fe8fb4833a8c8ae37861c1be16ce5492f72799484f2515174bce64b0 WatchSource:0}: Error finding container da85efb6fe8fb4833a8c8ae37861c1be16ce5492f72799484f2515174bce64b0: Status 404 returned error can't find the container with id da85efb6fe8fb4833a8c8ae37861c1be16ce5492f72799484f2515174bce64b0 Dec 03 09:50:21 crc kubenswrapper[4947]: I1203 09:50:21.397440 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell3-conductor-0"] Dec 03 09:50:21 crc kubenswrapper[4947]: I1203 09:50:21.516647 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell3-conductor-0" event={"ID":"765d2e96-579a-4b09-8df0-eee649a5620b","Type":"ContainerStarted","Data":"da85efb6fe8fb4833a8c8ae37861c1be16ce5492f72799484f2515174bce64b0"} Dec 03 09:50:21 crc kubenswrapper[4947]: I1203 09:50:21.611164 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell2-conductor-0" Dec 03 09:50:21 crc kubenswrapper[4947]: I1203 09:50:21.663789 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 09:50:21 crc kubenswrapper[4947]: I1203 09:50:21.702237 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 09:50:22 crc kubenswrapper[4947]: I1203 09:50:22.530650 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell3-conductor-0" event={"ID":"765d2e96-579a-4b09-8df0-eee649a5620b","Type":"ContainerStarted","Data":"3099b80996b207291871849e712f679b69d3fd0229423500649229eca29f2795"} Dec 03 09:50:22 crc kubenswrapper[4947]: I1203 09:50:22.566081 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 09:50:22 crc kubenswrapper[4947]: I1203 09:50:22.572780 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell3-conductor-0" podStartSLOduration=2.572759032 podStartE2EDuration="2.572759032s" podCreationTimestamp="2025-12-03 09:50:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:50:22.546757129 +0000 UTC m=+10883.807711565" watchObservedRunningTime="2025-12-03 09:50:22.572759032 +0000 UTC m=+10883.833713458" Dec 03 09:50:23 crc kubenswrapper[4947]: I1203 09:50:23.540437 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell3-conductor-0" Dec 03 09:50:23 crc kubenswrapper[4947]: I1203 09:50:23.731948 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 09:50:23 crc kubenswrapper[4947]: I1203 09:50:23.731989 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 09:50:24 crc kubenswrapper[4947]: I1203 09:50:24.814935 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b901c287-ea30-475d-8f6b-96ccd6463604" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.234:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 09:50:24 crc kubenswrapper[4947]: I1203 09:50:24.815130 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b901c287-ea30-475d-8f6b-96ccd6463604" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.234:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 09:50:24 crc kubenswrapper[4947]: I1203 09:50:24.859891 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 09:50:24 crc kubenswrapper[4947]: I1203 09:50:24.859971 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 09:50:25 crc kubenswrapper[4947]: I1203 09:50:25.942717 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6f498eea-ba21-4c7e-8fc0-761ebb08a859" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.235:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 09:50:25 crc kubenswrapper[4947]: I1203 09:50:25.942889 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6f498eea-ba21-4c7e-8fc0-761ebb08a859" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.235:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 09:50:30 crc kubenswrapper[4947]: I1203 09:50:30.942661 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell3-conductor-0" Dec 03 09:50:31 crc kubenswrapper[4947]: I1203 09:50:31.224367 4947 scope.go:117] "RemoveContainer" containerID="2a9eb8d59aac80d11abeeb6f04e66b2aefa7cae7495c7ba476fbe71c58f7878c" Dec 03 09:50:33 crc kubenswrapper[4947]: I1203 09:50:33.734556 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 09:50:33 crc kubenswrapper[4947]: I1203 09:50:33.735326 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 09:50:33 crc kubenswrapper[4947]: I1203 09:50:33.738386 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 09:50:33 crc kubenswrapper[4947]: I1203 09:50:33.739944 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 09:50:34 crc kubenswrapper[4947]: I1203 09:50:34.653050 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 09:50:34 crc kubenswrapper[4947]: I1203 09:50:34.661066 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 09:50:34 crc kubenswrapper[4947]: I1203 09:50:34.875363 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 09:50:34 crc kubenswrapper[4947]: I1203 09:50:34.878565 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 09:50:34 crc kubenswrapper[4947]: I1203 09:50:34.878942 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 09:50:35 crc kubenswrapper[4947]: I1203 09:50:35.665010 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 09:51:16 crc kubenswrapper[4947]: I1203 09:51:16.407096 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g5vl5"] Dec 03 09:51:16 crc kubenswrapper[4947]: I1203 09:51:16.410725 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g5vl5" Dec 03 09:51:16 crc kubenswrapper[4947]: I1203 09:51:16.424815 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g5vl5"] Dec 03 09:51:16 crc kubenswrapper[4947]: I1203 09:51:16.512330 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbf40dba-94f8-4c00-a5d0-afbca197b583-utilities\") pod \"redhat-operators-g5vl5\" (UID: \"bbf40dba-94f8-4c00-a5d0-afbca197b583\") " pod="openshift-marketplace/redhat-operators-g5vl5" Dec 03 09:51:16 crc kubenswrapper[4947]: I1203 09:51:16.512392 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbf40dba-94f8-4c00-a5d0-afbca197b583-catalog-content\") pod \"redhat-operators-g5vl5\" (UID: \"bbf40dba-94f8-4c00-a5d0-afbca197b583\") " pod="openshift-marketplace/redhat-operators-g5vl5" Dec 03 09:51:16 crc kubenswrapper[4947]: I1203 09:51:16.512434 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vcgh\" (UniqueName: \"kubernetes.io/projected/bbf40dba-94f8-4c00-a5d0-afbca197b583-kube-api-access-2vcgh\") pod \"redhat-operators-g5vl5\" (UID: \"bbf40dba-94f8-4c00-a5d0-afbca197b583\") " pod="openshift-marketplace/redhat-operators-g5vl5" Dec 03 09:51:16 crc kubenswrapper[4947]: I1203 09:51:16.615228 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbf40dba-94f8-4c00-a5d0-afbca197b583-catalog-content\") pod \"redhat-operators-g5vl5\" (UID: \"bbf40dba-94f8-4c00-a5d0-afbca197b583\") " pod="openshift-marketplace/redhat-operators-g5vl5" Dec 03 09:51:16 crc kubenswrapper[4947]: I1203 09:51:16.615429 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vcgh\" (UniqueName: \"kubernetes.io/projected/bbf40dba-94f8-4c00-a5d0-afbca197b583-kube-api-access-2vcgh\") pod \"redhat-operators-g5vl5\" (UID: \"bbf40dba-94f8-4c00-a5d0-afbca197b583\") " pod="openshift-marketplace/redhat-operators-g5vl5" Dec 03 09:51:16 crc kubenswrapper[4947]: I1203 09:51:16.615712 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbf40dba-94f8-4c00-a5d0-afbca197b583-catalog-content\") pod \"redhat-operators-g5vl5\" (UID: \"bbf40dba-94f8-4c00-a5d0-afbca197b583\") " pod="openshift-marketplace/redhat-operators-g5vl5" Dec 03 09:51:16 crc kubenswrapper[4947]: I1203 09:51:16.616022 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbf40dba-94f8-4c00-a5d0-afbca197b583-utilities\") pod \"redhat-operators-g5vl5\" (UID: \"bbf40dba-94f8-4c00-a5d0-afbca197b583\") " pod="openshift-marketplace/redhat-operators-g5vl5" Dec 03 09:51:16 crc kubenswrapper[4947]: I1203 09:51:16.616461 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbf40dba-94f8-4c00-a5d0-afbca197b583-utilities\") pod \"redhat-operators-g5vl5\" (UID: \"bbf40dba-94f8-4c00-a5d0-afbca197b583\") " pod="openshift-marketplace/redhat-operators-g5vl5" Dec 03 09:51:16 crc kubenswrapper[4947]: I1203 09:51:16.635213 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vcgh\" (UniqueName: \"kubernetes.io/projected/bbf40dba-94f8-4c00-a5d0-afbca197b583-kube-api-access-2vcgh\") pod \"redhat-operators-g5vl5\" (UID: \"bbf40dba-94f8-4c00-a5d0-afbca197b583\") " pod="openshift-marketplace/redhat-operators-g5vl5" Dec 03 09:51:16 crc kubenswrapper[4947]: I1203 09:51:16.730753 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g5vl5" Dec 03 09:51:17 crc kubenswrapper[4947]: I1203 09:51:17.296766 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g5vl5"] Dec 03 09:51:18 crc kubenswrapper[4947]: I1203 09:51:18.136253 4947 generic.go:334] "Generic (PLEG): container finished" podID="bbf40dba-94f8-4c00-a5d0-afbca197b583" containerID="1d376298fa663b1b00f6472a4eafbfea10f83755b7cbbed52b6c14f554e822e4" exitCode=0 Dec 03 09:51:18 crc kubenswrapper[4947]: I1203 09:51:18.136611 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5vl5" event={"ID":"bbf40dba-94f8-4c00-a5d0-afbca197b583","Type":"ContainerDied","Data":"1d376298fa663b1b00f6472a4eafbfea10f83755b7cbbed52b6c14f554e822e4"} Dec 03 09:51:18 crc kubenswrapper[4947]: I1203 09:51:18.136641 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5vl5" event={"ID":"bbf40dba-94f8-4c00-a5d0-afbca197b583","Type":"ContainerStarted","Data":"7acaf9b774ce60bcb793f937c37d9a4024f92a0d359ee81e75875358bd403646"} Dec 03 09:51:19 crc kubenswrapper[4947]: I1203 09:51:19.147386 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5vl5" event={"ID":"bbf40dba-94f8-4c00-a5d0-afbca197b583","Type":"ContainerStarted","Data":"7f4b8102b8d7e6f8b11758b07aaed534ca633712469e02e0418690d123eb27a5"} Dec 03 09:51:23 crc kubenswrapper[4947]: I1203 09:51:23.198388 4947 generic.go:334] "Generic (PLEG): container finished" podID="bbf40dba-94f8-4c00-a5d0-afbca197b583" containerID="7f4b8102b8d7e6f8b11758b07aaed534ca633712469e02e0418690d123eb27a5" exitCode=0 Dec 03 09:51:23 crc kubenswrapper[4947]: I1203 09:51:23.198436 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5vl5" event={"ID":"bbf40dba-94f8-4c00-a5d0-afbca197b583","Type":"ContainerDied","Data":"7f4b8102b8d7e6f8b11758b07aaed534ca633712469e02e0418690d123eb27a5"} Dec 03 09:51:24 crc kubenswrapper[4947]: I1203 09:51:24.213824 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5vl5" event={"ID":"bbf40dba-94f8-4c00-a5d0-afbca197b583","Type":"ContainerStarted","Data":"24cbdefd248a8d1c8fd28c42466bac6abf5b38f244a6d5f1091fa1419d8f1a6c"} Dec 03 09:51:26 crc kubenswrapper[4947]: I1203 09:51:26.730881 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g5vl5" Dec 03 09:51:26 crc kubenswrapper[4947]: I1203 09:51:26.731194 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g5vl5" Dec 03 09:51:27 crc kubenswrapper[4947]: I1203 09:51:27.801668 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-g5vl5" podUID="bbf40dba-94f8-4c00-a5d0-afbca197b583" containerName="registry-server" probeResult="failure" output=< Dec 03 09:51:27 crc kubenswrapper[4947]: timeout: failed to connect service ":50051" within 1s Dec 03 09:51:27 crc kubenswrapper[4947]: > Dec 03 09:51:31 crc kubenswrapper[4947]: I1203 09:51:31.493547 4947 scope.go:117] "RemoveContainer" containerID="6ec727023ab23f4b714b65304291565d8de6f14ee5edb9cb52e032b6da30fb9e" Dec 03 09:51:31 crc kubenswrapper[4947]: I1203 09:51:31.538061 4947 scope.go:117] "RemoveContainer" containerID="a85d66a743f381095b0ae31240f85b518bbc246fe4d4d997728edd6a305a763a" Dec 03 09:51:31 crc kubenswrapper[4947]: I1203 09:51:31.593095 4947 scope.go:117] "RemoveContainer" containerID="fc3a8a68d3fec24f6b0b174ad0c20dd4fd9c8cfd26a2f497097e70fc331b408c" Dec 03 09:51:36 crc kubenswrapper[4947]: I1203 09:51:36.814688 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g5vl5" Dec 03 09:51:36 crc kubenswrapper[4947]: I1203 09:51:36.842751 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g5vl5" podStartSLOduration=15.310580138 podStartE2EDuration="20.842733919s" podCreationTimestamp="2025-12-03 09:51:16 +0000 UTC" firstStartedPulling="2025-12-03 09:51:18.13899208 +0000 UTC m=+10939.399946516" lastFinishedPulling="2025-12-03 09:51:23.671145871 +0000 UTC m=+10944.932100297" observedRunningTime="2025-12-03 09:51:24.26296989 +0000 UTC m=+10945.523924316" watchObservedRunningTime="2025-12-03 09:51:36.842733919 +0000 UTC m=+10958.103688345" Dec 03 09:51:36 crc kubenswrapper[4947]: I1203 09:51:36.889101 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g5vl5" Dec 03 09:51:37 crc kubenswrapper[4947]: I1203 09:51:37.061551 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g5vl5"] Dec 03 09:51:38 crc kubenswrapper[4947]: I1203 09:51:38.700400 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g5vl5" podUID="bbf40dba-94f8-4c00-a5d0-afbca197b583" containerName="registry-server" containerID="cri-o://24cbdefd248a8d1c8fd28c42466bac6abf5b38f244a6d5f1091fa1419d8f1a6c" gracePeriod=2 Dec 03 09:51:39 crc kubenswrapper[4947]: I1203 09:51:39.310690 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g5vl5" Dec 03 09:51:39 crc kubenswrapper[4947]: I1203 09:51:39.321681 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbf40dba-94f8-4c00-a5d0-afbca197b583-utilities\") pod \"bbf40dba-94f8-4c00-a5d0-afbca197b583\" (UID: \"bbf40dba-94f8-4c00-a5d0-afbca197b583\") " Dec 03 09:51:39 crc kubenswrapper[4947]: I1203 09:51:39.321984 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vcgh\" (UniqueName: \"kubernetes.io/projected/bbf40dba-94f8-4c00-a5d0-afbca197b583-kube-api-access-2vcgh\") pod \"bbf40dba-94f8-4c00-a5d0-afbca197b583\" (UID: \"bbf40dba-94f8-4c00-a5d0-afbca197b583\") " Dec 03 09:51:39 crc kubenswrapper[4947]: I1203 09:51:39.322062 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbf40dba-94f8-4c00-a5d0-afbca197b583-catalog-content\") pod \"bbf40dba-94f8-4c00-a5d0-afbca197b583\" (UID: \"bbf40dba-94f8-4c00-a5d0-afbca197b583\") " Dec 03 09:51:39 crc kubenswrapper[4947]: I1203 09:51:39.322793 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbf40dba-94f8-4c00-a5d0-afbca197b583-utilities" (OuterVolumeSpecName: "utilities") pod "bbf40dba-94f8-4c00-a5d0-afbca197b583" (UID: "bbf40dba-94f8-4c00-a5d0-afbca197b583"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:51:39 crc kubenswrapper[4947]: I1203 09:51:39.327881 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbf40dba-94f8-4c00-a5d0-afbca197b583-kube-api-access-2vcgh" (OuterVolumeSpecName: "kube-api-access-2vcgh") pod "bbf40dba-94f8-4c00-a5d0-afbca197b583" (UID: "bbf40dba-94f8-4c00-a5d0-afbca197b583"). InnerVolumeSpecName "kube-api-access-2vcgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:51:39 crc kubenswrapper[4947]: I1203 09:51:39.424272 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vcgh\" (UniqueName: \"kubernetes.io/projected/bbf40dba-94f8-4c00-a5d0-afbca197b583-kube-api-access-2vcgh\") on node \"crc\" DevicePath \"\"" Dec 03 09:51:39 crc kubenswrapper[4947]: I1203 09:51:39.424480 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bbf40dba-94f8-4c00-a5d0-afbca197b583-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:51:39 crc kubenswrapper[4947]: I1203 09:51:39.456987 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbf40dba-94f8-4c00-a5d0-afbca197b583-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bbf40dba-94f8-4c00-a5d0-afbca197b583" (UID: "bbf40dba-94f8-4c00-a5d0-afbca197b583"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:51:39 crc kubenswrapper[4947]: I1203 09:51:39.525606 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bbf40dba-94f8-4c00-a5d0-afbca197b583-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:51:39 crc kubenswrapper[4947]: I1203 09:51:39.711143 4947 generic.go:334] "Generic (PLEG): container finished" podID="bbf40dba-94f8-4c00-a5d0-afbca197b583" containerID="24cbdefd248a8d1c8fd28c42466bac6abf5b38f244a6d5f1091fa1419d8f1a6c" exitCode=0 Dec 03 09:51:39 crc kubenswrapper[4947]: I1203 09:51:39.711215 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g5vl5" Dec 03 09:51:39 crc kubenswrapper[4947]: I1203 09:51:39.711216 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5vl5" event={"ID":"bbf40dba-94f8-4c00-a5d0-afbca197b583","Type":"ContainerDied","Data":"24cbdefd248a8d1c8fd28c42466bac6abf5b38f244a6d5f1091fa1419d8f1a6c"} Dec 03 09:51:39 crc kubenswrapper[4947]: I1203 09:51:39.712608 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g5vl5" event={"ID":"bbf40dba-94f8-4c00-a5d0-afbca197b583","Type":"ContainerDied","Data":"7acaf9b774ce60bcb793f937c37d9a4024f92a0d359ee81e75875358bd403646"} Dec 03 09:51:39 crc kubenswrapper[4947]: I1203 09:51:39.712641 4947 scope.go:117] "RemoveContainer" containerID="24cbdefd248a8d1c8fd28c42466bac6abf5b38f244a6d5f1091fa1419d8f1a6c" Dec 03 09:51:39 crc kubenswrapper[4947]: I1203 09:51:39.758383 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g5vl5"] Dec 03 09:51:39 crc kubenswrapper[4947]: I1203 09:51:39.759698 4947 scope.go:117] "RemoveContainer" containerID="7f4b8102b8d7e6f8b11758b07aaed534ca633712469e02e0418690d123eb27a5" Dec 03 09:51:39 crc kubenswrapper[4947]: I1203 09:51:39.768271 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g5vl5"] Dec 03 09:51:39 crc kubenswrapper[4947]: I1203 09:51:39.802013 4947 scope.go:117] "RemoveContainer" containerID="1d376298fa663b1b00f6472a4eafbfea10f83755b7cbbed52b6c14f554e822e4" Dec 03 09:51:39 crc kubenswrapper[4947]: I1203 09:51:39.873117 4947 scope.go:117] "RemoveContainer" containerID="24cbdefd248a8d1c8fd28c42466bac6abf5b38f244a6d5f1091fa1419d8f1a6c" Dec 03 09:51:39 crc kubenswrapper[4947]: E1203 09:51:39.874827 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24cbdefd248a8d1c8fd28c42466bac6abf5b38f244a6d5f1091fa1419d8f1a6c\": container with ID starting with 24cbdefd248a8d1c8fd28c42466bac6abf5b38f244a6d5f1091fa1419d8f1a6c not found: ID does not exist" containerID="24cbdefd248a8d1c8fd28c42466bac6abf5b38f244a6d5f1091fa1419d8f1a6c" Dec 03 09:51:39 crc kubenswrapper[4947]: I1203 09:51:39.874881 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24cbdefd248a8d1c8fd28c42466bac6abf5b38f244a6d5f1091fa1419d8f1a6c"} err="failed to get container status \"24cbdefd248a8d1c8fd28c42466bac6abf5b38f244a6d5f1091fa1419d8f1a6c\": rpc error: code = NotFound desc = could not find container \"24cbdefd248a8d1c8fd28c42466bac6abf5b38f244a6d5f1091fa1419d8f1a6c\": container with ID starting with 24cbdefd248a8d1c8fd28c42466bac6abf5b38f244a6d5f1091fa1419d8f1a6c not found: ID does not exist" Dec 03 09:51:39 crc kubenswrapper[4947]: I1203 09:51:39.874915 4947 scope.go:117] "RemoveContainer" containerID="7f4b8102b8d7e6f8b11758b07aaed534ca633712469e02e0418690d123eb27a5" Dec 03 09:51:39 crc kubenswrapper[4947]: E1203 09:51:39.875461 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f4b8102b8d7e6f8b11758b07aaed534ca633712469e02e0418690d123eb27a5\": container with ID starting with 7f4b8102b8d7e6f8b11758b07aaed534ca633712469e02e0418690d123eb27a5 not found: ID does not exist" containerID="7f4b8102b8d7e6f8b11758b07aaed534ca633712469e02e0418690d123eb27a5" Dec 03 09:51:39 crc kubenswrapper[4947]: I1203 09:51:39.875571 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f4b8102b8d7e6f8b11758b07aaed534ca633712469e02e0418690d123eb27a5"} err="failed to get container status \"7f4b8102b8d7e6f8b11758b07aaed534ca633712469e02e0418690d123eb27a5\": rpc error: code = NotFound desc = could not find container \"7f4b8102b8d7e6f8b11758b07aaed534ca633712469e02e0418690d123eb27a5\": container with ID starting with 7f4b8102b8d7e6f8b11758b07aaed534ca633712469e02e0418690d123eb27a5 not found: ID does not exist" Dec 03 09:51:39 crc kubenswrapper[4947]: I1203 09:51:39.875604 4947 scope.go:117] "RemoveContainer" containerID="1d376298fa663b1b00f6472a4eafbfea10f83755b7cbbed52b6c14f554e822e4" Dec 03 09:51:39 crc kubenswrapper[4947]: E1203 09:51:39.875937 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d376298fa663b1b00f6472a4eafbfea10f83755b7cbbed52b6c14f554e822e4\": container with ID starting with 1d376298fa663b1b00f6472a4eafbfea10f83755b7cbbed52b6c14f554e822e4 not found: ID does not exist" containerID="1d376298fa663b1b00f6472a4eafbfea10f83755b7cbbed52b6c14f554e822e4" Dec 03 09:51:39 crc kubenswrapper[4947]: I1203 09:51:39.875966 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d376298fa663b1b00f6472a4eafbfea10f83755b7cbbed52b6c14f554e822e4"} err="failed to get container status \"1d376298fa663b1b00f6472a4eafbfea10f83755b7cbbed52b6c14f554e822e4\": rpc error: code = NotFound desc = could not find container \"1d376298fa663b1b00f6472a4eafbfea10f83755b7cbbed52b6c14f554e822e4\": container with ID starting with 1d376298fa663b1b00f6472a4eafbfea10f83755b7cbbed52b6c14f554e822e4 not found: ID does not exist" Dec 03 09:51:41 crc kubenswrapper[4947]: I1203 09:51:41.095215 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbf40dba-94f8-4c00-a5d0-afbca197b583" path="/var/lib/kubelet/pods/bbf40dba-94f8-4c00-a5d0-afbca197b583/volumes" Dec 03 09:52:00 crc kubenswrapper[4947]: I1203 09:52:00.087034 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:52:00 crc kubenswrapper[4947]: I1203 09:52:00.087718 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:52:30 crc kubenswrapper[4947]: I1203 09:52:30.086476 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:52:30 crc kubenswrapper[4947]: I1203 09:52:30.087087 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:52:36 crc kubenswrapper[4947]: I1203 09:52:36.388770 4947 generic.go:334] "Generic (PLEG): container finished" podID="2c41415b-4450-464c-bdcf-16da30cc83bb" containerID="a7362928763435d747ec7855af8b8e364d7b0633ac3a0fe3e5a7bd77eda7f394" exitCode=0 Dec 03 09:52:36 crc kubenswrapper[4947]: I1203 09:52:36.388863 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" event={"ID":"2c41415b-4450-464c-bdcf-16da30cc83bb","Type":"ContainerDied","Data":"a7362928763435d747ec7855af8b8e364d7b0633ac3a0fe3e5a7bd77eda7f394"} Dec 03 09:52:37 crc kubenswrapper[4947]: I1203 09:52:37.857435 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.010953 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-ssh-key\") pod \"2c41415b-4450-464c-bdcf-16da30cc83bb\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.010997 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell2-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-cell2-compute-config-0\") pod \"2c41415b-4450-464c-bdcf-16da30cc83bb\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.011064 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-migration-ssh-key-1\") pod \"2c41415b-4450-464c-bdcf-16da30cc83bb\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.011098 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-migration-ssh-key-0\") pod \"2c41415b-4450-464c-bdcf-16da30cc83bb\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.011117 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-inventory\") pod \"2c41415b-4450-464c-bdcf-16da30cc83bb\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.011156 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-cells-global-config-0\") pod \"2c41415b-4450-464c-bdcf-16da30cc83bb\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.011191 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell2-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-cell2-combined-ca-bundle\") pod \"2c41415b-4450-464c-bdcf-16da30cc83bb\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.011213 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell2-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-cell2-compute-config-1\") pod \"2c41415b-4450-464c-bdcf-16da30cc83bb\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.011257 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6wjv\" (UniqueName: \"kubernetes.io/projected/2c41415b-4450-464c-bdcf-16da30cc83bb-kube-api-access-s6wjv\") pod \"2c41415b-4450-464c-bdcf-16da30cc83bb\" (UID: \"2c41415b-4450-464c-bdcf-16da30cc83bb\") " Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.020551 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c41415b-4450-464c-bdcf-16da30cc83bb-kube-api-access-s6wjv" (OuterVolumeSpecName: "kube-api-access-s6wjv") pod "2c41415b-4450-464c-bdcf-16da30cc83bb" (UID: "2c41415b-4450-464c-bdcf-16da30cc83bb"). InnerVolumeSpecName "kube-api-access-s6wjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.021031 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-cell2-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell2-combined-ca-bundle") pod "2c41415b-4450-464c-bdcf-16da30cc83bb" (UID: "2c41415b-4450-464c-bdcf-16da30cc83bb"). InnerVolumeSpecName "nova-cell2-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.042014 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "2c41415b-4450-464c-bdcf-16da30cc83bb" (UID: "2c41415b-4450-464c-bdcf-16da30cc83bb"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.046761 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "2c41415b-4450-464c-bdcf-16da30cc83bb" (UID: "2c41415b-4450-464c-bdcf-16da30cc83bb"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.047115 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2c41415b-4450-464c-bdcf-16da30cc83bb" (UID: "2c41415b-4450-464c-bdcf-16da30cc83bb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.048939 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "2c41415b-4450-464c-bdcf-16da30cc83bb" (UID: "2c41415b-4450-464c-bdcf-16da30cc83bb"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.049116 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-cell2-compute-config-1" (OuterVolumeSpecName: "nova-cell2-compute-config-1") pod "2c41415b-4450-464c-bdcf-16da30cc83bb" (UID: "2c41415b-4450-464c-bdcf-16da30cc83bb"). InnerVolumeSpecName "nova-cell2-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.050922 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-cell2-compute-config-0" (OuterVolumeSpecName: "nova-cell2-compute-config-0") pod "2c41415b-4450-464c-bdcf-16da30cc83bb" (UID: "2c41415b-4450-464c-bdcf-16da30cc83bb"). InnerVolumeSpecName "nova-cell2-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.069825 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-inventory" (OuterVolumeSpecName: "inventory") pod "2c41415b-4450-464c-bdcf-16da30cc83bb" (UID: "2c41415b-4450-464c-bdcf-16da30cc83bb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.113608 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.113634 4947 reconciler_common.go:293] "Volume detached for volume \"nova-cell2-compute-config-0\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-cell2-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.113645 4947 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.113653 4947 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.113683 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.113692 4947 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.113700 4947 reconciler_common.go:293] "Volume detached for volume \"nova-cell2-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-cell2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.113708 4947 reconciler_common.go:293] "Volume detached for volume \"nova-cell2-compute-config-1\" (UniqueName: \"kubernetes.io/secret/2c41415b-4450-464c-bdcf-16da30cc83bb-nova-cell2-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.113716 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6wjv\" (UniqueName: \"kubernetes.io/projected/2c41415b-4450-464c-bdcf-16da30cc83bb-kube-api-access-s6wjv\") on node \"crc\" DevicePath \"\"" Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.416168 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" event={"ID":"2c41415b-4450-464c-bdcf-16da30cc83bb","Type":"ContainerDied","Data":"b111c69a6fcbe1cd2073f5c5484f7fc89da2b576a625bd49c8c4acaa309ada6f"} Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.416229 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b111c69a6fcbe1cd2073f5c5484f7fc89da2b576a625bd49c8c4acaa309ada6f" Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.416186 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2" Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.418333 4947 generic.go:334] "Generic (PLEG): container finished" podID="9b2e7f66-e905-447d-b029-e27add784fcc" containerID="c326f48e28a4a1121653d9e837a5fb700b663f5ad2ca915be5b04cea613c290d" exitCode=0 Dec 03 09:52:38 crc kubenswrapper[4947]: I1203 09:52:38.418386 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" event={"ID":"9b2e7f66-e905-447d-b029-e27add784fcc","Type":"ContainerDied","Data":"c326f48e28a4a1121653d9e837a5fb700b663f5ad2ca915be5b04cea613c290d"} Dec 03 09:52:39 crc kubenswrapper[4947]: I1203 09:52:39.910483 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.052671 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-cell1-combined-ca-bundle\") pod \"9b2e7f66-e905-447d-b029-e27add784fcc\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.053588 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-cell1-compute-config-0\") pod \"9b2e7f66-e905-447d-b029-e27add784fcc\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.053784 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-migration-ssh-key-1\") pod \"9b2e7f66-e905-447d-b029-e27add784fcc\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.053812 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-inventory\") pod \"9b2e7f66-e905-447d-b029-e27add784fcc\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.053846 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-cell1-compute-config-1\") pod \"9b2e7f66-e905-447d-b029-e27add784fcc\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.053868 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-migration-ssh-key-0\") pod \"9b2e7f66-e905-447d-b029-e27add784fcc\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.053892 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-ssh-key\") pod \"9b2e7f66-e905-447d-b029-e27add784fcc\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.053973 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/9b2e7f66-e905-447d-b029-e27add784fcc-nova-cells-global-config-0\") pod \"9b2e7f66-e905-447d-b029-e27add784fcc\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.054052 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5ltm\" (UniqueName: \"kubernetes.io/projected/9b2e7f66-e905-447d-b029-e27add784fcc-kube-api-access-p5ltm\") pod \"9b2e7f66-e905-447d-b029-e27add784fcc\" (UID: \"9b2e7f66-e905-447d-b029-e27add784fcc\") " Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.058704 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "9b2e7f66-e905-447d-b029-e27add784fcc" (UID: "9b2e7f66-e905-447d-b029-e27add784fcc"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.059307 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b2e7f66-e905-447d-b029-e27add784fcc-kube-api-access-p5ltm" (OuterVolumeSpecName: "kube-api-access-p5ltm") pod "9b2e7f66-e905-447d-b029-e27add784fcc" (UID: "9b2e7f66-e905-447d-b029-e27add784fcc"). InnerVolumeSpecName "kube-api-access-p5ltm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.079047 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b2e7f66-e905-447d-b029-e27add784fcc-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "9b2e7f66-e905-447d-b029-e27add784fcc" (UID: "9b2e7f66-e905-447d-b029-e27add784fcc"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.086849 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9b2e7f66-e905-447d-b029-e27add784fcc" (UID: "9b2e7f66-e905-447d-b029-e27add784fcc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.088655 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "9b2e7f66-e905-447d-b029-e27add784fcc" (UID: "9b2e7f66-e905-447d-b029-e27add784fcc"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.089030 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "9b2e7f66-e905-447d-b029-e27add784fcc" (UID: "9b2e7f66-e905-447d-b029-e27add784fcc"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.091223 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "9b2e7f66-e905-447d-b029-e27add784fcc" (UID: "9b2e7f66-e905-447d-b029-e27add784fcc"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.097831 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "9b2e7f66-e905-447d-b029-e27add784fcc" (UID: "9b2e7f66-e905-447d-b029-e27add784fcc"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.098184 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-inventory" (OuterVolumeSpecName: "inventory") pod "9b2e7f66-e905-447d-b029-e27add784fcc" (UID: "9b2e7f66-e905-447d-b029-e27add784fcc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.157295 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5ltm\" (UniqueName: \"kubernetes.io/projected/9b2e7f66-e905-447d-b029-e27add784fcc-kube-api-access-p5ltm\") on node \"crc\" DevicePath \"\"" Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.157330 4947 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.157344 4947 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.157357 4947 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.157370 4947 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.157381 4947 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.157393 4947 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.157404 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9b2e7f66-e905-447d-b029-e27add784fcc-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.157415 4947 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/9b2e7f66-e905-447d-b029-e27add784fcc-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.446886 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" event={"ID":"9b2e7f66-e905-447d-b029-e27add784fcc","Type":"ContainerDied","Data":"de2aeebcb7931d45e4c2ffe421d31a2ca0e6898456d4d183ca80aca54a1ab506"} Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.447088 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de2aeebcb7931d45e4c2ffe421d31a2ca0e6898456d4d183ca80aca54a1ab506" Dec 03 09:52:40 crc kubenswrapper[4947]: I1203 09:52:40.446988 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv" Dec 03 09:53:00 crc kubenswrapper[4947]: I1203 09:53:00.086904 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:53:00 crc kubenswrapper[4947]: I1203 09:53:00.087562 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:53:00 crc kubenswrapper[4947]: I1203 09:53:00.087632 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 09:53:00 crc kubenswrapper[4947]: I1203 09:53:00.088802 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eab5bbfe62ec2d95d68078d7965923cc0022d2b127d4947809beac88e6b7ad80"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 09:53:00 crc kubenswrapper[4947]: I1203 09:53:00.088896 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://eab5bbfe62ec2d95d68078d7965923cc0022d2b127d4947809beac88e6b7ad80" gracePeriod=600 Dec 03 09:53:00 crc kubenswrapper[4947]: I1203 09:53:00.701700 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="eab5bbfe62ec2d95d68078d7965923cc0022d2b127d4947809beac88e6b7ad80" exitCode=0 Dec 03 09:53:00 crc kubenswrapper[4947]: I1203 09:53:00.701793 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"eab5bbfe62ec2d95d68078d7965923cc0022d2b127d4947809beac88e6b7ad80"} Dec 03 09:53:00 crc kubenswrapper[4947]: I1203 09:53:00.702477 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c"} Dec 03 09:53:00 crc kubenswrapper[4947]: I1203 09:53:00.702614 4947 scope.go:117] "RemoveContainer" containerID="c658635ab4eb33d4c1af16bf660c87f44db869c68b483c37348af197f4676fe7" Dec 03 09:54:01 crc kubenswrapper[4947]: I1203 09:54:01.419209 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Dec 03 09:54:01 crc kubenswrapper[4947]: I1203 09:54:01.419836 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="a213096c-8020-47a7-b5b8-8d18edb3562e" containerName="adoption" containerID="cri-o://a774104fb59861765ccaabd157d8f847500fdc4ded570d2a2ab01b9a6ebf1c08" gracePeriod=30 Dec 03 09:54:31 crc kubenswrapper[4947]: I1203 09:54:31.816894 4947 generic.go:334] "Generic (PLEG): container finished" podID="a213096c-8020-47a7-b5b8-8d18edb3562e" containerID="a774104fb59861765ccaabd157d8f847500fdc4ded570d2a2ab01b9a6ebf1c08" exitCode=137 Dec 03 09:54:31 crc kubenswrapper[4947]: I1203 09:54:31.816977 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"a213096c-8020-47a7-b5b8-8d18edb3562e","Type":"ContainerDied","Data":"a774104fb59861765ccaabd157d8f847500fdc4ded570d2a2ab01b9a6ebf1c08"} Dec 03 09:54:31 crc kubenswrapper[4947]: I1203 09:54:31.963563 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 03 09:54:32 crc kubenswrapper[4947]: I1203 09:54:32.075251 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvk7r\" (UniqueName: \"kubernetes.io/projected/a213096c-8020-47a7-b5b8-8d18edb3562e-kube-api-access-cvk7r\") pod \"a213096c-8020-47a7-b5b8-8d18edb3562e\" (UID: \"a213096c-8020-47a7-b5b8-8d18edb3562e\") " Dec 03 09:54:32 crc kubenswrapper[4947]: I1203 09:54:32.076122 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-89c0eb3b-9372-4171-abc2-70ebb79f8ff7\") pod \"a213096c-8020-47a7-b5b8-8d18edb3562e\" (UID: \"a213096c-8020-47a7-b5b8-8d18edb3562e\") " Dec 03 09:54:32 crc kubenswrapper[4947]: I1203 09:54:32.086480 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a213096c-8020-47a7-b5b8-8d18edb3562e-kube-api-access-cvk7r" (OuterVolumeSpecName: "kube-api-access-cvk7r") pod "a213096c-8020-47a7-b5b8-8d18edb3562e" (UID: "a213096c-8020-47a7-b5b8-8d18edb3562e"). InnerVolumeSpecName "kube-api-access-cvk7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:54:32 crc kubenswrapper[4947]: I1203 09:54:32.097672 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-89c0eb3b-9372-4171-abc2-70ebb79f8ff7" (OuterVolumeSpecName: "mariadb-data") pod "a213096c-8020-47a7-b5b8-8d18edb3562e" (UID: "a213096c-8020-47a7-b5b8-8d18edb3562e"). InnerVolumeSpecName "pvc-89c0eb3b-9372-4171-abc2-70ebb79f8ff7". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 09:54:32 crc kubenswrapper[4947]: I1203 09:54:32.178471 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvk7r\" (UniqueName: \"kubernetes.io/projected/a213096c-8020-47a7-b5b8-8d18edb3562e-kube-api-access-cvk7r\") on node \"crc\" DevicePath \"\"" Dec 03 09:54:32 crc kubenswrapper[4947]: I1203 09:54:32.178551 4947 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-89c0eb3b-9372-4171-abc2-70ebb79f8ff7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-89c0eb3b-9372-4171-abc2-70ebb79f8ff7\") on node \"crc\" " Dec 03 09:54:32 crc kubenswrapper[4947]: I1203 09:54:32.211095 4947 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 09:54:32 crc kubenswrapper[4947]: I1203 09:54:32.211251 4947 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-89c0eb3b-9372-4171-abc2-70ebb79f8ff7" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-89c0eb3b-9372-4171-abc2-70ebb79f8ff7") on node "crc" Dec 03 09:54:32 crc kubenswrapper[4947]: I1203 09:54:32.280556 4947 reconciler_common.go:293] "Volume detached for volume \"pvc-89c0eb3b-9372-4171-abc2-70ebb79f8ff7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-89c0eb3b-9372-4171-abc2-70ebb79f8ff7\") on node \"crc\" DevicePath \"\"" Dec 03 09:54:32 crc kubenswrapper[4947]: I1203 09:54:32.835084 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"a213096c-8020-47a7-b5b8-8d18edb3562e","Type":"ContainerDied","Data":"993cd659883a0a791beb8c481e3da86ce7b53963c3aa78400a85c322886de094"} Dec 03 09:54:32 crc kubenswrapper[4947]: I1203 09:54:32.835155 4947 scope.go:117] "RemoveContainer" containerID="a774104fb59861765ccaabd157d8f847500fdc4ded570d2a2ab01b9a6ebf1c08" Dec 03 09:54:32 crc kubenswrapper[4947]: I1203 09:54:32.835195 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 03 09:54:32 crc kubenswrapper[4947]: I1203 09:54:32.893689 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Dec 03 09:54:32 crc kubenswrapper[4947]: I1203 09:54:32.909202 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Dec 03 09:54:33 crc kubenswrapper[4947]: I1203 09:54:33.102239 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a213096c-8020-47a7-b5b8-8d18edb3562e" path="/var/lib/kubelet/pods/a213096c-8020-47a7-b5b8-8d18edb3562e/volumes" Dec 03 09:54:33 crc kubenswrapper[4947]: I1203 09:54:33.521297 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Dec 03 09:54:33 crc kubenswrapper[4947]: I1203 09:54:33.521654 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="a3a74721-2312-4c7a-aa51-3895521fb264" containerName="adoption" containerID="cri-o://4421b16e41f753b457d095ff036848b78058ede638470831e1998120db09e943" gracePeriod=30 Dec 03 09:55:00 crc kubenswrapper[4947]: I1203 09:55:00.086000 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:55:00 crc kubenswrapper[4947]: I1203 09:55:00.086533 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:55:04 crc kubenswrapper[4947]: I1203 09:55:04.091505 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 03 09:55:04 crc kubenswrapper[4947]: I1203 09:55:04.151938 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/a3a74721-2312-4c7a-aa51-3895521fb264-ovn-data-cert\") pod \"a3a74721-2312-4c7a-aa51-3895521fb264\" (UID: \"a3a74721-2312-4c7a-aa51-3895521fb264\") " Dec 03 09:55:04 crc kubenswrapper[4947]: I1203 09:55:04.152557 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a320df5-843d-4c14-b669-5ed2b01cab89\") pod \"a3a74721-2312-4c7a-aa51-3895521fb264\" (UID: \"a3a74721-2312-4c7a-aa51-3895521fb264\") " Dec 03 09:55:04 crc kubenswrapper[4947]: I1203 09:55:04.152618 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dmfp\" (UniqueName: \"kubernetes.io/projected/a3a74721-2312-4c7a-aa51-3895521fb264-kube-api-access-2dmfp\") pod \"a3a74721-2312-4c7a-aa51-3895521fb264\" (UID: \"a3a74721-2312-4c7a-aa51-3895521fb264\") " Dec 03 09:55:04 crc kubenswrapper[4947]: I1203 09:55:04.166466 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3a74721-2312-4c7a-aa51-3895521fb264-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "a3a74721-2312-4c7a-aa51-3895521fb264" (UID: "a3a74721-2312-4c7a-aa51-3895521fb264"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:55:04 crc kubenswrapper[4947]: I1203 09:55:04.180130 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a320df5-843d-4c14-b669-5ed2b01cab89" (OuterVolumeSpecName: "ovn-data") pod "a3a74721-2312-4c7a-aa51-3895521fb264" (UID: "a3a74721-2312-4c7a-aa51-3895521fb264"). InnerVolumeSpecName "pvc-4a320df5-843d-4c14-b669-5ed2b01cab89". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 09:55:04 crc kubenswrapper[4947]: I1203 09:55:04.183848 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3a74721-2312-4c7a-aa51-3895521fb264-kube-api-access-2dmfp" (OuterVolumeSpecName: "kube-api-access-2dmfp") pod "a3a74721-2312-4c7a-aa51-3895521fb264" (UID: "a3a74721-2312-4c7a-aa51-3895521fb264"). InnerVolumeSpecName "kube-api-access-2dmfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:55:04 crc kubenswrapper[4947]: I1203 09:55:04.188946 4947 generic.go:334] "Generic (PLEG): container finished" podID="a3a74721-2312-4c7a-aa51-3895521fb264" containerID="4421b16e41f753b457d095ff036848b78058ede638470831e1998120db09e943" exitCode=137 Dec 03 09:55:04 crc kubenswrapper[4947]: I1203 09:55:04.189010 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"a3a74721-2312-4c7a-aa51-3895521fb264","Type":"ContainerDied","Data":"4421b16e41f753b457d095ff036848b78058ede638470831e1998120db09e943"} Dec 03 09:55:04 crc kubenswrapper[4947]: I1203 09:55:04.189036 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"a3a74721-2312-4c7a-aa51-3895521fb264","Type":"ContainerDied","Data":"ae3259ef96cd797ba24c8c7db315e0866c380c8a8a5d1e94ed7944097219261b"} Dec 03 09:55:04 crc kubenswrapper[4947]: I1203 09:55:04.189030 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 03 09:55:04 crc kubenswrapper[4947]: I1203 09:55:04.189055 4947 scope.go:117] "RemoveContainer" containerID="4421b16e41f753b457d095ff036848b78058ede638470831e1998120db09e943" Dec 03 09:55:04 crc kubenswrapper[4947]: I1203 09:55:04.274810 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dmfp\" (UniqueName: \"kubernetes.io/projected/a3a74721-2312-4c7a-aa51-3895521fb264-kube-api-access-2dmfp\") on node \"crc\" DevicePath \"\"" Dec 03 09:55:04 crc kubenswrapper[4947]: I1203 09:55:04.274853 4947 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/a3a74721-2312-4c7a-aa51-3895521fb264-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Dec 03 09:55:04 crc kubenswrapper[4947]: I1203 09:55:04.274897 4947 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-4a320df5-843d-4c14-b669-5ed2b01cab89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a320df5-843d-4c14-b669-5ed2b01cab89\") on node \"crc\" " Dec 03 09:55:04 crc kubenswrapper[4947]: I1203 09:55:04.282054 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Dec 03 09:55:04 crc kubenswrapper[4947]: I1203 09:55:04.284268 4947 scope.go:117] "RemoveContainer" containerID="4421b16e41f753b457d095ff036848b78058ede638470831e1998120db09e943" Dec 03 09:55:04 crc kubenswrapper[4947]: E1203 09:55:04.284824 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4421b16e41f753b457d095ff036848b78058ede638470831e1998120db09e943\": container with ID starting with 4421b16e41f753b457d095ff036848b78058ede638470831e1998120db09e943 not found: ID does not exist" containerID="4421b16e41f753b457d095ff036848b78058ede638470831e1998120db09e943" Dec 03 09:55:04 crc kubenswrapper[4947]: I1203 09:55:04.284866 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4421b16e41f753b457d095ff036848b78058ede638470831e1998120db09e943"} err="failed to get container status \"4421b16e41f753b457d095ff036848b78058ede638470831e1998120db09e943\": rpc error: code = NotFound desc = could not find container \"4421b16e41f753b457d095ff036848b78058ede638470831e1998120db09e943\": container with ID starting with 4421b16e41f753b457d095ff036848b78058ede638470831e1998120db09e943 not found: ID does not exist" Dec 03 09:55:04 crc kubenswrapper[4947]: I1203 09:55:04.293181 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Dec 03 09:55:04 crc kubenswrapper[4947]: I1203 09:55:04.304688 4947 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 09:55:04 crc kubenswrapper[4947]: I1203 09:55:04.304862 4947 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-4a320df5-843d-4c14-b669-5ed2b01cab89" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a320df5-843d-4c14-b669-5ed2b01cab89") on node "crc" Dec 03 09:55:04 crc kubenswrapper[4947]: I1203 09:55:04.377282 4947 reconciler_common.go:293] "Volume detached for volume \"pvc-4a320df5-843d-4c14-b669-5ed2b01cab89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4a320df5-843d-4c14-b669-5ed2b01cab89\") on node \"crc\" DevicePath \"\"" Dec 03 09:55:05 crc kubenswrapper[4947]: I1203 09:55:05.099603 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3a74721-2312-4c7a-aa51-3895521fb264" path="/var/lib/kubelet/pods/a3a74721-2312-4c7a-aa51-3895521fb264/volumes" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.631684 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 03 09:55:25 crc kubenswrapper[4947]: E1203 09:55:25.632790 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b2e7f66-e905-447d-b029-e27add784fcc" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.632813 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b2e7f66-e905-447d-b029-e27add784fcc" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 03 09:55:25 crc kubenswrapper[4947]: E1203 09:55:25.632831 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf40dba-94f8-4c00-a5d0-afbca197b583" containerName="extract-content" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.632841 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf40dba-94f8-4c00-a5d0-afbca197b583" containerName="extract-content" Dec 03 09:55:25 crc kubenswrapper[4947]: E1203 09:55:25.632855 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf40dba-94f8-4c00-a5d0-afbca197b583" containerName="extract-utilities" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.632863 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf40dba-94f8-4c00-a5d0-afbca197b583" containerName="extract-utilities" Dec 03 09:55:25 crc kubenswrapper[4947]: E1203 09:55:25.632895 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a213096c-8020-47a7-b5b8-8d18edb3562e" containerName="adoption" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.632902 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="a213096c-8020-47a7-b5b8-8d18edb3562e" containerName="adoption" Dec 03 09:55:25 crc kubenswrapper[4947]: E1203 09:55:25.632917 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c41415b-4450-464c-bdcf-16da30cc83bb" containerName="nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cell2" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.632929 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c41415b-4450-464c-bdcf-16da30cc83bb" containerName="nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cell2" Dec 03 09:55:25 crc kubenswrapper[4947]: E1203 09:55:25.632953 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf40dba-94f8-4c00-a5d0-afbca197b583" containerName="registry-server" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.632962 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf40dba-94f8-4c00-a5d0-afbca197b583" containerName="registry-server" Dec 03 09:55:25 crc kubenswrapper[4947]: E1203 09:55:25.632973 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3a74721-2312-4c7a-aa51-3895521fb264" containerName="adoption" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.632981 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3a74721-2312-4c7a-aa51-3895521fb264" containerName="adoption" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.633286 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbf40dba-94f8-4c00-a5d0-afbca197b583" containerName="registry-server" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.633305 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="a213096c-8020-47a7-b5b8-8d18edb3562e" containerName="adoption" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.633316 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3a74721-2312-4c7a-aa51-3895521fb264" containerName="adoption" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.633332 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c41415b-4450-464c-bdcf-16da30cc83bb" containerName="nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cell2" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.633346 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b2e7f66-e905-447d-b029-e27add784fcc" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.637745 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.641126 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.642520 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.642520 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-kk56l" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.642637 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.651781 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.705479 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e69a0cda-c7ef-4f16-a380-0d3f9385574e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.705740 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e69a0cda-c7ef-4f16-a380-0d3f9385574e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.705837 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e69a0cda-c7ef-4f16-a380-0d3f9385574e-config-data\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.705913 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e69a0cda-c7ef-4f16-a380-0d3f9385574e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.705986 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.706063 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e69a0cda-c7ef-4f16-a380-0d3f9385574e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.706113 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qql68\" (UniqueName: \"kubernetes.io/projected/e69a0cda-c7ef-4f16-a380-0d3f9385574e-kube-api-access-qql68\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.706184 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e69a0cda-c7ef-4f16-a380-0d3f9385574e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.706296 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e69a0cda-c7ef-4f16-a380-0d3f9385574e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.809089 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.809205 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e69a0cda-c7ef-4f16-a380-0d3f9385574e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.809270 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qql68\" (UniqueName: \"kubernetes.io/projected/e69a0cda-c7ef-4f16-a380-0d3f9385574e-kube-api-access-qql68\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.809333 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e69a0cda-c7ef-4f16-a380-0d3f9385574e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.809426 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e69a0cda-c7ef-4f16-a380-0d3f9385574e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.809563 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e69a0cda-c7ef-4f16-a380-0d3f9385574e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.809668 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e69a0cda-c7ef-4f16-a380-0d3f9385574e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.809719 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e69a0cda-c7ef-4f16-a380-0d3f9385574e-config-data\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.809778 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e69a0cda-c7ef-4f16-a380-0d3f9385574e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.810298 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e69a0cda-c7ef-4f16-a380-0d3f9385574e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.810546 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e69a0cda-c7ef-4f16-a380-0d3f9385574e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.811381 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.812571 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e69a0cda-c7ef-4f16-a380-0d3f9385574e-config-data\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.813052 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e69a0cda-c7ef-4f16-a380-0d3f9385574e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.820091 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e69a0cda-c7ef-4f16-a380-0d3f9385574e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.820819 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e69a0cda-c7ef-4f16-a380-0d3f9385574e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.825024 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e69a0cda-c7ef-4f16-a380-0d3f9385574e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.834379 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qql68\" (UniqueName: \"kubernetes.io/projected/e69a0cda-c7ef-4f16-a380-0d3f9385574e-kube-api-access-qql68\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.849508 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " pod="openstack/tempest-tests-tempest" Dec 03 09:55:25 crc kubenswrapper[4947]: I1203 09:55:25.965216 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 09:55:26 crc kubenswrapper[4947]: I1203 09:55:26.436954 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 03 09:55:26 crc kubenswrapper[4947]: I1203 09:55:26.455797 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 09:55:27 crc kubenswrapper[4947]: I1203 09:55:27.468457 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bkbl2"] Dec 03 09:55:27 crc kubenswrapper[4947]: I1203 09:55:27.473170 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bkbl2" Dec 03 09:55:27 crc kubenswrapper[4947]: I1203 09:55:27.481659 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e69a0cda-c7ef-4f16-a380-0d3f9385574e","Type":"ContainerStarted","Data":"0009ee61243f9ea2309320823a931fbc41b0377c37c07392c27bb9deb9539ba2"} Dec 03 09:55:27 crc kubenswrapper[4947]: I1203 09:55:27.487365 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bkbl2"] Dec 03 09:55:27 crc kubenswrapper[4947]: I1203 09:55:27.554271 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb947a9f-686d-4367-b56a-8b4303c7f0cb-utilities\") pod \"community-operators-bkbl2\" (UID: \"fb947a9f-686d-4367-b56a-8b4303c7f0cb\") " pod="openshift-marketplace/community-operators-bkbl2" Dec 03 09:55:27 crc kubenswrapper[4947]: I1203 09:55:27.554384 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb947a9f-686d-4367-b56a-8b4303c7f0cb-catalog-content\") pod \"community-operators-bkbl2\" (UID: \"fb947a9f-686d-4367-b56a-8b4303c7f0cb\") " pod="openshift-marketplace/community-operators-bkbl2" Dec 03 09:55:27 crc kubenswrapper[4947]: I1203 09:55:27.554623 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xmn7\" (UniqueName: \"kubernetes.io/projected/fb947a9f-686d-4367-b56a-8b4303c7f0cb-kube-api-access-7xmn7\") pod \"community-operators-bkbl2\" (UID: \"fb947a9f-686d-4367-b56a-8b4303c7f0cb\") " pod="openshift-marketplace/community-operators-bkbl2" Dec 03 09:55:27 crc kubenswrapper[4947]: I1203 09:55:27.656614 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xmn7\" (UniqueName: \"kubernetes.io/projected/fb947a9f-686d-4367-b56a-8b4303c7f0cb-kube-api-access-7xmn7\") pod \"community-operators-bkbl2\" (UID: \"fb947a9f-686d-4367-b56a-8b4303c7f0cb\") " pod="openshift-marketplace/community-operators-bkbl2" Dec 03 09:55:27 crc kubenswrapper[4947]: I1203 09:55:27.656683 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb947a9f-686d-4367-b56a-8b4303c7f0cb-utilities\") pod \"community-operators-bkbl2\" (UID: \"fb947a9f-686d-4367-b56a-8b4303c7f0cb\") " pod="openshift-marketplace/community-operators-bkbl2" Dec 03 09:55:27 crc kubenswrapper[4947]: I1203 09:55:27.656769 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb947a9f-686d-4367-b56a-8b4303c7f0cb-catalog-content\") pod \"community-operators-bkbl2\" (UID: \"fb947a9f-686d-4367-b56a-8b4303c7f0cb\") " pod="openshift-marketplace/community-operators-bkbl2" Dec 03 09:55:27 crc kubenswrapper[4947]: I1203 09:55:27.657254 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb947a9f-686d-4367-b56a-8b4303c7f0cb-catalog-content\") pod \"community-operators-bkbl2\" (UID: \"fb947a9f-686d-4367-b56a-8b4303c7f0cb\") " pod="openshift-marketplace/community-operators-bkbl2" Dec 03 09:55:27 crc kubenswrapper[4947]: I1203 09:55:27.657816 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb947a9f-686d-4367-b56a-8b4303c7f0cb-utilities\") pod \"community-operators-bkbl2\" (UID: \"fb947a9f-686d-4367-b56a-8b4303c7f0cb\") " pod="openshift-marketplace/community-operators-bkbl2" Dec 03 09:55:27 crc kubenswrapper[4947]: I1203 09:55:27.682803 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xmn7\" (UniqueName: \"kubernetes.io/projected/fb947a9f-686d-4367-b56a-8b4303c7f0cb-kube-api-access-7xmn7\") pod \"community-operators-bkbl2\" (UID: \"fb947a9f-686d-4367-b56a-8b4303c7f0cb\") " pod="openshift-marketplace/community-operators-bkbl2" Dec 03 09:55:27 crc kubenswrapper[4947]: I1203 09:55:27.830166 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bkbl2" Dec 03 09:55:28 crc kubenswrapper[4947]: I1203 09:55:28.345396 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bkbl2"] Dec 03 09:55:28 crc kubenswrapper[4947]: W1203 09:55:28.351411 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb947a9f_686d_4367_b56a_8b4303c7f0cb.slice/crio-30474f1e1cc55d5a0aec8656d78d4eed407e83b9814291e6fff35ed443b83e28 WatchSource:0}: Error finding container 30474f1e1cc55d5a0aec8656d78d4eed407e83b9814291e6fff35ed443b83e28: Status 404 returned error can't find the container with id 30474f1e1cc55d5a0aec8656d78d4eed407e83b9814291e6fff35ed443b83e28 Dec 03 09:55:28 crc kubenswrapper[4947]: I1203 09:55:28.497977 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkbl2" event={"ID":"fb947a9f-686d-4367-b56a-8b4303c7f0cb","Type":"ContainerStarted","Data":"30474f1e1cc55d5a0aec8656d78d4eed407e83b9814291e6fff35ed443b83e28"} Dec 03 09:55:29 crc kubenswrapper[4947]: I1203 09:55:29.512941 4947 generic.go:334] "Generic (PLEG): container finished" podID="fb947a9f-686d-4367-b56a-8b4303c7f0cb" containerID="9fd6c8b0d78a5f8c3341efb8f555b07260d6012ff9804c9d649c5d67a77c4d2e" exitCode=0 Dec 03 09:55:29 crc kubenswrapper[4947]: I1203 09:55:29.513159 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkbl2" event={"ID":"fb947a9f-686d-4367-b56a-8b4303c7f0cb","Type":"ContainerDied","Data":"9fd6c8b0d78a5f8c3341efb8f555b07260d6012ff9804c9d649c5d67a77c4d2e"} Dec 03 09:55:30 crc kubenswrapper[4947]: I1203 09:55:30.086473 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:55:30 crc kubenswrapper[4947]: I1203 09:55:30.086545 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:55:34 crc kubenswrapper[4947]: I1203 09:55:34.795381 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkbl2" event={"ID":"fb947a9f-686d-4367-b56a-8b4303c7f0cb","Type":"ContainerStarted","Data":"d57fc5ec9b455aff52cf8b70ccd4a03e3341f4c022a8552fdbd01b934a3d712c"} Dec 03 09:55:35 crc kubenswrapper[4947]: I1203 09:55:35.809307 4947 generic.go:334] "Generic (PLEG): container finished" podID="fb947a9f-686d-4367-b56a-8b4303c7f0cb" containerID="d57fc5ec9b455aff52cf8b70ccd4a03e3341f4c022a8552fdbd01b934a3d712c" exitCode=0 Dec 03 09:55:35 crc kubenswrapper[4947]: I1203 09:55:35.809399 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkbl2" event={"ID":"fb947a9f-686d-4367-b56a-8b4303c7f0cb","Type":"ContainerDied","Data":"d57fc5ec9b455aff52cf8b70ccd4a03e3341f4c022a8552fdbd01b934a3d712c"} Dec 03 09:55:36 crc kubenswrapper[4947]: I1203 09:55:36.828383 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkbl2" event={"ID":"fb947a9f-686d-4367-b56a-8b4303c7f0cb","Type":"ContainerStarted","Data":"5089edace73713c696583e3295302973ceb3528ecbe59304b8893193c5e0b89d"} Dec 03 09:55:36 crc kubenswrapper[4947]: I1203 09:55:36.852926 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bkbl2" podStartSLOduration=3.172149264 podStartE2EDuration="9.852898704s" podCreationTimestamp="2025-12-03 09:55:27 +0000 UTC" firstStartedPulling="2025-12-03 09:55:29.515073083 +0000 UTC m=+11190.776027499" lastFinishedPulling="2025-12-03 09:55:36.195822513 +0000 UTC m=+11197.456776939" observedRunningTime="2025-12-03 09:55:36.846974704 +0000 UTC m=+11198.107929140" watchObservedRunningTime="2025-12-03 09:55:36.852898704 +0000 UTC m=+11198.113853130" Dec 03 09:55:37 crc kubenswrapper[4947]: I1203 09:55:37.830518 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bkbl2" Dec 03 09:55:37 crc kubenswrapper[4947]: I1203 09:55:37.830836 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bkbl2" Dec 03 09:55:38 crc kubenswrapper[4947]: I1203 09:55:38.887605 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-bkbl2" podUID="fb947a9f-686d-4367-b56a-8b4303c7f0cb" containerName="registry-server" probeResult="failure" output=< Dec 03 09:55:38 crc kubenswrapper[4947]: timeout: failed to connect service ":50051" within 1s Dec 03 09:55:38 crc kubenswrapper[4947]: > Dec 03 09:55:47 crc kubenswrapper[4947]: I1203 09:55:47.880653 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bkbl2" Dec 03 09:55:47 crc kubenswrapper[4947]: I1203 09:55:47.942473 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bkbl2" Dec 03 09:55:48 crc kubenswrapper[4947]: I1203 09:55:48.128085 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bkbl2"] Dec 03 09:55:48 crc kubenswrapper[4947]: I1203 09:55:48.963731 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bkbl2" podUID="fb947a9f-686d-4367-b56a-8b4303c7f0cb" containerName="registry-server" containerID="cri-o://5089edace73713c696583e3295302973ceb3528ecbe59304b8893193c5e0b89d" gracePeriod=2 Dec 03 09:55:49 crc kubenswrapper[4947]: I1203 09:55:49.559945 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bkbl2" Dec 03 09:55:49 crc kubenswrapper[4947]: I1203 09:55:49.616807 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb947a9f-686d-4367-b56a-8b4303c7f0cb-catalog-content\") pod \"fb947a9f-686d-4367-b56a-8b4303c7f0cb\" (UID: \"fb947a9f-686d-4367-b56a-8b4303c7f0cb\") " Dec 03 09:55:49 crc kubenswrapper[4947]: I1203 09:55:49.617015 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xmn7\" (UniqueName: \"kubernetes.io/projected/fb947a9f-686d-4367-b56a-8b4303c7f0cb-kube-api-access-7xmn7\") pod \"fb947a9f-686d-4367-b56a-8b4303c7f0cb\" (UID: \"fb947a9f-686d-4367-b56a-8b4303c7f0cb\") " Dec 03 09:55:49 crc kubenswrapper[4947]: I1203 09:55:49.617073 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb947a9f-686d-4367-b56a-8b4303c7f0cb-utilities\") pod \"fb947a9f-686d-4367-b56a-8b4303c7f0cb\" (UID: \"fb947a9f-686d-4367-b56a-8b4303c7f0cb\") " Dec 03 09:55:49 crc kubenswrapper[4947]: I1203 09:55:49.618543 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb947a9f-686d-4367-b56a-8b4303c7f0cb-utilities" (OuterVolumeSpecName: "utilities") pod "fb947a9f-686d-4367-b56a-8b4303c7f0cb" (UID: "fb947a9f-686d-4367-b56a-8b4303c7f0cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:55:49 crc kubenswrapper[4947]: I1203 09:55:49.624481 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb947a9f-686d-4367-b56a-8b4303c7f0cb-kube-api-access-7xmn7" (OuterVolumeSpecName: "kube-api-access-7xmn7") pod "fb947a9f-686d-4367-b56a-8b4303c7f0cb" (UID: "fb947a9f-686d-4367-b56a-8b4303c7f0cb"). InnerVolumeSpecName "kube-api-access-7xmn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:55:49 crc kubenswrapper[4947]: I1203 09:55:49.664263 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb947a9f-686d-4367-b56a-8b4303c7f0cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb947a9f-686d-4367-b56a-8b4303c7f0cb" (UID: "fb947a9f-686d-4367-b56a-8b4303c7f0cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:55:49 crc kubenswrapper[4947]: I1203 09:55:49.719995 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xmn7\" (UniqueName: \"kubernetes.io/projected/fb947a9f-686d-4367-b56a-8b4303c7f0cb-kube-api-access-7xmn7\") on node \"crc\" DevicePath \"\"" Dec 03 09:55:49 crc kubenswrapper[4947]: I1203 09:55:49.720037 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb947a9f-686d-4367-b56a-8b4303c7f0cb-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:55:49 crc kubenswrapper[4947]: I1203 09:55:49.720049 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb947a9f-686d-4367-b56a-8b4303c7f0cb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:55:49 crc kubenswrapper[4947]: I1203 09:55:49.981166 4947 generic.go:334] "Generic (PLEG): container finished" podID="fb947a9f-686d-4367-b56a-8b4303c7f0cb" containerID="5089edace73713c696583e3295302973ceb3528ecbe59304b8893193c5e0b89d" exitCode=0 Dec 03 09:55:49 crc kubenswrapper[4947]: I1203 09:55:49.981204 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkbl2" event={"ID":"fb947a9f-686d-4367-b56a-8b4303c7f0cb","Type":"ContainerDied","Data":"5089edace73713c696583e3295302973ceb3528ecbe59304b8893193c5e0b89d"} Dec 03 09:55:49 crc kubenswrapper[4947]: I1203 09:55:49.981242 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bkbl2" event={"ID":"fb947a9f-686d-4367-b56a-8b4303c7f0cb","Type":"ContainerDied","Data":"30474f1e1cc55d5a0aec8656d78d4eed407e83b9814291e6fff35ed443b83e28"} Dec 03 09:55:49 crc kubenswrapper[4947]: I1203 09:55:49.981238 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bkbl2" Dec 03 09:55:49 crc kubenswrapper[4947]: I1203 09:55:49.981287 4947 scope.go:117] "RemoveContainer" containerID="5089edace73713c696583e3295302973ceb3528ecbe59304b8893193c5e0b89d" Dec 03 09:55:50 crc kubenswrapper[4947]: I1203 09:55:50.035834 4947 scope.go:117] "RemoveContainer" containerID="d57fc5ec9b455aff52cf8b70ccd4a03e3341f4c022a8552fdbd01b934a3d712c" Dec 03 09:55:50 crc kubenswrapper[4947]: I1203 09:55:50.043883 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bkbl2"] Dec 03 09:55:50 crc kubenswrapper[4947]: I1203 09:55:50.064124 4947 scope.go:117] "RemoveContainer" containerID="9fd6c8b0d78a5f8c3341efb8f555b07260d6012ff9804c9d649c5d67a77c4d2e" Dec 03 09:55:50 crc kubenswrapper[4947]: I1203 09:55:50.065802 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bkbl2"] Dec 03 09:55:50 crc kubenswrapper[4947]: I1203 09:55:50.127580 4947 scope.go:117] "RemoveContainer" containerID="5089edace73713c696583e3295302973ceb3528ecbe59304b8893193c5e0b89d" Dec 03 09:55:50 crc kubenswrapper[4947]: E1203 09:55:50.128092 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5089edace73713c696583e3295302973ceb3528ecbe59304b8893193c5e0b89d\": container with ID starting with 5089edace73713c696583e3295302973ceb3528ecbe59304b8893193c5e0b89d not found: ID does not exist" containerID="5089edace73713c696583e3295302973ceb3528ecbe59304b8893193c5e0b89d" Dec 03 09:55:50 crc kubenswrapper[4947]: I1203 09:55:50.128122 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5089edace73713c696583e3295302973ceb3528ecbe59304b8893193c5e0b89d"} err="failed to get container status \"5089edace73713c696583e3295302973ceb3528ecbe59304b8893193c5e0b89d\": rpc error: code = NotFound desc = could not find container \"5089edace73713c696583e3295302973ceb3528ecbe59304b8893193c5e0b89d\": container with ID starting with 5089edace73713c696583e3295302973ceb3528ecbe59304b8893193c5e0b89d not found: ID does not exist" Dec 03 09:55:50 crc kubenswrapper[4947]: I1203 09:55:50.128142 4947 scope.go:117] "RemoveContainer" containerID="d57fc5ec9b455aff52cf8b70ccd4a03e3341f4c022a8552fdbd01b934a3d712c" Dec 03 09:55:50 crc kubenswrapper[4947]: E1203 09:55:50.128458 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d57fc5ec9b455aff52cf8b70ccd4a03e3341f4c022a8552fdbd01b934a3d712c\": container with ID starting with d57fc5ec9b455aff52cf8b70ccd4a03e3341f4c022a8552fdbd01b934a3d712c not found: ID does not exist" containerID="d57fc5ec9b455aff52cf8b70ccd4a03e3341f4c022a8552fdbd01b934a3d712c" Dec 03 09:55:50 crc kubenswrapper[4947]: I1203 09:55:50.128537 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d57fc5ec9b455aff52cf8b70ccd4a03e3341f4c022a8552fdbd01b934a3d712c"} err="failed to get container status \"d57fc5ec9b455aff52cf8b70ccd4a03e3341f4c022a8552fdbd01b934a3d712c\": rpc error: code = NotFound desc = could not find container \"d57fc5ec9b455aff52cf8b70ccd4a03e3341f4c022a8552fdbd01b934a3d712c\": container with ID starting with d57fc5ec9b455aff52cf8b70ccd4a03e3341f4c022a8552fdbd01b934a3d712c not found: ID does not exist" Dec 03 09:55:50 crc kubenswrapper[4947]: I1203 09:55:50.128573 4947 scope.go:117] "RemoveContainer" containerID="9fd6c8b0d78a5f8c3341efb8f555b07260d6012ff9804c9d649c5d67a77c4d2e" Dec 03 09:55:50 crc kubenswrapper[4947]: E1203 09:55:50.128888 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fd6c8b0d78a5f8c3341efb8f555b07260d6012ff9804c9d649c5d67a77c4d2e\": container with ID starting with 9fd6c8b0d78a5f8c3341efb8f555b07260d6012ff9804c9d649c5d67a77c4d2e not found: ID does not exist" containerID="9fd6c8b0d78a5f8c3341efb8f555b07260d6012ff9804c9d649c5d67a77c4d2e" Dec 03 09:55:50 crc kubenswrapper[4947]: I1203 09:55:50.128918 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fd6c8b0d78a5f8c3341efb8f555b07260d6012ff9804c9d649c5d67a77c4d2e"} err="failed to get container status \"9fd6c8b0d78a5f8c3341efb8f555b07260d6012ff9804c9d649c5d67a77c4d2e\": rpc error: code = NotFound desc = could not find container \"9fd6c8b0d78a5f8c3341efb8f555b07260d6012ff9804c9d649c5d67a77c4d2e\": container with ID starting with 9fd6c8b0d78a5f8c3341efb8f555b07260d6012ff9804c9d649c5d67a77c4d2e not found: ID does not exist" Dec 03 09:55:51 crc kubenswrapper[4947]: I1203 09:55:51.099734 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb947a9f-686d-4367-b56a-8b4303c7f0cb" path="/var/lib/kubelet/pods/fb947a9f-686d-4367-b56a-8b4303c7f0cb/volumes" Dec 03 09:56:00 crc kubenswrapper[4947]: I1203 09:56:00.087029 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:56:00 crc kubenswrapper[4947]: I1203 09:56:00.087747 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:56:00 crc kubenswrapper[4947]: I1203 09:56:00.087815 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 09:56:00 crc kubenswrapper[4947]: I1203 09:56:00.088904 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 09:56:00 crc kubenswrapper[4947]: I1203 09:56:00.089014 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" gracePeriod=600 Dec 03 09:56:01 crc kubenswrapper[4947]: I1203 09:56:01.105324 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" exitCode=0 Dec 03 09:56:01 crc kubenswrapper[4947]: I1203 09:56:01.105380 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c"} Dec 03 09:56:01 crc kubenswrapper[4947]: I1203 09:56:01.105709 4947 scope.go:117] "RemoveContainer" containerID="eab5bbfe62ec2d95d68078d7965923cc0022d2b127d4947809beac88e6b7ad80" Dec 03 09:56:12 crc kubenswrapper[4947]: E1203 09:56:12.432514 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:56:12 crc kubenswrapper[4947]: E1203 09:56:12.485632 4947 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:65066e8ca260a75886ae57f157049605" Dec 03 09:56:12 crc kubenswrapper[4947]: E1203 09:56:12.485700 4947 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:65066e8ca260a75886ae57f157049605" Dec 03 09:56:12 crc kubenswrapper[4947]: E1203 09:56:12.485925 4947 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:65066e8ca260a75886ae57f157049605,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qql68,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(e69a0cda-c7ef-4f16-a380-0d3f9385574e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 09:56:12 crc kubenswrapper[4947]: E1203 09:56:12.487149 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="e69a0cda-c7ef-4f16-a380-0d3f9385574e" Dec 03 09:56:13 crc kubenswrapper[4947]: I1203 09:56:13.260192 4947 scope.go:117] "RemoveContainer" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" Dec 03 09:56:13 crc kubenswrapper[4947]: E1203 09:56:13.260899 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:56:13 crc kubenswrapper[4947]: E1203 09:56:13.261559 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:65066e8ca260a75886ae57f157049605\\\"\"" pod="openstack/tempest-tests-tempest" podUID="e69a0cda-c7ef-4f16-a380-0d3f9385574e" Dec 03 09:56:27 crc kubenswrapper[4947]: I1203 09:56:27.083659 4947 scope.go:117] "RemoveContainer" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" Dec 03 09:56:27 crc kubenswrapper[4947]: E1203 09:56:27.084421 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:56:28 crc kubenswrapper[4947]: I1203 09:56:28.257665 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 03 09:56:29 crc kubenswrapper[4947]: I1203 09:56:29.472078 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e69a0cda-c7ef-4f16-a380-0d3f9385574e","Type":"ContainerStarted","Data":"55d82ab50b32c454418365b5e89b0068618bf11d602d74c3b65dc5213e617212"} Dec 03 09:56:29 crc kubenswrapper[4947]: I1203 09:56:29.512993 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.7130483119999997 podStartE2EDuration="1m5.512971282s" podCreationTimestamp="2025-12-03 09:55:24 +0000 UTC" firstStartedPulling="2025-12-03 09:55:26.455461518 +0000 UTC m=+11187.716415944" lastFinishedPulling="2025-12-03 09:56:28.255384488 +0000 UTC m=+11249.516338914" observedRunningTime="2025-12-03 09:56:29.506108166 +0000 UTC m=+11250.767062592" watchObservedRunningTime="2025-12-03 09:56:29.512971282 +0000 UTC m=+11250.773925708" Dec 03 09:56:41 crc kubenswrapper[4947]: I1203 09:56:41.083836 4947 scope.go:117] "RemoveContainer" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" Dec 03 09:56:41 crc kubenswrapper[4947]: E1203 09:56:41.084695 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:56:53 crc kubenswrapper[4947]: I1203 09:56:53.083083 4947 scope.go:117] "RemoveContainer" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" Dec 03 09:56:53 crc kubenswrapper[4947]: E1203 09:56:53.083835 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:57:07 crc kubenswrapper[4947]: I1203 09:57:07.083076 4947 scope.go:117] "RemoveContainer" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" Dec 03 09:57:07 crc kubenswrapper[4947]: E1203 09:57:07.083784 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:57:21 crc kubenswrapper[4947]: I1203 09:57:21.083242 4947 scope.go:117] "RemoveContainer" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" Dec 03 09:57:21 crc kubenswrapper[4947]: E1203 09:57:21.083947 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:57:35 crc kubenswrapper[4947]: I1203 09:57:35.083642 4947 scope.go:117] "RemoveContainer" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" Dec 03 09:57:35 crc kubenswrapper[4947]: E1203 09:57:35.084367 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:57:46 crc kubenswrapper[4947]: I1203 09:57:46.082675 4947 scope.go:117] "RemoveContainer" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" Dec 03 09:57:46 crc kubenswrapper[4947]: E1203 09:57:46.083481 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:57:58 crc kubenswrapper[4947]: I1203 09:57:58.083323 4947 scope.go:117] "RemoveContainer" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" Dec 03 09:57:58 crc kubenswrapper[4947]: E1203 09:57:58.084426 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:58:09 crc kubenswrapper[4947]: I1203 09:58:09.094567 4947 scope.go:117] "RemoveContainer" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" Dec 03 09:58:09 crc kubenswrapper[4947]: E1203 09:58:09.095442 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:58:21 crc kubenswrapper[4947]: I1203 09:58:21.082936 4947 scope.go:117] "RemoveContainer" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" Dec 03 09:58:21 crc kubenswrapper[4947]: E1203 09:58:21.083732 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:58:33 crc kubenswrapper[4947]: I1203 09:58:33.088037 4947 scope.go:117] "RemoveContainer" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" Dec 03 09:58:33 crc kubenswrapper[4947]: E1203 09:58:33.089209 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:58:46 crc kubenswrapper[4947]: I1203 09:58:46.083921 4947 scope.go:117] "RemoveContainer" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" Dec 03 09:58:46 crc kubenswrapper[4947]: E1203 09:58:46.084662 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:58:59 crc kubenswrapper[4947]: I1203 09:58:59.107170 4947 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-64c8bd4d48-79lrk" podUID="4b5182fc-c85f-4e7a-960c-a2aa59cc653b" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 03 09:59:01 crc kubenswrapper[4947]: I1203 09:59:01.088410 4947 scope.go:117] "RemoveContainer" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" Dec 03 09:59:01 crc kubenswrapper[4947]: E1203 09:59:01.088958 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:59:05 crc kubenswrapper[4947]: I1203 09:59:05.988229 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d2qxf"] Dec 03 09:59:05 crc kubenswrapper[4947]: E1203 09:59:05.989201 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb947a9f-686d-4367-b56a-8b4303c7f0cb" containerName="extract-utilities" Dec 03 09:59:05 crc kubenswrapper[4947]: I1203 09:59:05.989217 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb947a9f-686d-4367-b56a-8b4303c7f0cb" containerName="extract-utilities" Dec 03 09:59:05 crc kubenswrapper[4947]: E1203 09:59:05.989259 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb947a9f-686d-4367-b56a-8b4303c7f0cb" containerName="registry-server" Dec 03 09:59:05 crc kubenswrapper[4947]: I1203 09:59:05.989265 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb947a9f-686d-4367-b56a-8b4303c7f0cb" containerName="registry-server" Dec 03 09:59:05 crc kubenswrapper[4947]: E1203 09:59:05.989273 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb947a9f-686d-4367-b56a-8b4303c7f0cb" containerName="extract-content" Dec 03 09:59:05 crc kubenswrapper[4947]: I1203 09:59:05.989280 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb947a9f-686d-4367-b56a-8b4303c7f0cb" containerName="extract-content" Dec 03 09:59:05 crc kubenswrapper[4947]: I1203 09:59:05.989494 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb947a9f-686d-4367-b56a-8b4303c7f0cb" containerName="registry-server" Dec 03 09:59:05 crc kubenswrapper[4947]: I1203 09:59:05.991215 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2qxf" Dec 03 09:59:06 crc kubenswrapper[4947]: I1203 09:59:06.006326 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2qxf"] Dec 03 09:59:06 crc kubenswrapper[4947]: I1203 09:59:06.136031 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzbc6\" (UniqueName: \"kubernetes.io/projected/9f80dd16-1ab3-463d-a3a7-96297f7a9655-kube-api-access-rzbc6\") pod \"redhat-marketplace-d2qxf\" (UID: \"9f80dd16-1ab3-463d-a3a7-96297f7a9655\") " pod="openshift-marketplace/redhat-marketplace-d2qxf" Dec 03 09:59:06 crc kubenswrapper[4947]: I1203 09:59:06.136074 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f80dd16-1ab3-463d-a3a7-96297f7a9655-catalog-content\") pod \"redhat-marketplace-d2qxf\" (UID: \"9f80dd16-1ab3-463d-a3a7-96297f7a9655\") " pod="openshift-marketplace/redhat-marketplace-d2qxf" Dec 03 09:59:06 crc kubenswrapper[4947]: I1203 09:59:06.136421 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f80dd16-1ab3-463d-a3a7-96297f7a9655-utilities\") pod \"redhat-marketplace-d2qxf\" (UID: \"9f80dd16-1ab3-463d-a3a7-96297f7a9655\") " pod="openshift-marketplace/redhat-marketplace-d2qxf" Dec 03 09:59:06 crc kubenswrapper[4947]: I1203 09:59:06.238262 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f80dd16-1ab3-463d-a3a7-96297f7a9655-utilities\") pod \"redhat-marketplace-d2qxf\" (UID: \"9f80dd16-1ab3-463d-a3a7-96297f7a9655\") " pod="openshift-marketplace/redhat-marketplace-d2qxf" Dec 03 09:59:06 crc kubenswrapper[4947]: I1203 09:59:06.238561 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzbc6\" (UniqueName: \"kubernetes.io/projected/9f80dd16-1ab3-463d-a3a7-96297f7a9655-kube-api-access-rzbc6\") pod \"redhat-marketplace-d2qxf\" (UID: \"9f80dd16-1ab3-463d-a3a7-96297f7a9655\") " pod="openshift-marketplace/redhat-marketplace-d2qxf" Dec 03 09:59:06 crc kubenswrapper[4947]: I1203 09:59:06.238598 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f80dd16-1ab3-463d-a3a7-96297f7a9655-catalog-content\") pod \"redhat-marketplace-d2qxf\" (UID: \"9f80dd16-1ab3-463d-a3a7-96297f7a9655\") " pod="openshift-marketplace/redhat-marketplace-d2qxf" Dec 03 09:59:06 crc kubenswrapper[4947]: I1203 09:59:06.238839 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f80dd16-1ab3-463d-a3a7-96297f7a9655-utilities\") pod \"redhat-marketplace-d2qxf\" (UID: \"9f80dd16-1ab3-463d-a3a7-96297f7a9655\") " pod="openshift-marketplace/redhat-marketplace-d2qxf" Dec 03 09:59:06 crc kubenswrapper[4947]: I1203 09:59:06.239074 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f80dd16-1ab3-463d-a3a7-96297f7a9655-catalog-content\") pod \"redhat-marketplace-d2qxf\" (UID: \"9f80dd16-1ab3-463d-a3a7-96297f7a9655\") " pod="openshift-marketplace/redhat-marketplace-d2qxf" Dec 03 09:59:06 crc kubenswrapper[4947]: I1203 09:59:06.265591 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzbc6\" (UniqueName: \"kubernetes.io/projected/9f80dd16-1ab3-463d-a3a7-96297f7a9655-kube-api-access-rzbc6\") pod \"redhat-marketplace-d2qxf\" (UID: \"9f80dd16-1ab3-463d-a3a7-96297f7a9655\") " pod="openshift-marketplace/redhat-marketplace-d2qxf" Dec 03 09:59:06 crc kubenswrapper[4947]: I1203 09:59:06.320749 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2qxf" Dec 03 09:59:06 crc kubenswrapper[4947]: I1203 09:59:06.878429 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2qxf"] Dec 03 09:59:07 crc kubenswrapper[4947]: I1203 09:59:07.374714 4947 generic.go:334] "Generic (PLEG): container finished" podID="9f80dd16-1ab3-463d-a3a7-96297f7a9655" containerID="c8a86974909c87bff1c6b98657ae432e0007de95ed785b58381070a348655b42" exitCode=0 Dec 03 09:59:07 crc kubenswrapper[4947]: I1203 09:59:07.374941 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2qxf" event={"ID":"9f80dd16-1ab3-463d-a3a7-96297f7a9655","Type":"ContainerDied","Data":"c8a86974909c87bff1c6b98657ae432e0007de95ed785b58381070a348655b42"} Dec 03 09:59:07 crc kubenswrapper[4947]: I1203 09:59:07.374968 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2qxf" event={"ID":"9f80dd16-1ab3-463d-a3a7-96297f7a9655","Type":"ContainerStarted","Data":"785acfadd61d0167f8c277a2785f1a436642e0a2c05e462789d9bb6bb4560569"} Dec 03 09:59:08 crc kubenswrapper[4947]: I1203 09:59:08.385996 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2qxf" event={"ID":"9f80dd16-1ab3-463d-a3a7-96297f7a9655","Type":"ContainerStarted","Data":"72e00cabab7d7fd5f83f7c2ee9c0fbfbf6b232dc766bda15ee683b8d8445995a"} Dec 03 09:59:09 crc kubenswrapper[4947]: I1203 09:59:09.398360 4947 generic.go:334] "Generic (PLEG): container finished" podID="9f80dd16-1ab3-463d-a3a7-96297f7a9655" containerID="72e00cabab7d7fd5f83f7c2ee9c0fbfbf6b232dc766bda15ee683b8d8445995a" exitCode=0 Dec 03 09:59:09 crc kubenswrapper[4947]: I1203 09:59:09.398425 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2qxf" event={"ID":"9f80dd16-1ab3-463d-a3a7-96297f7a9655","Type":"ContainerDied","Data":"72e00cabab7d7fd5f83f7c2ee9c0fbfbf6b232dc766bda15ee683b8d8445995a"} Dec 03 09:59:10 crc kubenswrapper[4947]: I1203 09:59:10.410577 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2qxf" event={"ID":"9f80dd16-1ab3-463d-a3a7-96297f7a9655","Type":"ContainerStarted","Data":"3d7d0e877f541e1b31b9d846878c72e2d03679a965b6dd181439d8cb2e56484a"} Dec 03 09:59:10 crc kubenswrapper[4947]: I1203 09:59:10.430124 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d2qxf" podStartSLOduration=2.81388391 podStartE2EDuration="5.430097226s" podCreationTimestamp="2025-12-03 09:59:05 +0000 UTC" firstStartedPulling="2025-12-03 09:59:07.377787698 +0000 UTC m=+11408.638742124" lastFinishedPulling="2025-12-03 09:59:09.994001014 +0000 UTC m=+11411.254955440" observedRunningTime="2025-12-03 09:59:10.428273697 +0000 UTC m=+11411.689228143" watchObservedRunningTime="2025-12-03 09:59:10.430097226 +0000 UTC m=+11411.691051692" Dec 03 09:59:12 crc kubenswrapper[4947]: I1203 09:59:12.083818 4947 scope.go:117] "RemoveContainer" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" Dec 03 09:59:12 crc kubenswrapper[4947]: E1203 09:59:12.084418 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:59:16 crc kubenswrapper[4947]: I1203 09:59:16.322301 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d2qxf" Dec 03 09:59:16 crc kubenswrapper[4947]: I1203 09:59:16.322771 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d2qxf" Dec 03 09:59:16 crc kubenswrapper[4947]: I1203 09:59:16.379053 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d2qxf" Dec 03 09:59:16 crc kubenswrapper[4947]: I1203 09:59:16.538576 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d2qxf" Dec 03 09:59:16 crc kubenswrapper[4947]: I1203 09:59:16.622620 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2qxf"] Dec 03 09:59:18 crc kubenswrapper[4947]: I1203 09:59:18.499905 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d2qxf" podUID="9f80dd16-1ab3-463d-a3a7-96297f7a9655" containerName="registry-server" containerID="cri-o://3d7d0e877f541e1b31b9d846878c72e2d03679a965b6dd181439d8cb2e56484a" gracePeriod=2 Dec 03 09:59:19 crc kubenswrapper[4947]: I1203 09:59:19.199602 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2qxf" Dec 03 09:59:19 crc kubenswrapper[4947]: I1203 09:59:19.211973 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f80dd16-1ab3-463d-a3a7-96297f7a9655-catalog-content\") pod \"9f80dd16-1ab3-463d-a3a7-96297f7a9655\" (UID: \"9f80dd16-1ab3-463d-a3a7-96297f7a9655\") " Dec 03 09:59:19 crc kubenswrapper[4947]: I1203 09:59:19.212100 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f80dd16-1ab3-463d-a3a7-96297f7a9655-utilities\") pod \"9f80dd16-1ab3-463d-a3a7-96297f7a9655\" (UID: \"9f80dd16-1ab3-463d-a3a7-96297f7a9655\") " Dec 03 09:59:19 crc kubenswrapper[4947]: I1203 09:59:19.212255 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzbc6\" (UniqueName: \"kubernetes.io/projected/9f80dd16-1ab3-463d-a3a7-96297f7a9655-kube-api-access-rzbc6\") pod \"9f80dd16-1ab3-463d-a3a7-96297f7a9655\" (UID: \"9f80dd16-1ab3-463d-a3a7-96297f7a9655\") " Dec 03 09:59:19 crc kubenswrapper[4947]: I1203 09:59:19.213529 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f80dd16-1ab3-463d-a3a7-96297f7a9655-utilities" (OuterVolumeSpecName: "utilities") pod "9f80dd16-1ab3-463d-a3a7-96297f7a9655" (UID: "9f80dd16-1ab3-463d-a3a7-96297f7a9655"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:59:19 crc kubenswrapper[4947]: I1203 09:59:19.252938 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f80dd16-1ab3-463d-a3a7-96297f7a9655-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f80dd16-1ab3-463d-a3a7-96297f7a9655" (UID: "9f80dd16-1ab3-463d-a3a7-96297f7a9655"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:59:19 crc kubenswrapper[4947]: I1203 09:59:19.258459 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f80dd16-1ab3-463d-a3a7-96297f7a9655-kube-api-access-rzbc6" (OuterVolumeSpecName: "kube-api-access-rzbc6") pod "9f80dd16-1ab3-463d-a3a7-96297f7a9655" (UID: "9f80dd16-1ab3-463d-a3a7-96297f7a9655"). InnerVolumeSpecName "kube-api-access-rzbc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:59:19 crc kubenswrapper[4947]: I1203 09:59:19.313933 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f80dd16-1ab3-463d-a3a7-96297f7a9655-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:59:19 crc kubenswrapper[4947]: I1203 09:59:19.314261 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzbc6\" (UniqueName: \"kubernetes.io/projected/9f80dd16-1ab3-463d-a3a7-96297f7a9655-kube-api-access-rzbc6\") on node \"crc\" DevicePath \"\"" Dec 03 09:59:19 crc kubenswrapper[4947]: I1203 09:59:19.314275 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f80dd16-1ab3-463d-a3a7-96297f7a9655-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:59:19 crc kubenswrapper[4947]: I1203 09:59:19.513107 4947 generic.go:334] "Generic (PLEG): container finished" podID="9f80dd16-1ab3-463d-a3a7-96297f7a9655" containerID="3d7d0e877f541e1b31b9d846878c72e2d03679a965b6dd181439d8cb2e56484a" exitCode=0 Dec 03 09:59:19 crc kubenswrapper[4947]: I1203 09:59:19.513149 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2qxf" event={"ID":"9f80dd16-1ab3-463d-a3a7-96297f7a9655","Type":"ContainerDied","Data":"3d7d0e877f541e1b31b9d846878c72e2d03679a965b6dd181439d8cb2e56484a"} Dec 03 09:59:19 crc kubenswrapper[4947]: I1203 09:59:19.513179 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2qxf" event={"ID":"9f80dd16-1ab3-463d-a3a7-96297f7a9655","Type":"ContainerDied","Data":"785acfadd61d0167f8c277a2785f1a436642e0a2c05e462789d9bb6bb4560569"} Dec 03 09:59:19 crc kubenswrapper[4947]: I1203 09:59:19.513202 4947 scope.go:117] "RemoveContainer" containerID="3d7d0e877f541e1b31b9d846878c72e2d03679a965b6dd181439d8cb2e56484a" Dec 03 09:59:19 crc kubenswrapper[4947]: I1203 09:59:19.513363 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2qxf" Dec 03 09:59:19 crc kubenswrapper[4947]: I1203 09:59:19.543527 4947 scope.go:117] "RemoveContainer" containerID="72e00cabab7d7fd5f83f7c2ee9c0fbfbf6b232dc766bda15ee683b8d8445995a" Dec 03 09:59:19 crc kubenswrapper[4947]: I1203 09:59:19.550349 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2qxf"] Dec 03 09:59:19 crc kubenswrapper[4947]: I1203 09:59:19.561252 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2qxf"] Dec 03 09:59:19 crc kubenswrapper[4947]: I1203 09:59:19.567421 4947 scope.go:117] "RemoveContainer" containerID="c8a86974909c87bff1c6b98657ae432e0007de95ed785b58381070a348655b42" Dec 03 09:59:19 crc kubenswrapper[4947]: I1203 09:59:19.619016 4947 scope.go:117] "RemoveContainer" containerID="3d7d0e877f541e1b31b9d846878c72e2d03679a965b6dd181439d8cb2e56484a" Dec 03 09:59:19 crc kubenswrapper[4947]: E1203 09:59:19.619923 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d7d0e877f541e1b31b9d846878c72e2d03679a965b6dd181439d8cb2e56484a\": container with ID starting with 3d7d0e877f541e1b31b9d846878c72e2d03679a965b6dd181439d8cb2e56484a not found: ID does not exist" containerID="3d7d0e877f541e1b31b9d846878c72e2d03679a965b6dd181439d8cb2e56484a" Dec 03 09:59:19 crc kubenswrapper[4947]: I1203 09:59:19.619967 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d7d0e877f541e1b31b9d846878c72e2d03679a965b6dd181439d8cb2e56484a"} err="failed to get container status \"3d7d0e877f541e1b31b9d846878c72e2d03679a965b6dd181439d8cb2e56484a\": rpc error: code = NotFound desc = could not find container \"3d7d0e877f541e1b31b9d846878c72e2d03679a965b6dd181439d8cb2e56484a\": container with ID starting with 3d7d0e877f541e1b31b9d846878c72e2d03679a965b6dd181439d8cb2e56484a not found: ID does not exist" Dec 03 09:59:19 crc kubenswrapper[4947]: I1203 09:59:19.620139 4947 scope.go:117] "RemoveContainer" containerID="72e00cabab7d7fd5f83f7c2ee9c0fbfbf6b232dc766bda15ee683b8d8445995a" Dec 03 09:59:19 crc kubenswrapper[4947]: E1203 09:59:19.626055 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72e00cabab7d7fd5f83f7c2ee9c0fbfbf6b232dc766bda15ee683b8d8445995a\": container with ID starting with 72e00cabab7d7fd5f83f7c2ee9c0fbfbf6b232dc766bda15ee683b8d8445995a not found: ID does not exist" containerID="72e00cabab7d7fd5f83f7c2ee9c0fbfbf6b232dc766bda15ee683b8d8445995a" Dec 03 09:59:19 crc kubenswrapper[4947]: I1203 09:59:19.626257 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72e00cabab7d7fd5f83f7c2ee9c0fbfbf6b232dc766bda15ee683b8d8445995a"} err="failed to get container status \"72e00cabab7d7fd5f83f7c2ee9c0fbfbf6b232dc766bda15ee683b8d8445995a\": rpc error: code = NotFound desc = could not find container \"72e00cabab7d7fd5f83f7c2ee9c0fbfbf6b232dc766bda15ee683b8d8445995a\": container with ID starting with 72e00cabab7d7fd5f83f7c2ee9c0fbfbf6b232dc766bda15ee683b8d8445995a not found: ID does not exist" Dec 03 09:59:19 crc kubenswrapper[4947]: I1203 09:59:19.626417 4947 scope.go:117] "RemoveContainer" containerID="c8a86974909c87bff1c6b98657ae432e0007de95ed785b58381070a348655b42" Dec 03 09:59:19 crc kubenswrapper[4947]: E1203 09:59:19.627027 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8a86974909c87bff1c6b98657ae432e0007de95ed785b58381070a348655b42\": container with ID starting with c8a86974909c87bff1c6b98657ae432e0007de95ed785b58381070a348655b42 not found: ID does not exist" containerID="c8a86974909c87bff1c6b98657ae432e0007de95ed785b58381070a348655b42" Dec 03 09:59:19 crc kubenswrapper[4947]: I1203 09:59:19.627193 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8a86974909c87bff1c6b98657ae432e0007de95ed785b58381070a348655b42"} err="failed to get container status \"c8a86974909c87bff1c6b98657ae432e0007de95ed785b58381070a348655b42\": rpc error: code = NotFound desc = could not find container \"c8a86974909c87bff1c6b98657ae432e0007de95ed785b58381070a348655b42\": container with ID starting with c8a86974909c87bff1c6b98657ae432e0007de95ed785b58381070a348655b42 not found: ID does not exist" Dec 03 09:59:21 crc kubenswrapper[4947]: I1203 09:59:21.095003 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f80dd16-1ab3-463d-a3a7-96297f7a9655" path="/var/lib/kubelet/pods/9f80dd16-1ab3-463d-a3a7-96297f7a9655/volumes" Dec 03 09:59:24 crc kubenswrapper[4947]: I1203 09:59:24.082921 4947 scope.go:117] "RemoveContainer" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" Dec 03 09:59:24 crc kubenswrapper[4947]: E1203 09:59:24.083561 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:59:39 crc kubenswrapper[4947]: I1203 09:59:39.105914 4947 scope.go:117] "RemoveContainer" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" Dec 03 09:59:39 crc kubenswrapper[4947]: E1203 09:59:39.106987 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:59:48 crc kubenswrapper[4947]: I1203 09:59:48.108381 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v564f"] Dec 03 09:59:48 crc kubenswrapper[4947]: E1203 09:59:48.109847 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f80dd16-1ab3-463d-a3a7-96297f7a9655" containerName="registry-server" Dec 03 09:59:48 crc kubenswrapper[4947]: I1203 09:59:48.109872 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f80dd16-1ab3-463d-a3a7-96297f7a9655" containerName="registry-server" Dec 03 09:59:48 crc kubenswrapper[4947]: E1203 09:59:48.109950 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f80dd16-1ab3-463d-a3a7-96297f7a9655" containerName="extract-content" Dec 03 09:59:48 crc kubenswrapper[4947]: I1203 09:59:48.109962 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f80dd16-1ab3-463d-a3a7-96297f7a9655" containerName="extract-content" Dec 03 09:59:48 crc kubenswrapper[4947]: E1203 09:59:48.109989 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f80dd16-1ab3-463d-a3a7-96297f7a9655" containerName="extract-utilities" Dec 03 09:59:48 crc kubenswrapper[4947]: I1203 09:59:48.110000 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f80dd16-1ab3-463d-a3a7-96297f7a9655" containerName="extract-utilities" Dec 03 09:59:48 crc kubenswrapper[4947]: I1203 09:59:48.110368 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f80dd16-1ab3-463d-a3a7-96297f7a9655" containerName="registry-server" Dec 03 09:59:48 crc kubenswrapper[4947]: I1203 09:59:48.113024 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v564f" Dec 03 09:59:48 crc kubenswrapper[4947]: I1203 09:59:48.124360 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v564f"] Dec 03 09:59:48 crc kubenswrapper[4947]: I1203 09:59:48.267777 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee003895-51a3-43d1-9f22-98136f95d921-catalog-content\") pod \"certified-operators-v564f\" (UID: \"ee003895-51a3-43d1-9f22-98136f95d921\") " pod="openshift-marketplace/certified-operators-v564f" Dec 03 09:59:48 crc kubenswrapper[4947]: I1203 09:59:48.268123 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee003895-51a3-43d1-9f22-98136f95d921-utilities\") pod \"certified-operators-v564f\" (UID: \"ee003895-51a3-43d1-9f22-98136f95d921\") " pod="openshift-marketplace/certified-operators-v564f" Dec 03 09:59:48 crc kubenswrapper[4947]: I1203 09:59:48.268269 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp77d\" (UniqueName: \"kubernetes.io/projected/ee003895-51a3-43d1-9f22-98136f95d921-kube-api-access-kp77d\") pod \"certified-operators-v564f\" (UID: \"ee003895-51a3-43d1-9f22-98136f95d921\") " pod="openshift-marketplace/certified-operators-v564f" Dec 03 09:59:48 crc kubenswrapper[4947]: I1203 09:59:48.369984 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp77d\" (UniqueName: \"kubernetes.io/projected/ee003895-51a3-43d1-9f22-98136f95d921-kube-api-access-kp77d\") pod \"certified-operators-v564f\" (UID: \"ee003895-51a3-43d1-9f22-98136f95d921\") " pod="openshift-marketplace/certified-operators-v564f" Dec 03 09:59:48 crc kubenswrapper[4947]: I1203 09:59:48.370091 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee003895-51a3-43d1-9f22-98136f95d921-catalog-content\") pod \"certified-operators-v564f\" (UID: \"ee003895-51a3-43d1-9f22-98136f95d921\") " pod="openshift-marketplace/certified-operators-v564f" Dec 03 09:59:48 crc kubenswrapper[4947]: I1203 09:59:48.370122 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee003895-51a3-43d1-9f22-98136f95d921-utilities\") pod \"certified-operators-v564f\" (UID: \"ee003895-51a3-43d1-9f22-98136f95d921\") " pod="openshift-marketplace/certified-operators-v564f" Dec 03 09:59:48 crc kubenswrapper[4947]: I1203 09:59:48.370549 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee003895-51a3-43d1-9f22-98136f95d921-utilities\") pod \"certified-operators-v564f\" (UID: \"ee003895-51a3-43d1-9f22-98136f95d921\") " pod="openshift-marketplace/certified-operators-v564f" Dec 03 09:59:48 crc kubenswrapper[4947]: I1203 09:59:48.370790 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee003895-51a3-43d1-9f22-98136f95d921-catalog-content\") pod \"certified-operators-v564f\" (UID: \"ee003895-51a3-43d1-9f22-98136f95d921\") " pod="openshift-marketplace/certified-operators-v564f" Dec 03 09:59:48 crc kubenswrapper[4947]: I1203 09:59:48.390131 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp77d\" (UniqueName: \"kubernetes.io/projected/ee003895-51a3-43d1-9f22-98136f95d921-kube-api-access-kp77d\") pod \"certified-operators-v564f\" (UID: \"ee003895-51a3-43d1-9f22-98136f95d921\") " pod="openshift-marketplace/certified-operators-v564f" Dec 03 09:59:48 crc kubenswrapper[4947]: I1203 09:59:48.438235 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v564f" Dec 03 09:59:48 crc kubenswrapper[4947]: I1203 09:59:48.984149 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v564f"] Dec 03 09:59:49 crc kubenswrapper[4947]: I1203 09:59:49.836278 4947 generic.go:334] "Generic (PLEG): container finished" podID="ee003895-51a3-43d1-9f22-98136f95d921" containerID="63789307c4c79cd71c6d2303626269d7e21c2182e7ae62e86882aced999540cb" exitCode=0 Dec 03 09:59:49 crc kubenswrapper[4947]: I1203 09:59:49.836358 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v564f" event={"ID":"ee003895-51a3-43d1-9f22-98136f95d921","Type":"ContainerDied","Data":"63789307c4c79cd71c6d2303626269d7e21c2182e7ae62e86882aced999540cb"} Dec 03 09:59:49 crc kubenswrapper[4947]: I1203 09:59:49.836656 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v564f" event={"ID":"ee003895-51a3-43d1-9f22-98136f95d921","Type":"ContainerStarted","Data":"f32983598832017cf7c50c6e23b8adfc82bf3c53ac23e737bab8c0d0d951f178"} Dec 03 09:59:50 crc kubenswrapper[4947]: I1203 09:59:50.083778 4947 scope.go:117] "RemoveContainer" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" Dec 03 09:59:50 crc kubenswrapper[4947]: E1203 09:59:50.084060 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 09:59:50 crc kubenswrapper[4947]: I1203 09:59:50.858364 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v564f" event={"ID":"ee003895-51a3-43d1-9f22-98136f95d921","Type":"ContainerStarted","Data":"b6838551bb4e2a5cd44607b80ef08b6b42f8fd3c31b777eb3f22d362507e5cb9"} Dec 03 09:59:51 crc kubenswrapper[4947]: I1203 09:59:51.871629 4947 generic.go:334] "Generic (PLEG): container finished" podID="ee003895-51a3-43d1-9f22-98136f95d921" containerID="b6838551bb4e2a5cd44607b80ef08b6b42f8fd3c31b777eb3f22d362507e5cb9" exitCode=0 Dec 03 09:59:51 crc kubenswrapper[4947]: I1203 09:59:51.871711 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v564f" event={"ID":"ee003895-51a3-43d1-9f22-98136f95d921","Type":"ContainerDied","Data":"b6838551bb4e2a5cd44607b80ef08b6b42f8fd3c31b777eb3f22d362507e5cb9"} Dec 03 09:59:52 crc kubenswrapper[4947]: I1203 09:59:52.885127 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v564f" event={"ID":"ee003895-51a3-43d1-9f22-98136f95d921","Type":"ContainerStarted","Data":"e42ecb1644c2922a0db1a7afda0129726f837409a299573f78058cd331f2deb7"} Dec 03 09:59:58 crc kubenswrapper[4947]: I1203 09:59:58.439058 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v564f" Dec 03 09:59:58 crc kubenswrapper[4947]: I1203 09:59:58.439959 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v564f" Dec 03 09:59:58 crc kubenswrapper[4947]: I1203 09:59:58.524653 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v564f" Dec 03 09:59:58 crc kubenswrapper[4947]: I1203 09:59:58.553558 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v564f" podStartSLOduration=8.041424002 podStartE2EDuration="10.553542056s" podCreationTimestamp="2025-12-03 09:59:48 +0000 UTC" firstStartedPulling="2025-12-03 09:59:49.839025774 +0000 UTC m=+11451.099980200" lastFinishedPulling="2025-12-03 09:59:52.351143828 +0000 UTC m=+11453.612098254" observedRunningTime="2025-12-03 09:59:52.91901222 +0000 UTC m=+11454.179966646" watchObservedRunningTime="2025-12-03 09:59:58.553542056 +0000 UTC m=+11459.814496502" Dec 03 09:59:59 crc kubenswrapper[4947]: I1203 09:59:59.023331 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v564f" Dec 03 09:59:59 crc kubenswrapper[4947]: I1203 09:59:59.070019 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v564f"] Dec 03 10:00:00 crc kubenswrapper[4947]: I1203 10:00:00.180957 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412600-fdj5r"] Dec 03 10:00:00 crc kubenswrapper[4947]: I1203 10:00:00.182881 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412600-fdj5r" Dec 03 10:00:00 crc kubenswrapper[4947]: I1203 10:00:00.186918 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 10:00:00 crc kubenswrapper[4947]: I1203 10:00:00.187190 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 10:00:00 crc kubenswrapper[4947]: I1203 10:00:00.195331 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412600-fdj5r"] Dec 03 10:00:00 crc kubenswrapper[4947]: I1203 10:00:00.345740 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c315334-03c0-4542-b369-99a6c07cbc82-config-volume\") pod \"collect-profiles-29412600-fdj5r\" (UID: \"6c315334-03c0-4542-b369-99a6c07cbc82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412600-fdj5r" Dec 03 10:00:00 crc kubenswrapper[4947]: I1203 10:00:00.346070 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvb9m\" (UniqueName: \"kubernetes.io/projected/6c315334-03c0-4542-b369-99a6c07cbc82-kube-api-access-tvb9m\") pod \"collect-profiles-29412600-fdj5r\" (UID: \"6c315334-03c0-4542-b369-99a6c07cbc82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412600-fdj5r" Dec 03 10:00:00 crc kubenswrapper[4947]: I1203 10:00:00.346326 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c315334-03c0-4542-b369-99a6c07cbc82-secret-volume\") pod \"collect-profiles-29412600-fdj5r\" (UID: \"6c315334-03c0-4542-b369-99a6c07cbc82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412600-fdj5r" Dec 03 10:00:00 crc kubenswrapper[4947]: I1203 10:00:00.448078 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvb9m\" (UniqueName: \"kubernetes.io/projected/6c315334-03c0-4542-b369-99a6c07cbc82-kube-api-access-tvb9m\") pod \"collect-profiles-29412600-fdj5r\" (UID: \"6c315334-03c0-4542-b369-99a6c07cbc82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412600-fdj5r" Dec 03 10:00:00 crc kubenswrapper[4947]: I1203 10:00:00.448596 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c315334-03c0-4542-b369-99a6c07cbc82-secret-volume\") pod \"collect-profiles-29412600-fdj5r\" (UID: \"6c315334-03c0-4542-b369-99a6c07cbc82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412600-fdj5r" Dec 03 10:00:00 crc kubenswrapper[4947]: I1203 10:00:00.448744 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c315334-03c0-4542-b369-99a6c07cbc82-config-volume\") pod \"collect-profiles-29412600-fdj5r\" (UID: \"6c315334-03c0-4542-b369-99a6c07cbc82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412600-fdj5r" Dec 03 10:00:00 crc kubenswrapper[4947]: I1203 10:00:00.450003 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c315334-03c0-4542-b369-99a6c07cbc82-config-volume\") pod \"collect-profiles-29412600-fdj5r\" (UID: \"6c315334-03c0-4542-b369-99a6c07cbc82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412600-fdj5r" Dec 03 10:00:00 crc kubenswrapper[4947]: I1203 10:00:00.458049 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c315334-03c0-4542-b369-99a6c07cbc82-secret-volume\") pod \"collect-profiles-29412600-fdj5r\" (UID: \"6c315334-03c0-4542-b369-99a6c07cbc82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412600-fdj5r" Dec 03 10:00:00 crc kubenswrapper[4947]: I1203 10:00:00.467130 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvb9m\" (UniqueName: \"kubernetes.io/projected/6c315334-03c0-4542-b369-99a6c07cbc82-kube-api-access-tvb9m\") pod \"collect-profiles-29412600-fdj5r\" (UID: \"6c315334-03c0-4542-b369-99a6c07cbc82\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412600-fdj5r" Dec 03 10:00:00 crc kubenswrapper[4947]: I1203 10:00:00.535451 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412600-fdj5r" Dec 03 10:00:00 crc kubenswrapper[4947]: I1203 10:00:00.998419 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v564f" podUID="ee003895-51a3-43d1-9f22-98136f95d921" containerName="registry-server" containerID="cri-o://e42ecb1644c2922a0db1a7afda0129726f837409a299573f78058cd331f2deb7" gracePeriod=2 Dec 03 10:00:01 crc kubenswrapper[4947]: I1203 10:00:01.034232 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412600-fdj5r"] Dec 03 10:00:01 crc kubenswrapper[4947]: I1203 10:00:01.647230 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v564f" Dec 03 10:00:01 crc kubenswrapper[4947]: I1203 10:00:01.791777 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee003895-51a3-43d1-9f22-98136f95d921-catalog-content\") pod \"ee003895-51a3-43d1-9f22-98136f95d921\" (UID: \"ee003895-51a3-43d1-9f22-98136f95d921\") " Dec 03 10:00:01 crc kubenswrapper[4947]: I1203 10:00:01.791976 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee003895-51a3-43d1-9f22-98136f95d921-utilities\") pod \"ee003895-51a3-43d1-9f22-98136f95d921\" (UID: \"ee003895-51a3-43d1-9f22-98136f95d921\") " Dec 03 10:00:01 crc kubenswrapper[4947]: I1203 10:00:01.792100 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp77d\" (UniqueName: \"kubernetes.io/projected/ee003895-51a3-43d1-9f22-98136f95d921-kube-api-access-kp77d\") pod \"ee003895-51a3-43d1-9f22-98136f95d921\" (UID: \"ee003895-51a3-43d1-9f22-98136f95d921\") " Dec 03 10:00:01 crc kubenswrapper[4947]: I1203 10:00:01.792761 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee003895-51a3-43d1-9f22-98136f95d921-utilities" (OuterVolumeSpecName: "utilities") pod "ee003895-51a3-43d1-9f22-98136f95d921" (UID: "ee003895-51a3-43d1-9f22-98136f95d921"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 10:00:01 crc kubenswrapper[4947]: I1203 10:00:01.794017 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee003895-51a3-43d1-9f22-98136f95d921-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 10:00:01 crc kubenswrapper[4947]: I1203 10:00:01.799937 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee003895-51a3-43d1-9f22-98136f95d921-kube-api-access-kp77d" (OuterVolumeSpecName: "kube-api-access-kp77d") pod "ee003895-51a3-43d1-9f22-98136f95d921" (UID: "ee003895-51a3-43d1-9f22-98136f95d921"). InnerVolumeSpecName "kube-api-access-kp77d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 10:00:01 crc kubenswrapper[4947]: I1203 10:00:01.850434 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee003895-51a3-43d1-9f22-98136f95d921-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee003895-51a3-43d1-9f22-98136f95d921" (UID: "ee003895-51a3-43d1-9f22-98136f95d921"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 10:00:01 crc kubenswrapper[4947]: I1203 10:00:01.895682 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp77d\" (UniqueName: \"kubernetes.io/projected/ee003895-51a3-43d1-9f22-98136f95d921-kube-api-access-kp77d\") on node \"crc\" DevicePath \"\"" Dec 03 10:00:01 crc kubenswrapper[4947]: I1203 10:00:01.895716 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee003895-51a3-43d1-9f22-98136f95d921-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 10:00:02 crc kubenswrapper[4947]: I1203 10:00:02.010026 4947 generic.go:334] "Generic (PLEG): container finished" podID="ee003895-51a3-43d1-9f22-98136f95d921" containerID="e42ecb1644c2922a0db1a7afda0129726f837409a299573f78058cd331f2deb7" exitCode=0 Dec 03 10:00:02 crc kubenswrapper[4947]: I1203 10:00:02.010087 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v564f" Dec 03 10:00:02 crc kubenswrapper[4947]: I1203 10:00:02.010111 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v564f" event={"ID":"ee003895-51a3-43d1-9f22-98136f95d921","Type":"ContainerDied","Data":"e42ecb1644c2922a0db1a7afda0129726f837409a299573f78058cd331f2deb7"} Dec 03 10:00:02 crc kubenswrapper[4947]: I1203 10:00:02.010147 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v564f" event={"ID":"ee003895-51a3-43d1-9f22-98136f95d921","Type":"ContainerDied","Data":"f32983598832017cf7c50c6e23b8adfc82bf3c53ac23e737bab8c0d0d951f178"} Dec 03 10:00:02 crc kubenswrapper[4947]: I1203 10:00:02.010168 4947 scope.go:117] "RemoveContainer" containerID="e42ecb1644c2922a0db1a7afda0129726f837409a299573f78058cd331f2deb7" Dec 03 10:00:02 crc kubenswrapper[4947]: I1203 10:00:02.012365 4947 generic.go:334] "Generic (PLEG): container finished" podID="6c315334-03c0-4542-b369-99a6c07cbc82" containerID="a2fd2c3f257fee0db3d07d89ed4a6d41e817a198b69c264ea1122c439ab11219" exitCode=0 Dec 03 10:00:02 crc kubenswrapper[4947]: I1203 10:00:02.012404 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412600-fdj5r" event={"ID":"6c315334-03c0-4542-b369-99a6c07cbc82","Type":"ContainerDied","Data":"a2fd2c3f257fee0db3d07d89ed4a6d41e817a198b69c264ea1122c439ab11219"} Dec 03 10:00:02 crc kubenswrapper[4947]: I1203 10:00:02.012463 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412600-fdj5r" event={"ID":"6c315334-03c0-4542-b369-99a6c07cbc82","Type":"ContainerStarted","Data":"6513d25798f75e724d1ac968c4011fc5dbc6916422784125a1eef6edc663f6e5"} Dec 03 10:00:02 crc kubenswrapper[4947]: I1203 10:00:02.047361 4947 scope.go:117] "RemoveContainer" containerID="b6838551bb4e2a5cd44607b80ef08b6b42f8fd3c31b777eb3f22d362507e5cb9" Dec 03 10:00:02 crc kubenswrapper[4947]: I1203 10:00:02.087262 4947 scope.go:117] "RemoveContainer" containerID="63789307c4c79cd71c6d2303626269d7e21c2182e7ae62e86882aced999540cb" Dec 03 10:00:02 crc kubenswrapper[4947]: I1203 10:00:02.093440 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v564f"] Dec 03 10:00:02 crc kubenswrapper[4947]: I1203 10:00:02.106554 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v564f"] Dec 03 10:00:02 crc kubenswrapper[4947]: I1203 10:00:02.146243 4947 scope.go:117] "RemoveContainer" containerID="e42ecb1644c2922a0db1a7afda0129726f837409a299573f78058cd331f2deb7" Dec 03 10:00:02 crc kubenswrapper[4947]: E1203 10:00:02.146923 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e42ecb1644c2922a0db1a7afda0129726f837409a299573f78058cd331f2deb7\": container with ID starting with e42ecb1644c2922a0db1a7afda0129726f837409a299573f78058cd331f2deb7 not found: ID does not exist" containerID="e42ecb1644c2922a0db1a7afda0129726f837409a299573f78058cd331f2deb7" Dec 03 10:00:02 crc kubenswrapper[4947]: I1203 10:00:02.146957 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e42ecb1644c2922a0db1a7afda0129726f837409a299573f78058cd331f2deb7"} err="failed to get container status \"e42ecb1644c2922a0db1a7afda0129726f837409a299573f78058cd331f2deb7\": rpc error: code = NotFound desc = could not find container \"e42ecb1644c2922a0db1a7afda0129726f837409a299573f78058cd331f2deb7\": container with ID starting with e42ecb1644c2922a0db1a7afda0129726f837409a299573f78058cd331f2deb7 not found: ID does not exist" Dec 03 10:00:02 crc kubenswrapper[4947]: I1203 10:00:02.146976 4947 scope.go:117] "RemoveContainer" containerID="b6838551bb4e2a5cd44607b80ef08b6b42f8fd3c31b777eb3f22d362507e5cb9" Dec 03 10:00:02 crc kubenswrapper[4947]: E1203 10:00:02.147200 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6838551bb4e2a5cd44607b80ef08b6b42f8fd3c31b777eb3f22d362507e5cb9\": container with ID starting with b6838551bb4e2a5cd44607b80ef08b6b42f8fd3c31b777eb3f22d362507e5cb9 not found: ID does not exist" containerID="b6838551bb4e2a5cd44607b80ef08b6b42f8fd3c31b777eb3f22d362507e5cb9" Dec 03 10:00:02 crc kubenswrapper[4947]: I1203 10:00:02.147221 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6838551bb4e2a5cd44607b80ef08b6b42f8fd3c31b777eb3f22d362507e5cb9"} err="failed to get container status \"b6838551bb4e2a5cd44607b80ef08b6b42f8fd3c31b777eb3f22d362507e5cb9\": rpc error: code = NotFound desc = could not find container \"b6838551bb4e2a5cd44607b80ef08b6b42f8fd3c31b777eb3f22d362507e5cb9\": container with ID starting with b6838551bb4e2a5cd44607b80ef08b6b42f8fd3c31b777eb3f22d362507e5cb9 not found: ID does not exist" Dec 03 10:00:02 crc kubenswrapper[4947]: I1203 10:00:02.147233 4947 scope.go:117] "RemoveContainer" containerID="63789307c4c79cd71c6d2303626269d7e21c2182e7ae62e86882aced999540cb" Dec 03 10:00:02 crc kubenswrapper[4947]: E1203 10:00:02.147849 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63789307c4c79cd71c6d2303626269d7e21c2182e7ae62e86882aced999540cb\": container with ID starting with 63789307c4c79cd71c6d2303626269d7e21c2182e7ae62e86882aced999540cb not found: ID does not exist" containerID="63789307c4c79cd71c6d2303626269d7e21c2182e7ae62e86882aced999540cb" Dec 03 10:00:02 crc kubenswrapper[4947]: I1203 10:00:02.147965 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63789307c4c79cd71c6d2303626269d7e21c2182e7ae62e86882aced999540cb"} err="failed to get container status \"63789307c4c79cd71c6d2303626269d7e21c2182e7ae62e86882aced999540cb\": rpc error: code = NotFound desc = could not find container \"63789307c4c79cd71c6d2303626269d7e21c2182e7ae62e86882aced999540cb\": container with ID starting with 63789307c4c79cd71c6d2303626269d7e21c2182e7ae62e86882aced999540cb not found: ID does not exist" Dec 03 10:00:03 crc kubenswrapper[4947]: I1203 10:00:03.095895 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee003895-51a3-43d1-9f22-98136f95d921" path="/var/lib/kubelet/pods/ee003895-51a3-43d1-9f22-98136f95d921/volumes" Dec 03 10:00:03 crc kubenswrapper[4947]: I1203 10:00:03.594122 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412600-fdj5r" Dec 03 10:00:03 crc kubenswrapper[4947]: I1203 10:00:03.644938 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvb9m\" (UniqueName: \"kubernetes.io/projected/6c315334-03c0-4542-b369-99a6c07cbc82-kube-api-access-tvb9m\") pod \"6c315334-03c0-4542-b369-99a6c07cbc82\" (UID: \"6c315334-03c0-4542-b369-99a6c07cbc82\") " Dec 03 10:00:03 crc kubenswrapper[4947]: I1203 10:00:03.645003 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c315334-03c0-4542-b369-99a6c07cbc82-config-volume\") pod \"6c315334-03c0-4542-b369-99a6c07cbc82\" (UID: \"6c315334-03c0-4542-b369-99a6c07cbc82\") " Dec 03 10:00:03 crc kubenswrapper[4947]: I1203 10:00:03.645228 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c315334-03c0-4542-b369-99a6c07cbc82-secret-volume\") pod \"6c315334-03c0-4542-b369-99a6c07cbc82\" (UID: \"6c315334-03c0-4542-b369-99a6c07cbc82\") " Dec 03 10:00:03 crc kubenswrapper[4947]: I1203 10:00:03.646200 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c315334-03c0-4542-b369-99a6c07cbc82-config-volume" (OuterVolumeSpecName: "config-volume") pod "6c315334-03c0-4542-b369-99a6c07cbc82" (UID: "6c315334-03c0-4542-b369-99a6c07cbc82"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 10:00:03 crc kubenswrapper[4947]: I1203 10:00:03.651482 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c315334-03c0-4542-b369-99a6c07cbc82-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6c315334-03c0-4542-b369-99a6c07cbc82" (UID: "6c315334-03c0-4542-b369-99a6c07cbc82"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 10:00:03 crc kubenswrapper[4947]: I1203 10:00:03.653037 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c315334-03c0-4542-b369-99a6c07cbc82-kube-api-access-tvb9m" (OuterVolumeSpecName: "kube-api-access-tvb9m") pod "6c315334-03c0-4542-b369-99a6c07cbc82" (UID: "6c315334-03c0-4542-b369-99a6c07cbc82"). InnerVolumeSpecName "kube-api-access-tvb9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 10:00:03 crc kubenswrapper[4947]: I1203 10:00:03.747686 4947 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c315334-03c0-4542-b369-99a6c07cbc82-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 10:00:03 crc kubenswrapper[4947]: I1203 10:00:03.747720 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvb9m\" (UniqueName: \"kubernetes.io/projected/6c315334-03c0-4542-b369-99a6c07cbc82-kube-api-access-tvb9m\") on node \"crc\" DevicePath \"\"" Dec 03 10:00:03 crc kubenswrapper[4947]: I1203 10:00:03.747729 4947 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c315334-03c0-4542-b369-99a6c07cbc82-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 10:00:04 crc kubenswrapper[4947]: I1203 10:00:04.066294 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412600-fdj5r" event={"ID":"6c315334-03c0-4542-b369-99a6c07cbc82","Type":"ContainerDied","Data":"6513d25798f75e724d1ac968c4011fc5dbc6916422784125a1eef6edc663f6e5"} Dec 03 10:00:04 crc kubenswrapper[4947]: I1203 10:00:04.066567 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6513d25798f75e724d1ac968c4011fc5dbc6916422784125a1eef6edc663f6e5" Dec 03 10:00:04 crc kubenswrapper[4947]: I1203 10:00:04.066348 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412600-fdj5r" Dec 03 10:00:04 crc kubenswrapper[4947]: I1203 10:00:04.082779 4947 scope.go:117] "RemoveContainer" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" Dec 03 10:00:04 crc kubenswrapper[4947]: E1203 10:00:04.083189 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:00:04 crc kubenswrapper[4947]: I1203 10:00:04.693030 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412555-c6ggp"] Dec 03 10:00:04 crc kubenswrapper[4947]: I1203 10:00:04.705364 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412555-c6ggp"] Dec 03 10:00:05 crc kubenswrapper[4947]: I1203 10:00:05.098608 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="817dee6d-50b1-47f2-84f3-d9be2e542a20" path="/var/lib/kubelet/pods/817dee6d-50b1-47f2-84f3-d9be2e542a20/volumes" Dec 03 10:00:15 crc kubenswrapper[4947]: I1203 10:00:15.083788 4947 scope.go:117] "RemoveContainer" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" Dec 03 10:00:15 crc kubenswrapper[4947]: E1203 10:00:15.084853 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:00:30 crc kubenswrapper[4947]: I1203 10:00:30.083699 4947 scope.go:117] "RemoveContainer" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" Dec 03 10:00:30 crc kubenswrapper[4947]: E1203 10:00:30.086138 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:00:33 crc kubenswrapper[4947]: I1203 10:00:33.969045 4947 scope.go:117] "RemoveContainer" containerID="4a84ea49d2198f18b790483395b0b0749f2ba3e3e7eec91afe91223ab382639c" Dec 03 10:00:42 crc kubenswrapper[4947]: I1203 10:00:42.083657 4947 scope.go:117] "RemoveContainer" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" Dec 03 10:00:42 crc kubenswrapper[4947]: E1203 10:00:42.084476 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:00:54 crc kubenswrapper[4947]: I1203 10:00:54.082850 4947 scope.go:117] "RemoveContainer" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" Dec 03 10:00:54 crc kubenswrapper[4947]: E1203 10:00:54.083643 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:01:00 crc kubenswrapper[4947]: I1203 10:01:00.162566 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29412601-5kcwm"] Dec 03 10:01:00 crc kubenswrapper[4947]: E1203 10:01:00.163644 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee003895-51a3-43d1-9f22-98136f95d921" containerName="extract-utilities" Dec 03 10:01:00 crc kubenswrapper[4947]: I1203 10:01:00.163658 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee003895-51a3-43d1-9f22-98136f95d921" containerName="extract-utilities" Dec 03 10:01:00 crc kubenswrapper[4947]: E1203 10:01:00.163685 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee003895-51a3-43d1-9f22-98136f95d921" containerName="extract-content" Dec 03 10:01:00 crc kubenswrapper[4947]: I1203 10:01:00.163691 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee003895-51a3-43d1-9f22-98136f95d921" containerName="extract-content" Dec 03 10:01:00 crc kubenswrapper[4947]: E1203 10:01:00.163706 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c315334-03c0-4542-b369-99a6c07cbc82" containerName="collect-profiles" Dec 03 10:01:00 crc kubenswrapper[4947]: I1203 10:01:00.163712 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c315334-03c0-4542-b369-99a6c07cbc82" containerName="collect-profiles" Dec 03 10:01:00 crc kubenswrapper[4947]: E1203 10:01:00.163741 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee003895-51a3-43d1-9f22-98136f95d921" containerName="registry-server" Dec 03 10:01:00 crc kubenswrapper[4947]: I1203 10:01:00.163746 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee003895-51a3-43d1-9f22-98136f95d921" containerName="registry-server" Dec 03 10:01:00 crc kubenswrapper[4947]: I1203 10:01:00.163936 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee003895-51a3-43d1-9f22-98136f95d921" containerName="registry-server" Dec 03 10:01:00 crc kubenswrapper[4947]: I1203 10:01:00.163967 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c315334-03c0-4542-b369-99a6c07cbc82" containerName="collect-profiles" Dec 03 10:01:00 crc kubenswrapper[4947]: I1203 10:01:00.164772 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412601-5kcwm" Dec 03 10:01:00 crc kubenswrapper[4947]: I1203 10:01:00.188561 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412601-5kcwm"] Dec 03 10:01:00 crc kubenswrapper[4947]: I1203 10:01:00.244006 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jszzk\" (UniqueName: \"kubernetes.io/projected/b461b660-8534-46fa-a7c4-8f98d4a4ee54-kube-api-access-jszzk\") pod \"keystone-cron-29412601-5kcwm\" (UID: \"b461b660-8534-46fa-a7c4-8f98d4a4ee54\") " pod="openstack/keystone-cron-29412601-5kcwm" Dec 03 10:01:00 crc kubenswrapper[4947]: I1203 10:01:00.244066 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b461b660-8534-46fa-a7c4-8f98d4a4ee54-fernet-keys\") pod \"keystone-cron-29412601-5kcwm\" (UID: \"b461b660-8534-46fa-a7c4-8f98d4a4ee54\") " pod="openstack/keystone-cron-29412601-5kcwm" Dec 03 10:01:00 crc kubenswrapper[4947]: I1203 10:01:00.244191 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b461b660-8534-46fa-a7c4-8f98d4a4ee54-combined-ca-bundle\") pod \"keystone-cron-29412601-5kcwm\" (UID: \"b461b660-8534-46fa-a7c4-8f98d4a4ee54\") " pod="openstack/keystone-cron-29412601-5kcwm" Dec 03 10:01:00 crc kubenswrapper[4947]: I1203 10:01:00.244280 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b461b660-8534-46fa-a7c4-8f98d4a4ee54-config-data\") pod \"keystone-cron-29412601-5kcwm\" (UID: \"b461b660-8534-46fa-a7c4-8f98d4a4ee54\") " pod="openstack/keystone-cron-29412601-5kcwm" Dec 03 10:01:00 crc kubenswrapper[4947]: I1203 10:01:00.346971 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b461b660-8534-46fa-a7c4-8f98d4a4ee54-config-data\") pod \"keystone-cron-29412601-5kcwm\" (UID: \"b461b660-8534-46fa-a7c4-8f98d4a4ee54\") " pod="openstack/keystone-cron-29412601-5kcwm" Dec 03 10:01:00 crc kubenswrapper[4947]: I1203 10:01:00.347093 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jszzk\" (UniqueName: \"kubernetes.io/projected/b461b660-8534-46fa-a7c4-8f98d4a4ee54-kube-api-access-jszzk\") pod \"keystone-cron-29412601-5kcwm\" (UID: \"b461b660-8534-46fa-a7c4-8f98d4a4ee54\") " pod="openstack/keystone-cron-29412601-5kcwm" Dec 03 10:01:00 crc kubenswrapper[4947]: I1203 10:01:00.347127 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b461b660-8534-46fa-a7c4-8f98d4a4ee54-fernet-keys\") pod \"keystone-cron-29412601-5kcwm\" (UID: \"b461b660-8534-46fa-a7c4-8f98d4a4ee54\") " pod="openstack/keystone-cron-29412601-5kcwm" Dec 03 10:01:00 crc kubenswrapper[4947]: I1203 10:01:00.347209 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b461b660-8534-46fa-a7c4-8f98d4a4ee54-combined-ca-bundle\") pod \"keystone-cron-29412601-5kcwm\" (UID: \"b461b660-8534-46fa-a7c4-8f98d4a4ee54\") " pod="openstack/keystone-cron-29412601-5kcwm" Dec 03 10:01:00 crc kubenswrapper[4947]: I1203 10:01:00.360323 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b461b660-8534-46fa-a7c4-8f98d4a4ee54-combined-ca-bundle\") pod \"keystone-cron-29412601-5kcwm\" (UID: \"b461b660-8534-46fa-a7c4-8f98d4a4ee54\") " pod="openstack/keystone-cron-29412601-5kcwm" Dec 03 10:01:00 crc kubenswrapper[4947]: I1203 10:01:00.360470 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b461b660-8534-46fa-a7c4-8f98d4a4ee54-config-data\") pod \"keystone-cron-29412601-5kcwm\" (UID: \"b461b660-8534-46fa-a7c4-8f98d4a4ee54\") " pod="openstack/keystone-cron-29412601-5kcwm" Dec 03 10:01:00 crc kubenswrapper[4947]: I1203 10:01:00.360895 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b461b660-8534-46fa-a7c4-8f98d4a4ee54-fernet-keys\") pod \"keystone-cron-29412601-5kcwm\" (UID: \"b461b660-8534-46fa-a7c4-8f98d4a4ee54\") " pod="openstack/keystone-cron-29412601-5kcwm" Dec 03 10:01:00 crc kubenswrapper[4947]: I1203 10:01:00.374018 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jszzk\" (UniqueName: \"kubernetes.io/projected/b461b660-8534-46fa-a7c4-8f98d4a4ee54-kube-api-access-jszzk\") pod \"keystone-cron-29412601-5kcwm\" (UID: \"b461b660-8534-46fa-a7c4-8f98d4a4ee54\") " pod="openstack/keystone-cron-29412601-5kcwm" Dec 03 10:01:00 crc kubenswrapper[4947]: I1203 10:01:00.486961 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412601-5kcwm" Dec 03 10:01:00 crc kubenswrapper[4947]: I1203 10:01:00.995316 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412601-5kcwm"] Dec 03 10:01:01 crc kubenswrapper[4947]: I1203 10:01:01.842319 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412601-5kcwm" event={"ID":"b461b660-8534-46fa-a7c4-8f98d4a4ee54","Type":"ContainerStarted","Data":"440a5d20775c46c1e3b5dc1a658540c8ef124c6da421799dbc054fe563de76e2"} Dec 03 10:01:01 crc kubenswrapper[4947]: I1203 10:01:01.842655 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412601-5kcwm" event={"ID":"b461b660-8534-46fa-a7c4-8f98d4a4ee54","Type":"ContainerStarted","Data":"3d8ea659a7d56cc87df40d6b30c8e8cdabd41516b80e41b018ce65d47e6c504f"} Dec 03 10:01:01 crc kubenswrapper[4947]: I1203 10:01:01.873327 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29412601-5kcwm" podStartSLOduration=1.8733047059999999 podStartE2EDuration="1.873304706s" podCreationTimestamp="2025-12-03 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 10:01:01.862040131 +0000 UTC m=+11523.122994567" watchObservedRunningTime="2025-12-03 10:01:01.873304706 +0000 UTC m=+11523.134259132" Dec 03 10:01:05 crc kubenswrapper[4947]: I1203 10:01:05.891811 4947 generic.go:334] "Generic (PLEG): container finished" podID="b461b660-8534-46fa-a7c4-8f98d4a4ee54" containerID="440a5d20775c46c1e3b5dc1a658540c8ef124c6da421799dbc054fe563de76e2" exitCode=0 Dec 03 10:01:05 crc kubenswrapper[4947]: I1203 10:01:05.892342 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412601-5kcwm" event={"ID":"b461b660-8534-46fa-a7c4-8f98d4a4ee54","Type":"ContainerDied","Data":"440a5d20775c46c1e3b5dc1a658540c8ef124c6da421799dbc054fe563de76e2"} Dec 03 10:01:07 crc kubenswrapper[4947]: I1203 10:01:07.485818 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412601-5kcwm" Dec 03 10:01:07 crc kubenswrapper[4947]: I1203 10:01:07.589420 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b461b660-8534-46fa-a7c4-8f98d4a4ee54-fernet-keys\") pod \"b461b660-8534-46fa-a7c4-8f98d4a4ee54\" (UID: \"b461b660-8534-46fa-a7c4-8f98d4a4ee54\") " Dec 03 10:01:07 crc kubenswrapper[4947]: I1203 10:01:07.589689 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b461b660-8534-46fa-a7c4-8f98d4a4ee54-combined-ca-bundle\") pod \"b461b660-8534-46fa-a7c4-8f98d4a4ee54\" (UID: \"b461b660-8534-46fa-a7c4-8f98d4a4ee54\") " Dec 03 10:01:07 crc kubenswrapper[4947]: I1203 10:01:07.589807 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b461b660-8534-46fa-a7c4-8f98d4a4ee54-config-data\") pod \"b461b660-8534-46fa-a7c4-8f98d4a4ee54\" (UID: \"b461b660-8534-46fa-a7c4-8f98d4a4ee54\") " Dec 03 10:01:07 crc kubenswrapper[4947]: I1203 10:01:07.589922 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jszzk\" (UniqueName: \"kubernetes.io/projected/b461b660-8534-46fa-a7c4-8f98d4a4ee54-kube-api-access-jszzk\") pod \"b461b660-8534-46fa-a7c4-8f98d4a4ee54\" (UID: \"b461b660-8534-46fa-a7c4-8f98d4a4ee54\") " Dec 03 10:01:07 crc kubenswrapper[4947]: I1203 10:01:07.599106 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b461b660-8534-46fa-a7c4-8f98d4a4ee54-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b461b660-8534-46fa-a7c4-8f98d4a4ee54" (UID: "b461b660-8534-46fa-a7c4-8f98d4a4ee54"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 10:01:07 crc kubenswrapper[4947]: I1203 10:01:07.599754 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b461b660-8534-46fa-a7c4-8f98d4a4ee54-kube-api-access-jszzk" (OuterVolumeSpecName: "kube-api-access-jszzk") pod "b461b660-8534-46fa-a7c4-8f98d4a4ee54" (UID: "b461b660-8534-46fa-a7c4-8f98d4a4ee54"). InnerVolumeSpecName "kube-api-access-jszzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 10:01:07 crc kubenswrapper[4947]: I1203 10:01:07.625374 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b461b660-8534-46fa-a7c4-8f98d4a4ee54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b461b660-8534-46fa-a7c4-8f98d4a4ee54" (UID: "b461b660-8534-46fa-a7c4-8f98d4a4ee54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 10:01:07 crc kubenswrapper[4947]: I1203 10:01:07.661677 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b461b660-8534-46fa-a7c4-8f98d4a4ee54-config-data" (OuterVolumeSpecName: "config-data") pod "b461b660-8534-46fa-a7c4-8f98d4a4ee54" (UID: "b461b660-8534-46fa-a7c4-8f98d4a4ee54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 10:01:07 crc kubenswrapper[4947]: I1203 10:01:07.692792 4947 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b461b660-8534-46fa-a7c4-8f98d4a4ee54-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 10:01:07 crc kubenswrapper[4947]: I1203 10:01:07.692835 4947 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b461b660-8534-46fa-a7c4-8f98d4a4ee54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 10:01:07 crc kubenswrapper[4947]: I1203 10:01:07.692853 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b461b660-8534-46fa-a7c4-8f98d4a4ee54-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 10:01:07 crc kubenswrapper[4947]: I1203 10:01:07.692867 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jszzk\" (UniqueName: \"kubernetes.io/projected/b461b660-8534-46fa-a7c4-8f98d4a4ee54-kube-api-access-jszzk\") on node \"crc\" DevicePath \"\"" Dec 03 10:01:07 crc kubenswrapper[4947]: I1203 10:01:07.946127 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412601-5kcwm" event={"ID":"b461b660-8534-46fa-a7c4-8f98d4a4ee54","Type":"ContainerDied","Data":"3d8ea659a7d56cc87df40d6b30c8e8cdabd41516b80e41b018ce65d47e6c504f"} Dec 03 10:01:07 crc kubenswrapper[4947]: I1203 10:01:07.946190 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d8ea659a7d56cc87df40d6b30c8e8cdabd41516b80e41b018ce65d47e6c504f" Dec 03 10:01:07 crc kubenswrapper[4947]: I1203 10:01:07.946317 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412601-5kcwm" Dec 03 10:01:09 crc kubenswrapper[4947]: I1203 10:01:09.097372 4947 scope.go:117] "RemoveContainer" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" Dec 03 10:01:09 crc kubenswrapper[4947]: I1203 10:01:09.967287 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"86c3be5e0f3c862dc07dce46bb23c0eadb5c9313006f2004bbb8084e1b25d323"} Dec 03 10:02:21 crc kubenswrapper[4947]: I1203 10:02:21.391570 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b7d5r"] Dec 03 10:02:21 crc kubenswrapper[4947]: E1203 10:02:21.392640 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b461b660-8534-46fa-a7c4-8f98d4a4ee54" containerName="keystone-cron" Dec 03 10:02:21 crc kubenswrapper[4947]: I1203 10:02:21.392657 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="b461b660-8534-46fa-a7c4-8f98d4a4ee54" containerName="keystone-cron" Dec 03 10:02:21 crc kubenswrapper[4947]: I1203 10:02:21.392925 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="b461b660-8534-46fa-a7c4-8f98d4a4ee54" containerName="keystone-cron" Dec 03 10:02:21 crc kubenswrapper[4947]: I1203 10:02:21.394793 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b7d5r" Dec 03 10:02:21 crc kubenswrapper[4947]: I1203 10:02:21.415418 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b7d5r"] Dec 03 10:02:21 crc kubenswrapper[4947]: I1203 10:02:21.529536 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03fb1824-8f3c-47be-8373-1a4bd3acaf45-utilities\") pod \"redhat-operators-b7d5r\" (UID: \"03fb1824-8f3c-47be-8373-1a4bd3acaf45\") " pod="openshift-marketplace/redhat-operators-b7d5r" Dec 03 10:02:21 crc kubenswrapper[4947]: I1203 10:02:21.529624 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcnfr\" (UniqueName: \"kubernetes.io/projected/03fb1824-8f3c-47be-8373-1a4bd3acaf45-kube-api-access-mcnfr\") pod \"redhat-operators-b7d5r\" (UID: \"03fb1824-8f3c-47be-8373-1a4bd3acaf45\") " pod="openshift-marketplace/redhat-operators-b7d5r" Dec 03 10:02:21 crc kubenswrapper[4947]: I1203 10:02:21.529729 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03fb1824-8f3c-47be-8373-1a4bd3acaf45-catalog-content\") pod \"redhat-operators-b7d5r\" (UID: \"03fb1824-8f3c-47be-8373-1a4bd3acaf45\") " pod="openshift-marketplace/redhat-operators-b7d5r" Dec 03 10:02:21 crc kubenswrapper[4947]: I1203 10:02:21.631844 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03fb1824-8f3c-47be-8373-1a4bd3acaf45-utilities\") pod \"redhat-operators-b7d5r\" (UID: \"03fb1824-8f3c-47be-8373-1a4bd3acaf45\") " pod="openshift-marketplace/redhat-operators-b7d5r" Dec 03 10:02:21 crc kubenswrapper[4947]: I1203 10:02:21.632209 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcnfr\" (UniqueName: \"kubernetes.io/projected/03fb1824-8f3c-47be-8373-1a4bd3acaf45-kube-api-access-mcnfr\") pod \"redhat-operators-b7d5r\" (UID: \"03fb1824-8f3c-47be-8373-1a4bd3acaf45\") " pod="openshift-marketplace/redhat-operators-b7d5r" Dec 03 10:02:21 crc kubenswrapper[4947]: I1203 10:02:21.632272 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03fb1824-8f3c-47be-8373-1a4bd3acaf45-catalog-content\") pod \"redhat-operators-b7d5r\" (UID: \"03fb1824-8f3c-47be-8373-1a4bd3acaf45\") " pod="openshift-marketplace/redhat-operators-b7d5r" Dec 03 10:02:21 crc kubenswrapper[4947]: I1203 10:02:21.632566 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03fb1824-8f3c-47be-8373-1a4bd3acaf45-utilities\") pod \"redhat-operators-b7d5r\" (UID: \"03fb1824-8f3c-47be-8373-1a4bd3acaf45\") " pod="openshift-marketplace/redhat-operators-b7d5r" Dec 03 10:02:21 crc kubenswrapper[4947]: I1203 10:02:21.632754 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03fb1824-8f3c-47be-8373-1a4bd3acaf45-catalog-content\") pod \"redhat-operators-b7d5r\" (UID: \"03fb1824-8f3c-47be-8373-1a4bd3acaf45\") " pod="openshift-marketplace/redhat-operators-b7d5r" Dec 03 10:02:21 crc kubenswrapper[4947]: I1203 10:02:21.654710 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcnfr\" (UniqueName: \"kubernetes.io/projected/03fb1824-8f3c-47be-8373-1a4bd3acaf45-kube-api-access-mcnfr\") pod \"redhat-operators-b7d5r\" (UID: \"03fb1824-8f3c-47be-8373-1a4bd3acaf45\") " pod="openshift-marketplace/redhat-operators-b7d5r" Dec 03 10:02:21 crc kubenswrapper[4947]: I1203 10:02:21.720264 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b7d5r" Dec 03 10:02:22 crc kubenswrapper[4947]: I1203 10:02:22.247808 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b7d5r"] Dec 03 10:02:22 crc kubenswrapper[4947]: I1203 10:02:22.754730 4947 generic.go:334] "Generic (PLEG): container finished" podID="03fb1824-8f3c-47be-8373-1a4bd3acaf45" containerID="5f4b342279754ad13d7ee8e61d7ec2402493c9a711751b848a6babb755e48ae2" exitCode=0 Dec 03 10:02:22 crc kubenswrapper[4947]: I1203 10:02:22.754826 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7d5r" event={"ID":"03fb1824-8f3c-47be-8373-1a4bd3acaf45","Type":"ContainerDied","Data":"5f4b342279754ad13d7ee8e61d7ec2402493c9a711751b848a6babb755e48ae2"} Dec 03 10:02:22 crc kubenswrapper[4947]: I1203 10:02:22.755044 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7d5r" event={"ID":"03fb1824-8f3c-47be-8373-1a4bd3acaf45","Type":"ContainerStarted","Data":"d5be0db8a5e5f0158e7afb75324977f59aa14c46a94420f4d4065d6e6638b6b9"} Dec 03 10:02:22 crc kubenswrapper[4947]: I1203 10:02:22.756575 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 10:02:23 crc kubenswrapper[4947]: I1203 10:02:23.773416 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7d5r" event={"ID":"03fb1824-8f3c-47be-8373-1a4bd3acaf45","Type":"ContainerStarted","Data":"fb139e1e468f1404c82ce102cdf2a76ae1a5e767dc326ee71630deb1c3107975"} Dec 03 10:02:27 crc kubenswrapper[4947]: I1203 10:02:27.818857 4947 generic.go:334] "Generic (PLEG): container finished" podID="03fb1824-8f3c-47be-8373-1a4bd3acaf45" containerID="fb139e1e468f1404c82ce102cdf2a76ae1a5e767dc326ee71630deb1c3107975" exitCode=0 Dec 03 10:02:27 crc kubenswrapper[4947]: I1203 10:02:27.819357 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7d5r" event={"ID":"03fb1824-8f3c-47be-8373-1a4bd3acaf45","Type":"ContainerDied","Data":"fb139e1e468f1404c82ce102cdf2a76ae1a5e767dc326ee71630deb1c3107975"} Dec 03 10:02:28 crc kubenswrapper[4947]: I1203 10:02:28.858664 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7d5r" event={"ID":"03fb1824-8f3c-47be-8373-1a4bd3acaf45","Type":"ContainerStarted","Data":"99f79e665852249d39c197929adc08f534961980c9bab6204b484bac2894a207"} Dec 03 10:02:28 crc kubenswrapper[4947]: I1203 10:02:28.884281 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b7d5r" podStartSLOduration=2.262692684 podStartE2EDuration="7.884253761s" podCreationTimestamp="2025-12-03 10:02:21 +0000 UTC" firstStartedPulling="2025-12-03 10:02:22.756355176 +0000 UTC m=+11604.017309602" lastFinishedPulling="2025-12-03 10:02:28.377916253 +0000 UTC m=+11609.638870679" observedRunningTime="2025-12-03 10:02:28.878069254 +0000 UTC m=+11610.139023710" watchObservedRunningTime="2025-12-03 10:02:28.884253761 +0000 UTC m=+11610.145208197" Dec 03 10:02:31 crc kubenswrapper[4947]: I1203 10:02:31.721129 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b7d5r" Dec 03 10:02:31 crc kubenswrapper[4947]: I1203 10:02:31.722739 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b7d5r" Dec 03 10:02:32 crc kubenswrapper[4947]: I1203 10:02:32.778298 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b7d5r" podUID="03fb1824-8f3c-47be-8373-1a4bd3acaf45" containerName="registry-server" probeResult="failure" output=< Dec 03 10:02:32 crc kubenswrapper[4947]: timeout: failed to connect service ":50051" within 1s Dec 03 10:02:32 crc kubenswrapper[4947]: > Dec 03 10:02:41 crc kubenswrapper[4947]: I1203 10:02:41.778400 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b7d5r" Dec 03 10:02:41 crc kubenswrapper[4947]: I1203 10:02:41.842652 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b7d5r" Dec 03 10:02:42 crc kubenswrapper[4947]: I1203 10:02:42.021423 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b7d5r"] Dec 03 10:02:43 crc kubenswrapper[4947]: I1203 10:02:43.007237 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b7d5r" podUID="03fb1824-8f3c-47be-8373-1a4bd3acaf45" containerName="registry-server" containerID="cri-o://99f79e665852249d39c197929adc08f534961980c9bab6204b484bac2894a207" gracePeriod=2 Dec 03 10:02:43 crc kubenswrapper[4947]: I1203 10:02:43.663344 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b7d5r" Dec 03 10:02:43 crc kubenswrapper[4947]: I1203 10:02:43.811963 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03fb1824-8f3c-47be-8373-1a4bd3acaf45-catalog-content\") pod \"03fb1824-8f3c-47be-8373-1a4bd3acaf45\" (UID: \"03fb1824-8f3c-47be-8373-1a4bd3acaf45\") " Dec 03 10:02:43 crc kubenswrapper[4947]: I1203 10:02:43.812311 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03fb1824-8f3c-47be-8373-1a4bd3acaf45-utilities\") pod \"03fb1824-8f3c-47be-8373-1a4bd3acaf45\" (UID: \"03fb1824-8f3c-47be-8373-1a4bd3acaf45\") " Dec 03 10:02:43 crc kubenswrapper[4947]: I1203 10:02:43.812473 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcnfr\" (UniqueName: \"kubernetes.io/projected/03fb1824-8f3c-47be-8373-1a4bd3acaf45-kube-api-access-mcnfr\") pod \"03fb1824-8f3c-47be-8373-1a4bd3acaf45\" (UID: \"03fb1824-8f3c-47be-8373-1a4bd3acaf45\") " Dec 03 10:02:43 crc kubenswrapper[4947]: I1203 10:02:43.812936 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03fb1824-8f3c-47be-8373-1a4bd3acaf45-utilities" (OuterVolumeSpecName: "utilities") pod "03fb1824-8f3c-47be-8373-1a4bd3acaf45" (UID: "03fb1824-8f3c-47be-8373-1a4bd3acaf45"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 10:02:43 crc kubenswrapper[4947]: I1203 10:02:43.813543 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03fb1824-8f3c-47be-8373-1a4bd3acaf45-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 10:02:43 crc kubenswrapper[4947]: I1203 10:02:43.825628 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03fb1824-8f3c-47be-8373-1a4bd3acaf45-kube-api-access-mcnfr" (OuterVolumeSpecName: "kube-api-access-mcnfr") pod "03fb1824-8f3c-47be-8373-1a4bd3acaf45" (UID: "03fb1824-8f3c-47be-8373-1a4bd3acaf45"). InnerVolumeSpecName "kube-api-access-mcnfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 10:02:43 crc kubenswrapper[4947]: I1203 10:02:43.915898 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcnfr\" (UniqueName: \"kubernetes.io/projected/03fb1824-8f3c-47be-8373-1a4bd3acaf45-kube-api-access-mcnfr\") on node \"crc\" DevicePath \"\"" Dec 03 10:02:43 crc kubenswrapper[4947]: I1203 10:02:43.929746 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03fb1824-8f3c-47be-8373-1a4bd3acaf45-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03fb1824-8f3c-47be-8373-1a4bd3acaf45" (UID: "03fb1824-8f3c-47be-8373-1a4bd3acaf45"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 10:02:44 crc kubenswrapper[4947]: I1203 10:02:44.018456 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03fb1824-8f3c-47be-8373-1a4bd3acaf45-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 10:02:44 crc kubenswrapper[4947]: I1203 10:02:44.021714 4947 generic.go:334] "Generic (PLEG): container finished" podID="03fb1824-8f3c-47be-8373-1a4bd3acaf45" containerID="99f79e665852249d39c197929adc08f534961980c9bab6204b484bac2894a207" exitCode=0 Dec 03 10:02:44 crc kubenswrapper[4947]: I1203 10:02:44.021765 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7d5r" event={"ID":"03fb1824-8f3c-47be-8373-1a4bd3acaf45","Type":"ContainerDied","Data":"99f79e665852249d39c197929adc08f534961980c9bab6204b484bac2894a207"} Dec 03 10:02:44 crc kubenswrapper[4947]: I1203 10:02:44.021779 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b7d5r" Dec 03 10:02:44 crc kubenswrapper[4947]: I1203 10:02:44.021797 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7d5r" event={"ID":"03fb1824-8f3c-47be-8373-1a4bd3acaf45","Type":"ContainerDied","Data":"d5be0db8a5e5f0158e7afb75324977f59aa14c46a94420f4d4065d6e6638b6b9"} Dec 03 10:02:44 crc kubenswrapper[4947]: I1203 10:02:44.021820 4947 scope.go:117] "RemoveContainer" containerID="99f79e665852249d39c197929adc08f534961980c9bab6204b484bac2894a207" Dec 03 10:02:44 crc kubenswrapper[4947]: I1203 10:02:44.068524 4947 scope.go:117] "RemoveContainer" containerID="fb139e1e468f1404c82ce102cdf2a76ae1a5e767dc326ee71630deb1c3107975" Dec 03 10:02:44 crc kubenswrapper[4947]: I1203 10:02:44.086395 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b7d5r"] Dec 03 10:02:44 crc kubenswrapper[4947]: I1203 10:02:44.095660 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b7d5r"] Dec 03 10:02:44 crc kubenswrapper[4947]: I1203 10:02:44.098612 4947 scope.go:117] "RemoveContainer" containerID="5f4b342279754ad13d7ee8e61d7ec2402493c9a711751b848a6babb755e48ae2" Dec 03 10:02:44 crc kubenswrapper[4947]: I1203 10:02:44.144636 4947 scope.go:117] "RemoveContainer" containerID="99f79e665852249d39c197929adc08f534961980c9bab6204b484bac2894a207" Dec 03 10:02:44 crc kubenswrapper[4947]: E1203 10:02:44.145149 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99f79e665852249d39c197929adc08f534961980c9bab6204b484bac2894a207\": container with ID starting with 99f79e665852249d39c197929adc08f534961980c9bab6204b484bac2894a207 not found: ID does not exist" containerID="99f79e665852249d39c197929adc08f534961980c9bab6204b484bac2894a207" Dec 03 10:02:44 crc kubenswrapper[4947]: I1203 10:02:44.145179 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99f79e665852249d39c197929adc08f534961980c9bab6204b484bac2894a207"} err="failed to get container status \"99f79e665852249d39c197929adc08f534961980c9bab6204b484bac2894a207\": rpc error: code = NotFound desc = could not find container \"99f79e665852249d39c197929adc08f534961980c9bab6204b484bac2894a207\": container with ID starting with 99f79e665852249d39c197929adc08f534961980c9bab6204b484bac2894a207 not found: ID does not exist" Dec 03 10:02:44 crc kubenswrapper[4947]: I1203 10:02:44.145198 4947 scope.go:117] "RemoveContainer" containerID="fb139e1e468f1404c82ce102cdf2a76ae1a5e767dc326ee71630deb1c3107975" Dec 03 10:02:44 crc kubenswrapper[4947]: E1203 10:02:44.145607 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb139e1e468f1404c82ce102cdf2a76ae1a5e767dc326ee71630deb1c3107975\": container with ID starting with fb139e1e468f1404c82ce102cdf2a76ae1a5e767dc326ee71630deb1c3107975 not found: ID does not exist" containerID="fb139e1e468f1404c82ce102cdf2a76ae1a5e767dc326ee71630deb1c3107975" Dec 03 10:02:44 crc kubenswrapper[4947]: I1203 10:02:44.145633 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb139e1e468f1404c82ce102cdf2a76ae1a5e767dc326ee71630deb1c3107975"} err="failed to get container status \"fb139e1e468f1404c82ce102cdf2a76ae1a5e767dc326ee71630deb1c3107975\": rpc error: code = NotFound desc = could not find container \"fb139e1e468f1404c82ce102cdf2a76ae1a5e767dc326ee71630deb1c3107975\": container with ID starting with fb139e1e468f1404c82ce102cdf2a76ae1a5e767dc326ee71630deb1c3107975 not found: ID does not exist" Dec 03 10:02:44 crc kubenswrapper[4947]: I1203 10:02:44.145652 4947 scope.go:117] "RemoveContainer" containerID="5f4b342279754ad13d7ee8e61d7ec2402493c9a711751b848a6babb755e48ae2" Dec 03 10:02:44 crc kubenswrapper[4947]: E1203 10:02:44.145961 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f4b342279754ad13d7ee8e61d7ec2402493c9a711751b848a6babb755e48ae2\": container with ID starting with 5f4b342279754ad13d7ee8e61d7ec2402493c9a711751b848a6babb755e48ae2 not found: ID does not exist" containerID="5f4b342279754ad13d7ee8e61d7ec2402493c9a711751b848a6babb755e48ae2" Dec 03 10:02:44 crc kubenswrapper[4947]: I1203 10:02:44.145987 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f4b342279754ad13d7ee8e61d7ec2402493c9a711751b848a6babb755e48ae2"} err="failed to get container status \"5f4b342279754ad13d7ee8e61d7ec2402493c9a711751b848a6babb755e48ae2\": rpc error: code = NotFound desc = could not find container \"5f4b342279754ad13d7ee8e61d7ec2402493c9a711751b848a6babb755e48ae2\": container with ID starting with 5f4b342279754ad13d7ee8e61d7ec2402493c9a711751b848a6babb755e48ae2 not found: ID does not exist" Dec 03 10:02:45 crc kubenswrapper[4947]: I1203 10:02:45.094105 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03fb1824-8f3c-47be-8373-1a4bd3acaf45" path="/var/lib/kubelet/pods/03fb1824-8f3c-47be-8373-1a4bd3acaf45/volumes" Dec 03 10:03:30 crc kubenswrapper[4947]: I1203 10:03:30.086694 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 10:03:30 crc kubenswrapper[4947]: I1203 10:03:30.087300 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 10:04:00 crc kubenswrapper[4947]: I1203 10:04:00.086591 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 10:04:00 crc kubenswrapper[4947]: I1203 10:04:00.087173 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 10:04:30 crc kubenswrapper[4947]: I1203 10:04:30.085947 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 10:04:30 crc kubenswrapper[4947]: I1203 10:04:30.086551 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 10:04:30 crc kubenswrapper[4947]: I1203 10:04:30.086606 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 10:04:30 crc kubenswrapper[4947]: I1203 10:04:30.087383 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"86c3be5e0f3c862dc07dce46bb23c0eadb5c9313006f2004bbb8084e1b25d323"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 10:04:30 crc kubenswrapper[4947]: I1203 10:04:30.087486 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://86c3be5e0f3c862dc07dce46bb23c0eadb5c9313006f2004bbb8084e1b25d323" gracePeriod=600 Dec 03 10:04:30 crc kubenswrapper[4947]: I1203 10:04:30.249324 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="86c3be5e0f3c862dc07dce46bb23c0eadb5c9313006f2004bbb8084e1b25d323" exitCode=0 Dec 03 10:04:30 crc kubenswrapper[4947]: I1203 10:04:30.249382 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"86c3be5e0f3c862dc07dce46bb23c0eadb5c9313006f2004bbb8084e1b25d323"} Dec 03 10:04:30 crc kubenswrapper[4947]: I1203 10:04:30.249431 4947 scope.go:117] "RemoveContainer" containerID="7dae83c3591879e95b856e549407233b049791286c776c65ed6eb44dffaa016c" Dec 03 10:04:31 crc kubenswrapper[4947]: I1203 10:04:31.276000 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a"} Dec 03 10:05:37 crc kubenswrapper[4947]: I1203 10:05:37.818835 4947 patch_prober.go:28] interesting pod/console-operator-58897d9998-wpnnz container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 10:05:37 crc kubenswrapper[4947]: I1203 10:05:37.819538 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-wpnnz" podUID="afeb9125-522c-4dea-93f1-d3ee2c58da87" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 10:05:38 crc kubenswrapper[4947]: I1203 10:05:38.740951 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="87e9571e-9b03-430a-83a3-2bc809f12a29" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Dec 03 10:05:55 crc kubenswrapper[4947]: I1203 10:05:55.232696 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hmk8x"] Dec 03 10:05:55 crc kubenswrapper[4947]: E1203 10:05:55.233677 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03fb1824-8f3c-47be-8373-1a4bd3acaf45" containerName="extract-utilities" Dec 03 10:05:55 crc kubenswrapper[4947]: I1203 10:05:55.233691 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="03fb1824-8f3c-47be-8373-1a4bd3acaf45" containerName="extract-utilities" Dec 03 10:05:55 crc kubenswrapper[4947]: E1203 10:05:55.233718 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03fb1824-8f3c-47be-8373-1a4bd3acaf45" containerName="registry-server" Dec 03 10:05:55 crc kubenswrapper[4947]: I1203 10:05:55.233724 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="03fb1824-8f3c-47be-8373-1a4bd3acaf45" containerName="registry-server" Dec 03 10:05:55 crc kubenswrapper[4947]: E1203 10:05:55.233740 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03fb1824-8f3c-47be-8373-1a4bd3acaf45" containerName="extract-content" Dec 03 10:05:55 crc kubenswrapper[4947]: I1203 10:05:55.233746 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="03fb1824-8f3c-47be-8373-1a4bd3acaf45" containerName="extract-content" Dec 03 10:05:55 crc kubenswrapper[4947]: I1203 10:05:55.233952 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="03fb1824-8f3c-47be-8373-1a4bd3acaf45" containerName="registry-server" Dec 03 10:05:55 crc kubenswrapper[4947]: I1203 10:05:55.235443 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hmk8x" Dec 03 10:05:55 crc kubenswrapper[4947]: I1203 10:05:55.246039 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hmk8x"] Dec 03 10:05:55 crc kubenswrapper[4947]: I1203 10:05:55.362463 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wtk4\" (UniqueName: \"kubernetes.io/projected/56609e19-3944-4d51-b788-3ad1ba0450a9-kube-api-access-6wtk4\") pod \"community-operators-hmk8x\" (UID: \"56609e19-3944-4d51-b788-3ad1ba0450a9\") " pod="openshift-marketplace/community-operators-hmk8x" Dec 03 10:05:55 crc kubenswrapper[4947]: I1203 10:05:55.362635 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56609e19-3944-4d51-b788-3ad1ba0450a9-utilities\") pod \"community-operators-hmk8x\" (UID: \"56609e19-3944-4d51-b788-3ad1ba0450a9\") " pod="openshift-marketplace/community-operators-hmk8x" Dec 03 10:05:55 crc kubenswrapper[4947]: I1203 10:05:55.362908 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56609e19-3944-4d51-b788-3ad1ba0450a9-catalog-content\") pod \"community-operators-hmk8x\" (UID: \"56609e19-3944-4d51-b788-3ad1ba0450a9\") " pod="openshift-marketplace/community-operators-hmk8x" Dec 03 10:05:55 crc kubenswrapper[4947]: I1203 10:05:55.466577 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wtk4\" (UniqueName: \"kubernetes.io/projected/56609e19-3944-4d51-b788-3ad1ba0450a9-kube-api-access-6wtk4\") pod \"community-operators-hmk8x\" (UID: \"56609e19-3944-4d51-b788-3ad1ba0450a9\") " pod="openshift-marketplace/community-operators-hmk8x" Dec 03 10:05:55 crc kubenswrapper[4947]: I1203 10:05:55.466652 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56609e19-3944-4d51-b788-3ad1ba0450a9-utilities\") pod \"community-operators-hmk8x\" (UID: \"56609e19-3944-4d51-b788-3ad1ba0450a9\") " pod="openshift-marketplace/community-operators-hmk8x" Dec 03 10:05:55 crc kubenswrapper[4947]: I1203 10:05:55.466682 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56609e19-3944-4d51-b788-3ad1ba0450a9-catalog-content\") pod \"community-operators-hmk8x\" (UID: \"56609e19-3944-4d51-b788-3ad1ba0450a9\") " pod="openshift-marketplace/community-operators-hmk8x" Dec 03 10:05:55 crc kubenswrapper[4947]: I1203 10:05:55.467339 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56609e19-3944-4d51-b788-3ad1ba0450a9-catalog-content\") pod \"community-operators-hmk8x\" (UID: \"56609e19-3944-4d51-b788-3ad1ba0450a9\") " pod="openshift-marketplace/community-operators-hmk8x" Dec 03 10:05:55 crc kubenswrapper[4947]: I1203 10:05:55.467338 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56609e19-3944-4d51-b788-3ad1ba0450a9-utilities\") pod \"community-operators-hmk8x\" (UID: \"56609e19-3944-4d51-b788-3ad1ba0450a9\") " pod="openshift-marketplace/community-operators-hmk8x" Dec 03 10:05:55 crc kubenswrapper[4947]: I1203 10:05:55.496657 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wtk4\" (UniqueName: \"kubernetes.io/projected/56609e19-3944-4d51-b788-3ad1ba0450a9-kube-api-access-6wtk4\") pod \"community-operators-hmk8x\" (UID: \"56609e19-3944-4d51-b788-3ad1ba0450a9\") " pod="openshift-marketplace/community-operators-hmk8x" Dec 03 10:05:55 crc kubenswrapper[4947]: I1203 10:05:55.559944 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hmk8x" Dec 03 10:05:56 crc kubenswrapper[4947]: I1203 10:05:56.130819 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hmk8x"] Dec 03 10:05:56 crc kubenswrapper[4947]: W1203 10:05:56.137764 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56609e19_3944_4d51_b788_3ad1ba0450a9.slice/crio-a8d69ac458f54167af040f45bbafba173685babd5a509e7f88d18f57faef2bbd WatchSource:0}: Error finding container a8d69ac458f54167af040f45bbafba173685babd5a509e7f88d18f57faef2bbd: Status 404 returned error can't find the container with id a8d69ac458f54167af040f45bbafba173685babd5a509e7f88d18f57faef2bbd Dec 03 10:05:56 crc kubenswrapper[4947]: I1203 10:05:56.581260 4947 generic.go:334] "Generic (PLEG): container finished" podID="56609e19-3944-4d51-b788-3ad1ba0450a9" containerID="da6aa3c21769060387f295d8c796bd7bec91e478de3bc119707928f81f7c34ab" exitCode=0 Dec 03 10:05:56 crc kubenswrapper[4947]: I1203 10:05:56.581626 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmk8x" event={"ID":"56609e19-3944-4d51-b788-3ad1ba0450a9","Type":"ContainerDied","Data":"da6aa3c21769060387f295d8c796bd7bec91e478de3bc119707928f81f7c34ab"} Dec 03 10:05:56 crc kubenswrapper[4947]: I1203 10:05:56.582445 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmk8x" event={"ID":"56609e19-3944-4d51-b788-3ad1ba0450a9","Type":"ContainerStarted","Data":"a8d69ac458f54167af040f45bbafba173685babd5a509e7f88d18f57faef2bbd"} Dec 03 10:05:57 crc kubenswrapper[4947]: I1203 10:05:57.600217 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmk8x" event={"ID":"56609e19-3944-4d51-b788-3ad1ba0450a9","Type":"ContainerStarted","Data":"e949d756828469762925f0b46517fca8303ea7d50485531139ec5e409ba2535a"} Dec 03 10:05:58 crc kubenswrapper[4947]: I1203 10:05:58.616138 4947 generic.go:334] "Generic (PLEG): container finished" podID="56609e19-3944-4d51-b788-3ad1ba0450a9" containerID="e949d756828469762925f0b46517fca8303ea7d50485531139ec5e409ba2535a" exitCode=0 Dec 03 10:05:58 crc kubenswrapper[4947]: I1203 10:05:58.616197 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmk8x" event={"ID":"56609e19-3944-4d51-b788-3ad1ba0450a9","Type":"ContainerDied","Data":"e949d756828469762925f0b46517fca8303ea7d50485531139ec5e409ba2535a"} Dec 03 10:05:59 crc kubenswrapper[4947]: I1203 10:05:59.634368 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmk8x" event={"ID":"56609e19-3944-4d51-b788-3ad1ba0450a9","Type":"ContainerStarted","Data":"a261f1d92df36ca1240af5c255289ba01e315dd57272752cd2f363e1d6d2467b"} Dec 03 10:05:59 crc kubenswrapper[4947]: I1203 10:05:59.676596 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hmk8x" podStartSLOduration=2.097373306 podStartE2EDuration="4.676573954s" podCreationTimestamp="2025-12-03 10:05:55 +0000 UTC" firstStartedPulling="2025-12-03 10:05:56.583337809 +0000 UTC m=+11817.844292235" lastFinishedPulling="2025-12-03 10:05:59.162538417 +0000 UTC m=+11820.423492883" observedRunningTime="2025-12-03 10:05:59.661009434 +0000 UTC m=+11820.921963910" watchObservedRunningTime="2025-12-03 10:05:59.676573954 +0000 UTC m=+11820.937528380" Dec 03 10:06:05 crc kubenswrapper[4947]: I1203 10:06:05.560564 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hmk8x" Dec 03 10:06:05 crc kubenswrapper[4947]: I1203 10:06:05.561189 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hmk8x" Dec 03 10:06:05 crc kubenswrapper[4947]: I1203 10:06:05.621078 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hmk8x" Dec 03 10:06:05 crc kubenswrapper[4947]: I1203 10:06:05.742581 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hmk8x" Dec 03 10:06:09 crc kubenswrapper[4947]: I1203 10:06:09.071178 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hmk8x"] Dec 03 10:06:09 crc kubenswrapper[4947]: I1203 10:06:09.071781 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hmk8x" podUID="56609e19-3944-4d51-b788-3ad1ba0450a9" containerName="registry-server" containerID="cri-o://a261f1d92df36ca1240af5c255289ba01e315dd57272752cd2f363e1d6d2467b" gracePeriod=2 Dec 03 10:06:09 crc kubenswrapper[4947]: I1203 10:06:09.633673 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hmk8x" Dec 03 10:06:09 crc kubenswrapper[4947]: I1203 10:06:09.740441 4947 generic.go:334] "Generic (PLEG): container finished" podID="56609e19-3944-4d51-b788-3ad1ba0450a9" containerID="a261f1d92df36ca1240af5c255289ba01e315dd57272752cd2f363e1d6d2467b" exitCode=0 Dec 03 10:06:09 crc kubenswrapper[4947]: I1203 10:06:09.740478 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmk8x" event={"ID":"56609e19-3944-4d51-b788-3ad1ba0450a9","Type":"ContainerDied","Data":"a261f1d92df36ca1240af5c255289ba01e315dd57272752cd2f363e1d6d2467b"} Dec 03 10:06:09 crc kubenswrapper[4947]: I1203 10:06:09.740538 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hmk8x" Dec 03 10:06:09 crc kubenswrapper[4947]: I1203 10:06:09.740555 4947 scope.go:117] "RemoveContainer" containerID="a261f1d92df36ca1240af5c255289ba01e315dd57272752cd2f363e1d6d2467b" Dec 03 10:06:09 crc kubenswrapper[4947]: I1203 10:06:09.740543 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hmk8x" event={"ID":"56609e19-3944-4d51-b788-3ad1ba0450a9","Type":"ContainerDied","Data":"a8d69ac458f54167af040f45bbafba173685babd5a509e7f88d18f57faef2bbd"} Dec 03 10:06:09 crc kubenswrapper[4947]: I1203 10:06:09.776728 4947 scope.go:117] "RemoveContainer" containerID="e949d756828469762925f0b46517fca8303ea7d50485531139ec5e409ba2535a" Dec 03 10:06:09 crc kubenswrapper[4947]: I1203 10:06:09.780016 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56609e19-3944-4d51-b788-3ad1ba0450a9-utilities\") pod \"56609e19-3944-4d51-b788-3ad1ba0450a9\" (UID: \"56609e19-3944-4d51-b788-3ad1ba0450a9\") " Dec 03 10:06:09 crc kubenswrapper[4947]: I1203 10:06:09.780124 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wtk4\" (UniqueName: \"kubernetes.io/projected/56609e19-3944-4d51-b788-3ad1ba0450a9-kube-api-access-6wtk4\") pod \"56609e19-3944-4d51-b788-3ad1ba0450a9\" (UID: \"56609e19-3944-4d51-b788-3ad1ba0450a9\") " Dec 03 10:06:09 crc kubenswrapper[4947]: I1203 10:06:09.780305 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56609e19-3944-4d51-b788-3ad1ba0450a9-catalog-content\") pod \"56609e19-3944-4d51-b788-3ad1ba0450a9\" (UID: \"56609e19-3944-4d51-b788-3ad1ba0450a9\") " Dec 03 10:06:09 crc kubenswrapper[4947]: I1203 10:06:09.781696 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56609e19-3944-4d51-b788-3ad1ba0450a9-utilities" (OuterVolumeSpecName: "utilities") pod "56609e19-3944-4d51-b788-3ad1ba0450a9" (UID: "56609e19-3944-4d51-b788-3ad1ba0450a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 10:06:09 crc kubenswrapper[4947]: I1203 10:06:09.786610 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56609e19-3944-4d51-b788-3ad1ba0450a9-kube-api-access-6wtk4" (OuterVolumeSpecName: "kube-api-access-6wtk4") pod "56609e19-3944-4d51-b788-3ad1ba0450a9" (UID: "56609e19-3944-4d51-b788-3ad1ba0450a9"). InnerVolumeSpecName "kube-api-access-6wtk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 10:06:09 crc kubenswrapper[4947]: I1203 10:06:09.807999 4947 scope.go:117] "RemoveContainer" containerID="da6aa3c21769060387f295d8c796bd7bec91e478de3bc119707928f81f7c34ab" Dec 03 10:06:09 crc kubenswrapper[4947]: I1203 10:06:09.840137 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56609e19-3944-4d51-b788-3ad1ba0450a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56609e19-3944-4d51-b788-3ad1ba0450a9" (UID: "56609e19-3944-4d51-b788-3ad1ba0450a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 10:06:09 crc kubenswrapper[4947]: I1203 10:06:09.882844 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56609e19-3944-4d51-b788-3ad1ba0450a9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 10:06:09 crc kubenswrapper[4947]: I1203 10:06:09.882879 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56609e19-3944-4d51-b788-3ad1ba0450a9-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 10:06:09 crc kubenswrapper[4947]: I1203 10:06:09.882893 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wtk4\" (UniqueName: \"kubernetes.io/projected/56609e19-3944-4d51-b788-3ad1ba0450a9-kube-api-access-6wtk4\") on node \"crc\" DevicePath \"\"" Dec 03 10:06:09 crc kubenswrapper[4947]: I1203 10:06:09.891809 4947 scope.go:117] "RemoveContainer" containerID="a261f1d92df36ca1240af5c255289ba01e315dd57272752cd2f363e1d6d2467b" Dec 03 10:06:09 crc kubenswrapper[4947]: E1203 10:06:09.892210 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a261f1d92df36ca1240af5c255289ba01e315dd57272752cd2f363e1d6d2467b\": container with ID starting with a261f1d92df36ca1240af5c255289ba01e315dd57272752cd2f363e1d6d2467b not found: ID does not exist" containerID="a261f1d92df36ca1240af5c255289ba01e315dd57272752cd2f363e1d6d2467b" Dec 03 10:06:09 crc kubenswrapper[4947]: I1203 10:06:09.892244 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a261f1d92df36ca1240af5c255289ba01e315dd57272752cd2f363e1d6d2467b"} err="failed to get container status \"a261f1d92df36ca1240af5c255289ba01e315dd57272752cd2f363e1d6d2467b\": rpc error: code = NotFound desc = could not find container \"a261f1d92df36ca1240af5c255289ba01e315dd57272752cd2f363e1d6d2467b\": container with ID starting with a261f1d92df36ca1240af5c255289ba01e315dd57272752cd2f363e1d6d2467b not found: ID does not exist" Dec 03 10:06:09 crc kubenswrapper[4947]: I1203 10:06:09.892269 4947 scope.go:117] "RemoveContainer" containerID="e949d756828469762925f0b46517fca8303ea7d50485531139ec5e409ba2535a" Dec 03 10:06:09 crc kubenswrapper[4947]: E1203 10:06:09.893028 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e949d756828469762925f0b46517fca8303ea7d50485531139ec5e409ba2535a\": container with ID starting with e949d756828469762925f0b46517fca8303ea7d50485531139ec5e409ba2535a not found: ID does not exist" containerID="e949d756828469762925f0b46517fca8303ea7d50485531139ec5e409ba2535a" Dec 03 10:06:09 crc kubenswrapper[4947]: I1203 10:06:09.893066 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e949d756828469762925f0b46517fca8303ea7d50485531139ec5e409ba2535a"} err="failed to get container status \"e949d756828469762925f0b46517fca8303ea7d50485531139ec5e409ba2535a\": rpc error: code = NotFound desc = could not find container \"e949d756828469762925f0b46517fca8303ea7d50485531139ec5e409ba2535a\": container with ID starting with e949d756828469762925f0b46517fca8303ea7d50485531139ec5e409ba2535a not found: ID does not exist" Dec 03 10:06:09 crc kubenswrapper[4947]: I1203 10:06:09.893092 4947 scope.go:117] "RemoveContainer" containerID="da6aa3c21769060387f295d8c796bd7bec91e478de3bc119707928f81f7c34ab" Dec 03 10:06:09 crc kubenswrapper[4947]: E1203 10:06:09.893578 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da6aa3c21769060387f295d8c796bd7bec91e478de3bc119707928f81f7c34ab\": container with ID starting with da6aa3c21769060387f295d8c796bd7bec91e478de3bc119707928f81f7c34ab not found: ID does not exist" containerID="da6aa3c21769060387f295d8c796bd7bec91e478de3bc119707928f81f7c34ab" Dec 03 10:06:09 crc kubenswrapper[4947]: I1203 10:06:09.893794 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da6aa3c21769060387f295d8c796bd7bec91e478de3bc119707928f81f7c34ab"} err="failed to get container status \"da6aa3c21769060387f295d8c796bd7bec91e478de3bc119707928f81f7c34ab\": rpc error: code = NotFound desc = could not find container \"da6aa3c21769060387f295d8c796bd7bec91e478de3bc119707928f81f7c34ab\": container with ID starting with da6aa3c21769060387f295d8c796bd7bec91e478de3bc119707928f81f7c34ab not found: ID does not exist" Dec 03 10:06:10 crc kubenswrapper[4947]: I1203 10:06:10.081119 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hmk8x"] Dec 03 10:06:10 crc kubenswrapper[4947]: I1203 10:06:10.090976 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hmk8x"] Dec 03 10:06:11 crc kubenswrapper[4947]: I1203 10:06:11.094178 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56609e19-3944-4d51-b788-3ad1ba0450a9" path="/var/lib/kubelet/pods/56609e19-3944-4d51-b788-3ad1ba0450a9/volumes" Dec 03 10:06:30 crc kubenswrapper[4947]: I1203 10:06:30.087178 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 10:06:30 crc kubenswrapper[4947]: I1203 10:06:30.088057 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 10:06:46 crc kubenswrapper[4947]: I1203 10:06:46.179630 4947 generic.go:334] "Generic (PLEG): container finished" podID="e69a0cda-c7ef-4f16-a380-0d3f9385574e" containerID="55d82ab50b32c454418365b5e89b0068618bf11d602d74c3b65dc5213e617212" exitCode=0 Dec 03 10:06:46 crc kubenswrapper[4947]: I1203 10:06:46.179767 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e69a0cda-c7ef-4f16-a380-0d3f9385574e","Type":"ContainerDied","Data":"55d82ab50b32c454418365b5e89b0068618bf11d602d74c3b65dc5213e617212"} Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.644659 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.700268 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.700321 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e69a0cda-c7ef-4f16-a380-0d3f9385574e-openstack-config-secret\") pod \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.700364 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e69a0cda-c7ef-4f16-a380-0d3f9385574e-openstack-config\") pod \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.700536 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e69a0cda-c7ef-4f16-a380-0d3f9385574e-ssh-key\") pod \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.700605 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e69a0cda-c7ef-4f16-a380-0d3f9385574e-test-operator-ephemeral-workdir\") pod \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.700650 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e69a0cda-c7ef-4f16-a380-0d3f9385574e-test-operator-ephemeral-temporary\") pod \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.700738 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qql68\" (UniqueName: \"kubernetes.io/projected/e69a0cda-c7ef-4f16-a380-0d3f9385574e-kube-api-access-qql68\") pod \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.700854 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e69a0cda-c7ef-4f16-a380-0d3f9385574e-config-data\") pod \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.700926 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e69a0cda-c7ef-4f16-a380-0d3f9385574e-ca-certs\") pod \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\" (UID: \"e69a0cda-c7ef-4f16-a380-0d3f9385574e\") " Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.701998 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e69a0cda-c7ef-4f16-a380-0d3f9385574e-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "e69a0cda-c7ef-4f16-a380-0d3f9385574e" (UID: "e69a0cda-c7ef-4f16-a380-0d3f9385574e"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.702205 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e69a0cda-c7ef-4f16-a380-0d3f9385574e-config-data" (OuterVolumeSpecName: "config-data") pod "e69a0cda-c7ef-4f16-a380-0d3f9385574e" (UID: "e69a0cda-c7ef-4f16-a380-0d3f9385574e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.706861 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "e69a0cda-c7ef-4f16-a380-0d3f9385574e" (UID: "e69a0cda-c7ef-4f16-a380-0d3f9385574e"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.706953 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e69a0cda-c7ef-4f16-a380-0d3f9385574e-kube-api-access-qql68" (OuterVolumeSpecName: "kube-api-access-qql68") pod "e69a0cda-c7ef-4f16-a380-0d3f9385574e" (UID: "e69a0cda-c7ef-4f16-a380-0d3f9385574e"). InnerVolumeSpecName "kube-api-access-qql68". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.707408 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e69a0cda-c7ef-4f16-a380-0d3f9385574e-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "e69a0cda-c7ef-4f16-a380-0d3f9385574e" (UID: "e69a0cda-c7ef-4f16-a380-0d3f9385574e"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.741748 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e69a0cda-c7ef-4f16-a380-0d3f9385574e-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "e69a0cda-c7ef-4f16-a380-0d3f9385574e" (UID: "e69a0cda-c7ef-4f16-a380-0d3f9385574e"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.751087 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e69a0cda-c7ef-4f16-a380-0d3f9385574e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "e69a0cda-c7ef-4f16-a380-0d3f9385574e" (UID: "e69a0cda-c7ef-4f16-a380-0d3f9385574e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.751416 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e69a0cda-c7ef-4f16-a380-0d3f9385574e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e69a0cda-c7ef-4f16-a380-0d3f9385574e" (UID: "e69a0cda-c7ef-4f16-a380-0d3f9385574e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.765307 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e69a0cda-c7ef-4f16-a380-0d3f9385574e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "e69a0cda-c7ef-4f16-a380-0d3f9385574e" (UID: "e69a0cda-c7ef-4f16-a380-0d3f9385574e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.803243 4947 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e69a0cda-c7ef-4f16-a380-0d3f9385574e-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.803279 4947 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/e69a0cda-c7ef-4f16-a380-0d3f9385574e-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.803288 4947 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e69a0cda-c7ef-4f16-a380-0d3f9385574e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.803322 4947 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.803335 4947 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e69a0cda-c7ef-4f16-a380-0d3f9385574e-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.803347 4947 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e69a0cda-c7ef-4f16-a380-0d3f9385574e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.803357 4947 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/e69a0cda-c7ef-4f16-a380-0d3f9385574e-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.803366 4947 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/e69a0cda-c7ef-4f16-a380-0d3f9385574e-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.803375 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qql68\" (UniqueName: \"kubernetes.io/projected/e69a0cda-c7ef-4f16-a380-0d3f9385574e-kube-api-access-qql68\") on node \"crc\" DevicePath \"\"" Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.825324 4947 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 03 10:06:47 crc kubenswrapper[4947]: I1203 10:06:47.904643 4947 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 03 10:06:48 crc kubenswrapper[4947]: I1203 10:06:48.207884 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"e69a0cda-c7ef-4f16-a380-0d3f9385574e","Type":"ContainerDied","Data":"0009ee61243f9ea2309320823a931fbc41b0377c37c07392c27bb9deb9539ba2"} Dec 03 10:06:48 crc kubenswrapper[4947]: I1203 10:06:48.207930 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0009ee61243f9ea2309320823a931fbc41b0377c37c07392c27bb9deb9539ba2" Dec 03 10:06:48 crc kubenswrapper[4947]: I1203 10:06:48.207956 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 10:06:56 crc kubenswrapper[4947]: I1203 10:06:56.774775 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 10:06:56 crc kubenswrapper[4947]: E1203 10:06:56.776316 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56609e19-3944-4d51-b788-3ad1ba0450a9" containerName="registry-server" Dec 03 10:06:56 crc kubenswrapper[4947]: I1203 10:06:56.776342 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="56609e19-3944-4d51-b788-3ad1ba0450a9" containerName="registry-server" Dec 03 10:06:56 crc kubenswrapper[4947]: E1203 10:06:56.776361 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56609e19-3944-4d51-b788-3ad1ba0450a9" containerName="extract-utilities" Dec 03 10:06:56 crc kubenswrapper[4947]: I1203 10:06:56.776374 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="56609e19-3944-4d51-b788-3ad1ba0450a9" containerName="extract-utilities" Dec 03 10:06:56 crc kubenswrapper[4947]: E1203 10:06:56.776405 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56609e19-3944-4d51-b788-3ad1ba0450a9" containerName="extract-content" Dec 03 10:06:56 crc kubenswrapper[4947]: I1203 10:06:56.776414 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="56609e19-3944-4d51-b788-3ad1ba0450a9" containerName="extract-content" Dec 03 10:06:56 crc kubenswrapper[4947]: E1203 10:06:56.776427 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e69a0cda-c7ef-4f16-a380-0d3f9385574e" containerName="tempest-tests-tempest-tests-runner" Dec 03 10:06:56 crc kubenswrapper[4947]: I1203 10:06:56.776435 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="e69a0cda-c7ef-4f16-a380-0d3f9385574e" containerName="tempest-tests-tempest-tests-runner" Dec 03 10:06:56 crc kubenswrapper[4947]: I1203 10:06:56.776750 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="56609e19-3944-4d51-b788-3ad1ba0450a9" containerName="registry-server" Dec 03 10:06:56 crc kubenswrapper[4947]: I1203 10:06:56.776765 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="e69a0cda-c7ef-4f16-a380-0d3f9385574e" containerName="tempest-tests-tempest-tests-runner" Dec 03 10:06:56 crc kubenswrapper[4947]: I1203 10:06:56.777822 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 10:06:56 crc kubenswrapper[4947]: I1203 10:06:56.780155 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-kk56l" Dec 03 10:06:56 crc kubenswrapper[4947]: I1203 10:06:56.789286 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 10:06:56 crc kubenswrapper[4947]: I1203 10:06:56.905197 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k9qx\" (UniqueName: \"kubernetes.io/projected/e4697a62-089a-4448-be21-9867446728b2-kube-api-access-4k9qx\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e4697a62-089a-4448-be21-9867446728b2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 10:06:56 crc kubenswrapper[4947]: I1203 10:06:56.905374 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e4697a62-089a-4448-be21-9867446728b2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 10:06:57 crc kubenswrapper[4947]: I1203 10:06:57.007338 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k9qx\" (UniqueName: \"kubernetes.io/projected/e4697a62-089a-4448-be21-9867446728b2-kube-api-access-4k9qx\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e4697a62-089a-4448-be21-9867446728b2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 10:06:57 crc kubenswrapper[4947]: I1203 10:06:57.007859 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e4697a62-089a-4448-be21-9867446728b2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 10:06:57 crc kubenswrapper[4947]: I1203 10:06:57.008331 4947 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e4697a62-089a-4448-be21-9867446728b2\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 10:06:57 crc kubenswrapper[4947]: I1203 10:06:57.032545 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k9qx\" (UniqueName: \"kubernetes.io/projected/e4697a62-089a-4448-be21-9867446728b2-kube-api-access-4k9qx\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e4697a62-089a-4448-be21-9867446728b2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 10:06:57 crc kubenswrapper[4947]: I1203 10:06:57.035695 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e4697a62-089a-4448-be21-9867446728b2\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 10:06:57 crc kubenswrapper[4947]: I1203 10:06:57.111676 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 10:06:57 crc kubenswrapper[4947]: I1203 10:06:57.692061 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 10:06:57 crc kubenswrapper[4947]: W1203 10:06:57.702342 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4697a62_089a_4448_be21_9867446728b2.slice/crio-ad214c22586bc3598e66c7a50abf05799698336a2e833c9ce62f86f738efa252 WatchSource:0}: Error finding container ad214c22586bc3598e66c7a50abf05799698336a2e833c9ce62f86f738efa252: Status 404 returned error can't find the container with id ad214c22586bc3598e66c7a50abf05799698336a2e833c9ce62f86f738efa252 Dec 03 10:06:58 crc kubenswrapper[4947]: I1203 10:06:58.348934 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e4697a62-089a-4448-be21-9867446728b2","Type":"ContainerStarted","Data":"ad214c22586bc3598e66c7a50abf05799698336a2e833c9ce62f86f738efa252"} Dec 03 10:06:59 crc kubenswrapper[4947]: I1203 10:06:59.371221 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e4697a62-089a-4448-be21-9867446728b2","Type":"ContainerStarted","Data":"ddba198069b3bbd3b05dd8394e3e0fba9aa599878d0991f0077d890ad07c830b"} Dec 03 10:06:59 crc kubenswrapper[4947]: I1203 10:06:59.399627 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.481784184 podStartE2EDuration="3.399608771s" podCreationTimestamp="2025-12-03 10:06:56 +0000 UTC" firstStartedPulling="2025-12-03 10:06:57.706296501 +0000 UTC m=+11878.967250927" lastFinishedPulling="2025-12-03 10:06:58.624121088 +0000 UTC m=+11879.885075514" observedRunningTime="2025-12-03 10:06:59.38809203 +0000 UTC m=+11880.649046466" watchObservedRunningTime="2025-12-03 10:06:59.399608771 +0000 UTC m=+11880.660563207" Dec 03 10:07:00 crc kubenswrapper[4947]: I1203 10:07:00.087057 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 10:07:00 crc kubenswrapper[4947]: I1203 10:07:00.087138 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 10:07:30 crc kubenswrapper[4947]: I1203 10:07:30.086090 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 10:07:30 crc kubenswrapper[4947]: I1203 10:07:30.086838 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 10:07:30 crc kubenswrapper[4947]: I1203 10:07:30.086891 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 10:07:30 crc kubenswrapper[4947]: I1203 10:07:30.087679 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 10:07:30 crc kubenswrapper[4947]: I1203 10:07:30.087774 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" gracePeriod=600 Dec 03 10:07:30 crc kubenswrapper[4947]: E1203 10:07:30.297219 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:07:30 crc kubenswrapper[4947]: I1203 10:07:30.803674 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" exitCode=0 Dec 03 10:07:30 crc kubenswrapper[4947]: I1203 10:07:30.803895 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a"} Dec 03 10:07:30 crc kubenswrapper[4947]: I1203 10:07:30.804280 4947 scope.go:117] "RemoveContainer" containerID="86c3be5e0f3c862dc07dce46bb23c0eadb5c9313006f2004bbb8084e1b25d323" Dec 03 10:07:30 crc kubenswrapper[4947]: I1203 10:07:30.805040 4947 scope.go:117] "RemoveContainer" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" Dec 03 10:07:30 crc kubenswrapper[4947]: E1203 10:07:30.805381 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:07:43 crc kubenswrapper[4947]: I1203 10:07:43.083999 4947 scope.go:117] "RemoveContainer" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" Dec 03 10:07:43 crc kubenswrapper[4947]: E1203 10:07:43.084899 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:07:55 crc kubenswrapper[4947]: I1203 10:07:55.084627 4947 scope.go:117] "RemoveContainer" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" Dec 03 10:07:55 crc kubenswrapper[4947]: E1203 10:07:55.085854 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:08:06 crc kubenswrapper[4947]: I1203 10:08:06.082993 4947 scope.go:117] "RemoveContainer" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" Dec 03 10:08:06 crc kubenswrapper[4947]: E1203 10:08:06.083755 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:08:13 crc kubenswrapper[4947]: I1203 10:08:13.095702 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-snrdr/must-gather-kjn4s"] Dec 03 10:08:13 crc kubenswrapper[4947]: I1203 10:08:13.102809 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snrdr/must-gather-kjn4s" Dec 03 10:08:13 crc kubenswrapper[4947]: I1203 10:08:13.107885 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-snrdr"/"openshift-service-ca.crt" Dec 03 10:08:13 crc kubenswrapper[4947]: I1203 10:08:13.107948 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-snrdr"/"kube-root-ca.crt" Dec 03 10:08:13 crc kubenswrapper[4947]: I1203 10:08:13.135926 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-snrdr/must-gather-kjn4s"] Dec 03 10:08:13 crc kubenswrapper[4947]: I1203 10:08:13.309790 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c762483a-2d87-4ac1-8fda-5f4044772e9a-must-gather-output\") pod \"must-gather-kjn4s\" (UID: \"c762483a-2d87-4ac1-8fda-5f4044772e9a\") " pod="openshift-must-gather-snrdr/must-gather-kjn4s" Dec 03 10:08:13 crc kubenswrapper[4947]: I1203 10:08:13.310117 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcgrb\" (UniqueName: \"kubernetes.io/projected/c762483a-2d87-4ac1-8fda-5f4044772e9a-kube-api-access-xcgrb\") pod \"must-gather-kjn4s\" (UID: \"c762483a-2d87-4ac1-8fda-5f4044772e9a\") " pod="openshift-must-gather-snrdr/must-gather-kjn4s" Dec 03 10:08:13 crc kubenswrapper[4947]: I1203 10:08:13.412074 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c762483a-2d87-4ac1-8fda-5f4044772e9a-must-gather-output\") pod \"must-gather-kjn4s\" (UID: \"c762483a-2d87-4ac1-8fda-5f4044772e9a\") " pod="openshift-must-gather-snrdr/must-gather-kjn4s" Dec 03 10:08:13 crc kubenswrapper[4947]: I1203 10:08:13.412139 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcgrb\" (UniqueName: \"kubernetes.io/projected/c762483a-2d87-4ac1-8fda-5f4044772e9a-kube-api-access-xcgrb\") pod \"must-gather-kjn4s\" (UID: \"c762483a-2d87-4ac1-8fda-5f4044772e9a\") " pod="openshift-must-gather-snrdr/must-gather-kjn4s" Dec 03 10:08:13 crc kubenswrapper[4947]: I1203 10:08:13.412552 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c762483a-2d87-4ac1-8fda-5f4044772e9a-must-gather-output\") pod \"must-gather-kjn4s\" (UID: \"c762483a-2d87-4ac1-8fda-5f4044772e9a\") " pod="openshift-must-gather-snrdr/must-gather-kjn4s" Dec 03 10:08:13 crc kubenswrapper[4947]: I1203 10:08:13.438144 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcgrb\" (UniqueName: \"kubernetes.io/projected/c762483a-2d87-4ac1-8fda-5f4044772e9a-kube-api-access-xcgrb\") pod \"must-gather-kjn4s\" (UID: \"c762483a-2d87-4ac1-8fda-5f4044772e9a\") " pod="openshift-must-gather-snrdr/must-gather-kjn4s" Dec 03 10:08:13 crc kubenswrapper[4947]: I1203 10:08:13.726434 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snrdr/must-gather-kjn4s" Dec 03 10:08:14 crc kubenswrapper[4947]: I1203 10:08:14.212386 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-snrdr/must-gather-kjn4s"] Dec 03 10:08:14 crc kubenswrapper[4947]: I1203 10:08:14.220745 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 10:08:14 crc kubenswrapper[4947]: I1203 10:08:14.941871 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-snrdr/must-gather-kjn4s" event={"ID":"c762483a-2d87-4ac1-8fda-5f4044772e9a","Type":"ContainerStarted","Data":"2c5c05003292d87d2234c77eb8578ccde3ef6cb07355236a52fd10271bb8a68f"} Dec 03 10:08:18 crc kubenswrapper[4947]: I1203 10:08:18.989162 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-snrdr/must-gather-kjn4s" event={"ID":"c762483a-2d87-4ac1-8fda-5f4044772e9a","Type":"ContainerStarted","Data":"75c93397c4e43e690ab83e54f6a436540bedb58a727ddb09631599dffc79945e"} Dec 03 10:08:20 crc kubenswrapper[4947]: I1203 10:08:20.002027 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-snrdr/must-gather-kjn4s" event={"ID":"c762483a-2d87-4ac1-8fda-5f4044772e9a","Type":"ContainerStarted","Data":"3ed1ee23eeb3693feaafdd8abae6768cc2c5771a4fc4582c9185088f62717118"} Dec 03 10:08:20 crc kubenswrapper[4947]: I1203 10:08:20.049317 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-snrdr/must-gather-kjn4s" podStartSLOduration=2.937678981 podStartE2EDuration="7.049288529s" podCreationTimestamp="2025-12-03 10:08:13 +0000 UTC" firstStartedPulling="2025-12-03 10:08:14.220713922 +0000 UTC m=+11955.481668338" lastFinishedPulling="2025-12-03 10:08:18.33232346 +0000 UTC m=+11959.593277886" observedRunningTime="2025-12-03 10:08:20.041250462 +0000 UTC m=+11961.302204898" watchObservedRunningTime="2025-12-03 10:08:20.049288529 +0000 UTC m=+11961.310242955" Dec 03 10:08:21 crc kubenswrapper[4947]: I1203 10:08:21.084041 4947 scope.go:117] "RemoveContainer" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" Dec 03 10:08:21 crc kubenswrapper[4947]: E1203 10:08:21.084354 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:08:23 crc kubenswrapper[4947]: I1203 10:08:23.509381 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-snrdr/crc-debug-fh5n6"] Dec 03 10:08:23 crc kubenswrapper[4947]: I1203 10:08:23.511198 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snrdr/crc-debug-fh5n6" Dec 03 10:08:23 crc kubenswrapper[4947]: I1203 10:08:23.515433 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-snrdr"/"default-dockercfg-87hhk" Dec 03 10:08:23 crc kubenswrapper[4947]: I1203 10:08:23.643310 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67l4m\" (UniqueName: \"kubernetes.io/projected/75c0cb44-ecc3-4fdd-8576-d205ca5a68aa-kube-api-access-67l4m\") pod \"crc-debug-fh5n6\" (UID: \"75c0cb44-ecc3-4fdd-8576-d205ca5a68aa\") " pod="openshift-must-gather-snrdr/crc-debug-fh5n6" Dec 03 10:08:23 crc kubenswrapper[4947]: I1203 10:08:23.643510 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/75c0cb44-ecc3-4fdd-8576-d205ca5a68aa-host\") pod \"crc-debug-fh5n6\" (UID: \"75c0cb44-ecc3-4fdd-8576-d205ca5a68aa\") " pod="openshift-must-gather-snrdr/crc-debug-fh5n6" Dec 03 10:08:23 crc kubenswrapper[4947]: I1203 10:08:23.744951 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/75c0cb44-ecc3-4fdd-8576-d205ca5a68aa-host\") pod \"crc-debug-fh5n6\" (UID: \"75c0cb44-ecc3-4fdd-8576-d205ca5a68aa\") " pod="openshift-must-gather-snrdr/crc-debug-fh5n6" Dec 03 10:08:23 crc kubenswrapper[4947]: I1203 10:08:23.745041 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67l4m\" (UniqueName: \"kubernetes.io/projected/75c0cb44-ecc3-4fdd-8576-d205ca5a68aa-kube-api-access-67l4m\") pod \"crc-debug-fh5n6\" (UID: \"75c0cb44-ecc3-4fdd-8576-d205ca5a68aa\") " pod="openshift-must-gather-snrdr/crc-debug-fh5n6" Dec 03 10:08:23 crc kubenswrapper[4947]: I1203 10:08:23.745317 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/75c0cb44-ecc3-4fdd-8576-d205ca5a68aa-host\") pod \"crc-debug-fh5n6\" (UID: \"75c0cb44-ecc3-4fdd-8576-d205ca5a68aa\") " pod="openshift-must-gather-snrdr/crc-debug-fh5n6" Dec 03 10:08:23 crc kubenswrapper[4947]: I1203 10:08:23.775293 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67l4m\" (UniqueName: \"kubernetes.io/projected/75c0cb44-ecc3-4fdd-8576-d205ca5a68aa-kube-api-access-67l4m\") pod \"crc-debug-fh5n6\" (UID: \"75c0cb44-ecc3-4fdd-8576-d205ca5a68aa\") " pod="openshift-must-gather-snrdr/crc-debug-fh5n6" Dec 03 10:08:23 crc kubenswrapper[4947]: I1203 10:08:23.838160 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snrdr/crc-debug-fh5n6" Dec 03 10:08:23 crc kubenswrapper[4947]: W1203 10:08:23.878140 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75c0cb44_ecc3_4fdd_8576_d205ca5a68aa.slice/crio-192ae906fa44ddb89e87f6f259abeeaffedc01c8293dae20a5ff2da15c4a83fe WatchSource:0}: Error finding container 192ae906fa44ddb89e87f6f259abeeaffedc01c8293dae20a5ff2da15c4a83fe: Status 404 returned error can't find the container with id 192ae906fa44ddb89e87f6f259abeeaffedc01c8293dae20a5ff2da15c4a83fe Dec 03 10:08:24 crc kubenswrapper[4947]: I1203 10:08:24.041120 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-snrdr/crc-debug-fh5n6" event={"ID":"75c0cb44-ecc3-4fdd-8576-d205ca5a68aa","Type":"ContainerStarted","Data":"192ae906fa44ddb89e87f6f259abeeaffedc01c8293dae20a5ff2da15c4a83fe"} Dec 03 10:08:32 crc kubenswrapper[4947]: I1203 10:08:32.083902 4947 scope.go:117] "RemoveContainer" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" Dec 03 10:08:32 crc kubenswrapper[4947]: E1203 10:08:32.084700 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:08:36 crc kubenswrapper[4947]: I1203 10:08:36.204837 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-snrdr/crc-debug-fh5n6" event={"ID":"75c0cb44-ecc3-4fdd-8576-d205ca5a68aa","Type":"ContainerStarted","Data":"b5296885cbbacf29b570e03767a2e48dadea54c95bc5678b1e22a45a98265250"} Dec 03 10:08:36 crc kubenswrapper[4947]: I1203 10:08:36.224272 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-snrdr/crc-debug-fh5n6" podStartSLOduration=1.5114370639999999 podStartE2EDuration="13.224248961s" podCreationTimestamp="2025-12-03 10:08:23 +0000 UTC" firstStartedPulling="2025-12-03 10:08:23.880430583 +0000 UTC m=+11965.141385009" lastFinishedPulling="2025-12-03 10:08:35.59324248 +0000 UTC m=+11976.854196906" observedRunningTime="2025-12-03 10:08:36.217773046 +0000 UTC m=+11977.478727482" watchObservedRunningTime="2025-12-03 10:08:36.224248961 +0000 UTC m=+11977.485203387" Dec 03 10:08:47 crc kubenswrapper[4947]: I1203 10:08:47.084438 4947 scope.go:117] "RemoveContainer" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" Dec 03 10:08:47 crc kubenswrapper[4947]: E1203 10:08:47.085146 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:08:59 crc kubenswrapper[4947]: I1203 10:08:59.090009 4947 scope.go:117] "RemoveContainer" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" Dec 03 10:08:59 crc kubenswrapper[4947]: E1203 10:08:59.090790 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:09:14 crc kubenswrapper[4947]: I1203 10:09:14.083376 4947 scope.go:117] "RemoveContainer" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" Dec 03 10:09:14 crc kubenswrapper[4947]: E1203 10:09:14.084685 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:09:16 crc kubenswrapper[4947]: I1203 10:09:16.645200 4947 generic.go:334] "Generic (PLEG): container finished" podID="75c0cb44-ecc3-4fdd-8576-d205ca5a68aa" containerID="b5296885cbbacf29b570e03767a2e48dadea54c95bc5678b1e22a45a98265250" exitCode=0 Dec 03 10:09:16 crc kubenswrapper[4947]: I1203 10:09:16.645411 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-snrdr/crc-debug-fh5n6" event={"ID":"75c0cb44-ecc3-4fdd-8576-d205ca5a68aa","Type":"ContainerDied","Data":"b5296885cbbacf29b570e03767a2e48dadea54c95bc5678b1e22a45a98265250"} Dec 03 10:09:17 crc kubenswrapper[4947]: I1203 10:09:17.787889 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snrdr/crc-debug-fh5n6" Dec 03 10:09:17 crc kubenswrapper[4947]: I1203 10:09:17.873970 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/75c0cb44-ecc3-4fdd-8576-d205ca5a68aa-host\") pod \"75c0cb44-ecc3-4fdd-8576-d205ca5a68aa\" (UID: \"75c0cb44-ecc3-4fdd-8576-d205ca5a68aa\") " Dec 03 10:09:17 crc kubenswrapper[4947]: I1203 10:09:17.874044 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67l4m\" (UniqueName: \"kubernetes.io/projected/75c0cb44-ecc3-4fdd-8576-d205ca5a68aa-kube-api-access-67l4m\") pod \"75c0cb44-ecc3-4fdd-8576-d205ca5a68aa\" (UID: \"75c0cb44-ecc3-4fdd-8576-d205ca5a68aa\") " Dec 03 10:09:17 crc kubenswrapper[4947]: I1203 10:09:17.874126 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/75c0cb44-ecc3-4fdd-8576-d205ca5a68aa-host" (OuterVolumeSpecName: "host") pod "75c0cb44-ecc3-4fdd-8576-d205ca5a68aa" (UID: "75c0cb44-ecc3-4fdd-8576-d205ca5a68aa"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 10:09:17 crc kubenswrapper[4947]: I1203 10:09:17.874630 4947 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/75c0cb44-ecc3-4fdd-8576-d205ca5a68aa-host\") on node \"crc\" DevicePath \"\"" Dec 03 10:09:17 crc kubenswrapper[4947]: I1203 10:09:17.881046 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75c0cb44-ecc3-4fdd-8576-d205ca5a68aa-kube-api-access-67l4m" (OuterVolumeSpecName: "kube-api-access-67l4m") pod "75c0cb44-ecc3-4fdd-8576-d205ca5a68aa" (UID: "75c0cb44-ecc3-4fdd-8576-d205ca5a68aa"). InnerVolumeSpecName "kube-api-access-67l4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 10:09:17 crc kubenswrapper[4947]: I1203 10:09:17.916096 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-snrdr/crc-debug-fh5n6"] Dec 03 10:09:17 crc kubenswrapper[4947]: I1203 10:09:17.925925 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-snrdr/crc-debug-fh5n6"] Dec 03 10:09:17 crc kubenswrapper[4947]: I1203 10:09:17.977740 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67l4m\" (UniqueName: \"kubernetes.io/projected/75c0cb44-ecc3-4fdd-8576-d205ca5a68aa-kube-api-access-67l4m\") on node \"crc\" DevicePath \"\"" Dec 03 10:09:18 crc kubenswrapper[4947]: I1203 10:09:18.674772 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="192ae906fa44ddb89e87f6f259abeeaffedc01c8293dae20a5ff2da15c4a83fe" Dec 03 10:09:18 crc kubenswrapper[4947]: I1203 10:09:18.674854 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snrdr/crc-debug-fh5n6" Dec 03 10:09:19 crc kubenswrapper[4947]: I1203 10:09:19.098215 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75c0cb44-ecc3-4fdd-8576-d205ca5a68aa" path="/var/lib/kubelet/pods/75c0cb44-ecc3-4fdd-8576-d205ca5a68aa/volumes" Dec 03 10:09:19 crc kubenswrapper[4947]: I1203 10:09:19.128052 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-snrdr/crc-debug-7w9jf"] Dec 03 10:09:19 crc kubenswrapper[4947]: E1203 10:09:19.128918 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75c0cb44-ecc3-4fdd-8576-d205ca5a68aa" containerName="container-00" Dec 03 10:09:19 crc kubenswrapper[4947]: I1203 10:09:19.129101 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="75c0cb44-ecc3-4fdd-8576-d205ca5a68aa" containerName="container-00" Dec 03 10:09:19 crc kubenswrapper[4947]: I1203 10:09:19.129468 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="75c0cb44-ecc3-4fdd-8576-d205ca5a68aa" containerName="container-00" Dec 03 10:09:19 crc kubenswrapper[4947]: I1203 10:09:19.130539 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snrdr/crc-debug-7w9jf" Dec 03 10:09:19 crc kubenswrapper[4947]: I1203 10:09:19.132882 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-snrdr"/"default-dockercfg-87hhk" Dec 03 10:09:19 crc kubenswrapper[4947]: I1203 10:09:19.208215 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8-host\") pod \"crc-debug-7w9jf\" (UID: \"675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8\") " pod="openshift-must-gather-snrdr/crc-debug-7w9jf" Dec 03 10:09:19 crc kubenswrapper[4947]: I1203 10:09:19.208272 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75bhl\" (UniqueName: \"kubernetes.io/projected/675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8-kube-api-access-75bhl\") pod \"crc-debug-7w9jf\" (UID: \"675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8\") " pod="openshift-must-gather-snrdr/crc-debug-7w9jf" Dec 03 10:09:19 crc kubenswrapper[4947]: I1203 10:09:19.311108 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75bhl\" (UniqueName: \"kubernetes.io/projected/675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8-kube-api-access-75bhl\") pod \"crc-debug-7w9jf\" (UID: \"675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8\") " pod="openshift-must-gather-snrdr/crc-debug-7w9jf" Dec 03 10:09:19 crc kubenswrapper[4947]: I1203 10:09:19.311577 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8-host\") pod \"crc-debug-7w9jf\" (UID: \"675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8\") " pod="openshift-must-gather-snrdr/crc-debug-7w9jf" Dec 03 10:09:19 crc kubenswrapper[4947]: I1203 10:09:19.311672 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8-host\") pod \"crc-debug-7w9jf\" (UID: \"675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8\") " pod="openshift-must-gather-snrdr/crc-debug-7w9jf" Dec 03 10:09:19 crc kubenswrapper[4947]: I1203 10:09:19.331105 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75bhl\" (UniqueName: \"kubernetes.io/projected/675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8-kube-api-access-75bhl\") pod \"crc-debug-7w9jf\" (UID: \"675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8\") " pod="openshift-must-gather-snrdr/crc-debug-7w9jf" Dec 03 10:09:19 crc kubenswrapper[4947]: I1203 10:09:19.452028 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snrdr/crc-debug-7w9jf" Dec 03 10:09:19 crc kubenswrapper[4947]: I1203 10:09:19.690677 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-snrdr/crc-debug-7w9jf" event={"ID":"675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8","Type":"ContainerStarted","Data":"0661a068a716945c38cfdc855285d016b3b41a61d7a9d9454ac1b65f5fad8652"} Dec 03 10:09:20 crc kubenswrapper[4947]: I1203 10:09:20.706790 4947 generic.go:334] "Generic (PLEG): container finished" podID="675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8" containerID="d240a3a9e2987b7b16d2fdb0351e83496022074fad511bec35c73807fd9f1305" exitCode=0 Dec 03 10:09:20 crc kubenswrapper[4947]: I1203 10:09:20.707750 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-snrdr/crc-debug-7w9jf" event={"ID":"675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8","Type":"ContainerDied","Data":"d240a3a9e2987b7b16d2fdb0351e83496022074fad511bec35c73807fd9f1305"} Dec 03 10:09:21 crc kubenswrapper[4947]: I1203 10:09:21.612411 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-snrdr/crc-debug-7w9jf"] Dec 03 10:09:21 crc kubenswrapper[4947]: I1203 10:09:21.623905 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-snrdr/crc-debug-7w9jf"] Dec 03 10:09:21 crc kubenswrapper[4947]: I1203 10:09:21.828749 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snrdr/crc-debug-7w9jf" Dec 03 10:09:21 crc kubenswrapper[4947]: I1203 10:09:21.977631 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8-host\") pod \"675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8\" (UID: \"675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8\") " Dec 03 10:09:21 crc kubenswrapper[4947]: I1203 10:09:21.977762 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8-host" (OuterVolumeSpecName: "host") pod "675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8" (UID: "675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 10:09:21 crc kubenswrapper[4947]: I1203 10:09:21.978366 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75bhl\" (UniqueName: \"kubernetes.io/projected/675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8-kube-api-access-75bhl\") pod \"675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8\" (UID: \"675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8\") " Dec 03 10:09:21 crc kubenswrapper[4947]: I1203 10:09:21.980647 4947 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8-host\") on node \"crc\" DevicePath \"\"" Dec 03 10:09:21 crc kubenswrapper[4947]: I1203 10:09:21.990738 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8-kube-api-access-75bhl" (OuterVolumeSpecName: "kube-api-access-75bhl") pod "675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8" (UID: "675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8"). InnerVolumeSpecName "kube-api-access-75bhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 10:09:22 crc kubenswrapper[4947]: I1203 10:09:22.083483 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75bhl\" (UniqueName: \"kubernetes.io/projected/675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8-kube-api-access-75bhl\") on node \"crc\" DevicePath \"\"" Dec 03 10:09:22 crc kubenswrapper[4947]: I1203 10:09:22.728959 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0661a068a716945c38cfdc855285d016b3b41a61d7a9d9454ac1b65f5fad8652" Dec 03 10:09:22 crc kubenswrapper[4947]: I1203 10:09:22.729024 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snrdr/crc-debug-7w9jf" Dec 03 10:09:22 crc kubenswrapper[4947]: I1203 10:09:22.801543 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-snrdr/crc-debug-9gpmv"] Dec 03 10:09:22 crc kubenswrapper[4947]: E1203 10:09:22.802001 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8" containerName="container-00" Dec 03 10:09:22 crc kubenswrapper[4947]: I1203 10:09:22.802019 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8" containerName="container-00" Dec 03 10:09:22 crc kubenswrapper[4947]: I1203 10:09:22.802236 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8" containerName="container-00" Dec 03 10:09:22 crc kubenswrapper[4947]: I1203 10:09:22.802977 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snrdr/crc-debug-9gpmv" Dec 03 10:09:22 crc kubenswrapper[4947]: I1203 10:09:22.805054 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-snrdr"/"default-dockercfg-87hhk" Dec 03 10:09:22 crc kubenswrapper[4947]: I1203 10:09:22.900655 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k58rq\" (UniqueName: \"kubernetes.io/projected/b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8-kube-api-access-k58rq\") pod \"crc-debug-9gpmv\" (UID: \"b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8\") " pod="openshift-must-gather-snrdr/crc-debug-9gpmv" Dec 03 10:09:22 crc kubenswrapper[4947]: I1203 10:09:22.901031 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8-host\") pod \"crc-debug-9gpmv\" (UID: \"b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8\") " pod="openshift-must-gather-snrdr/crc-debug-9gpmv" Dec 03 10:09:23 crc kubenswrapper[4947]: I1203 10:09:23.003359 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k58rq\" (UniqueName: \"kubernetes.io/projected/b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8-kube-api-access-k58rq\") pod \"crc-debug-9gpmv\" (UID: \"b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8\") " pod="openshift-must-gather-snrdr/crc-debug-9gpmv" Dec 03 10:09:23 crc kubenswrapper[4947]: I1203 10:09:23.003429 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8-host\") pod \"crc-debug-9gpmv\" (UID: \"b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8\") " pod="openshift-must-gather-snrdr/crc-debug-9gpmv" Dec 03 10:09:23 crc kubenswrapper[4947]: I1203 10:09:23.003647 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8-host\") pod \"crc-debug-9gpmv\" (UID: \"b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8\") " pod="openshift-must-gather-snrdr/crc-debug-9gpmv" Dec 03 10:09:23 crc kubenswrapper[4947]: I1203 10:09:23.034278 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k58rq\" (UniqueName: \"kubernetes.io/projected/b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8-kube-api-access-k58rq\") pod \"crc-debug-9gpmv\" (UID: \"b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8\") " pod="openshift-must-gather-snrdr/crc-debug-9gpmv" Dec 03 10:09:23 crc kubenswrapper[4947]: I1203 10:09:23.095716 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8" path="/var/lib/kubelet/pods/675e2e54-3cd7-4eeb-86b5-2f0fe4bd6eb8/volumes" Dec 03 10:09:23 crc kubenswrapper[4947]: I1203 10:09:23.120033 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snrdr/crc-debug-9gpmv" Dec 03 10:09:23 crc kubenswrapper[4947]: I1203 10:09:23.751262 4947 generic.go:334] "Generic (PLEG): container finished" podID="b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8" containerID="1967aa1439c4196e3f161e7c2cec7db7b7f47bbfa469c8db06828bd6b1fa4198" exitCode=0 Dec 03 10:09:23 crc kubenswrapper[4947]: I1203 10:09:23.751326 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-snrdr/crc-debug-9gpmv" event={"ID":"b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8","Type":"ContainerDied","Data":"1967aa1439c4196e3f161e7c2cec7db7b7f47bbfa469c8db06828bd6b1fa4198"} Dec 03 10:09:23 crc kubenswrapper[4947]: I1203 10:09:23.751639 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-snrdr/crc-debug-9gpmv" event={"ID":"b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8","Type":"ContainerStarted","Data":"2d3380a9602196782fffccb00542c2bcc00a98298c78953b8397c44c74b036c9"} Dec 03 10:09:23 crc kubenswrapper[4947]: I1203 10:09:23.798869 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-snrdr/crc-debug-9gpmv"] Dec 03 10:09:23 crc kubenswrapper[4947]: I1203 10:09:23.809665 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-snrdr/crc-debug-9gpmv"] Dec 03 10:09:24 crc kubenswrapper[4947]: I1203 10:09:24.893371 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snrdr/crc-debug-9gpmv" Dec 03 10:09:24 crc kubenswrapper[4947]: I1203 10:09:24.938322 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8-host\") pod \"b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8\" (UID: \"b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8\") " Dec 03 10:09:24 crc kubenswrapper[4947]: I1203 10:09:24.938475 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k58rq\" (UniqueName: \"kubernetes.io/projected/b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8-kube-api-access-k58rq\") pod \"b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8\" (UID: \"b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8\") " Dec 03 10:09:24 crc kubenswrapper[4947]: I1203 10:09:24.938703 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8-host" (OuterVolumeSpecName: "host") pod "b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8" (UID: "b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 10:09:24 crc kubenswrapper[4947]: I1203 10:09:24.939068 4947 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8-host\") on node \"crc\" DevicePath \"\"" Dec 03 10:09:24 crc kubenswrapper[4947]: I1203 10:09:24.944121 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8-kube-api-access-k58rq" (OuterVolumeSpecName: "kube-api-access-k58rq") pod "b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8" (UID: "b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8"). InnerVolumeSpecName "kube-api-access-k58rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 10:09:25 crc kubenswrapper[4947]: I1203 10:09:25.041091 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k58rq\" (UniqueName: \"kubernetes.io/projected/b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8-kube-api-access-k58rq\") on node \"crc\" DevicePath \"\"" Dec 03 10:09:25 crc kubenswrapper[4947]: I1203 10:09:25.084219 4947 scope.go:117] "RemoveContainer" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" Dec 03 10:09:25 crc kubenswrapper[4947]: E1203 10:09:25.084995 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:09:25 crc kubenswrapper[4947]: I1203 10:09:25.099281 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8" path="/var/lib/kubelet/pods/b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8/volumes" Dec 03 10:09:25 crc kubenswrapper[4947]: I1203 10:09:25.775736 4947 scope.go:117] "RemoveContainer" containerID="1967aa1439c4196e3f161e7c2cec7db7b7f47bbfa469c8db06828bd6b1fa4198" Dec 03 10:09:25 crc kubenswrapper[4947]: I1203 10:09:25.775858 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snrdr/crc-debug-9gpmv" Dec 03 10:09:36 crc kubenswrapper[4947]: I1203 10:09:36.083240 4947 scope.go:117] "RemoveContainer" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" Dec 03 10:09:36 crc kubenswrapper[4947]: E1203 10:09:36.084234 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:09:50 crc kubenswrapper[4947]: I1203 10:09:50.085153 4947 scope.go:117] "RemoveContainer" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" Dec 03 10:09:50 crc kubenswrapper[4947]: E1203 10:09:50.086284 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:10:00 crc kubenswrapper[4947]: I1203 10:10:00.232975 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9bszr"] Dec 03 10:10:00 crc kubenswrapper[4947]: E1203 10:10:00.234341 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8" containerName="container-00" Dec 03 10:10:00 crc kubenswrapper[4947]: I1203 10:10:00.234362 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8" containerName="container-00" Dec 03 10:10:00 crc kubenswrapper[4947]: I1203 10:10:00.234736 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="b88b5c1c-13eb-4617-ad7b-9c71ad77d2a8" containerName="container-00" Dec 03 10:10:00 crc kubenswrapper[4947]: I1203 10:10:00.237256 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9bszr" Dec 03 10:10:00 crc kubenswrapper[4947]: I1203 10:10:00.247728 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9bszr"] Dec 03 10:10:00 crc kubenswrapper[4947]: I1203 10:10:00.311039 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02678e09-a6ce-4ad1-a0df-5fcea2e344fd-utilities\") pod \"redhat-marketplace-9bszr\" (UID: \"02678e09-a6ce-4ad1-a0df-5fcea2e344fd\") " pod="openshift-marketplace/redhat-marketplace-9bszr" Dec 03 10:10:00 crc kubenswrapper[4947]: I1203 10:10:00.311159 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02678e09-a6ce-4ad1-a0df-5fcea2e344fd-catalog-content\") pod \"redhat-marketplace-9bszr\" (UID: \"02678e09-a6ce-4ad1-a0df-5fcea2e344fd\") " pod="openshift-marketplace/redhat-marketplace-9bszr" Dec 03 10:10:00 crc kubenswrapper[4947]: I1203 10:10:00.311457 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzxcw\" (UniqueName: \"kubernetes.io/projected/02678e09-a6ce-4ad1-a0df-5fcea2e344fd-kube-api-access-xzxcw\") pod \"redhat-marketplace-9bszr\" (UID: \"02678e09-a6ce-4ad1-a0df-5fcea2e344fd\") " pod="openshift-marketplace/redhat-marketplace-9bszr" Dec 03 10:10:00 crc kubenswrapper[4947]: I1203 10:10:00.413774 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02678e09-a6ce-4ad1-a0df-5fcea2e344fd-utilities\") pod \"redhat-marketplace-9bszr\" (UID: \"02678e09-a6ce-4ad1-a0df-5fcea2e344fd\") " pod="openshift-marketplace/redhat-marketplace-9bszr" Dec 03 10:10:00 crc kubenswrapper[4947]: I1203 10:10:00.413846 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02678e09-a6ce-4ad1-a0df-5fcea2e344fd-catalog-content\") pod \"redhat-marketplace-9bszr\" (UID: \"02678e09-a6ce-4ad1-a0df-5fcea2e344fd\") " pod="openshift-marketplace/redhat-marketplace-9bszr" Dec 03 10:10:00 crc kubenswrapper[4947]: I1203 10:10:00.413976 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzxcw\" (UniqueName: \"kubernetes.io/projected/02678e09-a6ce-4ad1-a0df-5fcea2e344fd-kube-api-access-xzxcw\") pod \"redhat-marketplace-9bszr\" (UID: \"02678e09-a6ce-4ad1-a0df-5fcea2e344fd\") " pod="openshift-marketplace/redhat-marketplace-9bszr" Dec 03 10:10:00 crc kubenswrapper[4947]: I1203 10:10:00.414688 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02678e09-a6ce-4ad1-a0df-5fcea2e344fd-utilities\") pod \"redhat-marketplace-9bszr\" (UID: \"02678e09-a6ce-4ad1-a0df-5fcea2e344fd\") " pod="openshift-marketplace/redhat-marketplace-9bszr" Dec 03 10:10:00 crc kubenswrapper[4947]: I1203 10:10:00.414765 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02678e09-a6ce-4ad1-a0df-5fcea2e344fd-catalog-content\") pod \"redhat-marketplace-9bszr\" (UID: \"02678e09-a6ce-4ad1-a0df-5fcea2e344fd\") " pod="openshift-marketplace/redhat-marketplace-9bszr" Dec 03 10:10:00 crc kubenswrapper[4947]: I1203 10:10:00.439085 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzxcw\" (UniqueName: \"kubernetes.io/projected/02678e09-a6ce-4ad1-a0df-5fcea2e344fd-kube-api-access-xzxcw\") pod \"redhat-marketplace-9bszr\" (UID: \"02678e09-a6ce-4ad1-a0df-5fcea2e344fd\") " pod="openshift-marketplace/redhat-marketplace-9bszr" Dec 03 10:10:00 crc kubenswrapper[4947]: I1203 10:10:00.583040 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9bszr" Dec 03 10:10:01 crc kubenswrapper[4947]: I1203 10:10:01.180064 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9bszr"] Dec 03 10:10:01 crc kubenswrapper[4947]: I1203 10:10:01.597311 4947 generic.go:334] "Generic (PLEG): container finished" podID="02678e09-a6ce-4ad1-a0df-5fcea2e344fd" containerID="b04695283d50641c59771dc20ca804d2ed3d0d26d5d3e6e6bca6ea8ffb75e46f" exitCode=0 Dec 03 10:10:01 crc kubenswrapper[4947]: I1203 10:10:01.597405 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bszr" event={"ID":"02678e09-a6ce-4ad1-a0df-5fcea2e344fd","Type":"ContainerDied","Data":"b04695283d50641c59771dc20ca804d2ed3d0d26d5d3e6e6bca6ea8ffb75e46f"} Dec 03 10:10:01 crc kubenswrapper[4947]: I1203 10:10:01.597625 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bszr" event={"ID":"02678e09-a6ce-4ad1-a0df-5fcea2e344fd","Type":"ContainerStarted","Data":"93638aff4202959fe32b2755c1930e592c6e89a03b155953667ef95ba00646b9"} Dec 03 10:10:02 crc kubenswrapper[4947]: I1203 10:10:02.085619 4947 scope.go:117] "RemoveContainer" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" Dec 03 10:10:02 crc kubenswrapper[4947]: E1203 10:10:02.086047 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:10:03 crc kubenswrapper[4947]: I1203 10:10:03.630867 4947 generic.go:334] "Generic (PLEG): container finished" podID="02678e09-a6ce-4ad1-a0df-5fcea2e344fd" containerID="9540f9e5752083f3d12adcfe30bffb038e462da34ced64b4595ec8cb82d6d02a" exitCode=0 Dec 03 10:10:03 crc kubenswrapper[4947]: I1203 10:10:03.630971 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bszr" event={"ID":"02678e09-a6ce-4ad1-a0df-5fcea2e344fd","Type":"ContainerDied","Data":"9540f9e5752083f3d12adcfe30bffb038e462da34ced64b4595ec8cb82d6d02a"} Dec 03 10:10:04 crc kubenswrapper[4947]: I1203 10:10:04.647912 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bszr" event={"ID":"02678e09-a6ce-4ad1-a0df-5fcea2e344fd","Type":"ContainerStarted","Data":"e36e13ea1a81ed847ed769e1a4f305e31fc0353ac5dc7f1633cce8eecc583257"} Dec 03 10:10:04 crc kubenswrapper[4947]: I1203 10:10:04.672206 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9bszr" podStartSLOduration=2.128343277 podStartE2EDuration="4.672183006s" podCreationTimestamp="2025-12-03 10:10:00 +0000 UTC" firstStartedPulling="2025-12-03 10:10:01.599517126 +0000 UTC m=+12062.860471552" lastFinishedPulling="2025-12-03 10:10:04.143356855 +0000 UTC m=+12065.404311281" observedRunningTime="2025-12-03 10:10:04.667990933 +0000 UTC m=+12065.928945369" watchObservedRunningTime="2025-12-03 10:10:04.672183006 +0000 UTC m=+12065.933137432" Dec 03 10:10:10 crc kubenswrapper[4947]: I1203 10:10:10.584120 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9bszr" Dec 03 10:10:10 crc kubenswrapper[4947]: I1203 10:10:10.584578 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9bszr" Dec 03 10:10:10 crc kubenswrapper[4947]: I1203 10:10:10.657569 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9bszr" Dec 03 10:10:10 crc kubenswrapper[4947]: I1203 10:10:10.760308 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9bszr" Dec 03 10:10:10 crc kubenswrapper[4947]: I1203 10:10:10.897119 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9bszr"] Dec 03 10:10:12 crc kubenswrapper[4947]: I1203 10:10:12.735451 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9bszr" podUID="02678e09-a6ce-4ad1-a0df-5fcea2e344fd" containerName="registry-server" containerID="cri-o://e36e13ea1a81ed847ed769e1a4f305e31fc0353ac5dc7f1633cce8eecc583257" gracePeriod=2 Dec 03 10:10:13 crc kubenswrapper[4947]: I1203 10:10:13.270656 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9bszr" Dec 03 10:10:13 crc kubenswrapper[4947]: I1203 10:10:13.291914 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzxcw\" (UniqueName: \"kubernetes.io/projected/02678e09-a6ce-4ad1-a0df-5fcea2e344fd-kube-api-access-xzxcw\") pod \"02678e09-a6ce-4ad1-a0df-5fcea2e344fd\" (UID: \"02678e09-a6ce-4ad1-a0df-5fcea2e344fd\") " Dec 03 10:10:13 crc kubenswrapper[4947]: I1203 10:10:13.292359 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02678e09-a6ce-4ad1-a0df-5fcea2e344fd-utilities\") pod \"02678e09-a6ce-4ad1-a0df-5fcea2e344fd\" (UID: \"02678e09-a6ce-4ad1-a0df-5fcea2e344fd\") " Dec 03 10:10:13 crc kubenswrapper[4947]: I1203 10:10:13.292405 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02678e09-a6ce-4ad1-a0df-5fcea2e344fd-catalog-content\") pod \"02678e09-a6ce-4ad1-a0df-5fcea2e344fd\" (UID: \"02678e09-a6ce-4ad1-a0df-5fcea2e344fd\") " Dec 03 10:10:13 crc kubenswrapper[4947]: I1203 10:10:13.293064 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02678e09-a6ce-4ad1-a0df-5fcea2e344fd-utilities" (OuterVolumeSpecName: "utilities") pod "02678e09-a6ce-4ad1-a0df-5fcea2e344fd" (UID: "02678e09-a6ce-4ad1-a0df-5fcea2e344fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 10:10:13 crc kubenswrapper[4947]: I1203 10:10:13.297539 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02678e09-a6ce-4ad1-a0df-5fcea2e344fd-kube-api-access-xzxcw" (OuterVolumeSpecName: "kube-api-access-xzxcw") pod "02678e09-a6ce-4ad1-a0df-5fcea2e344fd" (UID: "02678e09-a6ce-4ad1-a0df-5fcea2e344fd"). InnerVolumeSpecName "kube-api-access-xzxcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 10:10:13 crc kubenswrapper[4947]: I1203 10:10:13.313943 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02678e09-a6ce-4ad1-a0df-5fcea2e344fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02678e09-a6ce-4ad1-a0df-5fcea2e344fd" (UID: "02678e09-a6ce-4ad1-a0df-5fcea2e344fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 10:10:13 crc kubenswrapper[4947]: I1203 10:10:13.394735 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02678e09-a6ce-4ad1-a0df-5fcea2e344fd-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 10:10:13 crc kubenswrapper[4947]: I1203 10:10:13.394769 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02678e09-a6ce-4ad1-a0df-5fcea2e344fd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 10:10:13 crc kubenswrapper[4947]: I1203 10:10:13.394779 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzxcw\" (UniqueName: \"kubernetes.io/projected/02678e09-a6ce-4ad1-a0df-5fcea2e344fd-kube-api-access-xzxcw\") on node \"crc\" DevicePath \"\"" Dec 03 10:10:13 crc kubenswrapper[4947]: I1203 10:10:13.750667 4947 generic.go:334] "Generic (PLEG): container finished" podID="02678e09-a6ce-4ad1-a0df-5fcea2e344fd" containerID="e36e13ea1a81ed847ed769e1a4f305e31fc0353ac5dc7f1633cce8eecc583257" exitCode=0 Dec 03 10:10:13 crc kubenswrapper[4947]: I1203 10:10:13.750744 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bszr" event={"ID":"02678e09-a6ce-4ad1-a0df-5fcea2e344fd","Type":"ContainerDied","Data":"e36e13ea1a81ed847ed769e1a4f305e31fc0353ac5dc7f1633cce8eecc583257"} Dec 03 10:10:13 crc kubenswrapper[4947]: I1203 10:10:13.750776 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9bszr" event={"ID":"02678e09-a6ce-4ad1-a0df-5fcea2e344fd","Type":"ContainerDied","Data":"93638aff4202959fe32b2755c1930e592c6e89a03b155953667ef95ba00646b9"} Dec 03 10:10:13 crc kubenswrapper[4947]: I1203 10:10:13.750795 4947 scope.go:117] "RemoveContainer" containerID="e36e13ea1a81ed847ed769e1a4f305e31fc0353ac5dc7f1633cce8eecc583257" Dec 03 10:10:13 crc kubenswrapper[4947]: I1203 10:10:13.750936 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9bszr" Dec 03 10:10:13 crc kubenswrapper[4947]: I1203 10:10:13.799570 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9bszr"] Dec 03 10:10:13 crc kubenswrapper[4947]: I1203 10:10:13.800955 4947 scope.go:117] "RemoveContainer" containerID="9540f9e5752083f3d12adcfe30bffb038e462da34ced64b4595ec8cb82d6d02a" Dec 03 10:10:13 crc kubenswrapper[4947]: I1203 10:10:13.811768 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9bszr"] Dec 03 10:10:13 crc kubenswrapper[4947]: I1203 10:10:13.833322 4947 scope.go:117] "RemoveContainer" containerID="b04695283d50641c59771dc20ca804d2ed3d0d26d5d3e6e6bca6ea8ffb75e46f" Dec 03 10:10:13 crc kubenswrapper[4947]: I1203 10:10:13.887913 4947 scope.go:117] "RemoveContainer" containerID="e36e13ea1a81ed847ed769e1a4f305e31fc0353ac5dc7f1633cce8eecc583257" Dec 03 10:10:13 crc kubenswrapper[4947]: E1203 10:10:13.888700 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e36e13ea1a81ed847ed769e1a4f305e31fc0353ac5dc7f1633cce8eecc583257\": container with ID starting with e36e13ea1a81ed847ed769e1a4f305e31fc0353ac5dc7f1633cce8eecc583257 not found: ID does not exist" containerID="e36e13ea1a81ed847ed769e1a4f305e31fc0353ac5dc7f1633cce8eecc583257" Dec 03 10:10:13 crc kubenswrapper[4947]: I1203 10:10:13.888755 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e36e13ea1a81ed847ed769e1a4f305e31fc0353ac5dc7f1633cce8eecc583257"} err="failed to get container status \"e36e13ea1a81ed847ed769e1a4f305e31fc0353ac5dc7f1633cce8eecc583257\": rpc error: code = NotFound desc = could not find container \"e36e13ea1a81ed847ed769e1a4f305e31fc0353ac5dc7f1633cce8eecc583257\": container with ID starting with e36e13ea1a81ed847ed769e1a4f305e31fc0353ac5dc7f1633cce8eecc583257 not found: ID does not exist" Dec 03 10:10:13 crc kubenswrapper[4947]: I1203 10:10:13.888787 4947 scope.go:117] "RemoveContainer" containerID="9540f9e5752083f3d12adcfe30bffb038e462da34ced64b4595ec8cb82d6d02a" Dec 03 10:10:13 crc kubenswrapper[4947]: E1203 10:10:13.889141 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9540f9e5752083f3d12adcfe30bffb038e462da34ced64b4595ec8cb82d6d02a\": container with ID starting with 9540f9e5752083f3d12adcfe30bffb038e462da34ced64b4595ec8cb82d6d02a not found: ID does not exist" containerID="9540f9e5752083f3d12adcfe30bffb038e462da34ced64b4595ec8cb82d6d02a" Dec 03 10:10:13 crc kubenswrapper[4947]: I1203 10:10:13.889185 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9540f9e5752083f3d12adcfe30bffb038e462da34ced64b4595ec8cb82d6d02a"} err="failed to get container status \"9540f9e5752083f3d12adcfe30bffb038e462da34ced64b4595ec8cb82d6d02a\": rpc error: code = NotFound desc = could not find container \"9540f9e5752083f3d12adcfe30bffb038e462da34ced64b4595ec8cb82d6d02a\": container with ID starting with 9540f9e5752083f3d12adcfe30bffb038e462da34ced64b4595ec8cb82d6d02a not found: ID does not exist" Dec 03 10:10:13 crc kubenswrapper[4947]: I1203 10:10:13.889265 4947 scope.go:117] "RemoveContainer" containerID="b04695283d50641c59771dc20ca804d2ed3d0d26d5d3e6e6bca6ea8ffb75e46f" Dec 03 10:10:13 crc kubenswrapper[4947]: E1203 10:10:13.889564 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b04695283d50641c59771dc20ca804d2ed3d0d26d5d3e6e6bca6ea8ffb75e46f\": container with ID starting with b04695283d50641c59771dc20ca804d2ed3d0d26d5d3e6e6bca6ea8ffb75e46f not found: ID does not exist" containerID="b04695283d50641c59771dc20ca804d2ed3d0d26d5d3e6e6bca6ea8ffb75e46f" Dec 03 10:10:13 crc kubenswrapper[4947]: I1203 10:10:13.889596 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b04695283d50641c59771dc20ca804d2ed3d0d26d5d3e6e6bca6ea8ffb75e46f"} err="failed to get container status \"b04695283d50641c59771dc20ca804d2ed3d0d26d5d3e6e6bca6ea8ffb75e46f\": rpc error: code = NotFound desc = could not find container \"b04695283d50641c59771dc20ca804d2ed3d0d26d5d3e6e6bca6ea8ffb75e46f\": container with ID starting with b04695283d50641c59771dc20ca804d2ed3d0d26d5d3e6e6bca6ea8ffb75e46f not found: ID does not exist" Dec 03 10:10:14 crc kubenswrapper[4947]: I1203 10:10:14.082942 4947 scope.go:117] "RemoveContainer" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" Dec 03 10:10:14 crc kubenswrapper[4947]: E1203 10:10:14.083368 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:10:15 crc kubenswrapper[4947]: I1203 10:10:15.104553 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02678e09-a6ce-4ad1-a0df-5fcea2e344fd" path="/var/lib/kubelet/pods/02678e09-a6ce-4ad1-a0df-5fcea2e344fd/volumes" Dec 03 10:10:26 crc kubenswrapper[4947]: I1203 10:10:26.082737 4947 scope.go:117] "RemoveContainer" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" Dec 03 10:10:26 crc kubenswrapper[4947]: E1203 10:10:26.083602 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:10:27 crc kubenswrapper[4947]: I1203 10:10:27.135411 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_d601641a-4a35-480d-a171-e058a567adcd/init-config-reloader/0.log" Dec 03 10:10:27 crc kubenswrapper[4947]: I1203 10:10:27.329256 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_d601641a-4a35-480d-a171-e058a567adcd/config-reloader/0.log" Dec 03 10:10:27 crc kubenswrapper[4947]: I1203 10:10:27.343783 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_d601641a-4a35-480d-a171-e058a567adcd/alertmanager/0.log" Dec 03 10:10:27 crc kubenswrapper[4947]: I1203 10:10:27.580073 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_d601641a-4a35-480d-a171-e058a567adcd/init-config-reloader/0.log" Dec 03 10:10:27 crc kubenswrapper[4947]: I1203 10:10:27.702808 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_5f2e2ca8-0e8a-40eb-92f6-838120ef08c0/aodh-api/0.log" Dec 03 10:10:27 crc kubenswrapper[4947]: I1203 10:10:27.741025 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_5f2e2ca8-0e8a-40eb-92f6-838120ef08c0/aodh-listener/0.log" Dec 03 10:10:27 crc kubenswrapper[4947]: I1203 10:10:27.784133 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_5f2e2ca8-0e8a-40eb-92f6-838120ef08c0/aodh-notifier/0.log" Dec 03 10:10:27 crc kubenswrapper[4947]: I1203 10:10:27.792263 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_5f2e2ca8-0e8a-40eb-92f6-838120ef08c0/aodh-evaluator/0.log" Dec 03 10:10:27 crc kubenswrapper[4947]: I1203 10:10:27.932967 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-76dd78b6c8-phtkk_3e86fc66-942a-4e83-8969-6d18c93ba3e3/barbican-api/0.log" Dec 03 10:10:27 crc kubenswrapper[4947]: I1203 10:10:27.950926 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-76dd78b6c8-phtkk_3e86fc66-942a-4e83-8969-6d18c93ba3e3/barbican-api-log/0.log" Dec 03 10:10:28 crc kubenswrapper[4947]: I1203 10:10:28.224289 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6789c6d659-8b7jf_f055623c-9e1a-4793-a9d0-fc56e71d8df5/barbican-worker/0.log" Dec 03 10:10:28 crc kubenswrapper[4947]: I1203 10:10:28.232981 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-74b59955b-vlhfv_13615835-dfc2-4dd0-8dc3-518683077f9f/barbican-keystone-listener/0.log" Dec 03 10:10:28 crc kubenswrapper[4947]: I1203 10:10:28.508998 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6789c6d659-8b7jf_f055623c-9e1a-4793-a9d0-fc56e71d8df5/barbican-worker-log/0.log" Dec 03 10:10:28 crc kubenswrapper[4947]: I1203 10:10:28.539636 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-hv6px_490f2f91-8ec4-42da-bee9-65ebd38a7492/bootstrap-openstack-openstack-cell1/0.log" Dec 03 10:10:28 crc kubenswrapper[4947]: I1203 10:10:28.574391 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-74b59955b-vlhfv_13615835-dfc2-4dd0-8dc3-518683077f9f/barbican-keystone-listener-log/0.log" Dec 03 10:10:28 crc kubenswrapper[4947]: I1203 10:10:28.744415 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell2-d84ng_fe124b64-001a-435a-8096-764e2a71097b/bootstrap-openstack-openstack-cell2/0.log" Dec 03 10:10:28 crc kubenswrapper[4947]: I1203 10:10:28.828933 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_87e9571e-9b03-430a-83a3-2bc809f12a29/ceilometer-central-agent/0.log" Dec 03 10:10:28 crc kubenswrapper[4947]: I1203 10:10:28.966431 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_87e9571e-9b03-430a-83a3-2bc809f12a29/proxy-httpd/0.log" Dec 03 10:10:29 crc kubenswrapper[4947]: I1203 10:10:29.060105 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_87e9571e-9b03-430a-83a3-2bc809f12a29/ceilometer-notification-agent/0.log" Dec 03 10:10:29 crc kubenswrapper[4947]: I1203 10:10:29.063105 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_87e9571e-9b03-430a-83a3-2bc809f12a29/sg-core/0.log" Dec 03 10:10:29 crc kubenswrapper[4947]: I1203 10:10:29.235928 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_7e617284-a25d-439b-ac2d-5b795a63ea06/cinder-api/0.log" Dec 03 10:10:29 crc kubenswrapper[4947]: I1203 10:10:29.273553 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_7e617284-a25d-439b-ac2d-5b795a63ea06/cinder-api-log/0.log" Dec 03 10:10:29 crc kubenswrapper[4947]: I1203 10:10:29.416660 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e73b990b-2218-4b3e-9f09-3200fc4d668d/cinder-scheduler/0.log" Dec 03 10:10:29 crc kubenswrapper[4947]: I1203 10:10:29.538154 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-vxc8z_a6949c8f-209b-45db-9297-e7de78baa4ca/configure-network-openstack-openstack-cell1/0.log" Dec 03 10:10:29 crc kubenswrapper[4947]: I1203 10:10:29.560597 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e73b990b-2218-4b3e-9f09-3200fc4d668d/probe/0.log" Dec 03 10:10:29 crc kubenswrapper[4947]: I1203 10:10:29.761452 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell2-nswm8_6607a030-ff9e-4b09-b42b-11c78be5d094/configure-network-openstack-openstack-cell2/0.log" Dec 03 10:10:29 crc kubenswrapper[4947]: I1203 10:10:29.782002 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-j4sjz_a6d93b30-a643-4185-975a-a650351e70d5/configure-os-openstack-openstack-cell1/0.log" Dec 03 10:10:29 crc kubenswrapper[4947]: I1203 10:10:29.994056 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell2-fmlct_29fab9b2-d93c-432c-8563-f28b9af0e313/configure-os-openstack-openstack-cell2/0.log" Dec 03 10:10:30 crc kubenswrapper[4947]: I1203 10:10:30.028805 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7558866fff-lkt2j_6a4c0c95-ce41-4f47-8900-52da56018f73/init/0.log" Dec 03 10:10:30 crc kubenswrapper[4947]: I1203 10:10:30.262232 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7558866fff-lkt2j_6a4c0c95-ce41-4f47-8900-52da56018f73/init/0.log" Dec 03 10:10:30 crc kubenswrapper[4947]: I1203 10:10:30.332759 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-4b7ns_e168aa9d-0251-46ec-8237-596594385a28/download-cache-openstack-openstack-cell1/0.log" Dec 03 10:10:30 crc kubenswrapper[4947]: I1203 10:10:30.358409 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7558866fff-lkt2j_6a4c0c95-ce41-4f47-8900-52da56018f73/dnsmasq-dns/0.log" Dec 03 10:10:30 crc kubenswrapper[4947]: I1203 10:10:30.538112 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell2-l4vbx_53b2f919-6f0d-475b-a5fe-c59abec0ccbb/download-cache-openstack-openstack-cell2/0.log" Dec 03 10:10:30 crc kubenswrapper[4947]: I1203 10:10:30.647838 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_50c34849-5902-436a-965e-5de5d52d6853/glance-httpd/0.log" Dec 03 10:10:30 crc kubenswrapper[4947]: I1203 10:10:30.662295 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_50c34849-5902-436a-965e-5de5d52d6853/glance-log/0.log" Dec 03 10:10:30 crc kubenswrapper[4947]: I1203 10:10:30.783036 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_05aacc20-a81e-4e25-88f4-518cf128bcab/glance-httpd/0.log" Dec 03 10:10:30 crc kubenswrapper[4947]: I1203 10:10:30.872961 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_05aacc20-a81e-4e25-88f4-518cf128bcab/glance-log/0.log" Dec 03 10:10:31 crc kubenswrapper[4947]: I1203 10:10:31.004014 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-5f576c8c7-hqbg9_d4ff2437-317a-4e48-9cb7-d001f05ccbee/heat-api/0.log" Dec 03 10:10:31 crc kubenswrapper[4947]: I1203 10:10:31.127431 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-b598dd84c-gtjdb_6cef161a-7885-4753-9cbe-8ee4d59ebc94/heat-cfnapi/0.log" Dec 03 10:10:31 crc kubenswrapper[4947]: I1203 10:10:31.160334 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-5dbdc8fbbd-cxvpc_ab8a377e-bb3b-4cc3-afb8-d7cbe44a3e06/heat-engine/0.log" Dec 03 10:10:31 crc kubenswrapper[4947]: I1203 10:10:31.372136 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-55945f8875-6c2qk_3ed26802-4951-4e81-bf89-7fec1e488b7b/horizon/0.log" Dec 03 10:10:31 crc kubenswrapper[4947]: I1203 10:10:31.425319 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-qkqhk_9079c043-f2ef-421d-b8cd-81dea5769f02/install-certs-openstack-openstack-cell1/0.log" Dec 03 10:10:31 crc kubenswrapper[4947]: I1203 10:10:31.441426 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-55945f8875-6c2qk_3ed26802-4951-4e81-bf89-7fec1e488b7b/horizon-log/0.log" Dec 03 10:10:31 crc kubenswrapper[4947]: I1203 10:10:31.846776 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell2-42v76_4c493b7d-12ef-47ed-825b-8de463ffb17b/install-certs-openstack-openstack-cell2/0.log" Dec 03 10:10:31 crc kubenswrapper[4947]: I1203 10:10:31.891380 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-9szzl_3bb31b8a-1ed3-421b-a9a7-65ae531c90dd/install-os-openstack-openstack-cell1/0.log" Dec 03 10:10:31 crc kubenswrapper[4947]: I1203 10:10:31.961682 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell2-5x8jx_fce62e9e-4b15-4a9b-9ac9-da544976d2fe/install-os-openstack-openstack-cell2/0.log" Dec 03 10:10:32 crc kubenswrapper[4947]: I1203 10:10:32.176935 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29412541-j7np7_7c09f9ec-b2ae-4620-9815-20949c1c08ba/keystone-cron/0.log" Dec 03 10:10:32 crc kubenswrapper[4947]: I1203 10:10:32.287358 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29412601-5kcwm_b461b660-8534-46fa-a7c4-8f98d4a4ee54/keystone-cron/0.log" Dec 03 10:10:32 crc kubenswrapper[4947]: I1203 10:10:32.462270 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_bd56123a-a7b6-431e-aaa2-375bff6c9627/kube-state-metrics/0.log" Dec 03 10:10:32 crc kubenswrapper[4947]: I1203 10:10:32.535623 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7f49ddfb6d-86f4p_4261d958-be0d-4748-801a-6a08686d5e46/keystone-api/0.log" Dec 03 10:10:32 crc kubenswrapper[4947]: I1203 10:10:32.545212 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-6dmhf_1207aa81-9a4e-4189-b19c-65b393c24b4c/libvirt-openstack-openstack-cell1/0.log" Dec 03 10:10:32 crc kubenswrapper[4947]: I1203 10:10:32.695571 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell2-knn2l_9ea8f5d8-aea6-44e9-995f-537fd9e2f655/libvirt-openstack-openstack-cell2/0.log" Dec 03 10:10:33 crc kubenswrapper[4947]: I1203 10:10:33.139930 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-r5zr9_aa6d0967-a3d2-4da9-8ee8-c7318ac8d49f/neutron-dhcp-openstack-openstack-cell1/0.log" Dec 03 10:10:33 crc kubenswrapper[4947]: I1203 10:10:33.146048 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7b4c4b69d9-lgsnm_94a693aa-cb18-4936-8a2a-2a7fcc5feec4/neutron-httpd/0.log" Dec 03 10:10:33 crc kubenswrapper[4947]: I1203 10:10:33.398152 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell2-bhnzg_93ccd72a-194e-4207-bdfc-ed594d06f81c/neutron-dhcp-openstack-openstack-cell2/0.log" Dec 03 10:10:33 crc kubenswrapper[4947]: I1203 10:10:33.481303 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7b4c4b69d9-lgsnm_94a693aa-cb18-4936-8a2a-2a7fcc5feec4/neutron-api/0.log" Dec 03 10:10:33 crc kubenswrapper[4947]: I1203 10:10:33.636745 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-5pxn4_a4f58536-7db6-4b2a-89c7-a043893e4543/neutron-metadata-openstack-openstack-cell1/0.log" Dec 03 10:10:33 crc kubenswrapper[4947]: I1203 10:10:33.776270 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell2-ksjwl_231eb955-7b76-4bc9-b221-df73a0d8aae2/neutron-metadata-openstack-openstack-cell2/0.log" Dec 03 10:10:33 crc kubenswrapper[4947]: I1203 10:10:33.861934 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-dzdfr_1b9f4a70-dc36-4a8f-a879-3d74365bd0fb/neutron-sriov-openstack-openstack-cell1/0.log" Dec 03 10:10:34 crc kubenswrapper[4947]: I1203 10:10:34.131142 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell2-2g4j6_35f362a4-fcac-4d8b-a76b-558353ead5e5/neutron-sriov-openstack-openstack-cell2/0.log" Dec 03 10:10:34 crc kubenswrapper[4947]: I1203 10:10:34.270239 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b901c287-ea30-475d-8f6b-96ccd6463604/nova-api-api/0.log" Dec 03 10:10:34 crc kubenswrapper[4947]: I1203 10:10:34.527790 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_d04ce99a-4178-4396-b6d3-c6485432d85f/nova-cell0-conductor-conductor/0.log" Dec 03 10:10:34 crc kubenswrapper[4947]: I1203 10:10:34.647710 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b901c287-ea30-475d-8f6b-96ccd6463604/nova-api-log/0.log" Dec 03 10:10:34 crc kubenswrapper[4947]: I1203 10:10:34.722238 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_abc69d54-9ae1-4797-b8c8-9beb82eacff3/nova-cell1-conductor-conductor/0.log" Dec 03 10:10:34 crc kubenswrapper[4947]: I1203 10:10:34.741603 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_ab6dc8c6-2e15-4572-8793-c862fc651be5/memcached/0.log" Dec 03 10:10:34 crc kubenswrapper[4947]: I1203 10:10:34.823089 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_4948e0fe-dd3d-451b-9f21-92a047aef4a1/nova-cell1-novncproxy-novncproxy/0.log" Dec 03 10:10:34 crc kubenswrapper[4947]: I1203 10:10:34.990694 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellv6mxv_9b2e7f66-e905-447d-b029-e27add784fcc/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Dec 03 10:10:35 crc kubenswrapper[4947]: I1203 10:10:35.079890 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-qz8md_fad89668-b52f-4a0c-a347-08242c3d566d/nova-cell1-openstack-openstack-cell1/0.log" Dec 03 10:10:35 crc kubenswrapper[4947]: I1203 10:10:35.224781 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell2-conductor-0_acc4c853-250e-435c-9c44-3d026fd5e8ca/nova-cell2-conductor-conductor/0.log" Dec 03 10:10:35 crc kubenswrapper[4947]: I1203 10:10:35.494234 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell2-novncproxy-0_fada3a26-6dc9-44dc-9662-e6e2111cbf72/nova-cell2-novncproxy-novncproxy/0.log" Dec 03 10:10:35 crc kubenswrapper[4947]: I1203 10:10:35.686964 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cellzlkr2_2c41415b-4450-464c-bdcf-16da30cc83bb/nova-cell2-openstack-nova-compute-ffu-cell2-openstack-cell2/0.log" Dec 03 10:10:35 crc kubenswrapper[4947]: I1203 10:10:35.760799 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell2-openstack-openstack-cell2-nlwj4_6366976b-5812-49e2-84dd-6ff069eecd14/nova-cell2-openstack-openstack-cell2/0.log" Dec 03 10:10:35 crc kubenswrapper[4947]: I1203 10:10:35.902245 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell3-conductor-0_765d2e96-579a-4b09-8df0-eee649a5620b/nova-cell3-conductor-conductor/0.log" Dec 03 10:10:36 crc kubenswrapper[4947]: I1203 10:10:36.005951 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell3-novncproxy-0_7957cf89-0ff1-4dcf-841f-735eb853dd8c/nova-cell3-novncproxy-novncproxy/0.log" Dec 03 10:10:36 crc kubenswrapper[4947]: I1203 10:10:36.067482 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_6f498eea-ba21-4c7e-8fc0-761ebb08a859/nova-metadata-log/0.log" Dec 03 10:10:36 crc kubenswrapper[4947]: I1203 10:10:36.210539 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_6f498eea-ba21-4c7e-8fc0-761ebb08a859/nova-metadata-metadata/0.log" Dec 03 10:10:36 crc kubenswrapper[4947]: I1203 10:10:36.336358 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_d59dc211-4f20-4e7b-8e31-a387f4e4467d/nova-scheduler-scheduler/0.log" Dec 03 10:10:36 crc kubenswrapper[4947]: I1203 10:10:36.356057 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ec58c8e4-6d4f-4c79-a474-98ca677bc508/mysql-bootstrap/0.log" Dec 03 10:10:36 crc kubenswrapper[4947]: I1203 10:10:36.517666 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ec58c8e4-6d4f-4c79-a474-98ca677bc508/galera/0.log" Dec 03 10:10:36 crc kubenswrapper[4947]: I1203 10:10:36.542750 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell2-galera-0_01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5/mysql-bootstrap/0.log" Dec 03 10:10:36 crc kubenswrapper[4947]: I1203 10:10:36.576685 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ec58c8e4-6d4f-4c79-a474-98ca677bc508/mysql-bootstrap/0.log" Dec 03 10:10:36 crc kubenswrapper[4947]: I1203 10:10:36.780720 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell2-galera-0_01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5/mysql-bootstrap/0.log" Dec 03 10:10:36 crc kubenswrapper[4947]: I1203 10:10:36.791054 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell3-galera-0_e7f8614a-2b9a-43e5-ae0b-2ca7b2749819/mysql-bootstrap/0.log" Dec 03 10:10:36 crc kubenswrapper[4947]: I1203 10:10:36.795462 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell2-galera-0_01d9f67e-d65a-4f13-9bc6-3f7fcd68d8c5/galera/0.log" Dec 03 10:10:36 crc kubenswrapper[4947]: I1203 10:10:36.982716 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell3-galera-0_e7f8614a-2b9a-43e5-ae0b-2ca7b2749819/mysql-bootstrap/0.log" Dec 03 10:10:37 crc kubenswrapper[4947]: I1203 10:10:37.032142 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7cf44fd8-460f-4a2e-8e17-4b34f5158c24/mysql-bootstrap/0.log" Dec 03 10:10:37 crc kubenswrapper[4947]: I1203 10:10:37.101693 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell3-galera-0_e7f8614a-2b9a-43e5-ae0b-2ca7b2749819/galera/0.log" Dec 03 10:10:37 crc kubenswrapper[4947]: I1203 10:10:37.235270 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7cf44fd8-460f-4a2e-8e17-4b34f5158c24/mysql-bootstrap/0.log" Dec 03 10:10:37 crc kubenswrapper[4947]: I1203 10:10:37.268651 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7cf44fd8-460f-4a2e-8e17-4b34f5158c24/galera/0.log" Dec 03 10:10:37 crc kubenswrapper[4947]: I1203 10:10:37.302648 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_3de32e8a-5c89-4684-b401-cdac764e2b5b/openstackclient/0.log" Dec 03 10:10:37 crc kubenswrapper[4947]: I1203 10:10:37.447853 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_523c147f-a1b7-44cb-bd8b-3c917311b905/openstack-network-exporter/0.log" Dec 03 10:10:37 crc kubenswrapper[4947]: I1203 10:10:37.471430 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_523c147f-a1b7-44cb-bd8b-3c917311b905/ovn-northd/0.log" Dec 03 10:10:37 crc kubenswrapper[4947]: I1203 10:10:37.532052 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-zmkh5_45bf01ea-193c-4b97-85a1-135a04051991/ovn-openstack-openstack-cell1/0.log" Dec 03 10:10:37 crc kubenswrapper[4947]: I1203 10:10:37.704043 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell2-hk42j_793dd67d-fca6-4985-86d1-bd542f7e84e3/ovn-openstack-openstack-cell2/0.log" Dec 03 10:10:37 crc kubenswrapper[4947]: I1203 10:10:37.738095 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9372c287-0c89-4690-8d86-74825aa7e960/openstack-network-exporter/0.log" Dec 03 10:10:37 crc kubenswrapper[4947]: I1203 10:10:37.794386 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9372c287-0c89-4690-8d86-74825aa7e960/ovsdbserver-nb/0.log" Dec 03 10:10:37 crc kubenswrapper[4947]: I1203 10:10:37.905252 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_8fb8263d-b49d-43b1-8133-88972776e3a1/openstack-network-exporter/0.log" Dec 03 10:10:37 crc kubenswrapper[4947]: I1203 10:10:37.912335 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_8fb8263d-b49d-43b1-8133-88972776e3a1/ovsdbserver-nb/0.log" Dec 03 10:10:38 crc kubenswrapper[4947]: I1203 10:10:38.026441 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_621edb84-a88e-4aa5-9be0-1c50edbe0c80/openstack-network-exporter/0.log" Dec 03 10:10:38 crc kubenswrapper[4947]: I1203 10:10:38.092821 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_621edb84-a88e-4aa5-9be0-1c50edbe0c80/ovsdbserver-nb/0.log" Dec 03 10:10:38 crc kubenswrapper[4947]: I1203 10:10:38.124264 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf/openstack-network-exporter/0.log" Dec 03 10:10:38 crc kubenswrapper[4947]: I1203 10:10:38.213237 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b67d8e2f-5a5f-446c-8285-2d7ea07cfeaf/ovsdbserver-sb/0.log" Dec 03 10:10:38 crc kubenswrapper[4947]: I1203 10:10:38.309607 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_b935af14-28eb-4ae6-b66a-c7405f1f55f5/openstack-network-exporter/0.log" Dec 03 10:10:38 crc kubenswrapper[4947]: I1203 10:10:38.366316 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_b935af14-28eb-4ae6-b66a-c7405f1f55f5/ovsdbserver-sb/0.log" Dec 03 10:10:38 crc kubenswrapper[4947]: I1203 10:10:38.492664 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_e23e1eaf-4865-4a29-b5db-00de006317dc/ovsdbserver-sb/0.log" Dec 03 10:10:38 crc kubenswrapper[4947]: I1203 10:10:38.503068 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_e23e1eaf-4865-4a29-b5db-00de006317dc/openstack-network-exporter/0.log" Dec 03 10:10:38 crc kubenswrapper[4947]: I1203 10:10:38.693529 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-59fd9c98dd-nkt77_49837ac9-0624-4e5a-ad0a-8ed6eed35e8d/placement-api/0.log" Dec 03 10:10:38 crc kubenswrapper[4947]: I1203 10:10:38.817518 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-59fd9c98dd-nkt77_49837ac9-0624-4e5a-ad0a-8ed6eed35e8d/placement-log/0.log" Dec 03 10:10:38 crc kubenswrapper[4947]: I1203 10:10:38.824009 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-cdv6g5_6b81d648-89b9-4c0b-b312-4a667b806f59/pre-adoption-validation-openstack-pre-adoption-openstack-cell2/0.log" Dec 03 10:10:38 crc kubenswrapper[4947]: I1203 10:10:38.944091 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-cp4hql_7d2e53d2-0d5b-4895-802b-83538cc2fb92/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Dec 03 10:10:39 crc kubenswrapper[4947]: I1203 10:10:39.023708 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8647ce59-dce8-4205-9ba9-2f5dfbbea9bb/init-config-reloader/0.log" Dec 03 10:10:39 crc kubenswrapper[4947]: I1203 10:10:39.092224 4947 scope.go:117] "RemoveContainer" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" Dec 03 10:10:39 crc kubenswrapper[4947]: E1203 10:10:39.092517 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:10:39 crc kubenswrapper[4947]: I1203 10:10:39.317142 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8647ce59-dce8-4205-9ba9-2f5dfbbea9bb/init-config-reloader/0.log" Dec 03 10:10:39 crc kubenswrapper[4947]: I1203 10:10:39.334714 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8647ce59-dce8-4205-9ba9-2f5dfbbea9bb/config-reloader/0.log" Dec 03 10:10:39 crc kubenswrapper[4947]: I1203 10:10:39.397535 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8647ce59-dce8-4205-9ba9-2f5dfbbea9bb/prometheus/0.log" Dec 03 10:10:39 crc kubenswrapper[4947]: I1203 10:10:39.437633 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_8647ce59-dce8-4205-9ba9-2f5dfbbea9bb/thanos-sidecar/0.log" Dec 03 10:10:39 crc kubenswrapper[4947]: I1203 10:10:39.506739 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a0056672-0468-47c1-a7d6-5c5479eef74e/setup-container/0.log" Dec 03 10:10:39 crc kubenswrapper[4947]: I1203 10:10:39.680769 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a0056672-0468-47c1-a7d6-5c5479eef74e/rabbitmq/0.log" Dec 03 10:10:39 crc kubenswrapper[4947]: I1203 10:10:39.708657 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a0056672-0468-47c1-a7d6-5c5479eef74e/setup-container/0.log" Dec 03 10:10:39 crc kubenswrapper[4947]: I1203 10:10:39.723710 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell2-server-0_2d4140e8-1bc3-4ee0-9355-5135833ce0d8/setup-container/0.log" Dec 03 10:10:39 crc kubenswrapper[4947]: I1203 10:10:39.877630 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell2-server-0_2d4140e8-1bc3-4ee0-9355-5135833ce0d8/setup-container/0.log" Dec 03 10:10:39 crc kubenswrapper[4947]: I1203 10:10:39.945045 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell2-server-0_2d4140e8-1bc3-4ee0-9355-5135833ce0d8/rabbitmq/0.log" Dec 03 10:10:39 crc kubenswrapper[4947]: I1203 10:10:39.951372 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell3-server-0_687851ea-a5db-409c-9562-726c0d59a375/setup-container/0.log" Dec 03 10:10:40 crc kubenswrapper[4947]: I1203 10:10:40.127269 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell3-server-0_687851ea-a5db-409c-9562-726c0d59a375/setup-container/0.log" Dec 03 10:10:40 crc kubenswrapper[4947]: I1203 10:10:40.168859 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_a895404f-5bdf-4f7a-999f-8312a567c1d5/setup-container/0.log" Dec 03 10:10:40 crc kubenswrapper[4947]: I1203 10:10:40.183513 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell3-server-0_687851ea-a5db-409c-9562-726c0d59a375/rabbitmq/0.log" Dec 03 10:10:40 crc kubenswrapper[4947]: I1203 10:10:40.343453 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_a895404f-5bdf-4f7a-999f-8312a567c1d5/setup-container/0.log" Dec 03 10:10:40 crc kubenswrapper[4947]: I1203 10:10:40.380905 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-2t7q6_295f9a9c-86ec-41c0-8a8f-ca4bbf72fc11/reboot-os-openstack-openstack-cell1/0.log" Dec 03 10:10:40 crc kubenswrapper[4947]: I1203 10:10:40.563128 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_a895404f-5bdf-4f7a-999f-8312a567c1d5/rabbitmq/0.log" Dec 03 10:10:40 crc kubenswrapper[4947]: I1203 10:10:40.564335 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell2-pqmhv_44b8db5a-143f-459e-be84-316ae4a51d47/reboot-os-openstack-openstack-cell2/0.log" Dec 03 10:10:40 crc kubenswrapper[4947]: I1203 10:10:40.693526 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-fxwdw_6534752f-c509-4d12-9123-599ba4eb1da7/run-os-openstack-openstack-cell1/0.log" Dec 03 10:10:40 crc kubenswrapper[4947]: I1203 10:10:40.745097 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell2-qtsdm_1aed45c4-3852-4919-b581-d82a139c3f03/run-os-openstack-openstack-cell2/0.log" Dec 03 10:10:40 crc kubenswrapper[4947]: I1203 10:10:40.871783 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-qcps8_f1af51fb-9081-44b1-9408-7d267894140c/ssh-known-hosts-openstack/0.log" Dec 03 10:10:40 crc kubenswrapper[4947]: I1203 10:10:40.991715 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-64c8bd4d48-79lrk_4b5182fc-c85f-4e7a-960c-a2aa59cc653b/proxy-server/0.log" Dec 03 10:10:41 crc kubenswrapper[4947]: I1203 10:10:41.084354 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-64c8bd4d48-79lrk_4b5182fc-c85f-4e7a-960c-a2aa59cc653b/proxy-httpd/0.log" Dec 03 10:10:41 crc kubenswrapper[4947]: I1203 10:10:41.116055 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-jn9hw_e870e8ba-007a-48cd-bfbb-fa8051fabf33/swift-ring-rebalance/0.log" Dec 03 10:10:41 crc kubenswrapper[4947]: I1203 10:10:41.282799 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-49rkq_50db48fc-571d-4587-a07e-64ed238f82a9/telemetry-openstack-openstack-cell1/0.log" Dec 03 10:10:41 crc kubenswrapper[4947]: I1203 10:10:41.359314 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell2-sbspq_42d0e62c-8bf7-4830-a450-d8cef6764abe/telemetry-openstack-openstack-cell2/0.log" Dec 03 10:10:41 crc kubenswrapper[4947]: I1203 10:10:41.453046 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_e69a0cda-c7ef-4f16-a380-0d3f9385574e/tempest-tests-tempest-tests-runner/0.log" Dec 03 10:10:41 crc kubenswrapper[4947]: I1203 10:10:41.523662 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_e4697a62-089a-4448-be21-9867446728b2/test-operator-logs-container/0.log" Dec 03 10:10:41 crc kubenswrapper[4947]: I1203 10:10:41.626746 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-6tdnr_0c3e5b3c-7650-4636-963d-a108c90aab4c/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Dec 03 10:10:41 crc kubenswrapper[4947]: I1203 10:10:41.790103 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell2-cnn9l_1033702d-9d8f-40b5-8435-2e4cb6850bed/tripleo-cleanup-tripleo-cleanup-openstack-cell2/0.log" Dec 03 10:10:41 crc kubenswrapper[4947]: I1203 10:10:41.895600 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-2r4v7_5067a94c-5435-42d8-9e27-e5dfd3abe35f/validate-network-openstack-openstack-cell1/0.log" Dec 03 10:10:41 crc kubenswrapper[4947]: I1203 10:10:41.969990 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell2-h4hrv_76762737-e806-4f14-bde0-1a8daa074767/validate-network-openstack-openstack-cell2/0.log" Dec 03 10:10:50 crc kubenswrapper[4947]: I1203 10:10:50.512090 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z7pfv"] Dec 03 10:10:50 crc kubenswrapper[4947]: E1203 10:10:50.513277 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02678e09-a6ce-4ad1-a0df-5fcea2e344fd" containerName="registry-server" Dec 03 10:10:50 crc kubenswrapper[4947]: I1203 10:10:50.513295 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="02678e09-a6ce-4ad1-a0df-5fcea2e344fd" containerName="registry-server" Dec 03 10:10:50 crc kubenswrapper[4947]: E1203 10:10:50.513321 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02678e09-a6ce-4ad1-a0df-5fcea2e344fd" containerName="extract-content" Dec 03 10:10:50 crc kubenswrapper[4947]: I1203 10:10:50.513328 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="02678e09-a6ce-4ad1-a0df-5fcea2e344fd" containerName="extract-content" Dec 03 10:10:50 crc kubenswrapper[4947]: E1203 10:10:50.513342 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02678e09-a6ce-4ad1-a0df-5fcea2e344fd" containerName="extract-utilities" Dec 03 10:10:50 crc kubenswrapper[4947]: I1203 10:10:50.513350 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="02678e09-a6ce-4ad1-a0df-5fcea2e344fd" containerName="extract-utilities" Dec 03 10:10:50 crc kubenswrapper[4947]: I1203 10:10:50.513688 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="02678e09-a6ce-4ad1-a0df-5fcea2e344fd" containerName="registry-server" Dec 03 10:10:50 crc kubenswrapper[4947]: I1203 10:10:50.515688 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z7pfv" Dec 03 10:10:50 crc kubenswrapper[4947]: I1203 10:10:50.534951 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z7pfv"] Dec 03 10:10:50 crc kubenswrapper[4947]: I1203 10:10:50.547130 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdfgp\" (UniqueName: \"kubernetes.io/projected/881ec715-1c98-4c39-a9bc-0c78b6cc48ad-kube-api-access-vdfgp\") pod \"certified-operators-z7pfv\" (UID: \"881ec715-1c98-4c39-a9bc-0c78b6cc48ad\") " pod="openshift-marketplace/certified-operators-z7pfv" Dec 03 10:10:50 crc kubenswrapper[4947]: I1203 10:10:50.547216 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/881ec715-1c98-4c39-a9bc-0c78b6cc48ad-catalog-content\") pod \"certified-operators-z7pfv\" (UID: \"881ec715-1c98-4c39-a9bc-0c78b6cc48ad\") " pod="openshift-marketplace/certified-operators-z7pfv" Dec 03 10:10:50 crc kubenswrapper[4947]: I1203 10:10:50.547601 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/881ec715-1c98-4c39-a9bc-0c78b6cc48ad-utilities\") pod \"certified-operators-z7pfv\" (UID: \"881ec715-1c98-4c39-a9bc-0c78b6cc48ad\") " pod="openshift-marketplace/certified-operators-z7pfv" Dec 03 10:10:50 crc kubenswrapper[4947]: I1203 10:10:50.649721 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/881ec715-1c98-4c39-a9bc-0c78b6cc48ad-utilities\") pod \"certified-operators-z7pfv\" (UID: \"881ec715-1c98-4c39-a9bc-0c78b6cc48ad\") " pod="openshift-marketplace/certified-operators-z7pfv" Dec 03 10:10:50 crc kubenswrapper[4947]: I1203 10:10:50.649846 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdfgp\" (UniqueName: \"kubernetes.io/projected/881ec715-1c98-4c39-a9bc-0c78b6cc48ad-kube-api-access-vdfgp\") pod \"certified-operators-z7pfv\" (UID: \"881ec715-1c98-4c39-a9bc-0c78b6cc48ad\") " pod="openshift-marketplace/certified-operators-z7pfv" Dec 03 10:10:50 crc kubenswrapper[4947]: I1203 10:10:50.649876 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/881ec715-1c98-4c39-a9bc-0c78b6cc48ad-catalog-content\") pod \"certified-operators-z7pfv\" (UID: \"881ec715-1c98-4c39-a9bc-0c78b6cc48ad\") " pod="openshift-marketplace/certified-operators-z7pfv" Dec 03 10:10:50 crc kubenswrapper[4947]: I1203 10:10:50.650263 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/881ec715-1c98-4c39-a9bc-0c78b6cc48ad-utilities\") pod \"certified-operators-z7pfv\" (UID: \"881ec715-1c98-4c39-a9bc-0c78b6cc48ad\") " pod="openshift-marketplace/certified-operators-z7pfv" Dec 03 10:10:50 crc kubenswrapper[4947]: I1203 10:10:50.652654 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/881ec715-1c98-4c39-a9bc-0c78b6cc48ad-catalog-content\") pod \"certified-operators-z7pfv\" (UID: \"881ec715-1c98-4c39-a9bc-0c78b6cc48ad\") " pod="openshift-marketplace/certified-operators-z7pfv" Dec 03 10:10:50 crc kubenswrapper[4947]: I1203 10:10:50.668470 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdfgp\" (UniqueName: \"kubernetes.io/projected/881ec715-1c98-4c39-a9bc-0c78b6cc48ad-kube-api-access-vdfgp\") pod \"certified-operators-z7pfv\" (UID: \"881ec715-1c98-4c39-a9bc-0c78b6cc48ad\") " pod="openshift-marketplace/certified-operators-z7pfv" Dec 03 10:10:50 crc kubenswrapper[4947]: I1203 10:10:50.860075 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z7pfv" Dec 03 10:10:51 crc kubenswrapper[4947]: I1203 10:10:51.414327 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z7pfv"] Dec 03 10:10:52 crc kubenswrapper[4947]: I1203 10:10:52.139632 4947 generic.go:334] "Generic (PLEG): container finished" podID="881ec715-1c98-4c39-a9bc-0c78b6cc48ad" containerID="0c80fa3aa139e6e2561c0bebc1adf574c72df17c28495a4381ca2658e1715a65" exitCode=0 Dec 03 10:10:52 crc kubenswrapper[4947]: I1203 10:10:52.139686 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7pfv" event={"ID":"881ec715-1c98-4c39-a9bc-0c78b6cc48ad","Type":"ContainerDied","Data":"0c80fa3aa139e6e2561c0bebc1adf574c72df17c28495a4381ca2658e1715a65"} Dec 03 10:10:52 crc kubenswrapper[4947]: I1203 10:10:52.139714 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7pfv" event={"ID":"881ec715-1c98-4c39-a9bc-0c78b6cc48ad","Type":"ContainerStarted","Data":"cdf2ff4466553df26f5f780993d69edbcb24d5f21b0da1dd0e0f30f03a7cd64c"} Dec 03 10:10:54 crc kubenswrapper[4947]: I1203 10:10:54.084061 4947 scope.go:117] "RemoveContainer" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" Dec 03 10:10:54 crc kubenswrapper[4947]: E1203 10:10:54.085036 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:10:54 crc kubenswrapper[4947]: I1203 10:10:54.168384 4947 generic.go:334] "Generic (PLEG): container finished" podID="881ec715-1c98-4c39-a9bc-0c78b6cc48ad" containerID="cc6b4c72801850b54d39e2faabb2ab3a12d825d575921de55d89b0c74b033699" exitCode=0 Dec 03 10:10:54 crc kubenswrapper[4947]: I1203 10:10:54.168440 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7pfv" event={"ID":"881ec715-1c98-4c39-a9bc-0c78b6cc48ad","Type":"ContainerDied","Data":"cc6b4c72801850b54d39e2faabb2ab3a12d825d575921de55d89b0c74b033699"} Dec 03 10:10:55 crc kubenswrapper[4947]: I1203 10:10:55.183228 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7pfv" event={"ID":"881ec715-1c98-4c39-a9bc-0c78b6cc48ad","Type":"ContainerStarted","Data":"81ea700022c93db3c042b773d48f13070fe14ba8c47163e7f19d88919d438381"} Dec 03 10:10:55 crc kubenswrapper[4947]: I1203 10:10:55.204542 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z7pfv" podStartSLOduration=2.622389653 podStartE2EDuration="5.204506136s" podCreationTimestamp="2025-12-03 10:10:50 +0000 UTC" firstStartedPulling="2025-12-03 10:10:52.142420731 +0000 UTC m=+12113.403375157" lastFinishedPulling="2025-12-03 10:10:54.724537214 +0000 UTC m=+12115.985491640" observedRunningTime="2025-12-03 10:10:55.201105224 +0000 UTC m=+12116.462059680" watchObservedRunningTime="2025-12-03 10:10:55.204506136 +0000 UTC m=+12116.465460562" Dec 03 10:11:00 crc kubenswrapper[4947]: I1203 10:11:00.860974 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z7pfv" Dec 03 10:11:00 crc kubenswrapper[4947]: I1203 10:11:00.862432 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z7pfv" Dec 03 10:11:00 crc kubenswrapper[4947]: I1203 10:11:00.917239 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z7pfv" Dec 03 10:11:01 crc kubenswrapper[4947]: I1203 10:11:01.312404 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z7pfv" Dec 03 10:11:01 crc kubenswrapper[4947]: I1203 10:11:01.360485 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z7pfv"] Dec 03 10:11:03 crc kubenswrapper[4947]: I1203 10:11:03.271348 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z7pfv" podUID="881ec715-1c98-4c39-a9bc-0c78b6cc48ad" containerName="registry-server" containerID="cri-o://81ea700022c93db3c042b773d48f13070fe14ba8c47163e7f19d88919d438381" gracePeriod=2 Dec 03 10:11:03 crc kubenswrapper[4947]: I1203 10:11:03.774454 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z7pfv" Dec 03 10:11:03 crc kubenswrapper[4947]: I1203 10:11:03.842833 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/881ec715-1c98-4c39-a9bc-0c78b6cc48ad-utilities\") pod \"881ec715-1c98-4c39-a9bc-0c78b6cc48ad\" (UID: \"881ec715-1c98-4c39-a9bc-0c78b6cc48ad\") " Dec 03 10:11:03 crc kubenswrapper[4947]: I1203 10:11:03.843008 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/881ec715-1c98-4c39-a9bc-0c78b6cc48ad-catalog-content\") pod \"881ec715-1c98-4c39-a9bc-0c78b6cc48ad\" (UID: \"881ec715-1c98-4c39-a9bc-0c78b6cc48ad\") " Dec 03 10:11:03 crc kubenswrapper[4947]: I1203 10:11:03.843092 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdfgp\" (UniqueName: \"kubernetes.io/projected/881ec715-1c98-4c39-a9bc-0c78b6cc48ad-kube-api-access-vdfgp\") pod \"881ec715-1c98-4c39-a9bc-0c78b6cc48ad\" (UID: \"881ec715-1c98-4c39-a9bc-0c78b6cc48ad\") " Dec 03 10:11:03 crc kubenswrapper[4947]: I1203 10:11:03.843878 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/881ec715-1c98-4c39-a9bc-0c78b6cc48ad-utilities" (OuterVolumeSpecName: "utilities") pod "881ec715-1c98-4c39-a9bc-0c78b6cc48ad" (UID: "881ec715-1c98-4c39-a9bc-0c78b6cc48ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 10:11:03 crc kubenswrapper[4947]: I1203 10:11:03.849392 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/881ec715-1c98-4c39-a9bc-0c78b6cc48ad-kube-api-access-vdfgp" (OuterVolumeSpecName: "kube-api-access-vdfgp") pod "881ec715-1c98-4c39-a9bc-0c78b6cc48ad" (UID: "881ec715-1c98-4c39-a9bc-0c78b6cc48ad"). InnerVolumeSpecName "kube-api-access-vdfgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 10:11:03 crc kubenswrapper[4947]: I1203 10:11:03.911602 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/881ec715-1c98-4c39-a9bc-0c78b6cc48ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "881ec715-1c98-4c39-a9bc-0c78b6cc48ad" (UID: "881ec715-1c98-4c39-a9bc-0c78b6cc48ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 10:11:03 crc kubenswrapper[4947]: I1203 10:11:03.946333 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/881ec715-1c98-4c39-a9bc-0c78b6cc48ad-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 10:11:03 crc kubenswrapper[4947]: I1203 10:11:03.946377 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdfgp\" (UniqueName: \"kubernetes.io/projected/881ec715-1c98-4c39-a9bc-0c78b6cc48ad-kube-api-access-vdfgp\") on node \"crc\" DevicePath \"\"" Dec 03 10:11:03 crc kubenswrapper[4947]: I1203 10:11:03.946387 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/881ec715-1c98-4c39-a9bc-0c78b6cc48ad-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 10:11:04 crc kubenswrapper[4947]: I1203 10:11:04.284360 4947 generic.go:334] "Generic (PLEG): container finished" podID="881ec715-1c98-4c39-a9bc-0c78b6cc48ad" containerID="81ea700022c93db3c042b773d48f13070fe14ba8c47163e7f19d88919d438381" exitCode=0 Dec 03 10:11:04 crc kubenswrapper[4947]: I1203 10:11:04.284410 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7pfv" event={"ID":"881ec715-1c98-4c39-a9bc-0c78b6cc48ad","Type":"ContainerDied","Data":"81ea700022c93db3c042b773d48f13070fe14ba8c47163e7f19d88919d438381"} Dec 03 10:11:04 crc kubenswrapper[4947]: I1203 10:11:04.284441 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z7pfv" event={"ID":"881ec715-1c98-4c39-a9bc-0c78b6cc48ad","Type":"ContainerDied","Data":"cdf2ff4466553df26f5f780993d69edbcb24d5f21b0da1dd0e0f30f03a7cd64c"} Dec 03 10:11:04 crc kubenswrapper[4947]: I1203 10:11:04.284460 4947 scope.go:117] "RemoveContainer" containerID="81ea700022c93db3c042b773d48f13070fe14ba8c47163e7f19d88919d438381" Dec 03 10:11:04 crc kubenswrapper[4947]: I1203 10:11:04.284416 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z7pfv" Dec 03 10:11:04 crc kubenswrapper[4947]: I1203 10:11:04.326131 4947 scope.go:117] "RemoveContainer" containerID="cc6b4c72801850b54d39e2faabb2ab3a12d825d575921de55d89b0c74b033699" Dec 03 10:11:04 crc kubenswrapper[4947]: I1203 10:11:04.331118 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z7pfv"] Dec 03 10:11:04 crc kubenswrapper[4947]: I1203 10:11:04.342620 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z7pfv"] Dec 03 10:11:04 crc kubenswrapper[4947]: I1203 10:11:04.351230 4947 scope.go:117] "RemoveContainer" containerID="0c80fa3aa139e6e2561c0bebc1adf574c72df17c28495a4381ca2658e1715a65" Dec 03 10:11:04 crc kubenswrapper[4947]: I1203 10:11:04.409995 4947 scope.go:117] "RemoveContainer" containerID="81ea700022c93db3c042b773d48f13070fe14ba8c47163e7f19d88919d438381" Dec 03 10:11:04 crc kubenswrapper[4947]: E1203 10:11:04.410295 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81ea700022c93db3c042b773d48f13070fe14ba8c47163e7f19d88919d438381\": container with ID starting with 81ea700022c93db3c042b773d48f13070fe14ba8c47163e7f19d88919d438381 not found: ID does not exist" containerID="81ea700022c93db3c042b773d48f13070fe14ba8c47163e7f19d88919d438381" Dec 03 10:11:04 crc kubenswrapper[4947]: I1203 10:11:04.410324 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81ea700022c93db3c042b773d48f13070fe14ba8c47163e7f19d88919d438381"} err="failed to get container status \"81ea700022c93db3c042b773d48f13070fe14ba8c47163e7f19d88919d438381\": rpc error: code = NotFound desc = could not find container \"81ea700022c93db3c042b773d48f13070fe14ba8c47163e7f19d88919d438381\": container with ID starting with 81ea700022c93db3c042b773d48f13070fe14ba8c47163e7f19d88919d438381 not found: ID does not exist" Dec 03 10:11:04 crc kubenswrapper[4947]: I1203 10:11:04.410344 4947 scope.go:117] "RemoveContainer" containerID="cc6b4c72801850b54d39e2faabb2ab3a12d825d575921de55d89b0c74b033699" Dec 03 10:11:04 crc kubenswrapper[4947]: E1203 10:11:04.410631 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc6b4c72801850b54d39e2faabb2ab3a12d825d575921de55d89b0c74b033699\": container with ID starting with cc6b4c72801850b54d39e2faabb2ab3a12d825d575921de55d89b0c74b033699 not found: ID does not exist" containerID="cc6b4c72801850b54d39e2faabb2ab3a12d825d575921de55d89b0c74b033699" Dec 03 10:11:04 crc kubenswrapper[4947]: I1203 10:11:04.410660 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc6b4c72801850b54d39e2faabb2ab3a12d825d575921de55d89b0c74b033699"} err="failed to get container status \"cc6b4c72801850b54d39e2faabb2ab3a12d825d575921de55d89b0c74b033699\": rpc error: code = NotFound desc = could not find container \"cc6b4c72801850b54d39e2faabb2ab3a12d825d575921de55d89b0c74b033699\": container with ID starting with cc6b4c72801850b54d39e2faabb2ab3a12d825d575921de55d89b0c74b033699 not found: ID does not exist" Dec 03 10:11:04 crc kubenswrapper[4947]: I1203 10:11:04.410676 4947 scope.go:117] "RemoveContainer" containerID="0c80fa3aa139e6e2561c0bebc1adf574c72df17c28495a4381ca2658e1715a65" Dec 03 10:11:04 crc kubenswrapper[4947]: E1203 10:11:04.410890 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c80fa3aa139e6e2561c0bebc1adf574c72df17c28495a4381ca2658e1715a65\": container with ID starting with 0c80fa3aa139e6e2561c0bebc1adf574c72df17c28495a4381ca2658e1715a65 not found: ID does not exist" containerID="0c80fa3aa139e6e2561c0bebc1adf574c72df17c28495a4381ca2658e1715a65" Dec 03 10:11:04 crc kubenswrapper[4947]: I1203 10:11:04.410918 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c80fa3aa139e6e2561c0bebc1adf574c72df17c28495a4381ca2658e1715a65"} err="failed to get container status \"0c80fa3aa139e6e2561c0bebc1adf574c72df17c28495a4381ca2658e1715a65\": rpc error: code = NotFound desc = could not find container \"0c80fa3aa139e6e2561c0bebc1adf574c72df17c28495a4381ca2658e1715a65\": container with ID starting with 0c80fa3aa139e6e2561c0bebc1adf574c72df17c28495a4381ca2658e1715a65 not found: ID does not exist" Dec 03 10:11:05 crc kubenswrapper[4947]: I1203 10:11:05.101226 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="881ec715-1c98-4c39-a9bc-0c78b6cc48ad" path="/var/lib/kubelet/pods/881ec715-1c98-4c39-a9bc-0c78b6cc48ad/volumes" Dec 03 10:11:06 crc kubenswrapper[4947]: I1203 10:11:06.055542 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz_b7144393-2451-4a57-8755-5d18064bec1a/util/0.log" Dec 03 10:11:06 crc kubenswrapper[4947]: I1203 10:11:06.317447 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz_b7144393-2451-4a57-8755-5d18064bec1a/util/0.log" Dec 03 10:11:06 crc kubenswrapper[4947]: I1203 10:11:06.342095 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz_b7144393-2451-4a57-8755-5d18064bec1a/pull/0.log" Dec 03 10:11:06 crc kubenswrapper[4947]: I1203 10:11:06.357544 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz_b7144393-2451-4a57-8755-5d18064bec1a/pull/0.log" Dec 03 10:11:06 crc kubenswrapper[4947]: I1203 10:11:06.562676 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz_b7144393-2451-4a57-8755-5d18064bec1a/pull/0.log" Dec 03 10:11:06 crc kubenswrapper[4947]: I1203 10:11:06.653844 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz_b7144393-2451-4a57-8755-5d18064bec1a/util/0.log" Dec 03 10:11:06 crc kubenswrapper[4947]: I1203 10:11:06.660160 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_98dc3bd0b5c63de8bc52e3558b9d3e72fafafb6fd127fd2510d22068649jmbz_b7144393-2451-4a57-8755-5d18064bec1a/extract/0.log" Dec 03 10:11:06 crc kubenswrapper[4947]: I1203 10:11:06.828718 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-26vkh_888543d1-0ade-4e16-8965-a0ecb6fd65a7/kube-rbac-proxy/0.log" Dec 03 10:11:06 crc kubenswrapper[4947]: I1203 10:11:06.868795 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-g9h4f_83997d19-6166-445a-a2bd-15acf15fa18d/kube-rbac-proxy/0.log" Dec 03 10:11:06 crc kubenswrapper[4947]: I1203 10:11:06.960202 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-26vkh_888543d1-0ade-4e16-8965-a0ecb6fd65a7/manager/0.log" Dec 03 10:11:07 crc kubenswrapper[4947]: I1203 10:11:07.089426 4947 scope.go:117] "RemoveContainer" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" Dec 03 10:11:07 crc kubenswrapper[4947]: E1203 10:11:07.089761 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:11:07 crc kubenswrapper[4947]: I1203 10:11:07.144152 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-g9h4f_83997d19-6166-445a-a2bd-15acf15fa18d/manager/0.log" Dec 03 10:11:07 crc kubenswrapper[4947]: I1203 10:11:07.228734 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-mg8v4_e266debc-b844-4b60-bbbf-c038b61a7ab8/manager/0.log" Dec 03 10:11:07 crc kubenswrapper[4947]: I1203 10:11:07.237384 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-mg8v4_e266debc-b844-4b60-bbbf-c038b61a7ab8/kube-rbac-proxy/0.log" Dec 03 10:11:07 crc kubenswrapper[4947]: I1203 10:11:07.361590 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-nvsv7_528fb24c-b835-446a-84c8-fce6b4e4815c/kube-rbac-proxy/0.log" Dec 03 10:11:07 crc kubenswrapper[4947]: I1203 10:11:07.534334 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-nvsv7_528fb24c-b835-446a-84c8-fce6b4e4815c/manager/0.log" Dec 03 10:11:07 crc kubenswrapper[4947]: I1203 10:11:07.588918 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-hfz6c_1e069b5f-1572-4c42-b34e-74c49b4b6940/kube-rbac-proxy/0.log" Dec 03 10:11:07 crc kubenswrapper[4947]: I1203 10:11:07.607034 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-hfz6c_1e069b5f-1572-4c42-b34e-74c49b4b6940/manager/0.log" Dec 03 10:11:07 crc kubenswrapper[4947]: I1203 10:11:07.994014 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-gl8qx_13d55077-1827-43a0-a985-85db61855cb3/kube-rbac-proxy/0.log" Dec 03 10:11:08 crc kubenswrapper[4947]: I1203 10:11:08.016574 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-gl8qx_13d55077-1827-43a0-a985-85db61855cb3/manager/0.log" Dec 03 10:11:08 crc kubenswrapper[4947]: I1203 10:11:08.211305 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-htpcw_b30f0b12-f386-4968-98fd-e2272aa1b2f9/kube-rbac-proxy/0.log" Dec 03 10:11:08 crc kubenswrapper[4947]: I1203 10:11:08.249728 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-lt6cp_59be11f8-72a2-45cc-b690-951bda0d87be/kube-rbac-proxy/0.log" Dec 03 10:11:08 crc kubenswrapper[4947]: I1203 10:11:08.511722 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-lt6cp_59be11f8-72a2-45cc-b690-951bda0d87be/manager/0.log" Dec 03 10:11:08 crc kubenswrapper[4947]: I1203 10:11:08.533695 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-htpcw_b30f0b12-f386-4968-98fd-e2272aa1b2f9/manager/0.log" Dec 03 10:11:08 crc kubenswrapper[4947]: I1203 10:11:08.597662 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-qzhrh_fd59ba56-51b7-4260-9b1f-e3bee0916e06/kube-rbac-proxy/0.log" Dec 03 10:11:08 crc kubenswrapper[4947]: I1203 10:11:08.758442 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-qzhrh_fd59ba56-51b7-4260-9b1f-e3bee0916e06/manager/0.log" Dec 03 10:11:08 crc kubenswrapper[4947]: I1203 10:11:08.799145 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-bdgzf_fbf40e80-a5ef-41d4-ad63-b060d52be33f/kube-rbac-proxy/0.log" Dec 03 10:11:08 crc kubenswrapper[4947]: I1203 10:11:08.826796 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-bdgzf_fbf40e80-a5ef-41d4-ad63-b060d52be33f/manager/0.log" Dec 03 10:11:08 crc kubenswrapper[4947]: I1203 10:11:08.976564 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-hflxr_b537173f-b7c8-426d-bf40-8bb6ece17177/kube-rbac-proxy/0.log" Dec 03 10:11:09 crc kubenswrapper[4947]: I1203 10:11:09.064047 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-hflxr_b537173f-b7c8-426d-bf40-8bb6ece17177/manager/0.log" Dec 03 10:11:09 crc kubenswrapper[4947]: I1203 10:11:09.178417 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-n9p7d_f67211f5-3446-471e-8124-52d1d18dadbe/kube-rbac-proxy/0.log" Dec 03 10:11:09 crc kubenswrapper[4947]: I1203 10:11:09.348637 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-lwmcn_0681e281-d288-430f-b175-1d5c36593c9a/kube-rbac-proxy/0.log" Dec 03 10:11:09 crc kubenswrapper[4947]: I1203 10:11:09.350922 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-n9p7d_f67211f5-3446-471e-8124-52d1d18dadbe/manager/0.log" Dec 03 10:11:09 crc kubenswrapper[4947]: I1203 10:11:09.646866 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-lwmcn_0681e281-d288-430f-b175-1d5c36593c9a/manager/0.log" Dec 03 10:11:09 crc kubenswrapper[4947]: I1203 10:11:09.680854 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-mh7hb_0f8b1d0b-522a-413b-a689-39044ec47286/manager/0.log" Dec 03 10:11:09 crc kubenswrapper[4947]: I1203 10:11:09.690220 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-mh7hb_0f8b1d0b-522a-413b-a689-39044ec47286/kube-rbac-proxy/0.log" Dec 03 10:11:09 crc kubenswrapper[4947]: I1203 10:11:09.857527 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55d86b6686l6m2r_de69b4ac-b81b-4074-8e69-2ec717ecd70b/kube-rbac-proxy/0.log" Dec 03 10:11:09 crc kubenswrapper[4947]: I1203 10:11:09.878897 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55d86b6686l6m2r_de69b4ac-b81b-4074-8e69-2ec717ecd70b/manager/0.log" Dec 03 10:11:10 crc kubenswrapper[4947]: I1203 10:11:10.221920 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7dd5c7bb7c-m52kd_b6f8f031-fa7a-4e17-88ae-3a27974fa5f1/operator/0.log" Dec 03 10:11:10 crc kubenswrapper[4947]: I1203 10:11:10.492173 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-b5bpz_b548d942-a26b-47aa-b352-23afd3148288/kube-rbac-proxy/0.log" Dec 03 10:11:10 crc kubenswrapper[4947]: I1203 10:11:10.518709 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-cxbp6_8309e8b6-d88f-4bba-bdcf-d52ab10570aa/registry-server/0.log" Dec 03 10:11:10 crc kubenswrapper[4947]: I1203 10:11:10.662377 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-b5bpz_b548d942-a26b-47aa-b352-23afd3148288/manager/0.log" Dec 03 10:11:10 crc kubenswrapper[4947]: I1203 10:11:10.733654 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-llv5g_0703a4f3-6732-44a1-b690-fcea6eb2228d/kube-rbac-proxy/0.log" Dec 03 10:11:10 crc kubenswrapper[4947]: I1203 10:11:10.835384 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-llv5g_0703a4f3-6732-44a1-b690-fcea6eb2228d/manager/0.log" Dec 03 10:11:11 crc kubenswrapper[4947]: I1203 10:11:11.050476 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-72sgg_e8e59da3-af36-42c7-9c78-98608089eaea/kube-rbac-proxy/0.log" Dec 03 10:11:11 crc kubenswrapper[4947]: I1203 10:11:11.103956 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-sk2lv_ee4fc346-902a-4be8-9bf4-081b58b2c547/operator/0.log" Dec 03 10:11:11 crc kubenswrapper[4947]: I1203 10:11:11.215169 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-72sgg_e8e59da3-af36-42c7-9c78-98608089eaea/manager/0.log" Dec 03 10:11:11 crc kubenswrapper[4947]: I1203 10:11:11.288504 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-gdq5s_6999c194-78a5-48db-9e56-59f65d9e11c1/kube-rbac-proxy/0.log" Dec 03 10:11:11 crc kubenswrapper[4947]: I1203 10:11:11.528831 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-9bpvr_76116ad0-e325-41c1-a25e-9089331c52ba/kube-rbac-proxy/0.log" Dec 03 10:11:11 crc kubenswrapper[4947]: I1203 10:11:11.561776 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-9bpvr_76116ad0-e325-41c1-a25e-9089331c52ba/manager/0.log" Dec 03 10:11:11 crc kubenswrapper[4947]: I1203 10:11:11.584187 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-gdq5s_6999c194-78a5-48db-9e56-59f65d9e11c1/manager/0.log" Dec 03 10:11:11 crc kubenswrapper[4947]: I1203 10:11:11.874519 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-vksgw_bab921ef-a66f-48fe-87fc-fa040bc09b2e/manager/0.log" Dec 03 10:11:11 crc kubenswrapper[4947]: I1203 10:11:11.940744 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-vksgw_bab921ef-a66f-48fe-87fc-fa040bc09b2e/kube-rbac-proxy/0.log" Dec 03 10:11:12 crc kubenswrapper[4947]: I1203 10:11:12.787911 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-9f56fc979-wpbkc_549a3cff-42c6-45ea-8e4c-36c4aa29457c/manager/0.log" Dec 03 10:11:19 crc kubenswrapper[4947]: I1203 10:11:19.091319 4947 scope.go:117] "RemoveContainer" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" Dec 03 10:11:19 crc kubenswrapper[4947]: E1203 10:11:19.092073 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:11:32 crc kubenswrapper[4947]: I1203 10:11:32.784384 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-ljd4p_81985cd6-f5a4-46f3-9c71-448dc4f3bee6/control-plane-machine-set-operator/0.log" Dec 03 10:11:32 crc kubenswrapper[4947]: I1203 10:11:32.856295 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-klrbv_272dc0cc-9856-4375-b9f3-0ce1b543f30f/kube-rbac-proxy/0.log" Dec 03 10:11:32 crc kubenswrapper[4947]: I1203 10:11:32.900744 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-klrbv_272dc0cc-9856-4375-b9f3-0ce1b543f30f/machine-api-operator/0.log" Dec 03 10:11:34 crc kubenswrapper[4947]: I1203 10:11:34.083888 4947 scope.go:117] "RemoveContainer" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" Dec 03 10:11:34 crc kubenswrapper[4947]: E1203 10:11:34.084226 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:11:45 crc kubenswrapper[4947]: I1203 10:11:45.416046 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-grghg_b3404cf1-3794-4aeb-badb-55bede44e49e/cert-manager-controller/0.log" Dec 03 10:11:45 crc kubenswrapper[4947]: I1203 10:11:45.604745 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-p55bx_4dcf4bb6-050a-4327-bff0-d1e10bf87782/cert-manager-cainjector/0.log" Dec 03 10:11:45 crc kubenswrapper[4947]: I1203 10:11:45.657947 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-zk967_c04fa588-2ca2-4f54-abe9-167468be5bab/cert-manager-webhook/0.log" Dec 03 10:11:47 crc kubenswrapper[4947]: I1203 10:11:47.083730 4947 scope.go:117] "RemoveContainer" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" Dec 03 10:11:47 crc kubenswrapper[4947]: E1203 10:11:47.084043 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:11:58 crc kubenswrapper[4947]: I1203 10:11:58.445125 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-w2fjr_05b1f704-be0b-42bd-bee0-533712c2fa3b/nmstate-console-plugin/0.log" Dec 03 10:11:58 crc kubenswrapper[4947]: I1203 10:11:58.674263 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-gnfbv_4eaa6896-fdad-4d14-bca7-a155e8ddfa63/nmstate-handler/0.log" Dec 03 10:11:58 crc kubenswrapper[4947]: I1203 10:11:58.744965 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-8smps_acc10835-1647-4727-9bda-15a99886aec1/kube-rbac-proxy/0.log" Dec 03 10:11:58 crc kubenswrapper[4947]: I1203 10:11:58.783831 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-8smps_acc10835-1647-4727-9bda-15a99886aec1/nmstate-metrics/0.log" Dec 03 10:11:58 crc kubenswrapper[4947]: I1203 10:11:58.926421 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-nvjl2_81eb265d-a63b-40bb-be3e-9d7ed1012a86/nmstate-operator/0.log" Dec 03 10:11:59 crc kubenswrapper[4947]: I1203 10:11:59.027646 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-vbwhn_ddd645aa-f1f2-4657-acde-beac87387ecc/nmstate-webhook/0.log" Dec 03 10:12:02 crc kubenswrapper[4947]: I1203 10:12:02.082572 4947 scope.go:117] "RemoveContainer" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" Dec 03 10:12:02 crc kubenswrapper[4947]: E1203 10:12:02.083387 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:12:15 crc kubenswrapper[4947]: I1203 10:12:15.013081 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-smdq8_dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74/kube-rbac-proxy/0.log" Dec 03 10:12:15 crc kubenswrapper[4947]: I1203 10:12:15.089481 4947 scope.go:117] "RemoveContainer" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" Dec 03 10:12:15 crc kubenswrapper[4947]: E1203 10:12:15.101232 4947 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qv8tj_openshift-machine-config-operator(8384efba-0256-458d-8aab-627ad76e643e)\"" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" Dec 03 10:12:15 crc kubenswrapper[4947]: I1203 10:12:15.288317 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4jbhl_8704521a-a01e-42a0-a985-dcaa9756de9f/cp-frr-files/0.log" Dec 03 10:12:15 crc kubenswrapper[4947]: I1203 10:12:15.425271 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4jbhl_8704521a-a01e-42a0-a985-dcaa9756de9f/cp-frr-files/0.log" Dec 03 10:12:15 crc kubenswrapper[4947]: I1203 10:12:15.461073 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4jbhl_8704521a-a01e-42a0-a985-dcaa9756de9f/cp-reloader/0.log" Dec 03 10:12:15 crc kubenswrapper[4947]: I1203 10:12:15.480419 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-smdq8_dc6d1608-f8d6-4a0d-ace5-cad1cf4d2e74/controller/0.log" Dec 03 10:12:15 crc kubenswrapper[4947]: I1203 10:12:15.529162 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4jbhl_8704521a-a01e-42a0-a985-dcaa9756de9f/cp-metrics/0.log" Dec 03 10:12:15 crc kubenswrapper[4947]: I1203 10:12:15.596277 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4jbhl_8704521a-a01e-42a0-a985-dcaa9756de9f/cp-reloader/0.log" Dec 03 10:12:15 crc kubenswrapper[4947]: I1203 10:12:15.805259 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4jbhl_8704521a-a01e-42a0-a985-dcaa9756de9f/cp-metrics/0.log" Dec 03 10:12:15 crc kubenswrapper[4947]: I1203 10:12:15.813004 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4jbhl_8704521a-a01e-42a0-a985-dcaa9756de9f/cp-frr-files/0.log" Dec 03 10:12:15 crc kubenswrapper[4947]: I1203 10:12:15.845150 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4jbhl_8704521a-a01e-42a0-a985-dcaa9756de9f/cp-reloader/0.log" Dec 03 10:12:15 crc kubenswrapper[4947]: I1203 10:12:15.855891 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4jbhl_8704521a-a01e-42a0-a985-dcaa9756de9f/cp-metrics/0.log" Dec 03 10:12:16 crc kubenswrapper[4947]: I1203 10:12:16.063653 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4jbhl_8704521a-a01e-42a0-a985-dcaa9756de9f/cp-frr-files/0.log" Dec 03 10:12:16 crc kubenswrapper[4947]: I1203 10:12:16.107352 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4jbhl_8704521a-a01e-42a0-a985-dcaa9756de9f/controller/0.log" Dec 03 10:12:16 crc kubenswrapper[4947]: I1203 10:12:16.142695 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4jbhl_8704521a-a01e-42a0-a985-dcaa9756de9f/cp-reloader/0.log" Dec 03 10:12:16 crc kubenswrapper[4947]: I1203 10:12:16.160468 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4jbhl_8704521a-a01e-42a0-a985-dcaa9756de9f/cp-metrics/0.log" Dec 03 10:12:16 crc kubenswrapper[4947]: I1203 10:12:16.367419 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4jbhl_8704521a-a01e-42a0-a985-dcaa9756de9f/kube-rbac-proxy/0.log" Dec 03 10:12:16 crc kubenswrapper[4947]: I1203 10:12:16.384325 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4jbhl_8704521a-a01e-42a0-a985-dcaa9756de9f/frr-metrics/0.log" Dec 03 10:12:16 crc kubenswrapper[4947]: I1203 10:12:16.405047 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4jbhl_8704521a-a01e-42a0-a985-dcaa9756de9f/kube-rbac-proxy-frr/0.log" Dec 03 10:12:16 crc kubenswrapper[4947]: I1203 10:12:16.619551 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4jbhl_8704521a-a01e-42a0-a985-dcaa9756de9f/reloader/0.log" Dec 03 10:12:16 crc kubenswrapper[4947]: I1203 10:12:16.659693 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-lhpmz_b3a50a88-0743-4bc2-831c-65de7fbf4bb5/frr-k8s-webhook-server/0.log" Dec 03 10:12:16 crc kubenswrapper[4947]: I1203 10:12:16.878417 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6844cc4bd8-5bv87_e6cdefe4-13ac-46b3-8b10-dfdd75ece90a/manager/0.log" Dec 03 10:12:17 crc kubenswrapper[4947]: I1203 10:12:17.185047 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-swvdq_84e9b3ca-43fe-49fb-b14c-3837ce889acb/kube-rbac-proxy/0.log" Dec 03 10:12:17 crc kubenswrapper[4947]: I1203 10:12:17.199911 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-646746987c-7qg2b_502b6d02-4507-4965-b356-64edddc1b97b/webhook-server/0.log" Dec 03 10:12:18 crc kubenswrapper[4947]: I1203 10:12:18.253043 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-swvdq_84e9b3ca-43fe-49fb-b14c-3837ce889acb/speaker/0.log" Dec 03 10:12:19 crc kubenswrapper[4947]: I1203 10:12:19.897555 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4jbhl_8704521a-a01e-42a0-a985-dcaa9756de9f/frr/0.log" Dec 03 10:12:31 crc kubenswrapper[4947]: I1203 10:12:31.083067 4947 scope.go:117] "RemoveContainer" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" Dec 03 10:12:32 crc kubenswrapper[4947]: I1203 10:12:32.021428 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr_f692755d-a958-4fc0-9908-0c088cb8b85a/util/0.log" Dec 03 10:12:32 crc kubenswrapper[4947]: I1203 10:12:32.303005 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr_f692755d-a958-4fc0-9908-0c088cb8b85a/util/0.log" Dec 03 10:12:32 crc kubenswrapper[4947]: I1203 10:12:32.400448 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr_f692755d-a958-4fc0-9908-0c088cb8b85a/pull/0.log" Dec 03 10:12:32 crc kubenswrapper[4947]: I1203 10:12:32.403068 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr_f692755d-a958-4fc0-9908-0c088cb8b85a/pull/0.log" Dec 03 10:12:32 crc kubenswrapper[4947]: I1203 10:12:32.541769 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr_f692755d-a958-4fc0-9908-0c088cb8b85a/util/0.log" Dec 03 10:12:32 crc kubenswrapper[4947]: I1203 10:12:32.568325 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr_f692755d-a958-4fc0-9908-0c088cb8b85a/pull/0.log" Dec 03 10:12:32 crc kubenswrapper[4947]: I1203 10:12:32.590622 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931azmgrr_f692755d-a958-4fc0-9908-0c088cb8b85a/extract/0.log" Dec 03 10:12:32 crc kubenswrapper[4947]: I1203 10:12:32.814037 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76_50f2761d-ae32-4e29-9404-a1d5eb5141ef/util/0.log" Dec 03 10:12:32 crc kubenswrapper[4947]: I1203 10:12:32.979100 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76_50f2761d-ae32-4e29-9404-a1d5eb5141ef/util/0.log" Dec 03 10:12:33 crc kubenswrapper[4947]: I1203 10:12:33.022113 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76_50f2761d-ae32-4e29-9404-a1d5eb5141ef/pull/0.log" Dec 03 10:12:33 crc kubenswrapper[4947]: I1203 10:12:33.023398 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76_50f2761d-ae32-4e29-9404-a1d5eb5141ef/pull/0.log" Dec 03 10:12:33 crc kubenswrapper[4947]: I1203 10:12:33.157919 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76_50f2761d-ae32-4e29-9404-a1d5eb5141ef/pull/0.log" Dec 03 10:12:33 crc kubenswrapper[4947]: I1203 10:12:33.246748 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76_50f2761d-ae32-4e29-9404-a1d5eb5141ef/extract/0.log" Dec 03 10:12:33 crc kubenswrapper[4947]: I1203 10:12:33.279851 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"18952965d88233ea1cd500ad7f5c57f60779f5beacd002313da06142d2b92b6a"} Dec 03 10:12:33 crc kubenswrapper[4947]: I1203 10:12:33.365752 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh_1f7d61d9-044a-4bf6-8993-7e95e6dc289b/util/0.log" Dec 03 10:12:33 crc kubenswrapper[4947]: I1203 10:12:33.466794 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fb2n76_50f2761d-ae32-4e29-9404-a1d5eb5141ef/util/0.log" Dec 03 10:12:33 crc kubenswrapper[4947]: I1203 10:12:33.646209 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh_1f7d61d9-044a-4bf6-8993-7e95e6dc289b/pull/0.log" Dec 03 10:12:33 crc kubenswrapper[4947]: I1203 10:12:33.646314 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh_1f7d61d9-044a-4bf6-8993-7e95e6dc289b/pull/0.log" Dec 03 10:12:33 crc kubenswrapper[4947]: I1203 10:12:33.663453 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh_1f7d61d9-044a-4bf6-8993-7e95e6dc289b/util/0.log" Dec 03 10:12:33 crc kubenswrapper[4947]: I1203 10:12:33.849812 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh_1f7d61d9-044a-4bf6-8993-7e95e6dc289b/pull/0.log" Dec 03 10:12:33 crc kubenswrapper[4947]: I1203 10:12:33.935169 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh_1f7d61d9-044a-4bf6-8993-7e95e6dc289b/util/0.log" Dec 03 10:12:33 crc kubenswrapper[4947]: I1203 10:12:33.936453 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210874xh_1f7d61d9-044a-4bf6-8993-7e95e6dc289b/extract/0.log" Dec 03 10:12:34 crc kubenswrapper[4947]: I1203 10:12:34.031955 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l_2091a52a-0cd3-4b46-93d8-efacf220dd22/util/0.log" Dec 03 10:12:34 crc kubenswrapper[4947]: I1203 10:12:34.262978 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l_2091a52a-0cd3-4b46-93d8-efacf220dd22/pull/0.log" Dec 03 10:12:34 crc kubenswrapper[4947]: I1203 10:12:34.280408 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l_2091a52a-0cd3-4b46-93d8-efacf220dd22/util/0.log" Dec 03 10:12:34 crc kubenswrapper[4947]: I1203 10:12:34.298602 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l_2091a52a-0cd3-4b46-93d8-efacf220dd22/pull/0.log" Dec 03 10:12:34 crc kubenswrapper[4947]: I1203 10:12:34.467863 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l_2091a52a-0cd3-4b46-93d8-efacf220dd22/util/0.log" Dec 03 10:12:34 crc kubenswrapper[4947]: I1203 10:12:34.469663 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l_2091a52a-0cd3-4b46-93d8-efacf220dd22/pull/0.log" Dec 03 10:12:34 crc kubenswrapper[4947]: I1203 10:12:34.566319 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83p4j4l_2091a52a-0cd3-4b46-93d8-efacf220dd22/extract/0.log" Dec 03 10:12:34 crc kubenswrapper[4947]: I1203 10:12:34.903846 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mg856_54ba1b59-3e43-4354-bd44-006cfb0b69a2/extract-utilities/0.log" Dec 03 10:12:35 crc kubenswrapper[4947]: I1203 10:12:35.147338 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mg856_54ba1b59-3e43-4354-bd44-006cfb0b69a2/extract-utilities/0.log" Dec 03 10:12:35 crc kubenswrapper[4947]: I1203 10:12:35.174981 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mg856_54ba1b59-3e43-4354-bd44-006cfb0b69a2/extract-content/0.log" Dec 03 10:12:35 crc kubenswrapper[4947]: I1203 10:12:35.175088 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mg856_54ba1b59-3e43-4354-bd44-006cfb0b69a2/extract-content/0.log" Dec 03 10:12:35 crc kubenswrapper[4947]: I1203 10:12:35.363416 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mg856_54ba1b59-3e43-4354-bd44-006cfb0b69a2/extract-content/0.log" Dec 03 10:12:35 crc kubenswrapper[4947]: I1203 10:12:35.367593 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mg856_54ba1b59-3e43-4354-bd44-006cfb0b69a2/extract-utilities/0.log" Dec 03 10:12:35 crc kubenswrapper[4947]: I1203 10:12:35.625303 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-mg856_54ba1b59-3e43-4354-bd44-006cfb0b69a2/registry-server/0.log" Dec 03 10:12:35 crc kubenswrapper[4947]: I1203 10:12:35.684742 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m7hgc_b8028669-dbfd-4d2b-bdfb-440f156211b9/extract-utilities/0.log" Dec 03 10:12:35 crc kubenswrapper[4947]: I1203 10:12:35.772130 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m7hgc_b8028669-dbfd-4d2b-bdfb-440f156211b9/extract-utilities/0.log" Dec 03 10:12:35 crc kubenswrapper[4947]: I1203 10:12:35.813474 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m7hgc_b8028669-dbfd-4d2b-bdfb-440f156211b9/extract-content/0.log" Dec 03 10:12:35 crc kubenswrapper[4947]: I1203 10:12:35.833651 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m7hgc_b8028669-dbfd-4d2b-bdfb-440f156211b9/extract-content/0.log" Dec 03 10:12:36 crc kubenswrapper[4947]: I1203 10:12:36.017826 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m7hgc_b8028669-dbfd-4d2b-bdfb-440f156211b9/extract-utilities/0.log" Dec 03 10:12:36 crc kubenswrapper[4947]: I1203 10:12:36.024843 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m7hgc_b8028669-dbfd-4d2b-bdfb-440f156211b9/extract-content/0.log" Dec 03 10:12:36 crc kubenswrapper[4947]: I1203 10:12:36.069471 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-j8fz6_258aafb0-b050-4b10-b8b5-339e387f81ae/marketplace-operator/0.log" Dec 03 10:12:36 crc kubenswrapper[4947]: I1203 10:12:36.319667 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j45sc_780d43c8-0dea-47ad-95cc-26801572c76d/extract-utilities/0.log" Dec 03 10:12:36 crc kubenswrapper[4947]: I1203 10:12:36.509275 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j45sc_780d43c8-0dea-47ad-95cc-26801572c76d/extract-content/0.log" Dec 03 10:12:36 crc kubenswrapper[4947]: I1203 10:12:36.537170 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j45sc_780d43c8-0dea-47ad-95cc-26801572c76d/extract-content/0.log" Dec 03 10:12:36 crc kubenswrapper[4947]: I1203 10:12:36.592000 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j45sc_780d43c8-0dea-47ad-95cc-26801572c76d/extract-utilities/0.log" Dec 03 10:12:36 crc kubenswrapper[4947]: I1203 10:12:36.801090 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j45sc_780d43c8-0dea-47ad-95cc-26801572c76d/extract-utilities/0.log" Dec 03 10:12:36 crc kubenswrapper[4947]: I1203 10:12:36.857947 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j45sc_780d43c8-0dea-47ad-95cc-26801572c76d/extract-content/0.log" Dec 03 10:12:37 crc kubenswrapper[4947]: I1203 10:12:37.042182 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7pxlc_104bf72c-b8e1-4daf-834e-42342cd02813/extract-utilities/0.log" Dec 03 10:12:37 crc kubenswrapper[4947]: I1203 10:12:37.257529 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7pxlc_104bf72c-b8e1-4daf-834e-42342cd02813/extract-content/0.log" Dec 03 10:12:37 crc kubenswrapper[4947]: I1203 10:12:37.296735 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7pxlc_104bf72c-b8e1-4daf-834e-42342cd02813/extract-utilities/0.log" Dec 03 10:12:37 crc kubenswrapper[4947]: I1203 10:12:37.370112 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7pxlc_104bf72c-b8e1-4daf-834e-42342cd02813/extract-content/0.log" Dec 03 10:12:37 crc kubenswrapper[4947]: I1203 10:12:37.529649 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-j45sc_780d43c8-0dea-47ad-95cc-26801572c76d/registry-server/0.log" Dec 03 10:12:37 crc kubenswrapper[4947]: I1203 10:12:37.579349 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7pxlc_104bf72c-b8e1-4daf-834e-42342cd02813/extract-utilities/0.log" Dec 03 10:12:37 crc kubenswrapper[4947]: I1203 10:12:37.603198 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7pxlc_104bf72c-b8e1-4daf-834e-42342cd02813/extract-content/0.log" Dec 03 10:12:37 crc kubenswrapper[4947]: I1203 10:12:37.874936 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m7hgc_b8028669-dbfd-4d2b-bdfb-440f156211b9/registry-server/0.log" Dec 03 10:12:38 crc kubenswrapper[4947]: I1203 10:12:38.185999 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-7pxlc_104bf72c-b8e1-4daf-834e-42342cd02813/registry-server/0.log" Dec 03 10:12:51 crc kubenswrapper[4947]: I1203 10:12:51.048389 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-vg27w_dc90cebb-4774-4b00-bd82-852b5c5af24d/prometheus-operator/0.log" Dec 03 10:12:51 crc kubenswrapper[4947]: I1203 10:12:51.247754 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6f69fbf5bf-wbfw7_79345650-86f1-4fed-a0af-1a1cdb2fe403/prometheus-operator-admission-webhook/0.log" Dec 03 10:12:51 crc kubenswrapper[4947]: I1203 10:12:51.307101 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6f69fbf5bf-zppkk_a0c92f55-578e-4ab0-8a31-7a3df001a16c/prometheus-operator-admission-webhook/0.log" Dec 03 10:12:51 crc kubenswrapper[4947]: I1203 10:12:51.444881 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-d9hvf_5c86178b-2737-47e6-8146-3f508757d0ba/operator/0.log" Dec 03 10:12:51 crc kubenswrapper[4947]: I1203 10:12:51.503058 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-kj52t_13b9e98a-c5f9-4ce7-b114-bfd48c42c147/perses-operator/0.log" Dec 03 10:14:35 crc kubenswrapper[4947]: I1203 10:14:35.658180 4947 scope.go:117] "RemoveContainer" containerID="b5296885cbbacf29b570e03767a2e48dadea54c95bc5678b1e22a45a98265250" Dec 03 10:15:00 crc kubenswrapper[4947]: I1203 10:15:00.085987 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 10:15:00 crc kubenswrapper[4947]: I1203 10:15:00.086522 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 10:15:00 crc kubenswrapper[4947]: I1203 10:15:00.153982 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412615-8rjk9"] Dec 03 10:15:00 crc kubenswrapper[4947]: E1203 10:15:00.154569 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="881ec715-1c98-4c39-a9bc-0c78b6cc48ad" containerName="extract-content" Dec 03 10:15:00 crc kubenswrapper[4947]: I1203 10:15:00.154587 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="881ec715-1c98-4c39-a9bc-0c78b6cc48ad" containerName="extract-content" Dec 03 10:15:00 crc kubenswrapper[4947]: E1203 10:15:00.154605 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="881ec715-1c98-4c39-a9bc-0c78b6cc48ad" containerName="registry-server" Dec 03 10:15:00 crc kubenswrapper[4947]: I1203 10:15:00.154612 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="881ec715-1c98-4c39-a9bc-0c78b6cc48ad" containerName="registry-server" Dec 03 10:15:00 crc kubenswrapper[4947]: E1203 10:15:00.154634 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="881ec715-1c98-4c39-a9bc-0c78b6cc48ad" containerName="extract-utilities" Dec 03 10:15:00 crc kubenswrapper[4947]: I1203 10:15:00.154640 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="881ec715-1c98-4c39-a9bc-0c78b6cc48ad" containerName="extract-utilities" Dec 03 10:15:00 crc kubenswrapper[4947]: I1203 10:15:00.154851 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="881ec715-1c98-4c39-a9bc-0c78b6cc48ad" containerName="registry-server" Dec 03 10:15:00 crc kubenswrapper[4947]: I1203 10:15:00.155705 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412615-8rjk9" Dec 03 10:15:00 crc kubenswrapper[4947]: I1203 10:15:00.161052 4947 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 10:15:00 crc kubenswrapper[4947]: I1203 10:15:00.161100 4947 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 10:15:00 crc kubenswrapper[4947]: I1203 10:15:00.165308 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412615-8rjk9"] Dec 03 10:15:00 crc kubenswrapper[4947]: I1203 10:15:00.268795 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6-secret-volume\") pod \"collect-profiles-29412615-8rjk9\" (UID: \"b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412615-8rjk9" Dec 03 10:15:00 crc kubenswrapper[4947]: I1203 10:15:00.268897 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b5rp\" (UniqueName: \"kubernetes.io/projected/b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6-kube-api-access-2b5rp\") pod \"collect-profiles-29412615-8rjk9\" (UID: \"b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412615-8rjk9" Dec 03 10:15:00 crc kubenswrapper[4947]: I1203 10:15:00.268970 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6-config-volume\") pod \"collect-profiles-29412615-8rjk9\" (UID: \"b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412615-8rjk9" Dec 03 10:15:00 crc kubenswrapper[4947]: I1203 10:15:00.370607 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b5rp\" (UniqueName: \"kubernetes.io/projected/b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6-kube-api-access-2b5rp\") pod \"collect-profiles-29412615-8rjk9\" (UID: \"b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412615-8rjk9" Dec 03 10:15:00 crc kubenswrapper[4947]: I1203 10:15:00.370963 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6-config-volume\") pod \"collect-profiles-29412615-8rjk9\" (UID: \"b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412615-8rjk9" Dec 03 10:15:00 crc kubenswrapper[4947]: I1203 10:15:00.371090 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6-secret-volume\") pod \"collect-profiles-29412615-8rjk9\" (UID: \"b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412615-8rjk9" Dec 03 10:15:00 crc kubenswrapper[4947]: I1203 10:15:00.371897 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6-config-volume\") pod \"collect-profiles-29412615-8rjk9\" (UID: \"b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412615-8rjk9" Dec 03 10:15:00 crc kubenswrapper[4947]: I1203 10:15:00.388552 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6-secret-volume\") pod \"collect-profiles-29412615-8rjk9\" (UID: \"b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412615-8rjk9" Dec 03 10:15:00 crc kubenswrapper[4947]: I1203 10:15:00.397907 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b5rp\" (UniqueName: \"kubernetes.io/projected/b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6-kube-api-access-2b5rp\") pod \"collect-profiles-29412615-8rjk9\" (UID: \"b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412615-8rjk9" Dec 03 10:15:00 crc kubenswrapper[4947]: I1203 10:15:00.494041 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412615-8rjk9" Dec 03 10:15:00 crc kubenswrapper[4947]: I1203 10:15:00.943405 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412615-8rjk9"] Dec 03 10:15:01 crc kubenswrapper[4947]: I1203 10:15:01.298360 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412615-8rjk9" event={"ID":"b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6","Type":"ContainerStarted","Data":"9b632ec2755977b97c17a03ace94a909b9a784af86587195dbb3044b92c7b1ca"} Dec 03 10:15:01 crc kubenswrapper[4947]: I1203 10:15:01.298653 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412615-8rjk9" event={"ID":"b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6","Type":"ContainerStarted","Data":"d8175b007e4685d08f309db29cfff6e19fdd3927a0014f663b233aa7c032d739"} Dec 03 10:15:01 crc kubenswrapper[4947]: I1203 10:15:01.304109 4947 generic.go:334] "Generic (PLEG): container finished" podID="c762483a-2d87-4ac1-8fda-5f4044772e9a" containerID="75c93397c4e43e690ab83e54f6a436540bedb58a727ddb09631599dffc79945e" exitCode=0 Dec 03 10:15:01 crc kubenswrapper[4947]: I1203 10:15:01.304163 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-snrdr/must-gather-kjn4s" event={"ID":"c762483a-2d87-4ac1-8fda-5f4044772e9a","Type":"ContainerDied","Data":"75c93397c4e43e690ab83e54f6a436540bedb58a727ddb09631599dffc79945e"} Dec 03 10:15:01 crc kubenswrapper[4947]: I1203 10:15:01.304963 4947 scope.go:117] "RemoveContainer" containerID="75c93397c4e43e690ab83e54f6a436540bedb58a727ddb09631599dffc79945e" Dec 03 10:15:01 crc kubenswrapper[4947]: I1203 10:15:01.328061 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29412615-8rjk9" podStartSLOduration=1.328042913 podStartE2EDuration="1.328042913s" podCreationTimestamp="2025-12-03 10:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 10:15:01.313632633 +0000 UTC m=+12362.574587069" watchObservedRunningTime="2025-12-03 10:15:01.328042913 +0000 UTC m=+12362.588997339" Dec 03 10:15:02 crc kubenswrapper[4947]: I1203 10:15:02.123894 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-snrdr_must-gather-kjn4s_c762483a-2d87-4ac1-8fda-5f4044772e9a/gather/0.log" Dec 03 10:15:02 crc kubenswrapper[4947]: I1203 10:15:02.319420 4947 generic.go:334] "Generic (PLEG): container finished" podID="b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6" containerID="9b632ec2755977b97c17a03ace94a909b9a784af86587195dbb3044b92c7b1ca" exitCode=0 Dec 03 10:15:02 crc kubenswrapper[4947]: I1203 10:15:02.319572 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412615-8rjk9" event={"ID":"b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6","Type":"ContainerDied","Data":"9b632ec2755977b97c17a03ace94a909b9a784af86587195dbb3044b92c7b1ca"} Dec 03 10:15:03 crc kubenswrapper[4947]: I1203 10:15:03.866677 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412615-8rjk9" Dec 03 10:15:03 crc kubenswrapper[4947]: I1203 10:15:03.876124 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b5rp\" (UniqueName: \"kubernetes.io/projected/b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6-kube-api-access-2b5rp\") pod \"b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6\" (UID: \"b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6\") " Dec 03 10:15:03 crc kubenswrapper[4947]: I1203 10:15:03.876226 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6-secret-volume\") pod \"b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6\" (UID: \"b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6\") " Dec 03 10:15:03 crc kubenswrapper[4947]: I1203 10:15:03.876524 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6-config-volume\") pod \"b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6\" (UID: \"b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6\") " Dec 03 10:15:03 crc kubenswrapper[4947]: I1203 10:15:03.877863 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6-config-volume" (OuterVolumeSpecName: "config-volume") pod "b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6" (UID: "b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 10:15:03 crc kubenswrapper[4947]: I1203 10:15:03.888821 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6-kube-api-access-2b5rp" (OuterVolumeSpecName: "kube-api-access-2b5rp") pod "b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6" (UID: "b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6"). InnerVolumeSpecName "kube-api-access-2b5rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 10:15:03 crc kubenswrapper[4947]: I1203 10:15:03.889045 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6" (UID: "b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 10:15:03 crc kubenswrapper[4947]: I1203 10:15:03.977702 4947 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 10:15:03 crc kubenswrapper[4947]: I1203 10:15:03.977735 4947 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 10:15:03 crc kubenswrapper[4947]: I1203 10:15:03.977745 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b5rp\" (UniqueName: \"kubernetes.io/projected/b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6-kube-api-access-2b5rp\") on node \"crc\" DevicePath \"\"" Dec 03 10:15:04 crc kubenswrapper[4947]: I1203 10:15:04.340455 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412615-8rjk9" event={"ID":"b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6","Type":"ContainerDied","Data":"d8175b007e4685d08f309db29cfff6e19fdd3927a0014f663b233aa7c032d739"} Dec 03 10:15:04 crc kubenswrapper[4947]: I1203 10:15:04.340541 4947 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8175b007e4685d08f309db29cfff6e19fdd3927a0014f663b233aa7c032d739" Dec 03 10:15:04 crc kubenswrapper[4947]: I1203 10:15:04.340565 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412615-8rjk9" Dec 03 10:15:04 crc kubenswrapper[4947]: I1203 10:15:04.426853 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412570-fj6ff"] Dec 03 10:15:04 crc kubenswrapper[4947]: I1203 10:15:04.445546 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412570-fj6ff"] Dec 03 10:15:05 crc kubenswrapper[4947]: I1203 10:15:05.098408 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef663b37-7b25-4a1d-81e4-e33b06d90a30" path="/var/lib/kubelet/pods/ef663b37-7b25-4a1d-81e4-e33b06d90a30/volumes" Dec 03 10:15:11 crc kubenswrapper[4947]: I1203 10:15:11.559003 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-snrdr/must-gather-kjn4s"] Dec 03 10:15:11 crc kubenswrapper[4947]: I1203 10:15:11.561098 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-snrdr/must-gather-kjn4s" podUID="c762483a-2d87-4ac1-8fda-5f4044772e9a" containerName="copy" containerID="cri-o://3ed1ee23eeb3693feaafdd8abae6768cc2c5771a4fc4582c9185088f62717118" gracePeriod=2 Dec 03 10:15:11 crc kubenswrapper[4947]: I1203 10:15:11.577131 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-snrdr/must-gather-kjn4s"] Dec 03 10:15:12 crc kubenswrapper[4947]: I1203 10:15:12.061936 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-snrdr_must-gather-kjn4s_c762483a-2d87-4ac1-8fda-5f4044772e9a/copy/0.log" Dec 03 10:15:12 crc kubenswrapper[4947]: I1203 10:15:12.062694 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snrdr/must-gather-kjn4s" Dec 03 10:15:12 crc kubenswrapper[4947]: I1203 10:15:12.175012 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c762483a-2d87-4ac1-8fda-5f4044772e9a-must-gather-output\") pod \"c762483a-2d87-4ac1-8fda-5f4044772e9a\" (UID: \"c762483a-2d87-4ac1-8fda-5f4044772e9a\") " Dec 03 10:15:12 crc kubenswrapper[4947]: I1203 10:15:12.175644 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgrb\" (UniqueName: \"kubernetes.io/projected/c762483a-2d87-4ac1-8fda-5f4044772e9a-kube-api-access-xcgrb\") pod \"c762483a-2d87-4ac1-8fda-5f4044772e9a\" (UID: \"c762483a-2d87-4ac1-8fda-5f4044772e9a\") " Dec 03 10:15:12 crc kubenswrapper[4947]: I1203 10:15:12.180812 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c762483a-2d87-4ac1-8fda-5f4044772e9a-kube-api-access-xcgrb" (OuterVolumeSpecName: "kube-api-access-xcgrb") pod "c762483a-2d87-4ac1-8fda-5f4044772e9a" (UID: "c762483a-2d87-4ac1-8fda-5f4044772e9a"). InnerVolumeSpecName "kube-api-access-xcgrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 10:15:12 crc kubenswrapper[4947]: I1203 10:15:12.278709 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgrb\" (UniqueName: \"kubernetes.io/projected/c762483a-2d87-4ac1-8fda-5f4044772e9a-kube-api-access-xcgrb\") on node \"crc\" DevicePath \"\"" Dec 03 10:15:12 crc kubenswrapper[4947]: I1203 10:15:12.386770 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c762483a-2d87-4ac1-8fda-5f4044772e9a-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "c762483a-2d87-4ac1-8fda-5f4044772e9a" (UID: "c762483a-2d87-4ac1-8fda-5f4044772e9a"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 10:15:12 crc kubenswrapper[4947]: I1203 10:15:12.434312 4947 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-snrdr_must-gather-kjn4s_c762483a-2d87-4ac1-8fda-5f4044772e9a/copy/0.log" Dec 03 10:15:12 crc kubenswrapper[4947]: I1203 10:15:12.434961 4947 generic.go:334] "Generic (PLEG): container finished" podID="c762483a-2d87-4ac1-8fda-5f4044772e9a" containerID="3ed1ee23eeb3693feaafdd8abae6768cc2c5771a4fc4582c9185088f62717118" exitCode=143 Dec 03 10:15:12 crc kubenswrapper[4947]: I1203 10:15:12.435129 4947 scope.go:117] "RemoveContainer" containerID="3ed1ee23eeb3693feaafdd8abae6768cc2c5771a4fc4582c9185088f62717118" Dec 03 10:15:12 crc kubenswrapper[4947]: I1203 10:15:12.435344 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-snrdr/must-gather-kjn4s" Dec 03 10:15:12 crc kubenswrapper[4947]: I1203 10:15:12.480042 4947 scope.go:117] "RemoveContainer" containerID="75c93397c4e43e690ab83e54f6a436540bedb58a727ddb09631599dffc79945e" Dec 03 10:15:12 crc kubenswrapper[4947]: I1203 10:15:12.485570 4947 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c762483a-2d87-4ac1-8fda-5f4044772e9a-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 03 10:15:12 crc kubenswrapper[4947]: I1203 10:15:12.596346 4947 scope.go:117] "RemoveContainer" containerID="3ed1ee23eeb3693feaafdd8abae6768cc2c5771a4fc4582c9185088f62717118" Dec 03 10:15:12 crc kubenswrapper[4947]: E1203 10:15:12.597182 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ed1ee23eeb3693feaafdd8abae6768cc2c5771a4fc4582c9185088f62717118\": container with ID starting with 3ed1ee23eeb3693feaafdd8abae6768cc2c5771a4fc4582c9185088f62717118 not found: ID does not exist" containerID="3ed1ee23eeb3693feaafdd8abae6768cc2c5771a4fc4582c9185088f62717118" Dec 03 10:15:12 crc kubenswrapper[4947]: I1203 10:15:12.597233 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ed1ee23eeb3693feaafdd8abae6768cc2c5771a4fc4582c9185088f62717118"} err="failed to get container status \"3ed1ee23eeb3693feaafdd8abae6768cc2c5771a4fc4582c9185088f62717118\": rpc error: code = NotFound desc = could not find container \"3ed1ee23eeb3693feaafdd8abae6768cc2c5771a4fc4582c9185088f62717118\": container with ID starting with 3ed1ee23eeb3693feaafdd8abae6768cc2c5771a4fc4582c9185088f62717118 not found: ID does not exist" Dec 03 10:15:12 crc kubenswrapper[4947]: I1203 10:15:12.597266 4947 scope.go:117] "RemoveContainer" containerID="75c93397c4e43e690ab83e54f6a436540bedb58a727ddb09631599dffc79945e" Dec 03 10:15:12 crc kubenswrapper[4947]: E1203 10:15:12.597668 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75c93397c4e43e690ab83e54f6a436540bedb58a727ddb09631599dffc79945e\": container with ID starting with 75c93397c4e43e690ab83e54f6a436540bedb58a727ddb09631599dffc79945e not found: ID does not exist" containerID="75c93397c4e43e690ab83e54f6a436540bedb58a727ddb09631599dffc79945e" Dec 03 10:15:12 crc kubenswrapper[4947]: I1203 10:15:12.597698 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75c93397c4e43e690ab83e54f6a436540bedb58a727ddb09631599dffc79945e"} err="failed to get container status \"75c93397c4e43e690ab83e54f6a436540bedb58a727ddb09631599dffc79945e\": rpc error: code = NotFound desc = could not find container \"75c93397c4e43e690ab83e54f6a436540bedb58a727ddb09631599dffc79945e\": container with ID starting with 75c93397c4e43e690ab83e54f6a436540bedb58a727ddb09631599dffc79945e not found: ID does not exist" Dec 03 10:15:13 crc kubenswrapper[4947]: I1203 10:15:13.094849 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c762483a-2d87-4ac1-8fda-5f4044772e9a" path="/var/lib/kubelet/pods/c762483a-2d87-4ac1-8fda-5f4044772e9a/volumes" Dec 03 10:15:30 crc kubenswrapper[4947]: I1203 10:15:30.086242 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 10:15:30 crc kubenswrapper[4947]: I1203 10:15:30.086737 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 10:15:35 crc kubenswrapper[4947]: I1203 10:15:35.739743 4947 scope.go:117] "RemoveContainer" containerID="65dcc688c64324e9dd2fadfdccb34b59f41935e7a00e4a414bce422cae9c40ae" Dec 03 10:15:35 crc kubenswrapper[4947]: I1203 10:15:35.766990 4947 scope.go:117] "RemoveContainer" containerID="d240a3a9e2987b7b16d2fdb0351e83496022074fad511bec35c73807fd9f1305" Dec 03 10:16:00 crc kubenswrapper[4947]: I1203 10:16:00.086563 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 10:16:00 crc kubenswrapper[4947]: I1203 10:16:00.087271 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 10:16:00 crc kubenswrapper[4947]: I1203 10:16:00.087323 4947 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" Dec 03 10:16:00 crc kubenswrapper[4947]: I1203 10:16:00.088056 4947 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"18952965d88233ea1cd500ad7f5c57f60779f5beacd002313da06142d2b92b6a"} pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 10:16:00 crc kubenswrapper[4947]: I1203 10:16:00.088127 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" containerID="cri-o://18952965d88233ea1cd500ad7f5c57f60779f5beacd002313da06142d2b92b6a" gracePeriod=600 Dec 03 10:16:01 crc kubenswrapper[4947]: I1203 10:16:01.025662 4947 generic.go:334] "Generic (PLEG): container finished" podID="8384efba-0256-458d-8aab-627ad76e643e" containerID="18952965d88233ea1cd500ad7f5c57f60779f5beacd002313da06142d2b92b6a" exitCode=0 Dec 03 10:16:01 crc kubenswrapper[4947]: I1203 10:16:01.025714 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerDied","Data":"18952965d88233ea1cd500ad7f5c57f60779f5beacd002313da06142d2b92b6a"} Dec 03 10:16:01 crc kubenswrapper[4947]: I1203 10:16:01.026063 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" event={"ID":"8384efba-0256-458d-8aab-627ad76e643e","Type":"ContainerStarted","Data":"2cffe1abef8f7f55a7a04635ac1f4df4c0651db778ad04c5bf42428b1237cc2a"} Dec 03 10:16:01 crc kubenswrapper[4947]: I1203 10:16:01.026092 4947 scope.go:117] "RemoveContainer" containerID="5084cd433ef72c8432f50a22d6d90c9cf1090da36e4077036b9e4ebdb2113a0a" Dec 03 10:16:28 crc kubenswrapper[4947]: I1203 10:16:28.729037 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-89w6j"] Dec 03 10:16:28 crc kubenswrapper[4947]: E1203 10:16:28.729954 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c762483a-2d87-4ac1-8fda-5f4044772e9a" containerName="gather" Dec 03 10:16:28 crc kubenswrapper[4947]: I1203 10:16:28.729966 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="c762483a-2d87-4ac1-8fda-5f4044772e9a" containerName="gather" Dec 03 10:16:28 crc kubenswrapper[4947]: E1203 10:16:28.729988 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c762483a-2d87-4ac1-8fda-5f4044772e9a" containerName="copy" Dec 03 10:16:28 crc kubenswrapper[4947]: I1203 10:16:28.729994 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="c762483a-2d87-4ac1-8fda-5f4044772e9a" containerName="copy" Dec 03 10:16:28 crc kubenswrapper[4947]: E1203 10:16:28.730013 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6" containerName="collect-profiles" Dec 03 10:16:28 crc kubenswrapper[4947]: I1203 10:16:28.730019 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6" containerName="collect-profiles" Dec 03 10:16:28 crc kubenswrapper[4947]: I1203 10:16:28.730201 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="c762483a-2d87-4ac1-8fda-5f4044772e9a" containerName="gather" Dec 03 10:16:28 crc kubenswrapper[4947]: I1203 10:16:28.730216 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="b49ee4f8-6657-42e2-8d07-f63cfb7f1cd6" containerName="collect-profiles" Dec 03 10:16:28 crc kubenswrapper[4947]: I1203 10:16:28.730233 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="c762483a-2d87-4ac1-8fda-5f4044772e9a" containerName="copy" Dec 03 10:16:28 crc kubenswrapper[4947]: I1203 10:16:28.732067 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89w6j" Dec 03 10:16:28 crc kubenswrapper[4947]: I1203 10:16:28.745066 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-89w6j"] Dec 03 10:16:28 crc kubenswrapper[4947]: I1203 10:16:28.855364 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86c2e069-1b19-44d5-81e7-64b526349b2d-catalog-content\") pod \"community-operators-89w6j\" (UID: \"86c2e069-1b19-44d5-81e7-64b526349b2d\") " pod="openshift-marketplace/community-operators-89w6j" Dec 03 10:16:28 crc kubenswrapper[4947]: I1203 10:16:28.855439 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86c2e069-1b19-44d5-81e7-64b526349b2d-utilities\") pod \"community-operators-89w6j\" (UID: \"86c2e069-1b19-44d5-81e7-64b526349b2d\") " pod="openshift-marketplace/community-operators-89w6j" Dec 03 10:16:28 crc kubenswrapper[4947]: I1203 10:16:28.855829 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65g57\" (UniqueName: \"kubernetes.io/projected/86c2e069-1b19-44d5-81e7-64b526349b2d-kube-api-access-65g57\") pod \"community-operators-89w6j\" (UID: \"86c2e069-1b19-44d5-81e7-64b526349b2d\") " pod="openshift-marketplace/community-operators-89w6j" Dec 03 10:16:28 crc kubenswrapper[4947]: I1203 10:16:28.958844 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86c2e069-1b19-44d5-81e7-64b526349b2d-catalog-content\") pod \"community-operators-89w6j\" (UID: \"86c2e069-1b19-44d5-81e7-64b526349b2d\") " pod="openshift-marketplace/community-operators-89w6j" Dec 03 10:16:28 crc kubenswrapper[4947]: I1203 10:16:28.958912 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86c2e069-1b19-44d5-81e7-64b526349b2d-utilities\") pod \"community-operators-89w6j\" (UID: \"86c2e069-1b19-44d5-81e7-64b526349b2d\") " pod="openshift-marketplace/community-operators-89w6j" Dec 03 10:16:28 crc kubenswrapper[4947]: I1203 10:16:28.958989 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65g57\" (UniqueName: \"kubernetes.io/projected/86c2e069-1b19-44d5-81e7-64b526349b2d-kube-api-access-65g57\") pod \"community-operators-89w6j\" (UID: \"86c2e069-1b19-44d5-81e7-64b526349b2d\") " pod="openshift-marketplace/community-operators-89w6j" Dec 03 10:16:28 crc kubenswrapper[4947]: I1203 10:16:28.959483 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86c2e069-1b19-44d5-81e7-64b526349b2d-catalog-content\") pod \"community-operators-89w6j\" (UID: \"86c2e069-1b19-44d5-81e7-64b526349b2d\") " pod="openshift-marketplace/community-operators-89w6j" Dec 03 10:16:28 crc kubenswrapper[4947]: I1203 10:16:28.959629 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86c2e069-1b19-44d5-81e7-64b526349b2d-utilities\") pod \"community-operators-89w6j\" (UID: \"86c2e069-1b19-44d5-81e7-64b526349b2d\") " pod="openshift-marketplace/community-operators-89w6j" Dec 03 10:16:28 crc kubenswrapper[4947]: I1203 10:16:28.982854 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65g57\" (UniqueName: \"kubernetes.io/projected/86c2e069-1b19-44d5-81e7-64b526349b2d-kube-api-access-65g57\") pod \"community-operators-89w6j\" (UID: \"86c2e069-1b19-44d5-81e7-64b526349b2d\") " pod="openshift-marketplace/community-operators-89w6j" Dec 03 10:16:29 crc kubenswrapper[4947]: I1203 10:16:29.054013 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89w6j" Dec 03 10:16:29 crc kubenswrapper[4947]: I1203 10:16:29.687556 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-89w6j"] Dec 03 10:16:29 crc kubenswrapper[4947]: W1203 10:16:29.694332 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86c2e069_1b19_44d5_81e7_64b526349b2d.slice/crio-5f78b776b85ce0c1c011e4daa0dfbae10ec60b0faa5101f4fe99c734a12e7de9 WatchSource:0}: Error finding container 5f78b776b85ce0c1c011e4daa0dfbae10ec60b0faa5101f4fe99c734a12e7de9: Status 404 returned error can't find the container with id 5f78b776b85ce0c1c011e4daa0dfbae10ec60b0faa5101f4fe99c734a12e7de9 Dec 03 10:16:30 crc kubenswrapper[4947]: I1203 10:16:30.377232 4947 generic.go:334] "Generic (PLEG): container finished" podID="86c2e069-1b19-44d5-81e7-64b526349b2d" containerID="2423932a8622a6e829e57596d31b781a248cc599bd9e14c44666af1a055bbeb3" exitCode=0 Dec 03 10:16:30 crc kubenswrapper[4947]: I1203 10:16:30.377294 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89w6j" event={"ID":"86c2e069-1b19-44d5-81e7-64b526349b2d","Type":"ContainerDied","Data":"2423932a8622a6e829e57596d31b781a248cc599bd9e14c44666af1a055bbeb3"} Dec 03 10:16:30 crc kubenswrapper[4947]: I1203 10:16:30.378177 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89w6j" event={"ID":"86c2e069-1b19-44d5-81e7-64b526349b2d","Type":"ContainerStarted","Data":"5f78b776b85ce0c1c011e4daa0dfbae10ec60b0faa5101f4fe99c734a12e7de9"} Dec 03 10:16:30 crc kubenswrapper[4947]: I1203 10:16:30.379527 4947 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 10:16:31 crc kubenswrapper[4947]: I1203 10:16:31.390662 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89w6j" event={"ID":"86c2e069-1b19-44d5-81e7-64b526349b2d","Type":"ContainerStarted","Data":"1625926ef813f00ddbdb327a7681fa2ca87921b9618f5db26cb5081125c3c87a"} Dec 03 10:16:32 crc kubenswrapper[4947]: I1203 10:16:32.403544 4947 generic.go:334] "Generic (PLEG): container finished" podID="86c2e069-1b19-44d5-81e7-64b526349b2d" containerID="1625926ef813f00ddbdb327a7681fa2ca87921b9618f5db26cb5081125c3c87a" exitCode=0 Dec 03 10:16:32 crc kubenswrapper[4947]: I1203 10:16:32.403590 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89w6j" event={"ID":"86c2e069-1b19-44d5-81e7-64b526349b2d","Type":"ContainerDied","Data":"1625926ef813f00ddbdb327a7681fa2ca87921b9618f5db26cb5081125c3c87a"} Dec 03 10:16:33 crc kubenswrapper[4947]: I1203 10:16:33.415600 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89w6j" event={"ID":"86c2e069-1b19-44d5-81e7-64b526349b2d","Type":"ContainerStarted","Data":"49d738ccd33bf45204bd989bdabeeac7ffad1b62b0c9b4b3b7c3b17ac344eb74"} Dec 03 10:16:33 crc kubenswrapper[4947]: I1203 10:16:33.449814 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-89w6j" podStartSLOduration=2.850501996 podStartE2EDuration="5.449794923s" podCreationTimestamp="2025-12-03 10:16:28 +0000 UTC" firstStartedPulling="2025-12-03 10:16:30.379300412 +0000 UTC m=+12451.640254838" lastFinishedPulling="2025-12-03 10:16:32.978593339 +0000 UTC m=+12454.239547765" observedRunningTime="2025-12-03 10:16:33.444813999 +0000 UTC m=+12454.705768455" watchObservedRunningTime="2025-12-03 10:16:33.449794923 +0000 UTC m=+12454.710749349" Dec 03 10:16:39 crc kubenswrapper[4947]: I1203 10:16:39.059441 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-89w6j" Dec 03 10:16:39 crc kubenswrapper[4947]: I1203 10:16:39.059903 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-89w6j" Dec 03 10:16:39 crc kubenswrapper[4947]: I1203 10:16:39.124226 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-89w6j" Dec 03 10:16:39 crc kubenswrapper[4947]: I1203 10:16:39.540077 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-89w6j" Dec 03 10:16:39 crc kubenswrapper[4947]: I1203 10:16:39.596377 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-89w6j"] Dec 03 10:16:41 crc kubenswrapper[4947]: I1203 10:16:41.501804 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-89w6j" podUID="86c2e069-1b19-44d5-81e7-64b526349b2d" containerName="registry-server" containerID="cri-o://49d738ccd33bf45204bd989bdabeeac7ffad1b62b0c9b4b3b7c3b17ac344eb74" gracePeriod=2 Dec 03 10:16:42 crc kubenswrapper[4947]: I1203 10:16:42.516761 4947 generic.go:334] "Generic (PLEG): container finished" podID="86c2e069-1b19-44d5-81e7-64b526349b2d" containerID="49d738ccd33bf45204bd989bdabeeac7ffad1b62b0c9b4b3b7c3b17ac344eb74" exitCode=0 Dec 03 10:16:42 crc kubenswrapper[4947]: I1203 10:16:42.516815 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89w6j" event={"ID":"86c2e069-1b19-44d5-81e7-64b526349b2d","Type":"ContainerDied","Data":"49d738ccd33bf45204bd989bdabeeac7ffad1b62b0c9b4b3b7c3b17ac344eb74"} Dec 03 10:16:43 crc kubenswrapper[4947]: I1203 10:16:43.155910 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89w6j" Dec 03 10:16:43 crc kubenswrapper[4947]: I1203 10:16:43.185922 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86c2e069-1b19-44d5-81e7-64b526349b2d-catalog-content\") pod \"86c2e069-1b19-44d5-81e7-64b526349b2d\" (UID: \"86c2e069-1b19-44d5-81e7-64b526349b2d\") " Dec 03 10:16:43 crc kubenswrapper[4947]: I1203 10:16:43.186062 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65g57\" (UniqueName: \"kubernetes.io/projected/86c2e069-1b19-44d5-81e7-64b526349b2d-kube-api-access-65g57\") pod \"86c2e069-1b19-44d5-81e7-64b526349b2d\" (UID: \"86c2e069-1b19-44d5-81e7-64b526349b2d\") " Dec 03 10:16:43 crc kubenswrapper[4947]: I1203 10:16:43.186161 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86c2e069-1b19-44d5-81e7-64b526349b2d-utilities\") pod \"86c2e069-1b19-44d5-81e7-64b526349b2d\" (UID: \"86c2e069-1b19-44d5-81e7-64b526349b2d\") " Dec 03 10:16:43 crc kubenswrapper[4947]: I1203 10:16:43.187903 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86c2e069-1b19-44d5-81e7-64b526349b2d-utilities" (OuterVolumeSpecName: "utilities") pod "86c2e069-1b19-44d5-81e7-64b526349b2d" (UID: "86c2e069-1b19-44d5-81e7-64b526349b2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 10:16:43 crc kubenswrapper[4947]: I1203 10:16:43.196764 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86c2e069-1b19-44d5-81e7-64b526349b2d-kube-api-access-65g57" (OuterVolumeSpecName: "kube-api-access-65g57") pod "86c2e069-1b19-44d5-81e7-64b526349b2d" (UID: "86c2e069-1b19-44d5-81e7-64b526349b2d"). InnerVolumeSpecName "kube-api-access-65g57". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 10:16:43 crc kubenswrapper[4947]: I1203 10:16:43.243086 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86c2e069-1b19-44d5-81e7-64b526349b2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86c2e069-1b19-44d5-81e7-64b526349b2d" (UID: "86c2e069-1b19-44d5-81e7-64b526349b2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 10:16:43 crc kubenswrapper[4947]: I1203 10:16:43.288437 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86c2e069-1b19-44d5-81e7-64b526349b2d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 10:16:43 crc kubenswrapper[4947]: I1203 10:16:43.288750 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86c2e069-1b19-44d5-81e7-64b526349b2d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 10:16:43 crc kubenswrapper[4947]: I1203 10:16:43.288819 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65g57\" (UniqueName: \"kubernetes.io/projected/86c2e069-1b19-44d5-81e7-64b526349b2d-kube-api-access-65g57\") on node \"crc\" DevicePath \"\"" Dec 03 10:16:43 crc kubenswrapper[4947]: I1203 10:16:43.528562 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89w6j" event={"ID":"86c2e069-1b19-44d5-81e7-64b526349b2d","Type":"ContainerDied","Data":"5f78b776b85ce0c1c011e4daa0dfbae10ec60b0faa5101f4fe99c734a12e7de9"} Dec 03 10:16:43 crc kubenswrapper[4947]: I1203 10:16:43.528629 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89w6j" Dec 03 10:16:43 crc kubenswrapper[4947]: I1203 10:16:43.528662 4947 scope.go:117] "RemoveContainer" containerID="49d738ccd33bf45204bd989bdabeeac7ffad1b62b0c9b4b3b7c3b17ac344eb74" Dec 03 10:16:43 crc kubenswrapper[4947]: I1203 10:16:43.556942 4947 scope.go:117] "RemoveContainer" containerID="1625926ef813f00ddbdb327a7681fa2ca87921b9618f5db26cb5081125c3c87a" Dec 03 10:16:43 crc kubenswrapper[4947]: I1203 10:16:43.573362 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-89w6j"] Dec 03 10:16:43 crc kubenswrapper[4947]: I1203 10:16:43.597217 4947 scope.go:117] "RemoveContainer" containerID="2423932a8622a6e829e57596d31b781a248cc599bd9e14c44666af1a055bbeb3" Dec 03 10:16:43 crc kubenswrapper[4947]: I1203 10:16:43.610575 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-89w6j"] Dec 03 10:16:45 crc kubenswrapper[4947]: I1203 10:16:45.103615 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86c2e069-1b19-44d5-81e7-64b526349b2d" path="/var/lib/kubelet/pods/86c2e069-1b19-44d5-81e7-64b526349b2d/volumes" Dec 03 10:18:00 crc kubenswrapper[4947]: I1203 10:18:00.085877 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 10:18:00 crc kubenswrapper[4947]: I1203 10:18:00.086439 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 10:18:20 crc kubenswrapper[4947]: I1203 10:18:20.919415 4947 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fbh55"] Dec 03 10:18:20 crc kubenswrapper[4947]: E1203 10:18:20.926388 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c2e069-1b19-44d5-81e7-64b526349b2d" containerName="registry-server" Dec 03 10:18:20 crc kubenswrapper[4947]: I1203 10:18:20.926442 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c2e069-1b19-44d5-81e7-64b526349b2d" containerName="registry-server" Dec 03 10:18:20 crc kubenswrapper[4947]: E1203 10:18:20.926525 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c2e069-1b19-44d5-81e7-64b526349b2d" containerName="extract-utilities" Dec 03 10:18:20 crc kubenswrapper[4947]: I1203 10:18:20.926537 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c2e069-1b19-44d5-81e7-64b526349b2d" containerName="extract-utilities" Dec 03 10:18:20 crc kubenswrapper[4947]: E1203 10:18:20.926591 4947 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c2e069-1b19-44d5-81e7-64b526349b2d" containerName="extract-content" Dec 03 10:18:20 crc kubenswrapper[4947]: I1203 10:18:20.926603 4947 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c2e069-1b19-44d5-81e7-64b526349b2d" containerName="extract-content" Dec 03 10:18:20 crc kubenswrapper[4947]: I1203 10:18:20.926996 4947 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c2e069-1b19-44d5-81e7-64b526349b2d" containerName="registry-server" Dec 03 10:18:20 crc kubenswrapper[4947]: I1203 10:18:20.929650 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbh55" Dec 03 10:18:20 crc kubenswrapper[4947]: I1203 10:18:20.960560 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fbh55"] Dec 03 10:18:21 crc kubenswrapper[4947]: I1203 10:18:21.020738 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5znz5\" (UniqueName: \"kubernetes.io/projected/b0991512-ec23-4d1b-9daf-c1082e52fb12-kube-api-access-5znz5\") pod \"redhat-operators-fbh55\" (UID: \"b0991512-ec23-4d1b-9daf-c1082e52fb12\") " pod="openshift-marketplace/redhat-operators-fbh55" Dec 03 10:18:21 crc kubenswrapper[4947]: I1203 10:18:21.020831 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0991512-ec23-4d1b-9daf-c1082e52fb12-utilities\") pod \"redhat-operators-fbh55\" (UID: \"b0991512-ec23-4d1b-9daf-c1082e52fb12\") " pod="openshift-marketplace/redhat-operators-fbh55" Dec 03 10:18:21 crc kubenswrapper[4947]: I1203 10:18:21.020879 4947 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0991512-ec23-4d1b-9daf-c1082e52fb12-catalog-content\") pod \"redhat-operators-fbh55\" (UID: \"b0991512-ec23-4d1b-9daf-c1082e52fb12\") " pod="openshift-marketplace/redhat-operators-fbh55" Dec 03 10:18:21 crc kubenswrapper[4947]: I1203 10:18:21.122817 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0991512-ec23-4d1b-9daf-c1082e52fb12-utilities\") pod \"redhat-operators-fbh55\" (UID: \"b0991512-ec23-4d1b-9daf-c1082e52fb12\") " pod="openshift-marketplace/redhat-operators-fbh55" Dec 03 10:18:21 crc kubenswrapper[4947]: I1203 10:18:21.122965 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0991512-ec23-4d1b-9daf-c1082e52fb12-catalog-content\") pod \"redhat-operators-fbh55\" (UID: \"b0991512-ec23-4d1b-9daf-c1082e52fb12\") " pod="openshift-marketplace/redhat-operators-fbh55" Dec 03 10:18:21 crc kubenswrapper[4947]: I1203 10:18:21.123474 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0991512-ec23-4d1b-9daf-c1082e52fb12-utilities\") pod \"redhat-operators-fbh55\" (UID: \"b0991512-ec23-4d1b-9daf-c1082e52fb12\") " pod="openshift-marketplace/redhat-operators-fbh55" Dec 03 10:18:21 crc kubenswrapper[4947]: I1203 10:18:21.123576 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0991512-ec23-4d1b-9daf-c1082e52fb12-catalog-content\") pod \"redhat-operators-fbh55\" (UID: \"b0991512-ec23-4d1b-9daf-c1082e52fb12\") " pod="openshift-marketplace/redhat-operators-fbh55" Dec 03 10:18:21 crc kubenswrapper[4947]: I1203 10:18:21.123814 4947 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5znz5\" (UniqueName: \"kubernetes.io/projected/b0991512-ec23-4d1b-9daf-c1082e52fb12-kube-api-access-5znz5\") pod \"redhat-operators-fbh55\" (UID: \"b0991512-ec23-4d1b-9daf-c1082e52fb12\") " pod="openshift-marketplace/redhat-operators-fbh55" Dec 03 10:18:21 crc kubenswrapper[4947]: I1203 10:18:21.146244 4947 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5znz5\" (UniqueName: \"kubernetes.io/projected/b0991512-ec23-4d1b-9daf-c1082e52fb12-kube-api-access-5znz5\") pod \"redhat-operators-fbh55\" (UID: \"b0991512-ec23-4d1b-9daf-c1082e52fb12\") " pod="openshift-marketplace/redhat-operators-fbh55" Dec 03 10:18:21 crc kubenswrapper[4947]: I1203 10:18:21.278045 4947 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbh55" Dec 03 10:18:21 crc kubenswrapper[4947]: W1203 10:18:21.831673 4947 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0991512_ec23_4d1b_9daf_c1082e52fb12.slice/crio-3ac91b03590b5309de18533a579f417f8a5493210ffb8fcc43cf800533e2804e WatchSource:0}: Error finding container 3ac91b03590b5309de18533a579f417f8a5493210ffb8fcc43cf800533e2804e: Status 404 returned error can't find the container with id 3ac91b03590b5309de18533a579f417f8a5493210ffb8fcc43cf800533e2804e Dec 03 10:18:21 crc kubenswrapper[4947]: I1203 10:18:21.836384 4947 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fbh55"] Dec 03 10:18:22 crc kubenswrapper[4947]: I1203 10:18:22.822177 4947 generic.go:334] "Generic (PLEG): container finished" podID="b0991512-ec23-4d1b-9daf-c1082e52fb12" containerID="6f97f6b29c8c999f82454c085564ef963e72810b8a1ab083b1e2d4db7122b412" exitCode=0 Dec 03 10:18:22 crc kubenswrapper[4947]: I1203 10:18:22.822408 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbh55" event={"ID":"b0991512-ec23-4d1b-9daf-c1082e52fb12","Type":"ContainerDied","Data":"6f97f6b29c8c999f82454c085564ef963e72810b8a1ab083b1e2d4db7122b412"} Dec 03 10:18:22 crc kubenswrapper[4947]: I1203 10:18:22.822508 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbh55" event={"ID":"b0991512-ec23-4d1b-9daf-c1082e52fb12","Type":"ContainerStarted","Data":"3ac91b03590b5309de18533a579f417f8a5493210ffb8fcc43cf800533e2804e"} Dec 03 10:18:23 crc kubenswrapper[4947]: I1203 10:18:23.840389 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbh55" event={"ID":"b0991512-ec23-4d1b-9daf-c1082e52fb12","Type":"ContainerStarted","Data":"d3966a0230f1700bf284e413a5b8ce15a31ec3558cbd1db41647f9a2eb3ad37b"} Dec 03 10:18:25 crc kubenswrapper[4947]: I1203 10:18:25.865079 4947 generic.go:334] "Generic (PLEG): container finished" podID="b0991512-ec23-4d1b-9daf-c1082e52fb12" containerID="d3966a0230f1700bf284e413a5b8ce15a31ec3558cbd1db41647f9a2eb3ad37b" exitCode=0 Dec 03 10:18:25 crc kubenswrapper[4947]: I1203 10:18:25.865153 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbh55" event={"ID":"b0991512-ec23-4d1b-9daf-c1082e52fb12","Type":"ContainerDied","Data":"d3966a0230f1700bf284e413a5b8ce15a31ec3558cbd1db41647f9a2eb3ad37b"} Dec 03 10:18:26 crc kubenswrapper[4947]: I1203 10:18:26.881842 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbh55" event={"ID":"b0991512-ec23-4d1b-9daf-c1082e52fb12","Type":"ContainerStarted","Data":"27738f46f5deea6d92c0264da1c41710b2b9040aaeeceed64344865a6bd52f85"} Dec 03 10:18:26 crc kubenswrapper[4947]: I1203 10:18:26.916407 4947 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fbh55" podStartSLOduration=3.371943674 podStartE2EDuration="6.916391635s" podCreationTimestamp="2025-12-03 10:18:20 +0000 UTC" firstStartedPulling="2025-12-03 10:18:22.825742483 +0000 UTC m=+12564.086696929" lastFinishedPulling="2025-12-03 10:18:26.370190444 +0000 UTC m=+12567.631144890" observedRunningTime="2025-12-03 10:18:26.914947876 +0000 UTC m=+12568.175902312" watchObservedRunningTime="2025-12-03 10:18:26.916391635 +0000 UTC m=+12568.177346061" Dec 03 10:18:30 crc kubenswrapper[4947]: I1203 10:18:30.086537 4947 patch_prober.go:28] interesting pod/machine-config-daemon-qv8tj container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 10:18:30 crc kubenswrapper[4947]: I1203 10:18:30.088681 4947 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qv8tj" podUID="8384efba-0256-458d-8aab-627ad76e643e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 10:18:31 crc kubenswrapper[4947]: I1203 10:18:31.278378 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fbh55" Dec 03 10:18:31 crc kubenswrapper[4947]: I1203 10:18:31.278575 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fbh55" Dec 03 10:18:32 crc kubenswrapper[4947]: I1203 10:18:32.338290 4947 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fbh55" podUID="b0991512-ec23-4d1b-9daf-c1082e52fb12" containerName="registry-server" probeResult="failure" output=< Dec 03 10:18:32 crc kubenswrapper[4947]: timeout: failed to connect service ":50051" within 1s Dec 03 10:18:32 crc kubenswrapper[4947]: > Dec 03 10:18:41 crc kubenswrapper[4947]: I1203 10:18:41.395145 4947 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fbh55" Dec 03 10:18:41 crc kubenswrapper[4947]: I1203 10:18:41.491017 4947 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fbh55" Dec 03 10:18:41 crc kubenswrapper[4947]: I1203 10:18:41.652734 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fbh55"] Dec 03 10:18:43 crc kubenswrapper[4947]: I1203 10:18:43.109435 4947 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fbh55" podUID="b0991512-ec23-4d1b-9daf-c1082e52fb12" containerName="registry-server" containerID="cri-o://27738f46f5deea6d92c0264da1c41710b2b9040aaeeceed64344865a6bd52f85" gracePeriod=2 Dec 03 10:18:43 crc kubenswrapper[4947]: I1203 10:18:43.599773 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbh55" Dec 03 10:18:43 crc kubenswrapper[4947]: I1203 10:18:43.741989 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0991512-ec23-4d1b-9daf-c1082e52fb12-utilities\") pod \"b0991512-ec23-4d1b-9daf-c1082e52fb12\" (UID: \"b0991512-ec23-4d1b-9daf-c1082e52fb12\") " Dec 03 10:18:43 crc kubenswrapper[4947]: I1203 10:18:43.742409 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5znz5\" (UniqueName: \"kubernetes.io/projected/b0991512-ec23-4d1b-9daf-c1082e52fb12-kube-api-access-5znz5\") pod \"b0991512-ec23-4d1b-9daf-c1082e52fb12\" (UID: \"b0991512-ec23-4d1b-9daf-c1082e52fb12\") " Dec 03 10:18:43 crc kubenswrapper[4947]: I1203 10:18:43.742551 4947 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0991512-ec23-4d1b-9daf-c1082e52fb12-catalog-content\") pod \"b0991512-ec23-4d1b-9daf-c1082e52fb12\" (UID: \"b0991512-ec23-4d1b-9daf-c1082e52fb12\") " Dec 03 10:18:43 crc kubenswrapper[4947]: I1203 10:18:43.743271 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0991512-ec23-4d1b-9daf-c1082e52fb12-utilities" (OuterVolumeSpecName: "utilities") pod "b0991512-ec23-4d1b-9daf-c1082e52fb12" (UID: "b0991512-ec23-4d1b-9daf-c1082e52fb12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 10:18:43 crc kubenswrapper[4947]: I1203 10:18:43.748187 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0991512-ec23-4d1b-9daf-c1082e52fb12-kube-api-access-5znz5" (OuterVolumeSpecName: "kube-api-access-5znz5") pod "b0991512-ec23-4d1b-9daf-c1082e52fb12" (UID: "b0991512-ec23-4d1b-9daf-c1082e52fb12"). InnerVolumeSpecName "kube-api-access-5znz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 10:18:43 crc kubenswrapper[4947]: I1203 10:18:43.755011 4947 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0991512-ec23-4d1b-9daf-c1082e52fb12-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 10:18:43 crc kubenswrapper[4947]: I1203 10:18:43.755039 4947 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5znz5\" (UniqueName: \"kubernetes.io/projected/b0991512-ec23-4d1b-9daf-c1082e52fb12-kube-api-access-5znz5\") on node \"crc\" DevicePath \"\"" Dec 03 10:18:43 crc kubenswrapper[4947]: I1203 10:18:43.854629 4947 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0991512-ec23-4d1b-9daf-c1082e52fb12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0991512-ec23-4d1b-9daf-c1082e52fb12" (UID: "b0991512-ec23-4d1b-9daf-c1082e52fb12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 10:18:43 crc kubenswrapper[4947]: I1203 10:18:43.856979 4947 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0991512-ec23-4d1b-9daf-c1082e52fb12-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 10:18:44 crc kubenswrapper[4947]: I1203 10:18:44.126000 4947 generic.go:334] "Generic (PLEG): container finished" podID="b0991512-ec23-4d1b-9daf-c1082e52fb12" containerID="27738f46f5deea6d92c0264da1c41710b2b9040aaeeceed64344865a6bd52f85" exitCode=0 Dec 03 10:18:44 crc kubenswrapper[4947]: I1203 10:18:44.126070 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbh55" event={"ID":"b0991512-ec23-4d1b-9daf-c1082e52fb12","Type":"ContainerDied","Data":"27738f46f5deea6d92c0264da1c41710b2b9040aaeeceed64344865a6bd52f85"} Dec 03 10:18:44 crc kubenswrapper[4947]: I1203 10:18:44.126111 4947 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fbh55" event={"ID":"b0991512-ec23-4d1b-9daf-c1082e52fb12","Type":"ContainerDied","Data":"3ac91b03590b5309de18533a579f417f8a5493210ffb8fcc43cf800533e2804e"} Dec 03 10:18:44 crc kubenswrapper[4947]: I1203 10:18:44.126125 4947 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fbh55" Dec 03 10:18:44 crc kubenswrapper[4947]: I1203 10:18:44.126134 4947 scope.go:117] "RemoveContainer" containerID="27738f46f5deea6d92c0264da1c41710b2b9040aaeeceed64344865a6bd52f85" Dec 03 10:18:44 crc kubenswrapper[4947]: I1203 10:18:44.177991 4947 scope.go:117] "RemoveContainer" containerID="d3966a0230f1700bf284e413a5b8ce15a31ec3558cbd1db41647f9a2eb3ad37b" Dec 03 10:18:44 crc kubenswrapper[4947]: I1203 10:18:44.192144 4947 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fbh55"] Dec 03 10:18:44 crc kubenswrapper[4947]: I1203 10:18:44.206274 4947 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fbh55"] Dec 03 10:18:44 crc kubenswrapper[4947]: I1203 10:18:44.213757 4947 scope.go:117] "RemoveContainer" containerID="6f97f6b29c8c999f82454c085564ef963e72810b8a1ab083b1e2d4db7122b412" Dec 03 10:18:44 crc kubenswrapper[4947]: I1203 10:18:44.264583 4947 scope.go:117] "RemoveContainer" containerID="27738f46f5deea6d92c0264da1c41710b2b9040aaeeceed64344865a6bd52f85" Dec 03 10:18:44 crc kubenswrapper[4947]: E1203 10:18:44.265089 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27738f46f5deea6d92c0264da1c41710b2b9040aaeeceed64344865a6bd52f85\": container with ID starting with 27738f46f5deea6d92c0264da1c41710b2b9040aaeeceed64344865a6bd52f85 not found: ID does not exist" containerID="27738f46f5deea6d92c0264da1c41710b2b9040aaeeceed64344865a6bd52f85" Dec 03 10:18:44 crc kubenswrapper[4947]: I1203 10:18:44.265125 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27738f46f5deea6d92c0264da1c41710b2b9040aaeeceed64344865a6bd52f85"} err="failed to get container status \"27738f46f5deea6d92c0264da1c41710b2b9040aaeeceed64344865a6bd52f85\": rpc error: code = NotFound desc = could not find container \"27738f46f5deea6d92c0264da1c41710b2b9040aaeeceed64344865a6bd52f85\": container with ID starting with 27738f46f5deea6d92c0264da1c41710b2b9040aaeeceed64344865a6bd52f85 not found: ID does not exist" Dec 03 10:18:44 crc kubenswrapper[4947]: I1203 10:18:44.265148 4947 scope.go:117] "RemoveContainer" containerID="d3966a0230f1700bf284e413a5b8ce15a31ec3558cbd1db41647f9a2eb3ad37b" Dec 03 10:18:44 crc kubenswrapper[4947]: E1203 10:18:44.265520 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3966a0230f1700bf284e413a5b8ce15a31ec3558cbd1db41647f9a2eb3ad37b\": container with ID starting with d3966a0230f1700bf284e413a5b8ce15a31ec3558cbd1db41647f9a2eb3ad37b not found: ID does not exist" containerID="d3966a0230f1700bf284e413a5b8ce15a31ec3558cbd1db41647f9a2eb3ad37b" Dec 03 10:18:44 crc kubenswrapper[4947]: I1203 10:18:44.265554 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3966a0230f1700bf284e413a5b8ce15a31ec3558cbd1db41647f9a2eb3ad37b"} err="failed to get container status \"d3966a0230f1700bf284e413a5b8ce15a31ec3558cbd1db41647f9a2eb3ad37b\": rpc error: code = NotFound desc = could not find container \"d3966a0230f1700bf284e413a5b8ce15a31ec3558cbd1db41647f9a2eb3ad37b\": container with ID starting with d3966a0230f1700bf284e413a5b8ce15a31ec3558cbd1db41647f9a2eb3ad37b not found: ID does not exist" Dec 03 10:18:44 crc kubenswrapper[4947]: I1203 10:18:44.265599 4947 scope.go:117] "RemoveContainer" containerID="6f97f6b29c8c999f82454c085564ef963e72810b8a1ab083b1e2d4db7122b412" Dec 03 10:18:44 crc kubenswrapper[4947]: E1203 10:18:44.265876 4947 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f97f6b29c8c999f82454c085564ef963e72810b8a1ab083b1e2d4db7122b412\": container with ID starting with 6f97f6b29c8c999f82454c085564ef963e72810b8a1ab083b1e2d4db7122b412 not found: ID does not exist" containerID="6f97f6b29c8c999f82454c085564ef963e72810b8a1ab083b1e2d4db7122b412" Dec 03 10:18:44 crc kubenswrapper[4947]: I1203 10:18:44.265927 4947 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f97f6b29c8c999f82454c085564ef963e72810b8a1ab083b1e2d4db7122b412"} err="failed to get container status \"6f97f6b29c8c999f82454c085564ef963e72810b8a1ab083b1e2d4db7122b412\": rpc error: code = NotFound desc = could not find container \"6f97f6b29c8c999f82454c085564ef963e72810b8a1ab083b1e2d4db7122b412\": container with ID starting with 6f97f6b29c8c999f82454c085564ef963e72810b8a1ab083b1e2d4db7122b412 not found: ID does not exist" Dec 03 10:18:45 crc kubenswrapper[4947]: I1203 10:18:45.105308 4947 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0991512-ec23-4d1b-9daf-c1082e52fb12" path="/var/lib/kubelet/pods/b0991512-ec23-4d1b-9daf-c1082e52fb12/volumes"